WO2018076614A1 - 一种直播视频的处理方法、装置及设备、计算机可读介质 - Google Patents

一种直播视频的处理方法、装置及设备、计算机可读介质 Download PDF

Info

Publication number
WO2018076614A1
WO2018076614A1 PCT/CN2017/079622 CN2017079622W WO2018076614A1 WO 2018076614 A1 WO2018076614 A1 WO 2018076614A1 CN 2017079622 W CN2017079622 W CN 2017079622W WO 2018076614 A1 WO2018076614 A1 WO 2018076614A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
frame
data stream
average value
image blocks
Prior art date
Application number
PCT/CN2017/079622
Other languages
English (en)
French (fr)
Inventor
郑伟
Original Assignee
武汉斗鱼网络科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 武汉斗鱼网络科技有限公司 filed Critical 武汉斗鱼网络科技有限公司
Publication of WO2018076614A1 publication Critical patent/WO2018076614A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/23805Controlling the feeding rate to the network, e.g. by controlling the video pump
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Definitions

  • the present invention relates to the field of Internet and video live broadcast technologies, and in particular, to a method, device and device for processing live video, and a computer readable medium.
  • Video live broadcast refers to the method of using the Internet and streaming media technology to broadcast live video recorded by the anchor to multiple audience users.
  • the anchor client collects the live broadcast video of the anchor through the camera, and then sends the collected live video to the server.
  • the live video is sent by the server to the client.
  • live video is interactive. This kind of interactivity is mainly reflected in two aspects, one is the interaction between the viewer and the viewer, and the other is the audience and anchor. The interaction between them. Since the live video broadcasts the video of the anchor in real time, the anchor may leave the collection area of the camera or be in a static state such as sleeping during the live broadcast of the video. At this time, the viewer and the anchor cannot interact with each other. At present, the current video is processed in the manner of processing live video in the prior art, which is likely to cause a large amount of redundant data and waste of bandwidth.
  • the purpose of the embodiments of the present application is to provide a method, a device, and a device for processing live video, so as to solve the problem that the live video processing in the prior art is continued if the viewer and the anchor cannot interact. It is easy to cause a lot of redundant data and bandwidth waste.
  • the embodiment of the present application provides a method for processing a live video, where the method includes: acquiring a video data stream collected by a camera, where the video data stream includes multiple video frames; Layout dividing each of the video frames into a plurality of image blocks; calculating a luminance value of each of the image blocks, and calculating, between the image blocks in the same position in each of the video frames, based on the luminance values Determining a difference value of the brightness values; determining an average value of the difference values corresponding to the image blocks having the same position, according to the image blocks corresponding to the average value of the average value being greater than a preset average value Determining whether the video data stream is a still-frame video stream, or determining whether the video data stream is static according to the number of the image blocks corresponding to an average value of the average value being less than or equal to a preset average value.
  • a frame video stream after determining that the video data stream is a still frame video stream, performing a stepwise reduction of a frame rate of the camera, increasing a step interval of the key frame in the video data stream, and sending a cutoff message to the server At least one.
  • the embodiment of the present application provides the first possible implementation manner of the foregoing first aspect, wherein the dividing the video frame into multiple image blocks according to the same layout, including: periodically determining a dividing layout of the video frame; dividing the video frame of each frame into a plurality of image blocks according to the dividing layout.
  • the embodiment of the present application provides the second possible implementation manner of the foregoing first aspect, wherein the periodically determining the partitioning layout of the video frame includes: After the start of the static frame detection period, the luminance signal of one of the video frames in the current detection period is extracted; the video frame of the extracted luminance signal is divided into multiple regions according to the first preset size, and each of the regions is calculated. The brightness average of the pixels, the average brightness of the pixels is equal, and at least two adjacent regions are merged as one of the regions to obtain a divided layout of the video frames.
  • the embodiment of the present application provides the third possible implementation manner of the foregoing first aspect, wherein the periodically determining the partitioning layout of the video frame includes: After the start of the still frame detection period, extracting the luminance signal of the first video frame in the current detection period;
  • the regions are combined as one of the regions to obtain a partitioned layout of the video frames.
  • the embodiment of the present application provides the fourth possible implementation manner of the foregoing first aspect, wherein the dividing the each of the video frames into multiple image blocks according to the same layout includes: following The second preset size divides each of the video frames into a plurality of image blocks on average.
  • the embodiment of the present application provides the fifth possible implementation manner of the foregoing first aspect, wherein the calculating the brightness value of each of the image blocks comprises: acquiring each pixel in the image block a brightness value corresponding to the pixel; a brightness average value of the pixel corresponding to the image block is calculated according to the brightness value corresponding to each of the pixel points; and a brightness average value of the pixel point corresponding to the image block is determined as the image block Brightness value.
  • the embodiment of the present application provides the sixth possible implementation manner of the foregoing first aspect, the calculating the brightness value of each of the image blocks, including: acquiring each pixel point in the image block The brightness value; the sum of the pixel values of the pixel points in the image block is calculated as the brightness value of the image block.
  • the embodiment of the present application provides the seventh possible implementation manner of the foregoing first aspect, wherein the calculating, according to the brightness value, between the image blocks having the same position in each of the video frames
  • the difference of the brightness values includes: calculating the brightness value between the image blocks in each of the video frames in the current detection period that is the same as the position in the first video frame in the current detection period The difference.
  • the embodiment of the present application provides the sixth possible implementation manner of the foregoing first aspect, wherein the image block corresponding to the average value of the average value is greater than a preset average value Determining whether the video data stream is a still-frame video stream, comprising: counting a number of image blocks corresponding to an average value of an absolute value greater than a preset average value in each detection period; and determining the number and the first preset The threshold is compared. When the number of consecutive preset number of detection periods is less than the first preset threshold, determining that the video data stream is a still frame video stream.
  • the embodiment of the present application provides the ninth possible implementation manner of the foregoing first aspect, wherein the image corresponding to the average value of the average value is less than or equal to an average value of the preset average value
  • the number of the blocks determines whether the video data stream is a still-frame video stream, and includes: counting the number of image blocks corresponding to an average value of each of the detection periods that is less than or equal to a preset average value;
  • the second preset threshold is compared. When the number of consecutive preset number of detection periods is greater than the second preset threshold, the video data stream is determined to be a still frame video stream.
  • the embodiment of the present application provides the tenth possible implementation manner of the foregoing first aspect, wherein, after determining that the video data stream is a still-frame video stream, performing step-by-step reduction of the acquisition frame rate, And increasing the interval of the key frame in the video data stream and sending the interrupt message to the server at least one step, including: after determining that the video data stream is a still frame video stream, counting the duration of the still frame video stream Time; according to the duration, performing step-by-step reduction of the acquisition frame rate, increasing the interval of key frames in the video data stream step by step, and sending a cut-off message to at least one of the servers.
  • the embodiment of the present application provides the eleventh possible implementation manner of the foregoing first aspect, wherein, when determining that the video data stream is a non-still frame video stream, detecting whether the current push stream link is broken. On, if yes, send a reconnect message to the server.
  • the embodiment of the present application provides a processing device for a live video, where the device includes: an acquiring module configured to acquire a video data stream collected by a camera, where the video data stream includes multiple video frames; Configuring to divide each of the video frames into a plurality of image blocks in the same layout; a computing module configured to calculate a luminance value of each of the image blocks, and calculate a position in each of the video frames according to the luminance values a difference value of the brightness values between the image blocks; a determining module configured to determine an average value of the difference values corresponding to the image blocks having the same position, according to an absolute value in the average value And the number of the image blocks corresponding to the average value of the preset average value determines whether the video data stream is a still frame video stream, or according to an average value of the average value that is less than or equal to an average value of the preset average value.
  • the embodiment of the present application provides the first possible implementation manner of the foregoing second aspect, wherein the dividing module includes: a determining unit configured to periodically determine a dividing layout of the video frame; And configuring to divide the video frame of each frame into a plurality of image blocks according to the dividing layout.
  • an embodiment of the present application provides a processing device for a live video, including: a memory configured to store a program, and a processor configured to execute a method including the following steps by calling a program stored in the memory. : obtaining a video data stream collected by a camera, the video data The stream includes a plurality of video frames; each of the video frames is divided into a plurality of image blocks in the same layout; a luminance value of each of the image blocks is calculated, and a position in each of the video frames is calculated according to the luminance values a difference value of the brightness values between the image blocks; determining an average value of the difference values corresponding to the image blocks having the same position, according to an average value of the average value being greater than a preset mean value The number of the image blocks corresponding to the value determines whether the video data stream is a still-frame video stream, or the number of the image blocks corresponding to an average value of the average value that is less than or equal to a preset average value.
  • Determining whether the video data stream is a still frame video stream After determining that the video data stream is a still frame video stream, performing stepwise reduction of a frame rate of the camera, and stepwise increasing a key frame in the video data stream Interval and send a cutoff message to at least one of the servers.
  • a computer readable medium having non-volatile program code executable by a processor, the program code causing the processor to perform the above method.
  • the method, device, and device for processing a live video provided by the embodiment of the present application, and a computer readable medium, perform static frame detection on the obtained video data stream, and when detecting a still frame video stream, that is, between the current viewer and the anchor
  • the implementation reduces the acquisition frame rate of the video data stream step by step, increases the interval of the key frames in the video data stream step by step, and sends the interrupt message to at least one of the servers, so as to avoid no interaction between the viewer and the anchor. Generates a lot of redundant data and leads to wasted bandwidth.
  • FIG. 1 is a flowchart of a method for processing a live video provided by an embodiment of the present application
  • FIG. 2 is a flowchart of periodically determining a division layout of a video frame in a method for processing a live video according to an embodiment of the present application;
  • FIG. 3 is a flowchart of calculating a luminance value of each image block in a method for processing a live video according to an embodiment of the present application
  • FIG. 4 is a schematic structural diagram of a processing apparatus for a live video provided by an embodiment of the present application.
  • FIG. 5 is a schematic block diagram of a processing device for a live video provided by an embodiment of the present application.
  • the anchor when performing a live video broadcast, the anchor may leave the image capturing device for collecting the video data stream or be in a static state such as sleeping. At this time, the viewer and the anchor cannot interact with each other, if still The current video is processed in the manner of processing live video in the prior art, which is likely to cause a large amount of redundant data and waste of bandwidth. Based on this, the embodiment of the present application provides a method, a device, and a device for processing a live video, which are described below by using an embodiment.
  • the embodiment of the present application provides a method for processing a live video. As shown in FIG. 1 , when the live video is processed by using the method, the method includes the following steps S110-S150, as follows.
  • the capture video data stream can be implemented by an image capture device, such as a camera.
  • a camera is taken as an example for description.
  • the camera may be a camera attached to a device such as a mobile phone or a computer. Of course, it may also be a separately installed camera for live video, which is not limited in this embodiment.
  • the camera collects the video data stream in real time, and when the camera is turned on, starts to collect the video data stream, and outputs the collected video data stream to the processor that executes the processing method in the embodiment of the present application, wherein the video data stream collected by the camera It is transmitted in units of video frames, and therefore, the acquired video data stream includes a plurality of video frames.
  • each video frame in the video data stream is divided into a plurality of image blocks.
  • the division of the video frame includes two cases:
  • the first case periodically determining a division layout of the video frame; dividing the video frame of each frame into a plurality of image blocks according to the division layout.
  • the periodically determining the partitioning layout of the video frame refers to determining a partitioning layout of a video frame in each detection period, and determining, according to a video frame in the received video data stream in the detection period, The other video frames received during the detection period are divided according to the partition layout determined within the detection period to obtain a plurality of image blocks.
  • periodically determining the division layout of the video frame includes steps S210-S220, as follows.
  • the static data frame is periodically detected on the video data stream, and the length of each detection period may be preset, for example, 5 seconds as one detection period, 8 seconds as one detection period, or 10 seconds.
  • the specific length of the above detection period can be based on the actual application scenario.
  • the setting of the present application does not limit the specific time length of the above detection period.
  • the obtained video data stream is in the form of YUV (Luminance Chrominance), wherein Y is a luminance signal, and Y data is continuously stored, and each pixel in the video frame corresponds to one byte of data, and is extracted.
  • YUV Luminance Chrominance
  • the resolution can be expressed as width ⁇ height (width ⁇ height), and the width of the resolution is copied from each video frame of the video data stream by the width. Height bytes of data, which is the luminance signal of the extracted video frame.
  • the video frame that extracts the luminance signal is divided into multiple regions according to the first preset size, and the average value of the brightness of the pixel corresponding to each region is calculated, and the average brightness of the pixels is equal and at least two adjacent regions are merged. As a region, the division layout of the above video frame is obtained.
  • the first preset size may be 64 ⁇ 64 pixels, or may be other sizes.
  • the selection of the first preset size may be set according to an actual application scenario, and the specific size of the preset size is not limited in this embodiment of the present application.
  • the size of the video frame is an integer multiple of the first preset size.
  • a video frame received in the detection period is divided into a plurality of regions.
  • the size of each region is equal to the first preset size.
  • Calculating the brightness average value of the pixel points corresponding to each area is to calculate the sum of the brightness values of all the pixels in each area, and divide the sum of the brightness values by the number of pixels in the area to obtain the corresponding area.
  • the average brightness of the pixels It can be understood that the luminance value of each pixel is determined according to its luminance signal.
  • the merging into one region is such that the first video frame is divided into a plurality of larger regions, the larger region is made into an image block, and the divided video frame is determined as a divided layout within the current detection period.
  • the brightness average value of the pixel points corresponding to each area is compared with the brightness average value of the pixel points corresponding to the four adjacent areas.
  • the partition layout is determined once per cycle, and therefore, after the video frame is acquired, First determining whether the acquired video frame is a video frame for determining a partition layout in a current detection period, such as a first video frame, and if so, determining a division layout of the current detection period according to the video frame, and if not, according to the current
  • the divided layout determined within the detection period is divided into video blocks to obtain a plurality of image blocks.
  • each video frame is equally divided into a plurality of image blocks according to a second preset size.
  • the specific size of the second preset size may be set according to the actual application scenario.
  • the specific size of the second preset size is not limited in this embodiment of the present application.
  • each video frame to be acquired Each is equally divided into a plurality of image blocks according to the second preset size, and the size of each image block is equal.
  • the video frame is divided into a plurality of image blocks according to the first case described above, or the video frame is divided into a plurality of image blocks according to the second case described above, after the video frame is divided into a plurality of image blocks, calculation is required.
  • the brightness value of each image block is required.
  • the luminance value of each image block is calculated, including steps S310-S330, as follows.
  • the sum of the brightness values corresponding to all the pixel points in each image block is calculated, and the calculated total is divided by the number of the pixel points in the image block, and the pixel points corresponding to the image block are obtained. Average brightness.
  • the brightness average value of the pixel points corresponding to the calculated image block is the brightness value of the image block.
  • the sum of the pixel values of all the pixels in the image block may also be used as the brightness value of the image block.
  • the difference between the brightness values of the image blocks having the same position in each video frame is calculated according to the brightness value of the image block, which specifically includes:
  • a difference value of luminance values between image blocks in the same video frame in the current detection period as in the first video frame in the current detection period is calculated.
  • each video frame is divided into 10 image blocks, and 10 image blocks in the second video frame are calculated to be the same as in the first video frame.
  • the difference between the brightness values of the image blocks at the location, and the difference between the brightness values of the image blocks in the third video frame and the image block at the same position in the first video frame is calculated until the twentieth video is calculated.
  • the difference between the 10 image blocks in the frame and the image block luminance values at the same position in the first video frame of course, only 20 video frames are acquired in the current detection period, and each video frame is divided into 10
  • the image blocks are described as an example, and the number of video frames acquired in the current period and the number of image blocks into which each video frame is divided are not limited.
  • S140 Determine an average value of the difference corresponding to the image block with the same position, and determine whether the video data stream is a still video stream according to the number of image blocks corresponding to the average value of the average value being greater than a preset average value. Or determining whether the video data stream is a still frame video stream according to the number of the image blocks corresponding to an average value of the average value being less than or equal to a preset average value.
  • the corresponding image block at the same position is calculated.
  • the average of the differences, how many image blocks each video frame is divided into, how many averages are obtained here.
  • the determining whether the video data stream is a still-frame video stream according to the number of image blocks corresponding to the average value of the average value being greater than the average value of the preset average value includes:
  • the video data stream is determined to be a still-frame video stream.
  • the video data stream is a still frame data stream.
  • the specific value of the preset mean is not limited, and is usually selected as a smaller value, such as a value less than 1, a value less than 2, and the like.
  • the preset mean is zero. It can be understood that when the preset mean value is zero, the average value of the absolute value greater than the preset mean value is an average value that is not zero.
  • the plurality of average values obtained above are respectively compared with zero, and the number of image blocks corresponding to the average value of the non-zero is counted, and the number is compared with the first preset threshold. The number is smaller than the first preset threshold, and in a continuous preset number of detection periods, if the number of image blocks corresponding to the non-zero average is smaller than the first preset threshold, the video data stream is determined to be static. Frame data stream.
  • the first preset threshold may be 10% of the number of image blocks obtained by dividing each video frame. Of course, other values may be used.
  • the specific value of the first preset threshold may be based on an actual application scenario. The setting of the present application does not limit the specific value of the preset threshold.
  • the preset number of the preset number of detection periods may be a value of 5, 6, or the like.
  • the specific number of the preset number may be set according to the actual application scenario. The value is limited.
  • the foregoing may also be used to count the number of image blocks whose absolute value is less than or equal to the average value of the preset average value, and compare the number with the second preset threshold, when the number is greater than or equal to the second preset threshold. And, in a continuous preset number of detection periods, when the number of image blocks corresponding to the average value of the absolute value less than or equal to the preset mean value is greater than or equal to the second preset threshold value, determining that the video data stream is static Frame data stream.
  • the preset average value when the preset average value is preferably 0, it may be a graph corresponding to the average value of the statistics of zero.
  • the number of image blocks when the number is greater than or equal to the second preset threshold, and within a consecutive preset number of detection periods, the number of image blocks corresponding to the average value of zero is greater than or equal to the second pre- When the threshold is set, it is determined that the video data stream is a still frame data stream.
  • the second preset threshold may be ninety percent of the number of image blocks obtained by dividing each video frame. Of course, other values may be used.
  • the specific value of the second preset threshold may be based on actual applications. The scenario is set, and the specific value of the preset threshold is not limited in the embodiment of the present application.
  • the bandwidth occupied by the current video data stream needs to be reduced, including:
  • the flow message is sent to at least one of the servers.
  • the frame rate of the camera may be reduced step by step, the interval of key frames in the video data stream may be increased step by step, and the interrupt message may be sent to any one of the server, any two, or Three items.
  • the duration of the still-frame video stream is counted.
  • the frame of the camera is gradually decreased. The rate, so the number of video frames output by the camera will be reduced, therefore, reducing the frame rate of the camera can reduce the amount of video processing from the source, on the one hand can reduce the amount of data processing, on the other hand can reduce data redundancy.
  • the frame rate of the camera under normal conditions is generally 25 to 30 fps (frames per second), which indicates the number of video frames outputted from the camera per second.
  • the frame rate of the camera can be set. It is 1 to 30 fps. Therefore, the frame rate of the camera can be lowered step by step within the range.
  • the above range can be adjusted according to the specific application scenario. Here, for example, the frame rate is not adjusted. Line limit.
  • the setting of the camera frame rate can be achieved through the provided API.
  • the interval of key frames in the video data stream is increased step by step. Since the key frame contains a large amount of information, the key frame data size is tens of other types of frames. Times or even hundreds of times, key frames appear in the video data stream at regular intervals. Therefore, as the duration of the still-frame video stream increases, the interval between key frames in the video stream is increased step by step, and the entire video data can be reduced. The size of the stream, thereby reducing the bandwidth occupied.
  • the interval of key frames is 30, indicating that there is only one key frame in every 30 video frames.
  • the maximum interval of key frames can be increased to 100. Of course, it can also be set. For other values, the maximum spacing of the adjustable keyframes is not defined here. It can be understood that in the video coding format of H264, the I frame is a key frame.
  • the time T 1 and the time T 2 may be equal, or the time T 1 may be greater than the time T 2 , or the time T 1 may be less than T 2 , and the specific size of the time T 1 and the time T 2 is not used in the embodiment of the present application. Relationships are limited.
  • the push-stream link needs to be disconnected, that is, the audio-video data stream is no longer pushed outward. At this time, the sending of the cutoff message to the server is performed, so that the waste of bandwidth can be avoided.
  • a disconnection flow message may be sent to the server.
  • how long the static frame video stream continues to transmit the interrupt message to the server is not a limitation in this embodiment.
  • Sending a cut-off message to the server is to inform the server of the reason for the live room disconnection, and to keep the live broadcast, but there is no new data push.
  • the server may only provide the last still frame image of the cache; for the viewer user who has entered the live room before the interruption, the server notifies each viewer user. Disconnect the playback data request, but do not release the player. The playback interface keeps the last frame.
  • the server After re-establishing the push link between the server and the server, detecting the frame rate of the current camera and the interval of the key frame in the video data stream, if it is detected that the frame rate of the current camera has been modified, adjusting the frame rate of the camera The frame rate when the original video data stream is in the non-still frame video stream. If it is detected that the key frame interval in the current video data stream has been modified, the key frame interval in the video data stream is adjusted to the original video data stream. The interval at which the still-frame video stream is streamed.
  • the method for processing a live video provided by the embodiment of the present application performs static frame detection on the obtained video data stream.
  • a static video stream is detected, that is, when there is no interaction between the viewer and the anchor, the execution is gradually reduced.
  • the frame rate of the camera increasing the interval of key frames in the video data stream step by step, and sending the interrupt message to at least one of the servers, avoiding a large amount of redundant data and causing waste of bandwidth when there is no interaction between the viewer and the anchor. .
  • An embodiment of the present application provides a processing device for a live video. As shown in FIG. 4, the device includes an obtaining module 410, a dividing module 410, a calculating module 430, a determining module 440, and an executing module 450.
  • the video data stream includes a plurality of video frames; the dividing module 420 is configured to divide each video frame into a plurality of image blocks in the same layout; the calculating module 430 is configured to calculate a luminance value of each image block, and calculating, according to the luminance value, a difference value of luminance values between image blocks having the same position in each video frame; the determining module 440 configured to determine a difference value corresponding to the image block having the same position
  • the average value is determined according to the number of image blocks corresponding to the average value of the average value being greater than the average value of the preset average value, whether the video data stream is a still frame video stream, or according to the The number of the image blocks corresponding to the average value of the average value being less than or equal to the preset mean value determines whether the video data stream is a still frame video stream; the execution module 450 is configured to determine that the video data stream is After the still-frame video stream, the frame rate of the camera is gradually reduced, the
  • the dividing module 420 divides each video frame into a plurality of image blocks in the same layout by using the determining unit and the dividing unit, and specifically includes:
  • the determining unit is configured to periodically determine a dividing layout of the video frame, and the dividing unit is configured to divide each frame of the video frame into a plurality of image blocks according to the dividing layout.
  • the determining unit is configured to: after the start of the static frame detection period, extract a luminance signal of one of the video frames in the current detection period; and divide the video frame that extracts the luminance signal into multiples according to the first preset size. a region, calculating a luminance average of pixels corresponding to each of the regions, equalizing the luminance averages of the pixel points, and combining at least two adjacent regions as one of the regions, to obtain a division layout of the video frame .
  • the determining unit is configured to: after the start of the static frame detection period, extract a luminance signal of the first video frame in the current detection period; and divide the first one of the video frames into multiple according to the first preset size. a region, calculating a luminance average of pixels corresponding to each of the regions, equalizing the luminance averages of the pixel points, and combining at least two adjacent regions as one of the regions, to obtain a division layout of the video frame .
  • the dividing module 420 may further divide each video frame into a plurality of image blocks according to a second preset size.
  • the calculation module 430 includes: a brightness value acquiring unit configured to acquire a brightness value corresponding to each pixel point in the image block; and a calculating unit configured to calculate the brightness value according to each of the pixel points a brightness average value of the pixel points corresponding to the image block; and a brightness value determining unit configured to determine a brightness average value of the pixel points corresponding to the image block as a brightness value of the image block.
  • the calculation module 430 may be further configured to acquire a brightness value corresponding to each pixel point in the image block; and calculate a sum of pixel values of the pixel points in the image block as a brightness value of the image block.
  • the calculation module 430 calculates a difference value of the brightness values between the image blocks in each of the video frames in the current detection period that are the same as the positions in the first video frame in the current detection period.
  • the determining module 440 includes: a first statistic unit configured to count the number of image blocks corresponding to the average value of the absolute value in each detection period that is greater than the preset average value; the first comparing unit is configured to set the number Comparing with the first preset threshold, when the number of consecutive preset number of detection periods is less than the first preset threshold, determining that the video data stream is a still frame video stream.
  • the determining module 440 may further include: a second statistic unit, configured to count the number of image blocks corresponding to the average value of each of the detection periods that is less than or equal to the average value of the preset mean values; and the second comparison unit, The method is configured to compare the number of the video data to the second preset threshold, and determine that the video data stream is a still frame video when the number of consecutive preset number of detection periods is greater than the second preset threshold. flow.
  • a second statistic unit configured to count the number of image blocks corresponding to the average value of each of the detection periods that is less than or equal to the average value of the preset mean values
  • the second comparison unit The method is configured to compare the number of the video data to the second preset threshold, and determine that the video data stream is a still frame video when the number of consecutive preset number of detection periods is greater than the second preset threshold. flow.
  • the executing module 450 is configured to: after determining that the video data stream is a still-frame video stream, counting a duration of the still-frame video stream; performing, according to the duration, performing a stepwise reduction of a frame rate of the camera, The interval of key frames in the video data stream is increased step by step and a cutoff message is sent to at least one of the servers.
  • execution module 450 is further configured to: when determining that the video data stream is a non-static frame video stream, detecting whether the current push stream link has been disconnected, and if so, sending a reconnect message to the server.
  • the live video processing device performs static frame detection on the obtained video data stream.
  • the video data stream is detected as a still frame video stream, that is, when there is no interaction between the current viewer and the anchor, the execution is performed.
  • the level reduces the acquisition frame rate of the video data stream, increases the interval of key frames in the video data stream step by step, and sends the interrupt message to at least one of the servers, thereby avoiding generating a large amount of redundant data when there is no interaction between the viewer and the anchor. And causing a waste of bandwidth.
  • the embodiment of the present application further provides a processing device for a live video.
  • FIG. 5 is a schematic block diagram of a processing device 500 for live video according to an embodiment of the present application.
  • the processing device 500 for live video provided by the embodiment of the present application includes: a memory 501. And processor 502.
  • the memory 501 is configured to store a program.
  • the processor 502 is configured to, by calling a program stored in the memory 501, perform a method comprising: acquiring a video data stream collected by a camera, the video data stream comprising a plurality of video frames; each of the same layout Dividing the video frame into a plurality of image blocks; calculating a luminance value of each of the image blocks, and calculating, according to the luminance values, the luminance values between the image blocks having the same position in each of the video frames a difference value; determining an average value of the difference values corresponding to the image blocks having the same position, and determining, according to the number of the image blocks corresponding to an average value of the average value being greater than a preset average value Whether the video data stream is a still frame video stream, or whether the video data stream is a still frame video stream according to the number of the image blocks corresponding to an average value of the average value being less than or equal to a preset average value; After determining that the video data stream is a still-frame video stream, performing step-by-step
  • the processor 502 executes the above-mentioned programs stored in the memory 501 to perform various function applications and data processing, that is, the processing method of the live video in the embodiment of the present application.
  • the memory 501 can include, but is not limited to, Random Access Memory (RAM), Read Only Memory (ROM), Programmable Read-Only Memory (PROM), and erasable read-only Erasable Programmable Read-Only Memory (EPROM), Electric Erasable Programmable Read-Only Memory (EEPROM), and the like.
  • the processor 502 can execute the foregoing method stored in the memory 501 after receiving the execution instruction, and implement the method defined in the flow disclosed in any of the foregoing embodiments of the present application.
  • Processor 502 can be an integrated circuit chip with signal processing capabilities.
  • the above processor can be General-purpose processor, including Central Processing Unit (CPU), Network Processor (NP), etc.; also can be digital signal processor (DSP), application specific integrated circuit (ASIC), off-the-shelf programmable Gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA off-the-shelf programmable Gate array
  • the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
  • FIG. 5 is merely illustrative, and the processing device 500 may further include more or less components than those shown in FIG. 5, or have a different configuration than that shown in FIG.
  • the components shown in FIG. 5 can be implemented in hardware, software, or a combination thereof.
  • the modules and units of the processing device for live video in the foregoing embodiments may be implemented by software code.
  • the modules and units described above may be stored in the memory 601 of the processing device 500.
  • the above modules and units can also be implemented by hardware such as an integrated circuit chip.
  • the processing device of the live video provided by the embodiment of the present application may be specific hardware on the device or software or firmware installed on the device.
  • the implementation principle and the technical effects of the device provided by the embodiment of the present application are the same as those of the foregoing method embodiment.
  • a person skilled in the art can clearly understand that for the convenience and brevity of the description, the specific working processes of the foregoing system, the device and the unit can refer to the corresponding processes in the foregoing method embodiments, and details are not described herein again.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some communication interface, device or unit, and may be electrical, mechanical or otherwise.
  • the unit described as a separate component may or may not be physically separated as a unit
  • the displayed components may or may not be physical units, ie may be located in one place, or may be distributed over multiple network elements. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in the embodiment provided by the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the functions may be stored in a computer readable storage medium if implemented in the form of a software functional unit and sold or used as a standalone product.
  • the technical solution of the present application which is essential or contributes to the prior art, or a part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including
  • the instructions are used to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present application.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

本申请提供了一种直播视频的处理方法、装置及设备、计算机可读介质,包括:获取摄像头采集的视频数据流,该视频数据流包括多个视频帧;按相同布局将每一个视频帧划分为多个图像块;计算每个图像块的亮度值,根据该亮度值计算每个视频帧中位置相同的图像块之间亮度值的差值;确定位置相同的图像块对应的差值的平均值,根据平均值对应的图像块的个数确定上述视频数据流是否为静帧视频流;当确定为静帧视频流时,执行逐级降低摄像头的帧率、逐级增加视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。本申请中,避免了在观众和主播之间无交互时产生大量的冗余数据以及导致带宽的浪费。

Description

一种直播视频的处理方法、装置及设备、计算机可读介质
本申请要求于2016年10月31日提交中国专利局、申请号为CN201610929382.5、发明名称为“一种直播视频的处理方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及互联网及视频直播技术领域,具体而言,涉及一种直播视频的处理方法、装置及设备、计算机可读介质。
背景技术
视频直播是指利用互联网及流媒体技术将主播录制的直播视频广播至多个观众用户的方式,在进行视频直播时,主播客户端通过摄像头采集主播的直播视频,然后将采集的直播视频发送给服务器,由服务器将直播视频发送给客户端。
视频直播与其他形式的视频播放最大的不同之处在于,直播视频具有交互性,这种交互性主要体现在两方面,一方面是观众与观众之间进行的交互,另一方面是观众和主播之间进行的交互。由于视频直播是实时将主播的视频发送出去,但是,在视频直播的过程中主播可能会离开摄像头的采集区域范围或者处于睡觉等静止状态,这时,观众和主播之间无法进行交互,如果这时仍然采用现有技术中处理直播视频的方式处理当前视频,容易造成大量的冗余数据和带宽的浪费。
发明内容
有鉴于此,本申请实施例的目的在于提供一种直播视频的处理方法、装置及设备,以解决在观众和主播无法进行交互的情况下,如果继续采用现有技术中的直播视频的处理方法,容易导致大量的冗余数据和带宽浪费的问题。
第一方面,本申请实施例提供了一种直播视频的处理方法,其中,所述方法包括:获取摄像头采集的视频数据流,所述视频数据流包括多个视频帧;按相同 布局将每一个所述视频帧划分为多个图像块;计算每个所述图像块的亮度值,根据所述亮度值计算每个所述视频帧中位置相同的所述图像块之间的所述亮度值的差值;确定所述位置相同的所述图像块对应的所述差值的平均值,根据所述平均值中绝对值大于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流,或者根据所述平均值中绝对值小于或等于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流;当确定所述视频数据流为静帧视频流后,执行逐级降低所述摄像头的帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
结合第一方面,本申请实施例提供了上述第一方面的第一种可能的实现方式,其中,所述按相同布局将每一个所述视频帧划分为多个图像块,包括:周期性确定所述视频帧的划分布局;根据所述划分布局将每一帧所述视频帧划分为多个图像块。
结合第一方面的第一种可能的实现方式,本申请实施例提供了上述第一方面的第二种可能的实现方式,其中,所述周期性确定所述视频帧的划分布局,包括:在静帧检测周期开始后,提取当前检测周期内的其中一个所述视频帧的亮度信号;将提取亮度信号的该视频帧按照第一预设尺寸划分为多个区域,计算每个所述区域对应的像素点的亮度均值,将像素点的所述亮度均值相等且相邻的至少两个所述区域合并作为一个所述区域,得到所述视频帧的划分布局。
结合第一方面的第一种可能的实现方式,本申请实施例提供了上述第一方面的第三种可能的实现方式,其中,所述周期性确定所述视频帧的划分布局,包括:在静帧检测周期开始后,提取当前检测周期内的第一个所述视频帧的亮度信号;
将第一个所述视频帧按照第一预设尺寸划分为多个区域,计算每个所述区域对应的像素点的亮度均值,将像素点的所述亮度均值相等且相邻的至少两个所述区域合并作为一个所述区域,得到所述视频帧的划分布局。
结合第一方面,本申请实施例提供了上述第一方面的第四种可能的实现方式,其中,所述按相同布局将每一个所述视频帧划分为多个图像块,包括:按照 第二预设尺寸将每一个所述视频帧平均划分为多个图像块。
结合第一方面,本申请实施例提供了上述第一方面的第五种可能的实现方式,其中,所述计算每个所述图像块的亮度值,包括:获取所述图像块中每个像素点对应的亮度值;根据每个所述像素点对应的亮度值,计算所述图像块对应的像素点的亮度均值;将所述图像块对应的像素点的亮度均值确定为所述图像块的亮度值。
结合第一方面,本申请实施例提供了上述第一方面的第六种可能的实现方式,所述计算每个所述图像块的亮度值,包括:获取所述图像块中每个像素点对应的亮度值;计算所述图像块中像素点的像素值之和,作为所述图像块的亮度值。
结合第一方面,本申请实施例提供了上述第一方面的第七种可能的实现方式,其中,所述根据所述亮度值计算每个所述视频帧中位置相同的所述图像块之间的所述亮度值的差值,包括:计算当前检测周期内每个所述视频帧中与所述当前检测周期内第一个视频帧中位置相同的所述图像块之间的所述亮度值的差值。
结合第一方面,本申请实施例提供了上述第一方面的第六种可能的实现方式,其中,所述根据所述平均值中绝对值大于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流,包括:统计每个检测周期内绝对值大于预设均值的平均值对应的图像块的个数;将所述个数与第一预设阈值进行比较,当连续预设数目个检测周期对应的所述个数小于所述第一预设阈值时,确定所述视频数据流为静帧视频流。
结合第一方面,本申请实施例提供了上述第一方面的第九种可能的实现方式,其中,所述根据所述平均值中绝对值小于或等于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流,包括:统计每个检测周期内绝对值小于或等于预设均值的平均值对应的图像块的个数;将所述个数与第二预设阈值进行比较,当连续预设数目个检测周期对应的所述个数大于所述第二预设阈值时,确定所述视频数据流为静帧视频流。
结合第一方面,本申请实施例提供了上述第一方面的第十种可能的实现方式,其中,所述当确定所述视频数据流为静帧视频流后,执行逐级降低采集帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项,包括:当确定所述视频数据流为静帧视频流后,统计所述静帧视频流的持续时间;根据所述持续时间,执行逐级降低采集帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
结合第一方面,本申请实施例提供了上述第一方面的第十一种可能的实现方式,其中,当确定所述视频数据流为非静帧视频流,检测当前的推流链接是否已经断开,如果是,发送重连消息给服务器。
第二方面,本申请实施例提供了一种直播视频的处理装置,其中,该装置包括:获取模块,配置成获取摄像头采集的视频数据流,所述视频数据流包括多个视频帧;划分模块,配置成按相同布局将每一个所述视频帧划分为多个图像块;计算模块,配置成计算每个所述图像块的亮度值,根据所述亮度值计算每个所述视频帧中位置相同的所述图像块之间的所述亮度值的差值;确定模块,配置成确定所述位置相同的所述图像块对应的所述差值的平均值,根据所述平均值中绝对值大于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流,或者根据所述平均值中绝对值小于或等于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流;执行模块,配置成当确定所述视频数据流为静帧视频流后,执行逐级降低所述摄像头的帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
结合第二方面,本申请实施例提供了上述第二方面的第一种可能的实现方式,其中,所述划分模块包括:确定单元,配置成周期性确定所述视频帧的划分布局;划分单元,配置成根据所述划分布局将每一帧所述视频帧划分为多个图像块。
第三方面,本申请实施例提供了一种直播视频的处理设备,包括:存储器,被配置成存储程序,处理器,被配置成通过调用所述存储器中存储的程序,执行包括以下步骤的方法:获取摄像头采集的视频数据流,所述视频数据 流包括多个视频帧;按相同布局将每一个所述视频帧划分为多个图像块;计算每个所述图像块的亮度值,根据所述亮度值计算每个所述视频帧中位置相同的所述图像块之间的所述亮度值的差值;确定所述位置相同的所述图像块对应的所述差值的平均值,根据所述平均值中绝对值大于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流,或者根据所述平均值中绝对值小于或等于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流;当确定所述视频数据流为静帧视频流后,执行逐级降低所述摄像头的帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
根据本发明再一方面,提供一种具有处理器可执行的非易失的程序代码的计算机可读介质,所述程序代码使所述处理器执行上述方法。
本申请实施例提供的直播视频的处理方法、装置及设备、计算机可读介质,对获取到的视频数据流进行静帧检测,当检测到为静帧视频流时,即当前观众和主播之间无交互时,执行逐级降低视频数据流的采集帧率、逐级增加视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项,避免了在观众和主播之间无交互时产生大量的冗余数据和导致带宽的浪费。
为使本申请的上述目的、特征和优点能更明显易懂,下文特举较佳实施例,并配合所附附图,作详细说明如下。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,应当理解,以下附图仅示出了本申请的某些实施例,因此不应被看作是对范围的限定,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他相关的附图。
图1示出了本申请实施例所提供的一种直播视频的处理方法的流程图;
图2示出了本申请实施例所提供的一种直播视频的处理方法中周期性确定视频帧的划分布局的流程图;
图3示出了本申请实施例所提供的一种直播视频的处理方法中计算每个图像块的亮度值的流程图;
图4示出了本申请实施例所提供的一种直播视频的处理装置的结构示意图;
图5是本申请实施例所提供的一种直播视频的处理设备的示意性方框图。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。通常在此处附图中描述和示出的本申请实施例的组件可以以各种不同的配置来布置和设计。因此,以下对在附图中提供的本申请的实施例的详细描述并非旨在限制要求保护的本申请的范围,而是仅仅表示本申请的选定实施例。基于本申请的实施例,本领域技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本申请保护的范围。
考虑到现有技术中,在进行视频直播时,主播可能会离开用于采集视频数据流的图像采集装置或者处于睡觉等静止状态,这时,观众和主播之间无法进行交互,如果这时仍然采用现有技术中处理直播视频的方式处理当前视频,容易造成大量的冗余数据和带宽的浪费。基于此,本申请实施例提供了一种直播视频的处理方法、装置及设备,下面通过实施例进行描述。
实施例1
本申请实施例提供了一种直播视频的处理方法,如图1所示,采用该方法处理直播视频时,包括步骤S110-S150,具体如下。
S110,获取摄像头采集的视频数据流,该视频数据流包括多个视频帧。
其中,采集视频数据流可以通过图像采集装置实现,如摄像头等,本申请实施例中以摄像头为例进行说明。该摄像头可以是手机或者电脑等设备上自带的摄像头,当然也可以是单独安装的用于直播视频的摄像头,在本实施例中并不作为限制。
摄像头会实时采集视频数据流,当摄像头开启后,便开始采集视频数据流,并将采集的视频数据流输出给执行本申请实施例中的处理方法的处理器,其中,摄像头采集的视频数据流是以视频帧为单位传输的,因此,获取的视频数据流包括多个视频帧。
S120,按相同布局将每一个视频帧划分为多个图像块。
当获取到摄像头采集的视频数据流后,将该视频数据流中的每一个视频帧都进行划分,划分成多个图像块,在本申请实施例中,对视频帧的划分包括两种情况:
第一种情况:周期性确定上述视频帧的划分布局;根据上述划分布局将每一帧上述视频帧划分为多个图像块。
其中,上述周期性确定上述视频帧的划分布局指的是在每个检测周期内,确定一次视频帧的划分布局,并且可以根据该检测周期内接收得到的视频数据流中的一个视频帧确定,在该检测周期内接收到的其它视频帧均按照该检测周期内确定的划分布局进行划分,得到多个图像块。在本申请实施例中,优选的根据该检测周期内接收到的第一个视频帧确定划分布局。
在本申请实施例中,如图2所示,周期性确定上述视频帧的划分布局,包括步骤S210-S220,具体如下。
S210,在静帧检测周期开始后,提取当前检测周期内的其中一个视频帧的亮度信号。
本申请实施例中,周期性对视频数据流进行静帧检测,每个检测周期的时间长短可以是预先设定的,比如说,5秒作为一个检测周期、8秒作为一个检测周期或者10秒作为一个检测周期等等,上述检测周期的具体时间长短可以根据实际应用场景 进行设置,本申请实施例并不限定上述检测周期的具体时间长度。
上述获取到的视频数据流为YUV(Luminance Chrominance,亮度色度)格式的,其中,Y是亮度信号,且Y数据是连续存放的,视频帧中的每一个像素对应一个字节的数据,提取视频帧的亮度信号时,根据采集的视频数据流的分辨率,该分辨率可以表示为width×height(宽×高),从视频数据流的每一个视频帧中拷贝出分辨率的width乘以height个字节的数据,该数据就是提取的视频帧的亮度信号。
S220,将提取亮度信号的该视频帧按照第一预设尺寸划分为多个区域,计算每个区域对应的像素点的亮度均值,将像素点的亮度均值相等且相邻的至少两个区域合并作为一个区域,得到上述视频帧的划分布局。
上述第一预设尺寸可以是64×64像素,也可以是其他大小,该第一预设尺寸的选取可以根据实际应用场景进行设置,本申请实施例并不对上述预设尺寸的具体大小进行限定。优选的,视频帧的尺寸大小为第一预设尺寸大小的整数倍。
按照上述第一预设尺寸的大小,将检测周期内接收到的一个视频帧划分为多个区域,优选的,每个区域的大小与第一预设尺寸相等。
上述计算每个区域对应的像素点的亮度均值是通过先计算每个区域内所有像素点的亮度值总和,将该亮度值总和除以该区域内的像素点的个数,得到该区域对应的像素点的亮度均值。可以理解的,每个像素点的亮度值根据其亮度信号确定。
将上述第一个视频帧划分得到的每个区域对应的像素点的亮度均值与其相邻的至少两个区域对应的像素点的亮度均值进行比较,判断像素均值是否相等,如果相等,则将其合并为一个区域,这样就将第一个视频帧划分为多个较大的区域,将该较大的区域成为图像块,并将该划分后的视频帧确定为当前检测周期内的划分布局。优选的,每个区域对应的像素点的亮度均值与其相邻的四个区域对应的像素点的亮度均值进行比较。
在本申请实施例中,每个周期确定一次划分布局,因此,当获取到视频帧后, 首先确定获取的视频帧是否是当前检测周期内用于确定划分布局的视频帧,如第一个视频帧,如果是,则根据该视频帧确定当前检测周期的划分布局,如果不是,则按照当前检测周期内确定的划分布局,将该视频帧进行划分,得到多个图像块。
第二种情况:按照第二预设尺寸将每一个视频帧平均划分为多个图像块。
上述第二预设尺寸具体大小可以根据实际应用场景进行设置,本申请实施例并不对第二预设尺寸的具体大小进行限定。
在上述第二种情况中,当获取到视频帧后,不管该视频帧是当前检测周期内的第一个视频帧,还是第一个视频帧之后的视频帧,将获取到的每个视频帧均按照第二预设尺寸平均划分为多个图像块,每个图像块的大小是相等的。
S130,计算每个图像块的亮度值,根据该亮度值计算每个视频帧中位置相同的图像块之间亮度值的差值。
不管是按照上述第一种情况将视频帧划分为多个图像块,还是按照上述第二种情况将视频帧划分为多个图像块,当将视频帧划分为多个图像块之后,均需要计算每个图像块的亮度值。
其中,如图3所示,计算每个图像块的亮度值,包括步骤S310-S330,具体如下。
S310,获取图像块中每个像素点对应的亮度值。
S320,根据每个像素点对应的亮度值,计算图像块对应的像素点的亮度均值。
在本申请实施例中,计算每个图像块中所有像素点对应的亮度值的总和,将计算得到的总和除以该图像块中的像素点的个数,得到该图像块对应的像素点得亮度均值。
S330,将上述图像块对应的像素点的亮度均值确定为图像块的亮度值。
上述计算出的图像块对应的像素点得亮度均值就是该图像块的亮度值。当然, 在本实施例中,也可以是以图像块中所有像素点的像素值之和作为该图像块的亮度值。
当确定出每个图像块的亮度值后,根据图像块的亮度值计算每个视频帧中位置相同的图像块之间亮度值的差值,具体包括:
计算当前检测周期内每个视频帧中与当前检测周期内第一个视频帧中位置相同的图像块之间亮度值的差值。
比如说,在当前检测周期内获取了20个视频帧,且每个视频帧均划分成10个图像块,计算第二个视频帧中的10个图像块分别与第一个视频帧中位置相同处的图像块亮度值的差值,计算第三个视频帧中的10个图像块分别与第一个视频帧中位置相同处的图像块亮度值的差值,直到计算出第二十个视频帧中的10个图像块分别与第一个视频帧中位置相同处的图像块亮度值的差值,当然,上述只是以当前检测周期内获取到了20个视频帧、每个视频帧划分为10个图像块为例进行说明,并没有限定当前周期内获取到的视频帧的个数以及每个视频帧划分成的图像块的个数。
S140,确定上述位置相同的图像块对应的差值的平均值,根据上述平均值中绝对值大于预设均值的平均值对应的图像块的个数确定上述视频数据流是否为静帧视频流,或者根据所述平均值中绝对值小于或等于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流。
当上述计算出当前检测周期内的每个视频帧中的图像块与该检测周期内第一个视频帧中位置相同处的图像块亮度值的差值后,计算相同位置处的图像块对应的差值的平均值,每个视频帧划分为多少个图像块,则此处得到多少个平均值。
上述根据平均值中绝对值大于预设均值的平均值所对应的图像块的个数确定上述视频数据流是否为静帧视频流,包括:
统计每个检测周期内绝对值大于预设均值的平均值对应的图像块的个数;将上述个数与预设阈值进行比较,当连续预设数目个检测周期对应的个数小于上述预 设阈值时,确定上述视频数据流为静帧视频流。
将上述得到的多个平均值的绝对值分别与预设均值进行比较,统计绝对值大于预设均值的平均值对应的图像块的个数,并将该个数与第一预设阈值进行比较,如果该个数小于第一预设阈值,且在连续预设数目个检测周期内,绝对值大于预设均值的平均值对应的图像块的个数均小于第一预设阈值,则判断该视频数据流为静帧数据流。
在本实例中,预设均值的具体值并不作为限定,通常选定为一个较小的值,如一个小于1的值,小于2的值等等。优选的,预设均值为零。可以理解的,当预设均值为零,绝对值大于预设均值的平均值即为不为零的平均值。则本步骤中,将上述得到的多个平均值分别与零进行比较,统计上述不为零的平均值对应的图像块的个数,并将该个数与第一预设阈值进行比较,如果该个数小于第一预设阈值,且在连续预设数目个检测周期内,不为零的平均值对应的图像块的个数均小于第一预设阈值,则判断该视频数据流为静帧数据流。
其中,上述第一预设阈值可以是每个视频帧划分得到的图像块的个数的百分之十,当然,还可以是其他数值,该第一预设阈值的具体数值可以根据实际应用场景进行设置,本申请实施例并不对上述预设阈值的具体数值进行限定。
上述预设数目个检测周期中的预设数目可以是5、6、8等等数值,该预设数目的具体数值可以根据实际应用场景进行设置,本申请实施例并不对上述预设数目的具体数值进行限定。
当然,上述还可以统计绝对值小于或等于预设均值的平均值对应的图像块的个数,将该个数与第二预设阈值进行比较,当该个数大于或者等于第二预设阈值时,且在连续预设数目个检测周期内,绝对值小于或等于预设均值的平均值对应的图像块的个数均大于或等于第二预设阈值时,则判断该视频数据流为静帧数据流。
可以理解的,当以预设均值优选为0时,则可以是统计为零的平均值对应的图 像块的个数,当该个数大于或者等于第二预设阈值时,且在连续预设数目个检测周期内,为零的平均值对应的图像块的个数均大于或等于第二预设阈值时,则判断该视频数据流为静帧数据流。
其中,上述第二预设阈值可以是每个视频帧划分得到的图像块的个数的百分之九十,当然,还可以是其他数值,该第二预设阈值的具体数值可以根据实际应用场景进行设置,本申请实施例并不对上述预设阈值的具体数值进行限定。
S150,当确定上述视频数据流为静帧视频流后,执行逐级降低上述摄像头的帧率、逐级增加上述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
当确定出上述视频数据流为静帧视频流后,则需要降低当前视频数据流占据的带宽,具体包括:
当确定上述视频数据流为静帧视频流后,统计静帧视频流的持续时间;根据上述持续时间,执行逐级降低上述摄像头的帧率、逐级增加视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
其中,上述可以根据静帧视频流的持续时间,可以执行逐级降低摄像头的帧率、逐级增加视频数据流中关键帧的间隔和发送断流消息给服务器中任意一项、任意两项或者三项。
在本申请实施例中,当确定上述视频数据流为静帧视频流后,统计静帧视频流的持续时间,当静帧视频流持续的时间达到T1时,则开始逐级降低摄像头的帧率,这样摄像头输出的视频帧的个数会减少,因此,降低摄像头的帧率可以从源头上减少视频的处理量,一方面可以降低数据处理量,另一方面可以减少数据冗余。
其中,正常情况下摄像头的帧率一般是25~30fps(帧每秒),表示每秒从摄像头输出的视频帧的个数,在逐级降低摄像头的帧率时,可以设置的摄像头的帧率为1~30fps,因此,可以在该范围内逐级降低摄像头的帧率,当然,上述范围可以根据具体应用场景进行调节,此处只是举例进行说明,并没有对上述帧率的调节范围进 行限定。对于摄像头帧率的设置,可以通过提供的API实现。
或者还可以,当静帧视频流持续的时间达到T2时,逐级增加视频数据流中关键帧的间隔,由于关键帧中含有大量的信息,因此关键帧数据大小是其他类型帧的数十倍甚至数百倍,关键帧会以一定的间隔出现在视频数据流中,因此,随着静帧视频流持续时间的增长,逐级增加视频数据流中关键帧的间隔,可以减少整个视频数据流的大小,从而减少占据的带宽。
例如,正常情况下关键帧的间隔为30,表示每30个视频帧中只有一个关键帧,上述逐级增加关键帧的间隔后,关键帧的最大间隔可以增大到100,当然,还可以设置为其它数值,此处并不对可调节的关键帧的最大间隔进行限定。可以理解的,在H264的视频编码格式中,I帧即为关键帧。
其中,上述时间T1和时间T2可以相等,也可以是时间T1大于时间T2,也可以时间T1小于T2,本申请实施例并不对上述时间T1和时间T2的具体大小关系进行限定。
如果静帧视频流的持续时间一直延长,当静帧视频流的持续时间远远大于上述T1以及T2时,这时需要断开推流链接,即不再向外推送音视频数据流,这时,执行发送断流消息给服务器,这样,可以避免带宽的浪费。
在本实施例中,当静帧视频流的持续时间达到一小时时,可以执行发送断流消息给服务器。当然,静帧视频流的具体持续多长时间后发送断流消息给服务器并不作为本实施例中的限制。
发送断流消息给服务器是为了告知服务器该直播房间断流的原因,并要保持持续直播中,但是没有新的数据推送。对于断流后进入直播房间的观众用户,向服务器请求获取音视频数据时,服务器可以只提供缓存的最后一个静帧图像;对于断流前已经进入直播房间的观众用户,服务器通知每一个观众用户断开播放数据请求,但不要释放播放器,播放界面保持最后一帧画面。
当确定上述视频数据流为非静帧视频数据流后,则检测当前的推流链接是否已经断开,如果是,则重新建立和服务器之间的推流链接。
在重新建立和服务器之间的推流链接时,需要发送重连消息给服务器,告知服务器该直播房间将开始重新推流,并要保持房间持续直播中,需要更新音视频数据,当服务器接收到重连消息后,则告知重连前已经进入直播房间的观众用户新的播放地址,播放器会重新加载音视频数据。
当重新建立和服务器之间的推流链接后,则检测当前摄像头的帧率以及视频数据流中关键帧的间隔,如果检测到当前摄像头的帧率已经被修改,则将该摄像头的帧率调节成原来视频数据流处于非静帧视频流时的帧率,如果检测到当前视频数据流中的关键帧间隔已经被修改,则将视频数据流中的关键帧间隔调节到原来视频数据流处于非静帧视频流时的间隔。
本申请实施例提供的直播视频的处理方法,对获取到的视频数据流进行静帧检测,当检测到为静帧视频流时,即当前当观众和主播之间无交互时,执行逐级降低摄像头的帧率、逐级增加视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项,避免了在观众和主播之间无交互时产生大量的冗余数据和导致带宽的浪费。
实施例2
本申请实施例提供了一种直播视频的处理装置,如图4所示,该装置包括获取模块410、划分模块410、计算模块430、确定模块440和执行模块450;其中,上述获取模块410,配置成获取摄像头采集的视频数据流,该视频数据流包括多个视频帧;上述划分模块420,配置成按相同布局将每一个视频帧划分为多个图像块;上述计算模块430,配置成计算每个图像块的亮度值,根据该亮度值计算每个视频帧中位置相同的图像块之间亮度值的差值;上述确定模块440,配置成确定上述位置相同的图像块对应的差值的平均值,根据该平均值中绝对值大于预设均值的平均值对应的图像块的个数确定上述视频数据流是否为静帧视频流,或者根据所述 平均值中绝对值小于或等于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流;上述执行模块450,配置成当确定上述视频数据流为静帧视频流后,执行逐级降低上述摄像头的帧率、逐级增加上述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
其中,上述划分模块420按相同布局将每一个视频帧划分为多个图像块是通过确定单元和划分单元实现的,具体包括:
上述确定单元,配置成周期性确定上述视频帧的划分布局;上述划分单元,配置成根据上述划分布局将每一帧视频帧划分为多个图像块。
具体的,上述确定单元配置成在静帧检测周期开始后,提取当前检测周期内的其中一个所述视频帧的亮度信号;将提取亮度信号的该视频帧按照第一预设尺寸划分为多个区域,计算每个所述区域对应的像素点的亮度均值,将像素点的所述亮度均值相等且相邻的至少两个所述区域合并作为一个所述区域,得到所述视频帧的划分布局。优选的,确定单元配置成在静帧检测周期开始后,提取当前检测周期内的第一个所述视频帧的亮度信号;将第一个所述视频帧按照第一预设尺寸划分为多个区域,计算每个所述区域对应的像素点的亮度均值,将像素点的所述亮度均值相等且相邻的至少两个所述区域合并作为一个所述区域,得到所述视频帧的划分布局。
另外,上述划分模块420还可以按照第二预设尺寸将每一个视频帧平均划分为多个图像块。
进一步的,计算模块430包括:亮度值获取单元,配置成获取所述图像块中每个像素点对应的亮度值;计算单元,配置成根据每个所述像素点对应的亮度值,计算所述图像块对应的像素点的亮度均值;亮度值确定单元,配置成将所述图像块对应的像素点的亮度均值确定为所述图像块的亮度值。
另外,计算模块430还可以配置成获取所述图像块中每个像素点对应的亮度值;计算所述图像块中像素点的像素值之和,作为所述图像块的亮度值。
进一步的,计算模块430计算当前检测周期内每个所述视频帧中与所述当前检测周期内第一个视频帧中位置相同的所述图像块之间的所述亮度值的差值。
进一步的,确定模块440包括:第一统计单元,配置成统计每个检测周期内绝对值大于预设均值的平均值对应的图像块的个数;第一比较单元,配置成将所述个数与第一预设阈值进行比较,当连续预设数目个检测周期对应的所述个数小于所述第一预设阈值时,确定所述视频数据流为静帧视频流。
另外,在本实施例中,确定模块440也可以包括:第二统计单元,统计每个检测周期内绝对值小于或等于预设均值的平均值对应的图像块的个数;第二比较单元,配置成将所述个数与第二预设阈值进行比较,当连续预设数目个检测周期对应的所述个数大于所述第二预设阈值时,确定所述视频数据流为静帧视频流。
具体的,执行模块450配置成当确定所述视频数据流为静帧视频流后,统计所述静帧视频流的持续时间;根据所述持续时间,执行逐级降低所述摄像头的帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
进一步的,执行模块450还配置成当确定所述视频数据流为非静帧视频流,检测当前的推流链接是否已经断开,如果是,发送重连消息给服务器。
本申请实施例提供的直播视频的处理装置,对获取到的视频数据流进行静帧检测,当检测到视频数据流为静帧视频流时,即当前观众和主播之间无交互时,执行逐级降低视频数据流的采集帧率、逐级增加视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项,避免了在观众和主播之间无交互时产生大量的冗余数据和导致带宽的浪费。
进一步的,本申请实施例还提供了一种直播视频的处理设备。
图5是本申请一个实施例的一种直播视频的处理设备500的示意性方框图。
如图5所示,本申请实施例提供的直播视频的处理设备500包括:存储器501 和处理器502。
存储器501,被配置成存储程序。
处理器502,被配置成通过调用所述存储器501中存储的程序,执行包括以下步骤的方法:获取摄像头采集的视频数据流,所述视频数据流包括多个视频帧;按相同布局将每一个所述视频帧划分为多个图像块;计算每个所述图像块的亮度值,根据所述亮度值计算每个所述视频帧中位置相同的所述图像块之间的所述亮度值的差值;确定所述位置相同的所述图像块对应的所述差值的平均值,根据所述平均值中绝对值大于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流,或者根据所述平均值中绝对值小于或等于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流;当确定所述视频数据流为静帧视频流后,执行逐级降低所述摄像头的帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
需说明的是,关于上述方法的进一步描述具体参见前面方法流程中的详细描述,此处不再赘述。
在本实施例中,处理器502通过运行存储在存储器501中的上述程序,从而执行各种功能应用以及数据处理,即实现本申请实施例中的直播视频的处理方法。存储器501可以包括但不限于随机存取存储器(Random Access Memory,RAM),只读存储器(Read Only Memory,ROM),可编程只读存储器(Programmable Read-Only Memory,PROM),可擦除只读存储器(Erasable Programmable Read-Only Memory,EPROM),电可擦除只读存储器(Electric Erasable Programmable Read-Only Memory,EEPROM)等。其中,所述处理器502可以在接收到执行指令后,执行所述存储器501中存储的上述程序,相应地实现前述本申请实施例任一实施例揭示的流程所定义的方法。
处理器502可以是一种集成电路芯片,具有信号处理能力。上述处理器可以是 通用处理器,包括中央处理器(Central Processing Unit,简称CPU)、网络处理器(Network Processor,简称NP)等;还可以是数字信号处理器(DSP)、专用集成电路(ASIC)、现成可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。其可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
可以理解的,图5所示的结构仅为示意,处理设备500还可以包括比图5中所示更多或者更少的组件,或者具有与图5所示不同的配置。图5中所示的各组件可以采用硬件、软件或其组合实现。
前述实施例中的直播视频的处理装置的各模块和单元可以是由软件代码实现,此时,上述的各模块和单元可存储于处理设备500的存储器601内。以上各模块和单元同样可以由硬件例如集成电路芯片实现。
本申请实施例所提供的直播视频的处理装置可以为设备上的特定硬件或者安装于设备上的软件或固件等。本申请实施例所提供的装置,其实现原理及产生的技术效果和前述方法实施例相同,为简要描述,装置实施例部分未提及之处,可参考前述方法实施例中相应内容。所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,前述描述的系统、装置和单元的具体工作过程,均可以参考上述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的实施例中,应该理解到,所揭露装置和方法,可以通过其它的方式实现。以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,又例如,多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些通信接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元 显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请提供的实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释,此外,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
最后应说明的是:以上所述实施例,仅为本申请的具体实施方式,用以说明本申请的技术方案,而非对其限制,本申请的保护范围并不局限于此,尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,其依然可以对前述实施例所记载的技术方案进行修改或可轻易想到变化,或者对其中部分技术特征进行等同替换;而这些修改、变化或者替换,并不使相应技术方案的本质脱离本申请实施例技术方案的精神和范围。都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (16)

  1. 一种直播视频的处理方法,其特征在于,所述方法包括:
    获取摄像头采集的视频数据流,所述视频数据流包括多个视频帧;
    按相同布局将每一个所述视频帧划分为多个图像块;
    计算每个所述图像块的亮度值,根据所述亮度值计算每个所述视频帧中位置相同的所述图像块之间的所述亮度值的差值;
    确定所述位置相同的所述图像块对应的所述差值的平均值,根据所述平均值中绝对值大于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流,或者根据所述平均值中绝对值小于或等于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流;
    当确定所述视频数据流为静帧视频流后,执行逐级降低所述摄像头的帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
  2. 根据权利要求1所述的方法,其特征在于,所述按相同布局将每一个所述视频帧划分为多个图像块,包括:
    周期性确定所述视频帧的划分布局;
    根据所述划分布局将每一帧所述视频帧划分为多个图像块。
  3. 根据权利要求2所述的方法,其特征在于,所述周期性确定所述视频帧的划分布局,包括:
    在静帧检测周期开始后,提取当前检测周期内的其中一个所述视频帧的亮度信号;
    将提取亮度信号的该视频帧按照第一预设尺寸划分为多个区域,计算每个所述区域对应的像素点的亮度均值,将像素点的所述亮度均值相等且相邻的至少两个所述区域合并作为一个所述区域,得到所述视频帧的划分布局。
  4. 根据权利要求2所述的方法,其特征在于,所述周期性确定所述视频帧的划分布局,包括:
    在静帧检测周期开始后,提取当前检测周期内的第一个所述视频帧的亮度信号;
    将第一个所述视频帧按照第一预设尺寸划分为多个区域,计算每个所述区域对应的像素点的亮度均值,将像素点的所述亮度均值相等且相邻的至少两个所述区域合并作为一个所述区域,得到所述视频帧的划分布局。
  5. 根据权利要求1至4任一项所述的方法,其特征在于,所述按相同布局将每一个所述视频帧划分为多个图像块,包括:
    按照第二预设尺寸将每一个所述视频帧平均划分为多个图像块。
  6. 根据权利要求1至5任一项所述的方法,其特征在于,所述计算每个所述图像块的亮度值,包括:
    获取所述图像块中每个像素点对应的亮度值;
    根据每个所述像素点对应的亮度值,计算所述图像块对应的像素点的亮度均值;
    将所述图像块对应的像素点的亮度均值确定为所述图像块的亮度值。
  7. 根据权利要求1至5任一项所述的方法,其特征在于,所述计算每个所述图像块的亮度值,包括:
    获取所述图像块中每个像素点对应的亮度值;
    计算所述图像块中像素点的像素值之和,作为所述图像块的亮度值。
  8. 根据权利要求1至7任一项所述的方法,其特征在于,所述根据所述亮度值计算每个所述视频帧中位置相同的所述图像块之间的所述亮度值的差值,包括:
    计算当前检测周期内每个所述视频帧中与所述当前检测周期内第一个视频帧中位置相同的所述图像块之间的所述亮度值的差值。
  9. 根据权利要求1至8任一项所述的方法,其特征在于,所述根据所述平均值中绝对值大于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流,包括:
    统计每个检测周期内绝对值大于预设均值的平均值对应的图像块的个数;
    将所述个数与第一预设阈值进行比较,当连续预设数目个检测周期对应的所述个数小于所述第一预设阈值时,确定所述视频数据流为静帧视频流。
  10. 根据权利要求1至9任一项所述的方法,其特征在于,所述根据所述平均值中绝对值小于或等于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流,包括:
    统计每个检测周期内绝对值小于或等于预设均值的平均值对应的图像块的个数;
    将所述个数与第二预设阈值进行比较,当连续预设数目个检测周期对应的所述个数大于所述第二预设阈值时,确定所述视频数据流为静帧视频流。
  11. 根据权利要求1至10任一项所述的方法,其特征在于,所述当确定所述视频数据流为静帧视频流后,执行逐级降低所述摄像头的帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项,包括:
    当确定所述视频数据流为静帧视频流后,统计所述静帧视频流的持续时间;
    根据所述持续时间,执行逐级降低所述摄像头的帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
  12. 根据权利要求1至11任一项所述的方法,其特征在于,所述方法还包括:
    当确定所述视频数据流为非静帧视频流,检测当前的推流链接是否已经断开,如果是,发送重连消息给服务器。
  13. 一种直播视频的处理装置,其特征在于,所述装置包括:
    获取模块,配置成获取摄像头采集的视频数据流,所述视频数据流包括多个视频帧;
    划分模块,配置成按相同布局将每一个所述视频帧划分为多个图像块;
    计算模块,配置成计算每个所述图像块的亮度值,根据所述亮度值计算每个所述视频帧中位置相同的所述图像块之间的所述亮度值的差值;
    确定模块,配置成确定所述位置相同的所述图像块对应的所述差值的平均值,根据所述平均值中绝对值大于预设均值的平均值对应的所述图像块的 个数确定所述视频数据流是否为静帧视频流,或者根据所述平均值中绝对值小于或等于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流;
    执行模块,配置成当确定所述视频数据流为静帧视频流后,执行逐级降低所述摄像头的帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
  14. 根据权利要求10所述的装置,其特征在于,所述划分模块包括:
    确定单元,配置成周期性确定所述视频帧的划分布局;
    划分单元,配置成根据所述划分布局将每一帧所述视频帧划分为多个图像块。
  15. 一种直播视频的处理设备,其特征在于,包括:
    存储器,被配置成存储程序,
    处理器,被配置成通过调用所述存储器中存储的程序,执行包括以下步骤的方法:
    获取摄像头采集的视频数据流,所述视频数据流包括多个视频帧;按相同布局将每一个所述视频帧划分为多个图像块;计算每个所述图像块的亮度值,根据所述亮度值计算每个所述视频帧中位置相同的所述图像块之间的所述亮度值的差值;确定所述位置相同的所述图像块对应的所述差值的平均值,根据所述平均值中绝对值大于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流,或者根据所述平均值中绝对值小于或等于预设均值的平均值对应的所述图像块的个数确定所述视频数据流是否为静帧视频流;当确定所述视频数据流为静帧视频流后,执行逐级降低所述摄像头的帧率、逐级增加所述视频数据流中关键帧的间隔和发送断流消息给服务器中至少一项。
  16. 一种具有处理器可执行的非易失的程序代码的计算机可读介质,其特征在于,所述程序代码使所述处理器执行所述权利要求1-12任一项所述方法。
PCT/CN2017/079622 2016-10-31 2017-04-06 一种直播视频的处理方法、装置及设备、计算机可读介质 WO2018076614A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610929382.5A CN106412626B (zh) 2016-10-31 2016-10-31 一种直播视频的处理方法及装置
CN201610929382.5 2016-10-31

Publications (1)

Publication Number Publication Date
WO2018076614A1 true WO2018076614A1 (zh) 2018-05-03

Family

ID=58012825

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/079622 WO2018076614A1 (zh) 2016-10-31 2017-04-06 一种直播视频的处理方法、装置及设备、计算机可读介质

Country Status (2)

Country Link
CN (1) CN106412626B (zh)
WO (1) WO2018076614A1 (zh)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109147043A (zh) * 2018-09-30 2019-01-04 Oppo广东移动通信有限公司 一种数据处理方法、服务器及计算机存储介质
CN110166780A (zh) * 2018-06-06 2019-08-23 腾讯科技(深圳)有限公司 视频的码率控制方法、转码处理方法、装置和机器设备
CN111091118A (zh) * 2019-12-31 2020-05-01 北京奇艺世纪科技有限公司 图像的识别方法、装置及电子设备和存储介质
CN111223058A (zh) * 2019-12-27 2020-06-02 杭州雄迈集成电路技术股份有限公司 一种图像增强方法
CN112115295A (zh) * 2020-08-27 2020-12-22 广州华多网络科技有限公司 视频图像检测方法、装置、及电子设备
CN112752120A (zh) * 2019-10-31 2021-05-04 深圳市中兴微电子技术有限公司 像素检测方法及装置、像素判断方法及装置
CN113365101A (zh) * 2020-03-05 2021-09-07 腾讯科技(深圳)有限公司 对视频进行多任务处理的方法及相关设备
CN113873097A (zh) * 2021-09-27 2021-12-31 北京紫光展锐通信技术有限公司 一种运动检测方法、装置、存储介质和电子设备
CN114332721A (zh) * 2021-12-31 2022-04-12 上海商汤临港智能科技有限公司 摄像装置遮挡检测方法、装置、电子设备及存储介质
CN114332738A (zh) * 2022-01-18 2022-04-12 浙江高信技术股份有限公司 一种用于智慧工地的安全帽检测系统
CN114549821A (zh) * 2022-01-14 2022-05-27 三一建筑机器人(西安)研究院有限公司 视觉模板生成、目标检测方法和装置及机器人系统
CN115103156A (zh) * 2022-06-10 2022-09-23 慧之安信息技术股份有限公司 一种动态的视频流传递方法
CN115396696A (zh) * 2022-08-22 2022-11-25 网易(杭州)网络有限公司 视频数据传输方法、系统、处理设备及存储介质
CN115529481A (zh) * 2021-06-25 2022-12-27 杭州海康威视数字技术股份有限公司 基于融合信号源的视频同步显示系统、方法及输入设备
CN117440190A (zh) * 2023-11-23 2024-01-23 北京视睿讯科技有限公司 一种分布式视频同步方法、系统、设备和介质
CN117560468A (zh) * 2023-11-10 2024-02-13 山东居安特消防科技有限公司 一种基于大数据的一体化消防器材生产监控系统
CN117596425A (zh) * 2023-10-24 2024-02-23 书行科技(北京)有限公司 编码帧率的确定方法、装置、电子设备及存储介质

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106412626B (zh) * 2016-10-31 2019-06-18 武汉斗鱼网络科技有限公司 一种直播视频的处理方法及装置
CN107622234B (zh) * 2017-09-12 2020-04-24 广州酷狗计算机科技有限公司 一种显示萌脸礼物的方法和装置
CN110753237B (zh) * 2019-11-05 2021-09-07 北京金和网络股份有限公司 节省流媒体服务器上行带宽流量的方法及装置
CN112788329A (zh) * 2020-12-24 2021-05-11 深圳创维-Rgb电子有限公司 视频静帧检测方法、装置、电视及存储介质
CN113573153B (zh) * 2021-02-02 2022-08-12 腾讯科技(深圳)有限公司 图像处理方法、装置及设备
CN113411569B (zh) * 2021-06-15 2022-08-12 北京百度网讯科技有限公司 检测静态画面的方法和装置
CN113965814B (zh) * 2021-08-30 2023-07-04 国网山东省电力公司信息通信公司 基于视频会议场景的多会场关键帧提取方法及系统
CN117255222A (zh) * 2023-11-20 2023-12-19 上海科江电子信息技术有限公司 一种数字电视监测的方法、系统及应用

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102547370A (zh) * 2011-11-01 2012-07-04 大连捷成实业发展有限公司 一种视频信号的黑场和静帧监测方法及系统
US20130002737A1 (en) * 2009-04-16 2013-01-03 Chunghwa Picture Tubes, Ltd. Driving circuit and gray insertion method of liquid crystal display
CN103281559A (zh) * 2013-05-31 2013-09-04 于京 视频质量检测的方法及系统
CN105578177A (zh) * 2015-12-15 2016-05-11 浙江广播电视集团 基于crc校验的视频静帧检测系统及方法
CN106412626A (zh) * 2016-10-31 2017-02-15 武汉斗鱼网络科技有限公司 一种直播视频的处理方法及装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102695028B (zh) * 2012-05-22 2015-01-21 广东威创视讯科技股份有限公司 视频图像动态降帧方法和系统
CN102946505B (zh) * 2012-11-22 2015-02-18 四川虹微技术有限公司 一种基于图像分块统计的自适应运动检测方法
CN103618911B (zh) * 2013-10-12 2017-02-01 北京视博云科技有限公司 一种基于视频属性信息的视频流提供方法及装置
US9407926B2 (en) * 2014-05-27 2016-08-02 Intel Corporation Block-based static region detection for video processing
CN104270561A (zh) * 2014-08-01 2015-01-07 Tcl通讯(宁波)有限公司 移动终端自动调节Camera帧率的方法及移动终端

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130002737A1 (en) * 2009-04-16 2013-01-03 Chunghwa Picture Tubes, Ltd. Driving circuit and gray insertion method of liquid crystal display
CN102547370A (zh) * 2011-11-01 2012-07-04 大连捷成实业发展有限公司 一种视频信号的黑场和静帧监测方法及系统
CN103281559A (zh) * 2013-05-31 2013-09-04 于京 视频质量检测的方法及系统
CN105578177A (zh) * 2015-12-15 2016-05-11 浙江广播电视集团 基于crc校验的视频静帧检测系统及方法
CN106412626A (zh) * 2016-10-31 2017-02-15 武汉斗鱼网络科技有限公司 一种直播视频的处理方法及装置

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110166780A (zh) * 2018-06-06 2019-08-23 腾讯科技(深圳)有限公司 视频的码率控制方法、转码处理方法、装置和机器设备
CN110166780B (zh) * 2018-06-06 2023-06-30 腾讯科技(深圳)有限公司 视频的码率控制方法、转码处理方法、装置和机器设备
CN109147043A (zh) * 2018-09-30 2019-01-04 Oppo广东移动通信有限公司 一种数据处理方法、服务器及计算机存储介质
CN112752120A (zh) * 2019-10-31 2021-05-04 深圳市中兴微电子技术有限公司 像素检测方法及装置、像素判断方法及装置
CN111223058B (zh) * 2019-12-27 2023-07-18 杭州雄迈集成电路技术股份有限公司 一种图像增强方法
CN111223058A (zh) * 2019-12-27 2020-06-02 杭州雄迈集成电路技术股份有限公司 一种图像增强方法
CN111091118A (zh) * 2019-12-31 2020-05-01 北京奇艺世纪科技有限公司 图像的识别方法、装置及电子设备和存储介质
CN113365101A (zh) * 2020-03-05 2021-09-07 腾讯科技(深圳)有限公司 对视频进行多任务处理的方法及相关设备
CN112115295A (zh) * 2020-08-27 2020-12-22 广州华多网络科技有限公司 视频图像检测方法、装置、及电子设备
CN115529481B (zh) * 2021-06-25 2024-05-03 杭州海康威视数字技术股份有限公司 基于融合信号源的视频同步显示系统、方法及输入设备
CN115529481A (zh) * 2021-06-25 2022-12-27 杭州海康威视数字技术股份有限公司 基于融合信号源的视频同步显示系统、方法及输入设备
CN113873097A (zh) * 2021-09-27 2021-12-31 北京紫光展锐通信技术有限公司 一种运动检测方法、装置、存储介质和电子设备
CN114332721A (zh) * 2021-12-31 2022-04-12 上海商汤临港智能科技有限公司 摄像装置遮挡检测方法、装置、电子设备及存储介质
CN114549821A (zh) * 2022-01-14 2022-05-27 三一建筑机器人(西安)研究院有限公司 视觉模板生成、目标检测方法和装置及机器人系统
CN114332738A (zh) * 2022-01-18 2022-04-12 浙江高信技术股份有限公司 一种用于智慧工地的安全帽检测系统
CN114332738B (zh) * 2022-01-18 2023-08-04 浙江高信技术股份有限公司 一种用于智慧工地的安全帽检测系统
CN115103156A (zh) * 2022-06-10 2022-09-23 慧之安信息技术股份有限公司 一种动态的视频流传递方法
CN115396696A (zh) * 2022-08-22 2022-11-25 网易(杭州)网络有限公司 视频数据传输方法、系统、处理设备及存储介质
CN115396696B (zh) * 2022-08-22 2024-04-12 网易(杭州)网络有限公司 视频数据传输方法、系统、处理设备及存储介质
CN117596425A (zh) * 2023-10-24 2024-02-23 书行科技(北京)有限公司 编码帧率的确定方法、装置、电子设备及存储介质
CN117560468A (zh) * 2023-11-10 2024-02-13 山东居安特消防科技有限公司 一种基于大数据的一体化消防器材生产监控系统
CN117560468B (zh) * 2023-11-10 2024-05-14 山东居安特消防科技有限公司 一种基于大数据的一体化消防器材生产监控系统
CN117440190A (zh) * 2023-11-23 2024-01-23 北京视睿讯科技有限公司 一种分布式视频同步方法、系统、设备和介质

Also Published As

Publication number Publication date
CN106412626A (zh) 2017-02-15
CN106412626B (zh) 2019-06-18

Similar Documents

Publication Publication Date Title
WO2018076614A1 (zh) 一种直播视频的处理方法、装置及设备、计算机可读介质
US10757397B2 (en) Information processing apparatus, information processing system, information processing method, and storage medium
KR102280134B1 (ko) 비디오 재생 방법, 장치 및 시스템
CN107105309B (zh) 直播调度方法及装置
US11228764B2 (en) Streaming multiple encodings encoded using different encoding parameters
US10037606B2 (en) Video analysis apparatus, monitoring system, and video analysis method
JP4256940B2 (ja) 可視索引付けシステムのための重要情景検出及びフレームフィルタリング
EP2876885A1 (en) Method and apparatus in a motion video capturing system
EP3086561A1 (en) Information pushing method, device, and system
CN109155840B (zh) 运动图像分割装置及监视方法
WO2018036481A1 (zh) 一种检测场景切换帧的方法、装置和系统
CN111447447B (zh) 直播编码方法、装置以及电子设备
US10609440B1 (en) Timing data anomaly detection and correction
US20160100222A1 (en) Method and System for Image Alteration
CN109348279A (zh) 一种推流方法、装置、设备及存储介质
CN109714622A (zh) 一种视频数据处理方法、装置及电子设备
CN112584189A (zh) 直播数据处理方法、装置、系统与计算机可读存储介质
US20180332291A1 (en) Information processing system and information processing method
US20130007206A1 (en) Transmission apparatus, control method for transmission apparatus, and storage medium
CN112954433A (zh) 视频处理方法、装置、电子设备及存储介质
CN110300326B (zh) 一种视频卡顿的检测方法、装置、电子设备及存储介质
CN111343475B (zh) 数据处理方法和装置、直播服务器及存储介质
JP2012137900A (ja) 映像出力システム、映像出力方法及びサーバ装置
US20160314590A1 (en) Distributed automatic notification method for abnormality in remote massive monitors
CN110913235A (zh) 用于大规模直播的访问控制方法、装置和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17865988

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17865988

Country of ref document: EP

Kind code of ref document: A1