CN114051142A - Hardware multi-channel coding anti-shake method and device, intelligent terminal and storage medium - Google Patents

Hardware multi-channel coding anti-shake method and device, intelligent terminal and storage medium Download PDF

Info

Publication number
CN114051142A
CN114051142A CN202210035003.3A CN202210035003A CN114051142A CN 114051142 A CN114051142 A CN 114051142A CN 202210035003 A CN202210035003 A CN 202210035003A CN 114051142 A CN114051142 A CN 114051142A
Authority
CN
China
Prior art keywords
image data
coding
buffer layer
data buffer
encoding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210035003.3A
Other languages
Chinese (zh)
Other versions
CN114051142B (en
Inventor
周志文
刘明
朱宇翔
纪向晴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mapgoo Technology Co ltd
Original Assignee
Shenzhen Mapgoo Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mapgoo Technology Co ltd filed Critical Shenzhen Mapgoo Technology Co ltd
Priority to CN202210035003.3A priority Critical patent/CN114051142B/en
Publication of CN114051142A publication Critical patent/CN114051142A/en
Application granted granted Critical
Publication of CN114051142B publication Critical patent/CN114051142B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/156Availability of hardware or computational resources, e.g. encoding based on power-saving criteria
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • H04N19/423Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation characterised by memory arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The invention discloses a hardware multi-channel coding anti-shake method, a device, an intelligent terminal and a storage medium, wherein the method comprises the following steps: calculating the coding interval time of each path of coder in real time; when the coding interval time of one coding line exceeds a preset range, increasing the capacity of the image data buffer layers of all the coding lines; when the coding interval time of each path is recovered to be within a preset range, the coding rate of each path of coder is increased according to a preset proportion based on the image data quantity accumulated in each path of image data buffer layer; when detecting that the image data amount in the image data buffer layer of each path is smaller than the capacity of the image data buffer layer during the preset normal encoding, controlling the capacity of the image data buffer layer and the encoding rate of an encoder during the restoration to the preset normal encoding; and outputting the image data coded by each encoder in real time. The problems of live broadcast frame dropping and blocking caused by hardware shaking are solved, and the multipath live broadcast fluency of weak hardware is improved.

Description

Hardware multi-channel coding anti-shake method and device, intelligent terminal and storage medium
Technical Field
The invention relates to the technical field of video coding, in particular to a hardware multi-channel coding anti-shake method, a hardware multi-channel coding anti-shake device, an intelligent terminal and a storage medium.
Background
The invention relates to the technical field of video coding, in particular to a hardware multi-channel coding anti-shake method, a hardware multi-channel coding anti-shake device, an intelligent terminal and a storage medium.
Disclosure of Invention
The invention mainly aims to provide a hardware multi-channel coding anti-shake method, a device, an intelligent terminal and a storage medium, and aims to solve the problem that a hardware multi-channel coding anti-shake scheme is lacked in the prior art, so that multi-channel live broadcast of hardware with weak performance can achieve a smoother and unsmooth live broadcast effect.
In order to achieve the above object, a first aspect of the present invention provides a hardware multi-coding anti-jitter method, wherein the method includes:
acquiring image data to be coded through a preset image data buffer layer;
calculating the coding interval time of each encoder in real time based on the coded image data, wherein the coding interval time is the time interval between coding two images;
when the coding interval time of one coding line is detected to exceed a preset normal value range, controlling and increasing the capacity of the image data buffer layers in all the coding lines;
when the coding interval time of each path is detected to be recovered to the range of the preset normal value, controlling the quantity of image data accumulated and stored in the image data buffer layers of each path, and increasing the coding rate of each path of coder according to a preset proportion;
when detecting that the image data amount stored in the image data buffer layer of each path is smaller than the capacity of the image data buffer layer during the preset normal encoding, controlling the capacity of the image data buffer layer and the encoding rate of an encoder during the restoration to the preset normal encoding;
and outputting the image data coded by each encoder in real time.
Optionally, the step of obtaining the image data to be encoded through the preset image data buffer layer includes:
an image data buffer layer is constructed in advance and used for temporarily storing image data to be coded;
presetting a normal value range of the coding interval time for judging whether hardware shakes;
the capacity value of the image data buffer layer and the coding rate of the coder during normal coding are preset and are used for recovering the normal coding working state after the hardware jitter is finished.
Optionally, the step of calculating the encoding interval time of each encoder in real time based on the encoded image data includes:
calculating in real time to obtain the coding interval time between every two images based on the current time for extracting the image data in the image data buffer layer and the time for extracting the last image data by the encoder;
and calculating the coding interval time of all the coding lines in real time.
Optionally, the step of controlling to increase the capacity of the image data buffer layer in all the encoding lines when it is detected that the encoding interval time of one of the encoding lines exceeds a preset normal value range includes:
based on the coding interval time of each line obtained by real-time calculation, when the coding interval time of one line is detected to exceed the range of a preset normal value;
the control increases the capacity of the image data buffer layer in all the encoding lines.
Optionally, when it is detected that the coding interval time of each path is recovered to the preset normal value range, the step of controlling the coding rate of each path of encoder to be increased according to a predetermined ratio based on the image data amount accumulated in each path of image data buffer layer includes:
based on the increased capacity of the image data buffer layer, when the coding interval time of each path is detected to be restored to be within a preset normal value range;
acquiring the image data quantity accumulated in each image data buffer layer;
and controlling the encoding rate of a corresponding line encoder to be increased by 10% when 2 frames of image data exist according to the accumulated stored image data amount.
Optionally, when it is detected that the amount of the image data stored in the image data buffer layer of each path is smaller than the capacity of the image data buffer layer during the preset normal encoding, the step of controlling the capacity of the image data buffer layer and the encoding rate of the encoder to be restored to the capacity during the preset normal encoding includes:
based on the promoted coding rate of the coder, when the quantity of the image data stored in each path of image data buffer layer is detected to be smaller than the capacity of the image data buffer layer during the preset normal coding;
controlling the capacity of the corresponding line image data buffer layer to be restored to the capacity of the image data buffer layer when the normal coding is preset;
and controlling the coding rate of the corresponding line coder to be restored to the coding rate when the normal coding is preset.
Optionally, the multi-path encoding includes encoding of image data of different resolutions for different shots and/or encoding of image data captured by multiple shots.
The second aspect of the present invention provides a hardware multi-coding anti-jitter apparatus, wherein the apparatus comprises:
the data buffer layer is used for temporarily storing image data to be coded, and the storage capacity of the data buffer layer can be changed;
the coding interval calculation layer is used for calculating the time interval between two images coded by the coder in real time;
an encoding layer for encoding the image data taken out from the data buffer layer;
the data output layer is used for outputting the coded image;
and the buffer control layer is used for controlling and changing the capacity of the data buffer layer and the coding rate of the coder according to the abnormal coding interval time weighted average value.
A third aspect of the present invention provides an intelligent terminal, where the intelligent terminal includes a memory, a processor, and a hardware multi-coding anti-shaking program stored in the memory and executable on the processor, and the hardware multi-coding anti-shaking program implements any one of the steps of the hardware multi-coding anti-shaking method when executed by the processor.
A fourth aspect of the present invention provides a storage medium, where a hardware multi-coding anti-shaking program is stored in the storage medium, and the hardware multi-coding anti-shaking program implements any one of the steps of the hardware multi-coding anti-shaking method when being executed by a processor.
Therefore, in the scheme of the invention, the image data to be coded is obtained through the preset image data buffer layer; calculating the coding interval time of each encoder in real time based on the coded image data, wherein the coding interval time is the time interval between coding two images; when the coding interval time of one coding line is detected to exceed a preset normal value range, controlling and increasing the capacity of the image data buffer layers in all the coding lines; when the coding interval time of each path is detected to be recovered to the range of the preset normal value, controlling the quantity of image data accumulated and stored in the image data buffer layers of each path, and increasing the coding rate of each path of coder according to a preset proportion; when detecting that the image data amount stored in the image data buffer layer of each path is smaller than the capacity of the image data buffer layer during the preset normal encoding, controlling the capacity of the image data buffer layer and the encoding rate of an encoder during the restoration to the preset normal encoding; and outputting the image data coded by each encoder in real time. Compared with the prior art, the invention is provided with the hardware multi-channel coding anti-shaking device, when a line is detected to have a coding rate reduced due to hardware shaking in the live broadcasting process, the image data buffer layers of all the lines are controlled to be enlarged, the coding rate is improved to avoid the problem of live broadcasting blockage caused by image data loss, and the weak hardware multi-channel live broadcasting fluency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a flowchart illustrating a hardware multi-coding anti-jitter method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a hardware multi-coding anti-jitter apparatus according to an embodiment of the present invention.
Fig. 3 is a schematic flow chart illustrating the step S200 in fig. 1 according to the present invention.
FIG. 4 is a flowchart illustrating the implementation of step S300 in FIG. 1.
Fig. 5 is a schematic flow chart illustrating the step S400 in fig. 1 according to the present invention.
Fig. 6 is a schematic flowchart illustrating a specific process of step S500 in fig. 1 according to the present invention.
Fig. 7 is a schematic block diagram of an internal structure of an intelligent terminal according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when …" or "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted depending on the context to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
The technical solutions in the embodiments of the present invention are clearly and completely described below with reference to the drawings of the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways than those specifically described and will be readily apparent to those of ordinary skill in the art without departing from the spirit of the present invention, and therefore the present invention is not limited to the specific embodiments disclosed below.
With the development of science and technology and the improvement of living standard of people, more and more people like to share own living news on a social platform or a video platform, and even can interact with audiences and friends in real time in a live broadcast mode. However, the method is not suitable for live broadcast by using high-performance equipment of a mobile phone or a computer in all occasions, for example, live broadcast cannot be carried out by lifting the mobile phone in the process of doing sports and driving.
However, when people use the automobile data recorder to directly broadcast in the driving process and directly broadcast through the large-screen camera, the problems of frame dropping and blocking of a live broadcast picture are often caused by shaking caused by insufficient performance of equipment. Especially when the weak hardware device is performing multi-resolution, multi-shot multi-channel broadcasting, it is often that multiple lines simultaneously generate anomalies. In the prior art, a solution for the problem that the weak hardware generates frame dropping and blocking due to hardware jitter in the process of multi-line live broadcasting does not exist.
In order to solve the problems in the prior art, in the scheme of the invention, image data to be coded is obtained through a preset image data buffer layer; calculating the coding interval time of each encoder in real time based on the coded image data, wherein the coding interval time is the time interval between coding two images; when the coding interval time of one coding line is detected to exceed a preset normal value range, controlling and increasing the capacity of the image data buffer layers in all the coding lines; when the coding interval time of each path is detected to be recovered to the range of the preset normal value, controlling the quantity of image data accumulated and stored in the image data buffer layers of each path, and increasing the coding rate of each path of coder according to a preset proportion; when detecting that the image data amount stored in the image data buffer layer of each path is smaller than the capacity of the image data buffer layer during the preset normal encoding, controlling the capacity of the image data buffer layer and the encoding rate of an encoder during the restoration to the preset normal encoding; and outputting the image data coded by each encoder in real time. Compared with the prior art, the invention is provided with the hardware multi-channel coding anti-shaking device, when a line is detected to have a coding rate reduced due to hardware shaking in the live broadcasting process, the image data buffer layers of all the lines are controlled to be enlarged, the coding rate is improved to avoid the problem of live broadcasting blockage caused by image data loss, and the weak hardware multi-channel live broadcasting fluency is improved.
Exemplary method
As shown in fig. 1, an embodiment of the present invention provides a hardware multi-coding anti-jitter method, specifically, the method includes the following steps:
and step S100, acquiring image data to be coded through a preset image data buffer layer.
In this embodiment, the system for live broadcasting is provided with an image data buffer layer for storing an image to be encoded in advance, and stores the image captured by the camera in the image data buffer layer to wait for encoding in sequence. Compared with a direct broadcasting and coding system without an image data buffer layer, the problem of frame drop is not easy to occur.
Step S200, calculating the coding interval time of each path of coder in real time based on the coded image data, wherein the coding interval time is the time interval between coding two images.
In this embodiment, during live broadcasting, an encoder in the live broadcasting system sequentially obtains images from the image data buffer layer for encoding, and then the current encoding interval time can be calculated in real time according to the interval time between every two encoded images and the like. Furthermore, whether the current hardware is jittered or not can be intuitively shown according to the coding interval time, so that the coding rate is obviously slowed down.
And step S300, when the encoding interval time of one encoding line is detected to be beyond a preset normal value range, controlling to increase the capacity of the image data buffer layers in all the encoding lines.
In this embodiment, based on the coding interval time obtained in the previous step in real time, when it is detected that the coding interval in one of the coding lines exceeds the preset normal value range, there is a high probability that the coding rate of the other lines is reduced at the same time or the next time due to hardware jitter, and the capacity of the image data buffer layers in all the coding lines is controlled to be increased. The normal value range of the coding interval time is determined according to the live frame rate, when the live frame rate is 25 frames, which is equivalent to 25 images needing to be played in one second, the encoder also needs 1 second to code 25 image data, and the normal coding interval time can be set to be 40 +/-10 ms. When the hardware shakes, the coding interval time reaches 200ms or even 500ms, the coded image data is not ready to be blocked, and the capacity of the image data buffer layer of all lines is controlled to be increased. Therefore, when the step detects that the coding is abnormal due to hardware jitter, the loss of the image data is avoided by increasing the image data amount which can be temporarily stored, and live broadcast pause and frame loss are avoided. Furthermore, the capacity of the image data buffer layers of all the lines is expanded by the abnormity of one coding line, so that the fault tolerance of the live broadcast system can be effectively improved.
And step S400, when the coding interval time of each path is detected to be recovered to the preset normal value range, controlling the quantity of image data accumulated in the image data buffer layers based on each path, and increasing the coding rate of each path of coder according to a preset proportion.
In this embodiment, when the hardware jitter of the device is over, the live broadcast system detects that the coding interval time of each channel is restored to the normal value range of 40 ± 10ms, which indicates that the encoder is working normally. And accelerating the encoding processing of the image data accumulated and stored in each path of image data buffer layer in the previous step by increasing the encoding rate of the encoder. Specifically, the encoding rate of the encoder to be increased is determined according to the number of the image data stored in each image data buffer layer, so that the problems that the temporary stored image data is too much and cannot be processed in a short time, or the temporary stored image data is less and the encoding rate is higher, so that the resource waste is caused, and the like are solved, and the encoding stability of each encoding line is improved.
Step S500, when the quantity of the image data stored in the image data buffer layer of each path is detected to be smaller than the capacity of the image data buffer layer during the preset normal encoding, controlling the capacity of the image data buffer layer and the encoding rate of the encoder when the normal encoding is restored to the preset value.
In this embodiment, based on the coding rate increased in the previous step, when it is detected that the image data amount stored in each image data buffer layer is recovered to the image data buffer layer capacity during normal coding due to the increased coding rate, and the coding state of the live broadcast system is the same as that during normal live broadcast coding at this time, the preset data buffer layer capacity and the preset coder coding rate are controlled to be recovered, and the next hardware jitter is waited.
And step S600, outputting the image data coded by each channel of coder in real time.
In this embodiment, whether the encoding is normal live encoding or encoding for encoding anti-shake due to hardware shake, the live system controls the encoded image to output data according to the live frame rate. The user will see a smooth live broadcast effect through the live broadcast room.
It can be seen that this embodiment is through setting up a hardware multichannel coding anti-shake device, when detecting that there is a circuit to lead to the encoding rate to reduce because of hardware shake in the live broadcast process, the image data buffer layer of control all circuits enlarges to improve the encoding rate and in order to avoid image data to lose, lead to the problem that live broadcast card is pause, effectively improved the live fluency of weak hardware multichannel.
Specifically, in this embodiment, a vehicle event data recorder commonly used by a user is taken as multi-channel live broadcast hardware to perform two-channel live broadcast with a single lens and different resolutions. When the event data recorder is another device, and the multi-channel live broadcast is a multi-channel live broadcast with different frame rates or multiple shots or a combination of the multiple resolutions, reference may be made to the specific scheme in this embodiment.
Referring to fig. 2, the live broadcast system of the car event data recorder is pre-constructed with the hardware two-way encoding anti-shake apparatus shown in fig. 2, which includes an image data buffer layer, an encoding interval calculation layer, an encoding layer, a data output layer, and a buffer control layer.
The image data buffer layer is used for inputting YUV image data shot by a camera; the encoding interval calculation layer is used for calculating the interval time of the encoder for continuously extracting YUV image data from the image data buffer layer; the encoding layer is used for extracting YUV image data from the image data buffer layer and encoding the image data into an image for live broadcast output; the data output layer is used for outputting the coded image to a live broadcast room address through an H.264 format; the buffer control layer controls the capacity of the image data buffer layer and the coding rate of the coder according to the coding interval time calculated by the coding interval calculation layer.
In an application scenario, a live broadcast system acquires image data to be encoded through a preset image data buffer layer.
Specifically, the step of obtaining the image data to be encoded through the preset image data buffer layer includes:
an image data buffer layer is constructed in advance and used for temporarily storing image data to be coded;
presetting a normal value range of the coding interval time for judging whether hardware shakes;
the capacity value of the image data buffer layer and the coding rate of the coder during normal coding are preset and are used for recovering the normal coding working state after the hardware jitter is finished.
For example, an image data buffer layer is pre-constructed in the live broadcast system of the automobile data recorder and used for temporarily storing YUV image data to be encoded, please refer to fig. 2; a normal value range of the coding interval time is also preset and is used for judging whether hardware shakes, for example, when the frame rate of live broadcast is 25 frames, the normal value range of the coding interval time can be set to be 40 +/-10 ms, and when the frame rate of live broadcast is 60 frames, the normal value range of the coding interval time can be set to be 15 +/-5 ms; the volume value of the image data buffer layer of the automobile data recorder during normal coding and the coding rate of the coder are preset, and the automobile data recorder is used for recovering the normal coding working state after hardware jitter is finished, so that the stability of a live broadcast system is improved. Specifically, the capacity value of the image data buffer layer during normal encoding can be set to 4 YUV image data, the encoding rate of the encoder is the same as the number of live broadcast frames, and when 25 frames are selected as the live broadcast data, the encoding rate is 1 second to encode 25 pieces of YUV image data.
Further, when the automobile data recorder shoots the YUV image data of the front picture of the vehicle through the lens, the live broadcast system inputs the YUV image data into the image data buffer layer for temporary storage.
In one application scenario, the live broadcast system calculates the encoding interval time of each encoder in real time based on encoded image data, wherein the encoding interval time is a time interval between encoding two images.
Specifically, as shown in fig. 3, the step S200 includes:
step S201, calculating in real time to obtain the coding interval time between every two images based on the time of the encoder for currently extracting the image data in the image data buffer layer and the time of extracting the previous image data;
step S202, calculating the coding interval time of all coding lines in real time.
For example, in the live broadcast process, the encoding interval calculation layer records, in real time, the time for the encoding layer to extract the YUV image data this time and the time for the encoding layer to extract the YUV image data last time, and calculates the time interval. And calculating and acquiring the coding interval time of the two lines in real time according to the method.
In one application scenario, when the live broadcast system detects that the coding interval time of one coding line exceeds a preset normal value range, the live broadcast system controls to increase the capacity of an image data buffer layer in all coding lines.
Specifically, as shown in fig. 4, the step S300 includes:
step S301, when the coding interval time of one of the lines is detected to exceed a preset normal value range based on the coding interval time of each line obtained by real-time calculation;
step S302, controlling and increasing the capacity of the image data buffer layer in all the coding lines.
For example, the live broadcasting system controls to increase the capacity of the image data buffer layers in the two encoding lines when detecting that the encoding interval time of one of the two encoding lines exceeds a preset normal range value based on the encoding interval time of the two encoding lines calculated in real time. In this embodiment, taking 25 frames of live broadcast as an example, the normal value range of the encoding interval time may be set to be 40 ± 10 ms. The coding interval time of normal coding of two lines is 44ms and 39ms, for example, and when the coding interval time of the first coding line suddenly becomes 400ms at a certain time, it is determined that the hardware jitter occurs in the event of the car recorder. And it is considered that an abnormality may occur in the encoding of another line at the same time or in a very short time. The capacity of the image data buffer layers in the two lines is controlled to be increased simultaneously, and at most 4 YUV image data are stored until 8 or even 12 YUV image data can be stored. For example, in the embodiment of the present invention, under the condition of 25 frames of live broadcast, the hardware jitter may generate the accumulation of 7-8 frames of YUV image data, and the default storage capacity of 4 YUV image data is added, so that the capacity expansion is set to store 12 YUV image data. Preferably, in order to ensure the stable operation of the live broadcast system, the capacity of the expanded image data buffer layer is not more than 12. In extreme cases, image data that exceeds the storage capacity may be selectively discarded.
Therefore, in the step, once the code abnormality of one line is detected, the image data buffer layers of all the lines are controlled to be added, so that the live broadcast fluency of all the lines is effectively improved, and the fault tolerance rate is improved.
In an application scenario, when the live broadcast system detects that the coding interval time of each path is recovered to the preset normal value range, the amount of image data accumulated and stored in each image data buffer layer is controlled, and the coding rate of each encoder is increased according to a preset proportion.
Specifically, as shown in fig. 5, the step S400 includes:
step S401, based on the increased capacity of the image data buffer layer, when the coding interval time of each path is detected to be recovered to a preset normal value range;
s402, acquiring the image data quantity accumulated in each image data buffer layer;
and S403, controlling to increase the coding rate of a corresponding line coder by 10% every time 2 frames of image data exist according to the quantity of the image data accumulated.
For example, the buffer layer is based on image data expanded by jitter. And when the hardware jitter of the automobile data recorder is finished, namely the encoding interval time of the two encoding lines is detected to be recovered to a normal value range within 40 +/-10 ms, acquiring the amount of the YUV image data accumulated and stored in the current two image data buffer layers due to the hardware jitter. For example, the first line stores 12 frames of YUV image data, and the second line stores 8 frames of YUV image data. And increasing the coding rate of the coder based on a preset rule according to the amount of the accumulated YUV image data so as to quickly code and process the YUV image data backlogged due to jitter. In this embodiment, the encoding rate of the encoding line is correspondingly increased by 10% every time 2 frames of YUV image data exist in the image data buffer layer. The 12 frames of YUV image data of the first line are correspondingly increased by 60% of the coding rate, and the coding rate is increased from 25 frames in 1 second to 38-39 frames in 1 second; the YUV data of 8 frames of the second line is promoted to be between 33 and 34 frames in 1 second, and the accumulated YUV image data can be processed within 1 to 2 seconds. Therefore, the encoding rate is increased by taking the amount of the stored YUV image data as a reference, each encoding line can be controlled to be restored to a normal encoding state within an approximate moment, and the stability of multi-channel encoding of a live broadcast system is improved.
In another embodiment, the steps S300 and S400 may be executed synchronously. Specifically, when the hardware jitter is detected through the coding interval time, the capacity of the image data buffer layers in all the coding lines is controlled to be increased, and meanwhile, the coding rate of the corresponding line coder is controlled to be increased based on the number of YUV image data stored in each coding line in real time. By the method, the accumulation of YUV image data caused by hardware jitter can be eliminated more quickly, and the anti-jitter performance is improved.
In an application scenario, when the live broadcast system detects that the amount of image data stored in the image data buffer layer of each channel is smaller than the capacity of the image data buffer layer during the preset normal encoding, the capacity of the image data buffer layer and the encoding rate of an encoder during the restoration to the preset normal encoding are controlled.
Specifically, as shown in fig. 6, the step S500 includes:
step S501, based on the improved encoder coding rate, when the quantity of the image data stored in each image data buffer layer is detected to be smaller than the capacity of the image data buffer layer during the preset normal encoding;
step S502, controlling the capacity of the corresponding line image data buffer layer to be restored to the capacity of the image data buffer layer when the normal coding is preset;
and step S503, controlling the coding rate of the corresponding line coder to be restored to the coding rate when the normal coding is preset.
For example, based on the encoding rate of each encoder raised in the previous step, when it is detected that the amount of image data stored in each image data buffer layer is smaller than the capacity of the image data buffer layer during the preset normal encoding, that is, smaller than 4 YUV image data. The control restores the image data buffer capacity of the line to 4 YUV image data and the encoder encoding rate to an encoding rate of 1 second 25 frames. And ending the hardware anti-shake until the capacity of the image data buffer layer and the coding rate of the normal coding of the two lines are recovered. However, it should be noted that if the hardware might still shake during the above steps, the latest hardware shake prevention can be performed without restoring the buffer capacity and encoding rate of the normally encoded image data.
In an application scene, the live broadcast system outputs image data encoded by each encoder in real time.
For example, the live broadcast system outputs the encoded YUV data to a live broadcast address through h.264, and because the loss of YUV image data is not generated due to hardware jitter in the encoding process, the feeling of pause and frame dropping is not generated when live broadcast is watched, thereby effectively improving the fluency of weak hardware multi-channel live broadcast and improving the live broadcast experience of users.
Exemplary device
Based on the hardware multi-channel coding anti-shake method and the device of the embodiment, the invention also provides an intelligent terminal, and the functional block diagram of the intelligent terminal can be shown in fig. 7. The intelligent terminal comprises a processor, a memory, a network interface and a display screen which are connected through a system bus. Wherein, the processor of the intelligent terminal is used for providing calculation and control capability. The memory of the intelligent terminal comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a hardware multi-pass encoding anti-shake program. The internal memory provides an environment for the operation of an operating system and a hardware multi-path coding anti-shake program in the nonvolatile storage medium. The network interface of the intelligent terminal is used for being connected and communicated with an external terminal through a network. When being executed by a processor, the hardware multi-channel coding anti-shaking program realizes the steps of any one of the hardware multi-channel coding anti-shaking methods. The display screen of the intelligent terminal can be a liquid crystal display screen or an electronic ink display screen.
It will be understood by those skilled in the art that the block diagram of fig. 7 is only a block diagram of a part of the structure related to the solution of the present invention, and does not constitute a limitation to the intelligent terminal to which the solution of the present invention is applied, and a specific intelligent terminal may include more or less components than those shown in the figure, or combine some components, or have different arrangements of components.
In one embodiment, an intelligent terminal is provided, where the intelligent terminal includes a memory, a processor, and a hardware multi-way coding anti-shake program stored in the memory and executable on the processor, and the hardware multi-way coding anti-shake program performs the following operation instructions when executed by the processor:
acquiring image data to be coded through a preset image data buffer layer;
calculating the coding interval time of each encoder in real time based on the coded image data, wherein the coding interval time is the time interval between coding two images;
when the coding interval time of one coding line is detected to exceed a preset normal value range, controlling and increasing the capacity of the image data buffer layers in all the coding lines;
when the coding interval time of each path is detected to be recovered to the range of the preset normal value, controlling the quantity of image data accumulated and stored in the image data buffer layers of each path, and increasing the coding rate of each path of coder according to a preset proportion;
when detecting that the image data amount stored in the image data buffer layer of each path is smaller than the capacity of the image data buffer layer during the preset normal encoding, controlling the capacity of the image data buffer layer and the encoding rate of an encoder during the restoration to the preset normal encoding;
and outputting the image data coded by each encoder in real time.
The embodiment of the invention also provides a storage medium, wherein the storage medium is stored with a hardware multi-path coding anti-shaking program, and the hardware multi-path coding anti-shaking program realizes the steps of any hardware multi-path coding anti-shaking method provided by the embodiment of the invention when being executed by a processor.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art would appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the above modules or units is only one logical division, and the actual implementation may be implemented by another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
The integrated modules/units described above may be stored in a storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the flow in the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a storage medium and executed by a processor, to instruct related hardware to implement the steps of the above-described embodiments of the method. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer readable medium may include: any entity or device capable of carrying the above-mentioned computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, software distribution medium, etc. It should be noted that the contents contained in the storage medium may be increased or decreased as appropriate according to the requirements of legislation and patent practice in the jurisdiction.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those skilled in the art; the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein.

Claims (10)

1. A hardware multi-coding anti-jitter method, the method comprising:
acquiring image data to be coded through a preset image data buffer layer;
calculating the coding interval time of each encoder in real time based on the coded image data, wherein the coding interval time is the time interval between coding two images;
when the coding interval time of one coding line is detected to exceed a preset normal value range, controlling and increasing the capacity of the image data buffer layers in all the coding lines;
when the coding interval time of each path is detected to be recovered to the range of the preset normal value, controlling the quantity of image data accumulated and stored in the image data buffer layers of each path, and increasing the coding rate of each path of coder according to a preset proportion;
when detecting that the image data amount stored in the image data buffer layer of each path is smaller than the capacity of the image data buffer layer during the preset normal encoding, controlling the capacity of the image data buffer layer and the encoding rate of an encoder during the restoration to the preset normal encoding;
and outputting the image data coded by each encoder in real time.
2. The hardware multi-channel coding anti-shaking method according to claim 1, wherein the step of obtaining the image data to be coded through a preset image data buffer layer comprises:
an image data buffer layer is constructed in advance and used for temporarily storing image data to be coded;
presetting a normal value range of the coding interval time for judging whether hardware shakes;
the capacity value of the image data buffer layer and the coding rate of the coder during normal coding are preset and are used for recovering the normal coding working state after the hardware jitter is finished.
3. The hardware multi-pass encoding anti-jitter method of claim 1, wherein the step of calculating the encoding interval time of each pass encoder in real time based on the encoded image data comprises:
calculating in real time to obtain the coding interval time between every two images based on the current time for extracting the image data in the image data buffer layer and the time for extracting the last image data by the encoder;
and calculating the coding interval time of all the coding lines in real time.
4. The hardware multi-coding anti-jitter method of claim 1, wherein the step of controlling the increase of the capacity of the image data buffer layer in all the coding lines when the coding interval time of one of the coding lines is detected to be beyond a preset normal value range comprises:
based on the coding interval time of each line obtained by real-time calculation, when the coding interval time of one line is detected to exceed the range of a preset normal value;
the control increases the capacity of the image data buffer layer in all the encoding lines.
5. The hardware multi-channel coding anti-jitter method as claimed in claim 4, wherein the step of controlling the amount of image data accumulated and stored in the image data buffer layers based on each channel when detecting that the coding interval time of each channel is restored to the preset normal value range, and increasing the coding rate of each channel of coder according to a predetermined ratio comprises:
based on the increased capacity of the image data buffer layer, when the coding interval time of each path is detected to be restored to be within a preset normal value range;
acquiring the image data quantity accumulated in each image data buffer layer;
and controlling the encoding rate of a corresponding line encoder to be increased by 10% when 2 frames of image data exist according to the accumulated stored image data amount.
6. The hardware multi-channel coding anti-shaking method according to claim 5, wherein the step of controlling the buffer capacity of the image data and the coding rate of the encoder to be restored to the preset normal coding time when detecting that the amount of the image data stored in the buffer layer of the image data of each channel is smaller than the buffer capacity of the image data at the preset normal coding time comprises:
based on the promoted coding rate of the coder, when the quantity of the image data stored in each path of image data buffer layer is detected to be smaller than the capacity of the image data buffer layer during the preset normal coding;
controlling the capacity of the corresponding line image data buffer layer to be restored to the capacity of the image data buffer layer when the normal coding is preset;
and controlling the coding rate of the corresponding line coder to be restored to the coding rate when the normal coding is preset.
7. The hardware multi-pass encoding anti-shaking method according to claim 1, wherein the multi-pass encoding comprises encoding of different resolution image data of different shots and/or encoding of image data captured by a plurality of shots.
8. A hardware multi-coding anti-jitter apparatus, the apparatus comprising:
the data buffer layer is used for temporarily storing image data to be coded, and the storage capacity of the data buffer layer can be changed;
the coding interval calculation layer is used for calculating the time interval between two images coded by the coder in real time;
an encoding layer for encoding the image data taken out from the data buffer layer;
the data output layer is used for outputting the coded image;
and the buffer control layer is used for controlling and changing the capacity of the data buffer layer and the coding rate of the coder according to the abnormal coding interval time weighted average value.
9. An intelligent terminal, characterized in that the intelligent terminal comprises a memory, a processor and a hardware multi-way coding anti-shaking program stored on the memory and operable on the processor, wherein the hardware multi-way coding anti-shaking program, when executed by the processor, implements the steps of the hardware multi-way coding anti-shaking method according to any one of claims 1-7.
10. A storage medium having a hardware multi-pass encoded anti-shaking program stored thereon, wherein the hardware multi-pass encoded anti-shaking program, when executed by a processor, implements the steps of the hardware multi-pass encoded anti-shaking method of any one of claims 1-7.
CN202210035003.3A 2022-01-13 2022-01-13 Hardware multi-channel coding anti-shake method and device, intelligent terminal and storage medium Active CN114051142B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210035003.3A CN114051142B (en) 2022-01-13 2022-01-13 Hardware multi-channel coding anti-shake method and device, intelligent terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210035003.3A CN114051142B (en) 2022-01-13 2022-01-13 Hardware multi-channel coding anti-shake method and device, intelligent terminal and storage medium

Publications (2)

Publication Number Publication Date
CN114051142A true CN114051142A (en) 2022-02-15
CN114051142B CN114051142B (en) 2022-04-29

Family

ID=80196466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210035003.3A Active CN114051142B (en) 2022-01-13 2022-01-13 Hardware multi-channel coding anti-shake method and device, intelligent terminal and storage medium

Country Status (1)

Country Link
CN (1) CN114051142B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5729303A (en) * 1995-02-23 1998-03-17 Hitachi, Ltd. Memory control system and picture decoder using the same
US20030174769A1 (en) * 2001-05-10 2003-09-18 Takefumi Nagumo Motion picture encoding apparatus
JP2005033599A (en) * 2003-07-08 2005-02-03 Sony Corp Coding device and coding method, and program
US20100272170A1 (en) * 2009-04-28 2010-10-28 Fujitsu Limited Image encoding apparatus, image encoding method and medium on which image encoding program is recorded
US20120051420A1 (en) * 2010-08-31 2012-03-01 General Instrument Corporation Method and apparatus for encoding
CN102378065A (en) * 2011-10-19 2012-03-14 江西省南城县网信电子有限公司 Method and system for configuring buffer area at streaming media server side based on MPEG (Moving Picture Experts Group)-4
US20120166670A1 (en) * 2010-12-28 2012-06-28 Yoshinobu Kure Transmitting apparatus, transmitting method, and program
CN103269433A (en) * 2013-04-28 2013-08-28 广东威创视讯科技股份有限公司 Video data transmission method and video data transmission system
JP2014155084A (en) * 2013-02-12 2014-08-25 Mitsubishi Electric Corp Image encoder
JP2016052000A (en) * 2014-08-29 2016-04-11 日本放送協会 Video transmitter
WO2018177165A1 (en) * 2017-03-30 2018-10-04 上海七牛信息技术有限公司 Method and system for optimizing quality network pushed stream
CN111385563A (en) * 2018-12-29 2020-07-07 广州市百果园信息技术有限公司 Hardware encoder detection method and device and terminal
CN113382265A (en) * 2021-05-19 2021-09-10 北京大学深圳研究生院 Hardware implementation method, apparatus, medium, and program product for video data entropy coding

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5729303A (en) * 1995-02-23 1998-03-17 Hitachi, Ltd. Memory control system and picture decoder using the same
US20030174769A1 (en) * 2001-05-10 2003-09-18 Takefumi Nagumo Motion picture encoding apparatus
JP2005033599A (en) * 2003-07-08 2005-02-03 Sony Corp Coding device and coding method, and program
US20100272170A1 (en) * 2009-04-28 2010-10-28 Fujitsu Limited Image encoding apparatus, image encoding method and medium on which image encoding program is recorded
US20120051420A1 (en) * 2010-08-31 2012-03-01 General Instrument Corporation Method and apparatus for encoding
US20120166670A1 (en) * 2010-12-28 2012-06-28 Yoshinobu Kure Transmitting apparatus, transmitting method, and program
CN102378065A (en) * 2011-10-19 2012-03-14 江西省南城县网信电子有限公司 Method and system for configuring buffer area at streaming media server side based on MPEG (Moving Picture Experts Group)-4
JP2014155084A (en) * 2013-02-12 2014-08-25 Mitsubishi Electric Corp Image encoder
CN103269433A (en) * 2013-04-28 2013-08-28 广东威创视讯科技股份有限公司 Video data transmission method and video data transmission system
JP2016052000A (en) * 2014-08-29 2016-04-11 日本放送協会 Video transmitter
WO2018177165A1 (en) * 2017-03-30 2018-10-04 上海七牛信息技术有限公司 Method and system for optimizing quality network pushed stream
CN111385563A (en) * 2018-12-29 2020-07-07 广州市百果园信息技术有限公司 Hardware encoder detection method and device and terminal
CN113382265A (en) * 2021-05-19 2021-09-10 北京大学深圳研究生院 Hardware implementation method, apparatus, medium, and program product for video data entropy coding

Also Published As

Publication number Publication date
CN114051142B (en) 2022-04-29

Similar Documents

Publication Publication Date Title
CN111277779B (en) Video processing method and related device
CN111885305B (en) Preview picture processing method and device, storage medium and electronic equipment
US10097611B2 (en) Individual adjustment of audio and video properties in network conferencing
CN109413563B (en) Video sound effect processing method and related product
CN106303157B (en) Video noise reduction processing method and video noise reduction processing device
JP5190117B2 (en) System and method for generating photos with variable image quality
CN109118430B (en) Super-resolution image reconstruction method and device, electronic equipment and storage medium
CN112532880B (en) Video processing method and device, terminal equipment and storage medium
CN112399123B (en) Video definition adjusting method and device, electronic equipment and storage medium
CN108347580B (en) Method for processing video frame data and electronic equipment
CN113099272A (en) Video processing method and device, electronic equipment and storage medium
CN105578110B (en) A kind of video call method
CN111935442A (en) Information display method and device and electronic equipment
CN112637476A (en) Video recording method, device, terminal and computer readable storage medium
KR100719841B1 (en) Method for creation and indication of thumbnail view
CN115103210A (en) Information processing method, device, terminal and storage medium
CN110913118B (en) Video processing method, device and storage medium
CN114051142B (en) Hardware multi-channel coding anti-shake method and device, intelligent terminal and storage medium
US20190306462A1 (en) Image processing apparatus, videoconference system, image processing method, and recording medium
CN111444909A (en) Image data acquisition method, terminal device and medium
CN115834795A (en) Image processing method, device, equipment and computer readable storage medium
CN113395531B (en) Play switching method and device, electronic equipment and computer readable storage medium
CN113452915A (en) Camera switching method and electronic equipment
CN106254873B (en) Video coding method and video coding device
CN112291476B (en) Shooting anti-shake processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant