CN112004090A - Target boundary determining method, computer device and storage medium - Google Patents

Target boundary determining method, computer device and storage medium Download PDF

Info

Publication number
CN112004090A
CN112004090A CN202010674498.5A CN202010674498A CN112004090A CN 112004090 A CN112004090 A CN 112004090A CN 202010674498 A CN202010674498 A CN 202010674498A CN 112004090 A CN112004090 A CN 112004090A
Authority
CN
China
Prior art keywords
boundary
code stream
target
stream information
image frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010674498.5A
Other languages
Chinese (zh)
Inventor
方瑞东
林聚财
殷俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202010674498.5A priority Critical patent/CN112004090A/en
Publication of CN112004090A publication Critical patent/CN112004090A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/124Quantisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/184Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being bits, e.g. of the compressed video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/90Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
    • H04N19/96Tree coding, e.g. quad-tree coding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The application discloses a target boundary determining method, computer equipment and a storage medium, wherein the method comprises the following steps: acquiring code stream information of each macro block in a target image frame; selecting boundary macro blocks of which code stream information meets the requirement of a boundary code stream from a target image frame; and determining the boundary of the target object in the target image frame based on the selected boundary macro block. By means of the method, accuracy of obtaining the boundary of the target object can be improved, complexity is low, and consumed time is short.

Description

Target boundary determining method, computer device and storage medium
Technical Field
The present application relates to the field of encoding and decoding technologies, and in particular, to a target boundary determining method, a computer device, and a computer-readable storage medium.
Background
The boundary extraction of the target object refers to the extraction of the target boundary which is increasingly applied to more application scenes such as security and traffic fields by segmenting the target object from the background in the image so as to determine the position of the target object or track the target object. At present, in an extraction method of a target boundary, original data of an image needs to be directly processed, for example, various boundary extraction algorithms are used to detect the target boundary in the original data of the image, so that the calculation amount is large, the time consumption is long, the real-time performance is poor, and the obtained target boundary is inaccurate.
Disclosure of Invention
The application provides a target boundary determining method, a computer device and a storage medium, which can improve the accuracy of obtaining the boundary of a target object, and have low complexity and less time consumption.
In order to solve the technical problem, the application adopts a technical scheme that: an object boundary determination method is provided. The method comprises the following steps: acquiring code stream information of each macro block in a target image frame; selecting boundary macro blocks of which code stream information meets the requirement of a boundary code stream from a target image frame; and determining the boundary of the target object in the target image frame based on the selected boundary macro block.
In order to solve the above technical problem, another technical solution adopted by the present application is: a computer device is provided. The computer device includes: a processor and a memory for storing a computer program for execution by the processor to implement the above-mentioned target boundary determination method.
In order to solve the above technical problem, another technical solution adopted by the present application is: a computer-readable storage medium is provided. The computer-readable storage medium stores a computer program that is executed by a processor to implement the above-described target boundary determination method.
The beneficial effect of this application is: the method includes the steps of obtaining code stream information of macro blocks after target image frame coding, selecting boundary macro blocks of which the code stream information meets requirements of boundary code streams according to characteristics of the code stream information of the boundary macro blocks to determine boundaries of target objects in the target image frames, and compared with the method that original data of images are analyzed to determine the target boundaries, directly determining the boundaries of the target objects by using the code stream information of the target image frame macro blocks.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic flow chart diagram illustrating a first embodiment of a target boundary determination method provided herein;
FIG. 2 is a schematic flow chart of step S120 of FIG. 1 provided herein;
FIG. 3 is a schematic flow chart of step S130 in FIG. 1 provided herein;
FIG. 4 is a schematic flowchart of a second embodiment of a target boundary determination method provided in the present application;
FIG. 5 is a schematic flowchart of a third embodiment of a target boundary determination method provided in the present application;
FIG. 6 is a schematic diagram of the structure of the target boundary determining apparatus provided in the present application;
FIG. 7 is a schematic diagram of a computer device provided herein;
fig. 8 is a schematic structural diagram of a computer-readable storage medium provided in the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first", "second" and "third" in this application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any indication of the number of technical features indicated. Thus, a feature defined as "first," "second," or "third" may explicitly or implicitly include at least one of the feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless explicitly specifically limited otherwise. All directional indications (such as up, down, left, right, front, and back) in the embodiments of the present application are only used to explain the relative positional relationship between the components, the movement, and the like in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indication is changed accordingly. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those skilled in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
The applicant of the present application has found, through long-term research, that in the prior art, when determining a boundary of a target object in an image or determining a boundary of a moving target object in a video, raw data of an image frame of the image or the video is generally directly analyzed, a certain boundary algorithm is used to find the boundary of the target object, and if an accurate boundary of the target object needs to be found, a large amount of time is required, and generally, a real-time requirement is difficult to achieve.
In order to solve the above problems, the present application provides the following embodiments, and the embodiments of the present application are explained below.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating a first embodiment of a target boundary determining method according to the present application, the method including the following steps:
s110: and acquiring code stream information of each macro block in the target image frame.
The method and the device for encoding the target image frame of the target image or the video can encode the target image frame of the target image frame to obtain the code stream information of the encoded image of the target image frame, wherein the encoded code stream type may include standard encoding or non-standard encoding, for example, the standard encoded code stream type may be h.264/h.265/h.266, SVAC1/SVAC2, AVS1/AVS2/AVS3, VP8/VP9/AV1, and the like, which is not limited in the present application.
Dividing the coded image of the frame in the target image into a plurality of macro blocks (Marco Block, MB for short) to obtain code stream information of each macro Block in the target image frame. The basic size of the macroblock may be consistent with the basic coded macroblock size of the corresponding coded stream type, for example, for the h.264 coding standard, the coded macroblock size may be 16 × 16, and for the h.265 coding standard, the coded macroblock size may be 64 × 64, which is not limited in this application.
The code stream information of the macro block comprises one or a combination of a quantization parameter value, a residual value, a coded bit number, time domain texture complexity, space domain texture complexity, a maximum division depth and the number of contained sub blocks. Of course, it is understood that the code stream information of the macro block in the present application may also be other available code stream information, and the present application is not limited herein.
Specifically, a Quantization Parameter (QP) of the macroblock reflects a spatial detail compression condition of the target image frame, and if the QP value is small, most details of the target image frame are retained; if the QP value is increased, some details of the target image frame are lost, and the code rate is reduced, resulting in image distortion and quality degradation. In addition, due to the correlation of adjacent pixels in the same target image frame, the coding data is compressed by adopting an intra-frame prediction mode and an inter-frame prediction mode, and the difference value between the obtained predicted value and the actual sampling value is used as a residual value. In addition, the temporal texture complexity and the spatial texture complexity of the macroblock may be the complexity of texture information, texture features, and the like of the target image frame in both temporal and spatial domains.
The coded image is divided into a plurality of macro blocks, and the macro blocks of the target image frame can be further divided. For example, a macroblock may be a Coding Tree Unit (CTU), where a Coding Tree Unit is composed of a luminance Coding Tree Block (CTB) and two additional chrominance pixel blocks, the depth of the Coding Tree Unit may be the maximum partition depth of the macroblock, and the number of subblocks included in the Coding Tree Unit may be the number of subblocks included in the macroblock.
S120: and selecting boundary macro blocks of which the code stream information meets the requirement of the boundary code stream from the target image frame.
The code stream information comprises at least one code stream information capable of reflecting the difference between the boundary and the non-boundary of the target object in the target image frame. Due to the difference of code stream characteristics between the code stream information of the target object at the boundary and the code stream information of the non-boundary in the target image frame, the boundary and the non-boundary of the target object can be distinguished according to the code stream information.
In some embodiments, the code stream information of the macro block of the target object at the boundary position in the target image frame is similar due to the consistency of the code stream characteristics of the boundary of the target object. The code stream information of the macro block in the boundary area is smaller, the code stream information of the macro block in the non-boundary area is larger, and the non-boundary area can be a flat area of the target object in the image frame, so that the code stream information of the boundary macro block can be divided to determine the boundary macro block of the target object in the boundary according to the code stream information. For example, when the code stream information is a quantization parameter, a quantization parameter value of a macroblock in the boundary region is smaller than that of a non-boundary region, and a boundary macroblock of the target object in the boundary can be determined according to a difference between the code stream information of the macroblock in the boundary region and the code stream information of the macroblock in the non-boundary region.
In some embodiments, the code stream information of the macro block in the boundary area is larger than the code stream information of the non-boundary area, for example, the code stream information is: and when any one of residual value, coded bit number, time domain texture complexity, space domain texture complexity, maximum division depth or the number of contained sub-blocks is adopted, the boundary of the target object in the target image frame can be distinguished due to the difference between the code stream information of the boundary region and the code stream information of the non-boundary region.
In some embodiments, the boundary code stream requires that at least one code stream information of the boundary macro block satisfies: the code stream information is located in a target numerical region of code stream information distribution corresponding to the target image frame, wherein the target numerical region is as follows: dividing the code stream information distribution into two numerical value areas according to the numerical value, wherein any one numerical value area can be a target numerical value area. For example, the code stream information is divided into two value regions according to the value size, wherein the value region with a larger value is an upper value region, and the value region with a smaller value is a lower value region, so that the macro blocks in the code stream information of the boundary region are distinguished, if the target value region is the upper value region, the macro block of the upper value region is determined to be the boundary macro block, and if the target value region is the lower value region, the macro block of the lower value region is determined to be the boundary macro block.
In some embodiments, referring to fig. 2 for the step S120, the step S120 includes the following steps:
s121: and taking at least two code stream information as target code stream information.
The code stream information of the macro block may include any at least two code stream information of a quantization parameter value, a residual value, a coded bit number, a time domain texture complexity, a spatial domain texture complexity, a maximum division depth, and a number of subblocks included therein as target code stream information, for example, the quantization parameter value and the coded bit number may be used as the target code stream information, or all the code stream information may be used as the target code stream information, which is not limited in this application.
S122: and traversing each macro block in the target image frame to detect whether each target code stream information of the macro block meets a preset condition.
The preset condition is that the difference value between the maximum value of each target code stream information in the target image frame and the target code stream information of the macro block is larger than or smaller than a reference threshold corresponding to the target code stream information. Because the difference between the code stream information of the macro block in the boundary area and the maximum value of the code stream information in the target image frame is large, and the difference between the code stream information of the macro block in the flat area and the maximum value of the code stream information of the target image frame is small, the macro block of which the difference between the maximum value of the target code stream information in the target image frame and each target code stream information of the macro block is larger than or smaller than a reference threshold value corresponding to the target code stream information is detected by traversing each macro block in the target image frame, so that the macro block which is detected to meet the preset condition is determined as the.
When the target code stream information comprises the quantization parameter, the quantization parameter value of the macro block on the boundary area is smaller than that of the non-boundary area, the lower value area with a smaller value is the target value area, and the macro block in the lower value area is determined to be the boundary macro block. The preset condition may be that a difference between a maximum value of the target code stream information in the target image frame and the target code stream information of the macro block is greater than a reference threshold corresponding to the target code stream information.
When the target code stream information includes any one of residual value, coded bit number, time domain texture complexity, space domain texture complexity, maximum division depth or the number of contained subblocks, the code stream information of the macro block in the boundary region is larger than that of the code stream information of the non-boundary region, the upper value region of a larger value is a target value region, and the macro block in the upper value region is determined to be a boundary macro block. The preset condition may be that a difference value between a maximum value of target code stream information in the target image frame and the target code stream information of the macro block is smaller than a reference threshold corresponding to the target code stream information.
In some embodiments, the reference threshold corresponding to the target code stream information is: and the average value or the intermediate value between the maximum value and the minimum value of the target code stream information in the target image frame. Specifically, an average value between a maximum value and a minimum value of the target code stream information in the target image frame may be used as a reference threshold value corresponding to the target code stream information. In addition, the intermediate value of the target code stream information in the target image frame may also be used as the reference threshold corresponding to the target code stream information. Of course, a corresponding appropriate reference threshold may also be obtained for application scenes of different target image frames, and the reference threshold is not limited herein in the present application.
Step S122 will be described below by taking the reference threshold corresponding to the target code stream information as an average value between the maximum value and the minimum value of the target code stream information in the target image frame.
Acquiring target code stream information in a frame in a target image, for example, when the target code stream information includes a quantization parameter value of a macro block, traversing a current macro block in a target image frame to detect whether the quantization parameter value of the macro block meets a first preset condition, where the first preset condition is:
|QPmax-Mqp(w,h)|>TH(qp);
wherein th (qp) ═ 2 (QPmax + QPmin).
In the above equation, th (qp) is a first reference threshold, QPmax and QPmin are respectively the maximum value and the minimum value of the quantization parameter values actually encoded in the target image frame, and Mqp (w, h) is the quantization parameter value of the current macroblock, where w represents the macroblock number in the horizontal direction and h represents the macroblock number in the vertical direction.
For example, when the target code stream information includes a residual value of a macro block, traversing a current macro block in the target image frame to detect whether the residual value of the macro block meets a second preset condition, where the second preset condition is:
|RESmax-Mres(w,h)|<TH(res);
wherein th (res) ═ 2 (RESmax + RESmin).
In the above equation, th (res) is a second reference threshold, RESmax and RESmin are respectively the maximum value and the minimum value of residual values of macroblock coding in the target image frame, Mres (w, h) is the residual value of the current macroblock, where w represents the macroblock number in the horizontal direction, and h represents the macroblock number in the vertical direction.
For example, when the target code stream information includes the bit number of the macroblock after being encoded, traversing the current macroblock in the target image frame to detect whether the bit number of the macroblock after being encoded meets a third preset condition, where the third preset condition is:
|BITmax-Mbit(w,h)|<TH(bit);
wherein th (bit) ═ BITmax + BITmin)/2.
In the above equation, th (bit) is a third reference threshold, BITmax and BITmin are respectively the maximum value and the minimum value of the number of bits after coding a macroblock in the target image frame, Mbit (w, h) is the number of bits after coding a current macroblock, where w represents a macroblock sequence number in the horizontal direction, and h represents a macroblock sequence number in the vertical direction.
For example, when the target code stream information includes a macroblock time domain texture complexity or a spatial domain texture complexity, traversing a current macroblock in the target image frame to detect whether the macroblock texture complexity satisfies a fourth preset condition, where the fourth preset condition is:
|TEXmax-Mtex(w,h)|<TH(tex);
wherein th (tex) ═ tex + tex min)/2.
In the above equation, th (tex) is a fourth reference threshold, TEXmax and TEXmin are respectively the maximum value and the minimum value of the texture complexity of the macroblock in the target image frame, Mtex (w, h) is the texture complexity of the current macroblock, where w represents the macroblock number in the horizontal direction, and h represents the macroblock number in the vertical direction.
For example, when the target code stream information includes the number of subblocks included in the macroblock, traversing the current macroblock in the target image frame to detect whether the number of subblocks included in the macroblock satisfies a fifth preset condition, where the fifth preset condition is:
|SUBmax-Msub(w,h)|<TH(sub);
wherein th (sub)/2 (SUBmax + SUbmin).
In the above formula, th (sub) is a fifth reference threshold, SUBmax and SUBmin are respectively the maximum value and the minimum value of the number of subblocks included in the macroblock in the target image frame, and Msub (w, h) is the number of subblocks included in the current macroblock, where w represents the macroblock number in the horizontal direction, and h represents the macroblock number in the vertical direction.
If the quantization parameter value of the macro block and the coded bit number are used as target code stream information, each macro block in the target image frame is traversed to detect whether the quantization parameter value of the macro block and the coded bit number simultaneously meet the first preset condition and the third preset condition. And if the quantization parameter value, the residual value, the coded bit number, the texture complexity and the number of the contained subblocks of the macro block are taken as target code stream information, traversing each macro block in the target image frame to detect whether the target code stream information of the macro block simultaneously meets the first preset condition, the second preset condition, the third preset condition, the fourth preset condition and the fifth preset condition. In some embodiments, different preset conditions and reference thresholds may be set according to different code stream information, which is not limited in the present application.
The method has the advantages that at least two kinds of code stream information are used as target code stream information, more accurate boundary code stream information can be obtained, and in some application scenes, code stream information of more kinds of macro blocks can be used as target code stream information so as to improve the accuracy of determining the boundary of a target object in a target image frame by the code stream information. The target code stream information and the preset conditions are not limited by the application.
In step S122, if each piece of target code stream information of the macroblock meets the preset condition, step S123 is executed.
S123: and taking the macro block as a boundary macro block of which the code stream information meets the boundary code stream condition.
If the target code stream information of the traversed current macro block meets the preset condition, the current macro block can be used as a boundary macro block of which the code stream information meets the boundary code stream condition, each macro block in the target image frame is traversed to detect all macro blocks of which the target code stream information meets the preset condition, and the boundary macro block meeting the boundary code stream condition is determined.
S130: and determining the boundary of the target object in the target image frame based on the selected boundary macro block.
After determining the boundary macro block satisfying the boundary code stream condition, the boundary of the target object in the target image frame may be determined based on the selected boundary macro block, for example, the boundary of the target object in the target image frame may be generated by connecting the selected boundary macro blocks.
In some embodiments, referring to fig. 3 for step S130, the step S130 includes the following steps:
s131: and finding out whether the boundary macro blocks of other boundary macro blocks exist in a preset range.
Boundary macro blocks meeting preset conditions are detected, boundary macro blocks which are not located at boundary positions may exist, the boundary macro blocks can be selected in a fine mode, and whether boundary macro blocks of other boundary macro blocks exist in a preset range or not can be searched. For example, if it is determined that the current macroblock is a boundary macroblock, whether other boundary macroblocks exist within the preset range of the current macroblock may be searched, and the boundary macroblock may be subjected to fine selection to obtain a more accurate boundary of the target object.
In some embodiments, the predetermined range includes positions above left, right, above right, left, right, below left, and below right of the boundary macro block, and is spaced from the boundary macro block by less than a predetermined number of macro blocks. The preset number can be set according to specific requirements, and the application takes the preset number as zero as an example for explanation, when the preset number is zero, the current macro block is adjacent to the boundary macro block in the preset range, and when the current macro block is determined to be the boundary macro block, whether the macro blocks adjacent to the current macro block, above left, above right, on right, below left and below right, are the boundary macro blocks is detected, and if at least one of the macro blocks above left, above right, on left, on right, below left and below right exists as the boundary macro block, the current macro block can be determined to be the refined boundary macro block.
For example, the current macroblock meets the preset condition of the boundary code stream, is determined to be a boundary macroblock, and detects whether the macroblocks around the current boundary macroblock at the upper left, right, upper right, left, right, lower left, and lower right are boundary macroblocks, and if a boundary macroblock exists at the upper left, it is determined that the current boundary macroblock is a refined boundary macroblock, that is, a boundary macroblock in which other boundary macroblocks exist within a preset range.
And determining the current macro block as a boundary macro block, searching at least one other boundary macro block around the current boundary macro block, and performing fine selection on the boundary macro block to more accurately determine the current macro block as the boundary macro block and acquire a more accurate boundary of the target object.
If the boundary macro block having other boundary macro blocks within the preset range is found in step S131, step S132 is executed.
S132: and connecting the found boundary macro blocks to generate the boundary of the target object in the target image frame.
The found boundary macro block may be connected to the center sub-block of the boundary macro block to generate the boundary of the target object in the target image frame, or may be connected to the sub-blocks of the upper left corner sub-block and the upper right corner sub-block of the boundary macro block to generate the boundary of the target object in the target image frame.
For the boundary of a target object obtained by using a boundary extraction algorithm in the prior art, when the boundary of the target object obtained by the algorithm is synchronized to a corresponding position in a corresponding video or original image frame, due to the consumption of the boundary extraction algorithm time and other reasons, the boundary of the obtained target object may deviate from the boundary of the target object in the video or original image frame, and the boundary of the target object is not synchronized, so that the real-time performance is poor.
In the embodiment, the code stream information of the macro block of the target image frame is utilized, the boundary macro block of which the code stream information meets the requirement of the boundary code stream is selected according to the characteristic of the code stream information of the boundary code stream of the boundary, so as to determine the boundary of the target object in the target image frame, when the boundary macro block is selected, the numerical difference relation between the code stream information of the macro blocks is utilized for judging, when the boundary macro block is selected, the macro blocks around the boundary macro block are also judged, when at least one boundary macro block exists around the boundary macro block, the current macro block is determined to be the boundary macro block, the accuracy of obtaining the boundary of the target object is improved, compared with the method for determining the target boundary by analyzing the original data of the image, the method directly utilizes the code stream information of the macro block of the target image frame to determine the boundary of the target object, the process has low complexity and less, the method is used for detecting and extracting the target boundary in real time, has good real-time performance, and reduces the problem of boundary asynchronization of the target object when the boundary of the acquired target object is synchronized to the target image frame. In addition, in actual service, the boundary of the target object is generally superimposed on the reconstructed image data, and the target boundary is extracted after decoding, relatively speaking, since the boundary of the target object is extracted before decoding or at the same time of decoding, that is, before the target image frame is decoded and displayed, the boundary of the target object in the target image frame is already acquired, and therefore, when the acquired boundary of the target object is synchronized to the target image frame, the problem that the boundary of the target object is not synchronized does not exist.
For the above embodiment, the target object of the present application may be in a motion state, and the target object is a motion target, please refer to fig. 4, where fig. 4 is a schematic flowchart of a second embodiment of the target boundary determining method provided in the present application, and the method includes the following steps:
s210: code stream data of a target video is obtained, wherein the target video comprises a plurality of image frames, and the code stream data is obtained by coding the denoised image frames.
When the target boundary of the moving target object needs to be determined, a target video where the moving target object is located is obtained, and code stream data of the target video is obtained after a plurality of image frames of the target video are coded. Before code stream data of a target video is acquired, denoising can be performed on the target video or an image frame of the target video to reduce interference of noise on image frame code stream information of an actual moving target object, for example, a lot of noise exists in the target video or a target image frame shot at night, an image is unclear, denoising can be performed on the acquired target video or the target image frame, a denoising algorithm for the image frame of the target video or the target video can be determined according to a specific application scene, and the application is not limited thereto. In some embodiments, the code stream data of the target video may be read to parse the code stream data to obtain the code stream information of each macro block of the target image frame.
S220: and analyzing the code stream data to obtain the code stream information of each macro block of the target image frame.
The acquired code stream data of the target video is analyzed, in some application scenes, the code stream data can be decoded through a decoder to obtain code stream information of each macro block of the target image frame through decoding, and the decoding mode of the code stream data is not limited in the application.
S230: and taking at least two code stream information as target code stream information.
S240: and traversing each macro block in the target image frame to detect whether each target code stream information of the macro block meets a preset condition.
In step S240, if each piece of target code stream information of the macroblock meets the preset condition, step S250 is executed.
S250: and taking the macro block as a boundary macro block of which the code stream information meets the boundary code stream condition.
The specific implementation of steps S230 to S250 in this embodiment may refer to the implementation process of steps S121 to S123 in the above embodiment, and will not be described herein again.
S260: and finding out whether the boundary macro blocks of other boundary macro blocks exist in a preset range.
In step S260, if the boundary macro block with other boundary macro blocks in the preset range is found, step S270 is executed.
S270: and connecting the found boundary macro blocks to generate the boundary of the target object in the target image frame.
In the embodiment, for specific implementation of steps S260 and S270, reference may be made to implementation processes of steps S131 and S132 in the above embodiment, and details are not repeated here.
Different from the above embodiments, the present embodiment may determine the boundary of the moving target object, and by acquiring the code stream data of the target video, since the acquired code stream data is obtained by encoding the image frame subjected to denoising, the influence of the external noise on the code stream information of the analyzed macro block is reduced, and the present embodiment may be applied to target videos in various application environments to determine the boundary of the target object, and simultaneously, the acquired boundary is more accurate. The method and the device directly utilize the code stream information of the target image frame macro block to determine the boundary of the target object, the complexity of the process is low, the consumed time is short, the extraction efficiency of the boundary of the target object is improved, and the method and the device are used for detecting and extracting the boundary of the target in real time and have good real-time performance. In addition, compared with the prior art, the boundary of the moving target object extracted before or at the same time of decoding is acquired before the target video is decoded and displayed, so that the problem that the boundary of the target object is not synchronous when the acquired boundary of the moving target object is synchronous to the target image frame of the target video does not exist.
In addition, the target object of this embodiment is a moving target, and since the code stream data of the acquired target video is obtained by encoding based on a plurality of image frames of the target video, the encoding of the image frames involves adopting an intra-frame prediction mode and an inter-frame prediction mode, so that the image frames have strong correlation, and the acquired code streams of the macro blocks of the image frames are continuous, and the boundary acquired by the target object in a moving state is more accurate compared with a stationary target object.
For the above embodiments, the present embodiment takes the target code stream information as the quantization parameter value and the number of bits after encoding as an example for explanation, and the present application is not limited to the target code stream information. Referring to fig. 5, fig. 5 is a schematic flowchart illustrating a third embodiment of a target boundary determining method according to the present application, including the following steps:
s310: code stream data of a target video is obtained, wherein the target video comprises a plurality of image frames, and the code stream data is obtained by coding the denoised image frames.
And reading code stream data of the target video to analyze the code stream data to obtain code stream information of each macro block of the target image frame.
S320: and analyzing the code stream data to obtain the code stream information of each macro block of the target image frame.
The acquired code stream data of the target video is analyzed, in some application scenes, the code stream data can be decoded through a decoder to obtain code stream information of each macro block of the target image frame through decoding, and the decoding mode of the code stream data is not limited in the application.
S330: and taking the quantization parameter value and the coded bit number as target code stream information.
S340: and traversing each macro block in the target image frame to detect whether the quantization parameter value of the macro block and the coded bit number meet preset conditions.
Wherein the preset conditions are as follows:
|QPmax-Mqp(w,h)|>TH(qp);
and, | BITMAX-Mbit (w, h) | < TH (bit);
wherein the content of the first and second substances,
Figure BDA0002583553150000141
in the above equation, QPmax and QPmin are respectively the maximum value and the minimum value of the quantization parameter values actually encoded in the target image frame, Mqp (w, h) is the quantization parameter value of the current macroblock, BITmax and BITmin are respectively the maximum value and the minimum value of the number of bits after macroblock encoding in the target image frame, Mbit (w, h) is the number of bits after encoding of the current macroblock, th (qp) is the reference threshold of the quantization parameter, and th (bit) is the reference threshold of the number of bits after encoding, where w represents the macroblock number in the horizontal direction, and h represents the macroblock number in the vertical direction.
In some embodiments, the reference threshold th (qp) of the quantization parameter may also be a median of the quantization parameter of the current frame macroblock, and the reference threshold th (bit) of the coded bit number is a median of the coded bit number of the current frame macroblock, which is not limited in this application.
In this embodiment, other code stream information may also be used as the target code stream information, which is not limited in this application.
In step S340, if each target code stream information of the macroblock meets the preset condition, step S350 is executed.
S350: and taking the macro block as a boundary macro block of which the code stream information meets the boundary code stream condition.
If the quantization parameter value and the coded bit number of the current macro block both meet the preset conditions in the process of traversing the macro block of the current frame, the current macro block can be used as a boundary macro block of which the code stream information meets the boundary code stream conditions, and the current macro block can be used as a boundary macro block of the boundary, so that all the boundary macro blocks meeting the boundary are found out.
The specific implementation of steps S330 to S350 in this embodiment may refer to the implementation process of steps S121 to S123 in the above embodiment, and will not be described herein again.
S360: and finding out whether the boundary macro blocks of other boundary macro blocks exist in a preset range.
In step S360, if the boundary macro block with other boundary macro blocks in the preset range is found, step S370 is executed.
S370: and connecting the found boundary macro blocks to generate the boundary of the target object in the target image frame.
In the embodiment, reference may be made to the implementation processes of step S131 and step S132 in the above embodiment for specific implementation of steps S360 and S370, which are not described herein again.
In this embodiment, by acquiring the code stream data of the target video, since the acquired code stream data is obtained by encoding the denoised image frame, the influence of external noise on the code stream information of the analyzed macro block is reduced, and the acquired boundary is more accurate. In addition, the boundary is determined by directly utilizing the quantization parameter value after the target video is coded and the coded bit number, the accuracy of obtaining the boundary of the target object is improved, the complexity of the process is low, the time consumption is low, the extraction efficiency of the boundary of the target object is improved, and the real-time property of the boundary detection and extraction method is good when the boundary detection and extraction method is used for detecting and extracting the target boundary in real time. In addition, because the boundary of the target object is extracted before or while decoding, that is, before the target image frame is decoded and displayed, the boundary of the target object in the target image frame is already acquired, the boundary of the target object is not synchronized when the acquired boundary of the target object is synchronized to the target image frame.
For the above embodiments, the present application provides a target boundary determining apparatus, please refer to fig. 6, fig. 6 is a schematic structural diagram of the target boundary determining apparatus provided in the present application, and the target boundary determining apparatus 600 includes an obtaining module 610, an analyzing module 620, and a generating module 630.
The obtaining module 610 is configured to obtain code stream information of each macro block in the target image frame. The code stream information comprises at least one code stream information capable of reflecting the difference between the boundary and the non-boundary of the target object in the target image frame. For example, the code stream information of the macro block includes one or a combination of a quantization parameter value, a residual value, a coded bit number, a time domain texture complexity, a space domain texture complexity, a maximum division depth and the number of contained sub blocks.
In some embodiments, the target object is in a moving state, and the obtaining module 610 is configured to obtain code stream information of each macroblock in the target image frame, including: acquiring code stream data of a target video, wherein the target video comprises a plurality of image frames, and the code stream data is obtained by encoding the denoised image frames; and analyzing the code stream data to obtain the code stream information of each macro block of the target image frame.
The analysis module 620 is configured to select a boundary macroblock whose code stream information meets the requirement of the boundary code stream from the target image frame.
In some embodiments, the boundary code stream requires that at least one code stream information of the boundary macro block satisfies: the code stream information is located in a target numerical region of code stream information distribution corresponding to the target image frame, and the target numerical region is as follows: and dividing the code stream information distribution into two numerical value areas according to the numerical value, wherein one numerical value area is a target numerical value area. If the code stream information distribution is divided into two numerical value areas according to the numerical value, wherein the numerical value area with a larger value is an upper value area, the numerical value area with a smaller value is a lower value area, and one of the numerical value areas is a target numerical value area.
In some embodiments, the analyzing module 620 is configured to select a boundary macroblock whose code stream information meets the boundary code stream condition from the target image frame, and includes: taking at least two code stream information as target code stream information; traversing each macro block in the target image frame to detect whether each target code stream information of the macro block meets a preset condition; the preset condition is that the difference value between the maximum value of target code stream information in the target image frame and the target code stream information of the macro block is larger than or smaller than a reference threshold corresponding to the target code stream information; the reference threshold corresponding to the target code stream information is as follows: and the average value or the intermediate value between the maximum value and the minimum value of the target code stream information in the target image frame. And if each target code stream information of the macro block meets the preset condition, taking the macro block as a boundary macro block of which the code stream information meets the boundary code stream condition.
The generating module 630 is configured to determine a boundary of the target object in the target image frame based on the selected boundary macro block.
In some embodiments, the generating module 630 is configured to determine the boundary of the target object in the target image frame based on the selected boundary macro block, and includes: finding out whether boundary macro blocks of other boundary macro blocks exist in a preset range; and connecting the found boundary macro blocks to generate the boundary of the target object in the target image frame. The preset range comprises a position above the left, a position above the right, a position above the left, a position right, a position below the left and a position below the right of the boundary macro block, and the interval between the preset range and the boundary macro block is smaller than a preset number of macro blocks. It can be understood that, if the boundary macroblock and the searched boundary macroblock are adjacent macroblocks, the preset number is zero, and of course, the preset number may be determined according to specific requirements, which is not limited in this application.
The specific implementation of this embodiment can refer to the implementation process of the above embodiment, and is not described herein again.
With respect to the above embodiments, the present application provides a computer device, please refer to fig. 7, the computer device 700 includes: comprising a processor 710 and a memory 720, wherein the processor 710 is connected to the memory 720, the memory 720 is used for storing a computer program, which is executed by the processor 710 to implement the above-mentioned target boundary determination method.
In this embodiment, the processor 710 may also be referred to as a Central Processing Unit (CPU). Processor 710 may be an integrated circuit chip having signal processing capabilities. The processor 710 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor 710 may be any conventional processor or the like.
For the method of the above embodiment, it may exist in the form of a computer program, so that the present application provides a computer readable storage medium, please refer to fig. 8, where fig. 8 is a schematic structural diagram of the computer readable storage medium provided in the present application. The computer-readable storage medium 800 of the present embodiment stores therein a computer program 810 that can be executed by a processor to implement the method in the above-described embodiments.
The computer-readable storage medium 800 may be a medium that can store program instructions, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or may also be a server that stores the program instructions, and the server may send the stored program instructions to other devices for operation, or may self-execute the stored program instructions.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be apparent to those skilled in the art that the modules or steps of the present application described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and they may alternatively be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, or fabricated separately as individual integrated circuit modules, or fabricated as a single integrated circuit module from multiple modules or steps. Thus, the present application is not limited to any specific combination of hardware and software.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (11)

1. A method for object boundary determination, the method comprising:
acquiring code stream information of each macro block in a target image frame;
selecting boundary macro blocks of which the code stream information meets the requirement of a boundary code stream from the target image frame; and determining the boundary of a target object in the target image frame based on the selected boundary macro block.
2. The method according to claim 1, wherein the codestream information comprises at least one codestream information capable of reflecting a difference between a boundary and a non-boundary of a target object in the target image frame.
3. The method according to claim 2, wherein the code stream information of the macroblock includes one or a combination of quantization parameter values, residual values, coded bit numbers, temporal texture complexity, spatial texture complexity, maximum partition depth, and the number of subblocks included.
4. The method of claim 3, wherein the boundary code stream requires that at least one code stream information of the boundary macro-block satisfies: the code stream information is located in a target value area of the code stream information distribution corresponding to the target image frame, and the target value area is as follows: dividing the code stream information distribution into two numerical value areas according to the numerical value, wherein one numerical value area is the target numerical value area.
5. The method according to claim 4, wherein said selecting, from the target image frame, the boundary macro-block whose code stream information meets the boundary code stream condition includes:
taking at least two kinds of code stream information as target code stream information;
traversing each macro block in the target image frame to detect whether each target code stream information of the macro block meets a preset condition; the preset condition is that the difference value between the maximum value of the target code stream information in the target image frame and the target code stream information of the macro block is greater than or smaller than a reference threshold corresponding to the target code stream information;
and if each piece of target code stream information of the macro block meets the preset condition, taking the macro block as a boundary macro block of which the code stream information meets the boundary code stream condition.
6. The method according to claim 5, wherein the reference threshold corresponding to the target code stream information is: and the average value or the intermediate value between the maximum value and the minimum value of the target code stream information in the target image frame.
7. The method of claim 1, wherein determining the boundary of the target object in the target image frame based on the selected boundary macro block comprises:
finding out whether boundary macro blocks of other boundary macro blocks exist in a preset range;
and connecting the found boundary macro blocks to generate the boundary of the target object in the target image frame.
8. The method according to claim 7, wherein the predetermined range includes left, right, above right, left, right, below left, and below right of the boundary macro block, and is spaced from the boundary macro block by less than a predetermined number of macro blocks.
9. The method of claim 1, wherein the target object is in motion; and/or the presence of a gas in the gas,
the acquiring code stream information of each macro block in the target image frame includes:
acquiring code stream data of a target video, wherein the target video comprises a plurality of image frames, and the code stream data is obtained by encoding the image frames subjected to denoising;
and analyzing the code stream data to obtain code stream information of each macro block of the target image frame.
10. A computer device, characterized in that the computer device comprises: a processor and a memory for storing a computer program for execution by the processor to implement the method of any one of claims 1 to 9.
11. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed by a processor to implement the method according to any one of claims 1 to 9.
CN202010674498.5A 2020-07-14 2020-07-14 Target boundary determining method, computer device and storage medium Pending CN112004090A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010674498.5A CN112004090A (en) 2020-07-14 2020-07-14 Target boundary determining method, computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010674498.5A CN112004090A (en) 2020-07-14 2020-07-14 Target boundary determining method, computer device and storage medium

Publications (1)

Publication Number Publication Date
CN112004090A true CN112004090A (en) 2020-11-27

Family

ID=73466941

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010674498.5A Pending CN112004090A (en) 2020-07-14 2020-07-14 Target boundary determining method, computer device and storage medium

Country Status (1)

Country Link
CN (1) CN112004090A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020095761A (en) * 2001-06-15 2002-12-28 엘지전자 주식회사 Loop filtering method for video coder
CN1420690A (en) * 2001-11-17 2003-05-28 Lg电子株式会社 Method and system for control of bit tate based on object
CN101589624A (en) * 2007-01-22 2009-11-25 日本电气株式会社 Image re-encoding device, image re-encoding method and image encoding program
CN101621691A (en) * 2009-08-10 2010-01-06 浙江大学 Method for extracting edge characteristic from H.264 compressed code stream
US20130101031A1 (en) * 2011-10-25 2013-04-25 Qualcomm Incorporated Determining quantization parameters for deblocking filtering for video coding
CN103931193A (en) * 2011-09-21 2014-07-16 Lg电子株式会社 Method and an apparatus for encoding/decoding an image
CN105409222A (en) * 2013-07-17 2016-03-16 索尼公司 Image processing device and method
CN110062230A (en) * 2019-04-29 2019-07-26 湖南国科微电子股份有限公司 Image encoding method and device
CN110225355A (en) * 2019-06-22 2019-09-10 衢州光明电力投资集团有限公司赋腾科技分公司 High-performance video coding intra prediction optimization method based on area-of-interest

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020095761A (en) * 2001-06-15 2002-12-28 엘지전자 주식회사 Loop filtering method for video coder
CN1420690A (en) * 2001-11-17 2003-05-28 Lg电子株式会社 Method and system for control of bit tate based on object
CN101589624A (en) * 2007-01-22 2009-11-25 日本电气株式会社 Image re-encoding device, image re-encoding method and image encoding program
CN101621691A (en) * 2009-08-10 2010-01-06 浙江大学 Method for extracting edge characteristic from H.264 compressed code stream
CN103931193A (en) * 2011-09-21 2014-07-16 Lg电子株式会社 Method and an apparatus for encoding/decoding an image
US20130101031A1 (en) * 2011-10-25 2013-04-25 Qualcomm Incorporated Determining quantization parameters for deblocking filtering for video coding
CN105409222A (en) * 2013-07-17 2016-03-16 索尼公司 Image processing device and method
CN110062230A (en) * 2019-04-29 2019-07-26 湖南国科微电子股份有限公司 Image encoding method and device
CN110225355A (en) * 2019-06-22 2019-09-10 衢州光明电力投资集团有限公司赋腾科技分公司 High-performance video coding intra prediction optimization method based on area-of-interest

Similar Documents

Publication Publication Date Title
EP2782340B1 (en) Motion analysis method based on video compression code stream, code stream conversion method and apparatus thereof
CN112534818B (en) Machine learning based adaptation of coding parameters for video coding using motion and object detection
CN110290388B (en) Intra-frame prediction method, video encoding method, computer device and storage device
US7949053B2 (en) Method and assembly for video encoding, the video encoding including texture analysis and texture synthesis, and corresponding computer program and corresponding computer-readable storage medium
JP5496914B2 (en) How to assess perceptual quality
US20220058775A1 (en) Video denoising method and apparatus, and storage medium
CN105472205B (en) Real-time video noise reduction method and device in encoding process
US20120275524A1 (en) Systems and methods for processing shadows in compressed video images
TW201127064A (en) System and method to process motion vectors of video data
US11394997B2 (en) Method and apparatus for encoding or decoding video data with sub-pixel motion vector refinement
US20170041623A1 (en) Method and Apparatus for Intra Coding for a Block in a Coding System
CN105872556B (en) Video encoding method and apparatus
Yu et al. Detection of fake high definition for HEVC videos based on prediction mode feature
Xu et al. Detection of HEVC double compression with non-aligned GOP structures via inter-frame quality degradation analysis
CN102984524B (en) A kind of video coding-decoding method based on block layer decomposition
Lan et al. Exploiting non-local correlation via signal-dependent transform (SDT)
CN113382249B (en) Image/video encoding method, apparatus, system, and computer-readable storage medium
CN113422959A (en) Video encoding and decoding method and device, electronic equipment and storage medium
JP5178616B2 (en) Scene change detection device and video recording device
CN106878754B (en) A kind of 3D video depth image method for choosing frame inner forecast mode
CN109451306B (en) Method and device for selecting optimal prediction mode of chrominance component and electronic equipment
CN112004090A (en) Target boundary determining method, computer device and storage medium
CN112272297B (en) Image quality still frame detection method embedded in decoder
WO2022116119A1 (en) Inter-frame predication method, encoder, decoder and storage medium
CN109936741B (en) Video coding method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination