CN113596452A - Encoding method, encoding device, electronic equipment and storage medium - Google Patents
Encoding method, encoding device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN113596452A CN113596452A CN202110674333.2A CN202110674333A CN113596452A CN 113596452 A CN113596452 A CN 113596452A CN 202110674333 A CN202110674333 A CN 202110674333A CN 113596452 A CN113596452 A CN 113596452A
- Authority
- CN
- China
- Prior art keywords
- prediction mode
- block
- current block
- encoding
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/157—Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
- H04N19/159—Prediction type, e.g. intra-frame, inter-frame or bidirectional frame prediction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/176—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/567—Motion estimation based on rate distortion criteria
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
The invention provides an encoding method, an encoding device, electronic equipment and a storage medium, wherein the encoding method comprises the following steps: determining a prediction mode of the current block based on a prediction mode of a reference block of the current block and a prediction mode of a neighbor block of the current block; the reference block of the current block is a block at the position of the current block corresponding to the previous frame of image of the current block, or an original matching block of the current block, and the adjacent domain block of the current block is a preset number of blocks taking the current block as the center; the current block is encoded based on a prediction mode of the current block. Therefore, the probability of blocking effect occurrence can be effectively reduced.
Description
Technical Field
The present invention relates to the field of video encoding and decoding technologies, and in particular, to an encoding method, an encoding device, an electronic device, and a storage medium.
Background
The current coding standard is still a block-based hybrid coding framework, in the coding process, the residual error is subjected to block-based transformation, so that the correlation between blocks is ignored, and different quantization is adopted for transformation coefficients, so that different image blocks are processed differently, and for human eyes, the discontinuity between blocks, namely blocking effect, is easily observed. Although the coding standard considers how to further remove the Blocking effect in the development process (for example, De-Blocking technology is introduced in h.264/AVC, SAO (Sample Adaptive Offset) technology is introduced in h.265/HEVC, ESAO (Adaptive Sample Offset) technology is introduced in h.266/VVC and AVS3, and the like), the Blocking effect still frequently occurs in practical applications, and removing the Blocking effect is one of the important problems to be solved in the industry.
Disclosure of Invention
The invention provides a coding method, a coding device, electronic equipment and a storage medium.
In order to solve the above technical problems, a first technical solution provided by the present invention is: there is provided an encoding method including: determining a prediction mode of the current block based on a prediction mode of a reference block of the current block and a prediction mode of a neighbor block of the current block; the reference block of the current block is a block at the position of the current block corresponding to the previous frame of image of the current block, or an original matching block of the current block, and the adjacent domain block of the current block is a preset number of blocks taking the current block as the center; the current block is encoded based on a prediction mode of the current block.
Wherein the current block is a non-motion region.
Wherein the step of determining the prediction mode of the current block based on the prediction mode of the reference block of the current block and the prediction mode of the neighbor block of the current block comprises: in response to the prediction mode of the reference block being the same as the prediction mode of the neighbor block, the prediction mode of the current block is the same as the prediction modes of the reference block and the neighbor block.
Wherein, in response to the prediction mode of the reference block being the same as the prediction mode of the neighbor block, the step of the prediction mode of the current block being the same as the prediction modes of the reference block and the neighbor block comprises: responding to the prediction mode corresponding to the reference block as an inter prediction mode; the prediction mode corresponding to the neighborhood block is an inter-frame prediction mode, and the prediction mode of the current block is determined to be the inter-frame prediction mode; responding to the corresponding prediction mode of the reference block as a skip prediction mode; the prediction mode corresponding to the neighborhood block is a skip prediction mode, and the prediction mode of the current block is determined to be the skip prediction mode; responding to the corresponding prediction mode of the reference block as an intra-frame prediction mode; and determining the prediction mode of the current block as the intra-frame prediction mode.
Wherein the step of determining the prediction mode of the current block based on the prediction mode of the reference block of the current block and the prediction mode of the neighbor block of the current block comprises: and calculating the prediction mode of the current block by using a calculation method of the distortion cost in response to that the prediction mode of the reference block is different from that of the neighbor block.
Wherein the step of determining the prediction mode of the current block based on the prediction mode of the reference block of the current block and the prediction mode of the neighbor block of the current block comprises: in response to that the prediction mode of the reference block is the same as that of the neighborhood block, the current block is encoded by using the prediction modes of the reference block and the neighborhood block to obtain a first encoding result; encoding the current block by using the original prediction mode of the current block to obtain a second encoding result; the original prediction mode is a prediction mode of a current block obtained by utilizing rate distortion cost calculation; the prediction mode of the current block is determined from the prediction modes of the reference block and the neighbor block and the original prediction mode based on the first encoding result and the second encoding result.
Wherein, in response to the original prediction mode being an intra prediction mode, the method comprises: in response to that the prediction mode corresponding to the reference block is an inter-frame prediction mode and the prediction mode corresponding to the neighborhood block is an inter-frame prediction mode, encoding the current block by using the inter-frame prediction mode to obtain a first encoding result; encoding the current block by utilizing an intra-frame prediction mode to obtain a second encoding result; in response to that the prediction mode corresponding to the reference block is a skip prediction mode and the prediction mode corresponding to the neighborhood block is a skip prediction mode, encoding the current block by using the skip prediction mode to obtain a first encoding result; and encoding the current block by utilizing the intra-frame prediction mode to obtain a second encoding result.
Wherein, in response to the original prediction mode being the inter prediction mode, the method comprises: in response to that the prediction mode corresponding to the reference block is an intra-frame prediction mode and the prediction mode corresponding to the neighbor block is an intra-frame prediction mode, encoding the current block by using the intra-frame prediction mode to obtain a first encoding result; encoding the current block by utilizing an inter-frame prediction mode to obtain a second encoding result; in response to that the prediction mode corresponding to the reference block is a skip prediction mode and the prediction mode corresponding to the neighborhood block is a skip prediction mode, encoding the current block by using the skip prediction mode to obtain a first encoding result; and encoding the current block by utilizing the inter-frame prediction mode to obtain a second encoding result.
Wherein in response to the original prediction mode being the skip prediction mode, the method comprises: in response to that the prediction mode corresponding to the reference block is an inter-frame prediction mode and the prediction mode corresponding to the neighborhood block is an inter-frame prediction mode, encoding the current block by using the inter-frame prediction mode to obtain a first encoding result; coding the current block by using a skip prediction mode to obtain a second coding result; in response to that the prediction mode corresponding to the reference block is an intra-frame prediction mode and the prediction mode corresponding to the neighbor block is an intra-frame prediction mode, encoding the current block by using the intra-frame prediction mode to obtain a first encoding result; and encoding the current block by using the skip prediction mode to obtain a second encoding result.
In order to solve the above technical problems, a second technical solution provided by the present invention is: there is provided an encoding device including: a mode determination module for determining a prediction mode of the current block based on a prediction mode of a reference block of the current block and/or a prediction mode of a neighbor block of the current block; the reference block of the current block is a block at the position of the current block corresponding to the previous frame of image of the current block, or an original matching block of the current block, and the adjacent domain block of the current block is a preset number of blocks taking the current block as the center. And an encoding module which encodes the current block based on the prediction mode of the current block.
In order to solve the above technical problems, a third technical solution provided by the present invention is: provided is an electronic device including: a memory storing program instructions and a processor retrieving the program instructions from the memory to perform any of the above methods.
In order to solve the above technical problems, a fourth technical solution provided by the present invention is: there is provided a computer readable storage medium storing a program file executable to implement the method of any of the above.
The method has the advantages that the method is different from the prior art, and the encoding method determines the prediction mode of the current block based on the prediction mode of the reference block of the current block and the prediction mode of the neighborhood block of the current block; the current block is encoded based on a prediction mode of the current block. The reference block of the current block is a block at the position of the current block corresponding to the previous frame of image of the current block, or an original matching block of the current block, and the adjacent domain block of the current block is a preset number of blocks taking the current block as the center. Therefore, the probability of blocking effect occurrence can be effectively reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without inventive efforts, wherein:
FIG. 1 is a flowchart illustrating a first embodiment of an encoding method according to the present invention;
FIG. 2 is a flowchart illustrating a second embodiment of the encoding method according to the present invention;
FIG. 3 is a schematic structural diagram of an encoding apparatus according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of an electronic device according to an embodiment of the invention;
FIG. 5 is a schematic structural diagram of an embodiment of a computer-readable storage medium according to the present invention.
Detailed Description
In the prior art, the blocking effect is generally considered after the existing blocking effect. However, in the practical application process, no matter which coding technology is used, services with extremely high requirements for storage and transmission must encounter the problem of blocking effect under the condition of extremely low code rate, which is a necessary result generated based on the current coding technology principle. In order to avoid the generation of the blocking effect, the application provides a coding method which avoids the blocking effect from the coding technology, so as to improve the subjective quality of the image/video. The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Please refer to fig. 1, which is a flowchart illustrating a first embodiment of the encoding method according to the present invention.
The method specifically comprises the following steps:
step S11: the prediction mode of the current block is determined based on the prediction mode of the reference block of the current block and the prediction mode of the neighbor block of the current block.
In this embodiment, the prediction mode of the current block is determined based on the prediction mode of the reference block of the current block and the prediction mode of the neighbor block of the current block. Specifically, the original matching block of the current block is the block with the minimum cost after the current block is subjected to rate distortion cost comparison, namely the original best matching block of the current block. The neighbor blocks of the current block are a preset number of blocks centered on the current block. In one embodiment, the neighborhood block of the current block is (N-1) actively encoded blocks centered on the current block.
The prediction mode of the current block is determined mainly based on the time-space domain information of the current block. Wherein, the reference block of the current block represents the time domain information, and the neighborhood block of the current block represents the space domain information. The method improves the rationality and accuracy of the correction of the prediction mode.
In a specific embodiment, the current frame may be detected by using a detection algorithm, such as a frame difference method, a gaussian mixture detection algorithm, etc., to detect a motion region of the current frame, and a non-motion region is obtained based on the motion region. Specifically, the current frame image is cut off the motion region, and the remaining region is a non-motion region. The encoding method of the present application is mainly directed to non-motion areas, i.e., the current block is a non-motion area.
In a particular embodiment, in response to the prediction mode of the reference block being the same as the prediction mode of the neighbor block, the prediction mode of the current block is the same as the prediction modes of the reference block and the neighbor block. Specifically, if the prediction mode of the block at the position corresponding to the current block in the image of the frame previous to the current block is the same as the prediction modes of (N × N-1) active coding blocks centered on the current block, the prediction mode of the current block is the same as the prediction modes of the reference block and the neighboring block.
Specifically, the prediction mode corresponding to the response reference block is an inter prediction mode; and determining the prediction mode of the current block as the inter-frame prediction mode. Responding to the corresponding prediction mode of the reference block as a skip prediction mode; and determining the prediction mode of the current block as the skip prediction mode. Responding to the corresponding prediction mode of the reference block as an intra-frame prediction mode; and determining the prediction mode of the current block as the intra-frame prediction mode.
And calculating the prediction mode of the current block by using a calculation method of distortion cost calculation in response to that the prediction mode of the reference block is different from that of the neighbor block. That is, if the prediction mode of the block corresponding to the current block position in the previous frame of image of the current block is different from the prediction modes of the (N × N-1) effective coding blocks taking the current block as the center, the prediction mode of the current block is calculated by using the distortion cost calculation method.
Specifically, if the prediction mode obtained by calculating the distortion cost of the utilization rate of the current block is the intra-frame prediction mode, if the prediction mode of the reference block is different from the prediction mode of the neighbor block, the prediction mode of the current block is determined to be the intra-frame prediction mode. And if the prediction mode obtained by calculating the utilization rate distortion cost of the current block is the inter-frame prediction mode, determining that the prediction mode of the current block is the inter-frame prediction mode if the prediction mode of the reference block is different from the prediction mode of the neighbor block. And if the prediction mode obtained by calculating the utilization rate distortion cost of the current block is a skip prediction mode, determining that the prediction mode of the current block is the skip prediction mode if the prediction mode of the reference block is different from the prediction mode of the neighbor block.
In the method of this embodiment, the prediction mode of the current block is determined by the reference block of the current block and the neighbor block of the current block, so as to reduce the block effect problem. Specifically, if the prediction modes of the reference block and the neighbor block of the current block are the same, the prediction mode of the current block is the same as the prediction modes of the reference block and the neighbor block. Thereby reducing the blocking problem.
Step S12: the current block is encoded based on a prediction mode of the current block.
Specifically, after the prediction mode of the current block is determined, the current block is encoded using the prediction mode of the current block. For example, if the prediction mode of the current block is the intra-frame prediction mode, the current block is encoded by using the intra-frame prediction mode; if the prediction mode of the current block is an inter-frame prediction mode, encoding the current block by using the inter-frame prediction mode; and if the prediction mode of the current block is the skip prediction mode, encoding the current block by using the skip prediction mode.
In the method of the embodiment, at the encoding stage, a proper prediction mode is determined, and encoding is performed based on the proper prediction mode, so that the problem of blocking effect can be reduced at the encoding stage, and the quality of image/video in charge can be improved.
In an embodiment, if the prediction mode of the current block is directly set to be the same as the prediction modes of the reference block and the neighborhood block, and the effect of the coding block obtained by coding the current block by using the prediction mode which is the same as the prediction modes of the reference block and the neighborhood block is utilized, when the effect of the coding block obtained by coding the prediction mode of the current block obtained by calculating the distortion cost of the utilization rate is good, in the subsequent coding process, in order to solve the problem that error transmission can occur when determining the prediction modes of the reference block and the neighborhood block, and further the subsequent coding block continues the error all the time, the application also provides a flow diagram of a second embodiment of the coding method, as shown in fig. 2, in the present embodiment, steps S21 and S24 are the same as steps S11 and S12 in the first embodiment shown in fig. 1, except that steps S21 and S24 further include:
step S22: in response to that the prediction mode of the reference block is the same as that of the neighborhood block, the current block is encoded by using the prediction modes of the reference block and the neighborhood block to obtain a first encoding result; and encoding the current block by using the original prediction mode of the current block to obtain a second encoding result.
Specifically, if the prediction mode of the reference block is the same as the prediction mode of the neighbor block, the current block is encoded using the prediction modes of the reference block and the neighbor block to obtain a first encoding result. And calculating the original prediction mode of the current block by using a distortion cost calculation method, and encoding the current block by using the original prediction mode of the current block to obtain a second encoding result.
In one embodiment, if the original prediction mode of the current block calculated by the utilization distortion cost is the intra prediction mode. Under the condition that the original prediction mode is an intra-frame prediction mode, responding to that the prediction mode corresponding to the reference block is an inter-frame prediction mode and the prediction mode corresponding to the neighborhood block is the inter-frame prediction mode; encoding the current block by utilizing an inter-frame prediction mode to obtain a first encoding result; and encoding the current block by utilizing the intra-frame prediction mode to obtain a second encoding result. Under the condition that the original prediction mode is an intra-frame prediction mode, in response to the fact that the prediction mode corresponding to the reference block is a skip prediction mode and the prediction mode corresponding to the neighbor block is a skip prediction mode, encoding the current block by using the skip prediction mode to obtain a first encoding result; and encoding the current block by utilizing the intra-frame prediction mode to obtain a second encoding result.
In one embodiment, if the original prediction mode of the current block calculated by the utilization rate distortion cost is the inter prediction mode. Under the condition that the original prediction mode is an inter-frame prediction mode, responding to that the prediction mode corresponding to the reference block is an intra-frame prediction mode and the prediction mode corresponding to the neighborhood block is the intra-frame prediction mode; coding the current block by utilizing an intra-frame prediction mode to obtain a first coding result; and encoding the current block by utilizing the inter-frame prediction mode to obtain a second encoding result. Under the condition that the original prediction mode is the inter-frame prediction mode, in response to the fact that the prediction mode corresponding to the reference block is the skip prediction mode and the prediction mode corresponding to the neighbor block is the skip prediction mode, encoding the current block by using the skip prediction mode to obtain a first encoding result; and encoding the current block by utilizing the inter-frame prediction mode to obtain a second encoding result.
And if the original prediction mode of the current block obtained by utilizing the distortion cost calculation is the skip prediction mode. Under the condition that the original prediction mode is a skip prediction mode, responding that the prediction mode corresponding to the reference block is an inter-frame prediction mode, and the prediction mode corresponding to the neighborhood block is the inter-frame prediction mode; encoding the current block by utilizing an inter-frame prediction mode to obtain a first encoding result; and encoding the current block by using the skip prediction mode to obtain a second encoding result. Under the condition that the original prediction mode is a skip prediction mode, responding to that the prediction mode corresponding to the reference block is an intra-frame prediction mode and the prediction mode corresponding to the neighborhood block is the intra-frame prediction mode, and encoding the current block by using the intra-frame prediction mode to obtain a first encoding result; and encoding the current block by using the skip prediction mode to obtain a second encoding result.
Step S23: the prediction mode of the current block is determined from the prediction mode of the reference block, the prediction mode of the neighbor block, and the original prediction mode based on the first encoding result and the second encoding result.
Specifically, the prediction mode suitable for the current block is judged from the prediction mode of the reference block, the prediction mode of the neighborhood block and the original prediction mode based on the first encoding result and the second encoding result, so that the optimal prediction mode can be selected for the current block, and the problem of error transmission is avoided while the probability of block effect occurrence is reduced.
Fig. 3 is a schematic structural diagram of an encoding device according to an embodiment of the present invention, which specifically includes a mode determining module 31 and an encoding module 32.
The mode determining module 31 determines the prediction mode of the current block based on the spatial-temporal domain information of the current block. Wherein, the reference block of the current block represents the time domain information, and the neighborhood block of the current block represents the space domain information. The method improves the rationality and accuracy of the correction of the prediction mode.
Specifically, the mode determining module 31 is configured to determine the prediction mode of the current block based on the prediction mode of the reference block of the current block and/or the prediction mode of the neighbor block of the current block; the reference block of the current block is a block at a position corresponding to the current block in a frame of image before the current block, or an original matching block of the current block, specifically, the original matching block of the current block is a block with the minimum cost after the current block is subjected to rate distortion cost comparison. The neighbor blocks of the current block are a preset number of blocks centered on the current block. Wherein the current block is a non-motion region.
In an embodiment, the mode determination module 31 determines that the prediction mode of the current block is the same as the prediction modes of the reference block and the neighbor block when the prediction mode of the reference block is the same as the prediction mode of the neighbor block.
Specifically, the prediction mode corresponding to the response reference block is an inter prediction mode; the prediction mode corresponding to the neighborhood block is an inter-frame prediction mode, and the prediction mode of the current block is determined to be the inter-frame prediction mode; responding to the corresponding prediction mode of the reference block as a skip prediction mode; the prediction mode corresponding to the neighborhood block is a skip prediction mode, and the prediction mode of the current block is determined to be the skip prediction mode; responding to the corresponding prediction mode of the reference block as an intra-frame prediction mode; and determining the prediction mode of the current block as the intra-frame prediction mode.
In an embodiment, the mode determining module 31 is configured to calculate the prediction mode of the current block by using a calculation method of a distortion cost when the prediction mode of the reference block is different from the prediction mode of the neighbor block.
In an embodiment, the mode determining module 31 is configured to, when the prediction mode of the reference block is the same as that of the neighbor block, encode the current block by using the prediction modes of the reference block and the neighbor block to obtain a first encoding result; encoding the current block by using the original prediction mode of the current block to obtain a second encoding result; the original prediction mode is the prediction mode of the current block calculated by using the distortion cost.
Specifically, in response to the original prediction mode being an intra-frame prediction mode, in response to the prediction mode corresponding to the reference block being an inter-frame prediction mode, and in response to the prediction mode corresponding to the neighbor block being an inter-frame prediction mode, the current block is encoded by using the inter-frame prediction mode to obtain a first encoding result; encoding the current block by utilizing an intra-frame prediction mode to obtain a second encoding result; in response to that the prediction mode corresponding to the reference block is a skip prediction mode and the prediction mode corresponding to the neighborhood block is a skip prediction mode, encoding the current block by using the skip prediction mode to obtain a first encoding result; and encoding the current block by utilizing the intra-frame prediction mode to obtain a second encoding result.
Responding to the fact that the original prediction mode is an inter-frame prediction mode, responding to the fact that the prediction mode corresponding to the reference block is an intra-frame prediction mode, and responding to the fact that the prediction mode corresponding to the neighborhood block is the intra-frame prediction mode, and coding the current block by using the intra-frame prediction mode to obtain a first coding result; encoding the current block by utilizing an inter-frame prediction mode to obtain a second encoding result; in response to that the prediction mode corresponding to the reference block is a skip prediction mode and the prediction mode corresponding to the neighborhood block is a skip prediction mode, encoding the current block by using the skip prediction mode to obtain a first encoding result; and encoding the current block by utilizing the inter-frame prediction mode to obtain a second encoding result.
Responding to the fact that the original prediction mode is a skip prediction mode, responding to the fact that the prediction mode corresponding to the reference block is an inter-frame prediction mode, and responding to the fact that the prediction mode corresponding to the neighborhood block is an inter-frame prediction mode, and coding the current block by utilizing the inter-frame prediction mode to obtain a first coding result; coding the current block by using a skip prediction mode to obtain a second coding result; in response to that the prediction mode corresponding to the reference block is an intra-frame prediction mode and the prediction mode corresponding to the neighbor block is an intra-frame prediction mode, encoding the current block by using the intra-frame prediction mode to obtain a first encoding result; and encoding the current block by using the skip prediction mode to obtain a second encoding result.
The mode determining module 31 is configured to determine a prediction mode of the current block from prediction modes of the reference block and the neighbor block and an original prediction mode based on the first encoding result and the second encoding result.
The encoding module 32 encodes the current block based on the prediction mode of the current block.
The encoding apparatus of this embodiment determines the prediction mode of the current block according to the reference block of the current block and the neighboring block of the current block, so as to reduce the block effect problem. Specifically, if the prediction modes of the reference block and the neighbor block of the current block are the same, the prediction mode of the current block is the same as the prediction modes of the reference block and the neighbor block. Thereby reducing the blocking problem.
The encoding apparatus of this embodiment determines the prediction mode suitable for the current block from the prediction mode of the reference block, the prediction mode of the neighboring block, and the original prediction mode based on the first encoding result and the second encoding result, so that the optimal prediction mode can be selected for the current block, and the error transmission problem is avoided while the probability of occurrence of the block effect is reduced.
Referring to fig. 4, a schematic structural diagram of an electronic device according to an embodiment of the present invention is shown, where the electronic device includes a memory 202 and a processor 201 that are connected to each other.
The memory 202 is used to store program instructions implementing the method of any of the above.
The processor 201 is used to execute program instructions stored by the memory 202.
The processor 201 may also be referred to as a Central Processing Unit (CPU). The processor 201 may be an integrated circuit chip having signal processing capabilities. The processor 201 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 202 may be a memory bank, a TF card, etc., and may store all information in the electronic device of the device, including the input raw data, the computer program, the intermediate operation results, and the final operation results. It stores and retrieves information based on the location specified by the controller. With the memory, the electronic device can only have the memory function to ensure the normal operation. The memories of electronic devices are classified into a main memory (internal memory) and an auxiliary memory (external memory) according to their purposes, and also into an external memory and an internal memory. The external memory is usually a magnetic medium, an optical disk, or the like, and can store information for a long period of time. The memory refers to a storage component on the main board, which is used for storing data and programs currently being executed, but is only used for temporarily storing the programs and the data, and the data is lost when the power is turned off or the power is cut off.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a system server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method of the embodiments of the present application.
Please refer to fig. 5, which is a schematic structural diagram of a computer-readable storage medium according to the present invention. The storage medium of the present application stores a program file 203 capable of implementing all the methods described above, wherein the program file 203 may be stored in the storage medium in the form of a software product, and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. The aforementioned storage device includes: various media capable of storing program codes, such as a usb disk, a mobile hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or terminal devices, such as a computer, a server, a mobile phone, and a tablet.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (12)
1. A method of encoding, comprising:
determining a prediction mode of a current block based on a prediction mode of a reference block of the current block and a prediction mode of a neighbor block of the current block; the reference block of the current block is a block of a previous frame image of the current block corresponding to the position of the current block, or an original matching block of the current block, and the neighborhood block of the current block is a preset number of blocks taking the current block as the center;
encoding the current block based on a prediction mode of the current block.
2. The method of claim 1, wherein the current block is a non-motion region.
3. The method of claim 1, wherein the step of determining the prediction mode of the current block based on the prediction mode of the reference block of the current block and the prediction mode of the neighbor block of the current block comprises:
in response to the prediction mode of the reference block being the same as the prediction mode of the neighbor block, the prediction mode of the current block being the same as the prediction modes of the reference block and the neighbor block.
4. The method according to claim 3, wherein said step of, in response to the prediction mode of the reference block being the same as the prediction mode of the neighbor block, the prediction mode of the current block being the same as the prediction modes of the reference block and the neighbor block comprises:
responding to the prediction mode corresponding to the reference block as an inter prediction mode; the prediction mode corresponding to the neighborhood block is an inter-frame prediction mode, and the prediction mode of the current block is determined to be the inter-frame prediction mode;
responding to the corresponding prediction mode of the reference block as a skip prediction mode; the prediction mode corresponding to the neighborhood block is a skip prediction mode, and the prediction mode of the current block is determined to be the skip prediction mode;
responding to the corresponding prediction mode of the reference block as an intra-frame prediction mode; and determining the prediction mode of the current block as an intra-frame prediction mode, wherein the prediction mode corresponding to the neighborhood block is the intra-frame prediction mode.
5. The method of claim 3, wherein the step of determining the prediction mode of the current block based on the prediction mode of the reference block of the current block and the prediction mode of the neighbor block of the current block comprises:
and calculating the prediction mode of the current block by using a calculation method of a distortion cost in response to that the prediction mode of the reference block is different from that of the neighborhood block.
6. The method of claim 3, wherein the step of determining the prediction mode of the current block based on the prediction mode of the reference block of the current block and the prediction mode of the neighbor block of the current block comprises:
in response to that the prediction mode of the reference block is the same as that of the neighborhood block, encoding the current block by using the prediction modes of the reference block and the neighborhood block to obtain a first encoding result; encoding the current block by using the original prediction mode of the current block to obtain a second encoding result; the original prediction mode is the prediction mode of the current block obtained by utilizing rate distortion cost calculation;
determining a prediction mode of the current block from prediction modes of the reference block and the neighbor block and the original prediction mode based on the first encoding result and the second encoding result.
7. The method of claim 6, wherein in response to the original prediction mode being an intra prediction mode, the method comprises:
in response to that the prediction mode corresponding to the reference block is an inter-frame prediction mode and the prediction mode corresponding to the neighborhood block is an inter-frame prediction mode, encoding the current block by using the inter-frame prediction mode to obtain a first encoding result; encoding the current block by utilizing the intra-frame prediction mode to obtain a second encoding result;
in response to that the prediction mode corresponding to the reference block is a skip prediction mode and the prediction mode corresponding to the neighborhood block is a skip prediction mode, encoding the current block by using the skip prediction mode to obtain a first encoding result; and encoding the current block by utilizing the intra-frame prediction mode to obtain a second encoding result.
8. The method of claim 6, wherein in response to the original prediction mode being an inter prediction mode, the method comprises:
in response to that the prediction mode corresponding to the reference block is an intra-frame prediction mode and the prediction mode corresponding to the neighborhood block is an intra-frame prediction mode, encoding the current block by using the intra-frame prediction mode to obtain a first encoding result; encoding the current block by utilizing the inter-frame prediction mode to obtain a second encoding result;
in response to that the prediction mode corresponding to the reference block is a skip prediction mode and the prediction mode corresponding to the neighborhood block is a skip prediction mode, encoding the current block by using the skip prediction mode to obtain a first encoding result; and encoding the current block by utilizing the inter-frame prediction mode to obtain a second encoding result.
9. The method of claim 6, wherein in response to the original prediction mode being a skip prediction mode, the method comprises:
in response to that the prediction mode corresponding to the reference block is an inter-frame prediction mode and the prediction mode corresponding to the neighborhood block is an inter-frame prediction mode, encoding the current block by using the inter-frame prediction mode to obtain a first encoding result; encoding the current block by using the skip prediction mode to obtain a second encoding result;
in response to that the prediction mode corresponding to the reference block is an intra-frame prediction mode and the prediction mode corresponding to the neighborhood block is an intra-frame prediction mode, encoding the current block by using the intra-frame prediction mode to obtain a first encoding result; and encoding the current block by using the skip prediction mode to obtain a second encoding result.
10. An encoding apparatus, comprising:
a mode determination module for determining a prediction mode of the current block based on a prediction mode of a reference block of the current block and/or a prediction mode of a neighbor block of the current block; the reference block of the current block is a block of a previous frame image of the current block corresponding to the position of the current block, or an original matching block of the current block, and the neighborhood block of the current block is a preset number of blocks taking the current block as the center;
an encoding module that encodes the current block based on a prediction mode of the current block.
11. An electronic device, comprising: a memory storing program instructions and a processor retrieving the program instructions from the memory to perform the method of any of claims 1-9.
12. A computer-readable storage medium, characterized in that a program file is stored, which program file can be executed to implement the method according to any one of claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110674333.2A CN113596452B (en) | 2021-06-17 | 2021-06-17 | Encoding method, encoding device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110674333.2A CN113596452B (en) | 2021-06-17 | 2021-06-17 | Encoding method, encoding device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113596452A true CN113596452A (en) | 2021-11-02 |
CN113596452B CN113596452B (en) | 2023-03-24 |
Family
ID=78244081
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110674333.2A Active CN113596452B (en) | 2021-06-17 | 2021-06-17 | Encoding method, encoding device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113596452B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117596392A (en) * | 2023-09-28 | 2024-02-23 | 书行科技(北京)有限公司 | Coding information determining method of coding block and related product |
CN117596392B (en) * | 2023-09-28 | 2024-10-22 | 书行科技(北京)有限公司 | Coding information determining method of coding block and related product |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101573985A (en) * | 2006-11-03 | 2009-11-04 | 三星电子株式会社 | Method and apparatus for video predictive encoding and method and apparatus for video predictive decoding |
CN101677406A (en) * | 2008-09-19 | 2010-03-24 | 华为技术有限公司 | Method and apparatus for video encoding and decoding |
CN101873500A (en) * | 2009-04-24 | 2010-10-27 | 华为技术有限公司 | Interframe prediction encoding method, interframe prediction decoding method and equipment |
CN103220506A (en) * | 2012-01-19 | 2013-07-24 | 华为技术有限公司 | Method and equipment of coding and decoding |
US20130287094A1 (en) * | 2010-12-28 | 2013-10-31 | Sk Telecom Co., Ltd. | Method and device for encoding/decoding image using feature vectors of surrounding blocks |
CN103384326A (en) * | 2013-03-20 | 2013-11-06 | 张新安 | Quick intra-frame prediction mode selection method for AVS-M video coding |
CN108307088A (en) * | 2017-10-09 | 2018-07-20 | 腾讯科技(深圳)有限公司 | A kind of image processing method, device, system and storage medium |
CN109005412A (en) * | 2017-06-06 | 2018-12-14 | 北京三星通信技术研究有限公司 | The method and apparatus that motion vector obtains |
CN111919447A (en) * | 2018-03-14 | 2020-11-10 | 韩国电子通信研究院 | Method and apparatus for encoding/decoding image and recording medium storing bitstream |
CN112218076A (en) * | 2020-10-17 | 2021-01-12 | 浙江大华技术股份有限公司 | Video coding method, device and system and computer readable storage medium |
-
2021
- 2021-06-17 CN CN202110674333.2A patent/CN113596452B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101573985A (en) * | 2006-11-03 | 2009-11-04 | 三星电子株式会社 | Method and apparatus for video predictive encoding and method and apparatus for video predictive decoding |
CN101677406A (en) * | 2008-09-19 | 2010-03-24 | 华为技术有限公司 | Method and apparatus for video encoding and decoding |
CN101873500A (en) * | 2009-04-24 | 2010-10-27 | 华为技术有限公司 | Interframe prediction encoding method, interframe prediction decoding method and equipment |
US20130287094A1 (en) * | 2010-12-28 | 2013-10-31 | Sk Telecom Co., Ltd. | Method and device for encoding/decoding image using feature vectors of surrounding blocks |
CN103220506A (en) * | 2012-01-19 | 2013-07-24 | 华为技术有限公司 | Method and equipment of coding and decoding |
CN103384326A (en) * | 2013-03-20 | 2013-11-06 | 张新安 | Quick intra-frame prediction mode selection method for AVS-M video coding |
CN109005412A (en) * | 2017-06-06 | 2018-12-14 | 北京三星通信技术研究有限公司 | The method and apparatus that motion vector obtains |
CN108307088A (en) * | 2017-10-09 | 2018-07-20 | 腾讯科技(深圳)有限公司 | A kind of image processing method, device, system and storage medium |
CN111919447A (en) * | 2018-03-14 | 2020-11-10 | 韩国电子通信研究院 | Method and apparatus for encoding/decoding image and recording medium storing bitstream |
CN112218076A (en) * | 2020-10-17 | 2021-01-12 | 浙江大华技术股份有限公司 | Video coding method, device and system and computer readable storage medium |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117596392A (en) * | 2023-09-28 | 2024-02-23 | 书行科技(北京)有限公司 | Coding information determining method of coding block and related product |
CN117596392B (en) * | 2023-09-28 | 2024-10-22 | 书行科技(北京)有限公司 | Coding information determining method of coding block and related product |
Also Published As
Publication number | Publication date |
---|---|
CN113596452B (en) | 2023-03-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12058382B2 (en) | Systems and methods for reducing blocking artifacts | |
US20060013320A1 (en) | Methods and apparatus for spatial error concealment | |
AU2012206839B2 (en) | Image coding and decoding method, image data processing method and devices thereof | |
KR20140110008A (en) | Object detection informed encoding | |
US12022070B2 (en) | Video encoding and decoding methods and apparatuses, electronic device, and storage medium | |
JP2024099733A (en) | Prediction method and device for decoding, and computer storage medium | |
CN113597757A (en) | Shape adaptive discrete cosine transform with region number adaptive geometric partitioning | |
US20240015310A1 (en) | Multimedia data processing method, apparatus, device, computer-readable storage medium, and computer program product | |
US20240163470A1 (en) | Method for inter prediction method, video picture encoder and decoder | |
CN113596452B (en) | Encoding method, encoding device, electronic equipment and storage medium | |
CN113382249A (en) | Image/video encoding method, apparatus, system, and computer-readable storage medium | |
CN109544591B (en) | Motion estimation method and device, electronic equipment and storage medium | |
CN109561315B (en) | Motion estimation method and device, electronic equipment and storage medium | |
US7995653B2 (en) | Method for finding the prediction direction in intraframe video coding | |
CN104159106A (en) | Video encoding method and device, and video decoding method and device | |
US20210235084A1 (en) | Picture block splitting method and apparatus | |
US11350088B2 (en) | Intra prediction method and apparatus, and computer-readable storage medium | |
CN112738524B (en) | Image encoding method, image encoding device, storage medium, and electronic apparatus | |
WO2024216415A1 (en) | Video encoding method, video decoding method, decoder, encoder and computer-readable storage medium | |
AU2015268694B2 (en) | Image coding and decoding method, image data processing method, and devices thereof | |
CN113259669A (en) | Encoding method, encoding device, electronic equipment and computer readable storage medium | |
CN118233641A (en) | Block effect detection method and device, block effect processing method and device, and video encoder | |
CN117591051A (en) | Screen image coding method, device, electronic equipment and storage medium | |
CN115720269A (en) | Video encoding method and video decoding method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |