WO2010029850A1 - Dispositif de codage d’image, dispositif de décodage d’image, procédé de codage d’image, procédé de décodage d’image et programme - Google Patents

Dispositif de codage d’image, dispositif de décodage d’image, procédé de codage d’image, procédé de décodage d’image et programme Download PDF

Info

Publication number
WO2010029850A1
WO2010029850A1 PCT/JP2009/064825 JP2009064825W WO2010029850A1 WO 2010029850 A1 WO2010029850 A1 WO 2010029850A1 JP 2009064825 W JP2009064825 W JP 2009064825W WO 2010029850 A1 WO2010029850 A1 WO 2010029850A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
encoding
area
image
processing order
Prior art date
Application number
PCT/JP2009/064825
Other languages
English (en)
Japanese (ja)
Inventor
達治 森吉
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2010528701A priority Critical patent/JP5246264B2/ja
Publication of WO2010029850A1 publication Critical patent/WO2010029850A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • H04N19/436Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation using parallelised computational arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/109Selection of coding mode or of prediction mode among a plurality of temporal predictive coding modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/174Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a slice, e.g. a line of blocks or a group of blocks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding

Definitions

  • the present invention relates to an image encoding device, an image decoding device, an image encoding method, an image decoding method, an image encoding processing program, and an image decoding processing program, and more particularly to encoding an image by parallel processing.
  • the present invention relates to an image encoding apparatus that can prevent a decrease in encoding efficiency and image quality.
  • H standardized by ITU (International Telecommunication Union) as a technique for generating encoded data by encoding a moving image signal at a low bit rate, a high compression rate, and high image quality, and decoding the encoded moving image.
  • ITU International Telecommunication Union
  • ISO International Organization for Standardization
  • MPEG-1, MPEG-2, MPEG-4, and the like are widely used as international standards.
  • H.264 is a standard jointly developed by the ITU and ISO. H.264 (Non-Patent Document 1). This H. In H.264, it is known that further compression efficiency improvement and image quality improvement can be realized as compared with the conventional moving image encoding technology.
  • these video coding methods use the fact that there is a correlation in the pattern tendency and movement between the pixel block to be coded and the blocks located nearby in the screen.
  • information compression is performed.
  • MPEG-1 / 2 and the like vector prediction from nearby motion vector information is used.
  • H.263 and MPEG-4 spatial prediction (AC / DC prediction) of DCT coefficients of motion compensated residual images is further introduced.
  • H.C. H.264 employs spatial prediction (intra prediction) at the pixel value level, which contributes to improvement in coding efficiency.
  • the encoding order of pixel blocks in the screen is determined by the standard. Basically, the encoding process starts from the upper left of the screen and proceeds in the right direction.When the right end of the screen is reached, it moves down one block and proceeds to the right again. It is stipulated to be. Also, a block of 16 ⁇ 16 pixels called a macro block (hereinafter referred to as MB) is used as a unit of pixel blocks to be encoded, and encoding processing is performed in order of raster scan in MB units as described above. .
  • MB macro block
  • FIG. 16 An example of the operation of the related art encoding method is shown in FIG. This is an example in the case of encoding a moving image in which one screen is composed of N ⁇ M MBs. As shown in FIG. 16, each MB in the screen is encoded in raster scan order. By performing the encoding process in this order, the MBs adjacent to the upper left, upper, upper right, and left of a certain MB to be encoded have already been encoded, and the encoding information of these adjacent MBs is changed. The spatial prediction used becomes possible.
  • FIG. 1 An example of motion vector spatial prediction processing in the H.264 system is shown in FIG.
  • MBs adjacent to the upper left, upper, upper right, and left of the encoding target MB are already encoded, and the motion vectors 401 to 404 of those MBs are fixed. Therefore, a predicted motion vector 405 of the encoding target MB is obtained by calculating a median value (intermediate value) of motion vectors 402, 403, and 404 of adjacent MBs.
  • a difference value between the motion vector to be encoded and the predicted motion vector 405 is encoded.
  • MBs macroblocks located at the top or leftmost part of the screen do not exist among MBs adjacent to the upper left, upper, upper right, and left, and a zero vector or the like is used to calculate a predicted motion vector.
  • the prediction motion vector of the MB located at the uppermost part or the leftmost part of the screen has a lower spatial prediction accuracy than usual, and the information amount reduction effect often decreases.
  • FIG. 18 illustrates an example of luminance component intra prediction when the MB prediction mode is Intra — 16 ⁇ 16 and the prediction mode is Intra — 16 ⁇ 16_Vertical (vertical prediction) mode.
  • the MBs adjacent to the upper left, upper, upper right, and left of the encoding target MB are already encoded, and the pixel values of those MBs are fixed. Therefore, in the case of the vertical prediction, the pixel value in the lower end column of the upper adjacent MB is set as the predicted pixel value of the pixel having the same horizontal position of the encoding target MB.
  • a difference value between a pixel value to be encoded and a predicted pixel value is encoded.
  • the amount of information to be encoded can be reduced by encoding the difference value from the predicted pixel value calculated by intra prediction. Can do.
  • the prediction pixel values of MBs located at the uppermost part and the leftmost part of the screen often have lower spatial prediction accuracy than usual, and the information amount reduction effect is often reduced.
  • M + (N ⁇ 1) MBs located at the top and leftmost portions of the screen have poor spatial prediction efficiency and the information amount reduction effect is reduced. There are many cases to do.
  • Hi-vision (1920 ⁇ 1080 pixels), 4K ⁇ 2K (4096 ⁇ 2048 pixels), 8K ⁇ 4K (8192 ⁇ 4096 pixels), and the like are typical formats. Since the amount of computation required for encoding and decoding such a high-definition moving image is very large, there is a strong demand for parallel processing using a plurality of processors. In order to perform parallel processing of encoding and decoding of moving images, a screen is divided into partial areas called slices, and each slice is independently processed in parallel.
  • FIG. 19 shows an example of the parallel processing operation of the related technology by slice division. This is an example when the screen is divided into four slices for processing. When the screen is divided into slices, each slice performs independent coding processing, and there is no dependency on the coding results of other slices, so the coding processing of each slice can be executed completely in parallel. it can. The encoding process within each slice proceeds in raster scan order in MB units as usual.
  • Patent Document 1 and Patent Document 2 the compression rate is suppressed by a method such as controlling the quantization parameter smaller in the vicinity of the slice boundary than in other regions (paragraph number 0091 in Patent Document 1).
  • a technique for encoding with relatively high image quality and reducing the occurrence of unnatural encoding noise is disclosed.
  • the rate control circuit of the encoding device assigns a smaller quantization parameter QP1 by multiplying the quantization parameter QP by a weighting factor Q for a macroblock located at a slice boundary.
  • a larger amount of code is allocated to the macroblock and quantization is performed finely. Therefore, in the lossless encoding circuit, even if the macroblock and the macroblock adjacent to the macroblock belong to different slices and the macroblock is encoded using alternative data of the macroblock data adjacent to the macroblock, the macroblock Image quality deterioration due to block coding can be suppressed.
  • the frame memory outputs the image data of each area to the encoding circuit.
  • the encoding region detection circuit detects an end region where the motion compensation range is set outside the region, and outputs a region flag generated at a timing immediately before the end region timing.
  • the code amount control circuit increases the quantization width set in the encoding circuit when the region flag is generated.
  • the code amount control circuit outputs quantization width data for setting a large quantization width when the usage amount or the addition usage amount is large, and small quantization when the usage amount or the addition usage amount is small. Outputs quantization width data for setting the width.
  • the allocated code amount is reduced during the addition period, which is the region flag generation period, and the code amount that can be allocated to the end region is increased.
  • An object of the present invention is to solve the above-mentioned problems of the related art, and an image encoding device capable of effectively preventing a decrease in encoding efficiency and image quality when encoding an image by parallel processing,
  • An object of the present invention is to provide an image decoding apparatus, an image encoding method, an image decoding method, an image encoding processing program, and an image decoding processing program.
  • an image encoding apparatus includes an encoding unit that encodes an input image to generate a bitstream, and the encoding unit that controls an encoding order in the image.
  • Encoding control means for controlling the image wherein the encoding control means divides the image into areas including blocks, sets the shape of each area, and sets the area processing order for each area.
  • the image decoding apparatus of the present invention controls the operation of the decoding means for decoding the input bitstream and generating a decoded image, and the operation of the decoding means for controlling the decoding order of the bitstream.
  • Decoding control means wherein the decoding control means divides the image into areas including blocks and sets the shape information of the areas, and the area processing order set for the areas. , And control to decode the bitstream corresponding to each region based on the region processing order and the shape information of the region, and according to each region processing order
  • decoding an area at least one area having the area processing order is decoded after the other area processing order having a higher priority than the one area processing order. And controlling the.
  • the image encoding method of the present invention is an image encoding method using an image encoding device, and divides an input image into regions including blocks, sets the shapes of the regions, and sets the regions in the regions.
  • Each region processing order is set, and each region is encoded based on the region processing order and the shape information of the region to generate a bitstream, and in the encoding control,
  • the first condition in which encoded information cannot be mutually used between areas having the same area processing order, and at least one area processing order has priority over the one area processing order.
  • Control is performed so as to perform encoding that satisfies the second condition that allows the use of the encoded information to the other regions having the higher region processing order.
  • the image decoding method of the present invention is an image decoding method using an image decoding device, wherein the shape information of each area set by dividing the image into areas including blocks, and each area
  • the region processing order set in the area is input, the region processing order of the input bit stream and the shape information of the region are acquired and analyzed, and the region processing order and the shape of the region are obtained.
  • decoding control is performed so that each bit stream corresponding to each region is decoded and a decoded image is generated.
  • each region is decoded according to each region processing order.
  • the at least one region in the region processing order is controlled to be decoded after the other region processing order region having a higher priority than the one region processing order. And butterflies.
  • An image encoding processing program of the present invention is an image encoding processing program for controlling image encoding, On the computer, A function of setting the shape of each region by dividing the input image into each region including blocks, and setting a region processing order for each region; Based on the region processing order and the shape information of the region, encoding is controlled such that each region is encoded to generate a bitstream, and the control condition of the region is the same region processing order. A first condition that makes it impossible to use encoded information between regions, and at least one region in the region processing order, to another region in the region processing order having a higher priority than the one region processing order. And a function of performing encoding control satisfying the second condition that enables the use of the encoded information.
  • An image decoding processing program of the present invention is an image decoding processing program for performing control for decoding an encoded image, On the computer, The region processing order and the shape information of the region from a bitstream including the shape information of each region set by dividing the image into regions including blocks and the region processing order set for each region And the function to obtain and analyze the contents, Based on the region processing order and the shape information of the region, control in decoding is performed so as to generate a decoded image by decoding each bit stream corresponding to each region, and according to each region processing order A function of controlling at least one area having the area processing order to be decoded after the other area processing order having a higher priority than the one area processing order when decoding the area; Is executed.
  • the present invention when encoding an image by parallel processing, it is possible to effectively and reliably eliminate the occurrence of unnatural encoding noise and a decrease in encoding efficiency at a slice boundary.
  • the image encoding process can be realized efficiently and with high accuracy.
  • an image encoding device 1 of the present invention includes an encoding module 10b as encoding means for performing processing for encoding an input image and generating a bitstream, and the input image And an encoding control module 10a as an encoding control means for controlling the encoding means in order to control the encoding order.
  • the encoding control module 10a divides the image into regions including blocks, sets the shape of each region, sets the region processing order for each region, and sets the region processing order for each region.
  • a control condition of the region a first condition that makes it impossible to use encoded information between regions having the same region processing order, at least one region processing order
  • the area has a function of controlling to perform encoding that satisfies the second condition that allows the use of the encoded information to the area of the other area processing order having higher priority than the one area processing order.
  • the encoding control module 10a sets the area shape (AR61, AR62-1 to AR62-6) and the area processing order (level 0, level 1)
  • the encoding module 10b Encoding is performed in the order of 0 area AR61 and level 1 areas (AR62-1 to AR62-6).
  • the encoding control module 10a refers to and uses the encoding information with each other at the time of encoding between, for example, the area AR62-1 and the area AR62-2 having the same area processing order (level 1). Control is performed so that encoding is not possible. In this case, in each region, there is no dependency relationship of encoding information when performing spatial prediction.
  • the encoding control module 10a stores, for example, the area AR62-1 having one area processing order (level 1), the encoding information of the area AR61 having another area processing order (level 0) having a high priority at the time of encoding. Control is performed by referring to and using. In this case, each region has a dependency relationship of encoding information when performing spatial prediction.
  • the encoding control module 10a performs encoding control according to the control conditions in the above-described region encoding, thereby arbitrarily setting and controlling the region including the presence or absence of the dependency relationship. In other words, the encoding control module 10a performs control by setting a region boundary having a dependency relationship and a region boundary having no dependency relationship. In each region facing the region boundary having the dependency relationship, there is no portion with low spatial prediction efficiency, so that coding noise at the region boundary is eliminated.
  • the region only needs to include the “block”.
  • the unit of this “block” may be a macro block or a pixel block.
  • the “block” may be a pixel block group, and this pixel block group may or may not be equivalent to a so-called macro block.
  • the encoding control module 10a includes a first region (for example, the region AR11 illustrated in FIG. 6) including at least a parallel processing image division boundary and other second regions (for example, the regions AR21 and AR22 illustrated in FIG. 6).
  • the image is divided, and control is performed so that at least the first area is encoded before the second area according to an area processing order (for example, level 0, level 1) assigned to each area, It has a function of controlling to use the encoded information of the first area when encoding the two areas.
  • an area processing order for example, level 0, level 1
  • the image when encoding an image by parallel processing, while reducing side effects such as generation of unnatural encoding noise at a slice boundary and a decrease in encoding efficiency, the image is encoded.
  • Encoding processing can be realized efficiently and with high accuracy.
  • the reason is that in the related art slice division, the spatial prediction of the encoded information is discontinuous at the slice boundary, and thus the above problem cannot be avoided.
  • the first area including at least the parallel processing image division boundary is encoded first. Since the second region is encoded using the encoding information in the spatial prediction from the first region, the intra prediction and motion vector properties have spatial continuity. Thereby, the coding noise resulting from the discontinuity at the slice boundary, which is a problem of slice division of the related art, does not occur.
  • FIG. 1 is a block diagram showing an example of an overall schematic configuration of the image encoding device according to the first embodiment of the present invention.
  • the image encoding device 1 includes an input unit 11 that can input an image of one frame, an input image memory 12 that stores an image input from the input unit 11, and the above-described configuration.
  • the encoding control module 10a divides an image for one screen into partial areas, and performs a division level assignment unit 13 (setting control means) for performing control for assigning the encoding processing order (area processing order) as a level value (Level).
  • a coding order control unit 14 (first coding order control means) for controlling the coding processing order of each region based on the level value.
  • the encoding module 10b of the image encoding device 1 has one or a plurality of encoding units 15 capable of encoding each divided image of each region in units of pixel blocks in each region, and the encoding control
  • the module 10a includes a scanning unit 16 (second encoding order control means) that controls the pixel block encoding order of each region (for example, in a raster scan order).
  • the image encoding device 1 stores an encoded image that is an image in units of macroblocks encoded by the encoding unit 15 and is referred to (encoded) at the time of encoding.
  • An image memory 17 and an output unit 18 that outputs the bit stream encoded by the encoding unit 15 are provided.
  • the division level assignment unit 13 divides the input image supplied from the input image memory 12 into a plurality of partial regions, and assigns the coding processing order of each region as a level value based on the spatial prediction dependency between the regions.
  • the level value is highest at 0, which means that the encoding process is performed first.
  • the division level assignment unit 13 uses the encoded information of the MB belonging to the region having the level value 0 (level 0) for spatial prediction. Assign a level value so that For this reason, the encoding process cannot be started for the level 1 area only after all the encoding processes for the level 0 area have been completed.
  • the division level assignment unit 13 is an MB belonging to an area having a level value N (level N) when the level value is 0, 1,..., N ⁇ 1, N (N is a natural number).
  • N is a natural number.
  • a level value is assigned so that encoded information of MBs belonging to an area having a level value of N-1 (level N-1) or less may be used for spatial prediction. For this reason, the encoding process for the level N area can be started only after the encoding process for the area of level N-1 or lower is completed.
  • the division level assignment unit 13 assigns level values so that there is no dependency between different areas having the same level. For this reason, different areas with the same level can be independently encoded.
  • “having dependency” means that when one area is encoded, the other area is referred to and information (encoded information) of the other area is used.
  • no dependency means that when one area is encoded, the other area is not referred to and the information (encoded information) of the other area is not used.
  • the other MB (macroblock) when encoding one MB (macroblock), the other MB (macroblock) is referred to and information (encoding information) of the other MB (macroblock) is used, and in particular the code
  • the spatial prediction dependency is low (the spatial prediction efficiency is poor and the spatial prediction accuracy is low).
  • the spatial prediction dependency is high (the spatial prediction efficiency is high and the spatial prediction accuracy is high. )
  • the division level assignment unit 13 includes a first area (for example, a symbol AR1 shown in FIG. 3) of a block line including at least a parallel processing image division boundary region and another second area (for example, a symbol AR2 (AR2 shown in FIG. 3). (1 to AR2-4)) and an area shape setting function 13a for setting the shape of each area by dividing the image, and encoding at least the first area before the second area.
  • An area processing order setting function 13b for setting a level value as an area processing order for each area can be provided.
  • the encoding order control unit 14 controls the encoding process execution order of each area based on the encoding process order for each area assigned by the division level assignment unit 13. That is, the encoding order control unit 14 performs control such that each area is encoded according to each area processing order, and the second area is encoded using the encoding information of the first area.
  • the encoding unit 15 performs encoding processing of one MB (macroblock) using the input image supplied from the division level assignment unit 13 and the previously encoded image supplied from the encoded image memory 17. Then, the generated bit stream is supplied to the output unit 18 and the image of the encoding result is supplied to the encoded image memory 17.
  • the encoding unit 15 performs motion vector search for 1 MB, intra prediction mode determination, MB encoding mode determination, motion compensation or intra prediction, integer conversion of prediction residual signal, quantization, inverse quantization, inverse integer conversion, A series of encoding processes such as entropy encoding, generation of a local decoded image, and deblocking filter processing are executed. That is, the encoding unit 15 performs spatial prediction such as vector prediction from nearby motion vector information, spatial prediction of DCT coefficients of motion compensated residual images (AC / DC prediction), and spatial prediction at the pixel value level (intra prediction). Predictive coding information.
  • an encoding control module 10a is configured by the division level assignment unit 13, the encoding order control unit 14, and the scanning unit 16 of the present embodiment.
  • One or a plurality of encoding units 15 constitutes an encoding module 10b.
  • the encoding control module 10a may be an example or a part of “encoding control unit”.
  • the encoding module 10b can also be referred to as an example or a part of “encoding means”.
  • the encoding module 10b performs a process of encoding the input image and generating a bit stream.
  • the encoding control module 10a controls the encoding means in order to control the encoding order in the input image.
  • the encoding control module 10a divides the image into regions including blocks, sets the shape of each region, sets the region processing order for each region, and sets each region according to each region processing order.
  • a region as a control condition of the region, a first condition that makes it impossible to use encoded information between regions having the same region processing order, at least one region of the region processing order Has a function of controlling the encoding so as to satisfy the second condition that allows the use of the encoding information to the other area having the higher priority than the one area processing order. Yes.
  • the encoding control module 10a has a function of controlling the shape information of the region, the region processing order, and the information regarding the block encoding order to be encoded and output in the bitstream.
  • the encoding control module 10a has a function of performing control so that the areas to which the same area processing order is assigned are encoded in parallel.
  • the encoding control module 10a divides the image into a first region including at least a parallel processing image division boundary and another second region, and performs region processing assigned to each region.
  • a function for controlling to encode at least the first area prior to the second area according to the order and to use the encoding information of the first area when encoding the second area May be provided.
  • the encoding control module 10a includes two or more sub-regions (for example, AR2-1 and AR2-2 in FIG. 3) divided by the first region in the second region (for example, AR2 in FIG. 3). , AR2-3, AR2-4) are assigned the same region processing order, and are controlled to encode the subregions in parallel.
  • the encoding control module 10a performs control so as to encode the peripheral portion of at least two sides related to the start of the encoding scan among the peripheral portions of the image when encoding the first region.
  • the inside of the second area is encoded in units of blocks, a function of controlling the sub areas to be encoded in the same block encoding order is provided.
  • the encoding control module 10a divides the image into two or more regions (different from the parallel processing slice division) according to the spatial prediction dependency between the blocks, and shapes the regions. And processing for setting the region processing order for each region according to the spatial prediction dependency in each region, and encoding each region based on the region processing order and the shape information of the region In one area processing order, the other area processing order having a lower priority than the one area processing order is encoded before the other area without referring to the other area processing order in the spatial prediction.
  • the function to control is provided.
  • the encoding control module 10a has a function of performing control so that the encoding processes of different areas to which the same area processing order is assigned are executed in parallel. Furthermore, the encoding control module 10a has a function of controlling the shape information of the region, the region processing order, and the information regarding the block encoding order to be encoded and output in the bitstream.
  • FIG. 2 is a flowchart showing an example of a processing procedure in the image encoding device of FIG.
  • the overall operation processing procedure in the image encoding according to the present embodiment is as a basic procedure.
  • the encoding control module 10a When an image is input from the input image memory 12 to the encoding control module 10a, the encoding control module 10a is input.
  • the image to be processed is divided into areas including blocks and the shape of each area is set (step S102 shown in FIG. 2), and the area processing order is set for each area (step S102 shown in FIG. 2).
  • encoding control for encoding each area to generate a bitstream is performed (steps consisting of steps S103 to S109 shown in FIG. 2).
  • the encoding control module 10a as a control condition of the region, at least a first condition that makes it impossible to use the encoding information between regions of the same region processing order, at least In one area processing order area, encoding is performed that satisfies the second condition that allows the use of encoding information to another area processing order area having a higher priority than the one area processing order. To control.
  • the encoding control module 10a may change the shape of the region into a first region including at least a parallel processing image division boundary and other second regions. You may divide and set.
  • the encoding control module 10a may set the area processing order for each of the areas so that at least the first area is encoded before the second area.
  • the encoding of the second area can be controlled to use the encoding information of the first area.
  • the encoding processing of each of the sub-regions at different positions to which the same region processing order is assigned. can be controlled to execute in parallel.
  • the encoding control module 10a refers to another region in the one region processing order having a lower priority than the one region processing order in the spatial prediction. And control to encode before the other area.
  • the encoding control module 10a performs control so as to refer to and use the encoding information of one region that has been encoded previously when performing spatial prediction in another region.
  • the encoding control module 10a performs control so as to execute in parallel the encoding processes of the different areas to which the same area processing order is assigned.
  • the input unit 11 inputs an image of one frame and stores it in the input image memory 12 (step S101).
  • the division level assignment unit 13 of the encoding control module 10a performs a process of dividing the input image for one screen supplied from the input image memory 12 into two or more areas (area shape setting process). Further, the division level assignment unit 13 performs processing (region processing order setting processing) for assigning a level value to each of the divided regions based on the above rules (step S102: setting control step or setting control function).
  • the division level assignment unit 13 divides an image into a level 0 area AR1 and a level 1 area AR2 (AR2-1 to AR2-4).
  • the level 0 area AR ⁇ b> 1 has two sides on the side where the block scan is started in the peripheral portion of the image, and an area including the parallel processing image division boundary.
  • the level 1 area AR2 is another area of the level 0 area AR1, and includes a plurality of sub areas AR2-1, AR2-2, AR2-3, and AR2-4 divided by the level 0 area AR1.
  • the division level assignment unit 13 assigns “0” as the level value to the level 0 area AR1, and assigns “1” as the level value to the level 1 area AR2 (AR2-1 to AR2-4).
  • the division level assignment unit 13 includes a level 0 area AR1 composed of four MB (macroblock) rows that divide the screen left end and the screen into four, and a level 0 area
  • the screen is divided into four level 1 areas AR2 (AR2-1, AR2-2, AR2-3, AR2-4) divided by AR1, and the levels are assigned to the divided areas.
  • the coding order control unit 14 of the coding control module 10a controls the coding order for each area based on the level value of each divided area. More specifically, the encoding order control unit 14 performs a process of setting 0 to L that is a processing target level value (step S103 in FIG. 2). That is, the encoding order control unit 14 performs control for first encoding an area whose processing target level value L is 0 as an order of encoding the divided areas.
  • the scanning unit is scanned for the unprocessed area based on the determination result by the encoding order control unit 14. 16 and the processing of steps S105 to S107 shown in FIG.
  • the encoding order control unit 14 determines that the area AR1 (unprocessed area) having the level value L of 0 exists. Then, the determination result is output to the scanning unit 16 and the encoding unit 15.
  • the scanning unit 16 and the encoding unit 15 execute the processing of steps S105 to S107 in FIG.
  • the encoding order control unit 14 determines that the area AR2 (unprocessed) whose level value L is 1 Area) is present, and the determination result is output to the scanning unit 16 and the encoding unit 15.
  • the scanning unit 16 and the encoding unit 15 receive the determination result from the encoding order control unit 14, the scanning unit 16 and the encoding unit 15 execute the processing of steps S105 to S107 in FIG.
  • the scanning unit 16 and the encoding unit 15 sequentially perform the encoding process of the MB in each region. Specifically, the scanning unit 16 performs processing for determining whether or not an unprocessed MB exists in the processing target area (step S105). If the scanning unit 16 determines that there is no unprocessed MB in the area to be processed, the process for that area is completed and the process returns to step S105. On the other hand, when the scanning unit 16 determines in step S105 that there is an unprocessed MB in the processing target area, the scanning unit 16 determines the processing target MB according to the raster scan order and notifies the encoding unit 15 of it. (Step S106).
  • step S107 encoding step or encoding function
  • the MB encoding processing in AR2-1, AR2-2, AR2-3, AR2-4 can be performed in parallel.
  • step S104 the encoding order control unit 14 determines in step S104 described above that there is no unprocessed area having the level value L
  • the process proceeds to step S108.
  • step S108 the encoding order control unit 14 determines whether there is an unprocessed area having a level value L, and if it exists, the value of L is incremented by 1 in step S109, and the process returns to step S104.
  • the encoding order control unit 14 determines in step S108 that there is no unprocessed area, the encoding process for one frame is completed.
  • the division level assigning unit 13 supplies the coding unit 15 with information on the shape of the divided areas and the level values of each area, and performs control to encode and output the information in the bit stream.
  • FIG. 1 includes one or a plurality of encoding units 15.
  • the processing in steps S105 to S107 in the flowchart of FIG. 2 for different regions can be processed in parallel using different encoding units 15.
  • the steps consisting of the above steps S105 to S106 can be said to be an example of a second encoding order control processing step.
  • the step which consists of the above step S103, S104, S108, S109 is an example of a 1st encoding order control process step.
  • the encoding process of the level 0 area is first executed, and when this is completed, the encoding process of the four level 1 areas can be processed in a maximum of 4 in parallel.
  • the MB column at the left end of the screen belonging to the level 0 area is not divided, vertical spatial prediction can be used. A decrease in the conversion efficiency can be suppressed.
  • each block in the block diagram shown in FIG. 1 may be realized on software by executing a program recorded in a recording medium on a computer.
  • the physical configuration is, for example, one or a plurality of CPUs (or one or a plurality of CPUs and one or a plurality of memories), but the software configuration by each unit (circuit / means) is exhibited by the CPU by controlling the program.
  • a plurality of functions are expressed as components by a plurality of units (means).
  • each unit (means) is configured in the CPU.
  • a static state in which the program is not executed, the entire program (or each program part included in the configuration of each unit) that realizes the configuration of each unit is stored in a storage area such as a memory.
  • Each of the units and modules (means) described above may be configured so that a computer functionalized by a program can be realized together with the functions of the program, or a plurality of units and modules (means) that are permanently functionalized by specific hardware. You may comprise with the apparatus which consists of these electronic circuit blocks. Therefore, these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
  • FIG. 5 is a block diagram showing an example of the overall configuration of an image encoding device according to the second embodiment of the present invention.
  • the second embodiment is different from the first embodiment in that a scanning unit corresponding to each encoding unit is provided, and a different scan can be performed for each area to be encoded.
  • the input unit 11, the input image memory 12, the division level assignment unit 113, the encoding order control unit 113, the encoding unit 15, the encoded image memory 17, and the output shown in FIG. A unit 18 is provided, which has the same configuration as that of the first embodiment shown in FIG.
  • the image encoding device 100 according to the second embodiment includes a scanning unit 116 corresponding to each encoding unit 15.
  • the scanning unit 116 performs scanning in the MB scanning order suitable for each region to be encoded based on the image supplied from the division level assignment unit 13 and the information on the division state and the level assignment state.
  • the scanning order scanned by the scanning unit 116 is a level smaller than the level value of the encoding target region so that spatial prediction using the encoding information of the MB belonging to the region having the level value smaller than the level value of the encoding target region can be utilized.
  • the encoding process is set to start from the MB adjacent to the MB belonging to the value area. For example, with respect to the left-right direction, if it is not adjacent to the left side of the encoding target area but adjacent to the right side, it is scanned from right to left, otherwise it is scanned from left to right. If the upper side of the encoding target area is not adjacent but the lower side is adjacent, scanning from the bottom to the top is performed. In other cases, scanning is performed from the top to the bottom.
  • the division level assignment unit 113, the encoding order control unit 114, and the plurality of scanning units 116 of the present embodiment constitute an encoding control module 110a.
  • the encoding control module 110a can also be referred to as an example of “encoding control means”.
  • the encoding control module 110a has a function of controlling to change the pixel block encoding order (block encoding order) according to each area when encoding the area in units of blocks. ing. Further, the encoding control module 110a has a function of controlling the pixel block encoding order (block encoding order) so as to be determined according to the shape information of the area and the area processing order. Still further, the encoding control module 110a starts encoding the block encoding order of one area from a block portion adjacent to the other area, and codes the adjacent area of the one area in the other area. It has a function to control to use information. That is, the encoding control module 110a performs a spatial prediction process using the encoding information in the other region that is encoded adjacent to the one region and the block encoding order in the one region. The function to decide to do is provided.
  • the encoding control module 110a has a function of controlling to change the block encoding order according to each sub-region when encoding the second region in units of blocks. Prepare. In addition, the encoding control module 110a has a function of controlling the block encoding order of the second area so as to start encoding from a block portion adjacent to the first area.
  • the processing flow of the second embodiment is the same as the flowchart of the first embodiment of FIG. 2, but when determining the processing target MB in step S106, each processing unit corresponding to each encoding unit 15 is performed.
  • the scanning unit 116 is different from the first embodiment in that the scanning unit 116 determines based on the scanning order suitable for each coding area.
  • FIG. 6 shows an example in which the division level assignment unit 13 assigns the MB row at the center of the screen to the level 0 area AR11 and the upper and lower areas to the level 1 areas AR21 and AR22, respectively.
  • the encoding control module 110a controls the encoding unit 110b to execute the encoding process of the level 0 area AR11 first, and when this is completed, the encoding process of the two level 1 areas AR21 and AR22 is performed. Can be processed in parallel.
  • the encoding using the spatial prediction from the level 0 area AR11 is performed by scanning in the normal raster scan order. Can be processed.
  • the spatial prediction from the level 0 area AR11 cannot be used in normal raster scan forward scanning.
  • the encoding process starts from the lower left of the level 1 area AR21 and proceeds in the right direction.
  • the scanning unit 116 corresponding to the encoding control module 110a performs scanning (reverse raster scanning order scanning) that moves up and moves rightward again.
  • scanning reverse raster scanning order scanning
  • the encoding process proceeds from the lower MB (macroblock)
  • spatial prediction of the encoded information is performed using information on MBs adjacent to the lower left, lower, lower right, and left.
  • FIG. 8 shows an example in the case of motion vector prediction.
  • the median values of the motion vectors 402, 403, and 404 of MBs adjacent to the upper, upper right, and left are used.
  • median values of motion vectors 302, 303, and 304 of MBs adjacent to the lower, lower right, and left are used as shown in FIG.
  • Other spatial predictions such as intra prediction and AC / DC prediction can be similarly predicted from the lower MB.
  • the number of MBs (macroblocks) with poor spatial prediction efficiency is the number M of MBs at the left end of the screen and the number of MBs belonging to the level 0 area as shown in FIG. -1)
  • the total is M + (N-1).
  • efficient parallelization can be realized by parallel execution of the encoding processing of the two level 1 regions while maintaining the same encoding efficiency as when the screen is not divided.
  • the two level 1 regions are encoded using spatial prediction from the level 0 region, the intra prediction and motion vector properties have spatial continuity. Thereby, the coding noise resulting from the discontinuity at the slice boundary, which is a problem of slice division of the related art, does not occur.
  • the division level assignment unit 13 has a substantially cross-shaped area at the center of the screen as a level 0 area AR21 and four areas divided by the level 0 area AR21 as level 1 areas AR41, AR42, AR43, and AR44, respectively.
  • the four scanning units 116, 116, 116, 116 of the encoding control module 110 a scan MBs (macroblocks) in different order in the four areas AR 41, AR 42, AR 43, AR 44.
  • the level 0 area AR31 is adjacent to the upper side and the right side of the level 1 area AR43.
  • one scanning unit 116 starts the encoding process from the upper right of the level 1 area AR43 in order to utilize the encoding information in the spatial prediction from the level 0 area AR31, and moves leftward.
  • the encoding process is advanced.
  • the encoding process is performed in the first scanning order that moves down by 1 MB (macroblock) and proceeds to the left again.
  • the scanning units 116, 116, 116 corresponding thereto are adjacent to the level 0 area AR31 so that spatial prediction from the level 0 area AR31 can be used.
  • the encoding process is started from the existing MB (macroblock). That is, the second scanning unit 116 starts the encoding process from the lower right of the level 1 area AR41 as shown in FIG. 10 in order to utilize the encoding information in the spatial prediction from the level 0 area AR31. Then, the encoding process proceeds in the left direction. When the left end of the screen is reached, the encoding process is performed in the second scanning order that moves up by 1 MB (macroblock) and proceeds in the left direction again.
  • the third scanning unit 116 starts the encoding process from the upper left of the level 1 area AR44 in order to utilize the encoding information in the spatial prediction from the level 0 area AR31.
  • the encoding process is performed in the third scanning order that moves down by 1 MB (macroblock) and proceeds to the right again.
  • the fourth scanning unit 116 starts the encoding process from the lower left of the level 1 area AR42 in order to utilize the encoding information in the spatial prediction from the level 0 area AR31.
  • the encoding process is performed in the fourth scanning order that moves up by 1 MB (macroblock) and proceeds to the right again.
  • the number of MBs (macroblocks) with poor spatial prediction efficiency is M + (N ⁇ 1) belonging to the level 0 area as shown in FIG. This is the same number as the example in the case of not performing screen division according to the related technology of FIG.
  • the four level 1 areas AR41, AR42, AR43, and AR44 can be processed in parallel at a maximum of four.
  • all the level 1 areas AR41, AR42, AR43, and AR44 are encoded using the encoding information in the spatial prediction from the level 0 area AR31, intra prediction and motion vector Properties have spatial continuity. For this reason, the unnatural encoding noise at the slice boundary, which was a problem of the related art, does not occur.
  • the operation processing procedure in the image encoding according to the present embodiment is such that, in the encoding control, the block encoding order for encoding the inside of the area in units of blocks is changed for each area. Can be controlled.
  • the block encoding order can be controlled to be determined according to the shape information of the region and the region processing order.
  • encoding of the block encoding order of one of the regions is started from a block portion adjacent to the other region, and encoding of the one region adjacent portion in the other region is performed. It can be controlled to use information.
  • the second region has a block encoding order for encoding the second region (level 1 region) in units of blocks. It can be controlled to change (or change for each sub-region) according to each sub-region divided by the first region (AR41, AR42, AR43, AR44 in the example of FIG. 9). Further, when performing the control in the encoding, the block encoding order can be controlled to be changed based on the shape information of the region and the region processing order. Furthermore, when performing the control in the encoding, the block encoding order in the second region can be controlled to start from a block portion adjacent to the first region.
  • the sub-regions AR41, AR42, AR43, and AR44 constituting the second region are all encoded in different block encoding orders.
  • the case where AR44 is arranged in the same order and AR41 and AR42 are arranged in the same order may be used.
  • the block coding order can be changed according to each sub-region.
  • the block coding scan order in each sub-region is not limited to the example shown in FIG. Any order may be used as long as it starts from a block portion adjacent to the first region. If necessary, the AR 41 may scan from the lower right to the upper right first, and various scanning methods may be employed.
  • the space using the encoding information in the other region, which is encoded adjacent to the one region, and the block encoding order in the one region is possible to decide to perform a prediction process.
  • the encoding process is efficiently performed while eliminating the decrease in the encoding efficiency and the generation of unnatural encoding noise, which were problems in the parallel processing of the related technology. Can be executed in parallel.
  • FIG. 11 is a block diagram showing an example of the overall configuration in the third embodiment of the image decoding apparatus of the present invention.
  • the image decoding apparatus 200 includes an input unit 211, a bit stream memory 212, a division level information analysis unit 213, a decoding order control unit 214, and a plurality of decoding units.
  • the decoding unit 215 includes a scanning unit 216, a decoded image memory 217, and an output unit 218 that are provided corresponding to each decoding unit 215.
  • the input unit 211 can input a bit stream including the shape information of each area and the area processing order according to the above-described embodiment.
  • the bit stream memory 212 stores the input bit stream input from the input unit 211.
  • the division level information analyzing unit 213 (analyzing means) acquires the shape information and the area processing order of each area in the bit stream stored in the bit stream memory 212 and analyzes it.
  • the decoding order control unit 214 (first decoding order control means) decodes the bitstream corresponding to each region based on the shape information and region processing order of each region. 215 is controlled.
  • One scanning unit 216 out of each scanning unit 216 is one corresponding to sequentially decode the bitstream corresponding to the pixel block according to the pixel block decoding order for one region.
  • the decoding unit 215 of the above is controlled.
  • the other scanning unit 216 controls the corresponding other decoding unit 215 to sequentially decode the bitstream corresponding to the pixel block according to the pixel block decoding order for the other region.
  • the pixel block decoding order for one region and the pixel block decoding order for another region can be appropriately changed.
  • One decoding unit 215 performs a process of decoding a bit stream in units of pixel blocks and generating a decoded image.
  • Each decoding unit 215 is provided corresponding to each area that can be processed in parallel, and each decoding process of each bitstream corresponding to each area can be processed in parallel.
  • the decoded image memory 217 stores the decoded image decoded by the decoding unit 215.
  • the output unit 218 outputs the decoded image of the decoded image memory 217.
  • the decoding control module 210a can be configured by the division level information analysis unit 213, the decoding order control unit 214, and the plurality of scanning units 216 of the present embodiment.
  • the decryption module 210b may be configured by one or a plurality of decryption units 215.
  • the decryption control module 210a may be an example or a part of “decryption control means”.
  • the decryption module 210b can also be referred to as an example or a part of “decryption means”.
  • the decoding module 210b performs a process of decoding the input bit stream and generating a decoded image.
  • the decoding control module 210a has a function of controlling the operation of the decoding means in order to control the decoding order of the input bitstream.
  • the decoding control module 210a analyzes the bitstream including the shape information of each area set by dividing the image into areas including blocks and the area processing order set for each area. And a function of controlling to decode the bitstream corresponding to each region based on the region processing order and the shape information of the region.
  • the decoding control module 210a decodes each region according to each region processing order, at least one region having the region processing order has a higher priority than the one region processing order. It has a function of controlling to perform decoding after the area processing order area.
  • the bit stream has a first condition in which encoded information cannot be used between areas having the same area processing order, and at least one area having the area processing order has the one area processing order. It is based on the premise that the encoded information is controlled so as to satisfy the second condition that allows the use of the encoded information to the other regions having the higher priority than the rank. is there.
  • the decoding control module 210a determines the shape of each region set by dividing the image into a first region including at least a parallel processing image division boundary and another second region. Analyzing the bitstream including information and at least the region processing order set in each region to encode the first region before the second region, and analyzing the region processing order and the region And a function of controlling to decode the bitstream corresponding to each region based on shape information.
  • the decoding control module 210a When the decoding control module 210a decodes the bitstream corresponding to the area in units of blocks, the decoding control module 210a changes the block decoding order according to each area based on the area processing order and the shape information of the area. It has the function to control to do.
  • the decoding control module 210a divides the image into two or more regions (different from the parallel processing slice partitioning) that are set according to the spatial prediction dependency between the blocks, and sets the regions. Analyzing the bitstream including shape information and region processing order set for each region according to spatial prediction dependence in each region, and based on the region processing order and shape information of the region A function of controlling to decode the bitstream corresponding to each region is provided.
  • the decoding control module 210a performs block decoding according to each area based on the area processing order and the area shape information when decoding the bitstream corresponding to the area in units of pixel blocks. A function for controlling the order to be changed is provided. Further, the decoding control module 210a has a function of performing control so that decoding processes of different areas assigned with the same area processing order are executed in parallel.
  • FIG. 12 is a flowchart illustrating an example of a processing procedure in the image decoding apparatus in FIG. 11.
  • the overall operation processing procedure in the image decoding according to the present embodiment is as follows.
  • the image decoding device divides the image into regions including blocks, and the shape information of each region is set.
  • Input a bitstream including the region processing order set for each region (step S201 shown in FIG. 12), obtain the region processing order of the input bitstream and the shape information of the region, Analyzing (step S202 shown in FIG. 12), decoding control is performed so as to generate a decoded image by decoding each bitstream corresponding to each region based on the region processing order and the shape information of the region.
  • Can be performed steps S203 to S209 shown in FIG. 12).
  • the decoding control when decoding each area according to each area processing order, at least one area having the area processing order has a higher priority than the one area processing order. Control can be performed so that decoding is performed after the region of the region processing order.
  • the bit stream has a first condition in which encoded information cannot be used between areas having the same area processing order, and at least one area having the area processing order has the one area processing order. It is based on the premise that the encoded information is controlled so as to satisfy the second condition that allows the use of the encoded information to the other regions having the higher priority than the rank. is there.
  • the image decoding device when applied to parallel processing, sets shape information of each region set by dividing the image into a first region including at least a parallel processing image division boundary and another second region. And an area processing order set in each area so as to encode at least the first area before the second area, and the area processing of the input bitstream can be input Obtains rank and shape information of the area, analyzes the contents, and generates a decoded image by decoding each bitstream corresponding to each area based on the area processing rank and the shape information of the area Thus, control in decoding can be performed.
  • control is performed so that the bit stream corresponding to the area is decoded in units of blocks, and according to each area based on the area processing order and the shape information of the area. It can be controlled to change the pixel block decoding order.
  • the block decoding order in the region can also be obtained from the input bitstream and analyzed.
  • step S201 bit stream input processing step or bit stream input processing function
  • the division level information analysis unit 213 analyzes the screen division shape recorded in the bit stream and the level value of each division area (step S202: analysis step or analysis function).
  • the decoding order control unit 214 sets 0 as the processing target level value L (step S203), and determines whether or not there is an unprocessed area whose level value is L (step S204).
  • the process proceeds to the area MB decoding process of steps S205 to S207 for the unprocessed area.
  • step S205 the scanning unit 216 determines whether there is an unprocessed MB in the processing target area. If it is determined that there is an unprocessed MB, the process proceeds to step S206.
  • step S206 the scanning unit 216 determines the MB scanning order in the processing target area based on the screen division shape supplied from the division level information analysis unit 213 and the level value of each divided area, and determines the processing target MB. .
  • the MB scanning order determination method is the same as in the second embodiment.
  • the decoding unit 215 uses the bit stream supplied from the bit stream memory 212 and the previously encoded image supplied from the decoded image memory 217 to perform entropy code decoding, inverse quantization, and inverse integer conversion on 1 MB. , A series of decoding processes such as motion compensation or intra prediction, deblocking filter process, and generation of a local decoded image are executed, and the decoding result is stored in the decoded image memory 217 (step S207: decoding process step or decoding process) function). The decoding result image stored in the decoded image memory 217 is output from the output unit 218.
  • step S208 if the decoding order control unit 214 determines in step S204 described above that there is no unprocessed area whose level value is L, the process proceeds to step S208.
  • step S208 the decoding order control unit 214 determines whether there is an unprocessed area. If it is determined that there is an unprocessed area, the value of L is incremented by 1 in step S209, and the process returns to step S204. On the other hand, when the decoding order control unit 214 determines in step S208 that there is no unprocessed area, the decoding process for one frame is completed.
  • the configuration of FIG. 11 includes one or a plurality of decoding units 215.
  • the processing in steps S205 to S207 in the flowchart of FIG. 9 for different regions can be processed in parallel using different decoding units 215.
  • the steps consisting of the above steps S205 to S206 can be said to be an example of a second decoding order control processing step.
  • the step which consists of the above step S203, S204, S208, S209 is an example of a 1st decoding order control process step.
  • the encoding efficiency decrease which was a problem in the parallel processing of the related technology
  • Decoding processing can be efficiently executed in parallel while eliminating unnatural encoding noise.
  • examples of the image division pattern include the examples shown in FIGS. 13, 14, and 15.
  • FIG. 13 shows a case where the number of levels is 2 or more. That is, in the image of the example of FIG. 13, the level 0 area AR51 having one area, the level 1 areas AR52-1, AR52-2 having two areas, and the level 2 area AR53 having four areas. ⁇ 1, AR53-2, AR53-3, AR53-4, and eight level 3 areas AR54-1, AR54-2, AR54-3, AR54-4, AR54-5, AR54-6, AR54 -7, including AR54-8.
  • the number of areas that can be processed in parallel can be increased from 1 ⁇ 2 ⁇ 4 ⁇ 8.
  • FIG. 14 shows a case where the upper level area does not become a division boundary.
  • the image in the example of FIG. 14 includes one level 0 area AR61 and six level 1 areas AR62-1, AR62-2, AR62-3, AR62-4, AR62-5, and AR62-6.
  • the six level 1 areas AR62-1, AR62-2, AR62-3, AR62-4, AR62-5, and AR62-6 are divided by logical division BL.
  • encoding control is performed so that there is no dependency between the regions. In other words, it is impossible to refer to the encoded information between regions of the same level.
  • FIG. 15 shows an example in which upper level MBs are distributed.
  • the image in the example of FIG. 15 includes nine level 0 areas AR71-1 and nine level 1 areas AR72-1.
  • the level 1 area AR72-1 is formed around the level 0 area AR71-1.
  • the level 1 area AR72-1 can use the encoding information of the level 0 area AR71-1.
  • each level 1 area AR72-1 is divided by a logical break BL. In such a case, encoding control is performed so that there is no dependency between the level 1 regions. In other words, it is impossible to refer to the encoded information between regions of the same level.
  • the block coding order when coding each region of the same level value in units of blocks is not limited to raster scanning, but may be zigzag scanning.
  • the encoding control means may perform raster scanning in one area and zigzag scanning in another area.
  • the raster scanning of each region is not limited to the raster scanning in the horizontal direction, but may be a raster scanning in the vertical direction or a combination thereof.
  • the scanning start position and scanning end position at this time may be any location in the region as long as the MB with poor spatial prediction efficiency is reduced.
  • H.264 is used. Although an application conforming to H.264 has been described, the present invention is not limited to this application. Further, although only a limited example has been described regarding the shape of image division, the present invention is not limited to the illustrated image division, and can be applied to various image division shapes other than those illustrated.
  • the shape of the image division does not have to be fixed, and it is possible to perform different image divisions for each image, or to dynamically change the division shape.
  • the scanning order in the area is determined from the relationship between the shape of the area and the adjacent area
  • the method of determining the operation order is not limited to this, and the scanning order information selected at the time of encoding It is also conceivable to encode a bit stream and transmit it.
  • an example of parallel processing in two or four parallels has been described, the present invention is not limited to this degree of parallelism.
  • the present invention is not necessarily limited to the encoding / decoding of the moving image.
  • the present invention can also be applied to a method that uses spatial prediction in two-dimensional block-based processing such as encoding and decoding of still images.
  • the method described above can also be realized by a computer reading a program from a recording medium and executing it. That is, the above-described program may be recorded on an information recording medium.
  • the program of the present invention that realizes the functions of the above-described embodiments is a program corresponding to the processing unit (processing means), functions, and the like shown in the various block diagrams in the above-described embodiments, flowcharts, and the like.
  • a program for realizing the image coding apparatus divides an input image into regions including blocks and sets the shape of each region.
  • the first condition that makes it impossible to use the encoded information between the regions having the same region processing order, at least one region of the region processing order A condition is set that satisfies the second condition that allows the use of the encoded information to the other area having the higher priority than the one area processing order.
  • the computer when applied to parallel processing, is configured to divide an input image into a first region including at least a parallel processing image division boundary and other second regions and set the shape of each region, and at least Based on the function of setting the area processing order for each area so that the first area is encoded before the second area, and the area processing order and the shape information of the area, the areas are encoded. It is constructed as a configuration that executes a function of controlling encoding so as to generate a bitstream. The encoding information of the first area is used when encoding the second area.
  • the program for realizing the image decoding apparatus in software according to an embodiment of the present invention on software includes, on a computer, shape information of each area set by dividing the image into areas including blocks, A function for acquiring the region processing order and the shape information of the region from a bitstream including the region processing order set for the region, and analyzing the contents, and the region processing order and the shape information of the region.
  • it is constructed as a configuration for executing a function of performing control in decoding so as to decode each bit stream corresponding to each region and generate a decoded image.
  • at least one region with the region processing order is higher than other regions with a higher priority than the one region processing order. Decrypt later.
  • the shape information of each region set by dividing the image into a first region and at least another second region including at least a parallel processing image division boundary in the computer and at least the The region processing order and the shape information of the region are acquired from a bitstream including the region processing order set for each region so that the first region is encoded before the second region, and the contents And a function of performing control in decoding so as to generate a decoded image by decoding each bit stream corresponding to each area based on the area processing order and the shape information of the area. Build as a configuration to be executed.
  • the encoding control means of the encoding device divides the image into a first area including at least a parallel processing image dividing boundary and another second area, and the first, A shape of each second area is set, and an area processing order is set for each area so that at least the first area is encoded before the second area, and according to each area processing order
  • the respective areas are encoded, and at this time, the second area is constructed so as to be controlled to be encoded using the encoding information of the first area.
  • the first region extends in a second direction that intersects the first direction along the boundary and has a spatial prediction accuracy that is lower than usual and that is formed on one end side in the first direction of the image. 1 end region, and a second end region extending in the first direction and formed on the one end side in the second direction of the image, the spatial prediction accuracy being lower than normal.
  • the encoding control means controls to encode each of the first and second one end regions when encoding the first region.
  • the encoding control unit divides the image into two or more regions including the block group according to spatial prediction dependency between blocks, sets the shape of each region, and sets the shape of each region.
  • a region processing order is set for each region in accordance with the spatial prediction dependency.
  • the encoding control means controls to encode each area based on the area processing order and the area shape information.
  • the encoding control means does not refer to other areas in the one area processing order having a lower priority than the other area processing order in the spatial prediction without referring to the other areas in the spatial prediction. Control is performed so that encoding is performed earlier.
  • the encoding control means refers to and uses the encoding information of one region that has been encoded previously when performing spatial prediction in another region.
  • an image is divided into two or more regions composed of the block group in accordance with spatial prediction dependency between blocks, the shape of each region is set, and the space in each region is set.
  • a region processing order indicating an encoding order is set in each region according to the prediction dependency. Based on the region processing order and the shape information of the region, control is performed so that each region is encoded. At that time, in one area processing order, control is performed so that other areas having lower priority than the one area processing order are encoded prior to the other areas without reference to other areas in the spatial prediction. To do.
  • reference is made to the encoding information of one region encoded earlier. This is because spatial continuity can be maintained in the properties of intra prediction and motion vectors, and in addition, regions of the same region processing order can be independently processed in parallel.
  • the encoding control module 10a of the image encoding device extracts an input image from the block group according to spatial prediction dependency between blocks.
  • the shape of each region can be set by dividing into two or more regions (different from the slice processing for parallel processing in the related art).
  • the entire operation processing procedure in the image decoding according to the embodiment is such that the computer provided in the image decoding apparatus includes two or more blocks each including the block group according to spatial prediction dependency between blocks. It is possible to input a bitstream including the shape information of each region set by dividing into regions and the region processing order set for each region according to the spatial prediction dependency in each region.
  • the above-described program (including an image encoding processing program and an image decoding processing program) may be recorded on an information recording medium.
  • the present invention may be applied to a device composed of a plurality of devices, or may be applied to an apparatus composed of a single device.
  • image encoding devices and image decoding devices may exist independently or may be used in a state where they are incorporated in a certain device (for example, an electronic device).
  • a certain device for example, an electronic device.
  • the idea is not limited to this and includes various aspects.
  • it may be a case where a part is software and a part is realized by hardware, and a part is stored on a storage medium, and is read as needed. It may be as a thing.
  • the present invention is applicable to all image encoding devices and image decoding devices.
  • FIG. 6 is an explanatory diagram schematically illustrating an example of level assignment for encoding order control in the image encoding device in FIG. 5.
  • FIG. 6 is an explanatory diagram schematically illustrating a specific example of encoding order control in the image encoding device in FIG. 5. It is explanatory drawing for demonstrating the specific example of the operation
  • FIG. 6 is an explanatory diagram schematically illustrating another example of level assignment for encoding order control in the image encoding device in FIG. 5. It is explanatory drawing which shows typically the other specific example of the encoding order control in the image coding apparatus of FIG. It is a block diagram which shows an example of the whole structure of the image decoding apparatus by the 3rd Embodiment of this invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

L’invention concerne un dispositif de codage d’image permettant d’empêcher la dégradation de l’efficacité de codage et de la qualité de l’image lorsqu’une image est codée en parallèle. Le dispositif de codage d’image comprend un moyen de codage (10b) permettant de coder une image entrée et de créer un train de bits et un moyen de commande de codage (10a) permettant de contrôler le moyen de codage de façon à contrôler l’ordre de codage des données d’image. Le moyen de commande de codage (10a) divise l’image en régions comprenant des blocs, détermine la forme de chacune des régions et définit le niveau de priorité de traitement de région pour chacune des régions. La condition de contrôle associée au moment où les régions sont codées conformément aux niveaux de priorité de traitement de région est que les informations de codage concernant une région ne puissent pas être utilisées par les autres régions ayant le même niveau de priorité de traitement de région et que les informations de codage concernant une région puissent être utilisées par les régions ayant des niveaux de priorité de traitement de région supérieurs à ceux de la région.
PCT/JP2009/064825 2008-09-09 2009-08-26 Dispositif de codage d’image, dispositif de décodage d’image, procédé de codage d’image, procédé de décodage d’image et programme WO2010029850A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010528701A JP5246264B2 (ja) 2008-09-09 2009-08-26 画像符号化装置、画像復号化装置、画像符号化方法及び画像復号化方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008231359 2008-09-09
JP2008-231359 2008-09-09

Publications (1)

Publication Number Publication Date
WO2010029850A1 true WO2010029850A1 (fr) 2010-03-18

Family

ID=42005109

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/064825 WO2010029850A1 (fr) 2008-09-09 2009-08-26 Dispositif de codage d’image, dispositif de décodage d’image, procédé de codage d’image, procédé de décodage d’image et programme

Country Status (2)

Country Link
JP (1) JP5246264B2 (fr)
WO (1) WO2010029850A1 (fr)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011125810A1 (fr) * 2010-04-09 2011-10-13 ソニー株式会社 Appareil et procédé de traitement d'images
WO2011125188A1 (fr) * 2010-04-07 2011-10-13 三菱電機株式会社 Dispositif de chiffrement de vidéo, dispositif de décodage de vidéo, programme de chiffrement de vidéo et procédé de chiffrement de vidéo
JP2011217082A (ja) * 2010-03-31 2011-10-27 Jvc Kenwood Corp 画像符号化装置、画像符号化方法、画像符号化プログラム、画像復号装置、画像復号方法及び画像復号プログラム
JP2012028863A (ja) * 2010-07-20 2012-02-09 Hitachi Kokusai Electric Inc 動画像符号化装置
JP2013098734A (ja) * 2011-10-31 2013-05-20 Fujitsu Ltd 動画像復号装置、動画像符号化装置、動画像復号方法、及び動画像符号化方法
KR101336577B1 (ko) 2012-06-18 2013-12-03 한국항공대학교산학협력단 움직임 벡터 유도 장치 및 그 유도 방법
JP2014502481A (ja) * 2010-12-13 2014-01-30 エレクトロニクス アンド テレコミュニケーションズ リサーチ インスチチュート 参照ユニット決定方法及び装置
JP2014143708A (ja) * 2014-03-10 2014-08-07 Sony Corp 画像処理装置および方法、プログラム、並びに、記録媒体
JP2015185979A (ja) * 2014-03-24 2015-10-22 富士通株式会社 動画像符号化装置及び動画像符号化器
JP2016106483A (ja) * 2016-02-03 2016-06-16 ソニー株式会社 画像処理装置および方法、プログラム、並びに、記録媒体
CN109040759A (zh) * 2018-07-27 2018-12-18 西安空间无线电技术研究所 一种图像并行压缩装置及方法
WO2019031136A1 (fr) * 2017-08-07 2019-02-14 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Dispositif de codage, dispositif de décodage, procédé de codage et procédé de décodage
JP2022511809A (ja) * 2018-12-06 2022-02-01 クゥアルコム・インコーポレイテッド ビデオコード化に対する空間-時間動きベクトル予測パターン
US11303920B2 (en) 2017-07-07 2022-04-12 Samsung Electronics Co., Ltd. Apparatus and method for encoding motion vector determined using adaptive motion vector resolution, and apparatus and method for decoding motion vector
US11432003B2 (en) 2017-09-28 2022-08-30 Samsung Electronics Co., Ltd. Encoding method and apparatus therefor, and decoding method and apparatus therefor

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004140473A (ja) * 2002-10-15 2004-05-13 Sony Corp 画像情報符号化装置、復号化装置並びに画像情報符号化方法、復号化方法
JP2004235683A (ja) * 2003-01-28 2004-08-19 Sony Corp 画像処理装置および符号化装置とそれらの方法
JP2006129284A (ja) * 2004-10-29 2006-05-18 Sony Corp 符号化及び復号装置並びに符号化及び復号方法
WO2007055158A1 (fr) * 2005-11-08 2007-05-18 Matsushita Electric Industrial Co., Ltd. Méthode de codage d’image animée, méthode de décodage d’image animée et dispositif correspondant
JP2008072647A (ja) * 2006-09-15 2008-03-27 Toshiba Corp 情報処理装置、デコーダおよび再生装置の動作制御方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4647558B2 (ja) * 2006-07-27 2011-03-09 日本電信電話株式会社 映像符号化並列処理方法,映像符号化装置,映像符号化プログラムおよびその記録媒体

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004140473A (ja) * 2002-10-15 2004-05-13 Sony Corp 画像情報符号化装置、復号化装置並びに画像情報符号化方法、復号化方法
JP2004235683A (ja) * 2003-01-28 2004-08-19 Sony Corp 画像処理装置および符号化装置とそれらの方法
JP2006129284A (ja) * 2004-10-29 2006-05-18 Sony Corp 符号化及び復号装置並びに符号化及び復号方法
WO2007055158A1 (fr) * 2005-11-08 2007-05-18 Matsushita Electric Industrial Co., Ltd. Méthode de codage d’image animée, méthode de décodage d’image animée et dispositif correspondant
JP2008072647A (ja) * 2006-09-15 2008-03-27 Toshiba Corp 情報処理装置、デコーダおよび再生装置の動作制御方法

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011217082A (ja) * 2010-03-31 2011-10-27 Jvc Kenwood Corp 画像符号化装置、画像符号化方法、画像符号化プログラム、画像復号装置、画像復号方法及び画像復号プログラム
WO2011125188A1 (fr) * 2010-04-07 2011-10-13 三菱電機株式会社 Dispositif de chiffrement de vidéo, dispositif de décodage de vidéo, programme de chiffrement de vidéo et procédé de chiffrement de vidéo
US8824813B2 (en) 2010-04-09 2014-09-02 Sony Corporation Image processing device and method
CN104539955A (zh) * 2010-04-09 2015-04-22 索尼公司 图像处理装置和方法
CN105847802A (zh) * 2010-04-09 2016-08-10 索尼公司 图像处理装置和方法
CN102884795A (zh) * 2010-04-09 2013-01-16 索尼公司 图像处理装置和方法
CN109640085B (zh) * 2010-04-09 2021-08-24 索尼公司 图像处理装置和方法
US10659792B2 (en) 2010-04-09 2020-05-19 Sony Corporation Image processing device and method
WO2011125810A1 (fr) * 2010-04-09 2011-10-13 ソニー株式会社 Appareil et procédé de traitement d'images
CN105847802B (zh) * 2010-04-09 2019-05-03 索尼公司 图像处理装置和方法
US10187645B2 (en) 2010-04-09 2019-01-22 Sony Corporation Image processing device and method
JP2011223358A (ja) * 2010-04-09 2011-11-04 Sony Corp 画像処理装置および方法
CN104661024A (zh) * 2010-04-09 2015-05-27 索尼公司 图像处理装置和方法
CN109640085A (zh) * 2010-04-09 2019-04-16 索尼公司 图像处理装置和方法
US9179152B2 (en) 2010-04-09 2015-11-03 Sony Corporation Image processing device and method
US9204152B2 (en) 2010-04-09 2015-12-01 Sony Corporation Image processing device and method
CN105847803B (zh) * 2010-04-09 2019-03-08 索尼公司 图像处理装置和方法
CN109302609A (zh) * 2010-04-09 2019-02-01 索尼公司 图像处理装置和方法
CN105847803A (zh) * 2010-04-09 2016-08-10 索尼公司 图像处理装置和方法
JP2012028863A (ja) * 2010-07-20 2012-02-09 Hitachi Kokusai Electric Inc 動画像符号化装置
US10425653B2 (en) 2010-12-13 2019-09-24 Electronics And Telecommunications Research Institute Method and device for determining reference unit
US11843795B2 (en) 2010-12-13 2023-12-12 Electronics And Telecommunications Research Institute Method and device for determining reference unit
US11252424B2 (en) 2010-12-13 2022-02-15 Electronics And Telecommunications Research Institute Method and device for determining reference unit
US9288491B2 (en) 2010-12-13 2016-03-15 Electronics And Telecommunications Research Institute Method and device for determining reference unit
JP2014502481A (ja) * 2010-12-13 2014-01-30 エレクトロニクス アンド テレコミュニケーションズ リサーチ インスチチュート 参照ユニット決定方法及び装置
JP2013098734A (ja) * 2011-10-31 2013-05-20 Fujitsu Ltd 動画像復号装置、動画像符号化装置、動画像復号方法、及び動画像符号化方法
KR101336577B1 (ko) 2012-06-18 2013-12-03 한국항공대학교산학협력단 움직임 벡터 유도 장치 및 그 유도 방법
JP2014143708A (ja) * 2014-03-10 2014-08-07 Sony Corp 画像処理装置および方法、プログラム、並びに、記録媒体
JP2015185979A (ja) * 2014-03-24 2015-10-22 富士通株式会社 動画像符号化装置及び動画像符号化器
JP2016106483A (ja) * 2016-02-03 2016-06-16 ソニー株式会社 画像処理装置および方法、プログラム、並びに、記録媒体
US11303920B2 (en) 2017-07-07 2022-04-12 Samsung Electronics Co., Ltd. Apparatus and method for encoding motion vector determined using adaptive motion vector resolution, and apparatus and method for decoding motion vector
US11991383B2 (en) 2017-07-07 2024-05-21 Samsung Electronics Co., Ltd. Apparatus and method for encoding motion vector determined using adaptive motion vector resolution, and apparatus and method for decoding motion vector
WO2019031136A1 (fr) * 2017-08-07 2019-02-14 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Dispositif de codage, dispositif de décodage, procédé de codage et procédé de décodage
US11432003B2 (en) 2017-09-28 2022-08-30 Samsung Electronics Co., Ltd. Encoding method and apparatus therefor, and decoding method and apparatus therefor
CN109040759A (zh) * 2018-07-27 2018-12-18 西安空间无线电技术研究所 一种图像并行压缩装置及方法
CN109040759B (zh) * 2018-07-27 2021-11-16 西安空间无线电技术研究所 一种图像并行压缩装置及方法
JP2022511809A (ja) * 2018-12-06 2022-02-01 クゥアルコム・インコーポレイテッド ビデオコード化に対する空間-時間動きベクトル予測パターン

Also Published As

Publication number Publication date
JPWO2010029850A1 (ja) 2012-02-02
JP5246264B2 (ja) 2013-07-24

Similar Documents

Publication Publication Date Title
JP5246264B2 (ja) 画像符号化装置、画像復号化装置、画像符号化方法及び画像復号化方法
US20230224477A1 (en) Image encoding device, image decoding device, image encoding method, image decoding method, and image prediction device
RU2703229C1 (ru) Устройство декодирования изображений, устройство кодирования изображений, способ декодирования изображений и способ кодирования изображений
Chi et al. Parallel scalability and efficiency of HEVC parallelization approaches
JP5261376B2 (ja) 画像符号化装置および画像復号化装置
US8855203B2 (en) Video encoding apparatus and video decoding apparatus
US10645410B2 (en) Video decoding apparatus
JP4707118B2 (ja) 動画像符号化装置および動画像復号装置のイントラ予測方式
RU2649775C1 (ru) Устройство кодирования изображений, устройство декодирования изображений, способ кодирования изображений и способ декодирования изображений
US8107761B2 (en) Method for determining boundary strength
TW201724023A (zh) 畫像編碼裝置、畫像解碼裝置、畫像編碼方法、畫像解碼方法及記憶媒體
US20150256827A1 (en) Video encoding device, video decoding device, video encoding method, and video decoding method
GB2539211A (en) Enhanced coding and decoding using intra block copy mode
JP2007013298A (ja) 画像符号化装置
JP2023018110A (ja) ビデオサンプルの変換されたブロックを符号化および復号する方法、装置、およびシステム
CN113940065A (zh) 用于编码和解码视频样本的块的方法、设备和系统
JP2008271068A (ja) 動画像符号化方法,動画像並列符号化用符号化器,動画像並列符号化方法,動画像並列符号化装置,それらのプログラム,およびそれらのプログラムを記録したコンピュータ読み取り可能な記録媒体
US20130107971A1 (en) Image encoding apparatus, image encoding method and program, image decoding apparatus, image decoding method and program
JPWO2013125171A1 (ja) イントラ予測モード判定装置、イントラ予測モード判定方法、及びイントラ予測モード判定プログラム
US20210160492A1 (en) Image processing apparatus and image processing method
JP2014135552A (ja) 動画像符号化装置、動画像復号装置、動画像符号化方法および動画像復号方法
JP2011023842A (ja) 動画像符号化方法及び動画像符号化装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09812997

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2010528701

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09812997

Country of ref document: EP

Kind code of ref document: A1