WO2013001803A1 - Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program - Google Patents

Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program Download PDF

Info

Publication number
WO2013001803A1
WO2013001803A1 PCT/JP2012/004148 JP2012004148W WO2013001803A1 WO 2013001803 A1 WO2013001803 A1 WO 2013001803A1 JP 2012004148 W JP2012004148 W JP 2012004148W WO 2013001803 A1 WO2013001803 A1 WO 2013001803A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion information
candidate
prediction
candidates
list
Prior art date
Application number
PCT/JP2012/004148
Other languages
French (fr)
Japanese (ja)
Inventor
英樹 竹原
博哉 中村
福島 茂
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2012143341A external-priority patent/JP5678924B2/en
Priority claimed from JP2012143340A external-priority patent/JP5807621B2/en
Priority to KR1020217000106A priority Critical patent/KR102279115B1/en
Priority to CN201280032664.5A priority patent/CN103636218B/en
Priority to KR1020227004619A priority patent/KR102464103B1/en
Priority to KR1020157019113A priority patent/KR20150088909A/en
Priority to KR1020187028735A priority patent/KR102004113B1/en
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Priority to KR1020197020932A priority patent/KR102103682B1/en
Priority to KR1020147002407A priority patent/KR20140043242A/en
Priority to KR1020217021884A priority patent/KR102365353B1/en
Priority to KR1020207010416A priority patent/KR102200578B1/en
Publication of WO2013001803A1 publication Critical patent/WO2013001803A1/en
Priority to US14/109,629 priority patent/US9516314B2/en
Priority to US15/339,242 priority patent/US9686564B2/en
Priority to US15/422,656 priority patent/US9681149B1/en
Priority to US15/422,679 priority patent/US9693075B2/en
Priority to US15/422,694 priority patent/US9854266B2/en
Priority to US15/843,054 priority patent/US10009624B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/577Motion compensation with bidirectional frame interpolation, i.e. using B-pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • H04N19/521Processing of motion vectors for estimating the reliability of the determined motion vectors or motion vector field, e.g. for smoothing the motion vector field or for correcting motion vectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/157Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/105Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/147Data rate or code amount at the encoder output according to rate distortion criteria
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/189Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
    • H04N19/196Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding being specially adapted for the computation of encoding parameters, e.g. by averaging previously computed encoding parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • H04N19/423Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation characterised by memory arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • H04N19/517Processing of motion vectors by encoding
    • H04N19/52Processing of motion vectors by encoding by predictive encoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/172Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field

Definitions

  • the present invention relates to a moving image encoding technique using motion compensated prediction, and in particular, an image encoding device, an image encoding method, an image encoding program, and an image decoding device that encode or decode motion information used in motion compensated prediction.
  • the present invention relates to an image decoding method and an image decoding program.
  • motion compensation prediction divides the target image into fine blocks, uses the decoded image as a reference image, and based on the amount of motion indicated by the motion vector, the position of the position moved from the target block of the target image to the reference block of the reference image. This is a technique for generating a signal as a prediction signal. Some motion compensation predictions are performed unidirectionally using one motion vector, and others are performed bidirectionally using two motion vectors.
  • the motion vector of the encoded block adjacent to the processing target block is set as a prediction motion vector (also simply referred to as “prediction vector”), and the difference between the motion vector of the processing target block and the prediction vector is obtained.
  • the compression efficiency is improved by transmitting the difference vector as an encoded vector.
  • MPEG-4AVC improves the efficiency of motion compensation prediction by making the block size of motion compensation prediction finer and more diverse than MPEG-2.
  • the code amount of the encoded vector becomes a problem.
  • the motion vector of the block adjacent to the left of the block to be processed is simply used as the prediction vector.
  • the prediction vector is obtained by using the median of the motion vectors of a plurality of adjacent blocks as the prediction vector. And the increase in the code amount of the encoded vector is suppressed.
  • direct motion compensation prediction is known in MPEG-4 AVC. Direct motion compensated prediction generates a new motion vector by scaling the motion vector of a block at the same position as the processing target block of another encoded image by the distance between the target image and two reference images. The motion compensation prediction is realized without transmitting the quantization vector.
  • motion compensation prediction that realizes motion compensation prediction without transmitting an encoded vector using motion information of a block adjacent to a processing target block is known (see, for example, Patent Document 1).
  • direct motion compensated prediction that does not transmit an encoded vector focuses on the continuity of motion of a block that is in the same position as a processing target block and a processing target block of another image that has been encoded. Further, Patent Document 1 focuses on the continuity of movement of a processing target block and a block adjacent to the processing target block. As a result, by using the motion information of other blocks, the encoding efficiency is improved without encoding the motion information including the difference vector as the encoded vector.
  • the motion of the processing target block is shifted to the motion of the processing target block and the adjacent block, or the motion of a block around the same position as the processing target block of another encoded image.
  • the motion information including the difference vector must be encoded, and there is a difficult aspect that the improvement of the encoding efficiency is not sufficiently exhibited.
  • the present invention has been made in view of such circumstances, and an object thereof is to provide a technique for further improving the coding efficiency of motion information including a motion vector.
  • an image encoding device that performs motion compensation prediction, and includes a plurality of encoded blocks adjacent to an encoding target block, A plurality of blocks each having one or two pieces of motion information including at least vector information and reference image information are selected, and motion information candidates used for motion compensation prediction are included from the motion information of the selected blocks
  • a second motion information acquisition unit (162) that acquires motion information of the second prediction list from the second candidate included, and the first motion information acquired by the first motion information acquisition unit (161).
  • a selection candidate generation unit (163) that generates a new candidate for motion information by combining the motion information of the prediction list and the motion information of the second prediction list acquired by the second motion information acquisition unit (162). And).
  • the candidate list generation unit (140) generates a candidate list including new candidates generated by the selection candidate generation unit (163) when the number of candidates is less than the set maximum number. Also good.
  • the candidate list generation unit (140) generates a candidate list including one or more new candidates generated by the selection candidate generation unit (163) so that the number of candidates does not exceed the maximum number. It may be generated.
  • a code string generation unit (104) for encoding candidate specifying information for specifying a candidate for motion information used for motion compensation prediction in the candidate list may be further included.
  • the candidate list generation unit (140) may assign candidate specifying information larger than the candidate to the new candidate generated by the selection candidate generation unit (163).
  • the first prediction list and the second prediction list may be different prediction lists.
  • the candidate list generation unit (140) may include motion information derived from motion information of a block of an image temporally different from the image including the encoding target block in the candidate list.
  • the first motion information acquisition unit (161) may search for the candidates according to a first priority order, and may select a valid candidate as the first candidate.
  • the second motion information acquisition unit (162) may search for the candidates according to a second priority order and set a valid candidate as the second candidate.
  • the first motion information acquisition unit (161) may use a predetermined candidate among the candidates as the first candidate.
  • the second motion information acquisition unit (162) may use another predetermined candidate among the candidates as the second candidate.
  • the selection candidate generation unit (163) includes the first prediction list motion information and the first prediction list motion information acquired by the first motion information acquisition unit (161) and the second motion information acquisition unit (162). If both pieces of motion information in the second prediction list are valid, the new candidate may be generated.
  • the new candidate may have two pieces of motion information.
  • the new candidate may have one piece of motion information.
  • Another aspect of the present invention is an image encoding method.
  • This method is an image encoding method for performing motion compensation prediction, and each of motion information including at least motion vector information and reference image information from a plurality of encoded blocks adjacent to an encoding target block. Selecting a plurality of blocks having one or two and generating a candidate list including motion information candidates used for motion compensation prediction from the motion information of the selected blocks; and a first included in the candidate list Obtaining the motion information of the first prediction list from the candidates, obtaining the motion information of the second prediction list from the second candidates included in the candidate list, and the motion of the first prediction list Combining the information and the motion information of the second prediction list to generate a new candidate for motion information.
  • An image decoding device is an image decoding device that performs motion compensation prediction, and includes at least motion vector information and reference image information from a plurality of decoded blocks adjacent to a decoding target block.
  • a candidate list generation unit (230) that selects a plurality of blocks each having one or two pieces of motion information and generates a candidate list including motion information candidates used for motion compensation prediction from the motion information of the selected blocks.
  • a first motion information acquisition unit (161) that acquires motion information of the first prediction list from the first candidate included in the candidate, and a second prediction list from the second candidate included in the candidate
  • a second motion information acquisition unit (162) that acquires the motion information of the first prediction list acquired by the first motion information acquisition unit (161), and the second Comprising a combination of motion information of the second prediction lists acquired by the motion information obtaining section (162), selecting a candidate generation unit for generating a new candidate motion information (163), the.
  • the candidate list generation unit (230) generates a candidate list including new candidates generated by the selection candidate generation unit (163) when the number of candidates is less than the set maximum number. Also good.
  • the candidate list generation unit (230) generates a candidate list including one or more new candidates generated by the selection candidate generation unit (163) so that the number of candidates does not exceed the maximum number. It may be generated.
  • a selection unit (231) that selects one candidate from the selection candidates included in the candidate list generated in (230) may be further provided.
  • the candidate list generation unit (230) may assign candidate specifying information larger than the candidate to the new candidate generated by the selection candidate generation unit (163).
  • the first prediction list and the second prediction list may be different prediction lists.
  • the candidate list generation unit (230) may include motion information derived from motion information of a block of an image temporally different from the image including the decoding target block in the candidate list.
  • the first motion information acquisition unit (161) may search for the candidates according to a first priority order, and may select a valid candidate as the first candidate.
  • the second motion information acquisition unit (162) may search for the candidates according to a second priority order and set a valid candidate as the second candidate.
  • the first motion information acquisition unit (161) may use a predetermined candidate among the candidates as the first candidate.
  • the second motion information acquisition unit (162) may use another predetermined candidate among the candidates as the second candidate.
  • the selection candidate generation unit (163) includes the first prediction list motion information and the first prediction list motion information acquired by the first motion information acquisition unit (161) and the second motion information acquisition unit (162). If both pieces of motion information in the second prediction list are valid, the new candidate may be generated.
  • the new candidate may have two pieces of motion information.
  • the new candidate may have one piece of motion information.
  • Another aspect of the present invention is an image decoding method.
  • This method selects and selects a plurality of blocks each having one or two pieces of motion information including at least motion vector information and reference image information from a plurality of decoded blocks adjacent to the decoding target block.
  • generating a new candidate for motion information by combining the motion information of the first prediction list and the motion information of the second prediction list.
  • the encoding efficiency of motion information including motion vectors can be further improved.
  • FIGS. 2A and 2B are diagrams for explaining an encoded block.
  • FIGS. 3A to 3D are diagrams for explaining a prediction block. It is a figure for demonstrating a prediction block size. It is a figure for demonstrating prediction encoding mode.
  • FIGS. 6A to 6D are diagrams for explaining the prediction direction of motion compensation prediction. It is a figure for demonstrating an example of the syntax of a prediction block.
  • FIGS. 8A to 8C are views for explaining a truncated unary code string of a merge index. It is a figure for demonstrating the structure of the moving image encoder which concerns on Embodiment 1 of this invention.
  • FIGS. 1 and 2 are figures for demonstrating the management method of the motion information in the motion information memory of FIG. It is a figure for demonstrating the structure of the motion information generation part of FIG. It is a figure for demonstrating the structure of the difference vector calculation part of FIG. It is a figure for demonstrating a space candidate block group. It is a figure for demonstrating a time candidate block group. It is a figure for demonstrating the structure of the joint motion information determination part of FIG. It is a figure for demonstrating the structure of the joint motion information candidate production
  • 19A and 19B are diagrams for explaining conversion from a merge candidate number to a merge index. It is a flowchart for demonstrating the operation
  • 10 is a flowchart for explaining an operation of a motion information generation unit in FIG. 9.
  • 12 is a flowchart for explaining an operation of a difference vector calculation unit in FIG. 11.
  • movement of the joint motion information determination part of FIG. 18 is a flowchart for explaining the operation of the bidirectional combined motion information candidate list generation unit in FIG. 16. It is a flowchart for demonstrating operation
  • FIGS. 31A to 31C are diagrams for explaining an extended example of determining the prediction direction of the bidirectional combined motion information candidate.
  • FIG. 33 is a diagram for describing a configuration of a motion information reproducing unit in FIG. 32. It is a figure for demonstrating the structure of the motion vector reproduction
  • 33 is a flowchart for explaining an operation of a motion information reproducing unit in FIG. 32. It is a flowchart for demonstrating operation
  • FIG. 40A and 40B are diagrams for explaining a candidate number management table according to the first modification.
  • FIG. 10 is a diagram for explaining another candidate number management table according to the first modification of the first embodiment. It is a flowchart for derivation
  • FIG. 10 is a flowchart for explaining an operation of a backward motion information determination unit according to Modification 3 of Embodiment 1.
  • FIG. 10 is a diagram for describing a configuration of a combined motion information candidate generation unit according to Modification 4 of Embodiment 1. It is a figure for demonstrating operation
  • 49 (a) and 49 (b) are diagrams for explaining a predetermined combination of BD0 and BD1 according to the sixth modification of the first embodiment.
  • FIG. 10 is a diagram for explaining a predetermined combination of BD0 and BD1 according to the sixth modification of the first embodiment.
  • FIG. 6 is a diagram (part 1) for explaining the effect of the first embodiment
  • FIG. 8 is a diagram (part 2) for explaining the effect of the first embodiment
  • FIG. 6 is a diagram (No. 3) for explaining the effect of the first embodiment
  • FIGS. 53A and 53B are diagrams for explaining the syntax for encoding the candidate number management table of the second embodiment in the encoded stream.
  • FIG. 10 is a diagram for explaining a candidate number management table according to the third embodiment.
  • FIG. 10 is a diagram for illustrating a configuration of a combined motion information candidate generation unit according to the third embodiment.
  • 10 is a flowchart for explaining an operation of a combined motion information candidate generation unit according to the third embodiment.
  • 10 is a flowchart for explaining an operation of a candidate number management table changing unit according to the third embodiment.
  • FIGS. 60A and 60B are diagrams for explaining the candidate number management table of the candidate number management table changing unit according to the first modification of the third embodiment.
  • 15 is a flowchart for explaining an operation of a candidate number management table changing unit according to the second modification of the third embodiment.
  • 22 is a flowchart for explaining an operation of a candidate number management table changing unit according to the third modification of the third embodiment.
  • 10 is a flowchart for explaining an operation of a reference direction motion information determination unit according to the fourth embodiment.
  • FIG. 25 is a diagram for illustrating a configuration of a combined motion information candidate generation unit according to the fifth embodiment.
  • FIG. 20 is a diagram for explaining a candidate number management table according to the sixth embodiment.
  • 18 is a flowchart for explaining an operation of a reference direction determination unit according to the sixth embodiment. It is a figure for demonstrating the calculation method of the motion vector mvL0t of a time coupling
  • the MPEG-2 video (ISO / IEC 18 13818-2) encoding system was established as a general-purpose video compression encoding system, and used for DVD and D-VHS (registered trademark) standard digital VTR magnetic tapes, etc. It is widely used as an application for storage media and digital broadcasting.
  • H.264 (144 / 96-10 in ISO / IEC and H.264 in ITU-T, hereinafter referred to as MPEG-4AVC) has been established as an international standard.
  • an input image signal is divided into maximum coding block units as shown in FIG. 1, and the divided coding blocks are processed in a raster scan order.
  • the encoded block has a hierarchical structure, and can be made smaller encoded blocks by sequentially equally dividing into 4 in consideration of the encoding efficiency. Note that the encoded blocks divided into four are encoded in the zigzag scan order. An encoded block that cannot be further reduced is called a minimum encoded block.
  • An encoded block is a unit of encoding, and the maximum encoded block is also an encoded block when the number of divisions is zero. In this embodiment, the maximum coding block is 64 pixels ⁇ 64 pixels, and the minimum coding block is 8 pixels ⁇ 8 pixels.
  • FIGS. 2A and 2B show an example of division of the maximum coding block.
  • the encoded block is divided into ten.
  • CU0, CU1 and CU9 are 32 ⁇ 32 pixel coding blocks
  • CU2, CU3 and CU8 are 16 ⁇ 16 pixel coding blocks
  • CU4, CU5, CU6 and CU7 are 8 ⁇ 8 pixel coding blocks. It has become.
  • the encoded block is further divided into prediction blocks.
  • the prediction block division patterns are shown in FIGS. 3A is 2N ⁇ 2N that does not divide the encoded block, FIG. 3B is 2N ⁇ N that is horizontally divided, FIG. 3C is N ⁇ 2N that is vertically divided, and FIG. 3D. Indicates N ⁇ N divided horizontally and vertically.
  • the prediction block size includes a CU division number of 0 and a maximum prediction block size of 64 pixels ⁇ 64 pixels to a CU division number of 3 and a minimum prediction block size. There are 13 predicted block sizes up to 4 pixels x 4 pixels.
  • the maximum encoding block is 64 pixels ⁇ 64 pixels and the minimum encoding block is 8 pixels ⁇ 8 pixels, but the present invention is not limited to this combination.
  • the prediction block division patterns are shown in FIGS. 3A to 3D, the division is not limited to this as long as it is divided into one or more.
  • the prediction direction of motion compensation prediction and the number of encoded vectors can be switched by the block size of the prediction block.
  • an example of a predictive coding mode in which the prediction direction of motion compensation prediction and the number of coding vectors are associated will be briefly described with reference to FIG.
  • the prediction coding mode shown in FIG. 5 includes a unidirectional mode (UniPred) in which the prediction direction of motion compensation prediction is unidirectional and the number of coding vectors is 1, and the prediction direction of motion compensation prediction is bidirectional.
  • a bidirectional mode (BiPred) in which the number of encoded vectors is 2
  • a merge mode (MERGE) in which the prediction direction of motion compensation prediction is unidirectional or bidirectional and the number of encoded vectors is 0.
  • MERGE merge mode
  • Intra intra mode which is a predictive coding mode in which motion compensation prediction is not performed.
  • the reference image used in the motion compensation prediction is encoded in the encoded stream together with the encoded vector as a reference image index.
  • the reference image index used in motion compensation prediction is a numerical value of 0 or more.
  • a plurality of reference images that can be selected by the reference image index are managed in a reference index list. If the prediction direction of motion compensation prediction is unidirectional, one reference image index is encoded. If the prediction direction of motion compensation prediction is bidirectional, a reference image index indicating a reference image in each prediction direction is encoded. (See FIG. 5).
  • Predicted vector index In HEVC, in order to improve the accuracy of a prediction vector, it is considered to select an optimal prediction vector from among a plurality of prediction vector candidates and to encode a prediction vector index for indicating the selected prediction vector. Yes.
  • the prediction vector index is introduced. If the prediction direction of motion compensation prediction is unidirectional, one prediction vector index is encoded. If the prediction direction of motion compensation prediction is bidirectional, a prediction vector index indicating a prediction vector in each prediction direction is encoded. (See FIG. 5).
  • merge technique the above-described merge index (merge technique) is introduced. As shown in FIG. 5, one merge index is encoded when the prediction encoding mode is the merge mode. If the motion information is bidirectional, the motion information includes motion vector information and reference image information in each prediction direction.
  • motion information possessed by a block that may be indicated by the merge index is referred to as a combined motion information candidate, and an aggregate of combined motion information candidates is referred to as a combined motion information candidate list.
  • FIG. 6A shows a case where the reference image (RefL0Pic) in the unidirectional direction and the L0 direction is at a time before the encoding target image (CurPic).
  • FIG. 6B shows a case where the reference image in the unidirectional direction and the L0 direction is at a time after the encoding target image.
  • the reference image in the L0 direction in FIGS. 6A and 6B may be replaced with a reference image in the L1 direction (RefL1Pic).
  • FIG. 6C shows a case where the reference image in the L0 direction is at a time before the encoding target image and the reference image in the L1 direction is at a time after the encoding target image.
  • FIG. 6D shows a case in which the reference image in the L0 direction and the reference image in the L1 direction are at a time before the encoding target image.
  • the reference image in the L0 direction in FIGS. 6C and 6D may be replaced with the reference image in the L1 direction (RefL1Pic), and the reference image in the L1 direction may be replaced with the reference image in the L0 direction.
  • the L0 direction and the L1 direction which are prediction directions of motion compensation prediction, can be indicated in either the forward direction or the backward direction in terms of time.
  • a plurality of reference images can exist in each of the L0 direction and the L1 direction, the reference image in the L0 direction is registered in the reference image list L0, and the reference image in the L1 direction is registered in the reference image list L1, The position of the reference image in the reference image list is designated by the reference image index in each prediction direction, and the reference image is determined.
  • the prediction direction is the L0 direction is a prediction direction that uses motion information associated with the reference image registered in the reference image list L0
  • the prediction direction is the L1 direction is registered in the reference image list L1.
  • the prediction direction uses motion information associated with the reference image.
  • the prediction block includes a merge flag (merge_flag), a merge index (merge_idx), a direction of motion compensation prediction (inter_pred_type), a reference index (ref_idx_l0 and ref_idx_l1), and a difference vector (mvd_l0 [0], mvd_l0 [1], mvd_l [1], mvd ], Mvd_l1 [1]) and prediction vector indexes (mvp_idx_l0 and mvp_idx_l1). [0] of the difference vector indicates a horizontal component and [1] indicates a vertical component.
  • ref_idx_l0 and mvd_l0 [0], mvd_l0 [1], and mvp_idx_l0 are information regarding the L0 direction
  • ref_idx_l1 and mvd_l1 [0], mvd_l1 [1], and mvp_idx_l1 are information regarding the L1 direction.
  • inter_pred_type Pred_L0 (unidirectional in the L0 direction), Pred_L1 (unidirectional in the L1 direction), and Pred_BI (bidirectional in the BI).
  • the merge mode can transmit motion information with one merge index. Therefore, if the prediction errors in the merge mode (merge flag is 1) and non-merge mode (merge flag is 0) are about the same, the merge mode can more efficiently encode motion information. In other words, the efficiency of motion information encoding can be improved by increasing the selection rate of the merge mode.
  • the motion information is encoded with less information in the merge mode than in the non-merge mode. What is necessary is not limited to this.
  • the motion information may be only a difference vector.
  • NumMergeCands which is a function for calculating the number of merge candidates before the merge index decoding (encoding), calculates the number of prediction vector candidates before the prediction vector index decoding (encoding).
  • NumMvpCands () is installed. These are functions necessary for obtaining the number of candidates because the number of merge candidates and the number of prediction vector candidates change for each prediction block depending on the validity of motion information of adjacent blocks. Note that the motion information of an adjacent block is valid means that the adjacent block is not a block outside the area or the intra mode, and the motion information of the adjacent block is invalid means that the adjacent block is out of the area. It is a block or intra mode.
  • FIG. 8A shows a merge index code string using a truncated unary code string when the number of merge candidates is two
  • FIG. 8B shows a merge using a truncated unary code string when the number of merge candidates is three
  • FIG. 8C shows a code sequence of the index
  • FIG. 8C shows a code sequence of the merge index by the Truncated Unary code sequence when the number of merge candidates is four.
  • the encoding efficiency of the merge index improves as the number of merge candidates decreases. That is, the coding efficiency of the merge index can be improved by leaving candidates with high selectivity and reducing candidates with low selectivity. In addition, when the number of candidates is the same, the code amount of the smaller merge index is smaller, so that the encoding efficiency can be improved by assigning a small merge index to a candidate with a high selection rate.
  • POC Picture Order Count
  • time information time information of an image.
  • POC is a counter indicating the display order of images defined by MPEG-4 AVC.
  • the POC is also increased by 1. Therefore, the time difference (distance) between images can be acquired from the POC difference between images.
  • the degree of correlation between the motion information of the processing target block and the motion information of the block adjacent to the processing target block is high when the processing target block and the adjacent block have the same motion. For example, this is a case where the region including the processing target block and the adjacent block is translated.
  • the degree of correlation between the motion information of the processing target block and the motion information of the adjacent block also depends on the length of contact between the processing target block and the adjacent block.
  • a block in the same position as the processing target block (hereinafter referred to as the same position block) on another decoded image generally used in the temporal direct mode or the spatial direct mode, and the processing target block
  • the degree of correlation is high when the block at the same position and the block to be processed are in a stationary state.
  • FIG. 9 shows a configuration of moving picture coding apparatus 100 according to Embodiment 1 of the present invention.
  • the moving image encoding apparatus 100 is an apparatus that encodes a moving image signal in units of prediction blocks for performing motion compensation prediction. It is assumed that the coding block division, the prediction block size determination, and the prediction encoding mode determination are determined by the higher-order encoding control unit.
  • the moving picture encoding apparatus 100 is realized by hardware such as an information processing apparatus including a CPU (Central Processing Unit), a frame memory, a hard disk, and the like.
  • the moving image encoding apparatus 100 realizes functional components described below by operating the above components. Note that the position information of the prediction block to be processed, the prediction block size, and the prediction direction of motion compensated prediction are assumed to be shared in the video encoding device 100 and are not shown.
  • the moving image encoding apparatus 100 includes a prediction block image acquisition unit 101, a subtraction unit 102, a prediction error encoding unit 103, a code string generation unit 104, a prediction error decoding unit 105, a motion compensation unit 106, and an addition unit. 107, a motion vector detection unit 108, a motion information generation unit 109, a frame memory 110, and a motion information memory 111.
  • the prediction block image acquisition unit 101 acquires the image signal of the prediction block to be processed from the image signal supplied from the terminal 10 based on the position information and the prediction block size of the prediction block, and subtracts the image signal of the prediction block 102, and supplied to the motion vector detection unit 108 and the motion information generation unit 109.
  • the subtraction unit 102 subtracts the image signal supplied from the prediction block image acquisition unit 101 and the prediction signal supplied from the motion compensation unit 106 to calculate a prediction error signal, and calculates the prediction error signal to the prediction error encoding unit 103. To supply.
  • the prediction error encoding unit 103 performs processing such as quantization and orthogonal transformation on the prediction error signal supplied from the subtraction unit 102 to generate prediction error encoded data, and encodes the prediction error encoded data.
  • the data is supplied to the column generation unit 104 and the prediction error decoding unit 105.
  • the code string generation unit 104 includes the prediction error encoded data supplied from the prediction error encoding unit 103, the merge flag supplied from the motion information generation unit 109, the merge candidate number, the prediction direction of motion compensation prediction, and the reference image index.
  • the difference vector and the prediction vector index are entropy-encoded according to the syntax together with the prediction direction of motion compensation prediction to generate a code string, and the code string is supplied to the terminal 11.
  • the merge candidate number is converted into a merge index to generate a code string.
  • the merge candidate number is a number indicating the selected combined motion information candidate. The conversion from the merge candidate number to the merge index will be described later.
  • a truncated unary code string is used for encoding a merge index and a prediction vector index as described above.
  • the code string can be encoded with a smaller number of bits as the number of candidates is smaller. .
  • the prediction error decoding unit 105 performs a process such as inverse quantization or inverse orthogonal transform on the prediction error encoded data supplied from the prediction error encoding unit 103 to generate a prediction error signal, and the prediction error signal Is supplied to the adder 107.
  • the motion compensation unit 106 performs motion compensation on the reference image in the frame memory 110 indicated by the reference image index supplied from the motion information generation unit 109 based on the motion vector supplied from the motion information generation unit 109, and generates a prediction signal. Generate. If the prediction direction is bidirectional, the prediction signal is obtained by averaging the prediction signals in the L0 direction and the L1 direction.
  • the addition unit 107 adds the prediction error signal supplied from the prediction error decoding unit 105 and the prediction signal supplied from the motion compensation unit 106 to generate a decoded image signal, and supplies the decoded image signal to the frame memory 110. To do.
  • the motion vector detection unit 108 detects a motion vector and a reference image index indicating the reference image from the image signal supplied from the prediction block image acquisition unit 101 and an image signal corresponding to a plurality of reference images, and the motion vector and the motion vector The reference image index is supplied to the motion information generation unit 109. If the prediction direction is bidirectional, motion vectors and reference image indexes in the L0 direction and the L1 direction are detected.
  • a general motion vector detection method calculates an error evaluation value for an image signal corresponding to a reference image moved by a predetermined movement amount from the same position as the image signal of the target image, and moves to minimize the error evaluation value. Let the amount be a motion vector.
  • SAD Sum of Absolute Difference
  • MSE Mel Square Error
  • the motion information generation unit 109 uses the motion vector and reference image index supplied from the motion vector detection unit 108, the candidate block group supplied from the motion information memory 111, and the reference image indicated by the reference image index in the frame memory 110. , A merge candidate number, or a difference vector and a prediction vector index, and a code flag generation unit 104, a motion compensation unit 106, and a merge flag, a merge candidate number, a reference image index, a difference vector, and a prediction vector index
  • the motion information memory 111 is supplied. A detailed configuration of the motion information generation unit 109 will be described later.
  • the frame memory 110 stores the decoded image signal supplied from the adding unit 107. In addition, for a decoded image in which decoding of the entire image is completed, a predetermined number of images of 1 or more is stored as a reference image.
  • the frame memory 110 supplies the stored reference image signal to the motion compensation unit 106 and the motion information generation unit 109.
  • a storage area for storing the reference image is controlled by a FIFO (First In First Out) method.
  • the motion information memory 111 stores the motion information supplied from the motion information generation unit 109 for a predetermined number of images in units of the minimum predicted block size.
  • the motion information of the adjacent block of the prediction block to be processed is set as a space candidate block group.
  • the motion information memory 111 uses the motion information of the block on the ColPic and the surrounding blocks at the same position as the prediction block to be processed as a time candidate block group.
  • the motion information memory 111 supplies the spatial candidate block group and the temporal candidate block group to the motion information generation unit 109 as candidate block groups.
  • the motion information memory 111 is synchronized with the frame memory 110 and is controlled by a FIFO (First In First Out) method.
  • ColPic refers to a decoded image different from the prediction block to be processed and stored in the frame memory 110 as a reference image.
  • ColPic is a reference image decoded immediately before.
  • ColPic is a reference image decoded immediately before.
  • it may be an encoded image, for example, a reference image immediately before in display order or a reference image immediately after in display order may be used. It can also be specified in the encoded stream.
  • FIG. 10 shows a state where the predicted block size to be processed is 16 pixels ⁇ 16 pixels.
  • the motion information of the prediction block is stored in 16 memory areas indicated by hatching in FIG.
  • the predictive coding mode is the intra mode
  • (0, 0) is stored as the motion vector in the L0 direction and the L1 direction
  • “ ⁇ 1” is stored as the reference image index in the L0 direction and the L1 direction.
  • the reference image index “ ⁇ 1” may be any value as long as it can be determined that the mode does not perform motion compensation prediction. From this point onward, unless expressed otherwise, the term “block” refers to the smallest predicted block unit when expressed simply as a block. Further, in the case of a block outside the area, as in the intra mode, (0, 0) is stored as the motion vector in the L0 direction and the L1 direction, and “ ⁇ 1” is stored as the reference image index in the L0 direction and the L1 direction. .
  • the LX direction (X is 0 or 1) is valid
  • the reference image index in the LX direction is 0 or more
  • the reference image index in the LX direction is “ ⁇ 1 ”.
  • the motion information generation unit 109 includes a difference vector calculation unit 120, a combined motion information determination unit 121, and a predictive coding mode determination unit 122.
  • the terminal 12 is in the motion information memory 111
  • the terminal 13 is in the motion vector detection unit 108
  • the terminal 14 is in the frame memory 110
  • the terminal 15 is in the prediction block image acquisition unit 101
  • the terminal 16 is in the code string generation unit 104
  • the terminal 50 Are connected to the motion compensation unit 106
  • the terminal 51 is connected to the motion information memory 111, respectively.
  • the difference vector calculation unit 120 predicts from the candidate block group supplied from the terminal 12, the motion vector and reference image index supplied from the terminal 13, the reference image supplied from the terminal 14, and the image signal supplied from the terminal 15. A vector index is determined, and a difference vector and a rate distortion evaluation value are calculated. Then, the reference image index, the motion vector, the difference vector, the prediction vector index, and the rate distortion evaluation value are supplied to the prediction encoding mode determination unit 122.
  • the detailed configuration of the difference vector calculation unit 120 will be described later.
  • the combined motion information determination unit 121 generates a combined motion information candidate list from the candidate block group supplied from the terminal 12, the reference image supplied from the terminal 14, and the image signal supplied from the terminal 15. Then, the combined motion information determination unit 121 selects a combined motion information candidate from the generated combined motion information candidate list, determines a merge candidate number, calculates a rate distortion evaluation value, and combines the combined motion information candidate. Motion information, the merge candidate number, and the rate distortion evaluation value are supplied to the predictive coding mode determination unit 122. A detailed configuration of the combined motion information determination unit 121 will be described later.
  • the predictive coding mode determination unit 122 compares the rate distortion evaluation value supplied from the difference vector calculation unit 120 with the rate distortion evaluation value supplied from the combined motion information determination unit 121. If the former is less than the latter, the merge flag is set to “0”. The predictive coding mode determination unit 122 supplies the merge flag, the reference image index, the difference vector, and the prediction vector index supplied from the difference vector calculation unit 120 to the terminal 16, and the motion vector supplied from the difference vector calculation unit 120. The reference image index is supplied to the terminal 50 and the terminal 51.
  • the predictive coding mode determination unit 122 supplies the merge flag and the merge candidate number supplied from the combined motion information determination unit 121 to the terminal 16, and the motion vector of the motion information supplied from the combined motion information determination unit 121 and the reference image
  • the index is supplied to the terminal 50 and the terminal 51.
  • the difference vector calculation unit 120 includes a prediction vector candidate list generation unit 130, a prediction vector determination unit 131, and a subtraction unit 132.
  • the terminal 17 is connected to the predictive coding mode determination unit 122.
  • the prediction vector candidate list generation unit 130 is also installed in the moving image decoding apparatus 200 that decodes the code sequence generated by the moving image encoding apparatus 100 according to the first embodiment.
  • the image decoding apparatus 200 generates a prediction vector candidate list having no contradiction.
  • the prediction vector candidate list generation unit 130 deletes candidate blocks outside the region and candidate blocks in the intra mode from the candidate block group supplied from the terminal 12. Further, when there are a plurality of candidate blocks having overlapping motion vectors, one candidate block is left and deleted. The prediction vector candidate list generation unit 130 generates a prediction vector candidate list from these candidate blocks after deletion, and supplies the prediction vector candidate list to the prediction vector determination unit 131. It is assumed that the predicted vector candidate list generated in this way includes one or more predicted vector candidates that do not overlap. For example, when there is no candidate block having a motion vector, the vector (0, 0) is added to the prediction vector candidate list. If the prediction direction is bidirectional, a prediction vector candidate list is generated and supplied for the L0 direction and the L1 direction.
  • the prediction vector determination unit 131 selects an optimal prediction vector for the motion vector supplied from the terminal 13 from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 130.
  • the prediction vector determination unit 131 supplies the selected prediction vector to the subtraction unit 132, and supplies a reference vector index and a prediction vector index that is information indicating the selected prediction vector to the terminal 17. If the prediction direction is bidirectional, an optimal prediction vector is selected and supplied for the L0 direction and the L1 direction.
  • the prediction error amount is calculated from the reference image supplied from the terminal 14 and the image signal supplied from the terminal 15 based on the motion vector of the prediction vector candidate as the optimal prediction vector. Then, a rate distortion evaluation value is calculated from the code amount of the reference image index, the difference vector, and the prediction vector index, and the above-described prediction error amount, and a prediction vector candidate that minimizes the rate distortion evaluation value is selected.
  • the subtraction unit 132 calculates a difference vector by subtracting the prediction vector supplied from the prediction vector determination unit 131 from the motion vector supplied from the terminal 13, and supplies the difference vector to the terminal 17. If the prediction direction is bidirectional, a difference vector is calculated and supplied for the L0 direction and the L1 direction.
  • the candidate block group includes a spatial candidate block group and a temporal candidate block group.
  • FIG. 13 shows adjacent blocks of the prediction block to be processed when the prediction block size to be processed is 16 pixels ⁇ 16 pixels.
  • the space candidate block group is assumed to be five blocks of block A1, block C, block D, block B1, and block E shown in FIG.
  • the spatial candidate block group is five blocks of block A1, block C, block D, block B1, and block E, but the spatial candidate block group is at least one or more processed adjacent to the prediction block to be processed. Any block may be used, and the present invention is not limited to these. For example, all of block A1, block A2, block A3, block A4, block B1, block B2, block B3, block B4, block C, block D, and block E may be spatial candidate blocks.
  • FIG. 14 shows a block in a prediction block on ColPic and its peripheral blocks at the same position as the prediction block to be processed when the prediction block size to be processed is 16 pixels ⁇ 16 pixels.
  • the time candidate block group includes two blocks, block H and block I6 shown in FIG.
  • the time candidate block group is two blocks of block H and block I6 on ColPic, but the time candidate block group is at least one block on a decoded image different from the prediction block to be processed.
  • the time candidate block group is at least one block on a decoded image different from the prediction block to be processed.
  • all of block I1 to block I16, block A1 to block A4, block B1 to block B4, block C, block D, block E, block F1 to block F4, block G1 to block G4 and block H on ColPic It may be a candidate block.
  • block A4 is referred to as block A
  • block B4 is referred to as block B.
  • blocks H and I6 are referred to as time blocks.
  • the combined motion information determination unit 121 includes a combined motion information candidate generation unit 140 and a combined motion information selection unit 141.
  • the combined motion information candidate generation unit 140 is also installed in the moving image decoding apparatus 200 that decodes the code sequence generated by the moving image encoding apparatus 100 according to Embodiment 1, and is combined with the moving image encoding apparatus 100 and the moving image.
  • the image decoding apparatus 200 generates the same combined motion information list without any contradiction.
  • the combined motion information candidate generation unit 140 generates a combined motion information candidate list from the candidate block group supplied from the terminal 12, and supplies the combined motion information candidate list to the combined motion information selection unit 141. A detailed configuration of the combined motion information candidate generation unit 140 will be described later.
  • the combined motion information selection unit 141 is information that selects an optimal combined motion information candidate from the combined motion information candidate list supplied from the combined motion information candidate generation unit 140 and indicates the selected combined motion information candidate.
  • the merge candidate number is supplied to the terminal 17.
  • the motion vector, and the reference image supplied from the terminal 14 obtained based on the reference image index and the image signal supplied from the terminal 15 as the optimal combined motion information candidate.
  • a prediction error amount is calculated.
  • a rate distortion evaluation value is calculated from the code amount of the merge candidate number and the prediction error amount, and a combined motion information candidate that minimizes the rate distortion evaluation value is selected.
  • the candidate block group includes a spatial candidate block group and a temporal candidate block group.
  • the space candidate block group is assumed to be four blocks of block A4, block B4, block C, and block E shown in FIG.
  • the spatial candidate block group is four blocks of block A4, block B4, block C, and block E, but the spatial candidate block group may be at least one or more processed blocks adjacent to the prediction block to be processed. What is necessary is not limited to these.
  • the time candidate block group includes two blocks, block H and block I6 shown in FIG.
  • the time candidate block group is the same as the time candidate block group supplied to the prediction vector candidate list generation unit 130, but the time candidate block group is on a decoded image different from the prediction block to be processed.
  • the block is not limited to these as long as it is at least 0 or more blocks.
  • FIG. 16 shows a configuration of the combined motion information candidate generation unit 140.
  • the terminal 18 is connected to the combined motion information selection unit 141.
  • the combined motion information candidate generation unit 140 includes a unidirectional combined motion information candidate list generation unit 150, a first combined motion information candidate list reduction unit 151, a bidirectional combined motion information candidate list generation unit 152, and a second combined motion information candidate list reduction. Part 153.
  • the unidirectional combined motion information candidate list generation unit 150 generates a first combined motion information candidate list from the candidate block group supplied from the terminal 12, and reduces the first combined motion information candidate list to the first combined motion information candidate list. To the unit 151.
  • the first combined motion information candidate list reduction unit 151 includes a plurality of combined motion information candidates having motion information overlapping from the first combined motion information candidate list supplied from the unidirectional combined motion information candidate list generation unit 150.
  • the second combined motion information candidate list is generated by deleting one of the combined motion information candidates, and the second combined motion information candidate list is supplied to the bidirectional combined motion information candidate list generating unit 152.
  • the bidirectional combined motion information candidate list generating unit 152 generates a bidirectional combined motion information candidate list from the second combined motion information candidate list supplied from the first combined motion information candidate list reducing unit 151, and the bidirectional combined motion information candidate list is generated.
  • the information candidate list is combined with the second combined motion information candidate list described above to generate a third combined motion information candidate list, and the third combined motion information candidate list is supplied to the second combined motion information candidate list reduction unit 153.
  • a detailed configuration of the bidirectional combined motion information candidate list generation unit 152 will be described later.
  • the bidirectional combined motion information candidate list generation unit 152 generates a bidirectional combined motion information candidate (BD0) whose reference direction is L0 and a bidirectional combined motion information candidate (BD1) whose reference direction is L1. Shall. Therefore, there is a possibility that BD0 and BD1 are included in the above-described bidirectional combined motion information candidate list.
  • the second combined motion information candidate list reduction unit 153 includes a plurality of combined motion information candidates that have overlapping motion information from the third combined motion information candidate list supplied from the bidirectional combined motion information candidate list generation unit 152.
  • the combined motion information candidate list is generated by deleting one combined motion information candidate, and the combined motion information candidate list is supplied to the terminal 18.
  • the unidirectional combined motion information candidate is a motion information candidate of a candidate block used in a so-called merge technique, and is motion information obtained from one candidate block.
  • the bidirectional combined motion information is a technique that is a feature of the first embodiment, and is motion information obtained by using two pieces of motion information from two candidate blocks. In this embodiment, one L0 direction and one L1 direction are used as two pieces of motion information.
  • FIG. 17 shows a configuration of the bidirectional combined motion information candidate list generation unit 152.
  • the terminal 19 is connected to the first combined motion information candidate list reduction unit 151, and the terminal 20 is connected to the second combined motion information candidate list reduction unit 153.
  • the bidirectional combined motion information candidate list generation unit 152 includes a reference direction determination unit 160, a reference direction motion information determination unit 161, a reverse direction motion information determination unit 162, and a bidirectional motion information determination unit 163.
  • the reference direction determination unit 160 determines the reference direction of the bidirectional combined motion information candidate from the second combined motion information candidate list, and determines the reference direction and the second combined motion information candidate list supplied from the terminal 19 as the reference direction motion information.
  • the data is sent to the determination unit 161.
  • the reference direction for the bidirectional combined motion information candidate (BD0) with the reference direction L0 is the L0 direction
  • the reference direction for the bidirectional combined motion information candidate (BD1) with the reference direction L1 is the L1 direction.
  • the reference direction motion information determination unit 161 determines the reference direction motion vector and the reference image index of the bidirectional combined motion information candidate from the reference direction and the second combined motion information candidate list supplied from the reference direction determination unit 160, The reference direction, the motion vector in the reference direction, the reference image index, and the second combined motion information candidate list are sent to the backward direction motion information determination unit 162.
  • the backward direction motion information determination unit 162 determines the bidirectional combined motion information candidate from the reference direction, the reference direction motion vector and the reference image index, and the second combined motion information candidate list supplied from the reference direction motion information determination unit 161. A backward motion vector and a reference image index are determined.
  • the backward motion information determination unit 162 sends the motion vector in the reference direction and the reference image index, the backward motion vector and the reference image index, and the second combined motion information candidate list to the bidirectional motion information determination unit 163.
  • the reverse direction is the L1 direction
  • the reverse direction is the L0 direction.
  • the bidirectional motion information determination unit 163 determines a bidirectional combined motion information candidate from the reference direction motion vector and the reference image index supplied from the backward direction motion information determination unit 162, and the backward direction motion vector and the reference image index. . In addition, the bidirectional motion information determination unit 163 generates a third combined motion information candidate list from the second combined motion information candidate list, and sends the third combined motion information candidate list to the terminal 20.
  • Merge candidate numbers 0 to 6 are combined motion information candidates (A) of block A, combined motion information candidates (B) of block B, and combined motion information candidates (COL) of time blocks included in the combined motion information candidate list, respectively.
  • Combined motion information candidate (C) of block C, combined motion information candidate (E) of block E, bidirectional combined motion information candidate (BD0) with reference direction L0, and bidirectional combined motion information candidate with reference direction L1 (BD1) is shown.
  • the maximum number of combined motion information candidates included in the combined motion information candidate list is 7 (the maximum value of the merge index is 6).
  • the merge candidate number of the bidirectional combined motion information candidate (BD0) having the reference direction L0 and the bidirectional combined motion information candidate (BD1) having the reference direction L1 is the merge candidate of the unidirectional combined motion information candidate. Assigned to be larger than the number.
  • the candidate number management table used in Embodiment 1 is shown in FIG. 18, it is only necessary that a smaller merge candidate number is assigned to a combined motion information candidate with a higher selection rate.
  • the maximum number of combined motion information candidates included in the candidate number management table and the combined motion information candidate list is assumed to be shared in the moving picture coding apparatus 100, and is not illustrated.
  • conversion from the merge candidate number to the merge index will be described with reference to FIGS.
  • FIG. 19A shows a combined motion information candidate for block A, a combined motion information candidate for block B, a combined motion information candidate for time block, a combined motion information candidate for block C, a combined motion information candidate for block E, and a reference direction.
  • the bidirectional combined motion information candidate of L0 and the bidirectional combined motion information candidate of the reference direction L1 are all valid, it indicates that the merge candidate number becomes the merge index as it is.
  • FIG. 19B shows a case in which merge indexes are assigned in ascending order of merge candidate numbers after filling invalid merge candidate numbers when the combined motion information candidates include invalid blocks.
  • the merge index 0 is set to the merge candidate number 0 and the merge index 1 is converted to merge candidate number 2
  • merge index 2 is converted to merge candidate number 3
  • merge index 3 is converted to merge candidate number 5
  • merge index 4 is converted to merge candidate number 6.
  • the merge index of the bidirectional combined motion information candidate (BD0) having the reference direction L0 and the bidirectional combined motion information candidate (BD1) having the reference direction L1 is greater than the merge index of the unidirectional combined motion information candidate. Is also assigned to be larger.
  • the reverse conversion from the merge index to the merge candidate number is performed, and the moving picture encoding apparatus 100 And the moving picture decoding apparatus 200 generates the same candidate number management table with no contradiction.
  • the prediction block image acquisition unit 101 acquires the image signal of the prediction block to be processed from the image signal supplied from the terminal 10 based on the position information of the prediction block and the prediction block size (S100).
  • the motion vector detection unit 108 detects a reference image index indicating a motion vector and a reference image from the image signal supplied from the predicted block image acquisition unit 101 and image signals corresponding to a plurality of reference images (S101).
  • the motion information generation unit 109 generates a merge candidate number or a difference vector and a prediction vector index from the motion vector and reference image index supplied from the motion vector detection unit 108 and the candidate block group supplied from the motion information memory 111. (S102).
  • the motion compensation unit 106 performs motion compensation on the reference image indicated by the reference image index in the frame memory 110 based on the motion vector supplied from the motion vector detection unit 108 to generate a prediction signal. If the prediction direction is bidirectional, an average of the prediction signals in the L0 direction and the L1 direction is generated as a prediction signal (S103).
  • the subtraction unit 102 calculates a difference between the image signal supplied from the prediction block image acquisition unit 101 and the prediction signal supplied from the motion compensation unit 106 to calculate a prediction error signal (S104).
  • the prediction error encoding unit 103 performs processing such as quantization and orthogonal transformation on the prediction error signal supplied from the subtraction unit 102 to generate prediction error encoded data (S105).
  • the code string generation unit 104 includes the prediction error encoded data supplied from the prediction error encoding unit 103, and the merge flag, merge candidate number, reference image index, difference vector, and prediction vector index supplied from the motion information generation unit 109. Is entropy-coded according to the syntax along with the prediction direction to generate a code string (S106).
  • the addition unit 107 adds the prediction error signal supplied from the prediction error decoding unit 105 and the prediction signal supplied from the motion compensation unit 106 to generate a decoded image signal (S107).
  • the frame memory 110 stores the decoded image signal supplied from the adding unit 107 (S108).
  • the motion information memory 111 stores one image of the motion vector supplied from the motion vector detection unit 108 in units of the minimum predicted block size (S109).
  • the difference vector calculation unit 120 includes a candidate block group supplied from the terminal 12, a motion vector and reference image index supplied from the terminal 13, a reference image supplied from the terminal 14, and an image signal supplied from the terminal 15. A prediction vector index is determined, and a difference vector and a rate distortion evaluation value are calculated (S110).
  • the combined motion information determination unit 121 determines the merge candidate number from the candidate block group supplied from the terminal 12, the reference image supplied from the terminal 14, and the image signal supplied from the terminal 15 to obtain the rate distortion evaluation value. Calculate (S111).
  • the predictive coding mode determination unit 122 compares the rate distortion evaluation value supplied from the difference vector calculation unit 120 with the rate distortion evaluation value supplied from the combined motion information determination unit 121, and merges if the former is smaller than the latter The flag is set to “0”, otherwise, the merge flag is set to “1” (S112).
  • the prediction vector candidate list generation unit 130 excludes candidate blocks that are out of the region, candidate blocks that are in the intra mode, and candidate blocks that have overlapping motion vectors from the candidate block group supplied from the terminal 12. A prediction vector candidate list is generated. If the prediction direction is bidirectional, a prediction vector candidate list is generated for the L0 direction and the L1 direction (S120).
  • the prediction vector determination unit 131 selects an optimal prediction vector for the motion vector supplied from the terminal 13 from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 130. If the prediction direction is bidirectional, an optimal prediction vector is selected for the L0 direction and the L1 direction (S121). The subtraction unit 132 subtracts the prediction vector supplied from the prediction vector determination unit 131 from the motion vector supplied from the terminal 13 to calculate a difference vector. If the prediction direction is bidirectional, a difference vector is calculated for the L0 direction and the L1 direction (S122).
  • the combined motion information candidate generation unit 140 generates a combined motion information candidate list from the candidate block group supplied from the terminal 12 (S130).
  • the combined motion information selection unit 141 selects from the combined motion information candidate list supplied from the combined motion information candidate generation unit 140, the motion vector and reference image index supplied from the terminal 13, and the combined motion information optimal for the prediction direction. Is determined (S131).
  • the unidirectional combined motion information candidate list generation unit 150 generates a spatial combined motion information candidate list from candidate blocks obtained by excluding candidate blocks outside the region and candidate blocks in the intra mode from the spatial candidate block group supplied from the terminal 12. (S140). Detailed operations for generating the spatially coupled motion information candidate list will be described later.
  • the unidirectional combined motion information candidate list generation unit 150 generates a time combined motion information candidate list from candidate blocks obtained by excluding candidate blocks outside the region and candidate blocks in the intra mode from the time candidate block group supplied from the terminal 12. Generate (S141). The detailed operation of generating the time combination motion information candidate list will be described later.
  • the unidirectional combined motion information candidate list generation unit 150 combines the spatial combined motion information candidate list and the temporal combined motion information candidate list in the order of merge candidate numbers to generate a first combined motion information candidate list (S142).
  • the first combined motion information candidate list reduction unit 151 includes a plurality of combined motion information candidates having motion information overlapping from the first combined motion information candidate list supplied from the unidirectional combined motion information candidate list generation unit 150.
  • the second combined motion information candidate list is generated by deleting one combined motion information candidate (S143).
  • the bidirectional combined motion information candidate list generating unit 152 generates a bidirectional combined motion information candidate list from the second combined motion information candidate list supplied from the first combined motion information candidate list reducing unit 151 (S144). The detailed operation of generating the bidirectional combined motion information candidate list will be described later.
  • the bidirectional combined motion information candidate list generation unit 152 combines the second combined motion information candidate list and the bidirectional combined motion information candidate list in the order of merge candidate numbers to generate a third combined motion information candidate list (S145). ).
  • the second combined motion information candidate list reduction unit 153 includes a plurality of combined motion information candidates that have overlapping motion information from the third combined motion information candidate list supplied from the bidirectional combined motion information candidate list generation unit 152.
  • a combined motion information candidate list is generated by deleting one combined motion information candidate (S146).
  • the spatial combination motion information candidate list includes motion information of four or less candidate blocks.
  • block A, block B, block C and block E which are the four candidate blocks included in the space candidate block group (S150 to S153).
  • the validity of the candidate block is checked (S151).
  • a candidate block is valid when the candidate block is not out of the region and is not in intra mode. If the candidate block is valid (YES in S151), the motion information of the candidate block is added to the spatially combined motion information candidate list (S152). If the candidate block is not valid (NO in S151), step S152 is skipped.
  • the spatial combination motion information candidate list includes motion information of four or less candidate blocks.
  • the number of spatial combination motion information candidate lists may vary depending on the effectiveness of the candidate block. It is not limited.
  • the temporal combination motion information candidate list includes motion information of one or less candidate blocks.
  • the following processing is repeated for time blocks that are two candidate blocks included in the time candidate block group (S160 to S166).
  • the validity of the candidate block is checked (S161). A candidate block is valid when the candidate block is not out of the region and is not in intra mode. If the candidate block is valid (YES in S161), a time combination motion information candidate is generated, the time combination motion information candidate is added to the list (step S162 to step S165), and the process ends. . If the candidate block is not valid (NO in S161), the next candidate block is inspected (S166).
  • the prediction direction of the temporally combined motion information candidate is determined (S162).
  • the prediction direction of the combined motion information candidate is bidirectional.
  • the reference images in the L0 direction and the L1 direction of the time combination motion information candidate are determined (S163).
  • the reference image in the L0 direction is the reference image that is the closest to the processing target image among the reference images in the L0 direction
  • the reference image in the L1 direction is the processing target image among the reference images in the L1 direction.
  • the reference image is the closest distance.
  • the reference image in the L0 direction is the reference image that is the closest to the processing target image among the reference images in the L0 direction
  • the reference image in the L1 direction is the closest to the processing target image among the reference images in the L1 direction.
  • the reference image is at a distance, it is only necessary to determine the reference image in the L0 direction and the reference image in the L1 direction, and the present invention is not limited to this.
  • the reference images in the L0 direction and the L1 direction may be encoded in the encoded stream
  • the reference image indexes in the L0 direction and the L1 direction may be set to 0, and the L0 used by the adjacent block of the processing target block.
  • the most frequently used reference image may be used as a reference image for reference in each of the L0 direction and the L1 direction.
  • the motion vector of the time combination motion information candidate is calculated (S164).
  • the temporally combined motion information candidate calculates bidirectional motion information based on the reference image ColRefPic and the motion vector mvCol, which are effective prediction directions in the motion information of the candidate block.
  • the prediction direction of the candidate block is the unidirectional direction of the L0 direction or the L1 direction, it is selected based on the reference image and the motion vector in the prediction direction.
  • the prediction direction of the candidate block is bidirectional, the selection is made based on either the reference image in the L0 direction or the L1 direction and the motion vector.
  • a reference image and a motion vector that are present in the same time direction as ColPic are selected as a reference, and a candidate block is selected based on a reference image that is closer to the distance between ColPic and a reference image in the L0 direction or L1 direction of the candidate block.
  • the block may be selected based on the direction in which the motion vector in the L0 direction or the L1 direction intersects the processing target image.
  • the time combination motion information candidates are generated as described above, but it is only necessary to be able to determine bidirectional motion information using motion information of another encoded image, and the present invention is not limited to this.
  • a motion vector scaled according to the distance between the reference image in each direction and the processing target image as in direct motion compensation may be used as a bidirectional motion vector. If the candidate block is invalid (NO in S163), the next candidate block is inspected (S165).
  • time combination motion information candidate list includes motion information of one or less candidate blocks.
  • number of time combination motion information candidate lists may be changed depending on the effectiveness of the candidate block, and is not limited thereto.
  • the prediction direction, reference image, and motion vector determination method are not limited to these.
  • the reference direction determination unit 160 determines the reference direction of the bidirectional combined motion information candidate from the second combined motion information candidate list (S170).
  • the reference direction for the bidirectional combined motion information candidate (BD0) with the reference direction L0 is the L0 direction
  • the reference direction for the bidirectional combined motion information candidate (BD1) with the reference direction L1 is the L1 direction.
  • the reference direction motion information determination unit 161 determines the reference direction motion vector and the reference image index of the bidirectional combined motion information candidate from the reference direction and the second combined motion information candidate list supplied from the reference direction determination unit 160 ( S171). The detailed operation of the reference direction motion information determination unit 161 will be described later.
  • the backward direction motion information determination unit 162 reverses the bidirectional combined motion information candidate from the reference direction, the reference direction motion vector, the reference image index, and the second combined motion information candidate list supplied from the reference direction motion information determination unit 161.
  • a direction motion vector and a reference image index are determined (S172). The detailed operation of the backward motion information determination unit 162 will be described later.
  • the bidirectional motion information determination unit 163 generates bidirectional combined motion information from the reference direction, the reference direction motion vector and the reference image index, and the reverse direction motion vector and the reference image index supplied from the backward direction motion information determination unit 162.
  • a candidate prediction direction is determined (S173). The detailed operation of determining the prediction direction of the bidirectional combined motion information candidate will be described later.
  • the bidirectional motion information determination unit 163 checks the validity of the prediction direction of the bidirectional combined motion information candidate (S174). If the prediction direction of the bidirectional combined motion information candidate is valid (YES in S174), the bidirectional motion information determination unit 163 adds the bidirectional combined motion information candidate to the bidirectional combined motion information candidate list (S175). If the prediction direction of the bidirectional combined motion information candidate is invalid (NO in S174), step S175 is skipped.
  • the reference direction motion information determination unit 161 will be described using the flowchart of FIG. Assume that the LX direction (X is 0 or 1) is selected as the reference direction of the bidirectional combined motion information candidate. The validity of the reference direction LX is set to “0” (S190). The number of combined motion information candidates (NCands) included in the second combined motion information candidate list is repeated, and the following processing is repeated (S191 to S194). The validity of the combined motion information candidate in the LX direction is checked (S192).
  • the validity of the reference direction LX is set to “1”, and the motion vector and reference index in the reference direction are set to the LX direction of the combined motion information candidate.
  • the process ends with the motion vector and the reference index (S193). If the LX direction of the combined motion information candidate is invalid (NO in S192), the next candidate is examined (S194).
  • NANDs combined motion information candidates
  • the present invention is not limited to this.
  • the number of inspections is fixed to a predetermined number such as 2 or 3, and the amount of processing is reduced and redundant bi-directional coupling is performed. It is also possible to reduce the amount of merge index codes by reducing the possibility of generating motion information candidates.
  • the reverse direction of the reference direction is set as the reverse direction of the bidirectional combined motion information candidate. It is assumed that the LY direction (Y is 0 or 1) is selected as the reverse direction. The validity of LY in the reverse direction is set to “0” (S200). The number of combined motion information candidates included in the second combined motion information candidate list (NCands) and the following processing are repeated (S201 to S205).
  • NANDs combined motion information candidates
  • the present invention is not limited to this.
  • the number of inspections is fixed to a predetermined number such as 2 or 3, and the amount of processing is reduced and redundant bi-directional coupling is performed. It is also possible to reduce the amount of merge index codes by reducing the possibility of generating motion information candidates.
  • the block to start the inspection as the combined motion information candidate next to the combined motion information candidate selected in the reference direction, the possibility that BD0 and BD1 are the same can be eliminated, and step S202 can be reduced. .
  • the prediction direction is a bi-directional BI. If only the LX direction is valid, the prediction direction is a unidirectional LX direction. If only the LY direction is valid, the prediction direction is a single direction. The prediction direction becomes invalid if both the LX direction and the LY direction are invalid. That is, when both the LX direction and the LY direction are valid, a combined motion information candidate having motion information in the LX direction and a combined motion information candidate having motion information in the LX direction having motion information in the LY direction A new bidirectional combined motion information candidate is generated by combining with another combined motion information candidate.
  • the prediction direction of the combined motion information candidate having the effective LX prediction is bi-prediction
  • the prediction direction of the combined motion information candidate is converted to single prediction.
  • the prediction direction of the combined motion information candidate is converted to single prediction.
  • FIGS. 31 (a) to 31 (c) show an extended example of determining the prediction direction of the bidirectional combined motion information candidate. For example, if at least one of the LX direction and the LY direction is invalid as shown in FIG. 31A, the prediction direction is invalidated, or the prediction direction is forced as shown in FIGS. 31B and 31C. Alternatively, it may be bidirectional.
  • the prediction efficiency of bidirectional prediction is higher than that of unidirectional prediction. Therefore, in FIG. 31A, when both the LX direction and the LY direction are not valid, the prediction direction of the bidirectional combined motion information candidates is invalidated, and the number of combined motion information candidates is reduced to reduce the code amount of the merge index. Can be reduced.
  • the adaptive processing may be performed so as to invalidate the prediction direction of the bidirectional combined motion information candidate.
  • the motion vector in the invalid prediction direction is (0, 0) and the reference index is “0”.
  • the bidirectional combined motion information candidate can be forced to be bidirectional using the reference image of the shortest distance as a prediction signal. This is because the reference index “0” is generally the reference image closest to the processing target image, and the reliability of the prediction signal with the shortest distance is the highest.
  • FIG. 32 shows a moving picture decoding apparatus 200 according to the first embodiment.
  • the video decoding device 200 is a device that generates a playback image by decoding the code string encoded by the video encoding device 100.
  • the video decoding device 200 is realized by hardware such as an information processing device including a CPU (Central Processing Unit), a frame memory, and a hard disk.
  • the moving picture decoding apparatus 200 realizes functional components described below by operating the above components. Note that the position information and the prediction block size of the prediction block to be decoded are shared in the video decoding device 200 and are not shown. Further, the maximum number of combined motion information candidates included in the candidate number management table and the combined motion information candidate list is assumed to be shared in the moving image decoding apparatus 200 and is not illustrated.
  • the moving picture decoding apparatus 200 includes a code string analysis unit 201, a prediction error decoding unit 202, an addition unit 203, a motion information reproduction unit 204, a motion compensation unit 205, a frame memory 206, and a motion information memory 207.
  • the code string analysis unit 201 decodes the code string supplied from the terminal 30 to predict prediction error encoded data, merge flag, merge candidate number, prediction direction of motion compensation prediction, reference image index, difference vector, and prediction vector index. Is decoded according to the syntax. Then, the prediction error coding data is transferred to the prediction error decoding unit 202, the merge flag, the merge candidate number, the prediction direction of the motion compensation prediction, the reference image index, the difference vector, and the prediction vector index as motion information. This is supplied to the playback unit 204.
  • the merge candidate number is obtained by conversion from the merge index.
  • the prediction error decoding unit 202 performs a process such as inverse quantization or inverse orthogonal transform on the prediction error encoded data supplied from the code string analysis unit 201 to generate a prediction error signal, and the prediction error signal is It supplies to the addition part 203.
  • the adding unit 203 adds the prediction error signal supplied from the prediction error decoding unit 202 and the prediction signal supplied from the motion compensation unit 205 to generate a decoded image signal, and the decoded image signal is stored in the frame memory 206 and Supply to terminal 31.
  • the motion information reproduction unit 204 is supplied from the motion information memory 207 and the merge flag, merge candidate number, motion compensation prediction direction, reference image index, difference vector, and prediction vector index supplied from the code string analysis unit 201. Motion information is reproduced from the candidate block group, and the motion information is supplied to the motion compensation unit 205. A detailed configuration of the motion information reproducing unit 204 will be described later.
  • the motion compensation unit 205 performs motion compensation on the reference image indicated by the reference image index in the frame memory 206 based on the motion information supplied from the motion information reproduction unit 204, and generates a prediction signal. If the prediction direction is bidirectional, an average of the prediction signals in the L0 direction and the L1 direction is generated as a prediction signal, and the prediction signal is supplied to the adding unit 203.
  • the frame memory 206 and the motion information memory 207 have the same functions as the frame memory 110 and the motion information memory 111 of the moving picture coding apparatus 100.
  • FIG. 33 shows the configuration of the motion information playback unit 204.
  • the motion information playback unit 204 includes an encoding mode determination unit 210, a motion vector playback unit 211, and a combined motion information playback unit 212.
  • the terminal 32 is connected to the code string analysis unit 201, the terminal 33 is connected to the motion information memory 207, and the terminal 34 is connected to the motion compensation unit 205.
  • the merge flag supplied from the code stream analysis unit 201 is “0”
  • the coding mode determination unit 210 determines the prediction direction, reference image index, difference vector, motion compensation prediction supplied from the code stream analysis unit 201, The prediction vector index is supplied to the motion vector reproduction unit 211. If the merge flag is “1”, the merge candidate number supplied from the code string analysis unit 201 is supplied to the combined motion information reproduction unit 212.
  • the motion vector reproduction unit 211 receives motion information from the prediction direction of motion compensation prediction supplied from the encoding mode determination unit 210, the reference image index, the difference vector, and the prediction vector index, and the candidate block group supplied from the terminal 33. Is supplied to the terminal 34. A detailed configuration of the motion vector reproducing unit 211 will be described later.
  • the combined motion information reproduction unit 212 reproduces the motion information from the merge candidate number supplied from the encoding mode determination unit 210 and the candidate block group supplied from the terminal 33 and supplies the motion information to the terminal 34. A detailed configuration of the combined motion information reproducing unit 212 will be described later.
  • the motion vector reproduction unit 211 includes a prediction vector candidate list generation unit 220, a prediction vector determination unit 221, and an addition unit 222.
  • the terminal 35 is connected to the encoding mode determination unit 210.
  • the prediction vector candidate list generation unit 220 has the same function as the prediction vector candidate list generation unit 130 of the video encoding device 100.
  • the prediction vector determination unit 221 determines a prediction vector from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 220 and the prediction vector index supplied from the terminal 35, and supplies the prediction vector to the addition unit 222.
  • the addition unit 222 adds the difference vector supplied from the terminal 35 and the prediction vector supplied from the prediction vector determination unit 221 to calculate a motion vector, and supplies the motion vector to the terminal 34.
  • the combined motion information reproduction unit 212 includes a combined motion information candidate generation unit 230 and a combined motion information selection unit 231.
  • the combined motion information candidate generation unit 230 has the same function as the combined motion information candidate generation unit 140 shown in FIG. Based on the combined motion information candidate list supplied from the combined motion information candidate generation unit 230 and the merge candidate number supplied from the terminal 35, the combined motion information selection unit 231 selects motion information from the combined motion information candidate list. The motion information is selected and supplied to the terminal 34.
  • the code string analysis unit 201 decodes the code string supplied from the terminal 30 to predict prediction error encoded data, merge flag, merge candidate number, prediction direction of motion compensation prediction, reference image index, difference vector, and prediction vector index. Is decoded according to the syntax (S210).
  • the motion information reproduction unit 204 is supplied from the motion information memory 207 and the merge flag, merge candidate number, motion compensation prediction direction, reference image index, difference vector, and prediction vector index supplied from the code string analysis unit 201. Motion information is reproduced from the candidate block group (S211).
  • the motion compensation unit 205 performs motion compensation on the reference image indicated by the reference image index in the frame memory 206 based on the motion information supplied from the motion information reproduction unit 204, and generates a prediction signal. If the prediction direction is bidirectional, an average of the prediction signals in the L0 direction and the L1 direction is generated as a prediction signal (S212).
  • the prediction error decoding unit 202 performs a process such as inverse quantization or inverse orthogonal transform on the prediction error encoded data supplied from the code string analysis unit 201 to generate a prediction error signal (S213).
  • the adding unit 203 adds the prediction error signal supplied from the prediction error decoding unit 202 and the prediction signal supplied from the motion compensation unit 205 to generate a decoded image signal (S214).
  • the frame memory 206 stores the decoded image signal supplied from the adding unit 203 (S215).
  • the motion information memory 207 stores the motion vector supplied from the motion information reproducing unit 204 for one image in the minimum predicted block size unit (S216).
  • the encoding mode determination unit 210 determines whether the merge flag supplied from the code string analysis unit 201 is “0” or “1” (S220). If the merge flag is “1” (1 in S220), the combined motion information reproduction unit 212 calculates the motion from the merge candidate number supplied from the encoding mode determination unit 210 and the candidate block group supplied from the terminal 33. Information is reproduced (S221).
  • the motion vector reproduction unit 211 supplies the motion compensation prediction prediction direction, reference image index, difference vector, and prediction vector supplied from the coding mode determination unit 210. Motion information is reproduced from the index and the candidate block group supplied from the terminal 33 (S222).
  • the prediction vector candidate list generation unit 220 generates a prediction vector candidate list by the same operation as the prediction vector candidate list generation unit 130 of the video encoding device 100 (S300).
  • the prediction vector determination unit 221 selects a prediction vector candidate indicated by the prediction vector index supplied from the terminal 35 from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 220, and determines a prediction vector. (S301).
  • the adder 222 adds the difference vector supplied from the terminal 35 and the prediction vector supplied from the prediction vector determination unit 221 to calculate a motion vector (S302).
  • the combined motion information candidate generation unit 230 generates a combined motion information candidate list by the same operation as the combined motion information candidate generation unit 140 of the video encoding device 100 (S310).
  • the combined motion information selection unit 231 selects a combined motion information candidate indicated by the merge candidate number supplied from the terminal 35 from the combined motion information candidate list supplied from the combined motion information candidate generation unit 230, and combines them.
  • the motion information is determined (S311).
  • the first embodiment can be modified as follows.
  • FIG. 18 is given as an example of the candidate number management table.
  • the maximum number of combined motion information candidates may be 1 or more, and a merge candidate number with a smaller combined motion information candidate with a higher selection rate. Is not limited to that shown in FIG.
  • the maximum number of combined motion information candidates included in the combined motion information candidate list is 7 (the maximum value of the merge index is 6), but it may be 2 or more.
  • the selection rate of bidirectional combined motion information candidates is higher than the selection rate of combined motion information candidates of block C and block E, it may be as shown in FIGS. 40 (a) and 40 (b). .
  • each bidirectional combined motion information candidate (BD0 to BD3) will be described.
  • the bidirectional combined motion information candidate (BD0) and the bidirectional combined motion information candidate (BD1) are assumed to be the same as those in the first embodiment.
  • the bidirectional combined motion information candidate (BD2) and the bidirectional combined motion information candidate (BD3) include a reference direction motion vector and a reference index of the reference direction bidirectional combined motion information candidate, and a reverse bidirectional combined motion information candidate.
  • the method of determining the motion vector and reference index in the base direction is different between the bidirectional combined motion information candidate (BD0) and the bidirectional combined motion information candidate (BD1).
  • FIG. 42 is a flowchart for explaining the derivation of the bidirectional combined motion information candidate (BD2).
  • FIG. 42 is obtained by replacing step S193 in the flowchart of FIG. 28 from step S195 to step S197.
  • step S195 to step S197 will be described. It is checked whether the effectiveness of LX is “1” (S195). If the validity of LX is not “1” (NO in S195), the validity of LX is set to “1” (S196), and the next candidate is examined (S194). If the validity of LX is “1” (YES in S195), the motion vector and reference index in the base direction are set as the motion vector and reference index in the LX direction of the combined motion information candidate (S197), and the process ends.
  • FIG. 43 is a flowchart for explaining the derivation of the bidirectional combined motion information candidate (BD3).
  • FIG. 43 is obtained by replacing step S204 in the flowchart of FIG. 29 from step S206 to step S208.
  • step S206 to step S208 will be described.
  • Whether the validity of LY is “1” is checked (S206). If the validity of LY is not “1” (NO in S206), the validity of LY is set to “1” (S207), and the next candidate is examined (S205). If the validity of LY is “1” (YES in S206), the motion vector and reference index in the base direction are set as the motion vector and reference index in the LY direction of the combined motion information candidate (S208), and the process is terminated.
  • the bidirectional combined motion information candidate (BD2) is first effective when it is not the same candidate as the reference direction in the reverse direction, and the motion vector and reference index in the reference direction of the combined motion information candidate that is second effective in the reference direction.
  • the combined motion information candidate is a bidirectional combined motion information candidate using a motion vector in the reverse direction of the combined motion information candidate and a reference index.
  • the bidirectional combined motion information candidate (BD3) is the second effective that is not the same candidate as the reference direction in the reverse direction and the motion vector and reference index in the reference direction of the combined motion information candidate that is first effective in the reference direction.
  • the combined motion information candidate is a bidirectional combined motion information candidate that combines a motion vector in the reverse direction of the combined motion information candidate and a reference index.
  • the number of combinations of bidirectional combined motion information candidates can be increased, the selection rate of combined motion information candidates can be increased, and the encoding efficiency of motion information can be improved.
  • FIG. 29 is given as an example of the operation of the backward direction motion information determination unit 162, but it is only necessary to generate bidirectional combined motion information candidates, and the present invention is not limited to this.
  • step S240 may be added as shown in FIG. 44 in order to increase the effectiveness of the bidirectional combined motion information candidate, that is, to prevent the second combined motion information candidate list reduction unit 153 from deleting it.
  • the combined motion information candidate having the same motion information as the bidirectional combined motion information candidate using the motion vector and reference index in the reference direction and the backward motion vector and reference index of the combined motion information candidate to be inspected is the second It is inspected that it is not in the combined motion information candidate list (S240).
  • step S205 is performed.
  • the next candidate is examined (S206). In this case, the second combined motion information candidate list reduction unit 153 in FIG. 16 and step S146 in FIG. 24 can be omitted.
  • the second combined motion information candidate list reduction unit 153 does not reduce the bidirectional combined motion information candidates, and the selection rate of the combined motion information candidates can be increased and the motion information encoding efficiency can be improved. .
  • FIG. 29 is given as an example of the operation of the backward direction motion information determination unit 162, but step S250 may be added as shown in FIG.
  • step S205 is performed. If they are the same (NO in S250), the next candidate is inspected (S206).
  • the bidirectional combined motion information candidate is not the same as the combined motion information candidate selected in the reference direction, and the effectiveness of the bidirectional combined motion information candidate is increased and the combined motion information candidate selection rate is increased.
  • the encoding efficiency of motion information can be improved.
  • FIG. 16 is given as an example of the configuration of the combined motion information candidate generation unit 140.
  • the first combined motion information candidate list reduction unit 151 is configured as illustrated in FIG.
  • only the second combined motion information candidate list reduction unit 153 can be combined with the deletion unit.
  • the bidirectional combined motion information candidate list generation unit 152 since a redundant combined motion information candidate is supplied to the bidirectional combined motion information candidate list generation unit 152 as a problem in this case, if the first two unidirectional combined motion information candidates are the same, the reference The bidirectional combined motion information candidate (BD0) whose direction is L0 and the bidirectional combined motion information candidate (BD1) whose reference direction is L1 are the same motion information. Therefore, as shown in FIG. 47 (b), the probability of generating the same bidirectional combined motion information candidate by changing the inspection order of FIG. 28 and FIG. 29 depending on whether the reference direction is L0 or L1. Can be reduced.
  • bidirectional combined motion information candidates are generated by searching for combined motion information candidate blocks that are valid in the direction opposite to the reference direction and using the motion information in the direction opposite to the reference direction. . By searching in the direction opposite to the reference direction, the effectiveness of the bidirectional combined motion information candidate can be improved, but the processing amount increases.
  • the bidirectional combined motion information candidates are defined as combinations of predetermined combined motion information candidate blocks with higher reliability, thereby omitting the search process. It is possible to improve the coding efficiency by improving the selection rate of the directional coupled motion information candidates.
  • the bidirectional combined motion information candidate (BD0) having the reference direction L0 indicates the motion information in the L0 direction of the candidate block A with the highest reliability and the L1 direction of the candidate block B with the second highest reliability.
  • the bi-directional combined motion information candidate (BD1) whose reference direction is L1 is the motion information in the L1 direction of the most reliable candidate block A and the L0 direction of the second most reliable candidate block B in the L0 direction. This is an example in which motion information is combined and the prediction direction is defined as bidirectional prediction.
  • the bidirectional combined motion information candidate (BD0) whose reference direction is L0 is the motion information in the L0 direction of the candidate block A having the highest reliability
  • the prediction direction is defined as unidirectional prediction. Note that other combinations are possible as long as the combinations of candidate blocks have higher reliability.
  • a small merge candidate number is assigned to the bidirectional combined motion information candidate (BD0) whose reference direction is L0.
  • the present invention is not limited to this.
  • a small merge candidate number is assigned to a bidirectional combined motion information candidate for bidirectional prediction with high prediction efficiency.
  • the encoding efficiency can also be improved.
  • BD0 and BD1 are bidirectional prediction
  • a smaller merge candidate number can be preferentially assigned to a bidirectional combined motion information candidate whose motion information in the reference direction is unidirectional. This is because the reliability of motion information is generally high when unidirectional prediction is selected even though bidirectional prediction has higher prediction efficiency than unidirectional prediction.
  • Embodiment 1 Example of the effect of bidirectional joint motion information in bidirectional prediction
  • the motion vector in the L0 direction of the block N is mvL0N
  • the motion vector in the L1 direction is mvL1N
  • the reference image index in the L0 direction is refIdxL0N
  • the reference image index in the L1 direction is refIdxL1N
  • the difference vector in the L0 direction is dmvL0N
  • the difference vector in the L0 direction is represented as dmvL1N
  • the difference between the reference image indexes in the L0 direction is represented as drefIdxL0N
  • the reference image index in the L1 direction is represented as drefIdxL1N.
  • the unidirectional combined motion information candidates are A, B, COL, C, and E in FIG.
  • these unidirectional combined motion information candidates there is no motion information identical to the motion information that minimizes the prediction error for the processing target block (Z). Therefore, a unidirectional combined motion information candidate having a minimum rate distortion evaluation value is selected from these unidirectional combined motion information candidates. Then, the candidate rate distortion evaluation value and the rate distortion evaluation value calculated by the difference vector calculation unit 120 are compared, and the merge mode is used as the encoding mode only when the former is smaller than the latter. Become.
  • the merge mode When the merge mode is selected as the encoding mode, it is because the balance between the encoding efficiency of motion information and the prediction error is optimal, and the prediction error is not optimal. On the other hand, when the non-merge mode is selected as the encoding mode, the encoding efficiency of motion information is not optimal.
  • the bidirectional combined motion information candidates generated by the first embodiment are BD0 and BD1 in FIG.
  • the bidirectional combined motion information candidate (BD0) whose reference direction is L0 is a bidirectional combined motion information candidate including the motion information of the block A in the L0 direction and the motion information of the block B in the L1 direction.
  • the bidirectional combined motion information candidate (BD1) having the reference direction L1 is a bidirectional combined motion information candidate including the motion information of the block A in the L1 direction and the motion information of the block B in the L0 direction.
  • the bidirectional combined motion information candidate (BD0) whose reference direction is L0 has the same motion information as the motion information that minimizes the prediction error for the processing target block (Z). That is, by selecting the bidirectional combined motion information candidate (BD0) whose reference direction is L0, it is possible to minimize the prediction error and optimize the encoding efficiency of the motion information.
  • unidirectional combined motion information candidates are invalid (x), and valid unidirectional combined motion information candidates A and E have motion information as shown in FIG. Also in this case, there is no motion information in the unidirectional combined motion information candidates that minimizes the prediction error for the processing target block (Z).
  • the bidirectional combined motion information candidates generated by the first embodiment are BD0 and BD1 in FIG.
  • the bidirectional combined motion information candidate (BD0) whose reference direction is L0 is a bidirectional combined motion information candidate whose prediction direction consisting of motion information of the block A in the L0 direction is unidirectional.
  • the bidirectional combined motion information candidate (BD1) whose reference direction is L1 is a bidirectional combined motion information candidate including the motion information of the block E in the L0 direction and the motion information of the block A in the L1 direction. It can be seen that the bidirectional combined motion information candidate (BD0) whose reference direction is L0 has the same motion information as the motion information that minimizes the prediction error for the processing target block (Z). That is, by selecting the bidirectional combined motion information candidate (BD0) whose reference direction is L0, it is possible to minimize the prediction error and optimize the encoding efficiency of the motion information.
  • unidirectional combined motion information candidates A, COL, and C are invalid (x), and valid unidirectional combined motion information candidates B and E have motion information as shown in FIG. Also in this case, there is no motion information in the unidirectional combined motion information candidate that minimizes the prediction error for the processing target block (Z).
  • the bidirectional combined motion information candidates generated by the first embodiment are BD0 and BD1 in FIG.
  • the bidirectional combined motion information candidate (BD0) whose reference direction is L0 is a bidirectional combined motion information candidate including the motion information of the block B in the L0 direction and the motion information of the block E in the L1 direction, and BD1 is not generated.
  • the bidirectional combined motion information candidate (BD0) having the reference direction L0 has the same motion information as the motion information that minimizes the prediction error for the processing target block (Z). That is, by selecting the bidirectional combined motion information candidate (BD0) whose reference direction is L0, it is possible to minimize the prediction error and optimize the encoding efficiency of the motion information.
  • Bidirectional motion information candidate As described above, by generating the bidirectional combined motion information candidate using the motion information in the L0 direction and the L1 direction of the unidirectional combined motion information candidate, the motion of the processing target block is encoded with another image. Even if there is a deviation from the motion of the block located at the same position or the adjacent block of the processing target block, the motion information can be encoded using only the index without encoding. Therefore, it is possible to realize a moving image encoding device and a moving image decoding device that can optimize encoding efficiency and prediction efficiency.
  • the bidirectional combined motion information candidate list generation unit 152 has the bidirectional combined motion information candidate (BD 0) whose reference direction is L 0 and the reference It can be avoided that the bidirectional combined motion information candidate (BD1) having the direction L1 has the same motion information, and the effectiveness of the bidirectional combined motion information candidate can be increased to improve the encoding efficiency.
  • the combined motion information candidates are generated without increasing the number of unidirectional combined motion information candidates by generating the bidirectional combined motion information candidates using the motion information in each direction of the unidirectional combined motion information candidates.
  • the number of can be increased. Therefore, in the moving picture encoding apparatus and moving picture decoding apparatus using a general LSI in which the memory read time is increased by increasing the number of unidirectional combined motion information candidates, the number of unidirectional combined motion information candidates is increased. The increase in memory read time due to can be suppressed.
  • Adaptive switching As described above, by assigning a small merge candidate number to a bidirectional combined motion information candidate whose prediction direction is bidirectional, a bidirectional combined motion information candidate having a high prediction efficiency and a bidirectional prediction direction can be obtained.
  • a bidirectional combined motion information candidate By increasing the selection rate and preferentially assigning a small merge candidate number to a bidirectional combined motion information candidate whose motion information in the reference direction is unidirectional, a bidirectional combined motion information candidate using highly reliable motion information It is possible to improve the coding efficiency by increasing the selectivity.
  • the higher-order function of the moving picture encoding apparatus has a function of changing the candidate number management table for each encoded stream unit or for each slice that is a part of the encoded stream.
  • the code string generation unit 104 encodes the candidate number management table into an encoded stream as shown in FIGS. 53 (a) and 53 (b) and transmits the encoded stream.
  • 53 (a) and 53 (b) the syntax of encoding the candidate number management table with SPS (Sequence Parameter Set) for control in units of encoded streams and Slice_header for control in units of slices. An example is shown.
  • modified_merge_index_flag specifies whether to change the standard relationship between the merge candidate number and the combined motion information candidate, specifies the number to be redefined with "max_no_of_merge_index_minus1", and enters the combined motion information candidate list with "merge_mode [i]” Specifies the order of candidate blocks included.
  • bd_merge_base_direction that is information for designating the reference direction of the bidirectional combined motion information candidate can be set.
  • 53 (a) and 53 (b) are examples of syntax, and merge candidate numbers to be assigned to bidirectional combined motion information candidates can be specified in the encoded stream, and the reference direction of bidirectional combined motion information candidates can be determined.
  • the present invention is not limited to this.
  • the configuration of the moving picture decoding apparatus according to the second embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the code string analysis unit 201.
  • the code string analysis unit 201 decodes the candidate number management table according to the syntaxes of FIGS. 53 (a) and 53 (b).
  • Embodiment 3 Replacement of combined motion information candidates
  • the configuration of the moving picture coding apparatus according to the third embodiment is the same as that of the moving picture coding apparatus 100 according to the first embodiment except for the function of the combined motion information candidate generation unit 140.
  • the candidate number management table in Embodiment 3 is shown in FIG. 54, and the maximum number of combined motion information candidates included in the combined motion information candidate list is 5.
  • the difference is that the maximum number of combined motion information candidates included in the combined motion information candidate list is 5, and no merge candidate number is assigned to the bidirectional combined motion information candidate.
  • the difference between the combined motion information candidate generation unit 140 in Embodiment 3 and Embodiment 1 will be described with reference to FIG.
  • the candidate number management table changing unit 154 calculates the effective number of bidirectional combined motion information candidates from the second combined motion information candidate list supplied from the first combined motion information candidate list reducing unit 151. If the effective number of bidirectional combined motion information candidates is 1 or more, the candidate number management table is changed, and the second combined motion information candidate list is supplied to the bidirectional combined motion information candidate list generation unit 152. If the effective number of bidirectional combined motion information candidates is 0, the second combined motion information candidate list is supplied to the terminal 18 as a combined motion information candidate list.
  • the flowchart of FIG. 56 has the following two steps added to the flowchart of FIG.
  • the candidate number management table changing unit 154 changes the candidate number management table (S260). It is checked whether the candidate number management table has been changed (S261). If the candidate number management table is changed (YES in S261), step S144 is performed. If the candidate number management table has not been changed (NO in S261), step S144 is skipped.
  • the candidate number management table changing unit 154 counts the number of invalid combined motion information candidates not included in the second combined motion information candidate list, and calculates the invalid number of combined motion information candidates (S270).
  • the invalid number of combined motion information candidates is calculated as the number of invalid combined motion information candidates not included in the second combined motion information candidate list.
  • the number of invalid combined motion information candidates is What is necessary is just to be able to calculate, and it is not limited to this.
  • the number of valid combined motion information candidates included in the second combined motion information candidate list is subtracted from 5 which is the sum of 4 which is the maximum number of spatially combined motion information candidates and 1 which is the maximum number of temporally combined motion information candidates.
  • the number of invalid combined motion information candidates may be obtained.
  • the combined motion information having a high selection rate is invalid, it is considered that the selection rate of the bidirectional combined motion information candidate also decreases. Therefore, the number of invalid combined motion information candidates having a merge candidate number of 2 or more is set. You may count.
  • the candidate number management table changing unit 154 checks whether the invalid number of combined motion information candidates is 1 or more (S271). If the invalid number of combined motion information candidates is 1 or more (YES in S271), the subsequent processing is performed to change the candidate number management table. If the invalid number of combined motion information candidates is 0 (NO in S271), the process ends.
  • the candidate number management table changing unit 154 counts the number of effective bidirectional combined motion information candidates and calculates the effective number of bidirectional combined motion information candidates (S272). That is, if both BD0 and BD1 are valid, the effective number of bidirectional combined motion information candidates is 2, and if either BD0 or BD1 is valid, the effective number of bidirectional combined motion information candidates is 1, and BD0 And BD1 are both invalid, the effective number of bidirectional combined motion information candidates is zero.
  • the candidate number management table changing unit 154 sets the smaller one of the invalid number of combined motion information candidates and the effective number of bidirectional combined motion information candidates as the additional number of bidirectional combined motion information candidates (S273).
  • the candidate number management table changing unit 154 assigns invalid merge candidate numbers to the bidirectional combined motion information candidates for the added number of bidirectional combined motion information candidates (S274).
  • FIG. 58A shows an example in which the invalid number of combined motion information candidates is 1 and the effective number of bidirectional combined motion information candidates is 1 or more.
  • BD0 is assigned to the first invalid merge candidate number 1. If BD1 is valid, BD1 may be assigned.
  • FIG. 58B shows an example in which the invalid number of combined motion information candidates is 2 and the effective number of bidirectional combined motion information candidates is 2.
  • BD0 is assigned to the first invalid merge candidate number 2
  • BD1 is assigned to the second invalid merge candidate number 4.
  • FIG. 58C shows an example in which the invalid number of combined motion information candidates is 2 and the effective number of bidirectional combined motion information candidates is 1 (BD1 is valid).
  • BD1 is assigned to invalid merge candidate number 2.
  • the configuration of the moving picture decoding apparatus according to the third embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the combined motion information candidate generation unit 140.
  • the combined motion information candidate generation unit 140 of the video decoding device of the third embodiment is the same as the combined motion information candidate generation unit 140 of the video encoding device of the third embodiment.
  • the third embodiment can be modified as follows.
  • FIG. 57 is given as an example of the operation of the candidate number management table changing unit 154.
  • a merge candidate number smaller in the combined motion information candidate having a higher selection rate is assigned to the changed candidate number management table.
  • the present invention is not limited to this.
  • step S275 may be added to the operation of the candidate number management table changing unit 154 as shown in FIG.
  • the flowchart of FIG. 59 is obtained by adding step S275 to the flowchart of FIG.
  • the candidate number management table changing unit 154 packs the merge candidate numbers of invalid combined motion information candidates (S274).
  • FIG. 60A shows an example in which the invalid number of combined motion information candidates is 1 and the effective number of bidirectional combined motion information candidates is 1 or more.
  • BD0 is first assigned to the invalid merge candidate number 4. If BD1 is valid, BD1 may be assigned.
  • FIG. 60B shows an example in which the invalid number of combined motion information candidates is 2 and the effective number of bidirectional combined motion information candidates is 2.
  • BD0 is assigned to the first invalid merge candidate number 3
  • BD1 is assigned to the second invalid merge candidate number 4.
  • a merge candidate number larger than that of the unidirectional combined motion information candidate is assigned to the bidirectional combined motion information candidate.
  • Modification 2 Depends on a predetermined block
  • the operation of the candidate number management table changing unit 154 can be further modified. First, in this modification, it is assumed that a predetermined bidirectional combined motion information candidate is associated with a predetermined block, BD0 is associated with block C, and BD1 is associated with block D.
  • BD0 is associated with block C
  • BD1 is associated with block D.
  • S280 to S284 another modification of the operation of the candidate number management table changing unit 154 will be described with reference to FIG.
  • the following process is repeated for the number of associated blocks (S280 to S284). Whether the i-th predetermined block is invalid is checked (S281). If the i-th predetermined block is invalid (YES in S281), the subsequent processing is performed to change the candidate number management table. If the i-th predetermined block is not invalid (NO in S281), the next predetermined block is inspected.
  • the candidate number management table changing unit 154 assigns the bidirectional combined motion information candidate (BD0) to the first predetermined invalid merge candidate number, and the candidate number management table changing unit 154 sets the second predetermined invalid invalid candidate number.
  • a bidirectional combined motion information candidate (BD1) is assigned to the merge candidate number (S282).
  • the bidirectional combined motion information candidate list generation unit 152 is valid when the predetermined combined motion information candidate is invalid.
  • the predetermined combined motion information candidates are the block C and the block E, but when the combined motion information candidate having a higher merge candidate number and the low selection rate is invalid, the bidirectional combined motion information candidate is generated.
  • the present invention is not limited to this.
  • the operation of the candidate number management table changing unit 154 can be further modified.
  • a modified example of the operation of the candidate number management table changing unit 154 will be described with reference to FIG. If the invalid number of combined motion information candidates is 0 (NO in S271), the candidate number management table changing unit 154 indicates that the prediction direction included in the second combined motion information candidate list is unidirectional (L0 direction or L1 direction).
  • the number of combined motion information candidates is counted, and the number of unidirectional predictions is calculated (S290). It is investigated whether the number of unidirectional predictions is 1 or more (S291).
  • the subsequent processing is performed to change the candidate number management table. If the unidirectional prediction number is 0 (NO in S291), the process is terminated.
  • the candidate number management table changing unit 154 counts the number of bidirectional combined motion information candidates whose prediction direction is bidirectional, and calculates the effective number of bidirectional combined motion information candidates (S292).
  • the candidate number management table changing unit 154 assigns merge candidate numbers of combined motion information candidates whose prediction direction is unidirectional to the number of bidirectional combined motion information candidates corresponding to the additional number of bidirectional combined motion information candidates (S294).
  • the candidate number management table changing unit 154 determines the merge candidate number whose final prediction direction is unidirectional as the bidirectional combined motion. Assigned to information candidate (BD0).
  • the candidate number management table changing unit 154 determines a merge candidate number in which the second prediction direction from the last is unidirectional, Assigned to bidirectional combined motion information candidate (BD1).
  • the calculation of the number of unidirectional predictions is the number of combined motion information candidates whose prediction direction is included in the second combined motion information candidate list.
  • the present invention is not limited to this.
  • the number of combined motion information candidates in which the prediction direction with a merge candidate number of 3 or more is unidirectional is counted. May be. If the number of invalid combined motion information candidates is 0, the number of combined motion information candidates whose prediction direction is unidirectional is counted. However, the total number of combined motion information candidate invalid numbers and the number of unidirectional predictions is The upper limit is not limited to this as long as a merge candidate number can be assigned to a bidirectional combined motion information candidate.
  • the bidirectional combined motion information candidate list generation unit 152 replaces the combined motion information candidate whose prediction direction is unidirectional with the bidirectional combined motion information candidate whose prediction direction is bidirectional.
  • the merge is performed so that the merge candidate number of the bidirectional combined motion information candidate is larger than the merge candidate number of the unidirectional combined motion information candidate.
  • the candidate number By using the candidate number, the encoding efficiency of the merge index can be improved.
  • the combined motion information candidate of the block with high reliability and high selection rate remains, and the selection rate is low. It is possible to adaptively switch between the combined motion information candidate of the block and the bidirectional combined motion information candidate. Therefore, an increase in the code amount of the merge index due to an increase in merge candidate numbers can be suppressed, and the selection rate of the combined motion information candidates can be increased to improve the encoding efficiency.
  • the combined motion information candidate whose prediction direction is unidirectional is replaced with the bidirectional combined motion information candidate whose prediction direction is bidirectional, and the bidirectional prediction motion information whose prediction direction is bidirectional is high.
  • Embodiment 4 (Priority is given to motion information for unidirectional prediction)
  • the configuration of the moving picture encoding apparatus according to the fourth embodiment is the same as that of the moving picture encoding apparatus 100 according to the first embodiment except for the function of the reference direction motion information determination unit 161.
  • the difference between the reference direction motion information determination unit 161 in the fourth embodiment and the first embodiment will be described.
  • the operation of the reference direction motion information determination unit 161 according to Embodiment 4 will be described with reference to FIG.
  • step S323 is obtained by adding steps S320 to S323 to the flowchart of FIG. 28, and is characterized by step S321.
  • the validity of the reference direction LX is set to “0” (S190).
  • the number of combined motion information candidates (NCands) included in the second combined motion information candidate list is repeated, and the following processing is repeated (S320 to S323).
  • Whether the combined motion information candidate is valid in the LX direction and unidirectional prediction is checked (S321). If the LX direction of the combined motion information candidate is valid and unidirectional prediction (YES in S321), the validity of the reference direction LX is set to “1”, and the motion vector and reference index in the reference direction are set.
  • the processing ends with the motion vector in the LX direction of the combined motion information candidate and the reference index (S322). If the LX direction of the combined motion information candidate is valid and is not unidirectional prediction (NO in S321), the next candidate is examined (S323).
  • the number of combined motion information candidates (NCands) included in the second combined motion information candidate list is repeated. (S191 to S194).
  • the validity of the combined motion information candidate in the LX direction is checked (S192). If the LX direction of the combined motion information candidate is valid (YES in S192), the validity of the reference direction LX is set to “1”, and the motion vector and reference index in the reference direction are set to the LX direction of the combined motion information candidate. The process ends with the motion vector and the reference index (S193). If the LX direction of the combined motion information candidate is invalid (NO in S192), the next candidate is examined (S194).
  • the reference direction motion information determination unit 161 according to the fourth embodiment differs from the first embodiment in that priority is given to motion information that is unidirectional in determining the reference direction motion information.
  • the configuration of the moving picture decoding apparatus according to the fourth embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the reference direction motion information determination unit 161.
  • the combined motion information candidate generation unit 140 of the video decoding device in the fourth embodiment is the same as the combined motion information candidate generation unit 140 of the video encoding device in the fourth embodiment.
  • the fourth embodiment can be modified as follows.
  • FIG. 63 is given as an example of the operation of the reference direction motion information determination unit 161.
  • motion information that is unidirectional may be given priority in determining motion information, and is not limited thereto.
  • step S191 to step S194 in FIG. 63 may be deleted, and the motion information in the reference direction may be limited to motion information in a single direction.
  • FIG. 63 is given as an example of the operation of the reference direction motion information determination unit 161.
  • motion information that is unidirectional may be given priority in determining motion information, and is not limited thereto.
  • priority may be given to motion information that is unidirectional, similar to the reference direction motion information determination unit 161 of the fourth embodiment.
  • the selection is limited to the motion information that is unidirectional similarly to the reference direction motion information determination unit 161 of the first modification of the fourth embodiment. May be.
  • Embodiment 5 (Each direction deletion process)
  • the configuration of the moving picture encoding apparatus of the fifth embodiment is the same as that of the moving picture encoding apparatus 100 of the first embodiment except for the function of the combined motion information candidate generation unit 140.
  • the difference between the combined motion information candidate generation unit 140 in Embodiment 5 and Embodiment 1 will be described.
  • FIG. 64 Differences from the first embodiment will be described with reference to FIG. 64 regarding the configuration of the combined motion information candidate generation unit 140 according to the fifth embodiment.
  • an L0 direction motion information candidate list generation unit 155 and an L1 direction motion information candidate list generation unit 156 are installed instead of the first combined motion information candidate list reduction unit 151 of FIG.
  • the L0 direction motion information candidate list generation unit 155 for motion information candidates included in the first combined motion information candidate list, when there are a plurality of combined motion information candidates having motion information with overlapping motion information in the L0 direction. Deletes one combined motion information candidate, generates an L0 direction motion information candidate list, and supplies the L0 direction motion information candidate list to the bidirectional combined motion information candidate list generation unit 152.
  • the L1 direction motion information candidate list generation unit 156 for motion information candidates included in the first combined motion information candidate list, when there are a plurality of combined motion information candidates having motion information with overlapping motion information in the L1 direction. Deletes one combined motion information candidate, generates an L1 direction motion information candidate list, and supplies the L1 direction motion information candidate list to the bidirectional combined motion information candidate list generation unit 152.
  • the bidirectional combined motion information candidate list generation unit 152 includes the L0 direction motion information candidate list supplied from the L0 direction motion information candidate list generation unit 155 and the L1 direction motion information supplied from the L1 direction motion information candidate list generation unit 156.
  • a bidirectional combined motion information candidate list is generated from the candidate list.
  • the configuration of the moving picture decoding apparatus according to the fifth embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the combined motion information candidate generation unit 140.
  • the combined motion information candidate generation unit 140 of the video decoding device in the fifth embodiment is the same as the combined motion information candidate generation unit 140 of the video encoding device in the fifth embodiment.
  • Embodiment 6 (Selective use of bidirectional combined motion information candidates)
  • the configuration of the moving picture coding apparatus according to the sixth embodiment is the same as that of the moving picture coding apparatus 100 according to the first embodiment except for the function of the reference direction determination unit 160.
  • the candidate number management table in the sixth embodiment is shown in FIG. 65, and the maximum number of combined motion information candidates included in the combined motion information candidate list is 6.
  • the difference is that the maximum number of combined motion information candidates included in the combined motion information candidate list is 6, and that only one merge candidate number is assigned to the bidirectional combined motion information candidate.
  • the difference between the reference direction determination unit 160 in the sixth embodiment and the first embodiment will be described.
  • the operation of the reference direction determination unit 160 according to the sixth embodiment will be described with reference to FIG.
  • the reference direction determination unit 160 repeats the following processing (S300 to S305) for the number of combined motion information candidates (NCands) included in the second combined motion information candidate list.
  • the validity of the combined motion information candidate in the L0 direction is checked (S301). If the L0 direction of the combined motion information candidate is valid (YES in S301), the reference direction is set to L0 and the process ends (S302). If the L0 direction of the combined motion information candidate is invalid (NO in S301), the validity of the combined motion information candidate in the L1 direction is checked (S303). If the L1 direction of the combined motion information candidate is valid (YES in S303), the reference direction is set to L1 and the process ends (S304). If the L1 direction of the combined motion information candidate is invalid (NO in S303), the next candidate is examined (S305). If the reference direction cannot be set, no bidirectional combined motion information candidate is generated (S306).
  • the configuration of the moving picture decoding apparatus according to the sixth embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the reference direction determination unit 160.
  • the reference direction determination unit 160 of the moving picture decoding apparatus according to the sixth embodiment is the same as the reference direction determination unit 160 of the moving picture encoding apparatus according to the sixth embodiment.
  • the moving image encoded stream output from the moving image encoding apparatus according to the first to sixth embodiments described above is specified so that it can be decoded according to the encoding method used in the first to sixth embodiments.
  • Data format A moving picture decoding apparatus corresponding to the moving picture encoding apparatus can decode an encoded stream of this specific data format.
  • a merge index indicating a bidirectional combined motion information candidate and a candidate number management table are encoded in the encoded stream. Also, only the merge index indicating the bidirectional combined motion information candidate is encoded in the encoded stream, and the candidate number management table is shared by the video encoding device and the video decoding device, so that the candidate number management table is included in the encoded stream. It does not have to be encoded.
  • the encoded stream When a wired or wireless network is used to exchange an encoded stream between a moving image encoding device and a moving image decoding device, the encoded stream is converted into a data format suitable for the transmission form of the communication path. It may be transmitted.
  • a video transmission apparatus that converts the encoded stream output from the video encoding apparatus into encoded data in a data format suitable for the transmission form of the communication channel and transmits the encoded data to the network, and receives the encoded data from the network Then, a moving image receiving apparatus that restores the encoded stream and supplies the encoded stream to the moving image decoding apparatus is provided.
  • the moving image transmitting apparatus is a memory that buffers the encoded stream output from the moving image encoding apparatus, a packet processing unit that packetizes the encoded stream, and transmission that transmits the packetized encoded data via the network.
  • the moving image receiving apparatus generates a coded stream by packetizing the received data, a receiving unit that receives the packetized coded data via a network, a memory that buffers the received coded data, and packet processing. And a packet processing unit provided to the video decoding device.
  • the above encoding and decoding processes can be realized as a transmission, storage, and reception device using hardware, as well as firmware stored in ROM (Read Only Memory), flash memory, and the like. It can also be realized by software such as a computer.
  • the firmware program and software program can be recorded and provided on a computer-readable recording medium, provided from a server through a wired or wireless network, or provided as a data broadcast of terrestrial or satellite digital broadcasting. Is also possible.
  • the temporally combined motion information candidate calculates bidirectional motion information based on the reference image ColRefPic and the motion vector mvCol, which are effective prediction directions in the motion information of the candidate block.
  • the prediction direction of the candidate block is the unidirectional direction of the L0 direction or the L1 direction, it is selected based on the reference image and the motion vector in the prediction direction.
  • the prediction direction of the candidate block is bidirectional, the selection is made based on either the reference image in the L0 direction or the L1 direction and the motion vector.
  • a reference image and a motion vector as a reference for generating bidirectional motion information are selected, a motion vector of a temporally combined motion information candidate is calculated.
  • the distance between the images of ColPic and ColRefPic is ColDist
  • the distance between the reference images ColL0Pic in the L0 direction of the temporally coupled motion information candidate and the image to be processed CurPic is CurL0Dist
  • the reference image ColL1Pic in the L1 direction of the temporally coupled motion information candidate is the target of processing.
  • a motion vector of the following equation 1 obtained by scaling ColMv with a distance ratio of ColDist, CurL0Dist, and CurL1Dist is set as a motion vector of a temporally combined motion information candidate.
  • the inter-image distance is calculated using POC and has a positive / negative sign.
  • ColPic, ColRefPic, ColL0Pic, and ColL1Pic in FIG. 67 are examples, and other relationships may be used.
  • the present invention can be used for a moving picture encoding and decoding technique using motion compensated prediction.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

A candidate list generator selects, from a plurality of encoded blocks adjacent to a block to be encoded, a plurality of blocks individually having one or two items of motion information that include at least motion vector information and reference image information, and generates, from the motion information in the selected blocks, a candidate list including candidates of motion information used for motion-compensated prediction. A first motion information acquisition unit acquires motion information of a first prediction list from a first candidate included in the candidates. A second motion information acquisition unit acquires motion information of a second prediction list from a second candidate included in the candidates. A selected candidate generator combines the motion information of the first prediction list acquired from the first motion information acquisition unit and the motion information of the second prediction list acquired from the second motion information acquisition unit, and generates new candidates of motion information.

Description

画像符号化装置、画像符号化方法、画像符号化プログラム、画像復号装置、画像復号方法および画像復号プログラムImage encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program
 本発明は、動き補償予測を用いた動画像符号化技術に関し、特に動き補償予測で利用する動き情報を符号化または復号する画像符号化装置、画像符号化方法、画像符号化プログラム、画像復号装置、画像復号方法および画像復号プログラムに関する。 The present invention relates to a moving image encoding technique using motion compensated prediction, and in particular, an image encoding device, an image encoding method, an image encoding program, and an image decoding device that encode or decode motion information used in motion compensated prediction. The present invention relates to an image decoding method and an image decoding program.
 一般的な動画像圧縮符号化では動き補償予測が利用される。動き補償予測は、対象画像を細かいブロックに分割し、復号済みの画像を参照画像として、動きベクトルで示される動き量に基いて、対象画像の対象ブロックから参照画像の参照ブロックに移動した位置の信号を予測信号として生成する技術である。動き補償予測には1本の動きベクトルを利用して単方向に行うものと、2本の動きベクトルを利用して双方向に行うものがある。 In general video compression coding, motion compensation prediction is used. Motion compensation prediction divides the target image into fine blocks, uses the decoded image as a reference image, and based on the amount of motion indicated by the motion vector, the position of the position moved from the target block of the target image to the reference block of the reference image This is a technique for generating a signal as a prediction signal. Some motion compensation predictions are performed unidirectionally using one motion vector, and others are performed bidirectionally using two motion vectors.
 また、動きベクトルについても、処理対象ブロックに隣接する符号化済みのブロックの動きベクトルを予測動きベクトル(単に「予測ベクトル」ともいう)とし、処理対象ブロックの動きベクトルと予測ベクトルとの差分を求め、差分ベクトルを符号化ベクトルとして伝送することで圧縮効率を向上させている。 As for the motion vector, the motion vector of the encoded block adjacent to the processing target block is set as a prediction motion vector (also simply referred to as “prediction vector”), and the difference between the motion vector of the processing target block and the prediction vector is obtained. The compression efficiency is improved by transmitting the difference vector as an encoded vector.
 MPEG-4AVCでは、MPEG-2よりも動き補償予測のブロックサイズを細かく且つ多様にすることで動き補償予測の効率を向上させている。一方、ブロックサイズを細かくしたことで動きベクトル数が増加するため、符号化ベクトルの符号量が問題となる。 MPEG-4AVC improves the efficiency of motion compensation prediction by making the block size of motion compensation prediction finer and more diverse than MPEG-2. On the other hand, since the number of motion vectors increases by making the block size finer, the code amount of the encoded vector becomes a problem.
 そのため、MPEG-2では単純に処理対象ブロックの左に隣接するブロックの動きベクトルを予測ベクトルとしていたが、MPEG-4AVCでは複数の隣接ブロックの動きベクトルの中央値を予測ベクトルとすることで予測ベクトルの精度を向上させ、符号化ベクトルの符号量の増加を抑制している。さらに、MPEG-4AVCにおいてダイレクト動き補償予測が知られている。ダイレクト動き補償予測は、符号化済みの別の画像の処理対象ブロックと同一位置にあるブロックの動きベクトルを、対象画像と2つの参照画像の距離でスケーリングして新たな動きベクトルを生成し、符号化ベクトルを伝送することなく、動き補償予測を実現するものである。 Therefore, in MPEG-2, the motion vector of the block adjacent to the left of the block to be processed is simply used as the prediction vector. In MPEG-4 AVC, the prediction vector is obtained by using the median of the motion vectors of a plurality of adjacent blocks as the prediction vector. And the increase in the code amount of the encoded vector is suppressed. Further, direct motion compensation prediction is known in MPEG-4 AVC. Direct motion compensated prediction generates a new motion vector by scaling the motion vector of a block at the same position as the processing target block of another encoded image by the distance between the target image and two reference images. The motion compensation prediction is realized without transmitting the quantization vector.
 また、処理対象ブロックに隣接するブロックの動き情報を利用して、符号化ベクトルを伝送することなく動き補償予測を実現する動き補償予測が知られている(例えば、特許文献1参照)。 Also, motion compensation prediction that realizes motion compensation prediction without transmitting an encoded vector using motion information of a block adjacent to a processing target block is known (see, for example, Patent Document 1).
特開平10-276439号公報JP-A-10-276439
 上述したように、符号化ベクトルを伝送しないダイレクト動き補償予測は、処理対象ブロックと符号化済みの別の画像の処理対象ブロックと同一位置にあるブロックの動きの連続性に着目している。また、特許文献1は処理対象ブロックと処理対象ブロックに隣接するブロックの動きの連続性に着目している。それにより、他のブロックの動き情報を利用することで差分ベクトルを含む動き情報を符号化ベクトルとして符号化せずに符号化効率を向上させている。 As described above, direct motion compensated prediction that does not transmit an encoded vector focuses on the continuity of motion of a block that is in the same position as a processing target block and a processing target block of another image that has been encoded. Further, Patent Document 1 focuses on the continuity of movement of a processing target block and a block adjacent to the processing target block. As a result, by using the motion information of other blocks, the encoding efficiency is improved without encoding the motion information including the difference vector as the encoded vector.
 ところが従来の動き補償予測では、処理対象ブロックの動きが、処理対象ブロックと隣接するブロックとの動きや、符号化済みの別の画像の処理対象ブロックと同一位置周辺にあるブロックとの動きにずれがある場合、差分ベクトルを含む動き情報を符号化しなければならず、符号化効率の向上が十分に発揮されないという難しい面がある。 However, in the conventional motion compensated prediction, the motion of the processing target block is shifted to the motion of the processing target block and the adjacent block, or the motion of a block around the same position as the processing target block of another encoded image. In such a case, the motion information including the difference vector must be encoded, and there is a difficult aspect that the improvement of the encoding efficiency is not sufficiently exhibited.
 本発明はこうした状況に鑑みてなされたものであり、その目的は、動きベクトルを含む動き情報の符号化効率を、より向上させる技術を提供することにある。 The present invention has been made in view of such circumstances, and an object thereof is to provide a technique for further improving the coding efficiency of motion information including a motion vector.
 上記課題を解決するために、本発明のある態様の画像符号化装置は、動き補償予測を行う画像符号化装置であって、符号化対象ブロックに隣接する複数の符号化済みのブロックから、動きベクトルの情報と参照画像の情報とを少なくとも含む動き情報をそれぞれ1つまたは2つ持つ複数のブロックを選択して、選択されたブロックの動き情報から、動き補償予測に用いる動き情報の候補を含む候補リストを生成する候補リスト生成部(140)と、前記候補に含まれる第1の候補から第1の予測リストの動き情報を取得する第1の動き情報取得部(161)と、前記候補に含まれる第2の候補から第2の予測リストの動き情報を取得する第2の動き情報取得部(162)と、前記第1の動き情報取得部(161)により取得された前記第1の予測リストの動き情報と、前記第2の動き情報取得部(162)により取得された前記第2の予測リストの動き情報を組み合わせて、動き情報の新たな候補を生成する選択候補生成部(163)と、を備える。 In order to solve the above-described problem, an image encoding device according to an aspect of the present invention is an image encoding device that performs motion compensation prediction, and includes a plurality of encoded blocks adjacent to an encoding target block, A plurality of blocks each having one or two pieces of motion information including at least vector information and reference image information are selected, and motion information candidates used for motion compensation prediction are included from the motion information of the selected blocks A candidate list generation unit (140) for generating a candidate list, a first motion information acquisition unit (161) for acquiring motion information of the first prediction list from the first candidate included in the candidates, and the candidates A second motion information acquisition unit (162) that acquires motion information of the second prediction list from the second candidate included, and the first motion information acquired by the first motion information acquisition unit (161). A selection candidate generation unit (163) that generates a new candidate for motion information by combining the motion information of the prediction list and the motion information of the second prediction list acquired by the second motion information acquisition unit (162). And).
 前記候補リスト生成部(140)は、前記候補の数が、設定された最大数に満たない場合、前記選択候補生成部(163)により生成された新たな候補を含めた候補リストを生成してもよい。 The candidate list generation unit (140) generates a candidate list including new candidates generated by the selection candidate generation unit (163) when the number of candidates is less than the set maximum number. Also good.
 前記候補リスト生成部(140)は、前記候補の数が、前記最大数を超えないように、前記選択候補生成部(163)により生成された1つ以上の新たな候補を含めた候補リストを生成してもよい。 The candidate list generation unit (140) generates a candidate list including one or more new candidates generated by the selection candidate generation unit (163) so that the number of candidates does not exceed the maximum number. It may be generated.
 動き補償予測に用いる動き情報の候補を前記候補リスト内で特定するための候補特定情報を符号化する符号列生成部(104)と、をさらに備えてもよい。 A code string generation unit (104) for encoding candidate specifying information for specifying a candidate for motion information used for motion compensation prediction in the candidate list may be further included.
 前記候補リスト生成部(140)は、前記選択候補生成部(163)により生成された新たな候補に前記候補よりも大きな候補特定情報を割り当ててもよい。 The candidate list generation unit (140) may assign candidate specifying information larger than the candidate to the new candidate generated by the selection candidate generation unit (163).
 前記第1の予測リストおよび前記第2の予測リストは、異なる予測リストであってもよい。 The first prediction list and the second prediction list may be different prediction lists.
 前記候補リスト生成部(140)は、前記符号化対象ブロックを含む画像と時間的に異なる画像のブロックの動き情報から導出した動き情報を候補リストに含めてもよい。 The candidate list generation unit (140) may include motion information derived from motion information of a block of an image temporally different from the image including the encoding target block in the candidate list.
 前記第1の動き情報取得部(161)は、前記候補を第1の優先順に従って検索し、有効となる候補を前記第1の候補としてもよい。前記第2の動き情報取得部(162)は、前記候補を第2の優先順に従って検索し、有効となる候補を前記第2の候補としてもよい。 The first motion information acquisition unit (161) may search for the candidates according to a first priority order, and may select a valid candidate as the first candidate. The second motion information acquisition unit (162) may search for the candidates according to a second priority order and set a valid candidate as the second candidate.
 前記第1の動き情報取得部(161)は、前記候補の中の予め定められた候補を前記第1の候補としてもよい。前記第2の動き情報取得部(162)は、前記候補の中の予め定められた別の候補を前記第2の候補としてもよい。 The first motion information acquisition unit (161) may use a predetermined candidate among the candidates as the first candidate. The second motion information acquisition unit (162) may use another predetermined candidate among the candidates as the second candidate.
 前記選択候補生成部(163)は、前記第1の動き情報取得部(161)および前記第2の動き情報取得部(162)により取得された、前記第1の予測リストの動き情報および前記第2の予測リストの動き情報の両方が有効である場合、前記新たな候補を生成してもよい。 The selection candidate generation unit (163) includes the first prediction list motion information and the first prediction list motion information acquired by the first motion information acquisition unit (161) and the second motion information acquisition unit (162). If both pieces of motion information in the second prediction list are valid, the new candidate may be generated.
 前記新たな候補は、2つの動き情報を持ってもよい。前記新たな候補は、1つの動き情報を持ってもよい。 The new candidate may have two pieces of motion information. The new candidate may have one piece of motion information.
 本発明の別の態様は、画像符号化方法である。この方法は、動き補償予測を行う画像符号化方法であって、符号化対象ブロックに隣接する複数の符号化済みのブロックから、動きベクトルの情報と参照画像の情報とを少なくとも含む動き情報をそれぞれ1つまたは2つ持つ複数のブロックを選択して、選択されたブロックの動き情報から、動き補償予測に用いる動き情報の候補を含む候補リストを生成するステップと、前記候補リストに含まれる第1の候補から第1の予測リストの動き情報を取得するステップと、前記候補リストに含まれる第2の候補から第2の予測リストの動き情報を取得するステップと、前記第1の予測リストの動き情報と、前記第2の予測リストの動き情報を組み合わせて、動き情報の新たな候補を生成するステップと、を備える。 Another aspect of the present invention is an image encoding method. This method is an image encoding method for performing motion compensation prediction, and each of motion information including at least motion vector information and reference image information from a plurality of encoded blocks adjacent to an encoding target block. Selecting a plurality of blocks having one or two and generating a candidate list including motion information candidates used for motion compensation prediction from the motion information of the selected blocks; and a first included in the candidate list Obtaining the motion information of the first prediction list from the candidates, obtaining the motion information of the second prediction list from the second candidates included in the candidate list, and the motion of the first prediction list Combining the information and the motion information of the second prediction list to generate a new candidate for motion information.
 本発明のある態様の画像復号装置は、動き補償予測を行う画像復号装置であって、復号対象ブロックに隣接する複数の復号済みのブロックから、動きベクトルの情報と参照画像の情報とを少なくとも含む動き情報をそれぞれ1つまたは2つ持つ複数のブロックを選択して、選択されたブロックの動き情報から、動き補償予測に用いる動き情報の候補を含む候補リストを生成する候補リスト生成部(230)と、前記候補に含まれる第1の候補から第1の予測リストの動き情報を取得する第1の動き情報取得部(161)と、前記候補に含まれる第2の候補から第2の予測リストの動き情報を取得する第2の動き情報取得部(162)と、前記第1の動き情報取得部(161)により取得された前記第1の予測リストの動き情報と、前記第2の動き情報取得部(162)により取得された前記第2の予測リストの動き情報を組み合わせて、動き情報の新たな候補を生成する選択候補生成部(163)と、を備える。 An image decoding device according to an aspect of the present invention is an image decoding device that performs motion compensation prediction, and includes at least motion vector information and reference image information from a plurality of decoded blocks adjacent to a decoding target block. A candidate list generation unit (230) that selects a plurality of blocks each having one or two pieces of motion information and generates a candidate list including motion information candidates used for motion compensation prediction from the motion information of the selected blocks. A first motion information acquisition unit (161) that acquires motion information of the first prediction list from the first candidate included in the candidate, and a second prediction list from the second candidate included in the candidate A second motion information acquisition unit (162) that acquires the motion information of the first prediction list acquired by the first motion information acquisition unit (161), and the second Comprising a combination of motion information of the second prediction lists acquired by the motion information obtaining section (162), selecting a candidate generation unit for generating a new candidate motion information (163), the.
 前記候補リスト生成部(230)は、前記候補の数が、設定された最大数に満たない場合、前記選択候補生成部(163)により生成された新たな候補を含めた候補リストを生成してもよい。 The candidate list generation unit (230) generates a candidate list including new candidates generated by the selection candidate generation unit (163) when the number of candidates is less than the set maximum number. Also good.
 前記候補リスト生成部(230)は、前記候補の数が、前記最大数を超えないように、前記選択候補生成部(163)により生成された1つ以上の新たな候補を含めた候補リストを生成してもよい。 The candidate list generation unit (230) generates a candidate list including one or more new candidates generated by the selection candidate generation unit (163) so that the number of candidates does not exceed the maximum number. It may be generated.
 動き補償予測に用いる動き情報の候補を前記候補リスト内で特定するための候補特定情報を復号する符号列解析部(201)と、復号された前記候補特定情報を用いて、前記候補リスト生成部(230)で生成した候補リストに含まれる選択候補の中から1つの候補を選択する選択部(231)と、をさらに備えてもよい。 A code string analyzing unit (201) for decoding candidate specifying information for specifying a candidate for motion information used for motion compensated prediction in the candidate list, and using the decoded candidate specifying information, the candidate list generating unit A selection unit (231) that selects one candidate from the selection candidates included in the candidate list generated in (230) may be further provided.
 前記候補リスト生成部(230)は、前記選択候補生成部(163)により生成された新たな候補に前記候補よりも大きな候補特定情報を割り当ててもよい。 The candidate list generation unit (230) may assign candidate specifying information larger than the candidate to the new candidate generated by the selection candidate generation unit (163).
 前記第1の予測リストおよび前記第2の予測リストは、異なる予測リストであってもよい。 The first prediction list and the second prediction list may be different prediction lists.
 前記候補リスト生成部(230)は、前記復号対象ブロックを含む画像と時間的に異なる画像のブロックの動き情報から導出した動き情報を候補リストに含めてもよい。 The candidate list generation unit (230) may include motion information derived from motion information of a block of an image temporally different from the image including the decoding target block in the candidate list.
 前記第1の動き情報取得部(161)は、前記候補を第1の優先順に従って検索し、有効となる候補を前記第1の候補としてもよい。前記第2の動き情報取得部(162)は、前記候補を第2の優先順に従って検索し、有効となる候補を前記第2の候補としてもよい。 The first motion information acquisition unit (161) may search for the candidates according to a first priority order, and may select a valid candidate as the first candidate. The second motion information acquisition unit (162) may search for the candidates according to a second priority order and set a valid candidate as the second candidate.
 前記第1の動き情報取得部(161)は、前記候補の中の予め定められた候補を前記第1の候補としてもよい。前記第2の動き情報取得部(162)は、前記候補の中の予め定められた別の候補を前記第2の候補としてもよい。 The first motion information acquisition unit (161) may use a predetermined candidate among the candidates as the first candidate. The second motion information acquisition unit (162) may use another predetermined candidate among the candidates as the second candidate.
 前記選択候補生成部(163)は、前記第1の動き情報取得部(161)および前記第2の動き情報取得部(162)により取得された、前記第1の予測リストの動き情報および前記第2の予測リストの動き情報の両方が有効である場合、前記新たな候補を生成してもよい。 The selection candidate generation unit (163) includes the first prediction list motion information and the first prediction list motion information acquired by the first motion information acquisition unit (161) and the second motion information acquisition unit (162). If both pieces of motion information in the second prediction list are valid, the new candidate may be generated.
 前記新たな候補は、2つの動き情報を持ってもよい。前記新たな候補は、1つの動き情報を持ってもよい。 The new candidate may have two pieces of motion information. The new candidate may have one piece of motion information.
 本発明の別の態様は、画像復号方法である。この方法は、復号対象ブロックに隣接する複数の復号済みのブロックから、動きベクトルの情報と参照画像の情報とを少なくとも含む動き情報をそれぞれ1つまたは2つ持つ複数のブロックを選択して、選択されたブロックの動き情報から、動き補償予測に用いる動き情報の候補を含む候補リストを生成するステップと、前記候補リストに含まれる第1の候補から第1の予測リストの動き情報を取得するステップと、前記第1の予測リストの動き情報と、前記第2の予測リストの動き情報を組み合わせて、動き情報の新たな候補を生成するステップと、を備える。 Another aspect of the present invention is an image decoding method. This method selects and selects a plurality of blocks each having one or two pieces of motion information including at least motion vector information and reference image information from a plurality of decoded blocks adjacent to the decoding target block. Generating a candidate list including motion information candidates used for motion compensated prediction from the motion information of the obtained block, and acquiring motion information of the first prediction list from the first candidate included in the candidate list And generating a new candidate for motion information by combining the motion information of the first prediction list and the motion information of the second prediction list.
 なお、以上の構成要素の任意の組み合わせ、本発明の表現を方法、装置、システム、記録媒体、コンピュータプログラムなどの間で変換したものもまた、本発明の態様として有効である。 It should be noted that any combination of the above-described constituent elements, and a conversion of the expression of the present invention between a method, an apparatus, a system, a recording medium, a computer program, etc. are also effective as an aspect of the present invention.
 本発明によれば、動きベクトルを含む動き情報の符号化効率を、より向上させることができる。 According to the present invention, the encoding efficiency of motion information including motion vectors can be further improved.
画像を最大符号化ブロックに分割する例を説明するための図である。It is a figure for demonstrating the example which divides | segments an image into the largest encoding block. 図2(a)、(b)は、符号化ブロックを説明するための図である。2A and 2B are diagrams for explaining an encoded block. 図3(a)~(d)は、予測ブロックを説明するための図である。FIGS. 3A to 3D are diagrams for explaining a prediction block. 予測ブロックサイズを説明するための図である。It is a figure for demonstrating a prediction block size. 予測符号化モードを説明するための図である。It is a figure for demonstrating prediction encoding mode. 図6(a)~(d)は、動き補償予測の予測方向を説明するための図である。FIGS. 6A to 6D are diagrams for explaining the prediction direction of motion compensation prediction. 予測ブロックのシンタックスの一例を説明するための図である。It is a figure for demonstrating an example of the syntax of a prediction block. 図8(a)~(c)は、マージインデックスのTruncated Unary符号列を説明するための図である。FIGS. 8A to 8C are views for explaining a truncated unary code string of a merge index. 本発明の実施の形態1に係る動画像符号化装置の構成を説明するための図である。It is a figure for demonstrating the structure of the moving image encoder which concerns on Embodiment 1 of this invention. 図9の動き情報メモリにおける動き情報の管理方法を説明するための図である。It is a figure for demonstrating the management method of the motion information in the motion information memory of FIG. 図9の動き情報生成部の構成を説明するための図である。It is a figure for demonstrating the structure of the motion information generation part of FIG. 図9の差分ベクトル算出部の構成を説明するための図である。It is a figure for demonstrating the structure of the difference vector calculation part of FIG. 空間候補ブロック群を説明するための図である。It is a figure for demonstrating a space candidate block group. 時間候補ブロック群を説明するための図である。It is a figure for demonstrating a time candidate block group. 図11の結合動き情報決定部の構成を説明するための図である。It is a figure for demonstrating the structure of the joint motion information determination part of FIG. 図15の結合動き情報候補生成部の構成を説明するための図である。It is a figure for demonstrating the structure of the joint motion information candidate production | generation part of FIG. 図16の双方向結合動き情報候補リスト生成部の構成を説明するための図である。It is a figure for demonstrating the structure of the bidirectional | two-way joint motion information candidate list production | generation part of FIG. 候補番号管理テーブルを説明するための図である。It is a figure for demonstrating a candidate number management table. 図19(a)、(b)は、マージ候補番号からマージインデックスへの変換を説明するための図である。FIGS. 19A and 19B are diagrams for explaining conversion from a merge candidate number to a merge index. 本発明の実施の形態1に係る動画像符号化装置の符号化の動作を説明するためのフローチャートである。It is a flowchart for demonstrating the operation | movement of the encoding of the moving image encoder which concerns on Embodiment 1 of this invention. 図9の動き情報生成部の動作を説明するためのフローチャートである。10 is a flowchart for explaining an operation of a motion information generation unit in FIG. 9. 図11の差分ベクトル算出部の動作を説明するためのフローチャートである。12 is a flowchart for explaining an operation of a difference vector calculation unit in FIG. 11. 図11の結合動き情報決定部の動作を説明するためのフローチャートである。It is a flowchart for demonstrating operation | movement of the joint motion information determination part of FIG. 図16の双方向結合動き情報候補リスト生成部の動作を説明するためのフローチャートである。18 is a flowchart for explaining the operation of the bidirectional combined motion information candidate list generation unit in FIG. 16. 空間結合動き情報候補リストの生成の動作を説明するためのフローチャートである。It is a flowchart for demonstrating operation | movement of the production | generation of a space joint motion information candidate list. 時間結合動き情報候補リストの生成の動作を説明するためのフローチャートである。It is a flowchart for demonstrating the operation | movement of the production | generation of a time combination motion information candidate list. 双方向結合動き情報候補リストの生成の動作を説明するためのフローチャートである。It is a flowchart for demonstrating operation | movement of the production | generation of a bidirectional | two-way joint motion information candidate list. 図17の基準方向動き情報決定部の動作を説明するためのフローチャートである。It is a flowchart for demonstrating operation | movement of the reference direction movement information determination part of FIG. 図17の逆方向動き情報決定部の動作を説明するためのフローチャートである。It is a flowchart for demonstrating operation | movement of the reverse direction motion information determination part of FIG. 双方向結合動き情報候補の予測方向の決定を説明するための図である。It is a figure for demonstrating determination of the prediction direction of a bidirectional | two-way joint motion information candidate. 図31(a)~(c)は、双方向結合動き情報候補の予測方向の決定の拡張例を説明するための図である。FIGS. 31A to 31C are diagrams for explaining an extended example of determining the prediction direction of the bidirectional combined motion information candidate. 本発明の実施の形態1に係る動画像復号装置の構成を説明するための図である。It is a figure for demonstrating the structure of the moving image decoding apparatus which concerns on Embodiment 1 of this invention. 図32の動き情報再生部の構成を説明するための図である。FIG. 33 is a diagram for describing a configuration of a motion information reproducing unit in FIG. 32. 図33の動きベクトル再生部の構成を説明するための図である。It is a figure for demonstrating the structure of the motion vector reproduction | regeneration part of FIG. 図33の結合動き情報再生部の構成を説明するための図である。It is a figure for demonstrating the structure of the joint motion information reproduction | regeneration part of FIG. 本発明の実施の形態1に係る動画像復号装置の復号の動作を説明するためのフローチャートである。It is a flowchart for demonstrating the decoding operation | movement of the moving image decoding apparatus which concerns on Embodiment 1 of this invention. 図32の動き情報再生部の動作を説明するためのフローチャートである。33 is a flowchart for explaining an operation of a motion information reproducing unit in FIG. 32. 図33の動きベクトル再生部の動作を説明するためのフローチャートである。It is a flowchart for demonstrating operation | movement of the motion vector reproduction | regeneration part of FIG. 図33の結合動き情報再生部の動作を説明するためのフローチャートである。It is a flowchart for demonstrating operation | movement of the joint motion information reproduction | regeneration part of FIG. 図40(a)、(b)は変形例1に係る候補番号管理テーブルを説明するための図である。40A and 40B are diagrams for explaining a candidate number management table according to the first modification. 実施の形態1の変形例1に係る別の候補番号管理テーブルの説明するための図である。FIG. 10 is a diagram for explaining another candidate number management table according to the first modification of the first embodiment. 双方向結合動き情報候補(BD2)の導出を説明するためのフローチャートである。It is a flowchart for derivation | leading-out of a bidirectional | two-way joint motion information candidate (BD2). 双方向結合動き情報候補(BD3)の導出を説明するためのフローチャートである。It is a flowchart for demonstrating derivation | leading-out of a bidirectional | two-way joint motion information candidate (BD3). 実施の形態1の変形例2に係る逆方向動き情報決定部の動作を説明するためのフローチャートである。10 is a flowchart for explaining an operation of a backward motion information determination unit according to the second modification of the first embodiment. 実施の形態1の変形例3に係る逆方向動き情報決定部の動作を説明するためのフローチャートである。10 is a flowchart for explaining an operation of a backward motion information determination unit according to Modification 3 of Embodiment 1. 実施の形態1の変形例4に係る結合動き情報候補生成部の構成を説明するための図である。FIG. 10 is a diagram for describing a configuration of a combined motion information candidate generation unit according to Modification 4 of Embodiment 1. 実施の形態1の変形例4に係る基準方向動き情報決定部の動作と逆方向動き情報決定部の動作を説明するための図である。It is a figure for demonstrating operation | movement of the reference direction motion information determination part which concerns on the modification 4 of Embodiment 1, and operation | movement of a reverse direction motion information determination part. 実施の形態1の変形例5に係る2つの予測方向が同一の動き情報の組み合わせを説明するための図である。It is a figure for demonstrating the combination of the motion information with which the two prediction directions which concern on the modification 5 of Embodiment 1 are the same. 図49(a)、(b)は、実施の形態1の変形例6に係るBD0およびBD1の予め定められた組み合わせを説明するための図である。49 (a) and 49 (b) are diagrams for explaining a predetermined combination of BD0 and BD1 according to the sixth modification of the first embodiment. 実施の形態1の効果を説明するための図(その1)である。FIG. 6 is a diagram (part 1) for explaining the effect of the first embodiment; 実施の形態1の効果を説明するための図(その2)である。FIG. 8 is a diagram (part 2) for explaining the effect of the first embodiment; 実施の形態1の効果を説明するための図(その3)である。FIG. 6 is a diagram (No. 3) for explaining the effect of the first embodiment; 図53(a)、(b)は、実施の形態2の候補番号管理テーブルを符号化ストリーム中に符号化するシンタックスを説明するための図である。FIGS. 53A and 53B are diagrams for explaining the syntax for encoding the candidate number management table of the second embodiment in the encoded stream. 実施の形態3の候補番号管理テーブルを説明するための図である。FIG. 10 is a diagram for explaining a candidate number management table according to the third embodiment. 実施の形態3の結合動き情報候補生成部の構成を説明するための図である。FIG. 10 is a diagram for illustrating a configuration of a combined motion information candidate generation unit according to the third embodiment. 実施の形態3の結合動き情報候補生成部の動作を説明するためのフローチャートである。10 is a flowchart for explaining an operation of a combined motion information candidate generation unit according to the third embodiment. 実施の形態3の候補番号管理テーブル変更部の動作を説明するためのフローチャートである。10 is a flowchart for explaining an operation of a candidate number management table changing unit according to the third embodiment. 図58(a)~(c)は、実施の形態3の候補番号管理テーブル変更部の候補番号管理テーブルの変更例を説明するための図である。58 (a) to 58 (c) are diagrams for explaining examples of changing the candidate number management table of the candidate number management table changing unit according to the third embodiment. 実施の形態3の変形例1に係る候補番号管理テーブル変更部の動作を説明するためのフローチャートである。15 is a flowchart for explaining an operation of a candidate number management table changing unit according to the first modification of the third embodiment. 図60(a)、(b)は、実施の形態3の変形例1に係る候補番号管理テーブル変更部の候補番号管理テーブルを説明するための図である。FIGS. 60A and 60B are diagrams for explaining the candidate number management table of the candidate number management table changing unit according to the first modification of the third embodiment. 実施の形態3の変形例2に係る候補番号管理テーブル変更部の動作を説明するためのフローチャートである。15 is a flowchart for explaining an operation of a candidate number management table changing unit according to the second modification of the third embodiment. 実施の形態3の変形例3に係る候補番号管理テーブル変更部の動作を説明するためのフローチャートである。22 is a flowchart for explaining an operation of a candidate number management table changing unit according to the third modification of the third embodiment. 実施の形態4の基準方向動き情報決定部の動作を説明するためのフローチャートである。10 is a flowchart for explaining an operation of a reference direction motion information determination unit according to the fourth embodiment. 実施の形態5の結合動き情報候補生成部の構成を説明するための図である。FIG. 25 is a diagram for illustrating a configuration of a combined motion information candidate generation unit according to the fifth embodiment. 実施の形態6の候補番号管理テーブルを説明するための図である。FIG. 20 is a diagram for explaining a candidate number management table according to the sixth embodiment. 実施の形態6の基準方向決定部の動作を説明するためのフローチャートである。18 is a flowchart for explaining an operation of a reference direction determination unit according to the sixth embodiment. 時間結合動き情報候補の動きベクトルmvL0t、mvL1tの算出手法を説明するための図である。It is a figure for demonstrating the calculation method of the motion vector mvL0t of a time coupling | bonding motion information candidate, and mvL1t.
 まず、本発明の実施の形態の前提となる技術を説明する。 First, the technology that is the premise of the embodiment of the present invention will be described.
 現在、MPEG(Moving Picture Experts Group)などの符号化方式に準拠した装置およびシステムが普及している。そのような符号化方式では、時間軸上に連続する複数の画像をデジタル信号の情報として取り扱う。その際、効率の高い情報の放送、伝送または蓄積などを目的とし、時間方向の冗長性を利用した動き補償予測、および空間方向の冗長性を利用した離散コサイン変換などの直交変換を用いて圧縮符号化する。 Currently, devices and systems that comply with an encoding method such as MPEG (Moving Picture Experts Group) are widely used. In such an encoding method, a plurality of images that are continuous on the time axis are handled as digital signal information. At that time, for the purpose of broadcasting, transmitting or storing highly efficient information, compression using motion compensation prediction using temporal redundancy and orthogonal transform such as discrete cosine transform using spatial redundancy Encode.
 1995年にはMPEG-2ビデオ(ISO/IEC 13818-2)符号化方式が、汎用の映像圧縮符号化方式として制定され、DVDおよびD-VHS(登録商標)規格のデジタルVTRによる磁気テープなどの蓄積メディア、ならびにデジタル放送などのアプリケーションとして広く用いられている。 In 1995, the MPEG-2 video (ISO / IEC 18 13818-2) encoding system was established as a general-purpose video compression encoding system, and used for DVD and D-VHS (registered trademark) standard digital VTR magnetic tapes, etc. It is widely used as an application for storage media and digital broadcasting.
 さらに、2003年に、国際標準化機構(ISO)と国際電気標準会議(IEC)のジョイント技術委員会(ISO/IEC)と、国際電気通信連合電気通信標準化部門(ITU-T)の共同作業によってMPEG-4 AVC/H.264と呼ばれる符号化方式(ISO/IECでは14496-10、ITU-TではH.264の規格番号がつけられている。以下、これをMPEG-4AVCと呼ぶ)が国際標準として制定された。 Furthermore, in 2003, a joint effort between the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) Joint Technical Committee (ISO / IEC) and the International Telecommunication Union Telecommunication Standardization Sector (ITU-T) -4 AVC / H. An encoding system called H.264 (144 / 96-10 in ISO / IEC and H.264 in ITU-T, hereinafter referred to as MPEG-4AVC) has been established as an international standard.
 現在、国際標準化機構(ISO)と国際電気標準会議(IEC)のジョイント技術委員会(ISO/IEC)と、国際電気通信連合電気通信標準化部門(ITU-T)の共同作業によってHEVCと呼ばれる符号化方式の標準化が検討されている。 Coding currently called HEVC in collaboration with the International Technical Organization (ISO) and the International Electrotechnical Commission (IEC) Joint Technical Committee (ISO / IEC) and the International Telecommunications Union Telecommunication Standardization Sector (ITU-T) Standardization of the method is being studied.
 (符号化ブロック)
 本発明の実施の形態では、入力された画像信号を図1のように最大符号化ブロック単位に分割し、分割した符号化ブロックをラスタースキャン順序で処理する。符号化ブロックは階層構造となっており、符号化効率などを考慮して順次均等に4分割することでより小さい符号化ブロックにすることができる。なお、4分割された符号化ブロックはジグザグスキャン順で符号化される。これ以上小さくすることのできない符号化ブロックを最小符号化ブロックと呼ぶ。符号化ブロックは符号化の単位となり、最大符号化ブロックも分割数が0である場合は符号化ブロックとなる。本実施の形態では、最大符号化ブロックを64画素×64画素、最小符号化ブロックを8画素×8画素とする。
(Encoding block)
In the embodiment of the present invention, an input image signal is divided into maximum coding block units as shown in FIG. 1, and the divided coding blocks are processed in a raster scan order. The encoded block has a hierarchical structure, and can be made smaller encoded blocks by sequentially equally dividing into 4 in consideration of the encoding efficiency. Note that the encoded blocks divided into four are encoded in the zigzag scan order. An encoded block that cannot be further reduced is called a minimum encoded block. An encoded block is a unit of encoding, and the maximum encoded block is also an encoded block when the number of divisions is zero. In this embodiment, the maximum coding block is 64 pixels × 64 pixels, and the minimum coding block is 8 pixels × 8 pixels.
 図2(a)、(b)に最大符号化ブロックの分割の一例を示す。図2(a)の例では、符号化ブロックが10個に分割されている。CU0、CU1およびCU9は32画素×32画素の符号化ブロック、CU2、CU3およびCU8は16画素×16画素の符号化ブロック、ならびにCU4、CU5、CU6およびCU7は8画素×8画素の符号化ブロックとなっている。 FIGS. 2A and 2B show an example of division of the maximum coding block. In the example of FIG. 2A, the encoded block is divided into ten. CU0, CU1 and CU9 are 32 × 32 pixel coding blocks, CU2, CU3 and CU8 are 16 × 16 pixel coding blocks, and CU4, CU5, CU6 and CU7 are 8 × 8 pixel coding blocks. It has become.
 (予測ブロック)
 本発明の実施の形態では、符号化ブロックはさらに予測ブロックに分割される。予測ブロックの分割のパターンを図3(a)~(d)に示す。図3(a)は符号化ブロックを分割しない2N×2N、図3(b)は水平に分割する2N×N、図3(c)は垂直に分割するN×2N、および図3(d)は水平と垂直に分割するN×Nを示す。つまり、予測ブロックサイズには、図4に示すように、CU分割数が0であって最大の予測ブロックサイズである64画素×64画素からCU分割数が3であって最小の予測ブロックサイズである4画素×4画素までの13の予測ブロックサイズが存在することになる。
(Prediction block)
In the embodiment of the present invention, the encoded block is further divided into prediction blocks. The prediction block division patterns are shown in FIGS. 3A is 2N × 2N that does not divide the encoded block, FIG. 3B is 2N × N that is horizontally divided, FIG. 3C is N × 2N that is vertically divided, and FIG. 3D. Indicates N × N divided horizontally and vertically. In other words, as shown in FIG. 4, the prediction block size includes a CU division number of 0 and a maximum prediction block size of 64 pixels × 64 pixels to a CU division number of 3 and a minimum prediction block size. There are 13 predicted block sizes up to 4 pixels x 4 pixels.
 本発明の実施の形態では、最大符号化ブロックを64画素×64画素、最小符号化ブロックを8画素×8画素とするが、この組み合わせに限定されない。また、予測ブロックの分割のパターンを図3(a)~(d)としたが、1以上に分割されればよくこれに限定されない。 In the embodiment of the present invention, the maximum encoding block is 64 pixels × 64 pixels and the minimum encoding block is 8 pixels × 8 pixels, but the present invention is not limited to this combination. Further, although the prediction block division patterns are shown in FIGS. 3A to 3D, the division is not limited to this as long as it is divided into one or more.
 (予測符号化モード)
 本発明の実施の形態では、動き補償予測の予測方向や符号化ベクトル数を予測ブロックのブロックサイズで切り替えることが可能となっている。ここで、動き補償予測の予測方向と符号化ベクトル数を関連付けた予測符号化モードの一例について図5を用いて簡単に説明する。
(Predictive coding mode)
In the embodiment of the present invention, the prediction direction of motion compensation prediction and the number of encoded vectors can be switched by the block size of the prediction block. Here, an example of a predictive coding mode in which the prediction direction of motion compensation prediction and the number of coding vectors are associated will be briefly described with reference to FIG.
 図5に示す予測符号化モードには、動き補償予測の予測方向が単方向であって符号化ベクトル数が1である単方向モード(UniPred)、動き補償予測の予測方向が双方向であって符号化ベクトル数が2である双方向モード(BiPred)、および動き補償予測の予測方向が単方向または双方向であって符号化ベクトル数が0であるマージモード(MERGE)がある。また、動き補償予測を実施しない予測符号化モードであるイントラモード(Intra)もある。 The prediction coding mode shown in FIG. 5 includes a unidirectional mode (UniPred) in which the prediction direction of motion compensation prediction is unidirectional and the number of coding vectors is 1, and the prediction direction of motion compensation prediction is bidirectional. There is a bidirectional mode (BiPred) in which the number of encoded vectors is 2, and a merge mode (MERGE) in which the prediction direction of motion compensation prediction is unidirectional or bidirectional and the number of encoded vectors is 0. There is also an intra mode (Intra) which is a predictive coding mode in which motion compensation prediction is not performed.
 (参照画像インデックス)
 本発明の実施の形態では、動き補償予測の精度向上のために、動き補償予測において複数の参照画像の中から最適な参照画像を選択することを可能とする。そのため、動き補償予測で利用した参照画像を参照画像インデックスとして符号化ベクトルとともに符号化ストリーム中に符号化する。動き補償予測で利用される参照画像インデックスは0以上の数値となる。参照画像インデックスで選択できる複数の参照画像は参照インデックスリストで管理される。動き補償予測の予測方向が単方向であれば、参照画像インデックスは1つ符号化され、動き補償予測の予測方向が双方向であれば、それぞれの予測方向の参照画像を示す参照画像インデックスが符号化される(図5参照)。
(Reference image index)
In the embodiment of the present invention, it is possible to select an optimal reference image from a plurality of reference images in motion compensation prediction in order to improve the accuracy of motion compensation prediction. Therefore, the reference image used in the motion compensation prediction is encoded in the encoded stream together with the encoded vector as a reference image index. The reference image index used in motion compensation prediction is a numerical value of 0 or more. A plurality of reference images that can be selected by the reference image index are managed in a reference index list. If the prediction direction of motion compensation prediction is unidirectional, one reference image index is encoded. If the prediction direction of motion compensation prediction is bidirectional, a reference image index indicating a reference image in each prediction direction is encoded. (See FIG. 5).
 (予測ベクトルインデックス)
 HEVCでは、予測ベクトルの精度を向上させるために、複数の予測ベクトルの候補の中から最適な予測ベクトルを選択し、選択した予測ベクトルを示すための予測ベクトルインデックスを符号化することが検討されている。本発明の実施の形態では、上記の予測ベクトルインデックスを導入する。動き補償予測の予測方向が単方向であれば、予測ベクトルインデックスは1つ符号化され、動き補償予測の予測方向が双方向であれば、それぞれの予測方向の予測ベクトルを示す予測ベクトルインデックスが符号化される(図5参照)。
(Predicted vector index)
In HEVC, in order to improve the accuracy of a prediction vector, it is considered to select an optimal prediction vector from among a plurality of prediction vector candidates and to encode a prediction vector index for indicating the selected prediction vector. Yes. In the embodiment of the present invention, the prediction vector index is introduced. If the prediction direction of motion compensation prediction is unidirectional, one prediction vector index is encoded. If the prediction direction of motion compensation prediction is bidirectional, a prediction vector index indicating a prediction vector in each prediction direction is encoded. (See FIG. 5).
 (マージインデックス)
 HEVCでは、さらに符号化効率を向上させるために、複数の隣接ブロックの候補及び符号化済みの別の画像の処理対象ブロックと同一位置にあるブロックの中から最適なブロックを選択し、選択したブロックを示すマージインデックスを符号化および復号することが検討されている。これは、選択されたマージインデックスの示すブロックの動き補償予測の予測方向、動きベクトル情報、参照画像情報から成る動き情報を処理対象ブロックで利用する動き補償予測技術(マージ技術)である。本発明の実施の形態では、上記のマージインデックス(マージ技術)を導入する。図5に示すように、マージインデックスは予測符号化モードがマージモードの場合に、1つ符号化される。なお、動き情報が双方向であれば、動き情報には各予測方向の動きベクトル情報と参照画像情報が含まれる。
(Merge index)
In HEVC, in order to further improve the coding efficiency, an optimum block is selected from among a plurality of adjacent block candidates and a block located at the same position as a processing target block of another coded image, and the selected block is selected. Encoding and decoding a merge index indicating This is a motion compensation prediction technique (merge technique) in which motion information including a prediction direction, motion vector information, and reference image information of a motion compensation prediction of a block indicated by a selected merge index is used in a processing target block. In the embodiment of the present invention, the above-described merge index (merge technique) is introduced. As shown in FIG. 5, one merge index is encoded when the prediction encoding mode is the merge mode. If the motion information is bidirectional, the motion information includes motion vector information and reference image information in each prediction direction.
 以降、マージインデックスによって示される可能性のあるブロックの持つ動き情報を結合動き情報候補と呼び、結合動き情報候補の集合体を結合動き情報候補リストと呼ぶ。 Hereinafter, motion information possessed by a block that may be indicated by the merge index is referred to as a combined motion information candidate, and an aggregate of combined motion information candidates is referred to as a combined motion information candidate list.
 (予測方向)
 本発明の実施の形態では、動き補償予測の予測方向としてL0方向とL1方向の2つを設定する。ここで、動き補償予測の予測方向について図6(a)~(d)を用いて簡単に説明する。動き補償予測の予測方向が単方向の場合はL0方向またはL1方向のいずれかを利用する。図6(a)は単方向であってL0方向の参照画像(RefL0Pic)が符号化対象画像(CurPic)より前の時刻にある場合を示している。図6(b)は単方向であってL0方向の参照画像が符号化対象画像より後の時刻にある場合を示している。図6(a)および図6(b)のL0方向の参照画像をL1方向の参照画像(RefL1Pic)に置き換えることもできる。
(Forecast direction)
In the embodiment of the present invention, two directions of the L0 direction and the L1 direction are set as the prediction directions of the motion compensation prediction. Here, the prediction direction of motion compensation prediction will be briefly described with reference to FIGS. When the prediction direction of motion compensation prediction is unidirectional, either the L0 direction or the L1 direction is used. FIG. 6A shows a case where the reference image (RefL0Pic) in the unidirectional direction and the L0 direction is at a time before the encoding target image (CurPic). FIG. 6B shows a case where the reference image in the unidirectional direction and the L0 direction is at a time after the encoding target image. The reference image in the L0 direction in FIGS. 6A and 6B may be replaced with a reference image in the L1 direction (RefL1Pic).
 双方向の場合はL0方向とL1方向の2つを利用してBI方向と表現する。図6(c)は双方向であってL0方向の参照画像が符号化対象画像より前の時刻にあって、L1方向の参照画像が符号化対象画像より後の時刻にある場合を示している。図6(d)は双方向であってL0方向の参照画像とL1方向の参照画像が符号化対象画像より前の時刻にある場合を示している。図6(c)および図6(d)のL0方向の参照画像をL1方向の参照画像(RefL1Pic)に、L1方向の参照画像をL0方向の参照画像に置き換えることもできる。以上で説明したように、動き補償予測の予測方向であるL0方向とL1方向はそれぞれ時間的に前方向と後方向のいずれでも示すことができる。また、L0方向とL1方向はそれぞれに複数の参照画像が存在することが可能であって、L0方向の参照画像は参照画像リストL0にL1方向の参照画像は参照画像リストL1に登録されて、それぞれの予測方向の参照画像インデックスによって参照画像リスト中の参照画像の位置が指定されて、参照画像が確定する。以降、予測方向がL0方向であるとは参照画像リストL0に登録された参照画像と関連づけられた動き情報を利用する予測方向であり、予測方向がL1方向であるとは参照画像リストL1に登録された参照画像と関連づけられた動き情報を利用する予測方向である。 In the case of bidirectional, it is expressed as BI direction using two of L0 direction and L1 direction. FIG. 6C shows a case where the reference image in the L0 direction is at a time before the encoding target image and the reference image in the L1 direction is at a time after the encoding target image. . FIG. 6D shows a case in which the reference image in the L0 direction and the reference image in the L1 direction are at a time before the encoding target image. The reference image in the L0 direction in FIGS. 6C and 6D may be replaced with the reference image in the L1 direction (RefL1Pic), and the reference image in the L1 direction may be replaced with the reference image in the L0 direction. As described above, the L0 direction and the L1 direction, which are prediction directions of motion compensation prediction, can be indicated in either the forward direction or the backward direction in terms of time. In addition, a plurality of reference images can exist in each of the L0 direction and the L1 direction, the reference image in the L0 direction is registered in the reference image list L0, and the reference image in the L1 direction is registered in the reference image list L1, The position of the reference image in the reference image list is designated by the reference image index in each prediction direction, and the reference image is determined. Hereinafter, the prediction direction is the L0 direction is a prediction direction that uses motion information associated with the reference image registered in the reference image list L0, and the prediction direction is the L1 direction is registered in the reference image list L1. The prediction direction uses motion information associated with the reference image.
 (シンタックス)
 本発明の実施の形態による予測ブロックのシンタックスの一例について図7を用いて説明する。予測ブロックがイントラであるかインターであるかは上位の符号化ブロックによって指定されており、図7は予測ブロックがインターの場合のシンタックスを示す。予測ブロックには、マージフラグ(merge_flag)、マージインデックス(merge_idx)、動き補償予測の方向(inter_pred_type)、参照インデックス(ref_idx_l0とref_idx_l1)、差分ベクトル(mvd_l0[0]、mvd_l0[1]、mvd_l1[0]、mvd_l1[1])および予測ベクトルインデックス(mvp_idx_l0およびmvp_idx_l1)が設置されている。差分ベクトルの[0]は水平成分、[1]は垂直成分を示す。
(Syntax)
An example of the syntax of the prediction block according to the embodiment of the present invention will be described with reference to FIG. Whether the prediction block is intra or inter is specified by the higher-order encoding block, and FIG. 7 shows the syntax when the prediction block is inter. The prediction block includes a merge flag (merge_flag), a merge index (merge_idx), a direction of motion compensation prediction (inter_pred_type), a reference index (ref_idx_l0 and ref_idx_l1), and a difference vector (mvd_l0 [0], mvd_l0 [1], mvd_l [1], mvd ], Mvd_l1 [1]) and prediction vector indexes (mvp_idx_l0 and mvp_idx_l1). [0] of the difference vector indicates a horizontal component and [1] indicates a vertical component.
 ここで、ref_idx_l0とmvd_l0[0]、mvd_l0[1]、mvp_idx_l0はL0方向に関する情報であって、ref_idx_l1とmvd_l1[0]、mvd_l1[1]、mvp_idx_l1はL1方向に関する情報である。inter_pred_typeにはPred_L0(L0方向の単方向)、Pred_L1(L1方向の単方向)およびPred_BI(BIの双方向)の3種類がある。 Here, ref_idx_l0 and mvd_l0 [0], mvd_l0 [1], and mvp_idx_l0 are information regarding the L0 direction, and ref_idx_l1 and mvd_l1 [0], mvd_l1 [1], and mvp_idx_l1 are information regarding the L1 direction. There are three types of inter_pred_type: Pred_L0 (unidirectional in the L0 direction), Pred_L1 (unidirectional in the L1 direction), and Pred_BI (bidirectional in the BI).
 (動き情報の符号量)
 図7のシンタックスからもわかるように、マージモードはマージインデックス1つで動き情報を伝送できる。したがって、マージモード(マージフラグが1)と非マージモード(マージフラグが0)の予測誤差が同程度であれば、マージモードの方が動き情報を効率よく符号化できる。すなわち、マージモードの選択率を高くすることで動き情報の符号化効率を向上させることができる。
(Code amount of motion information)
As can be seen from the syntax of FIG. 7, the merge mode can transmit motion information with one merge index. Therefore, if the prediction errors in the merge mode (merge flag is 1) and non-merge mode (merge flag is 0) are about the same, the merge mode can more efficiently encode motion information. In other words, the efficiency of motion information encoding can be improved by increasing the selection rate of the merge mode.
 なお、本発明の実施の形態による予測ブロックのシンタックスを図7のように設定したが、本発明の実施の形態によればマージモードの方が非マージモードより少ない情報で動き情報を符号化できればよく、これに限定されない。例えば、動き情報が差分ベクトルだけでもよい。 The syntax of the prediction block according to the embodiment of the present invention is set as shown in FIG. 7, but according to the embodiment of the present invention, the motion information is encoded with less information in the merge mode than in the non-merge mode. What is necessary is not limited to this. For example, the motion information may be only a difference vector.
 (マージインデックスの特性)
 図7ではマージインデックスの復号(符号化)の前段にマージの候補数を算出する関数であるNumMergeCands()が、予測ベクトルインデックスの復号(符号化)の前段に予測ベクトルの候補数を算出する関数であるNumMvpCands()が設置されている。これらは隣接ブロックの動き情報の有効性によって、マージの候補数や予測ベクトルの候補数が予測ブロック毎に変化するために、その候補数を取得するために必要な関数である。なお、隣接ブロックの動き情報が有効であるとは、その隣接ブロックが領域外のブロックやイントラモードでないことであって、隣接ブロックの動き情報が無効であるとは、その隣接ブロックが領域外のブロックやイントラモードであることである。
(Characteristics of merge index)
In FIG. 7, NumMergeCands (), which is a function for calculating the number of merge candidates before the merge index decoding (encoding), calculates the number of prediction vector candidates before the prediction vector index decoding (encoding). NumMvpCands () is installed. These are functions necessary for obtaining the number of candidates because the number of merge candidates and the number of prediction vector candidates change for each prediction block depending on the validity of motion information of adjacent blocks. Note that the motion information of an adjacent block is valid means that the adjacent block is not a block outside the area or the intra mode, and the motion information of the adjacent block is invalid means that the adjacent block is out of the area. It is a block or intra mode.
 なお、マージの候補数が1である場合はマージインデックスを復号(符号化)しない。マージの候補数が1である場合は指定しなくとも一意に決定できるためである。予測ベクトルインデックスの場合も同様である。 Note that when the number of merge candidates is 1, the merge index is not decoded (encoded). This is because when the number of merge candidates is 1, it can be uniquely determined without specifying. The same applies to the prediction vector index.
 また、マージインデックスの符号列について図8(a)~(c)を用いて説明する。本発明の実施の形態では、マージインデックスの符号列としてTruncated Unary符号列を用いる。図8(a)はマージの候補数が2個の場合のTruncated Unary符号列によるマージインデックスの符号列を、図8(b)はマージの候補数が3個の場合のTruncated Unary符号列によるマージインデックスの符号列を、図8(c)はマージの候補数が4個の場合のTruncated Unary符号列によるマージインデックスの符号列を示す。 Further, the merge index code string will be described with reference to FIGS. In the embodiment of the present invention, a Trunked Unary code string is used as the code string of the merge index. FIG. 8A shows a merge index code string using a truncated unary code string when the number of merge candidates is two, and FIG. 8B shows a merge using a truncated unary code string when the number of merge candidates is three. FIG. 8C shows a code sequence of the index, and FIG. 8C shows a code sequence of the merge index by the Truncated Unary code sequence when the number of merge candidates is four.
 図8(a)~(c)から同じマージインデックスの値を符号化する場合でもマージの候補数が少ないほどマージインデックスに割り当てられる符号ビット数が小さくなることがわかる。例えば、マージインデックスが1である場合、マージの候補数が2個であれば‘1’の1ビットであるが、マージの候補数が3個であれば‘10’の2ビットとなる。 8A to 8C that even when the same merge index value is encoded, the smaller the number of merge candidates, the smaller the number of code bits assigned to the merge index. For example, when the merge index is 1, if the number of merge candidates is two, it is 1 bit of “1”, but if the number of merge candidates is 3, it is 2 bits of “10”.
 以上のように、マージの候補数は少ないほどマージインデックスの符号化効率は向上する。すなわち、選択率の高い候補を残し、選択率の低い候補を削減することでマージインデックスの符号化効率は向上させることができる。また、候補数が同じである場合には、小さいマージインデックスの方が符号量は少なくなるため、選択率の高い候補に小さなマージインデックスを割り当てることで符号化効率を向上させることができる。 As described above, the encoding efficiency of the merge index improves as the number of merge candidates decreases. That is, the coding efficiency of the merge index can be improved by leaving candidates with high selectivity and reducing candidates with low selectivity. In addition, when the number of candidates is the same, the code amount of the smaller merge index is smaller, so that the encoding efficiency can be improved by assigning a small merge index to a candidate with a high selection rate.
 (POC)
 本発明の実施の形態では、画像の時間情報(距離情報)としてPOC(Picture Order Count)を用いる。POCはMPEG-4AVCで定義された画像の表示順序を示すカウンタである。画像の表示順序が1増加するとPOCも1増加する。したがって、画像間のPOC差から画像間の時間差(距離)を取得できる。
(POC)
In the embodiment of the present invention, POC (Picture Order Count) is used as time information (distance information) of an image. POC is a counter indicating the display order of images defined by MPEG-4 AVC. When the image display order is increased by 1, the POC is also increased by 1. Therefore, the time difference (distance) between images can be acquired from the POC difference between images.
 (隣接ブロックの動き情報の特性)
 一般的に処理対象ブロックの動き情報と処理対象ブロックに隣接するブロック(以下、隣接ブロックという)の動き情報の相関度が高いのは、処理対象ブロックと隣接ブロックが同じ動きをしている場合、例えば、処理対象ブロックと隣接ブロックを含めた領域が平行移動している場合である。また、一般的に処理対象ブロックの動き情報と隣接ブロックの動き情報の相関度は、処理対象ブロックと隣接ブロックの接している長さにも依存する。
(Characteristics of motion information of adjacent blocks)
In general, the degree of correlation between the motion information of the processing target block and the motion information of the block adjacent to the processing target block (hereinafter referred to as an adjacent block) is high when the processing target block and the adjacent block have the same motion. For example, this is a case where the region including the processing target block and the adjacent block is translated. In general, the degree of correlation between the motion information of the processing target block and the motion information of the adjacent block also depends on the length of contact between the processing target block and the adjacent block.
 (別の画像の動き情報の特性)
 一方、一般的に時間ダイレクトモードや空間ダイレクトモードで利用されている復号済みの別の画像上で処理対象ブロックと同一位置にあるブロック(以下、同一位置ブロックという)と、当該処理対象ブロックとの相関度が高いのは、当該同一位置ブロックおよび当該処理対象ブロックが静止状態にある場合である。
(Characteristics of motion information of another image)
On the other hand, a block in the same position as the processing target block (hereinafter referred to as the same position block) on another decoded image generally used in the temporal direct mode or the spatial direct mode, and the processing target block The degree of correlation is high when the block at the same position and the block to be processed are in a stationary state.
 以下、図面とともに本発明に係る動画像符号化装置、動画像符号化方法、動画像符号化プログラムの好適な実施の形態について詳細に説明する。なお、図面の説明には同一要素には同一符号を付与して重複する説明を省略する。 Hereinafter, preferred embodiments of a moving picture coding apparatus, a moving picture coding method, and a moving picture coding program according to the present invention will be described in detail with reference to the drawings. In the description of the drawings, the same elements are denoted by the same reference numerals, and redundant description is omitted.
 [実施の形態1]
 (動画像符号化装置100の構成)
 図9は本発明の実施の形態1に係る動画像符号化装置100の構成を示す。動画像符号化装置100は、動画像信号を動き補償予測を実施する予測ブロック単位で符号化する装置である。符号化ブロックの分割、予測ブロックサイズの決定、予測符号化モードの決定は上位の符号化制御部で決定されているものとする。
[Embodiment 1]
(Configuration of moving picture coding apparatus 100)
FIG. 9 shows a configuration of moving picture coding apparatus 100 according to Embodiment 1 of the present invention. The moving image encoding apparatus 100 is an apparatus that encodes a moving image signal in units of prediction blocks for performing motion compensation prediction. It is assumed that the coding block division, the prediction block size determination, and the prediction encoding mode determination are determined by the higher-order encoding control unit.
 動画像符号化装置100は、CPU(Central Processing Unit)、フレームメモリ、ハードディスクなどを備える情報処理装置などのハードウェアにより実現される。動画像符号化装置100は、上記の構成要素が動作することにより、以下に説明する機能的な構成要素を実現する。なお、処理対象の予測ブロックの位置情報、予測ブロックサイズおよび動き補償予測の予測方向に関しては動画像符号化装置100内で共有していることとし、図示しない。 The moving picture encoding apparatus 100 is realized by hardware such as an information processing apparatus including a CPU (Central Processing Unit), a frame memory, a hard disk, and the like. The moving image encoding apparatus 100 realizes functional components described below by operating the above components. Note that the position information of the prediction block to be processed, the prediction block size, and the prediction direction of motion compensated prediction are assumed to be shared in the video encoding device 100 and are not shown.
 実施の形態1の動画像符号化装置100は、予測ブロック画像取得部101、減算部102、予測誤差符号化部103、符号列生成部104、予測誤差復号部105、動き補償部106、加算部107、動きベクトル検出部108、動き情報生成部109、フレームメモリ110、および動き情報メモリ111を含む。 The moving image encoding apparatus 100 according to Embodiment 1 includes a prediction block image acquisition unit 101, a subtraction unit 102, a prediction error encoding unit 103, a code string generation unit 104, a prediction error decoding unit 105, a motion compensation unit 106, and an addition unit. 107, a motion vector detection unit 108, a motion information generation unit 109, a frame memory 110, and a motion information memory 111.
 (動画像符号化装置100の機能)
 以下、各部の機能について説明する。予測ブロック画像取得部101は、予測ブロックの位置情報と予測ブロックサイズに基づいて、端子10より供給される画像信号から処理対象の予測ブロックの画像信号を取得し、予測ブロックの画像信号を減算部102、動きベクトル検出部108および動き情報生成部109に供給する。
(Function of moving picture coding apparatus 100)
Hereinafter, functions of each unit will be described. The prediction block image acquisition unit 101 acquires the image signal of the prediction block to be processed from the image signal supplied from the terminal 10 based on the position information and the prediction block size of the prediction block, and subtracts the image signal of the prediction block 102, and supplied to the motion vector detection unit 108 and the motion information generation unit 109.
 減算部102は、予測ブロック画像取得部101より供給される画像信号と動き補償部106より供給される予測信号を減算して予測誤差信号を算出し、当該予測誤差信号を予測誤差符号化部103に供給する。 The subtraction unit 102 subtracts the image signal supplied from the prediction block image acquisition unit 101 and the prediction signal supplied from the motion compensation unit 106 to calculate a prediction error signal, and calculates the prediction error signal to the prediction error encoding unit 103. To supply.
 予測誤差符号化部103は、減算部102より供給される予測誤差信号に対して、量子化や直交変換などの処理を行って予測誤差符号化データを生成し、当該予測誤差符号化データを符号列生成部104および予測誤差復号部105に供給する。 The prediction error encoding unit 103 performs processing such as quantization and orthogonal transformation on the prediction error signal supplied from the subtraction unit 102 to generate prediction error encoded data, and encodes the prediction error encoded data. The data is supplied to the column generation unit 104 and the prediction error decoding unit 105.
 符号列生成部104は、予測誤差符号化部103より供給される予測誤差符号化データ、ならびに動き情報生成部109より供給されるマージフラグ、マージ候補番号、動き補償予測の予測方向、参照画像インデックス、差分ベクトルおよび予測ベクトルインデックスを、動き補償予測の予測方向とともにシンタックスに従ってエントロピー符号化して符号列を生成し、当該符号列を端子11に供給する。 The code string generation unit 104 includes the prediction error encoded data supplied from the prediction error encoding unit 103, the merge flag supplied from the motion information generation unit 109, the merge candidate number, the prediction direction of motion compensation prediction, and the reference image index. The difference vector and the prediction vector index are entropy-encoded according to the syntax together with the prediction direction of motion compensation prediction to generate a code string, and the code string is supplied to the terminal 11.
 ここで、マージ候補番号はマージインデックスに変換されて符号列を生成する。ここで、マージ候補番号は選択された結合動き情報候補を示す番号である。マージ候補番号からマージインデックスへの変換については後述する。なお、実施の形態1では上記のようにマージインデックスや予測ベクトルインデックスの符号化にTruncated Unary符号列を利用したが、候補数が小さいほど少ないビットで符号化できる符号列であればこれに限定されない。 Here, the merge candidate number is converted into a merge index to generate a code string. Here, the merge candidate number is a number indicating the selected combined motion information candidate. The conversion from the merge candidate number to the merge index will be described later. In the first embodiment, a truncated unary code string is used for encoding a merge index and a prediction vector index as described above. However, the code string can be encoded with a smaller number of bits as the number of candidates is smaller. .
 予測誤差復号部105は、予測誤差符号化部103より供給される予測誤差符号化データに対して、逆量子化や逆直交変換などの処理を行って予測誤差信号を生成し、当該予測誤差信号を加算部107に供給する。 The prediction error decoding unit 105 performs a process such as inverse quantization or inverse orthogonal transform on the prediction error encoded data supplied from the prediction error encoding unit 103 to generate a prediction error signal, and the prediction error signal Is supplied to the adder 107.
 動き補償部106は、動き情報生成部109より供給される参照画像インデックスで示されるフレームメモリ110内の参照画像を、動き情報生成部109より供給される動きベクトルに基づき動き補償して予測信号を生成する。予測方向が双方向であれば、L0方向とL1方向の予測信号を平均したものを予測信号とする。 The motion compensation unit 106 performs motion compensation on the reference image in the frame memory 110 indicated by the reference image index supplied from the motion information generation unit 109 based on the motion vector supplied from the motion information generation unit 109, and generates a prediction signal. Generate. If the prediction direction is bidirectional, the prediction signal is obtained by averaging the prediction signals in the L0 direction and the L1 direction.
 加算部107は、予測誤差復号部105より供給される予測誤差信号と、動き補償部106より供給される予測信号を加算して復号画像信号を生成し、当該復号画像信号をフレームメモリ110に供給する。 The addition unit 107 adds the prediction error signal supplied from the prediction error decoding unit 105 and the prediction signal supplied from the motion compensation unit 106 to generate a decoded image signal, and supplies the decoded image signal to the frame memory 110. To do.
 動きベクトル検出部108は、予測ブロック画像取得部101より供給される画像信号および複数の参照画像に相当する画像信号から、動きベクトルおよび参照画像を示す参照画像インデックスを検出し、当該動きベクトルおよび当該参照画像インデックスを動き情報生成部109に供給する。なお、予測方向が双方向であれば、L0方向とL1方向の動きベクトルと参照画像インデックスを検出する。 The motion vector detection unit 108 detects a motion vector and a reference image index indicating the reference image from the image signal supplied from the prediction block image acquisition unit 101 and an image signal corresponding to a plurality of reference images, and the motion vector and the motion vector The reference image index is supplied to the motion information generation unit 109. If the prediction direction is bidirectional, motion vectors and reference image indexes in the L0 direction and the L1 direction are detected.
 一般的な動きベクトルの検出方法は、対象画像の画像信号と同一位置より所定の移動量だけ移動させた参照画像に相当する画像信号について誤差評価値を算出し、誤差評価値が最小となる移動量を動きベクトルとする。誤差評価値としては、絶対差分和を示すSAD(Sum of Absolute Difference)や二乗誤差平均を示すMSE(Mean Square Error)などを利用することが可能である。 A general motion vector detection method calculates an error evaluation value for an image signal corresponding to a reference image moved by a predetermined movement amount from the same position as the image signal of the target image, and moves to minimize the error evaluation value. Let the amount be a motion vector. As the error evaluation value, SAD (Sum of Absolute Difference) indicating the sum of absolute differences, MSE (Mean Square Error) indicating the mean square error, or the like can be used.
 動き情報生成部109は、動きベクトル検出部108より供給される動きベクトルと参照画像インデックス、動き情報メモリ111より供給される候補ブロック群、およびフレームメモリ110内の参照画像インデックスで示される参照画像から、マージ候補番号、または差分ベクトルおよび予測ベクトルインデックスを生成し、マージフラグ、マージ候補番号、参照画像インデックス、差分ベクトルおよび予測ベクトルインデックスを必要に応じて、符号列生成部104、動き補償部106および動き情報メモリ111に供給する。動き情報生成部109の詳細な構成については後述する。 The motion information generation unit 109 uses the motion vector and reference image index supplied from the motion vector detection unit 108, the candidate block group supplied from the motion information memory 111, and the reference image indicated by the reference image index in the frame memory 110. , A merge candidate number, or a difference vector and a prediction vector index, and a code flag generation unit 104, a motion compensation unit 106, and a merge flag, a merge candidate number, a reference image index, a difference vector, and a prediction vector index The motion information memory 111 is supplied. A detailed configuration of the motion information generation unit 109 will be described later.
 フレームメモリ110は、加算部107より供給される復号画像信号を記憶する。また、画像全体の復号が完了した復号画像については参照画像として、1以上の所定の画像数を記憶する。フレームメモリ110は、記憶した参照画像信号を動き補償部106および動き情報生成部109に供給する。参照画像を記憶する記憶領域はFIFO(First In First Out)方式で制御される。 The frame memory 110 stores the decoded image signal supplied from the adding unit 107. In addition, for a decoded image in which decoding of the entire image is completed, a predetermined number of images of 1 or more is stored as a reference image. The frame memory 110 supplies the stored reference image signal to the motion compensation unit 106 and the motion information generation unit 109. A storage area for storing the reference image is controlled by a FIFO (First In First Out) method.
 動き情報メモリ111は、動き情報生成部109より供給される動き情報を最小の予測ブロックサイズ単位で所定の画像数、記憶する。処理対象の予測ブロックの隣接ブロックの動き情報を空間候補ブロック群とする。 The motion information memory 111 stores the motion information supplied from the motion information generation unit 109 for a predetermined number of images in units of the minimum predicted block size. The motion information of the adjacent block of the prediction block to be processed is set as a space candidate block group.
 また、動き情報メモリ111は、処理対象の予測ブロックと同一位置にあるColPic上のブロックとその周辺ブロックの動き情報を時間候補ブロック群とする。動き情報メモリ111は、空間候補ブロック群と時間候補ブロック群を候補ブロック群として動き情報生成部109に供給する。動き情報メモリ111は、フレームメモリ110と同期しており、FIFO(First In First Out)方式で制御される。 Also, the motion information memory 111 uses the motion information of the block on the ColPic and the surrounding blocks at the same position as the prediction block to be processed as a time candidate block group. The motion information memory 111 supplies the spatial candidate block group and the temporal candidate block group to the motion information generation unit 109 as candidate block groups. The motion information memory 111 is synchronized with the frame memory 110 and is controlled by a FIFO (First In First Out) method.
 ここで、ColPicとは、処理対象の予測ブロックとは別の復号済みの画像であって、フレームメモリ110に参照画像として記憶されている画像を指す。実施の形態1では、ColPicは直前に復号した参照画像とする。なお、実施の形態1では、ColPicは直前に復号した参照画像としたが、符号化済みの画像であればよく、例えば、表示順で直前の参照画像や表示順で直後の参照画像でもよく、符号化ストリーム中で指定することも可能である。 Here, ColPic refers to a decoded image different from the prediction block to be processed and stored in the frame memory 110 as a reference image. In Embodiment 1, ColPic is a reference image decoded immediately before. In Embodiment 1, ColPic is a reference image decoded immediately before. However, it may be an encoded image, for example, a reference image immediately before in display order or a reference image immediately after in display order may be used. It can also be specified in the encoded stream.
 ここで、動き情報メモリ111における動き情報の管理方法について図10を用いて説明する。動き情報は最小の予測ブロック単位で各メモリエリアに記憶される。図10は処理対象の予測ブロックサイズが16画素×16画素である場合の様子を示している。この場合、この予測ブロックの動き情報は図10の斜線部の16個のメモリエリアに格納される。 Here, a method for managing motion information in the motion information memory 111 will be described with reference to FIG. The motion information is stored in each memory area in units of the smallest prediction block. FIG. 10 shows a state where the predicted block size to be processed is 16 pixels × 16 pixels. In this case, the motion information of the prediction block is stored in 16 memory areas indicated by hatching in FIG.
 なお、予測符号化モードがイントラモードである場合、L0方向とL1方向の動きベクトルとして(0,0)が記憶され、L0方向とL1方向の参照画像インデックスとして「-1」が記憶される。なお、参照画像インデックスの「-1」は動き補償予測を実施しないモードであることが判定できればどのような値でもよい。これ以降は特に断らない限り単にブロックと表現する場合には、最小の予測ブロック単位のことを示すこととする。また、領域外のブロックである場合もイントラモードと同様に、L0方向とL1方向の動きベクトルとして(0,0)、L0方向とL1方向の参照画像インデックスとして「-1」が記憶されている。LX方向(Xは0または1)が有効であるとはLX方向の参照画像インデックスが0以上であることで、LX方向が無効である(有効でない)とはLX方向の参照画像インデックスが「-1」であることである。 When the predictive coding mode is the intra mode, (0, 0) is stored as the motion vector in the L0 direction and the L1 direction, and “−1” is stored as the reference image index in the L0 direction and the L1 direction. The reference image index “−1” may be any value as long as it can be determined that the mode does not perform motion compensation prediction. From this point onward, unless expressed otherwise, the term “block” refers to the smallest predicted block unit when expressed simply as a block. Further, in the case of a block outside the area, as in the intra mode, (0, 0) is stored as the motion vector in the L0 direction and the L1 direction, and “−1” is stored as the reference image index in the L0 direction and the L1 direction. . When the LX direction (X is 0 or 1) is valid, the reference image index in the LX direction is 0 or more, and when the LX direction is invalid (not valid), the reference image index in the LX direction is “− 1 ”.
 続いて、動き情報生成部109の詳細な構成について図11を用いて説明する。図11は動き情報生成部109の構成を示す。動き情報生成部109は、差分ベクトル算出部120、結合動き情報決定部121および予測符号化モード決定部122を含む。端子12は動き情報メモリ111に、端子13は動きベクトル検出部108に、端子14はフレームメモリ110に、端子15は予測ブロック画像取得部101に、端子16は符号列生成部104に、端子50は動き補償部106に、および端子51は動き情報メモリ111にそれぞれ接続されている。 Subsequently, a detailed configuration of the motion information generation unit 109 will be described with reference to FIG. FIG. 11 shows the configuration of the motion information generation unit 109. The motion information generation unit 109 includes a difference vector calculation unit 120, a combined motion information determination unit 121, and a predictive coding mode determination unit 122. The terminal 12 is in the motion information memory 111, the terminal 13 is in the motion vector detection unit 108, the terminal 14 is in the frame memory 110, the terminal 15 is in the prediction block image acquisition unit 101, the terminal 16 is in the code string generation unit 104, and the terminal 50 Are connected to the motion compensation unit 106, and the terminal 51 is connected to the motion information memory 111, respectively.
 以下、各部の機能について説明する。差分ベクトル算出部120は、端子12より供給される候補ブロック群、端子13より供給される動きベクトルと参照画像インデックス、端子14より供給される参照画像、および端子15より供給される画像信号から予測ベクトルインデックスを決定して、差分ベクトルおよびレート歪み評価値を算出する。そして、当該参照画像インデックス、当該動きベクトル、当該差分ベクトル、当該予測ベクトルインデックス、および当該レート歪み評価値を予測符号化モード決定部122に供給する。差分ベクトル算出部120の詳細な構成については後述する。 The functions of each part are described below. The difference vector calculation unit 120 predicts from the candidate block group supplied from the terminal 12, the motion vector and reference image index supplied from the terminal 13, the reference image supplied from the terminal 14, and the image signal supplied from the terminal 15. A vector index is determined, and a difference vector and a rate distortion evaluation value are calculated. Then, the reference image index, the motion vector, the difference vector, the prediction vector index, and the rate distortion evaluation value are supplied to the prediction encoding mode determination unit 122. The detailed configuration of the difference vector calculation unit 120 will be described later.
 結合動き情報決定部121は、端子12より供給される候補ブロック群、端子14より供給される参照画像、および端子15より供給される画像信号から結合動き情報候補リストを生成する。そして、結合動き情報決定部121は、生成した結合動き情報候補リストの中から結合動き情報候補を選択してマージ候補番号を決定すると共に、レート歪み評価値を算出して、当該結合動き情報候補の動き情報、当該マージ候補番号および当該レート歪み評価値を予測符号化モード決定部122に供給する。結合動き情報決定部121の詳細な構成については後述する。 The combined motion information determination unit 121 generates a combined motion information candidate list from the candidate block group supplied from the terminal 12, the reference image supplied from the terminal 14, and the image signal supplied from the terminal 15. Then, the combined motion information determination unit 121 selects a combined motion information candidate from the generated combined motion information candidate list, determines a merge candidate number, calculates a rate distortion evaluation value, and combines the combined motion information candidate. Motion information, the merge candidate number, and the rate distortion evaluation value are supplied to the predictive coding mode determination unit 122. A detailed configuration of the combined motion information determination unit 121 will be described later.
 予測符号化モード決定部122は、差分ベクトル算出部120より供給されるレート歪み評価値と、結合動き情報決定部121より供給されるレート歪み評価値とを比較する。前者の方が後者未満の場合は、マージフラグを「0」に設定する。予測符号化モード決定部122は、マージフラグ、および差分ベクトル算出部120より供給される参照画像インデックスと差分ベクトルと予測ベクトルインデックスを端子16に供給し、差分ベクトル算出部120より供給される動きベクトルと参照画像インデックスを端子50および端子51に供給する。 The predictive coding mode determination unit 122 compares the rate distortion evaluation value supplied from the difference vector calculation unit 120 with the rate distortion evaluation value supplied from the combined motion information determination unit 121. If the former is less than the latter, the merge flag is set to “0”. The predictive coding mode determination unit 122 supplies the merge flag, the reference image index, the difference vector, and the prediction vector index supplied from the difference vector calculation unit 120 to the terminal 16, and the motion vector supplied from the difference vector calculation unit 120. The reference image index is supplied to the terminal 50 and the terminal 51.
 前者が後者以上の場合は、マージフラグを1に設定する。予測符号化モード決定部122は、マージフラグおよび結合動き情報決定部121より供給されるマージ候補番号を端子16に供給し、結合動き情報決定部121より供給される動き情報の動きベクトルと参照画像インデックスを端子50および端子51に供給する。なお、レート歪み評価値の具体的な算出方法は本発明の主眼ではないため詳細は省略するが、レート歪み評価値が小さいほど符号化効率は高くなる特性を持つ評価値である。 If the former is greater than or equal to the latter, set the merge flag to 1. The predictive coding mode determination unit 122 supplies the merge flag and the merge candidate number supplied from the combined motion information determination unit 121 to the terminal 16, and the motion vector of the motion information supplied from the combined motion information determination unit 121 and the reference image The index is supplied to the terminal 50 and the terminal 51. Although a specific calculation method of the rate distortion evaluation value is not the main point of the present invention, the details thereof will be omitted. However, the evaluation value has a characteristic that the encoding efficiency increases as the rate distortion evaluation value decreases.
 続いて、差分ベクトル算出部120の詳細な構成について図12を用いて説明する。図12は差分ベクトル算出部120の構成を示す。差分ベクトル算出部120は、予測ベクトル候補リスト生成部130、予測ベクトル決定部131および減算部132とを含む。端子17は予測符号化モード決定部122に接続されている。 Subsequently, a detailed configuration of the difference vector calculation unit 120 will be described with reference to FIG. FIG. 12 shows the configuration of the difference vector calculation unit 120. The difference vector calculation unit 120 includes a prediction vector candidate list generation unit 130, a prediction vector determination unit 131, and a subtraction unit 132. The terminal 17 is connected to the predictive coding mode determination unit 122.
 予測ベクトル候補リスト生成部130は、実施の形態1による動画像符号化装置100により生成された符号列を復号する動画像復号装置200にも同様に設置されて、動画像符号化装置100と動画像復号装置200にて矛盾のない予測ベクトル候補リストが生成される。 The prediction vector candidate list generation unit 130 is also installed in the moving image decoding apparatus 200 that decodes the code sequence generated by the moving image encoding apparatus 100 according to the first embodiment. The image decoding apparatus 200 generates a prediction vector candidate list having no contradiction.
 以下、各部の機能について説明する。予測ベクトル候補リスト生成部130は、端子12より供給される候補ブロック群から領域外である候補ブロックや、イントラモードである候補ブロックを削除する。さらに重複している動きベクトルを持つ候補ブロックが複数存在する場合には1つの候補ブロックを残して削除する。予測ベクトル候補リスト生成部130は、これら削除後の候補ブロックから予測ベクトル候補リストを生成し、予測ベクトル候補リストを予測ベクトル決定部131に供給する。このようにして生成された予測ベクトル候補リストには重複のない予測ベクトル候補が1つ以上含まれるとする。例えば、動きベクトルを持つ候補ブロックが1つもない場合には、ベクトル(0,0)が予測ベクトル候補リストに追加される。なお、予測方向が双方向であれば、L0方向とL1方向について予測ベクトル候補リストを生成して供給する。 The function of each part will be described below. The prediction vector candidate list generation unit 130 deletes candidate blocks outside the region and candidate blocks in the intra mode from the candidate block group supplied from the terminal 12. Further, when there are a plurality of candidate blocks having overlapping motion vectors, one candidate block is left and deleted. The prediction vector candidate list generation unit 130 generates a prediction vector candidate list from these candidate blocks after deletion, and supplies the prediction vector candidate list to the prediction vector determination unit 131. It is assumed that the predicted vector candidate list generated in this way includes one or more predicted vector candidates that do not overlap. For example, when there is no candidate block having a motion vector, the vector (0, 0) is added to the prediction vector candidate list. If the prediction direction is bidirectional, a prediction vector candidate list is generated and supplied for the L0 direction and the L1 direction.
 予測ベクトル決定部131は、予測ベクトル候補リスト生成部130より供給される予測ベクトル候補リストの中から、端子13より供給される動きベクトルに最適な予測ベクトルを選択する。予測ベクトル決定部131は、選択した予測ベクトルを減算部132に供給するとともに、参照画像インデックスおよび選択された予測ベクトルを示す情報である予測ベクトルインデックスを端子17に供給する。なお、予測方向が双方向であれば、L0方向とL1方向について最適な予測ベクトルを選択して供給する。 The prediction vector determination unit 131 selects an optimal prediction vector for the motion vector supplied from the terminal 13 from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 130. The prediction vector determination unit 131 supplies the selected prediction vector to the subtraction unit 132, and supplies a reference vector index and a prediction vector index that is information indicating the selected prediction vector to the terminal 17. If the prediction direction is bidirectional, an optimal prediction vector is selected and supplied for the L0 direction and the L1 direction.
 ここで、最適な予測ベクトルとして、予測ベクトル候補が持つ動きベクトルに基づいて、端子14より供給される参照画像と端子15より供給される画像信号から予測誤差量が算出される。そして、参照画像インデックス、差分ベクトルおよび予測ベクトルインデックスの符号量と、上述の予測誤差量とからレート歪み評価値が算出されて、レート歪み評価値が最小となる予測ベクトル候補が選択される。 Here, the prediction error amount is calculated from the reference image supplied from the terminal 14 and the image signal supplied from the terminal 15 based on the motion vector of the prediction vector candidate as the optimal prediction vector. Then, a rate distortion evaluation value is calculated from the code amount of the reference image index, the difference vector, and the prediction vector index, and the above-described prediction error amount, and a prediction vector candidate that minimizes the rate distortion evaluation value is selected.
 減算部132は、端子13より供給される動きベクトルから予測ベクトル決定部131より供給される予測ベクトルを減算して差分ベクトルを算出し、当該差分ベクトルを端子17に供給する。なお、予測方向が双方向であれば、L0方向とL1方向について差分ベクトルを算出して供給する。 The subtraction unit 132 calculates a difference vector by subtracting the prediction vector supplied from the prediction vector determination unit 131 from the motion vector supplied from the terminal 13, and supplies the difference vector to the terminal 17. If the prediction direction is bidirectional, a difference vector is calculated and supplied for the L0 direction and the L1 direction.
 (予測ベクトル候補リスト生成部130に供給される候補ブロック群)
 ここで、予測ベクトル候補リスト生成部130に供給される候補ブロック群について図13と図14を用いて説明する。候補ブロック群には空間候補ブロック群と時間候補ブロック群が含まれる。
(Candidate block group supplied to prediction vector candidate list generation unit 130)
Here, the candidate block group supplied to the prediction vector candidate list production | generation part 130 is demonstrated using FIG. 13 and FIG. The candidate block group includes a spatial candidate block group and a temporal candidate block group.
 図13は処理対象の予測ブロックサイズが16画素×16画素である場合の処理対象の予測ブロックの隣接ブロックを示す。実施の形態1では、空間候補ブロック群として、図13に示すブロックA1、ブロックC、ブロックD、ブロックB1およびブロックEの5ブロックとする。ここでは、空間候補ブロック群をブロックA1、ブロックC、ブロックD、ブロックB1およびブロックEの5ブロックとしたが、空間候補ブロック群は、処理対象の予測ブロックに隣接する少なくとも1以上の処理済みのブロックであればよく、これらに限定されない。例えば、ブロックA1、ブロックA2、ブロックA3、ブロックA4、ブロックB1、ブロックB2、ブロックB3、ブロックB4、ブロックC、ブロックDおよびブロックEの全てを空間候補ブロックとしてもよい。 FIG. 13 shows adjacent blocks of the prediction block to be processed when the prediction block size to be processed is 16 pixels × 16 pixels. In the first embodiment, the space candidate block group is assumed to be five blocks of block A1, block C, block D, block B1, and block E shown in FIG. Here, the spatial candidate block group is five blocks of block A1, block C, block D, block B1, and block E, but the spatial candidate block group is at least one or more processed adjacent to the prediction block to be processed. Any block may be used, and the present invention is not limited to these. For example, all of block A1, block A2, block A3, block A4, block B1, block B2, block B3, block B4, block C, block D, and block E may be spatial candidate blocks.
 次に、時間候補ブロック群について図14を用いて説明する。図14は処理対象の予測ブロックサイズが16画素×16画素である場合の処理対象の予測ブロックと同一位置にあるColPic上の予測ブロック内のブロックとその周辺ブロックを示す。実施の形態1では、時間候補ブロック群として、図6に示すブロックHとブロックI6の2ブロックとする。 Next, the time candidate block group will be described with reference to FIG. FIG. 14 shows a block in a prediction block on ColPic and its peripheral blocks at the same position as the prediction block to be processed when the prediction block size to be processed is 16 pixels × 16 pixels. In the first embodiment, the time candidate block group includes two blocks, block H and block I6 shown in FIG.
 ここでは、時間候補ブロック群をColPic上のブロックHとブロックI6の2ブロックとしたが、時間候補ブロック群は、処理対象の予測ブロックとは別の復号済みの画像上の少なくとも1以上のブロックであればよく、これらに限定されない。例えば、ColPic上のブロックI1からブロックI16、ブロックA1からブロックA4、ブロックB1からブロックB4、ブロックC、ブロックD、ブロックE、ブロックF1からブロックF4、ブロックG1からブロックG4およびブロックHの全てを時間候補ブロックとしてもよい。以降特に断らない限り、ブロックA4をブロックA、ブロックB4をブロックBと表記する。以降特に断らない限り、ブロックHとブロックI6のブロックを時間ブロックと表記する。 Here, the time candidate block group is two blocks of block H and block I6 on ColPic, but the time candidate block group is at least one block on a decoded image different from the prediction block to be processed. There is no limitation to these. For example, all of block I1 to block I16, block A1 to block A4, block B1 to block B4, block C, block D, block E, block F1 to block F4, block G1 to block G4 and block H on ColPic It may be a candidate block. Hereinafter, unless otherwise specified, block A4 is referred to as block A, and block B4 is referred to as block B. Hereinafter, unless otherwise specified, blocks H and I6 are referred to as time blocks.
 (結合動き情報決定部121の構成)
 続いて、結合動き情報決定部121の詳細な構成について図15を用いて説明する。図15は結合動き情報決定部121の構成を示す。結合動き情報決定部121は、結合動き情報候補生成部140および結合動き情報選択部141を含む。結合動き情報候補生成部140は、実施の形態1による動画像符号化装置100により生成された符号列を復号する動画像復号装置200にも同様に設置されて、動画像符号化装置100と動画像復号装置200にて矛盾のない同一の結合動き情報リストが生成される。
(Configuration of the combined motion information determination unit 121)
Next, a detailed configuration of the combined motion information determination unit 121 will be described with reference to FIG. FIG. 15 shows the configuration of the combined motion information determination unit 121. The combined motion information determination unit 121 includes a combined motion information candidate generation unit 140 and a combined motion information selection unit 141. The combined motion information candidate generation unit 140 is also installed in the moving image decoding apparatus 200 that decodes the code sequence generated by the moving image encoding apparatus 100 according to Embodiment 1, and is combined with the moving image encoding apparatus 100 and the moving image. The image decoding apparatus 200 generates the same combined motion information list without any contradiction.
 以下、各部の機能について説明する。結合動き情報候補生成部140は、端子12より供給される候補ブロック群から結合動き情報候補リストを生成し、当該結合動き情報候補リストを結合動き情報選択部141に供給する。結合動き情報候補生成部140の詳細な構成については後述する。 The functions of each part are described below. The combined motion information candidate generation unit 140 generates a combined motion information candidate list from the candidate block group supplied from the terminal 12, and supplies the combined motion information candidate list to the combined motion information selection unit 141. A detailed configuration of the combined motion information candidate generation unit 140 will be described later.
 結合動き情報選択部141は、結合動き情報候補生成部140より供給される結合動き情報候補リストの中から、最適な結合動き情報候補を選択し、選択された結合動き情報候補を示す情報であるマージ候補番号を端子17に供給する。 The combined motion information selection unit 141 is information that selects an optimal combined motion information candidate from the combined motion information candidate list supplied from the combined motion information candidate generation unit 140 and indicates the selected combined motion information candidate. The merge candidate number is supplied to the terminal 17.
 ここで、最適な結合動き情報候補として、結合動き情報候補の予測方向、動きベクトルと参照画像インデックスに基づいて得られる端子14より供給される参照画像と、端子15より供給される画像信号とから予測誤差量が算出される。マージ候補番号の符号量と、当該予測誤差量とからレート歪み評価値が算出されて、レート歪み評価値が最小となる結合動き情報候補が選択される。 Here, from the prediction image of the combined motion information candidate, the motion vector, and the reference image supplied from the terminal 14 obtained based on the reference image index and the image signal supplied from the terminal 15 as the optimal combined motion information candidate. A prediction error amount is calculated. A rate distortion evaluation value is calculated from the code amount of the merge candidate number and the prediction error amount, and a combined motion information candidate that minimizes the rate distortion evaluation value is selected.
 (結合動き情報候補生成部140に供給される候補ブロック群)
 ここで、結合動き情報候補生成部140に供給される候補ブロック群について図13と図14を用いて説明する。候補ブロック群には空間候補ブロック群と時間候補ブロック群が含まれる。実施の形態1では、空間候補ブロック群を図13に示すブロックA4、ブロックB4、ブロックCおよびブロックEの4ブロックとする。ここでは、空間候補ブロック群をブロックA4、ブロックB4、ブロックCおよびブロックEの4ブロックとしたが、空間候補ブロック群は、処理対象の予測ブロックに隣接する少なくとも1以上の処理済みのブロックであればよく、これらに限定されない。
(Candidate block group supplied to the combined motion information candidate generation unit 140)
Here, the candidate block group supplied to the combined motion information candidate generation unit 140 will be described with reference to FIGS. 13 and 14. The candidate block group includes a spatial candidate block group and a temporal candidate block group. In the first embodiment, the space candidate block group is assumed to be four blocks of block A4, block B4, block C, and block E shown in FIG. Here, the spatial candidate block group is four blocks of block A4, block B4, block C, and block E, but the spatial candidate block group may be at least one or more processed blocks adjacent to the prediction block to be processed. What is necessary is not limited to these.
 次に、時間候補ブロック群について図14を用いて説明する。実施の形態1では、時間候補ブロック群として、図14に示すブロックHとブロックI6の2ブロックとする。ここでは、時間候補ブロック群を予測ベクトル候補リスト生成部130に供給される時間候補ブロック群と同じとしたが、時間候補ブロック群は、処理対象の予測ブロックとは別の復号済みの画像上の少なくとも0以上のブロックであればよく、これらに限定されない。 Next, the time candidate block group will be described with reference to FIG. In the first embodiment, the time candidate block group includes two blocks, block H and block I6 shown in FIG. Here, the time candidate block group is the same as the time candidate block group supplied to the prediction vector candidate list generation unit 130, but the time candidate block group is on a decoded image different from the prediction block to be processed. The block is not limited to these as long as it is at least 0 or more blocks.
 (結合動き情報候補生成部140の構成)
 続いて、実施の形態1の特徴をなす結合動き情報候補生成部140の詳細な構成について図16を用いて説明する。図16は結合動き情報候補生成部140の構成を示す。端子18は結合動き情報選択部141に接続されている。結合動き情報候補生成部140は、単方向結合動き情報候補リスト生成部150、第1結合動き情報候補リスト削減部151、双方向結合動き情報候補リスト生成部152および第2結合動き情報候補リスト削減部153を含む。
(Configuration of combined motion information candidate generation unit 140)
Next, a detailed configuration of the combined motion information candidate generation unit 140 that characterizes the first embodiment will be described with reference to FIG. FIG. 16 shows a configuration of the combined motion information candidate generation unit 140. The terminal 18 is connected to the combined motion information selection unit 141. The combined motion information candidate generation unit 140 includes a unidirectional combined motion information candidate list generation unit 150, a first combined motion information candidate list reduction unit 151, a bidirectional combined motion information candidate list generation unit 152, and a second combined motion information candidate list reduction. Part 153.
 以下、各部の機能について説明する。単方向結合動き情報候補リスト生成部150は、端子12より供給される候補ブロック群から第1結合動き情報候補リストを生成し、当該第1結合動き情報候補リストを第1結合動き情報候補リスト削減部151に供給する。 The functions of each part are described below. The unidirectional combined motion information candidate list generation unit 150 generates a first combined motion information candidate list from the candidate block group supplied from the terminal 12, and reduces the first combined motion information candidate list to the first combined motion information candidate list. To the unit 151.
 第1結合動き情報候補リスト削減部151は、単方向結合動き情報候補リスト生成部150より供給される第1結合動き情報候補リストから重複している動き情報を持つ結合動き情報候補が複数存在する場合には1つの結合動き情報候補を残して削除して第2結合動き情報候補リストを生成し、当該第2結合動き情報候補リストを双方向結合動き情報候補リスト生成部152に供給する。 The first combined motion information candidate list reduction unit 151 includes a plurality of combined motion information candidates having motion information overlapping from the first combined motion information candidate list supplied from the unidirectional combined motion information candidate list generation unit 150. In this case, the second combined motion information candidate list is generated by deleting one of the combined motion information candidates, and the second combined motion information candidate list is supplied to the bidirectional combined motion information candidate list generating unit 152.
 双方向結合動き情報候補リスト生成部152は、第1結合動き情報候補リスト削減部151より供給される第2結合動き情報候補リストから双方向結合動き情報候補リストを生成し、当該双方向結合動き情報候補リストを上述の第2結合動き情報候補リストと結合して第3結合動き情報候補リストを生成し、当該第3結合動き情報候補リストを第2結合動き情報候補リスト削減部153に供給する。双方向結合動き情報候補リスト生成部152の詳細な構成については後述する。 The bidirectional combined motion information candidate list generating unit 152 generates a bidirectional combined motion information candidate list from the second combined motion information candidate list supplied from the first combined motion information candidate list reducing unit 151, and the bidirectional combined motion information candidate list is generated. The information candidate list is combined with the second combined motion information candidate list described above to generate a third combined motion information candidate list, and the third combined motion information candidate list is supplied to the second combined motion information candidate list reduction unit 153. . A detailed configuration of the bidirectional combined motion information candidate list generation unit 152 will be described later.
 実施の形態1では、双方向結合動き情報候補リスト生成部152は、基準方向がL0の双方向結合動き情報候補(BD0)と基準方向がL1の双方向結合動き情報候補(BD1)を生成するものとする。そのため、上述の双方向結合動き情報候補リストには、BD0とBD1が含まれる可能性がある。 In Embodiment 1, the bidirectional combined motion information candidate list generation unit 152 generates a bidirectional combined motion information candidate (BD0) whose reference direction is L0 and a bidirectional combined motion information candidate (BD1) whose reference direction is L1. Shall. Therefore, there is a possibility that BD0 and BD1 are included in the above-described bidirectional combined motion information candidate list.
 第2結合動き情報候補リスト削減部153は、双方向結合動き情報候補リスト生成部152より供給される第3結合動き情報候補リストから重複している動き情報を持つ結合動き情報候補が複数存在する場合には1つの結合動き情報候補を残して削除して結合動き情報候補リストを生成し、当該結合動き情報候補リストを端子18に供給する。 The second combined motion information candidate list reduction unit 153 includes a plurality of combined motion information candidates that have overlapping motion information from the third combined motion information candidate list supplied from the bidirectional combined motion information candidate list generation unit 152. In this case, the combined motion information candidate list is generated by deleting one combined motion information candidate, and the combined motion information candidate list is supplied to the terminal 18.
 ここで、単方向結合動き情報候補は、所謂マージ技術で利用される候補ブロックの動き情報候補のことであって1つの候補ブロックから得られる動き情報である。一方、双方向結合動き情報は、実施の形態1の特徴となる技術であって2つの候補ブロックから2つの動き情報を利用して得られる動き情報である。本実施の形態では2つの動き情報としてL0方向とL1方向をそれぞれ1つずつ利用する。 Here, the unidirectional combined motion information candidate is a motion information candidate of a candidate block used in a so-called merge technique, and is motion information obtained from one candidate block. On the other hand, the bidirectional combined motion information is a technique that is a feature of the first embodiment, and is motion information obtained by using two pieces of motion information from two candidate blocks. In this embodiment, one L0 direction and one L1 direction are used as two pieces of motion information.
 (双方向結合動き情報候補リスト生成部152)
 続いて、双方向結合動き情報候補リスト生成部152の詳細な構成について図17を用いて説明する。図17は双方向結合動き情報候補リスト生成部152の構成を示す。端子19は第1結合動き情報候補リスト削減部151に、端子20は第2結合動き情報候補リスト削減部153にそれぞれ接続されている。双方向結合動き情報候補リスト生成部152は、基準方向決定部160、基準方向動き情報決定部161、逆方向動き情報決定部162および双方向動き情報決定部163を含む。
(Bidirectional combined motion information candidate list generation unit 152)
Next, a detailed configuration of the bidirectional combined motion information candidate list generation unit 152 will be described with reference to FIG. FIG. 17 shows a configuration of the bidirectional combined motion information candidate list generation unit 152. The terminal 19 is connected to the first combined motion information candidate list reduction unit 151, and the terminal 20 is connected to the second combined motion information candidate list reduction unit 153. The bidirectional combined motion information candidate list generation unit 152 includes a reference direction determination unit 160, a reference direction motion information determination unit 161, a reverse direction motion information determination unit 162, and a bidirectional motion information determination unit 163.
 以下、各部の機能について説明する。基準方向決定部160は、第2結合動き情報候補リストから双方向結合動き情報候補の基準方向を決定し、当該基準方向および端子19より供給される第2結合動き情報候補リストを基準方向動き情報決定部161に送る。基準方向がL0の双方向結合動き情報候補(BD0)の場合の基準方向はL0方向となり、基準方向がL1の双方向結合動き情報候補(BD1)の場合の基準方向はL1方向となる。 The functions of each part are described below. The reference direction determination unit 160 determines the reference direction of the bidirectional combined motion information candidate from the second combined motion information candidate list, and determines the reference direction and the second combined motion information candidate list supplied from the terminal 19 as the reference direction motion information. The data is sent to the determination unit 161. The reference direction for the bidirectional combined motion information candidate (BD0) with the reference direction L0 is the L0 direction, and the reference direction for the bidirectional combined motion information candidate (BD1) with the reference direction L1 is the L1 direction.
 基準方向動き情報決定部161は、基準方向決定部160より供給される基準方向および第2結合動き情報候補リストから、双方向結合動き情報候補の基準方向の動きベクトルと参照画像インデックスを決定し、当該基準方向、当該基準方向の動きベクトルと当該参照画像インデックス、および当該第2結合動き情報候補リストを逆方向動き情報決定部162に送る。 The reference direction motion information determination unit 161 determines the reference direction motion vector and the reference image index of the bidirectional combined motion information candidate from the reference direction and the second combined motion information candidate list supplied from the reference direction determination unit 160, The reference direction, the motion vector in the reference direction, the reference image index, and the second combined motion information candidate list are sent to the backward direction motion information determination unit 162.
 逆方向動き情報決定部162は、基準方向動き情報決定部161より供給される基準方向、基準方向の動きベクトルと参照画像インデックス、および第2結合動き情報候補リストから、双方向結合動き情報候補の逆方向の動きベクトルと参照画像インデックスを決定する。逆方向動き情報決定部162は、当該基準方向の動きベクトルと参照画像インデックス、当該逆方向の動きベクトルと参照画像インデックス、および第2結合動き情報候補リストを双方向動き情報決定部163に送る。なお、実施の形態1では、基準方向がL0方向であれば、逆方向はL1方向とし、基準方向がL1方向であれば、逆方向はL0方向とする。 The backward direction motion information determination unit 162 determines the bidirectional combined motion information candidate from the reference direction, the reference direction motion vector and the reference image index, and the second combined motion information candidate list supplied from the reference direction motion information determination unit 161. A backward motion vector and a reference image index are determined. The backward motion information determination unit 162 sends the motion vector in the reference direction and the reference image index, the backward motion vector and the reference image index, and the second combined motion information candidate list to the bidirectional motion information determination unit 163. In the first embodiment, if the reference direction is the L0 direction, the reverse direction is the L1 direction, and if the reference direction is the L1 direction, the reverse direction is the L0 direction.
 双方向動き情報決定部163は、逆方向動き情報決定部162より供給される基準方向の動きベクトルと参照画像インデックス、および逆方向の動きベクトルと参照画像インデックスから双方向結合動き情報候補を決定する。また、双方向動き情報決定部163は、第2結合動き情報候補リストから第3結合動き情報候補リストを生成し、当該第3結合動き情報候補リストを端子20に送る。 The bidirectional motion information determination unit 163 determines a bidirectional combined motion information candidate from the reference direction motion vector and the reference image index supplied from the backward direction motion information determination unit 162, and the backward direction motion vector and the reference image index. . In addition, the bidirectional motion information determination unit 163 generates a third combined motion information candidate list from the second combined motion information candidate list, and sends the third combined motion information candidate list to the terminal 20.
 (候補番号管理テーブル)
 ここで、実施の形態1で利用するマージ候補番号と結合動き情報候補の関係を示す候補番号管理テーブルについて図18を用いて説明する。マージ候補番号の0から6は、それぞれ結合動き情報候補リストに含まれるブロックAの結合動き情報候補(A)、ブロックBの結合動き情報候補(B)、時間ブロックの結合動き情報候補(COL)、ブロックCの結合動き情報候補(C)、ブロックEの結合動き情報候補(E)、基準方向がL0の双方向結合動き情報候補(BD0)、および基準方向がL1の双方向結合動き情報候補(BD1)を示す。また、結合動き情報候補リストに含まれる結合動き情報候補の最大数は7(マージインデックスの最大値は6)であるとする。以上のように、ここでは基準方向がL0の双方向結合動き情報候補(BD0)と基準方向がL1の双方向結合動き情報候補(BD1)のマージ候補番号が単方向結合動き情報候補のマージ候補番号よりも大きくなるように割り当てられる。なお、実施の形態1で利用する候補番号管理テーブルを図18としたが、選択率の高い結合動き情報候補ほど小さいマージ候補番号が割り当てられていればよくこれに限定されない。
(Candidate number management table)
Here, a candidate number management table indicating the relationship between merge candidate numbers and combined motion information candidates used in Embodiment 1 will be described with reference to FIG. Merge candidate numbers 0 to 6 are combined motion information candidates (A) of block A, combined motion information candidates (B) of block B, and combined motion information candidates (COL) of time blocks included in the combined motion information candidate list, respectively. , Combined motion information candidate (C) of block C, combined motion information candidate (E) of block E, bidirectional combined motion information candidate (BD0) with reference direction L0, and bidirectional combined motion information candidate with reference direction L1 (BD1) is shown. The maximum number of combined motion information candidates included in the combined motion information candidate list is 7 (the maximum value of the merge index is 6). As described above, the merge candidate number of the bidirectional combined motion information candidate (BD0) having the reference direction L0 and the bidirectional combined motion information candidate (BD1) having the reference direction L1 is the merge candidate of the unidirectional combined motion information candidate. Assigned to be larger than the number. Although the candidate number management table used in Embodiment 1 is shown in FIG. 18, it is only necessary that a smaller merge candidate number is assigned to a combined motion information candidate with a higher selection rate.
 ここで、候補番号管理テーブル、および結合動き情報候補リストに含まれる結合動き情報候補の最大数は動画像符号化装置100内で共有していることとし、図示しない。以下、マージ候補番号からマージインデックスへの変換について図19(a)、(b)を用いて説明する。 Here, the maximum number of combined motion information candidates included in the candidate number management table and the combined motion information candidate list is assumed to be shared in the moving picture coding apparatus 100, and is not illustrated. Hereinafter, conversion from the merge candidate number to the merge index will be described with reference to FIGS.
 図19(a)は、ブロックAの結合動き情報候補、ブロックBの結合動き情報候補、時間ブロックの結合動き情報候補、ブロックCの結合動き情報候補,ブロックEの結合動き情報候補、基準方向がL0の双方向結合動き情報候補、および基準方向がL1の双方向結合動き情報候補が全て有効である場合に、マージ候補番号はそのままマージインデックスとなることを示している。 FIG. 19A shows a combined motion information candidate for block A, a combined motion information candidate for block B, a combined motion information candidate for time block, a combined motion information candidate for block C, a combined motion information candidate for block E, and a reference direction. When the bidirectional combined motion information candidate of L0 and the bidirectional combined motion information candidate of the reference direction L1 are all valid, it indicates that the merge candidate number becomes the merge index as it is.
 図19(b)は、結合動き情報候補に無効なブロックが含まれている場合、無効であるマージ候補番号を詰めてからマージ候補番号の小さい順序にマージインデックスが割り当てられる場合を示している。図19(b)のようにマージ候補番号が1のブロックBとマージ候補番号が4のブロックEの結合動き情報候補が無効である場合、マージインデックスの0はマージ候補番号0に、マージインデックスの1はマージ候補番号2に、マージインデックスの2はマージ候補番号3に、マージインデックスの3はマージ候補番号5に、およびマージインデックスの4はマージ候補番号6にそれぞれ変換される。以上のように、ここでは基準方向がL0の双方向結合動き情報候補(BD0)と基準方向がL1の双方向結合動き情報候補(BD1)のマージインデックスは単方向結合動き情報候補のマージインデックスよりも大きくなるように割り当てられる。 FIG. 19B shows a case in which merge indexes are assigned in ascending order of merge candidate numbers after filling invalid merge candidate numbers when the combined motion information candidates include invalid blocks. As shown in FIG. 19B, when the combined motion information candidate of the block B with the merge candidate number 1 and the block E with the merge candidate number 4 is invalid, the merge index 0 is set to the merge candidate number 0 and the merge index 1 is converted to merge candidate number 2, merge index 2 is converted to merge candidate number 3, merge index 3 is converted to merge candidate number 5, and merge index 4 is converted to merge candidate number 6. As described above, here, the merge index of the bidirectional combined motion information candidate (BD0) having the reference direction L0 and the bidirectional combined motion information candidate (BD1) having the reference direction L1 is greater than the merge index of the unidirectional combined motion information candidate. Is also assigned to be larger.
 実施の形態1による動画像符号化装置100により生成された符号列を復号する動画像復号装置200ではマージインデックスからマージ候補番号に上記とは逆の変換が行われて、動画像符号化装置100と動画像復号装置200にて矛盾のない同一の候補番号管理テーブルが生成される。 In the moving picture decoding apparatus 200 that decodes the code sequence generated by the moving picture encoding apparatus 100 according to the first embodiment, the reverse conversion from the merge index to the merge candidate number is performed, and the moving picture encoding apparatus 100 And the moving picture decoding apparatus 200 generates the same candidate number management table with no contradiction.
 (動画像符号化装置100の動作)
 続いて、図20のフローチャートを用いて、実施の形態1の動画像符号化装置100における符号化の動作を説明する。予測ブロック画像取得部101は、予測ブロックの位置情報と予測ブロックサイズに基づいて、端子10より供給される画像信号から処理対象の予測ブロックの画像信号を取得する(S100)。
(Operation of moving picture coding apparatus 100)
Next, an encoding operation in the moving image encoding apparatus 100 according to Embodiment 1 will be described with reference to the flowchart of FIG. The prediction block image acquisition unit 101 acquires the image signal of the prediction block to be processed from the image signal supplied from the terminal 10 based on the position information of the prediction block and the prediction block size (S100).
 動きベクトル検出部108は、予測ブロック画像取得部101より供給される画像信号と複数の参照画像に相当する画像信号から、動きベクトルおよび参照画像を示す参照画像インデックスを検出する(S101)。 The motion vector detection unit 108 detects a reference image index indicating a motion vector and a reference image from the image signal supplied from the predicted block image acquisition unit 101 and image signals corresponding to a plurality of reference images (S101).
 動き情報生成部109は、動きベクトル検出部108より供給される動きベクトルと参照画像インデックス、および動き情報メモリ111より供給される候補ブロック群から、マージ候補番号、または差分ベクトルと予測ベクトルインデックスを生成する(S102)。 The motion information generation unit 109 generates a merge candidate number or a difference vector and a prediction vector index from the motion vector and reference image index supplied from the motion vector detection unit 108 and the candidate block group supplied from the motion information memory 111. (S102).
 動き補償部106は、フレームメモリ110内の参照画像インデックスで示される参照画像を、動きベクトル検出部108より供給される動きベクトルに基づき動き補償して予測信号を生成する。予測方向が双方向であれば、L0方向とL1方向の予測信号を平均したものを予測信号として生成する(S103)。 The motion compensation unit 106 performs motion compensation on the reference image indicated by the reference image index in the frame memory 110 based on the motion vector supplied from the motion vector detection unit 108 to generate a prediction signal. If the prediction direction is bidirectional, an average of the prediction signals in the L0 direction and the L1 direction is generated as a prediction signal (S103).
 減算部102は、予測ブロック画像取得部101より供給される画像信号と、動き補償部106より供給される予測信号との差分を算出して予測誤差信号を算出する(S104)。予測誤差符号化部103は、減算部102より供給される予測誤差信号に対して、量子化や直交変換などの処理を行って予測誤差符号化データを生成する(S105)。 The subtraction unit 102 calculates a difference between the image signal supplied from the prediction block image acquisition unit 101 and the prediction signal supplied from the motion compensation unit 106 to calculate a prediction error signal (S104). The prediction error encoding unit 103 performs processing such as quantization and orthogonal transformation on the prediction error signal supplied from the subtraction unit 102 to generate prediction error encoded data (S105).
 符号列生成部104は、予測誤差符号化部103より供給される予測誤差符号化データ、ならびに動き情報生成部109より供給されるマージフラグ、マージ候補番号、参照画像インデックス、差分ベクトルおよび予測ベクトルインデックスを、予測方向とともにシンタックスに従ってエントロピー符号化して符号列を生成する(S106)。 The code string generation unit 104 includes the prediction error encoded data supplied from the prediction error encoding unit 103, and the merge flag, merge candidate number, reference image index, difference vector, and prediction vector index supplied from the motion information generation unit 109. Is entropy-coded according to the syntax along with the prediction direction to generate a code string (S106).
 加算部107は、予測誤差復号部105より供給される予測誤差信号と、動き補償部106より供給される予測信号とを加算して復号画像信号を生成する(S107)。フレームメモリ110は、加算部107より供給される復号画像信号を記憶する(S108)。動き情報メモリ111は、動きベクトル検出部108より供給される動きベクトルを、最小の予測ブロックサイズ単位で1画像分記憶する(S109)。 The addition unit 107 adds the prediction error signal supplied from the prediction error decoding unit 105 and the prediction signal supplied from the motion compensation unit 106 to generate a decoded image signal (S107). The frame memory 110 stores the decoded image signal supplied from the adding unit 107 (S108). The motion information memory 111 stores one image of the motion vector supplied from the motion vector detection unit 108 in units of the minimum predicted block size (S109).
 引き続いて、図21のフローチャートを用いて、動き情報生成部109の動作を説明する。差分ベクトル算出部120は、端子12より供給される候補ブロック群、端子13より供給される動きベクトルと参照画像インデックス、端子14より供給される参照画像、および端子15より供給される画像信号から、予測ベクトルインデックスを決定して差分ベクトルとレート歪み評価値を算出する(S110)。 Subsequently, the operation of the motion information generation unit 109 will be described using the flowchart of FIG. The difference vector calculation unit 120 includes a candidate block group supplied from the terminal 12, a motion vector and reference image index supplied from the terminal 13, a reference image supplied from the terminal 14, and an image signal supplied from the terminal 15. A prediction vector index is determined, and a difference vector and a rate distortion evaluation value are calculated (S110).
 結合動き情報決定部121は、端子12より供給される候補ブロック群、端子14より供給される参照画像、および端子15より供給される画像信号から、マージ候補番号を決定してレート歪み評価値を算出する(S111)。 The combined motion information determination unit 121 determines the merge candidate number from the candidate block group supplied from the terminal 12, the reference image supplied from the terminal 14, and the image signal supplied from the terminal 15 to obtain the rate distortion evaluation value. Calculate (S111).
 予測符号化モード決定部122は、差分ベクトル算出部120より供給されるレート歪み評価値と、結合動き情報決定部121より供給されるレート歪み評価値を比較し、前者が後者より小さい場合はマージフラグを「0」に設定し、そうでない場合は、マージフラグを「1」に設定する(S112)。 The predictive coding mode determination unit 122 compares the rate distortion evaluation value supplied from the difference vector calculation unit 120 with the rate distortion evaluation value supplied from the combined motion information determination unit 121, and merges if the former is smaller than the latter The flag is set to “0”, otherwise, the merge flag is set to “1” (S112).
 引き続いて、図22のフローチャートを用いて、差分ベクトル算出部120の動作を説明する。予測ベクトル候補リスト生成部130は、端子12より供給される候補ブロック群から領域外である候補ブロック、イントラモードである候補ブロック、および重複している動きベクトルを持つ候補ブロックを除いた候補ブロックから、予測ベクトル候補リストを生成する。予測方向が双方向であれば、L0方向とL1方向について予測ベクトル候補リストを生成する(S120)。 Next, the operation of the difference vector calculation unit 120 will be described using the flowchart of FIG. The prediction vector candidate list generation unit 130 excludes candidate blocks that are out of the region, candidate blocks that are in the intra mode, and candidate blocks that have overlapping motion vectors from the candidate block group supplied from the terminal 12. A prediction vector candidate list is generated. If the prediction direction is bidirectional, a prediction vector candidate list is generated for the L0 direction and the L1 direction (S120).
 予測ベクトル決定部131は、予測ベクトル候補リスト生成部130より供給される予測ベクトル候補リストの中から、端子13より供給される動きベクトルに最適な予測ベクトルを選択する。予測方向が双方向であれば、L0方向とL1方向について最適な予測ベクトルを選択する(S121)。減算部132は、端子13より供給される動きベクトルから、予測ベクトル決定部131より供給される予測ベクトルを減算して差分ベクトルを算出する。予測方向が双方向であれば、L0方向とL1方向について差分ベクトルを算出する(S122)。 The prediction vector determination unit 131 selects an optimal prediction vector for the motion vector supplied from the terminal 13 from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 130. If the prediction direction is bidirectional, an optimal prediction vector is selected for the L0 direction and the L1 direction (S121). The subtraction unit 132 subtracts the prediction vector supplied from the prediction vector determination unit 131 from the motion vector supplied from the terminal 13 to calculate a difference vector. If the prediction direction is bidirectional, a difference vector is calculated for the L0 direction and the L1 direction (S122).
 (結合動き情報決定部121の動作)
 引き続いて、図23のフローチャートを用いて、結合動き情報決定部121の動作を詳細に説明する。結合動き情報候補生成部140は、端子12より供給される候補ブロック群から結合動き情報候補リストを生成する(S130)。結合動き情報選択部141は、結合動き情報候補生成部140より供給される結合動き情報候補リストの中から、端子13より供給される動きベクトルと参照画像インデックス、および予測方向に最適な結合動き情報を決定する(S131)。
(Operation of the combined motion information determination unit 121)
Subsequently, the operation of the combined motion information determination unit 121 will be described in detail using the flowchart of FIG. The combined motion information candidate generation unit 140 generates a combined motion information candidate list from the candidate block group supplied from the terminal 12 (S130). The combined motion information selection unit 141 selects from the combined motion information candidate list supplied from the combined motion information candidate generation unit 140, the motion vector and reference image index supplied from the terminal 13, and the combined motion information optimal for the prediction direction. Is determined (S131).
 (結合動き情報候補生成部140の動作)
 引き続いて、図24のフローチャートを用いて、結合動き情報候補生成部140の動作を詳細に説明する。単方向結合動き情報候補リスト生成部150は、端子12より供給される空間候補ブロック群から領域外である候補ブロックおよびイントラモードである候補ブロックを除いた候補ブロックから空間結合動き情報候補リストを生成する(S140)。空間結合動き情報候補リストの生成の詳細な動作は後述する。
(Operation of combined motion information candidate generation unit 140)
Subsequently, the operation of the combined motion information candidate generation unit 140 will be described in detail using the flowchart of FIG. The unidirectional combined motion information candidate list generation unit 150 generates a spatial combined motion information candidate list from candidate blocks obtained by excluding candidate blocks outside the region and candidate blocks in the intra mode from the spatial candidate block group supplied from the terminal 12. (S140). Detailed operations for generating the spatially coupled motion information candidate list will be described later.
 単方向結合動き情報候補リスト生成部150は、端子12より供給される時間候補ブロック群から領域外である候補ブロックおよびイントラモードである候補ブロックを除いた候補ブロックから、時間結合動き情報候補リストを生成する(S141)。時間結合動き情報候補リストの生成の詳細な動作は後述する。 The unidirectional combined motion information candidate list generation unit 150 generates a time combined motion information candidate list from candidate blocks obtained by excluding candidate blocks outside the region and candidate blocks in the intra mode from the time candidate block group supplied from the terminal 12. Generate (S141). The detailed operation of generating the time combination motion information candidate list will be described later.
 単方向結合動き情報候補リスト生成部150は、空間結合動き情報候補リストと時間結合動き情報候補リストをマージ候補番号の順序で結合して第1結合動き情報候補リストを生成する(S142)。 The unidirectional combined motion information candidate list generation unit 150 combines the spatial combined motion information candidate list and the temporal combined motion information candidate list in the order of merge candidate numbers to generate a first combined motion information candidate list (S142).
 第1結合動き情報候補リスト削減部151は、単方向結合動き情報候補リスト生成部150より供給される第1結合動き情報候補リストから重複している動き情報を持つ結合動き情報候補が複数存在する場合には1つの結合動き情報候補を残して削除して第2結合動き情報候補リストを生成する(S143)。 The first combined motion information candidate list reduction unit 151 includes a plurality of combined motion information candidates having motion information overlapping from the first combined motion information candidate list supplied from the unidirectional combined motion information candidate list generation unit 150. In this case, the second combined motion information candidate list is generated by deleting one combined motion information candidate (S143).
 双方向結合動き情報候補リスト生成部152は、第1結合動き情報候補リスト削減部151より供給される第2結合動き情報候補リストから双方向結合動き情報候補リストを生成する(S144)。双方向結合動き情報候補リストの生成の詳細な動作は後述する。 The bidirectional combined motion information candidate list generating unit 152 generates a bidirectional combined motion information candidate list from the second combined motion information candidate list supplied from the first combined motion information candidate list reducing unit 151 (S144). The detailed operation of generating the bidirectional combined motion information candidate list will be described later.
 双方向結合動き情報候補リスト生成部152は、第2結合動き情報候補リストと双方向結合動き情報候補リストを、マージ候補番号の順序で結合して第3結合動き情報候補リストを生成する(S145)。 The bidirectional combined motion information candidate list generation unit 152 combines the second combined motion information candidate list and the bidirectional combined motion information candidate list in the order of merge candidate numbers to generate a third combined motion information candidate list (S145). ).
 第2結合動き情報候補リスト削減部153は、双方向結合動き情報候補リスト生成部152より供給される第3結合動き情報候補リストから重複している動き情報を持つ結合動き情報候補が複数存在する場合には、1つの結合動き情報候補を残して削除して結合動き情報候補リストを生成する(S146)。 The second combined motion information candidate list reduction unit 153 includes a plurality of combined motion information candidates that have overlapping motion information from the third combined motion information candidate list supplied from the bidirectional combined motion information candidate list generation unit 152. In this case, a combined motion information candidate list is generated by deleting one combined motion information candidate (S146).
 引き続いて、図25のフローチャートを用いて、空間結合動き情報候補リストの生成の詳細な動作を説明する。実施の形態1では、空間結合動き情報候補リストには4以下の候補ブロックの動き情報が含まれるとする。 Subsequently, the detailed operation of generating the spatially coupled motion information candidate list will be described using the flowchart of FIG. In Embodiment 1, it is assumed that the spatial combination motion information candidate list includes motion information of four or less candidate blocks.
 空間候補ブロック群に含まれる4つの候補ブロックであるブロックA、ブロックB、ブロックCおよびブロックEについて以下の処理を繰り返し行う(S150からS153)。候補ブロックの有効性を検査する(S151)。候補ブロックが領域外でなくイントラモードでもない場合に候補ブロックは有効となる。候補ブロックが有効であれば(S151のYES)、候補ブロックの動き情報を空間結合動き情報候補リストに追加する(S152)。候補ブロックが有効でなければ(S151のNO)、ステップS152はスキップされる。 The following processing is repeated for block A, block B, block C and block E, which are the four candidate blocks included in the space candidate block group (S150 to S153). The validity of the candidate block is checked (S151). A candidate block is valid when the candidate block is not out of the region and is not in intra mode. If the candidate block is valid (YES in S151), the motion information of the candidate block is added to the spatially combined motion information candidate list (S152). If the candidate block is not valid (NO in S151), step S152 is skipped.
 実施の形態1では空間結合動き情報候補リストには4以下の候補ブロックの動き情報が含まれるとしたが、候補ブロックの有効性によって空間結合動き情報候補リストの数が変動すればよく、これに限定されない。 In the first embodiment, the spatial combination motion information candidate list includes motion information of four or less candidate blocks. However, the number of spatial combination motion information candidate lists may vary depending on the effectiveness of the candidate block. It is not limited.
 引き続いて、図26のフローチャートを用いて、時間結合動き情報候補リストの生成の詳細な動作を説明する。実施の形態1では、時間結合動き情報候補リストには1以下の候補ブロックの動き情報が含まれるとする。 Subsequently, the detailed operation of generating the time combination motion information candidate list will be described using the flowchart of FIG. In Embodiment 1, it is assumed that the temporal combination motion information candidate list includes motion information of one or less candidate blocks.
 時間候補ブロック群に含まれる2つの候補ブロックである時間ブロックについて以下の処理を繰り返し行う(S160からS166)。候補ブロックの有効性を検査する(S161)。候補ブロックが領域外でなくイントラモードでもない場合に候補ブロックは有効となる。候補ブロックが有効であれば(S161のYES)、時間結合動き情報候補が生成され、時間結合動き情報候補が時間結合動き情報候補がリストに追加され(ステップS162からステップS165)、処理を終了する。候補ブロックが有効でなければ(S161のNO)、次の候補ブロックを検査する(S166)。 The following processing is repeated for time blocks that are two candidate blocks included in the time candidate block group (S160 to S166). The validity of the candidate block is checked (S161). A candidate block is valid when the candidate block is not out of the region and is not in intra mode. If the candidate block is valid (YES in S161), a time combination motion information candidate is generated, the time combination motion information candidate is added to the list (step S162 to step S165), and the process ends. . If the candidate block is not valid (NO in S161), the next candidate block is inspected (S166).
 候補ブロックが有効であれば、時間結合動き情報候補の予測方向の決定を行う(S162)。実施の形態1では結合動き情報候補の予測方向を双方向とする。次に、時間結合動き情報候補のL0方向、L1方向の参照画像の決定を行う(S163)。実施の形態1ではL0方向の参照画像を、L0方向の参照画像のうち処理対象画像に最も近い距離にある参照画像とし、L1方向の参照画像を、L1方向の参照画像のうち処理対象画像に最も近い距離にある参照画像とする。ここでは、L0方向の参照画像を、L0方向の参照画像のうち処理対象画像に最も近い距離にある参照画像とし、L1方向の参照画像を、L1方向の参照画像のうち処理対象画像に最も近い距離にある参照画像としたが、L0方向の参照画像とL1方向の参照画像が決定できればよく、これに限定されない。例えば、符号化ストリーム中にL0方向及びL1方向の参照画像を符号化してもよく、L0方向及びL1方向の参照画像インデックスを0にしてもよく、処理対象ブロックの隣接ブロックが使用しているL0方向の参照画像及びL1方向の参照画像で、最も多く使用されている参照画像をL0方向、L1方向ぞれぞれの参照する参照画像としてもよい。 If the candidate block is valid, the prediction direction of the temporally combined motion information candidate is determined (S162). In the first embodiment, the prediction direction of the combined motion information candidate is bidirectional. Next, the reference images in the L0 direction and the L1 direction of the time combination motion information candidate are determined (S163). In the first embodiment, the reference image in the L0 direction is the reference image that is the closest to the processing target image among the reference images in the L0 direction, and the reference image in the L1 direction is the processing target image among the reference images in the L1 direction. The reference image is the closest distance. Here, the reference image in the L0 direction is the reference image that is the closest to the processing target image among the reference images in the L0 direction, and the reference image in the L1 direction is the closest to the processing target image among the reference images in the L1 direction. Although the reference image is at a distance, it is only necessary to determine the reference image in the L0 direction and the reference image in the L1 direction, and the present invention is not limited to this. For example, the reference images in the L0 direction and the L1 direction may be encoded in the encoded stream, the reference image indexes in the L0 direction and the L1 direction may be set to 0, and the L0 used by the adjacent block of the processing target block. Of the reference image in the direction and the reference image in the L1 direction, the most frequently used reference image may be used as a reference image for reference in each of the L0 direction and the L1 direction.
 次に、時間結合動き情報候補の動きベクトルを算出する(S164)。本実施の形態における、時間結合動き情報候補は、候補ブロックの動き情報で有効な予測方向である参照画像ColRefPicと動きベクトルmvColを基準に、双方向の動き情報を算出する。候補ブロックの予測方向がL0方向もしくはL1方向の単方向の場合には、その予測方向の参照画像と動きベクトルを基準として選択する。候補ブロックの予測方向が双方向である場合には、L0方向或いはL1方向のいずれか一方の参照画像と動きベクトルを基準として選択する。例えば、ColPicと同じ時間方向に存在する参照画像と動きベクトルを基準として選択する、候補ブロックのL0方向或いはL1方向の参照画像のいずれかColPicと画像間距離が近い方を基準として選択する、候補ブロックのL0方向或いはL1方向のいずれか動きベクトルが処理対象画像と交差する方を基準として選択する等が挙げられる。双方向動き情報生成の基準とする参照画像と動きベクトルが選択されたら、時間結合動き情報候補の動きベクトルを算出する。 Next, the motion vector of the time combination motion information candidate is calculated (S164). In this embodiment, the temporally combined motion information candidate calculates bidirectional motion information based on the reference image ColRefPic and the motion vector mvCol, which are effective prediction directions in the motion information of the candidate block. When the prediction direction of the candidate block is the unidirectional direction of the L0 direction or the L1 direction, it is selected based on the reference image and the motion vector in the prediction direction. When the prediction direction of the candidate block is bidirectional, the selection is made based on either the reference image in the L0 direction or the L1 direction and the motion vector. For example, a reference image and a motion vector that are present in the same time direction as ColPic are selected as a reference, and a candidate block is selected based on a reference image that is closer to the distance between ColPic and a reference image in the L0 direction or L1 direction of the candidate block. For example, the block may be selected based on the direction in which the motion vector in the L0 direction or the L1 direction intersects the processing target image. When a reference image and a motion vector as a reference for generating bidirectional motion information are selected, a motion vector of a temporally combined motion information candidate is calculated.
 ここでは上記のようにして時間結合動き情報候補を生成したが、符号化済みの別の画像の動き情報を利用して双方向の動き情報が決定できればよく、これに限定されない。例えば、ダイレクト動き補償で行われているように各方向の参照画像と処理対象画像の距離に応じてスケーリングした動きベクトルを双方向の動きベクトルとしてもよい。候補ブロックが無効であれば(S163のNO)、次の候補ブロックを検査する(S165)。 Here, the time combination motion information candidates are generated as described above, but it is only necessary to be able to determine bidirectional motion information using motion information of another encoded image, and the present invention is not limited to this. For example, a motion vector scaled according to the distance between the reference image in each direction and the processing target image as in direct motion compensation may be used as a bidirectional motion vector. If the candidate block is invalid (NO in S163), the next candidate block is inspected (S165).
 ここでは時間結合動き情報候補リストには1以下の候補ブロックの動き情報が含まれるとしたが、候補ブロックの有効性によって時間結合動き情報候補リストの数が変動すればよく、これに限定されない。予測方向、参照画像および動きベクトルの決定方法についても同様に、これらに限定されない。 Here, it is assumed that the time combination motion information candidate list includes motion information of one or less candidate blocks. However, the number of time combination motion information candidate lists may be changed depending on the effectiveness of the candidate block, and is not limited thereto. Similarly, the prediction direction, reference image, and motion vector determination method are not limited to these.
 (双方向結合動き情報候補リストの生成)
 引き続いて、図27のフローチャートを用いて、双方向結合動き情報候補リストの生成の詳細な動作を説明する。双方向結合動き情報候補リストは空であるとする。基準方向決定部160は、第2結合動き情報候補リストから双方向結合動き情報候補の基準方向を決定する(S170)。基準方向がL0の双方向結合動き情報候補(BD0)の場合の基準方向はL0方向となり、基準方向がL1の双方向結合動き情報候補(BD1)の場合の基準方向はL1方向となる。
(Generation of bidirectional combined motion information candidate list)
Subsequently, the detailed operation of generating the bidirectional combined motion information candidate list will be described with reference to the flowchart of FIG. Assume that the bidirectional combined motion information candidate list is empty. The reference direction determination unit 160 determines the reference direction of the bidirectional combined motion information candidate from the second combined motion information candidate list (S170). The reference direction for the bidirectional combined motion information candidate (BD0) with the reference direction L0 is the L0 direction, and the reference direction for the bidirectional combined motion information candidate (BD1) with the reference direction L1 is the L1 direction.
 基準方向動き情報決定部161は、基準方向決定部160より供給される基準方向および第2結合動き情報候補リストから、双方向結合動き情報候補の基準方向の動きベクトルと参照画像インデックスを決定する(S171)。基準方向動き情報決定部161の詳細な動作は後述する。 The reference direction motion information determination unit 161 determines the reference direction motion vector and the reference image index of the bidirectional combined motion information candidate from the reference direction and the second combined motion information candidate list supplied from the reference direction determination unit 160 ( S171). The detailed operation of the reference direction motion information determination unit 161 will be described later.
 逆方向動き情報決定部162は、基準方向動き情報決定部161より供給される基準方向、基準方向の動きベクトル、参照画像インデックスおよび第2結合動き情報候補リストから、双方向結合動き情報候補の逆方向の動きベクトルと参照画像インデックスを決定する(S172)。逆方向動き情報決定部162の詳細な動作は後述する。 The backward direction motion information determination unit 162 reverses the bidirectional combined motion information candidate from the reference direction, the reference direction motion vector, the reference image index, and the second combined motion information candidate list supplied from the reference direction motion information determination unit 161. A direction motion vector and a reference image index are determined (S172). The detailed operation of the backward motion information determination unit 162 will be described later.
 双方向動き情報決定部163は、逆方向動き情報決定部162より供給される基準方向、基準方向の動きベクトルと参照画像インデックス、および逆方向の動きベクトルと参照画像インデックスから、双方向結合動き情報候補の予測方向を決定する(S173)。双方向結合動き情報候補の予測方向の決定の詳細な動作は後述する。 The bidirectional motion information determination unit 163 generates bidirectional combined motion information from the reference direction, the reference direction motion vector and the reference image index, and the reverse direction motion vector and the reference image index supplied from the backward direction motion information determination unit 162. A candidate prediction direction is determined (S173). The detailed operation of determining the prediction direction of the bidirectional combined motion information candidate will be described later.
 双方向動き情報決定部163は、双方向結合動き情報候補の予測方向の有効性を検査する(S174)。双方向動き情報決定部163は、双方向結合動き情報候補の予測方向が有効であれば(S174のYES)、双方向結合動き情報候補リストに双方向結合動き情報候補を追加する(S175)。双方向結合動き情報候補の予測方向が無効であれば(S174のNO)、ステップS175はスキップされる。 The bidirectional motion information determination unit 163 checks the validity of the prediction direction of the bidirectional combined motion information candidate (S174). If the prediction direction of the bidirectional combined motion information candidate is valid (YES in S174), the bidirectional motion information determination unit 163 adds the bidirectional combined motion information candidate to the bidirectional combined motion information candidate list (S175). If the prediction direction of the bidirectional combined motion information candidate is invalid (NO in S174), step S175 is skipped.
 引き続いて、図28のフローチャートを用いて、基準方向動き情報決定部161の詳細な動作を説明する。双方向結合動き情報候補の基準方向としてLX方向(Xは0または1)が選ばれたとする。基準方向であるLXの有効性を「0」に設定する(S190)。第2結合動き情報候補リストに含まれる結合動き情報候補の数(NCands)、以下の処理を繰り返す(S191からS194)。結合動き情報候補のLX方向の有効性を検査する(S192)。結合動き情報候補のLX方向が有効であれば(S192のYES)、基準方向であるLXの有効性を「1」に設定し、基準方向の動きベクトルと参照インデックスを結合動き情報候補のLX方向の動きベクトルと参照インデックスとして処理を終了する(S193)。結合動き情報候補のLX方向が無効であれば(S192のNO)、次の候補を検査する(S194)。 Subsequently, the detailed operation of the reference direction motion information determination unit 161 will be described using the flowchart of FIG. Assume that the LX direction (X is 0 or 1) is selected as the reference direction of the bidirectional combined motion information candidate. The validity of the reference direction LX is set to “0” (S190). The number of combined motion information candidates (NCands) included in the second combined motion information candidate list is repeated, and the following processing is repeated (S191 to S194). The validity of the combined motion information candidate in the LX direction is checked (S192). If the LX direction of the combined motion information candidate is valid (YES in S192), the validity of the reference direction LX is set to “1”, and the motion vector and reference index in the reference direction are set to the LX direction of the combined motion information candidate. The process ends with the motion vector and the reference index (S193). If the LX direction of the combined motion information candidate is invalid (NO in S192), the next candidate is examined (S194).
 ここでは第2結合動き情報候補リストに含まれる結合動き情報候補の数(NCands)だけ検査するとしたが、双方向結合動き情報候補の基準方向の動き情報が決定できればよく、これに限定されない。例えば、選択率の高い結合動き情報候補だけから双方向結合動き情報候補を生成する場合には検査数を2や3のような所定数に固定して処理量を削減するとともに冗長な双方向結合動き情報候補の生成可能性を低下させてマージインデックスの符号量を削減することもできる。 Here, it is assumed that only the number of combined motion information candidates (NCands) included in the second combined motion information candidate list is inspected, but it is sufficient that the motion information in the reference direction of the bidirectional combined motion information candidates can be determined, and the present invention is not limited to this. For example, when bi-directional combined motion information candidates are generated only from combined motion information candidates with a high selection rate, the number of inspections is fixed to a predetermined number such as 2 or 3, and the amount of processing is reduced and redundant bi-directional coupling is performed. It is also possible to reduce the amount of merge index codes by reducing the possibility of generating motion information candidates.
 引き続いて、図29のフローチャートを用いて、逆方向動き情報決定部162の詳細な動作を説明する。双方向結合動き情報候補の逆方向として基準方向の逆方向が設定される。逆方向としてLY方向(Yは0または1)が選ばれたとする。逆方向であるLYの有効性を「0」に設定する(S200)。第2結合動き情報候補リストに含まれる結合動き情報候補の数(NCands)、以下の処理を繰り返す(S201からS205)。 Subsequently, the detailed operation of the backward motion information determination unit 162 will be described using the flowchart of FIG. The reverse direction of the reference direction is set as the reverse direction of the bidirectional combined motion information candidate. It is assumed that the LY direction (Y is 0 or 1) is selected as the reverse direction. The validity of LY in the reverse direction is set to “0” (S200). The number of combined motion information candidates included in the second combined motion information candidate list (NCands) and the following processing are repeated (S201 to S205).
 基準方向で選択された結合動き情報候補でないことを検査する(S202)。基準方向で選択された結合動き情報候補でないならば(S202のYES)、結合動き情報候補のLY方向の有効性を検査する(S203)。結合動き情報候補のLY方向が有効であれば(S203のYES)、逆方向であるLYの有効性を「1」に設定し、逆方向の動きベクトルと参照インデックスを結合動き情報候補のLY方向の動きベクトルと参照インデックスとして処理を終了する(S204)。基準方向で選択された結合動き情報候補であるならば(S202のNO)、または結合動き情報候補のLY方向が無効であれば(S203のNO)、次の候補を検査する(S205)。 It is checked that it is not a combined motion information candidate selected in the reference direction (S202). If it is not the combined motion information candidate selected in the reference direction (YES in S202), the validity of the combined motion information candidate in the LY direction is checked (S203). If the LY direction of the combined motion information candidate is valid (YES in S203), the validity of the reverse LY is set to “1”, and the reverse motion vector and reference index are set to the LY direction of the combined motion information candidate. The process ends with the motion vector and the reference index (S204). If it is a combined motion information candidate selected in the reference direction (NO in S202), or if the LY direction of the combined motion information candidate is invalid (NO in S203), the next candidate is examined (S205).
 ここでは第2結合動き情報候補リストに含まれる結合動き情報候補の数(NCands)だけ検査するとしたが、双方向結合動き情報候補の逆方向の動き情報が決定できればよく、これに限定されない。例えば、選択率の高い結合動き情報候補だけから双方向結合動き情報候補を生成する場合には検査数を2や3のような所定数に固定して処理量を削減するとともに冗長な双方向結合動き情報候補の生成可能性を低下させてマージインデックスの符号量を削減することもできる。また、検査を開始するブロックを基準方向で選択された結合動き情報候補の次の結合動き情報候補とすることで、BD0とBD1が同一になる可能性をなくし、ステップS202を削減することができる。 Here, it is assumed that only the number of combined motion information candidates (NCands) included in the second combined motion information candidate list is inspected, but it is only necessary to be able to determine motion information in the reverse direction of the bidirectional combined motion information candidates, and the present invention is not limited to this. For example, when bi-directional combined motion information candidates are generated only from combined motion information candidates with a high selection rate, the number of inspections is fixed to a predetermined number such as 2 or 3, and the amount of processing is reduced and redundant bi-directional coupling is performed. It is also possible to reduce the amount of merge index codes by reducing the possibility of generating motion information candidates. In addition, by setting the block to start the inspection as the combined motion information candidate next to the combined motion information candidate selected in the reference direction, the possibility that BD0 and BD1 are the same can be eliminated, and step S202 can be reduced. .
 引き続いて、図30の表を用いて、双方向結合動き情報候補の予測方向の決定の詳細な動作を説明する。LX方向とLY方向がともに有効であれば予測方向は双方向のBIとなり、LX方向のみが有効であれば予測方向は単方向のLX方向となり、LY方向のみが有効であれば予測方向は単方向のLY方向となり、およびLX方向とLY方向がともに無効であれば予測方向は無効となる。つまり、LX方向とLY方向がともに有効である場合、当該LX方向の動き情報を有する結合動き情報候補と、当該LY方向の動き情報を有する当該LX方向の動き情報を有する結合動き情報候補とは別の結合動き情報候補とが組み合わされて新たな双方向結合動き情報候補が生成される。また、LX方向のみが有効である場合、当該有効であるLX予測を有する結合動き情報候補の予測方向が双予測であれば、当該結合動き情報候補の予測方向が単予測に変換されることになる。同様に、LY方向のみが有効である場合、当該有効であるLY予測を有する結合動き情報候補の予測方向が双予測であれば、当該結合動き情報候補の予測方向が単予測に変換されることになる。 Subsequently, the detailed operation of determining the prediction direction of the bidirectional combined motion information candidate will be described using the table of FIG. If both the LX direction and the LY direction are valid, the prediction direction is a bi-directional BI. If only the LX direction is valid, the prediction direction is a unidirectional LX direction. If only the LY direction is valid, the prediction direction is a single direction. The prediction direction becomes invalid if both the LX direction and the LY direction are invalid. That is, when both the LX direction and the LY direction are valid, a combined motion information candidate having motion information in the LX direction and a combined motion information candidate having motion information in the LX direction having motion information in the LY direction A new bidirectional combined motion information candidate is generated by combining with another combined motion information candidate. In addition, when only the LX direction is valid, if the prediction direction of the combined motion information candidate having the effective LX prediction is bi-prediction, the prediction direction of the combined motion information candidate is converted to single prediction. Become. Similarly, when only the LY direction is valid, if the prediction direction of the combined motion information candidate having the effective LY prediction is bi-prediction, the prediction direction of the combined motion information candidate is converted to single prediction. become.
 ここでは双方向結合動き情報候補の予測方向の決定を図30としたが、予測方向が決定できればよく、これに限定されない。図31(a)~(c)に双方向結合動き情報候補の予測方向の決定の拡張例を示す。例えば、図31(a)のようにLX方向とLY方向の少なくとも一方が無効であれば予測方向を無効にすることや、図31(b)や図31(c)のように予測方向を強制的に双方向にしてもよい。 Here, the determination of the prediction direction of the bidirectional combined motion information candidate is shown in FIG. 30, but it is only necessary to be able to determine the prediction direction, and the present invention is not limited to this. FIGS. 31 (a) to 31 (c) show an extended example of determining the prediction direction of the bidirectional combined motion information candidate. For example, if at least one of the LX direction and the LY direction is invalid as shown in FIG. 31A, the prediction direction is invalidated, or the prediction direction is forced as shown in FIGS. 31B and 31C. Alternatively, it may be bidirectional.
 一般的には動きベクトルの精度が比較的高い場合には単方向予測よりも双方向予測の予測効率が高くなる。そのため、図31(a)では、LX方向とLY方向が共に有効でない場合には、双方向結合動き情報候補の予測方向を無効として、結合動き情報候補の数を減らすことでマージインデックスの符号量を削減することができる。ここで、例えば、単方向結合動き情報候補の中に双方向予測の候補がいる場合に双方向結合動き情報候補の予測方向を無効とするような適応処理とすることもできる。 Generally, when the accuracy of motion vectors is relatively high, the prediction efficiency of bidirectional prediction is higher than that of unidirectional prediction. Therefore, in FIG. 31A, when both the LX direction and the LY direction are not valid, the prediction direction of the bidirectional combined motion information candidates is invalidated, and the number of combined motion information candidates is reduced to reduce the code amount of the merge index. Can be reduced. Here, for example, when there is a bidirectional prediction candidate among the unidirectional combined motion information candidates, the adaptive processing may be performed so as to invalidate the prediction direction of the bidirectional combined motion information candidate.
 また、図31(b)は無効な予測方向の動きベクトルを(0,0)、参照インデックスを「0」とする。このように、最短距離の参照画像を予測信号として双方向結合動き情報候補を強制的に双方向とすることもできる。これは、参照インデックスの「0」は一般的に処理対象画像と最も距離の近い参照画像となるためであって、最短距離の予測信号の信頼度が最も高いためである。 Further, in FIG. 31B, the motion vector in the invalid prediction direction is (0, 0) and the reference index is “0”. In this way, the bidirectional combined motion information candidate can be forced to be bidirectional using the reference image of the shortest distance as a prediction signal. This is because the reference index “0” is generally the reference image closest to the processing target image, and the reliability of the prediction signal with the shortest distance is the highest.
 (動画像復号装置200の構成)
 次に、実施の形態1の動画像復号装置を説明する。図32に実施の形態1の動画像復号装置200を示す。動画像復号装置200は、動画像符号化装置100により符号化された符号列を復号して再生画像を生成する装置である。
(Configuration of moving picture decoding apparatus 200)
Next, the moving picture decoding apparatus according to the first embodiment will be described. FIG. 32 shows a moving picture decoding apparatus 200 according to the first embodiment. The video decoding device 200 is a device that generates a playback image by decoding the code string encoded by the video encoding device 100.
 動画像復号装置200は、CPU(Central Processing Unit)、フレームメモリ、ハードディスクなどを備える情報処理装置などのハードウェアにより実現される。動画像復号装置200は、上記の構成要素が動作することにより、以下に説明する機能的な構成要素を実現する。なお、復号対象の予測ブロックの位置情報、予測ブロックサイズに関しては動画像復号装置200内で共有していることとし、図示しない。また、候補番号管理テーブル、および結合動き情報候補リストに含まれる結合動き情報候補の最大数は動画像復号装置200内で共有していることとし、図示しない。 The video decoding device 200 is realized by hardware such as an information processing device including a CPU (Central Processing Unit), a frame memory, and a hard disk. The moving picture decoding apparatus 200 realizes functional components described below by operating the above components. Note that the position information and the prediction block size of the prediction block to be decoded are shared in the video decoding device 200 and are not shown. Further, the maximum number of combined motion information candidates included in the candidate number management table and the combined motion information candidate list is assumed to be shared in the moving image decoding apparatus 200 and is not illustrated.
 実施の形態1の動画像復号装置200は、符号列解析部201、予測誤差復号部202、加算部203、動き情報再生部204、動き補償部205、フレームメモリ206および動き情報メモリ207を備える。 The moving picture decoding apparatus 200 according to Embodiment 1 includes a code string analysis unit 201, a prediction error decoding unit 202, an addition unit 203, a motion information reproduction unit 204, a motion compensation unit 205, a frame memory 206, and a motion information memory 207.
 (動画像復号装置200の機能)
 以下、各部の機能について説明する。符号列解析部201は、端子30より供給された符号列を復号して予測誤差符号化データ、マージフラグ、マージ候補番号、動き補償予測の予測方向、参照画像インデックス、差分ベクトル、および予測ベクトルインデックスをシンタックスに従って復号する。そして、当該予測誤差符号化データを予測誤差復号部202に、当該マージフラグ、当該マージ候補番号、当該動き補償予測の予測方向、当該参照画像インデックス、当該差分ベクトル、および当該予測ベクトルインデックスを動き情報再生部204に供給する。なお、マージ候補番号はマージインデックスより変換されて得られる。
(Function of moving picture decoding apparatus 200)
Hereinafter, functions of each unit will be described. The code string analysis unit 201 decodes the code string supplied from the terminal 30 to predict prediction error encoded data, merge flag, merge candidate number, prediction direction of motion compensation prediction, reference image index, difference vector, and prediction vector index. Is decoded according to the syntax. Then, the prediction error coding data is transferred to the prediction error decoding unit 202, the merge flag, the merge candidate number, the prediction direction of the motion compensation prediction, the reference image index, the difference vector, and the prediction vector index as motion information. This is supplied to the playback unit 204. The merge candidate number is obtained by conversion from the merge index.
 予測誤差復号部202は、符号列解析部201より供給される予測誤差符号化データに対して、逆量子化や逆直交変換などの処理を行って予測誤差信号を生成し、当該予測誤差信号を加算部203に供給する。 The prediction error decoding unit 202 performs a process such as inverse quantization or inverse orthogonal transform on the prediction error encoded data supplied from the code string analysis unit 201 to generate a prediction error signal, and the prediction error signal is It supplies to the addition part 203.
 加算部203は、予測誤差復号部202より供給される予測誤差信号と、動き補償部205より供給される予測信号とを加算して復号画像信号を生成し、当該復号画像信号をフレームメモリ206および端子31に供給する。 The adding unit 203 adds the prediction error signal supplied from the prediction error decoding unit 202 and the prediction signal supplied from the motion compensation unit 205 to generate a decoded image signal, and the decoded image signal is stored in the frame memory 206 and Supply to terminal 31.
 動き情報再生部204は、符号列解析部201より供給されるマージフラグ、マージ候補番号、動き補償予測の予測方向、参照画像インデックス、差分ベクトル、および予測ベクトルインデックスと、動き情報メモリ207より供給される候補ブロック群から、動き情報を再生し、当該動き情報を動き補償部205に供給する。動き情報再生部204の詳細な構成については後述する。 The motion information reproduction unit 204 is supplied from the motion information memory 207 and the merge flag, merge candidate number, motion compensation prediction direction, reference image index, difference vector, and prediction vector index supplied from the code string analysis unit 201. Motion information is reproduced from the candidate block group, and the motion information is supplied to the motion compensation unit 205. A detailed configuration of the motion information reproducing unit 204 will be described later.
 動き補償部205は、動き情報再生部204より供給される動き情報に基づいて、フレームメモリ206内の参照画像インデックスが示す参照画像を、動きベクトルに基づき動き補償して予測信号を生成する。予測方向が双方向であれば、L0方向とL1方向の予測信号を平均したものを予測信号として生成し、当該予測信号を加算部203に供給する。 The motion compensation unit 205 performs motion compensation on the reference image indicated by the reference image index in the frame memory 206 based on the motion information supplied from the motion information reproduction unit 204, and generates a prediction signal. If the prediction direction is bidirectional, an average of the prediction signals in the L0 direction and the L1 direction is generated as a prediction signal, and the prediction signal is supplied to the adding unit 203.
 フレームメモリ206および動き情報メモリ207は、動画像符号化装置100のフレームメモリ110および動き情報メモリ111と同一の機能を有する。 The frame memory 206 and the motion information memory 207 have the same functions as the frame memory 110 and the motion information memory 111 of the moving picture coding apparatus 100.
 (動き情報再生部204の詳細な構成)
 続いて、実施の形態1の特徴をなす動き情報再生部204の詳細な構成について図33を用いて説明する。図33は動き情報再生部204の構成を示す。動き情報再生部204は、符号化モード判定部210、動きベクトル再生部211および結合動き情報再生部212を含む。端子32は符号列解析部201に、端子33は動き情報メモリ207に、端子34は動き補償部205にそれぞれ接続されている。
(Detailed configuration of the motion information playback unit 204)
Next, a detailed configuration of the motion information reproducing unit 204 that characterizes the first embodiment will be described with reference to FIG. FIG. 33 shows the configuration of the motion information playback unit 204. The motion information playback unit 204 includes an encoding mode determination unit 210, a motion vector playback unit 211, and a combined motion information playback unit 212. The terminal 32 is connected to the code string analysis unit 201, the terminal 33 is connected to the motion information memory 207, and the terminal 34 is connected to the motion compensation unit 205.
 以下、各部の機能について説明する。符号化モード判定部210は、符号列解析部201より供給されるマージフラグが「0」であれば、符号列解析部201より供給される動き補償予測の予測方向、参照画像インデックス、差分ベクトル、および予測ベクトルインデックスを動きベクトル再生部211に供給する。マージフラグが「1」であれば、符号列解析部201より供給されるマージ候補番号を結合動き情報再生部212に供給する。 The functions of each part are described below. If the merge flag supplied from the code stream analysis unit 201 is “0”, the coding mode determination unit 210 determines the prediction direction, reference image index, difference vector, motion compensation prediction supplied from the code stream analysis unit 201, The prediction vector index is supplied to the motion vector reproduction unit 211. If the merge flag is “1”, the merge candidate number supplied from the code string analysis unit 201 is supplied to the combined motion information reproduction unit 212.
 動きベクトル再生部211は、符号化モード判定部210より供給される動き補償予測の予測方向、参照画像インデックス、差分ベクトル、および予測ベクトルインデックスと、端子33より供給される候補ブロック群から、動き情報を再生して端子34に供給する。動きベクトル再生部211の詳細な構成については後述する。 The motion vector reproduction unit 211 receives motion information from the prediction direction of motion compensation prediction supplied from the encoding mode determination unit 210, the reference image index, the difference vector, and the prediction vector index, and the candidate block group supplied from the terminal 33. Is supplied to the terminal 34. A detailed configuration of the motion vector reproducing unit 211 will be described later.
 結合動き情報再生部212は、符号化モード判定部210より供給されるマージ候補番号と、端子33より供給される候補ブロック群から、動き情報を再生して端子34に供給する。結合動き情報再生部212の詳細な構成については後述する。 The combined motion information reproduction unit 212 reproduces the motion information from the merge candidate number supplied from the encoding mode determination unit 210 and the candidate block group supplied from the terminal 33 and supplies the motion information to the terminal 34. A detailed configuration of the combined motion information reproducing unit 212 will be described later.
 続いて、動きベクトル再生部211の詳細な構成について図34を用いて説明する。図34は動きベクトル再生部211の構成を示す。動きベクトル再生部211は、予測ベクトル候補リスト生成部220、予測ベクトル決定部221および加算部222を含む。端子35は符号化モード判定部210に接続されている。 Subsequently, a detailed configuration of the motion vector reproducing unit 211 will be described with reference to FIG. FIG. 34 shows the configuration of the motion vector reproducing unit 211. The motion vector reproduction unit 211 includes a prediction vector candidate list generation unit 220, a prediction vector determination unit 221, and an addition unit 222. The terminal 35 is connected to the encoding mode determination unit 210.
 以下、各部の機能について説明する。予測ベクトル候補リスト生成部220は、動画像符号化装置100の予測ベクトル候補リスト生成部130と同一の機能を有する。予測ベクトル決定部221は、予測ベクトル候補リスト生成部220より供給される予測ベクトル候補リスト、および端子35より供給される予測ベクトルインデックスから、予測ベクトルを決定して加算部222に供給する。 The functions of each part are described below. The prediction vector candidate list generation unit 220 has the same function as the prediction vector candidate list generation unit 130 of the video encoding device 100. The prediction vector determination unit 221 determines a prediction vector from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 220 and the prediction vector index supplied from the terminal 35, and supplies the prediction vector to the addition unit 222.
 加算部222は、端子35より供給される差分ベクトルと、予測ベクトル決定部221より供給される予測ベクトルとを加算して動きベクトルを算出し、当該動きベクトルを端子34に供給する。 The addition unit 222 adds the difference vector supplied from the terminal 35 and the prediction vector supplied from the prediction vector determination unit 221 to calculate a motion vector, and supplies the motion vector to the terminal 34.
 続いて、結合動き情報再生部212の詳細な構成について図35を用いて説明する。図35は結合動き情報再生部212の構成を示す。結合動き情報再生部212は、結合動き情報候補生成部230および結合動き情報選択部231を含む。 Subsequently, a detailed configuration of the combined motion information reproducing unit 212 will be described with reference to FIG. FIG. 35 shows the configuration of the combined motion information playback unit 212. The combined motion information reproduction unit 212 includes a combined motion information candidate generation unit 230 and a combined motion information selection unit 231.
 以下、各部の機能について説明する。結合動き情報候補生成部230は図15に示す結合動き情報候補生成部140と同一の機能を有する。結合動き情報選択部231は、結合動き情報候補生成部230より供給される結合動き情報候補リスト、および端子35より供給されるマージ候補番号に基づいて、結合動き情報候補リストの中から動き情報を選択し、当該動き情報を端子34に供給する。 The function of each part will be described below. The combined motion information candidate generation unit 230 has the same function as the combined motion information candidate generation unit 140 shown in FIG. Based on the combined motion information candidate list supplied from the combined motion information candidate generation unit 230 and the merge candidate number supplied from the terminal 35, the combined motion information selection unit 231 selects motion information from the combined motion information candidate list. The motion information is selected and supplied to the terminal 34.
 (動画像復号装置200の動作)
 続いて、図36のフローチャートを用いて、実施の形態1の動画像復号装置200における復号の動作を説明する。符号列解析部201は、端子30より供給される符号列を復号して予測誤差符号化データ、マージフラグ、マージ候補番号、動き補償予測の予測方向、参照画像インデックス、差分ベクトル、および予測ベクトルインデックスをシンタックスに従って復号する(S210)。
(Operation of the video decoding device 200)
Subsequently, the decoding operation in the moving picture decoding apparatus 200 according to Embodiment 1 will be described with reference to the flowchart of FIG. The code string analysis unit 201 decodes the code string supplied from the terminal 30 to predict prediction error encoded data, merge flag, merge candidate number, prediction direction of motion compensation prediction, reference image index, difference vector, and prediction vector index. Is decoded according to the syntax (S210).
 動き情報再生部204は、符号列解析部201より供給されるマージフラグ、マージ候補番号、動き補償予測の予測方向、参照画像インデックス、差分ベクトル、および予測ベクトルインデックスと、動き情報メモリ207より供給される候補ブロック群から、動き情報を再生する(S211)。 The motion information reproduction unit 204 is supplied from the motion information memory 207 and the merge flag, merge candidate number, motion compensation prediction direction, reference image index, difference vector, and prediction vector index supplied from the code string analysis unit 201. Motion information is reproduced from the candidate block group (S211).
 動き補償部205は、動き情報再生部204より供給される動き情報に基づいて、フレームメモリ206内の参照画像インデックスが示す参照画像を、動きベクトルに基づき動き補償して予測信号を生成する。予測方向が双方向であれば、L0方向とL1方向の予測信号を平均したものを予測信号として生成する(S212)。 The motion compensation unit 205 performs motion compensation on the reference image indicated by the reference image index in the frame memory 206 based on the motion information supplied from the motion information reproduction unit 204, and generates a prediction signal. If the prediction direction is bidirectional, an average of the prediction signals in the L0 direction and the L1 direction is generated as a prediction signal (S212).
 予測誤差復号部202は、符号列解析部201より供給される予測誤差符号化データに対して、逆量子化や逆直交変換などの処理を行って予測誤差信号を生成する(S213)。加算部203は、予測誤差復号部202より供給される予測誤差信号と、動き補償部205より供給される予測信号とを加算して復号画像信号を生成する(S214)。 The prediction error decoding unit 202 performs a process such as inverse quantization or inverse orthogonal transform on the prediction error encoded data supplied from the code string analysis unit 201 to generate a prediction error signal (S213). The adding unit 203 adds the prediction error signal supplied from the prediction error decoding unit 202 and the prediction signal supplied from the motion compensation unit 205 to generate a decoded image signal (S214).
 フレームメモリ206は、加算部203より供給される復号画像信号を記憶する(S215)。動き情報メモリ207は、動き情報再生部204より供給される動きベクトルを最小の予測ブロックサイズ単位で1画像分記憶する(S216)。 The frame memory 206 stores the decoded image signal supplied from the adding unit 203 (S215). The motion information memory 207 stores the motion vector supplied from the motion information reproducing unit 204 for one image in the minimum predicted block size unit (S216).
 引き続いて、図37のフローチャートを用いて、動き情報再生部204の動作を説明する。符号化モード判定部210は、符号列解析部201より供給されるマージフラグが「0」であるか「1」であるか判定する(S220)。マージフラグが「1」であれば(S220の1)、結合動き情報再生部212は、符号化モード判定部210より供給されるマージ候補番号、および端子33より供給される候補ブロック群から、動き情報を再生する(S221)。 Subsequently, the operation of the motion information reproducing unit 204 will be described using the flowchart of FIG. The encoding mode determination unit 210 determines whether the merge flag supplied from the code string analysis unit 201 is “0” or “1” (S220). If the merge flag is “1” (1 in S220), the combined motion information reproduction unit 212 calculates the motion from the merge candidate number supplied from the encoding mode determination unit 210 and the candidate block group supplied from the terminal 33. Information is reproduced (S221).
 マージフラグが「0」であれば、(S220の0)、動きベクトル再生部211は、符号化モード判定部210より供給される動き補償予測の予測方向、参照画像インデックス、差分ベクトル、および予測ベクトルインデックス、ならびに端子33より供給される候補ブロック群から、動き情報を再生する(S222)。 If the merge flag is “0” (0 in S220), the motion vector reproduction unit 211 supplies the motion compensation prediction prediction direction, reference image index, difference vector, and prediction vector supplied from the coding mode determination unit 210. Motion information is reproduced from the index and the candidate block group supplied from the terminal 33 (S222).
 引き続いて、図38のフローチャートを用いて、動きベクトル再生部211の動作を説明する。予測ベクトル候補リスト生成部220は、動画像符号化装置100の予測ベクトル候補リスト生成部130と同一の動作によって予測ベクトル候補リストを生成する(S300)。 Subsequently, the operation of the motion vector reproducing unit 211 will be described using the flowchart of FIG. The prediction vector candidate list generation unit 220 generates a prediction vector candidate list by the same operation as the prediction vector candidate list generation unit 130 of the video encoding device 100 (S300).
 予測ベクトル決定部221は、予測ベクトル候補リスト生成部220より供給される予測ベクトル候補リストの中から、端子35より供給される予測ベクトルインデックスで示される予測ベクトル候補を選択して、予測ベクトルを決定する(S301)。加算部222は、端子35より供給される差分ベクトルと、予測ベクトル決定部221より供給される予測ベクトルとを加算して動きベクトルを算出する(S302)。 The prediction vector determination unit 221 selects a prediction vector candidate indicated by the prediction vector index supplied from the terminal 35 from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 220, and determines a prediction vector. (S301). The adder 222 adds the difference vector supplied from the terminal 35 and the prediction vector supplied from the prediction vector determination unit 221 to calculate a motion vector (S302).
 引き続いて、図39のフローチャートを用いて、結合動き情報再生部212の動作を説明する。結合動き情報候補生成部230は、動画像符号化装置100の結合動き情報候補生成部140と同一の動作によって結合動き情報候補リストを生成する(S310)。結合動き情報選択部231は、結合動き情報候補生成部230より供給される結合動き情報候補リストの中から、端子35より供給されるマージ候補番号で示される結合動き情報候補を選択して、結合動き情報を決定する(S311)。 Subsequently, the operation of the combined motion information reproduction unit 212 will be described using the flowchart of FIG. The combined motion information candidate generation unit 230 generates a combined motion information candidate list by the same operation as the combined motion information candidate generation unit 140 of the video encoding device 100 (S310). The combined motion information selection unit 231 selects a combined motion information candidate indicated by the merge candidate number supplied from the terminal 35 from the combined motion information candidate list supplied from the combined motion information candidate generation unit 230, and combines them. The motion information is determined (S311).
 (実施の形態1の変形例)
 なお、実施の形態1は以下のように変形できる。
(Modification of Embodiment 1)
The first embodiment can be modified as follows.
 (変形例1:マージ候補番号の順序) (Variation 1: Merge candidate number order)
 上述の実施の形態1では、候補番号管理テーブルの一例として図18を挙げたが、結合動き情報候補の最大数は1以上であればよく、選択率の高い結合動き情報候補ほど小さいマージ候補番号が割り当てられていればよく、図18に限定されない。また、結合動き情報候補リストに含まれる結合動き情報候補の最大数は7(マージインデックスの最大値は6)としたが、2以上であればよい。例えば、双方向結合動き情報候補の選択率が、ブロックCとブロックEの結合動き情報候補の選択率よりも高い場合には、図40(a)や図40(b)のようにしてもよい。 In Embodiment 1 described above, FIG. 18 is given as an example of the candidate number management table. However, the maximum number of combined motion information candidates may be 1 or more, and a merge candidate number with a smaller combined motion information candidate with a higher selection rate. Is not limited to that shown in FIG. The maximum number of combined motion information candidates included in the combined motion information candidate list is 7 (the maximum value of the merge index is 6), but it may be 2 or more. For example, when the selection rate of bidirectional combined motion information candidates is higher than the selection rate of combined motion information candidates of block C and block E, it may be as shown in FIGS. 40 (a) and 40 (b). .
 また、図41のように双方向結合動き情報候補を増加させることもできる。各双方向結合動き情報候補(BD0~BD3)について説明する。双方向結合動き情報候補(BD0)と双方向結合動き情報候補(BD1)は実施の形態1と同一であるとする。双方向結合動き情報候補(BD2)と双方向結合動き情報候補(BD3)は、基準方向の双方向結合動き情報候補の基準方向の動きベクトルと参照インデックスと、逆方向の双方向結合動き情報候補の基準方向の動きベクトルと参照インデックスの決定方法が、双方向結合動き情報候補(BD0)と双方向結合動き情報候補(BD1)とは異なる。 Also, as shown in FIG. 41, the number of bidirectional combined motion information candidates can be increased. Each bidirectional combined motion information candidate (BD0 to BD3) will be described. The bidirectional combined motion information candidate (BD0) and the bidirectional combined motion information candidate (BD1) are assumed to be the same as those in the first embodiment. The bidirectional combined motion information candidate (BD2) and the bidirectional combined motion information candidate (BD3) include a reference direction motion vector and a reference index of the reference direction bidirectional combined motion information candidate, and a reverse bidirectional combined motion information candidate. The method of determining the motion vector and reference index in the base direction is different between the bidirectional combined motion information candidate (BD0) and the bidirectional combined motion information candidate (BD1).
 図42は双方向結合動き情報候補(BD2)の導出を説明するフローチャートである。図42は、図28のフローチャートのステップS193をステップS195からステップS197に置き換えたものである。以下、ステップS195からステップS197について説明する。LXの有効性が「1」であるか検査する(S195)。LXの有効性が「1」でなければ(S195のNO)、LXの有効性を「1」として(S196)、次の候補を検査する(S194)。LXの有効性が「1」であれば(S195のYES)、基準方向の動きベクトルと参照インデックスを、結合動き情報候補のLX方向の動きベクトルと参照インデックスとして(S197)、処理を終了する。 FIG. 42 is a flowchart for explaining the derivation of the bidirectional combined motion information candidate (BD2). FIG. 42 is obtained by replacing step S193 in the flowchart of FIG. 28 from step S195 to step S197. Hereinafter, step S195 to step S197 will be described. It is checked whether the effectiveness of LX is “1” (S195). If the validity of LX is not “1” (NO in S195), the validity of LX is set to “1” (S196), and the next candidate is examined (S194). If the validity of LX is “1” (YES in S195), the motion vector and reference index in the base direction are set as the motion vector and reference index in the LX direction of the combined motion information candidate (S197), and the process ends.
 図43は双方向結合動き情報候補(BD3)の導出を説明するフローチャートである。図43は、図29のフローチャートのステップS204をステップS206からステップS208に置き換えたものである。以下、ステップS206からステップS208について説明する。LYの有効性が「1」であるか検査する(S206)。LYの有効性が「1」でなければ(S206のNO)、LYの有効性を「1」として(S207)、次の候補を検査する(S205)。LYの有効性が「1」であれば(S206のYES)、基準方向の動きベクトルと参照インデックスを、結合動き情報候補のLY方向の動きベクトルと参照インデックスとして(S208)、処理を終了する。 FIG. 43 is a flowchart for explaining the derivation of the bidirectional combined motion information candidate (BD3). FIG. 43 is obtained by replacing step S204 in the flowchart of FIG. 29 from step S206 to step S208. Hereinafter, step S206 to step S208 will be described. Whether the validity of LY is “1” is checked (S206). If the validity of LY is not “1” (NO in S206), the validity of LY is set to “1” (S207), and the next candidate is examined (S205). If the validity of LY is “1” (YES in S206), the motion vector and reference index in the base direction are set as the motion vector and reference index in the LY direction of the combined motion information candidate (S208), and the process is terminated.
 つまり、双方向結合動き情報候補(BD2)は、基準方向で2番目に有効となる結合動き情報候補の基準方向の動きベクトルと参照インデックスと、逆方向で基準方向と同じ候補でない最初に有効となる結合動き情報候補の逆方向の動きベクトルと参照インデックスとを利用した双方向結合動き情報候補となる。 That is, the bidirectional combined motion information candidate (BD2) is first effective when it is not the same candidate as the reference direction in the reverse direction, and the motion vector and reference index in the reference direction of the combined motion information candidate that is second effective in the reference direction. The combined motion information candidate is a bidirectional combined motion information candidate using a motion vector in the reverse direction of the combined motion information candidate and a reference index.
 また、双方向結合動き情報候補(BD3)は、基準方向で最初に有効となる結合動き情報候補の基準方向の動きベクトルと参照インデックスと、逆方向で基準方向と同じ候補でない2番目に有効となる結合動き情報候補の逆方向の動きベクトルと参照インデックスとを組み合わせた双方向結合動き情報候補となる。 In addition, the bidirectional combined motion information candidate (BD3) is the second effective that is not the same candidate as the reference direction in the reverse direction and the motion vector and reference index in the reference direction of the combined motion information candidate that is first effective in the reference direction. The combined motion information candidate is a bidirectional combined motion information candidate that combines a motion vector in the reverse direction of the combined motion information candidate and a reference index.
 このように双方向結合動き情報候補の組み合わせを増加させて、結合動き情報候補の選択率を高めて動き情報の符号化効率を向上させることができる。 As described above, the number of combinations of bidirectional combined motion information candidates can be increased, the selection rate of combined motion information candidates can be increased, and the encoding efficiency of motion information can be improved.
 (変形例2:双方向結合動き情報候補の同一判定)
 上述の実施の形態1では、逆方向動き情報決定部162の動作例として図29を挙げたが、双方向結合動き情報候補が生成されればよく、これに限定されない。例えば、双方向結合動き情報候補の有効性を高める、つまり第2結合動き情報候補リスト削減部153によって削除されないようにする目的で、図44のようにステップS240を追加してもよい。
(Modification 2: Same determination of bidirectional combined motion information candidates)
In Embodiment 1 described above, FIG. 29 is given as an example of the operation of the backward direction motion information determination unit 162, but it is only necessary to generate bidirectional combined motion information candidates, and the present invention is not limited to this. For example, step S240 may be added as shown in FIG. 44 in order to increase the effectiveness of the bidirectional combined motion information candidate, that is, to prevent the second combined motion information candidate list reduction unit 153 from deleting it.
 基準方向の動きベクトルおよび参照インデックスと、検査対象の結合動き情報候補の逆方向の動きベクトルおよび参照インデックスを利用した双方向結合動き情報候補と同一の動き情報を持つ結合動き情報候補が、第2結合動き情報候補リストの中にないことを検査する(S240)。同一の結合動き情報候補が存在しない場合に(S240のYES)、ステップS205を実施する。同一の結合動き情報候補が存在する場合に(S240のNO)、次の候補を検査する(S206)。この場合、図16の第2結合動き情報候補リスト削減部153および図24のステップS146を省略することも可能である。 The combined motion information candidate having the same motion information as the bidirectional combined motion information candidate using the motion vector and reference index in the reference direction and the backward motion vector and reference index of the combined motion information candidate to be inspected is the second It is inspected that it is not in the combined motion information candidate list (S240). When the same combined motion information candidate does not exist (YES in S240), step S205 is performed. When the same combined motion information candidate exists (NO in S240), the next candidate is examined (S206). In this case, the second combined motion information candidate list reduction unit 153 in FIG. 16 and step S146 in FIG. 24 can be omitted.
 これにより、第2結合動き情報候補リスト削減部153によって双方向結合動き情報候補が削減されることはなくなり、結合動き情報候補の選択率を高めて動き情報の符号化効率を向上させることができる。 Accordingly, the second combined motion information candidate list reduction unit 153 does not reduce the bidirectional combined motion information candidates, and the selection rate of the combined motion information candidates can be increased and the motion information encoding efficiency can be improved. .
 (変形例3:双方向結合動き情報候補の基準方向との同一判定)
 上述の実施の形態1では、逆方向動き情報決定部162の動作例として図29を挙げたが、図45のようにステップS250を追加してもよい。
(Variation 3: Same determination with reference direction of bidirectional combined motion information candidate)
In Embodiment 1 described above, FIG. 29 is given as an example of the operation of the backward direction motion information determination unit 162, but step S250 may be added as shown in FIG.
 基準方向で選択された結合動き情報候補の逆方向の動きベクトルおよび参照インデックスと、検査対象の結合動き情報候補の逆方向の動きベクトルおよび参照インデックスとが同一でないことを検査する(S250)。同一でない場合に(S250のYES)、ステップS205を実施する。同一の場合に(S250のNO)、次の候補を検査する(S206)。 It is checked that the backward motion vector and reference index of the combined motion information candidate selected in the reference direction are not the same as the backward motion vector and reference index of the combined motion information candidate to be inspected (S250). If they are not identical (YES in S250), step S205 is performed. If they are the same (NO in S250), the next candidate is inspected (S206).
 これにより、双方向結合動き情報候補が、基準方向で選択された結合動き情報候補と同一になることはなくなり、双方向結合動き情報候補の有効性を高め、結合動き情報候補の選択率を高めることで動き情報の符号化効率を向上させることができる。 As a result, the bidirectional combined motion information candidate is not the same as the combined motion information candidate selected in the reference direction, and the effectiveness of the bidirectional combined motion information candidate is increased and the combined motion information candidate selection rate is increased. Thus, the encoding efficiency of motion information can be improved.
 (変形例4:削除プロセスの一本化)
 上述の実施の形態1では、結合動き情報候補生成部140の構成の一例として図16を挙げたが、より簡易的な構成として、図46のように第1結合動き情報候補リスト削減部151をなくして第2結合動き情報候補リスト削減部153だけとして削除部を1つにまとめることもできる。
(Modification 4: Unification of deletion process)
In Embodiment 1 described above, FIG. 16 is given as an example of the configuration of the combined motion information candidate generation unit 140. However, as a simpler configuration, the first combined motion information candidate list reduction unit 151 is configured as illustrated in FIG. Alternatively, only the second combined motion information candidate list reduction unit 153 can be combined with the deletion unit.
 ただし、この場合の課題として双方向結合動き情報候補リスト生成部152に冗長な結合動き情報候補が供給されるため、最初の2つの単方向結合動き情報候補が同一であった場合には、基準方向がL0の双方向結合動き情報候補(BD0)と基準方向がL1の双方向結合動き情報候補(BD1)が同じ動き情報となる。そのため、図47(b)のように、基準方向がL0であるかL1であるかによって、図28および図29の検査順序を変化させることで同一の双方向結合動き情報候補を生成する確率を低下させることができる。 However, since a redundant combined motion information candidate is supplied to the bidirectional combined motion information candidate list generation unit 152 as a problem in this case, if the first two unidirectional combined motion information candidates are the same, the reference The bidirectional combined motion information candidate (BD0) whose direction is L0 and the bidirectional combined motion information candidate (BD1) whose reference direction is L1 are the same motion information. Therefore, as shown in FIG. 47 (b), the probability of generating the same bidirectional combined motion information candidate by changing the inspection order of FIG. 28 and FIG. 29 depending on whether the reference direction is L0 or L1. Can be reduced.
 (変形例5:同一方向の利用)
 上述の実施の形態1では、逆方向動き情報決定部162の逆方向について、基準方向がL0方向であれば逆方向はL1方向となり、基準方向がL1方向であれば逆方向はL0方向となる例を挙げた。この点、基準方向がL0方向であれば逆方向をL0方向とし、基準方向がL1方向であれば逆方向をL1方向としてもよい。これは、図48のように第2結合動き情報候補リストに含まれる結合動き情報候補に同一の予測方向の動き情報のみ存在する場合に、双方向結合動き情報候補の生成確率を高め、結合動き情報候補の選択率を高めることで動き情報の符号化効率を向上させることができる。
(Modification 5: Use in the same direction)
In Embodiment 1 described above, regarding the reverse direction of the reverse direction motion information determination unit 162, if the reference direction is the L0 direction, the reverse direction is the L1 direction, and if the reference direction is the L1 direction, the reverse direction is the L0 direction. An example was given. In this regard, if the reference direction is the L0 direction, the reverse direction may be the L0 direction, and if the reference direction is the L1 direction, the reverse direction may be the L1 direction. As shown in FIG. 48, when only the motion information in the same prediction direction exists in the combined motion information candidates included in the second combined motion information candidate list, the generation probability of the bidirectional combined motion information candidate is increased. The efficiency of encoding motion information can be improved by increasing the selection rate of information candidates.
 (変形例6:予め定められた組み合わせ)
 上述の実施の形態1では、双方向結合動き情報候補を、基準方向と逆方向において有効である結合動き情報候補ブロックを検索して、基準方向と逆方向の動き情報を利用することで生成した。基準方向と逆方向に検索することで双方向結合動き情報候補の有効性を向上させることができるが、処理量が増加する。
(Modification 6: predetermined combination)
In the first embodiment described above, bidirectional combined motion information candidates are generated by searching for combined motion information candidate blocks that are valid in the direction opposite to the reference direction and using the motion information in the direction opposite to the reference direction. . By searching in the direction opposite to the reference direction, the effectiveness of the bidirectional combined motion information candidate can be improved, but the processing amount increases.
 そこで、図49(a)、(b)のように双方向結合動き情報候補を予め定められたより信頼度の高い結合動き情報候補ブロックの組み合わせとして定義しておくことで、検索処理を省き、双方向結合動き情報候補の選択率を向上させて、符号化効率を向上させることができる。 Therefore, as shown in FIGS. 49 (a) and 49 (b), the bidirectional combined motion information candidates are defined as combinations of predetermined combined motion information candidate blocks with higher reliability, thereby omitting the search process. It is possible to improve the coding efficiency by improving the selection rate of the directional coupled motion information candidates.
 図49(a)は基準方向がL0の双方向結合動き情報候補(BD0)を最も信頼度の高い候補ブロックAのL0方向の動き情報と2番目に信頼度の高い候補ブロックBのL1方向の動き情報を組み合わせて、基準方向がL1の双方向結合動き情報候補(BD1)を最も信頼度の高い候補ブロックAのL1方向の動き情報と2番目に信頼度の高い候補ブロックBのL0方向の動き情報を組み合わせて、予測方向を双方向予測として定義した例である。 In FIG. 49A, the bidirectional combined motion information candidate (BD0) having the reference direction L0 indicates the motion information in the L0 direction of the candidate block A with the highest reliability and the L1 direction of the candidate block B with the second highest reliability. Combining the motion information, the bi-directional combined motion information candidate (BD1) whose reference direction is L1 is the motion information in the L1 direction of the most reliable candidate block A and the L0 direction of the second most reliable candidate block B in the L0 direction. This is an example in which motion information is combined and the prediction direction is defined as bidirectional prediction.
 図49(b)は基準方向がL0の双方向結合動き情報候補(BD0)を最も信頼度の高い候補ブロックAのL0方向の動き情報として、基準方向がL1の双方向結合動き情報候補(BD1)を最も信頼度の高い候補ブロックAのL1方向の動き情報として、予測方向を単方向予測として定義した例である。なお、より信頼度の高い候補ブロックの組み合わせであれば、これ以外の組み合わせでもよい。 In FIG. 49B, the bidirectional combined motion information candidate (BD0) whose reference direction is L0 is the motion information in the L0 direction of the candidate block A having the highest reliability, and the bidirectional combined motion information candidate (BD1) whose reference direction is L1. ) As the motion information in the L1 direction of the candidate block A with the highest reliability, and the prediction direction is defined as unidirectional prediction. Note that other combinations are possible as long as the combinations of candidate blocks have higher reliability.
 (変形例7:BD0、BD1適応)
 上述の実施の形態1では基準方向がL0の双方向結合動き情報候補(BD0)に小さいマージ候補番号を割り当てるとしたが、これに限定されない。例えば、予測方向が双方向である双方向結合動き情報候補に優先的に小さいマージ候補番号を割り当てることで、予測効率の高い双方向予測の双方向結合動き情報候補に小さいマージ候補番号を割り当てることで符号化効率を改善することもできる。さらに、BD0とBD1がともに双方向予測である場合には、基準方向の動き情報が単方向である双方向結合動き情報候補に優先的に小さいマージ候補番号を割り当てることもできる。これは、一般的には単方向予測よりも双方向予測の方が予測効率が高いにも関わらず単方向予測が選択されている場合、その動き情報の信頼度は高いためである。
(Modification 7: BD0, BD1 adaptation)
In Embodiment 1 described above, a small merge candidate number is assigned to the bidirectional combined motion information candidate (BD0) whose reference direction is L0. However, the present invention is not limited to this. For example, by assigning a small merge candidate number preferentially to a bidirectional combined motion information candidate whose prediction direction is bidirectional, a small merge candidate number is assigned to a bidirectional combined motion information candidate for bidirectional prediction with high prediction efficiency. The encoding efficiency can also be improved. Further, when both BD0 and BD1 are bidirectional prediction, a smaller merge candidate number can be preferentially assigned to a bidirectional combined motion information candidate whose motion information in the reference direction is unidirectional. This is because the reliability of motion information is generally high when unidirectional prediction is selected even though bidirectional prediction has higher prediction efficiency than unidirectional prediction.
 (実施の形態1の効果)
 (双方向予測の双方向結合動き情報の効果例)
 実施の形態1による効果について図50を用いて説明する。以下、ブロックNのL0方向の動きベクトルをmvL0N、L1方向の動きベクトルをmvL1N、L0方向の参照画像インデックスをrefIdxL0N、L1方向の参照画像インデックスをrefIdxL1N、L0方向の差分ベクトルをdmvL0N、L1方向の差分ベクトルをdmvL1N、L0方向の参照画像インデックスの差分をdrefIdxL0N、およびL1方向の参照画像インデックスをdrefIdxL1Nと表す。
(Effect of Embodiment 1)
(Example of the effect of bidirectional joint motion information in bidirectional prediction)
The effect by Embodiment 1 is demonstrated using FIG. Hereinafter, the motion vector in the L0 direction of the block N is mvL0N, the motion vector in the L1 direction is mvL1N, the reference image index in the L0 direction is refIdxL0N, the reference image index in the L1 direction is refIdxL1N, the difference vector in the L0 direction is dmvL0N, and in the L1 direction The difference vector is represented as dmvL1N, the difference between the reference image indexes in the L0 direction is represented as drefIdxL0N, and the reference image index in the L1 direction is represented as drefIdxL1N.
 処理対象ブロック(Z)にとって予測誤差が最小となる動き情報は、予測方向が双方向(BI)で、mvL0Z=(2,8)、mvL1Z=(4,2)、refIdxL0Z=0、refIdxL1N=0であるとする。 The motion information with the smallest prediction error for the processing target block (Z) is that the prediction direction is bidirectional (BI), mvL0Z = (2,8), mvL1Z = (4,2), refIdxL0Z = 0, refIdxL1N = 0 Suppose that
 このとき、単方向結合動き情報候補が図50のA、B、COL、C、Eであるとする。これら単方向結合動き情報候補の中には処理対象ブロック(Z)にとって予測誤差が最小となる動き情報と同一の動き情報はない。したがって、これら単方向結合動き情報候補の中からレート歪み評価値が最小となる単方向結合動き情報候補を選択することになる。そして、その候補のレート歪み評価値と、差分ベクトル算出部120で算出されたレート歪み評価値とを比較して、前者が後者より小さくなる場合のみ符号化モードとしてマージモードが利用されることになる。 Suppose that the unidirectional combined motion information candidates are A, B, COL, C, and E in FIG. Among these unidirectional combined motion information candidates, there is no motion information identical to the motion information that minimizes the prediction error for the processing target block (Z). Therefore, a unidirectional combined motion information candidate having a minimum rate distortion evaluation value is selected from these unidirectional combined motion information candidates. Then, the candidate rate distortion evaluation value and the rate distortion evaluation value calculated by the difference vector calculation unit 120 are compared, and the merge mode is used as the encoding mode only when the former is smaller than the latter. Become.
 符号化モードとしてマージモードが選択された場合、それは動き情報の符号化効率と予測誤差のバランスが最適なためであって、予測誤差は最適とはならない。一方、符号化モードとして非マージモードが選択された場合には、動き情報の符号化効率は最適とはならない。 When the merge mode is selected as the encoding mode, it is because the balance between the encoding efficiency of motion information and the prediction error is optimal, and the prediction error is not optimal. On the other hand, when the non-merge mode is selected as the encoding mode, the encoding efficiency of motion information is not optimal.
 ここで、実施の形態1によって生成された双方向結合動き情報候補は、図50のBD0、BD1となる。基準方向がL0の双方向結合動き情報候補(BD0)は、ブロックAのL0方向の動き情報とブロックBのL1方向の動き情報よりなる双方向結合動き情報候補である。基準方向がL1の双方向結合動き情報候補(BD1)は、ブロックAのL1方向の動き情報とブロックBのL0方向の動き情報よりなる双方向結合動き情報候補である。このとき、基準方向がL0の双方向結合動き情報候補(BD0)は処理対象ブロック(Z)にとって予測誤差が最小となる動き情報と同じ動き情報を持つことがわかる。つまり、基準方向がL0の双方向結合動き情報候補(BD0)を選択することで、予測誤差を最小化し、動き情報の符号化効率を最適化することができる。 Here, the bidirectional combined motion information candidates generated by the first embodiment are BD0 and BD1 in FIG. The bidirectional combined motion information candidate (BD0) whose reference direction is L0 is a bidirectional combined motion information candidate including the motion information of the block A in the L0 direction and the motion information of the block B in the L1 direction. The bidirectional combined motion information candidate (BD1) having the reference direction L1 is a bidirectional combined motion information candidate including the motion information of the block A in the L1 direction and the motion information of the block B in the L0 direction. At this time, it can be seen that the bidirectional combined motion information candidate (BD0) whose reference direction is L0 has the same motion information as the motion information that minimizes the prediction error for the processing target block (Z). That is, by selecting the bidirectional combined motion information candidate (BD0) whose reference direction is L0, it is possible to minimize the prediction error and optimize the encoding efficiency of the motion information.
 (単方向予測の双方向結合動き情報の効果例)
 また、実施の形態1による単方向予測の効果について図51を用いて説明する。処理対象ブロック(Z)にとって予測誤差が最小となる動き情報は、予測方向が単方向(UNI)で、mvL0Z=(0,8)、refIdxL0Z=2であるとする。
(Effects of bidirectional combined motion information for unidirectional prediction)
Moreover, the effect of the unidirectional prediction by Embodiment 1 is demonstrated using FIG. It is assumed that the motion information that minimizes the prediction error for the processing target block (Z) is unidirectional (UNI), mvL0Z = (0,8), and refIdxL0Z = 2.
 単方向結合動き情報候補のB、C、COLが無効(×)であるとし、有効な単方向結合動き情報候補のA、Eは図51のような動き情報を持つとする。この場合も単方向結合動き情報候補の中には、処理対象ブロック(Z)にとって予測誤差が最小となる動き情報はない。 Suppose B, C, and COL of unidirectional combined motion information candidates are invalid (x), and valid unidirectional combined motion information candidates A and E have motion information as shown in FIG. Also in this case, there is no motion information in the unidirectional combined motion information candidates that minimizes the prediction error for the processing target block (Z).
 ここでも、実施の形態1によって生成された双方向結合動き情報候補は図51のBD0、BD1となる。基準方向がL0の双方向結合動き情報候補(BD0)は、ブロックAのL0方向の動き情報よりなる予測方向が単方向の双方向結合動き情報候補である。基準方向がL1の双方向結合動き情報候補(BD1)は、ブロックEのL0方向の動き情報とブロックAのL1方向の動き情報よりなる双方向結合動き情報候補である。基準方向がL0の双方向結合動き情報候補(BD0)は、処理対象ブロック(Z)にとって予測誤差が最小となる動き情報と同じ動き情報を持つことがわかる。つまり、基準方向がL0の双方向結合動き情報候補(BD0)を選択することで、予測誤差を最小化し、動き情報の符号化効率を最適化することができる。 Also here, the bidirectional combined motion information candidates generated by the first embodiment are BD0 and BD1 in FIG. The bidirectional combined motion information candidate (BD0) whose reference direction is L0 is a bidirectional combined motion information candidate whose prediction direction consisting of motion information of the block A in the L0 direction is unidirectional. The bidirectional combined motion information candidate (BD1) whose reference direction is L1 is a bidirectional combined motion information candidate including the motion information of the block E in the L0 direction and the motion information of the block A in the L1 direction. It can be seen that the bidirectional combined motion information candidate (BD0) whose reference direction is L0 has the same motion information as the motion information that minimizes the prediction error for the processing target block (Z). That is, by selecting the bidirectional combined motion information candidate (BD0) whose reference direction is L0, it is possible to minimize the prediction error and optimize the encoding efficiency of the motion information.
 (単方向予測の組み合わせによる双方向結合動き情報の効果例)
 また、実施の形態1による予測方向が単方向の動き情報の組み合わせによる効果について図52を用いて説明する。処理対象ブロック(Z)にとって予測誤差が最小となる動き情報は、予測方向が単方向(BI)で、mvL0Z=(2,2)、refIdxL0Z=0、mvL1Z=(―2,2)、refIdxL1Z=0であるとする。
(Effect of bidirectional joint motion information by combination of unidirectional prediction)
Further, the effect of the combination of motion information in which the prediction direction is unidirectional according to Embodiment 1 will be described with reference to FIG. The motion information with the smallest prediction error for the processing target block (Z) is that the prediction direction is unidirectional (BI), mvL0Z = (2, 2), refIdxL0Z = 0, mvL1Z = (− 2, 2), refIdxL1Z = Assume that it is zero.
 単方向結合動き情報候補のA、COL、Cが無効(×)であるとし、有効な単方向結合動き情報候補のB、Eは図52のような動き情報を持つとする。この場合も単方向結合動き情報候補の中には処理対象ブロック(Z)にとって予測誤差が最小となる動き情報はない。 Suppose that unidirectional combined motion information candidates A, COL, and C are invalid (x), and valid unidirectional combined motion information candidates B and E have motion information as shown in FIG. Also in this case, there is no motion information in the unidirectional combined motion information candidate that minimizes the prediction error for the processing target block (Z).
 ここでも、実施の形態1によって生成された双方向結合動き情報候補は図52のBD0、BD1となる。基準方向がL0の双方向結合動き情報候補(BD0)はブロックBのL0方向の動き情報とブロックEのL1方向の動き情報よりなる双方向結合動き情報候補であり、BD1は生成されない。基準方向がL0の双方向結合動き情報候補(BD0)は処理対象ブロック(Z)にとって予測誤差が最小となる動き情報と同じ動き情報を持つことがわかる。つまり、基準方向がL0の双方向結合動き情報候補(BD0)を選択することで、予測誤差を最小化し、動き情報の符号化効率を最適化することができる。 Also here, the bidirectional combined motion information candidates generated by the first embodiment are BD0 and BD1 in FIG. The bidirectional combined motion information candidate (BD0) whose reference direction is L0 is a bidirectional combined motion information candidate including the motion information of the block B in the L0 direction and the motion information of the block E in the L1 direction, and BD1 is not generated. It can be seen that the bidirectional combined motion information candidate (BD0) having the reference direction L0 has the same motion information as the motion information that minimizes the prediction error for the processing target block (Z). That is, by selecting the bidirectional combined motion information candidate (BD0) whose reference direction is L0, it is possible to minimize the prediction error and optimize the encoding efficiency of the motion information.
 (双方向結合動き情報候補)
 以上のように、単方向結合動き情報候補のL0方向とL1方向の動き情報を利用して双方向結合動き情報候補を生成することで、処理対象ブロックの動きが、符号化済みの別の画像の同一位置ブロックや当該処理対象ブロックの隣接ブロックの動きとずれがある場合でも、動き情報を符号化することなくインデックスのみで符号化することができる。したがって、符号化効率と予測効率を最適化できる動画像符号化装置および動画像復号装置を実現できる。
(Bidirectional motion information candidate)
As described above, by generating the bidirectional combined motion information candidate using the motion information in the L0 direction and the L1 direction of the unidirectional combined motion information candidate, the motion of the processing target block is encoded with another image. Even if there is a deviation from the motion of the block located at the same position or the adjacent block of the processing target block, the motion information can be encoded using only the index without encoding. Therefore, it is possible to realize a moving image encoding device and a moving image decoding device that can optimize encoding efficiency and prediction efficiency.
 (単方向の結合動き情報候補)
 また、単方向結合動き情報候補のL0方向とL1方向の動き情報を利用して新たな単方向の結合動き情報候補を生成する場合についても、単方向結合動き情報候補のL0方向とL1方向の動き情報を利用して双方向結合動き情報候補を生成する場合と、同様の効果を奏する。
(Candidate motion information in one direction)
Also, when generating new unidirectional combined motion information candidates using the unidirectional combined motion information candidate L0 direction and L1 direction motion information, the unidirectional combined motion information candidate L0 direction and L1 direction are also generated. The same effect as the case where the bidirectional combined motion information candidate is generated using the motion information is obtained.
 (同一方向の利用による双方向結合動き情報)
 また、単方向結合動き情報候補の同一の予測方向の動き情報を利用して双方向結合動き情報候補を生成する場合についても、単方向結合動き情報候補のL0方向とL1方向の動き情報を利用して双方向結合動き情報候補を生成する場合と、同様の効果を奏する。
(Bidirectional motion information by using the same direction)
Also, in the case of generating a bidirectional combined motion information candidate using motion information in the same prediction direction of the unidirectional combined motion information candidate, the motion information in the L0 direction and the L1 direction of the unidirectional combined motion information candidate is used. As a result, the same effect is produced as in the case of generating bidirectional combined motion information candidates.
 (動画像復号処理の簡易化)
 以上のように、単方向結合動き情報候補の各方向の動き情報を利用して双方向結合動き情報候補を生成することで、処理対象ブロックの動きが、符号化済みの別の画像の同一位置ブロックや当該処理対象ブロックの隣接ブロックの動きとずれがある場合でも、予測方向、参照インデックスおよび差分ベクトルの復号、予測ベクトルと差分ベクトルとの加算処理などが不要となり、動画像復号装置の処理を削減することができる。
(Simplified video decoding process)
As described above, by generating the bidirectional combined motion information candidate using the motion information in each direction of the unidirectional combined motion information candidate, the motion of the block to be processed is the same position in another encoded image. Even when there is a deviation from the motion of the block or the adjacent block of the processing target block, decoding of the prediction direction, the reference index and the difference vector, addition processing of the prediction vector and the difference vector, etc. are not required, Can be reduced.
 (削除プロセス)
 以上のように、第1結合動き情報候補リスト削減部151を設置することで、双方向結合動き情報候補リスト生成部152において、基準方向がL0の双方向結合動き情報候補(BD0)と、基準方向がL1の双方向結合動き情報候補(BD1)が同一の動き情報を持つことを回避することができ、双方向結合動き情報候補の有効性を高めて符号化効率を向上させることができる。
(Deletion process)
As described above, by installing the first combined motion information candidate list reduction unit 151, the bidirectional combined motion information candidate list generation unit 152 has the bidirectional combined motion information candidate (BD 0) whose reference direction is L 0 and the reference It can be avoided that the bidirectional combined motion information candidate (BD1) having the direction L1 has the same motion information, and the effectiveness of the bidirectional combined motion information candidate can be increased to improve the encoding efficiency.
 (選択率順のマージ候補番号割り当て)
 以上のように、選択率の高い結合動き情報候補ほど小さいマージ候補番号を割り当てておくことで、各方向においてより確からしい動き情報の選択率を高め、各方向において精度の高い動き情報を利用して高精度な双方向結合動き情報候補を生成することができる。また、検索の処理を単純化することができ、検索の処理数を制限しても符号化効率の低下を抑制できる。
(Merge candidate number assignment in order of selectivity)
As described above, by assigning a smaller merge candidate number to a combined motion information candidate with a higher selection rate, the selection rate of more reliable motion information in each direction is increased, and highly accurate motion information is used in each direction. Highly accurate bi-directional joint motion information candidates can be generated. In addition, the search process can be simplified, and a reduction in encoding efficiency can be suppressed even if the number of search processes is limited.
 (メモリリードタイム)
 以上のように、単方向結合動き情報候補の各方向の動き情報を利用して双方向結合動き情報候補を生成することで、単方向結合動き情報候補の数を増加させることなく結合動き情報候補の数を増加させることができる。したがって、単方向結合動き情報候補数の増加によってメモリリード時間が長くなるような一般的なLSIを使用している動画像符号化装置および動画像復号装置において、単方向結合動き情報候補数の増加によるメモリリード時間の増大を抑制できる。
(Memory lead time)
As described above, the combined motion information candidates are generated without increasing the number of unidirectional combined motion information candidates by generating the bidirectional combined motion information candidates using the motion information in each direction of the unidirectional combined motion information candidates. The number of can be increased. Therefore, in the moving picture encoding apparatus and moving picture decoding apparatus using a general LSI in which the memory read time is increased by increasing the number of unidirectional combined motion information candidates, the number of unidirectional combined motion information candidates is increased. The increase in memory read time due to can be suppressed.
 (適応切り替え)
 以上のように、予測方向が双方向である双方向結合動き情報候補に優先的に小さいマージ候補番号を割り当てることで、予測効率の高い、予測方向が双方向である双方向結合動き情報候補の選択率を高め、基準方向の動き情報が単方向である双方向結合動き情報候補に優先的に小さいマージ候補番号を割り当てることで、信頼度の高い動き情報を利用した双方向結合動き情報候補の選択率を高めて符号化効率を向上させることができる。
(Adaptive switching)
As described above, by assigning a small merge candidate number to a bidirectional combined motion information candidate whose prediction direction is bidirectional, a bidirectional combined motion information candidate having a high prediction efficiency and a bidirectional prediction direction can be obtained. By increasing the selection rate and preferentially assigning a small merge candidate number to a bidirectional combined motion information candidate whose motion information in the reference direction is unidirectional, a bidirectional combined motion information candidate using highly reliable motion information It is possible to improve the coding efficiency by increasing the selectivity.
 [実施の形態2]
 (シンタックス)
 実施の形態2の動画像符号化装置の構成は、当該動画像符号化装置の上位機能と符号列生成部104の機能を除いて実施の形態1の動画像符号化装置100の構成と同一である。以下、実施の形態2における動画像符号化装置の上位機能と符号列生成部104の機能について実施の形態1との相違を説明する。
[Embodiment 2]
(Syntax)
The configuration of the moving image encoding apparatus according to the second embodiment is the same as that of the moving image encoding apparatus 100 according to the first embodiment except for the higher-order function of the moving image encoding apparatus and the function of the code string generation unit 104. is there. Hereinafter, differences between the higher-order function of the moving picture coding apparatus and the function of the code string generation unit 104 in the second embodiment from those in the first embodiment will be described.
 実施の形態2の動画像符号化装置の上位機能は、符号化ストリーム単位または符号化ストリームの一部であるスライス毎に、候補番号管理テーブルを変更する機能を有する。符号列生成部104は、候補番号管理テーブルを図53(a)、(b)のように符号化ストリーム中に符号化して伝送する。図53(a)、(b)では、符号化ストリーム単位での制御のためのSPS(Sequence Parameter Set)、およびスライス単位での制御のためのSlice_headerで候補番号管理テーブルを符号化するシンタックスの例を示している。 The higher-order function of the moving picture encoding apparatus according to the second embodiment has a function of changing the candidate number management table for each encoded stream unit or for each slice that is a part of the encoded stream. The code string generation unit 104 encodes the candidate number management table into an encoded stream as shown in FIGS. 53 (a) and 53 (b) and transmits the encoded stream. 53 (a) and 53 (b), the syntax of encoding the candidate number management table with SPS (Sequence Parameter Set) for control in units of encoded streams and Slice_header for control in units of slices. An example is shown.
 "modified_merge_index_flag"でマージ候補番号と結合動き情報候補の標準の関係を変更するかどうかを指定し、"max_no_of_merge_index_minus1"で再定義する個数を指定し、"merge_mode[i]" で結合動き情報候補リストに含まれる候補ブロックの順序を指定する。また、双方向結合動き情報候補の基準方向を指定するための情報である"bd_merge_base_direction"を設置することもできる。 "modified_merge_index_flag" specifies whether to change the standard relationship between the merge candidate number and the combined motion information candidate, specifies the number to be redefined with "max_no_of_merge_index_minus1", and enters the combined motion information candidate list with "merge_mode [i]" Specifies the order of candidate blocks included. In addition, “bd_merge_base_direction” that is information for designating the reference direction of the bidirectional combined motion information candidate can be set.
例えば、マージ候補番号と結合動き情報候補の標準の関係が図18であり、再定義したい候補番号管理テーブルが図40(a)であったとすると、"modified_merge_index_flag"を「1」に設定し、"max_no_of_merge_index_minus1"を「6」に設定し、"merge_mode[i] "をそれぞれ「0」、「1」、「2」、「5」、「6」、「3」、「4」とする。 For example, if the standard relationship between the merge candidate number and the combined motion information candidate is FIG. 18 and the candidate number management table to be redefined is FIG. 40A, “modified_merge_index_flag” is set to “1”, “ “max_no_of_merge_index_minus1” is set to “6”, and “merge_mode [i]” is set to “0”, “1”, “2”, “5”, “6”, “3”, and “4”, respectively.
 図53(a)、(b)はシンタックスの一例であって、双方向結合動き情報候補に割り当てるマージ候補番号を符号化ストリーム中で指定でき、双方向結合動き情報候補の基準方向を定められればよく、これに限定されない。 53 (a) and 53 (b) are examples of syntax, and merge candidate numbers to be assigned to bidirectional combined motion information candidates can be specified in the encoded stream, and the reference direction of bidirectional combined motion information candidates can be determined. However, the present invention is not limited to this.
 実施の形態2の動画像復号装置の構成は、符号列解析部201の機能を除いて実施の形態1の動画像復号装置200の構成と同一である。以下、実施の形態2における動画像復号装置の符号列解析部201の機能について実施の形態1との相違を説明する。符号列解析部201は、候補番号管理テーブルを図53(a)、(b)のシンタックスにしたがって復号する。 The configuration of the moving picture decoding apparatus according to the second embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the code string analysis unit 201. Hereinafter, the difference of the function of the code string analysis unit 201 of the video decoding device in the second embodiment from the first embodiment will be described. The code string analysis unit 201 decodes the candidate number management table according to the syntaxes of FIGS. 53 (a) and 53 (b).
 (実施の形態2の効果)
 マージ候補番号と結合動き情報候補の最適な関係を、実施の形態2による動画像符号化装置と動画像復号装置でストリーム単位またはスライス単位で共有することで、ストリーム単位やスライス単位で動きの特性が変化するような場合にマージインデックスの符号化効率を向上させることができる。
(Effect of Embodiment 2)
The optimal relationship between the merge candidate number and the combined motion information candidate is shared by the video encoding device and the video decoding device according to Embodiment 2 in units of streams or slices, so that motion characteristics in units of streams or slices are obtained. In such a case, the coding efficiency of the merge index can be improved.
 [実施の形態3]
 (結合動き情報候補の置換)
 実施の形態3の動画像符号化装置の構成は、結合動き情報候補生成部140の機能を除いて実施の形態1の動画像符号化装置100の構成と同一である。最初に、実施の形態3における候補番号管理テーブルを図54とし、結合動き情報候補リストに含まれる結合動き情報候補の最大数は5であるとする。結合動き情報候補リストに含まれる結合動き情報候補の最大数が5であること、双方向結合動き情報候補にマージ候補番号が割り当てられていないことが異なる。以下、実施の形態3における結合動き情報候補生成部140について図55を用いて実施の形態1との相違を説明する。
[Embodiment 3]
(Replacement of combined motion information candidates)
The configuration of the moving picture coding apparatus according to the third embodiment is the same as that of the moving picture coding apparatus 100 according to the first embodiment except for the function of the combined motion information candidate generation unit 140. First, the candidate number management table in Embodiment 3 is shown in FIG. 54, and the maximum number of combined motion information candidates included in the combined motion information candidate list is 5. The difference is that the maximum number of combined motion information candidates included in the combined motion information candidate list is 5, and no merge candidate number is assigned to the bidirectional combined motion information candidate. Hereinafter, the difference between the combined motion information candidate generation unit 140 in Embodiment 3 and Embodiment 1 will be described with reference to FIG.
 図55の結合動き情報候補生成部140は、図16の結合動き情報候補生成部140に候補番号管理テーブル変更部154が追加された構成である。以下、候補番号管理テーブル変更部154の機能について説明する。候補番号管理テーブル変更部154は、第1結合動き情報候補リスト削減部151より供給される第2結合動き情報候補リストから、双方向結合動き情報候補の有効数を算出する。双方向結合動き情報候補の有効数が1以上であれば、候補番号管理テーブルを変更し、第2結合動き情報候補リストを双方向結合動き情報候補リスト生成部152に供給する。双方向結合動き情報候補の有効数が0であれば、第2結合動き情報候補リストを結合動き情報候補リストとして端子18に供給する。 55 is a configuration in which a candidate number management table changing unit 154 is added to the combined motion information candidate generating unit 140 in FIG. Hereinafter, the function of the candidate number management table changing unit 154 will be described. The candidate number management table changing unit 154 calculates the effective number of bidirectional combined motion information candidates from the second combined motion information candidate list supplied from the first combined motion information candidate list reducing unit 151. If the effective number of bidirectional combined motion information candidates is 1 or more, the candidate number management table is changed, and the second combined motion information candidate list is supplied to the bidirectional combined motion information candidate list generation unit 152. If the effective number of bidirectional combined motion information candidates is 0, the second combined motion information candidate list is supplied to the terminal 18 as a combined motion information candidate list.
 引き続いて、実施の形態3の結合動き情報候補生成部140の動作について図56を用いて実施の形態1との相違を説明する。図56のフローチャートは、図24のフローチャートに以下の2つのステップが追加されている。候補番号管理テーブル変更部154は、候補番号管理テーブルを変更する(S260)。候補番号管理テーブルが変更されたかを検査する(S261)。候補番号管理テーブルが変更されたならば(S261のYES)、ステップS144を実施する。候補番号管理テーブルが変更されていないならば(S261のNO)、ステップS144はスキップされる。 Subsequently, the operation of the combined motion information candidate generation unit 140 according to the third embodiment will be described with reference to FIG. 56 and the difference from the first embodiment. The flowchart of FIG. 56 has the following two steps added to the flowchart of FIG. The candidate number management table changing unit 154 changes the candidate number management table (S260). It is checked whether the candidate number management table has been changed (S261). If the candidate number management table is changed (YES in S261), step S144 is performed. If the candidate number management table has not been changed (NO in S261), step S144 is skipped.
 以下、候補番号管理テーブル変更部154の動作について図57を用いて説明する。最初に、候補番号管理テーブル変更部154は、第2結合動き情報候補リストに含まれない無効な結合動き情報候補の数をカウントし、結合動き情報候補の無効数を算出する(S270)。なお、実施の形態3では、結合動き情報候補の無効数の算出を第2結合動き情報候補リストに含まれない無効な結合動き情報候補の数としたが、無効な結合動き情報候補の数が算出できればよく、これに限定されない。例えば、空間結合動き情報候補の最大数である4と時間結合動き情報候補の最大数である1を合計した5から第2結合動き情報候補リストに含まれる有効な結合動き情報候補の数を減算して無効な結合動き情報候補の数を求めてもよい。また、選択率の高い結合動き情報が無効である場合には双方向結合動き情報候補も選択率が低下することが考えられるため、マージ候補番号が2以上の無効な結合動き情報候補の数をカウントしてもよい。 Hereinafter, the operation of the candidate number management table changing unit 154 will be described with reference to FIG. First, the candidate number management table changing unit 154 counts the number of invalid combined motion information candidates not included in the second combined motion information candidate list, and calculates the invalid number of combined motion information candidates (S270). In Embodiment 3, the invalid number of combined motion information candidates is calculated as the number of invalid combined motion information candidates not included in the second combined motion information candidate list. However, the number of invalid combined motion information candidates is What is necessary is just to be able to calculate, and it is not limited to this. For example, the number of valid combined motion information candidates included in the second combined motion information candidate list is subtracted from 5 which is the sum of 4 which is the maximum number of spatially combined motion information candidates and 1 which is the maximum number of temporally combined motion information candidates. Thus, the number of invalid combined motion information candidates may be obtained. In addition, when the combined motion information having a high selection rate is invalid, it is considered that the selection rate of the bidirectional combined motion information candidate also decreases. Therefore, the number of invalid combined motion information candidates having a merge candidate number of 2 or more is set. You may count.
 候補番号管理テーブル変更部154は、結合動き情報候補の無効数が1以上であるか調査する(S271)。結合動き情報候補の無効数が1以上であれば(S271のYES)、候補番号管理テーブルを変更するため以降の処理を行う。結合動き情報候補の無効数が0であれば(S271のNO)、処理を終了する。 The candidate number management table changing unit 154 checks whether the invalid number of combined motion information candidates is 1 or more (S271). If the invalid number of combined motion information candidates is 1 or more (YES in S271), the subsequent processing is performed to change the candidate number management table. If the invalid number of combined motion information candidates is 0 (NO in S271), the process ends.
 候補番号管理テーブル変更部154は、有効な双方向結合動き情報候補の数をカウントし、双方向結合動き情報候補の有効数を算出する(S272)。つまり、BD0とBD1がともに有効であれば双方向結合動き情報候補の有効数は2となり、BD0とBD1のいずれか一方が有効であれば双方向結合動き情報候補の有効数は1となり、BD0とBD1がともに無効であれば双方向結合動き情報候補の有効数は0となる。 The candidate number management table changing unit 154 counts the number of effective bidirectional combined motion information candidates and calculates the effective number of bidirectional combined motion information candidates (S272). That is, if both BD0 and BD1 are valid, the effective number of bidirectional combined motion information candidates is 2, and if either BD0 or BD1 is valid, the effective number of bidirectional combined motion information candidates is 1, and BD0 And BD1 are both invalid, the effective number of bidirectional combined motion information candidates is zero.
 候補番号管理テーブル変更部154は、結合動き情報候補の無効数と双方向結合動き情報候補の有効数のいずれか小さい方を双方向結合動き情報候補の追加数とする(S273)。候補番号管理テーブル変更部154は、双方向結合動き情報候補の追加数分の双方向結合動き情報候補に無効なマージ候補番号を割り当てる(S274)。 The candidate number management table changing unit 154 sets the smaller one of the invalid number of combined motion information candidates and the effective number of bidirectional combined motion information candidates as the additional number of bidirectional combined motion information candidates (S273). The candidate number management table changing unit 154 assigns invalid merge candidate numbers to the bidirectional combined motion information candidates for the added number of bidirectional combined motion information candidates (S274).
 以下、候補番号管理テーブル変更部154の候補番号管理テーブルの変更例について図58(a)~(c)を用いて説明する。図58(a)は結合動き情報候補の無効数が1で双方向結合動き情報候補の有効数が1以上の場合の例を示している。最初の無効なマージ候補番号1にBD0が割り当られる。なお、BD1が有効であればBD1を割り当ててもよい。図58(b)は結合動き情報候補の無効数が2で双方向結合動き情報候補の有効数が2の場合の例を示している。最初に無効なマージ候補番号2にBD0が、2番目に無効なマージ候補番号4にBD1が割り当られる。図58(c)は結合動き情報候補の無効数が2で双方向結合動き情報候補の有効数が1(BD1が有効)の場合の例を示している。最初に無効なマージ候補番号2にBD1が割り当られる。 Hereinafter, an example of changing the candidate number management table of the candidate number management table changing unit 154 will be described with reference to FIGS. 58 (a) to 58 (c). FIG. 58A shows an example in which the invalid number of combined motion information candidates is 1 and the effective number of bidirectional combined motion information candidates is 1 or more. BD0 is assigned to the first invalid merge candidate number 1. If BD1 is valid, BD1 may be assigned. FIG. 58B shows an example in which the invalid number of combined motion information candidates is 2 and the effective number of bidirectional combined motion information candidates is 2. BD0 is assigned to the first invalid merge candidate number 2, and BD1 is assigned to the second invalid merge candidate number 4. FIG. 58C shows an example in which the invalid number of combined motion information candidates is 2 and the effective number of bidirectional combined motion information candidates is 1 (BD1 is valid). First, BD1 is assigned to invalid merge candidate number 2.
 実施の形態3の動画像復号装置の構成は、結合動き情報候補生成部140の機能を除いて実施の形態1の動画像復号装置200の構成と同一である。実施の形態3の動画像復号装置の結合動き情報候補生成部140は実施の形態3の動画像符号化装置の結合動き情報候補生成部140と同一である。 The configuration of the moving picture decoding apparatus according to the third embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the combined motion information candidate generation unit 140. The combined motion information candidate generation unit 140 of the video decoding device of the third embodiment is the same as the combined motion information candidate generation unit 140 of the video encoding device of the third embodiment.
 (実施の形態3の変形例)
 なお、実施の形態3は以下のように変形できる。
(Modification of Embodiment 3)
The third embodiment can be modified as follows.
 (変形例1:単方向結合動き情報候補優先)
 上述の実施の形態3では、候補番号管理テーブル変更部154の動作例として図57を挙げたが、変更された候補番号管理テーブルが選択率の高い結合動き情報候補ほど小さいマージ候補番号が割り当てられていればよく、これに限定されない。
(Modification 1: Unidirectional combined motion information candidate priority)
In the above-described third embodiment, FIG. 57 is given as an example of the operation of the candidate number management table changing unit 154. However, a merge candidate number smaller in the combined motion information candidate having a higher selection rate is assigned to the changed candidate number management table. However, the present invention is not limited to this.
 例えば、既存の単向結合動き情報候補の信頼度が十分に高い場合には、候補番号管理テーブル変更部154の動作に図59のように以下のステップS275を追加してもよい。図59のフローチャートは、図57のフローチャートにステップS275を追加したものである。候補番号管理テーブル変更部154は、無効な結合動き情報候補のマージ候補番号を詰める(S274)。 For example, when the reliability of an existing unidirectional combined motion information candidate is sufficiently high, the following step S275 may be added to the operation of the candidate number management table changing unit 154 as shown in FIG. The flowchart of FIG. 59 is obtained by adding step S275 to the flowchart of FIG. The candidate number management table changing unit 154 packs the merge candidate numbers of invalid combined motion information candidates (S274).
 以下、候補番号管理テーブル変更部154の候補番号管理テーブルの変更例について図60(a)、(b)を用いて説明する。図60(a)は結合動き情報候補の無効数が1で双方向結合動き情報候補の有効数が1以上の場合の例を示している。無効なマージ候補番号(マージ候補番号1)が詰められた後、最初に無効なマージ候補番号4にBD0が割り当てられる。なお、BD1が有効であればBD1を割り当ててもよい。図60(b)は結合動き情報候補の無効数が2で双方向結合動き情報候補の有効数が2の場合の例を示している。無効なマージ候補番号(マージ候補番号2)が詰められた後、最初に無効なマージ候補番号3にBD0が、2番目に無効なマージ候補番号4にBD1が割り当てられる。このようにすることで、双方向結合動き情報候補には単方向結合動き情報候補よりも大きなマージ候補番号が割り当てられる。 Hereinafter, an example of changing the candidate number management table of the candidate number management table changing unit 154 will be described with reference to FIGS. 60 (a) and 60 (b). FIG. 60A shows an example in which the invalid number of combined motion information candidates is 1 and the effective number of bidirectional combined motion information candidates is 1 or more. After the invalid merge candidate number (merge candidate number 1) is filled, BD0 is first assigned to the invalid merge candidate number 4. If BD1 is valid, BD1 may be assigned. FIG. 60B shows an example in which the invalid number of combined motion information candidates is 2 and the effective number of bidirectional combined motion information candidates is 2. After the invalid merge candidate number (merge candidate number 2) is filled, BD0 is assigned to the first invalid merge candidate number 3, and BD1 is assigned to the second invalid merge candidate number 4. Thus, a merge candidate number larger than that of the unidirectional combined motion information candidate is assigned to the bidirectional combined motion information candidate.
 (変形例2:所定ブロック依存)
 候補番号管理テーブル変更部154の動作はさらに変形することもできる。最初に、本変形例では、所定のブロックに所定の双方向結合動き情報候補が関連付けられているとし、ブロックCにBD0が、ブロックDにBD1が関連付けられているものとする。以下、候補番号管理テーブル変更部154の動作の別の変形例について図61を用いて説明する。関連付けられたブロックの個数だけ以下の処理が繰り返される(S280からS284)。i番目の所定のブロックが無効であるか検査する(S281)。i番目の所定のブロックが無効であれば(S281のYES)、候補番号管理テーブルを変更するため以降の処理を行う。i番目の所定のブロックが無効でなければ(S281のNO)、次の所定ブロックを検査する。
(Modification 2: Depends on a predetermined block)
The operation of the candidate number management table changing unit 154 can be further modified. First, in this modification, it is assumed that a predetermined bidirectional combined motion information candidate is associated with a predetermined block, BD0 is associated with block C, and BD1 is associated with block D. Hereinafter, another modification of the operation of the candidate number management table changing unit 154 will be described with reference to FIG. The following process is repeated for the number of associated blocks (S280 to S284). Whether the i-th predetermined block is invalid is checked (S281). If the i-th predetermined block is invalid (YES in S281), the subsequent processing is performed to change the candidate number management table. If the i-th predetermined block is not invalid (NO in S281), the next predetermined block is inspected.
 変形例2では、所定の結合動き情報候補を、マージ候補番号3のブロックCとマージ候補番号4のブロックEの2つであるとしている。そのため、候補番号管理テーブル変更部154は、第1の所定の無効なマージ候補番号に双方向結合動き情報候補(BD0)を割り当て、候補番号管理テーブル変更部154は、第2の所定の無効なマージ候補番号に双方向結合動き情報候補(BD1)を割り当てる(S282)。 In Modification 2, it is assumed that there are two predetermined combined motion information candidates, that is, block C with merge candidate number 3 and block E with merge candidate number 4. Therefore, the candidate number management table changing unit 154 assigns the bidirectional combined motion information candidate (BD0) to the first predetermined invalid merge candidate number, and the candidate number management table changing unit 154 sets the second predetermined invalid invalid candidate number. A bidirectional combined motion information candidate (BD1) is assigned to the merge candidate number (S282).
 以上のように、変形例2による双方向結合動き情報候補リスト生成部152は、所定の結合動き情報候補が無効である場合に双方向結合動き情報候補が有効となる。ここでは、所定の結合動き情報候補をブロックCとブロックEとしたが、より大きいマージ候補番号を持つ選択率の低い結合動き情報候補が無効である場合に双方向結合動き情報候補が生成されればよく、これに限定されない。 As described above, the bidirectional combined motion information candidate list generation unit 152 according to the modified example 2 is valid when the predetermined combined motion information candidate is invalid. Here, the predetermined combined motion information candidates are the block C and the block E, but when the combined motion information candidate having a higher merge candidate number and the low selection rate is invalid, the bidirectional combined motion information candidate is generated. However, the present invention is not limited to this.
 (変形例3:単方向予測の結合動き情報候補の置換)
 候補番号管理テーブル変更部154の動作はさらに変形することもできる。以下、候補番号管理テーブル変更部154の動作の変形例について図62を用いて説明する。結合動き情報候補の無効数が0であれば(S271のNO)、候補番号管理テーブル変更部154は、第2結合動き情報候補リストに含まれる予測方向が単方向(L0方向またはL1方向)である結合動き情報候補の数をカウントし、単方向予測数を算出する(S290)。単方向予測数が1以上であるか調査する(S291)。双方向結合動き情報候補の有効数が1以上であれば(S291のYES)、候補番号管理テーブルを変更するため以降の処理を行う。単方向予測数が0であれば(S291のNO)、処理を終了する。候補番号管理テーブル変更部154は、予測方向が双方向である双方向結合動き情報候補の数をカウントし、双方向結合動き情報候補の有効数を算出する(S292)。候補番号管理テーブル変更部154は、双方向結合動き情報候補の追加数分の双方向結合動き情報候補に予測方向が単方向である結合動き情報候補のマージ候補番号を割り当てる(S294)。
(Variation 3: Replacement of combined motion information candidates for unidirectional prediction)
The operation of the candidate number management table changing unit 154 can be further modified. Hereinafter, a modified example of the operation of the candidate number management table changing unit 154 will be described with reference to FIG. If the invalid number of combined motion information candidates is 0 (NO in S271), the candidate number management table changing unit 154 indicates that the prediction direction included in the second combined motion information candidate list is unidirectional (L0 direction or L1 direction). The number of combined motion information candidates is counted, and the number of unidirectional predictions is calculated (S290). It is investigated whether the number of unidirectional predictions is 1 or more (S291). If the effective number of bidirectional combined motion information candidates is 1 or more (YES in S291), the subsequent processing is performed to change the candidate number management table. If the unidirectional prediction number is 0 (NO in S291), the process is terminated. The candidate number management table changing unit 154 counts the number of bidirectional combined motion information candidates whose prediction direction is bidirectional, and calculates the effective number of bidirectional combined motion information candidates (S292). The candidate number management table changing unit 154 assigns merge candidate numbers of combined motion information candidates whose prediction direction is unidirectional to the number of bidirectional combined motion information candidates corresponding to the additional number of bidirectional combined motion information candidates (S294).
 具体例として、候補番号管理テーブル変更部154は、双方向結合動き情報候補(BD0)の予測方向が双方向であれば、最後の予測方向が単方向であるマージ候補番号を、双方向結合動き情報候補(BD0)に割り当てる。また、候補番号管理テーブル変更部154は、双方向結合動き情報候補(BD1)の動き補償予測の方向が双方向であれば、最後から2番目の予測方向が単方向であるマージ候補番号を、双方向結合動き情報候補(BD1)に割り当てる。なお、実施の形態3の変形例3では、単方向予測数の算出を第2結合動き情報候補リストに含まれる予測方向が単方向である結合動き情報候補の数としたが、単方向予測数の算出ができればよく、これに限定されない。例えば、選択率の高い結合動き情報は予測方向が単方向であっても信頼度は高いと考えられるため、マージ候補番号が3以上の予測方向が単方向である結合動き情報候補の数をカウントしてもよい。また、結合動き情報候補の無効数が0であれば、予測方向が単方向である結合動き情報候補の数をカウントするとしたが、結合動き情報候補の無効数と単方向予測数の合計数を上限として双方向結合動き情報候補にマージ候補番号を割り当てることができればよく、これに限定されない。 As a specific example, if the prediction direction of the bidirectional combined motion information candidate (BD0) is bidirectional, the candidate number management table changing unit 154 determines the merge candidate number whose final prediction direction is unidirectional as the bidirectional combined motion. Assigned to information candidate (BD0). In addition, if the direction of motion compensation prediction of the bidirectional combined motion information candidate (BD1) is bidirectional, the candidate number management table changing unit 154 determines a merge candidate number in which the second prediction direction from the last is unidirectional, Assigned to bidirectional combined motion information candidate (BD1). In the third modification of the third embodiment, the calculation of the number of unidirectional predictions is the number of combined motion information candidates whose prediction direction is included in the second combined motion information candidate list. However, the present invention is not limited to this. For example, since the combined motion information with a high selection rate is considered to have high reliability even if the prediction direction is unidirectional, the number of combined motion information candidates in which the prediction direction with a merge candidate number of 3 or more is unidirectional is counted. May be. If the number of invalid combined motion information candidates is 0, the number of combined motion information candidates whose prediction direction is unidirectional is counted. However, the total number of combined motion information candidate invalid numbers and the number of unidirectional predictions is The upper limit is not limited to this as long as a merge candidate number can be assigned to a bidirectional combined motion information candidate.
 以上のように、変形例3による双方向結合動き情報候補リスト生成部152は、予測方向が単方向である結合動き情報候補を、予測方向が双方向の双方向結合動き情報候補に置換する。 As described above, the bidirectional combined motion information candidate list generation unit 152 according to Modification 3 replaces the combined motion information candidate whose prediction direction is unidirectional with the bidirectional combined motion information candidate whose prediction direction is bidirectional.
 (実施の形態3の効果)
 以上のように、無効となるマージインデックスを双方向結合動き情報候補のマージ候補番号として利用することで、マージ候補番号の増加によるマージインデックスの符号量の増加を抑制し、結合動き情報候補の選択率を高めて符号化効率を向上させることができる。
(Effect of Embodiment 3)
As described above, by using an invalid merge index as a merge candidate number of a bidirectional combined motion information candidate, an increase in the code amount of the merge index due to an increase in the merge candidate number is suppressed, and a combined motion information candidate is selected. The rate can be increased and the coding efficiency can be improved.
 以上のように、時間方向や空間方向の動き情報の信頼度が高い場合には、双方向結合動き情報候補のマージ候補番号が単方向結合動き情報候補のマージ候補番号よりも大きくなるようにマージ候補番号を利用することで、マージインデックスの符号化効率を向上させることができる。 As described above, when the reliability of the motion information in the time direction and the spatial direction is high, the merge is performed so that the merge candidate number of the bidirectional combined motion information candidate is larger than the merge candidate number of the unidirectional combined motion information candidate. By using the candidate number, the encoding efficiency of the merge index can be improved.
 以上のように、大きなマージ候補番号を持つ結合動き情報候補を、双方向結合動き情報候補と関連付けることで、信頼度が高く選択率の高いブロックの結合動き情報候補を残しながら、選択率の低いブロックの結合動き情報候補と双方向結合動き情報候補とを適応的に切り替えることができる。したがって、マージ候補番号の増加によるマージインデックスの符号量の増加を抑制し、結合動き情報候補の選択率を高めて符号化効率を向上させることができる。 As described above, by associating the combined motion information candidate having a large merge candidate number with the bidirectional combined motion information candidate, the combined motion information candidate of the block with high reliability and high selection rate remains, and the selection rate is low. It is possible to adaptively switch between the combined motion information candidate of the block and the bidirectional combined motion information candidate. Therefore, an increase in the code amount of the merge index due to an increase in merge candidate numbers can be suppressed, and the selection rate of the combined motion information candidates can be increased to improve the encoding efficiency.
 以上のように、予測方向が単方向である結合動き情報候補を、予測方向が双方向の双方向結合動き情報候補に置換し、予測効率の高い、予測方向が双方向の双方向結合動き情報候補の数を増加させて、結合動き情報候補の選択率を高めて符号化効率を向上させることができる。 As described above, the combined motion information candidate whose prediction direction is unidirectional is replaced with the bidirectional combined motion information candidate whose prediction direction is bidirectional, and the bidirectional prediction motion information whose prediction direction is bidirectional is high. By increasing the number of candidates, it is possible to increase the selection rate of combined motion information candidates and improve the coding efficiency.
 [実施の形態4]
 (単方向予測の動き情報を優先)
 実施の形態4の動画像符号化装置の構成は、基準方向動き情報決定部161の機能を除いて実施の形態1の動画像符号化装置100の構成と同一である。以下、実施の形態4における基準方向動き情報決定部161について実施の形態1との相違を説明する。実施の形態4の基準方向動き情報決定部161の動作について図63を用いて説明する。
[Embodiment 4]
(Priority is given to motion information for unidirectional prediction)
The configuration of the moving picture encoding apparatus according to the fourth embodiment is the same as that of the moving picture encoding apparatus 100 according to the first embodiment except for the function of the reference direction motion information determination unit 161. Hereinafter, the difference between the reference direction motion information determination unit 161 in the fourth embodiment and the first embodiment will be described. The operation of the reference direction motion information determination unit 161 according to Embodiment 4 will be described with reference to FIG.
 図63のフローチャートは、図28のフローチャートにステップS320からステップS323が追加されたものであり、ステップS321に特徴がある。まず、基準方向であるLXの有効性を「0」に設定する(S190)。第2結合動き情報候補リストに含まれる結合動き情報候補の数(NCands)、以下の処理を繰り返す(S320からS323)。結合動き情報候補のLX方向の有効性および単方向予測であるかを検査する(S321)。結合動き情報候補のLX方向が有効であり、且つ単方向予測であれば(S321のYES)、基準方向であるLXの有効性を「1」に設定し、基準方向の動きベクトルと参照インデックスを結合動き情報候補のLX方向の動きベクトルと参照インデックスとして処理を終了する(S322)。結合動き情報候補のLX方向が有効であり、且つ単方向予測でないならば(S321のNO)、次の候補を検査する(S323)。 63 is obtained by adding steps S320 to S323 to the flowchart of FIG. 28, and is characterized by step S321. First, the validity of the reference direction LX is set to “0” (S190). The number of combined motion information candidates (NCands) included in the second combined motion information candidate list is repeated, and the following processing is repeated (S320 to S323). Whether the combined motion information candidate is valid in the LX direction and unidirectional prediction is checked (S321). If the LX direction of the combined motion information candidate is valid and unidirectional prediction (YES in S321), the validity of the reference direction LX is set to “1”, and the motion vector and reference index in the reference direction are set. The processing ends with the motion vector in the LX direction of the combined motion information candidate and the reference index (S322). If the LX direction of the combined motion information candidate is valid and is not unidirectional prediction (NO in S321), the next candidate is examined (S323).
 結合動き情報候補のLX方向が有効であり、且つ単方向予測の動き情報候補が存在しない場合、第2結合動き情報候補リストに含まれる結合動き情報候補の数(NCands)、以下の処理を繰り返す(S191からS194)。結合動き情報候補のLX方向の有効性を検査する(S192)。結合動き情報候補のLX方向が有効であれば(S192のYES)、基準方向であるLXの有効性を「1」に設定し、基準方向の動きベクトルと参照インデックスを結合動き情報候補のLX方向の動きベクトルと参照インデックスとして処理を終了する(S193)。結合動き情報候補のLX方向が無効であれば(S192のNO)、次の候補を検査する(S194)。このように、実施の形態4の基準方向動き情報決定部161は、基準方向の動き情報の決定において単方向である動き情報を優先している点が、実施の形態1と異なる。 When the LX direction of the combined motion information candidate is valid and there is no motion information candidate for unidirectional prediction, the number of combined motion information candidates (NCands) included in the second combined motion information candidate list is repeated. (S191 to S194). The validity of the combined motion information candidate in the LX direction is checked (S192). If the LX direction of the combined motion information candidate is valid (YES in S192), the validity of the reference direction LX is set to “1”, and the motion vector and reference index in the reference direction are set to the LX direction of the combined motion information candidate. The process ends with the motion vector and the reference index (S193). If the LX direction of the combined motion information candidate is invalid (NO in S192), the next candidate is examined (S194). As described above, the reference direction motion information determination unit 161 according to the fourth embodiment differs from the first embodiment in that priority is given to motion information that is unidirectional in determining the reference direction motion information.
 実施の形態4の動画像復号装置の構成は、基準方向動き情報決定部161の機能を除いて実施の形態1の動画像復号装置200の構成と同一である。実施の形態4の動画像復号装置の結合動き情報候補生成部140は実施の形態4の動画像符号化装置の結合動き情報候補生成部140と同一である。 The configuration of the moving picture decoding apparatus according to the fourth embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the reference direction motion information determination unit 161. The combined motion information candidate generation unit 140 of the video decoding device in the fourth embodiment is the same as the combined motion information candidate generation unit 140 of the video encoding device in the fourth embodiment.
 (実施の形態4の変形例)
 なお、実施の形態4は以下のように変形できる。
(Modification of Embodiment 4)
The fourth embodiment can be modified as follows.
 (変形例1:単方向限定)
 上述の実施の形態4では、基準方向動き情報決定部161の動作例として図63を挙げたが、動き情報の決定において単方向である動き情報が優先されればよく、これに限定されない。例えば、図63のステップS191からステップS194を削除して基準方向の動き情報を単方向である動き情報に限定して選択してもよい。
(Modification 1: Unidirectional)
In the above-described fourth embodiment, FIG. 63 is given as an example of the operation of the reference direction motion information determination unit 161. However, motion information that is unidirectional may be given priority in determining motion information, and is not limited thereto. For example, step S191 to step S194 in FIG. 63 may be deleted, and the motion information in the reference direction may be limited to motion information in a single direction.
 (変形例2:逆方向の単方向の優先)
 上述の実施の形態4では、基準方向動き情報決定部161の動作例として図63を挙げたが、動き情報の決定において単方向である動き情報が優先されればよく、これに限定されない。例えば、逆方向動き情報決定部162の逆方向の動き情報の決定においても、実施の形態4の基準方向動き情報決定部161と同様に単方向である動き情報を優先してもよい。また、逆方向動き情報決定部162の逆方向の動き情報の決定において、実施の形態4の変形例1の基準方向動き情報決定部161と同様に単方向である動き情報に限定して選択してもよい。
(Variation 2: unidirectional priority in the reverse direction)
In the above-described fourth embodiment, FIG. 63 is given as an example of the operation of the reference direction motion information determination unit 161. However, motion information that is unidirectional may be given priority in determining motion information, and is not limited thereto. For example, also in the reverse direction motion information determination by the reverse direction motion information determination unit 162, priority may be given to motion information that is unidirectional, similar to the reference direction motion information determination unit 161 of the fourth embodiment. Further, in the determination of the backward direction motion information by the backward direction motion information determination unit 162, the selection is limited to the motion information that is unidirectional similarly to the reference direction motion information determination unit 161 of the first modification of the fourth embodiment. May be.
 (実施の形態4の効果)
 実施の形態4では、基準方向の動き情報の決定において単方向である動き情報を優先することで、信頼度の高い動き情報を基準方向の動き情報として利用することができ、双方向結合動き情報候補の選択率を高めて符号化効率を向上させることができる。
(Effect of Embodiment 4)
In the fourth embodiment, priority is given to motion information that is unidirectional in the determination of motion information in the reference direction, so that highly reliable motion information can be used as motion information in the reference direction. Encoding efficiency can be improved by increasing the selection rate of candidates.
 [実施の形態5]
 (各方向削除プロセス)
 実施の形態5の動画像符号化装置の構成は、結合動き情報候補生成部140の機能を除いて実施の形態1の動画像符号化装置100の構成と同一である。以下、実施の形態5における結合動き情報候補生成部140について実施の形態1との相違を説明する。
[Embodiment 5]
(Each direction deletion process)
The configuration of the moving picture encoding apparatus of the fifth embodiment is the same as that of the moving picture encoding apparatus 100 of the first embodiment except for the function of the combined motion information candidate generation unit 140. Hereinafter, the difference between the combined motion information candidate generation unit 140 in Embodiment 5 and Embodiment 1 will be described.
 実施の形態5の結合動き情報候補生成部140の構成について図64を用いて実施の形態1との相違を説明する。図64では、図16の第1結合動き情報候補リスト削減部151の代わりにL0方向動き情報候補リスト生成部155とL1方向動き情報候補リスト生成部156が設置されている。 Differences from the first embodiment will be described with reference to FIG. 64 regarding the configuration of the combined motion information candidate generation unit 140 according to the fifth embodiment. In FIG. 64, an L0 direction motion information candidate list generation unit 155 and an L1 direction motion information candidate list generation unit 156 are installed instead of the first combined motion information candidate list reduction unit 151 of FIG.
 実施の形態5の結合動き情報候補生成部140の機能について説明する。L0方向動き情報候補リスト生成部155は、第1結合動き情報候補リストに含まれる動き情報候補について、L0方向の動き情報が重複している動き情報を持つ結合動き情報候補が複数存在する場合には、1つの結合動き情報候補を残して削除してL0方向動き情報候補リストを生成し、当該L0方向動き情報候補リストを双方向結合動き情報候補リスト生成部152に供給する。 The function of the combined motion information candidate generation unit 140 according to Embodiment 5 will be described. The L0 direction motion information candidate list generation unit 155, for motion information candidates included in the first combined motion information candidate list, when there are a plurality of combined motion information candidates having motion information with overlapping motion information in the L0 direction. Deletes one combined motion information candidate, generates an L0 direction motion information candidate list, and supplies the L0 direction motion information candidate list to the bidirectional combined motion information candidate list generation unit 152.
 L1方向動き情報候補リスト生成部156は、第1結合動き情報候補リストに含まれる動き情報候補について、L1方向の動き情報が重複している動き情報を持つ結合動き情報候補が複数存在する場合には、1つの結合動き情報候補を残して削除してL1方向動き情報候補リストを生成し、当該L1方向動き情報候補リストを双方向結合動き情報候補リスト生成部152に供給する。 The L1 direction motion information candidate list generation unit 156, for motion information candidates included in the first combined motion information candidate list, when there are a plurality of combined motion information candidates having motion information with overlapping motion information in the L1 direction. Deletes one combined motion information candidate, generates an L1 direction motion information candidate list, and supplies the L1 direction motion information candidate list to the bidirectional combined motion information candidate list generation unit 152.
 双方向結合動き情報候補リスト生成部152は、L0方向動き情報候補リスト生成部155より供給されるL0方向動き情報候補リストと、L1方向動き情報候補リスト生成部156より供給されるL1方向動き情報候補リストから、双方向結合動き情報候補リストを生成する。 The bidirectional combined motion information candidate list generation unit 152 includes the L0 direction motion information candidate list supplied from the L0 direction motion information candidate list generation unit 155 and the L1 direction motion information supplied from the L1 direction motion information candidate list generation unit 156. A bidirectional combined motion information candidate list is generated from the candidate list.
 実施の形態5の動画像復号装置の構成は、結合動き情報候補生成部140の機能を除いて実施の形態1の動画像復号装置200の構成と同一である。実施の形態5の動画像復号装置の結合動き情報候補生成部140は実施の形態5の動画像符号化装置の結合動き情報候補生成部140と同一である。 The configuration of the moving picture decoding apparatus according to the fifth embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the combined motion information candidate generation unit 140. The combined motion information candidate generation unit 140 of the video decoding device in the fifth embodiment is the same as the combined motion information candidate generation unit 140 of the video encoding device in the fifth embodiment.
 (実施の形態5の効果)
 実施の形態5では、L0方向とL1方向の動き情報の冗長度を削減しておくことで同一の双方向結合動き情報の生成を抑制し、双方向結合動き情報候補の有効性を高めて符号化効率を向上させることができる。
(Effect of Embodiment 5)
In the fifth embodiment, by reducing the redundancy of motion information in the L0 direction and the L1 direction, the generation of the same bidirectional combined motion information is suppressed, and the effectiveness of the bidirectional combined motion information candidates is increased. Efficiency can be improved.
 [実施の形態6]
 (双方向結合動き情報候補の選択的利用)
 実施の形態6の動画像符号化装置の構成は、基準方向決定部160の機能を除いて実施の形態1の動画像符号化装置100の構成と同一である。最初に、実施の形態6における候補番号管理テーブルを図65とし、結合動き情報候補リストに含まれる結合動き情報候補の最大数は6であるとする。結合動き情報候補リストに含まれる結合動き情報候補の最大数が6であること、双方向結合動き情報候補にマージ候補番号が1つしか割り当てられていないことが異なる。以下、実施の形態6における基準方向決定部160について実施の形態1との相違を説明する。実施の形態6の基準方向決定部160の動作について図66を用いて説明する。
[Embodiment 6]
(Selective use of bidirectional combined motion information candidates)
The configuration of the moving picture coding apparatus according to the sixth embodiment is the same as that of the moving picture coding apparatus 100 according to the first embodiment except for the function of the reference direction determination unit 160. First, the candidate number management table in the sixth embodiment is shown in FIG. 65, and the maximum number of combined motion information candidates included in the combined motion information candidate list is 6. The difference is that the maximum number of combined motion information candidates included in the combined motion information candidate list is 6, and that only one merge candidate number is assigned to the bidirectional combined motion information candidate. Hereinafter, the difference between the reference direction determination unit 160 in the sixth embodiment and the first embodiment will be described. The operation of the reference direction determination unit 160 according to the sixth embodiment will be described with reference to FIG.
 基準方向決定部160は、第2結合動き情報候補リストに含まれる結合動き情報候補の数(NCands)、以下の処理を繰り返す(S300からS305)。結合動き情報候補のL0方向の有効性を検査する(S301)。結合動き情報候補のL0方向が有効であれば(S301のYES)、基準方向をL0に設定して処理を終了する(S302)。結合動き情報候補のL0方向が無効であれば(S301のNO)、結合動き情報候補のL1方向の有効性を検査する(S303)。結合動き情報候補のL1方向が有効であれば(S303のYES)、基準方向をL1に設定して処理を終了する(S304)。結合動き情報候補のL1方向が無効であれば(S303のNO)、次の候補を検査する(S305)。なお、基準方向が設定できなければ、双方向結合動き情報候補は生成しない(S306)。 The reference direction determination unit 160 repeats the following processing (S300 to S305) for the number of combined motion information candidates (NCands) included in the second combined motion information candidate list. The validity of the combined motion information candidate in the L0 direction is checked (S301). If the L0 direction of the combined motion information candidate is valid (YES in S301), the reference direction is set to L0 and the process ends (S302). If the L0 direction of the combined motion information candidate is invalid (NO in S301), the validity of the combined motion information candidate in the L1 direction is checked (S303). If the L1 direction of the combined motion information candidate is valid (YES in S303), the reference direction is set to L1 and the process ends (S304). If the L1 direction of the combined motion information candidate is invalid (NO in S303), the next candidate is examined (S305). If the reference direction cannot be set, no bidirectional combined motion information candidate is generated (S306).
 実施の形態6の動画像復号装置の構成は、基準方向決定部160の機能を除いて実施の形態1の動画像復号装置200の構成と同一である。実施の形態6の動画像復号装置の基準方向決定部160は実施の形態6の動画像符号化装置の基準方向決定部160と同一である。 The configuration of the moving picture decoding apparatus according to the sixth embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the reference direction determination unit 160. The reference direction determination unit 160 of the moving picture decoding apparatus according to the sixth embodiment is the same as the reference direction determination unit 160 of the moving picture encoding apparatus according to the sixth embodiment.
 (実施の形態6の効果)
 実施の形態6では、結合動き情報候補リストに含まれる結合動き情報候補の予測方向によって基準方向をL0方向とするかL1方向するかを判定することで、双方向結合動き情報候補が1つだけ有効である場合に、双方向結合動き情報候補の有効性を高め、双方向結合動き情報候補の選択性を高めて符号化効率を向上させることができる。
(Effect of Embodiment 6)
In the sixth embodiment, only one bidirectional combined motion information candidate is determined by determining whether the reference direction is the L0 direction or the L1 direction based on the prediction direction of the combined motion information candidate included in the combined motion information candidate list. When it is effective, it is possible to increase the effectiveness of the bidirectional combined motion information candidate and increase the selectivity of the bidirectional combined motion information candidate to improve the encoding efficiency.
 以上述べた実施の形態1から6の動画像符号化装置が出力する動画像の符号化ストリームは、実施の形態1から6で用いられた符号化方法に応じて復号することができるように特定のデータフォーマットを有している。当該動画像符号化装置に対応する動画像復号装置がこの特定のデータフォーマットの符号化ストリームを復号することができる。 The moving image encoded stream output from the moving image encoding apparatus according to the first to sixth embodiments described above is specified so that it can be decoded according to the encoding method used in the first to sixth embodiments. Data format. A moving picture decoding apparatus corresponding to the moving picture encoding apparatus can decode an encoded stream of this specific data format.
 具体的には、双方向結合動き情報候補を示すマージインデックスや、候補番号管理テーブルを符号化ストリーム中に符号化している。また、双方向結合動き情報候補を示すマージインデックスのみを符号化ストリーム中に符号化し、候補番号管理テーブルを動画像符号装置と動画像復号装置で共有することで候補番号管理テーブルを符号化ストリーム中に符号化しなくてもよい。 More specifically, a merge index indicating a bidirectional combined motion information candidate and a candidate number management table are encoded in the encoded stream. Also, only the merge index indicating the bidirectional combined motion information candidate is encoded in the encoded stream, and the candidate number management table is shared by the video encoding device and the video decoding device, so that the candidate number management table is included in the encoded stream. It does not have to be encoded.
 動画像符号化装置と動画像復号装置の間で符号化ストリームをやりとりするために、有線または無線のネットワークが用いられる場合、符号化ストリームを通信路の伝送形態に適したデータ形式に変換して伝送してもよい。その場合、動画像符号化装置が出力する符号化ストリームを通信路の伝送形態に適したデータ形式の符号化データに変換してネットワークに送信する動画像送信装置と、ネットワークから符号化データを受信して符号化ストリームに復元して動画像復号装置に供給する動画像受信装置とが設けられる。 When a wired or wireless network is used to exchange an encoded stream between a moving image encoding device and a moving image decoding device, the encoded stream is converted into a data format suitable for the transmission form of the communication path. It may be transmitted. In that case, a video transmission apparatus that converts the encoded stream output from the video encoding apparatus into encoded data in a data format suitable for the transmission form of the communication channel and transmits the encoded data to the network, and receives the encoded data from the network Then, a moving image receiving apparatus that restores the encoded stream and supplies the encoded stream to the moving image decoding apparatus is provided.
 動画像送信装置は、動画像符号化装置が出力する符号化ストリームをバッファするメモリと、符号化ストリームをパケット化するパケット処理部と、パケット化された符号化データをネットワークを介して送信する送信部とを含む。動画像受信装置は、パケット化された符号化データをネットワークを介して受信する受信部と、受信された符号化データをバッファするメモリと、符号化データをパケット処理して符号化ストリームを生成し、動画像復号装置に提供するパケット処理部とを含む。 The moving image transmitting apparatus is a memory that buffers the encoded stream output from the moving image encoding apparatus, a packet processing unit that packetizes the encoded stream, and transmission that transmits the packetized encoded data via the network. Part. The moving image receiving apparatus generates a coded stream by packetizing the received data, a receiving unit that receives the packetized coded data via a network, a memory that buffers the received coded data, and packet processing. And a packet processing unit provided to the video decoding device.
 以上の符号化および復号に関する処理は、ハードウェアを用いた伝送、蓄積、受信装置として実現することができるのは勿論のこと、ROM(Read Only Memory)やフラッシュメモリなどに記憶されているファームウェアや、コンピュータなどのソフトウェアによっても実現することができる。そのファームウェアプログラム、ソフトウェアプログラムをコンピュータなどで読み取り可能な記録媒体に記録して提供することも、有線あるいは無線のネットワークを通してサーバから提供することも、地上波あるいは衛星ディジタル放送のデータ放送として提供することも可能である。 The above encoding and decoding processes can be realized as a transmission, storage, and reception device using hardware, as well as firmware stored in ROM (Read Only Memory), flash memory, and the like. It can also be realized by software such as a computer. The firmware program and software program can be recorded and provided on a computer-readable recording medium, provided from a server through a wired or wireless network, or provided as a data broadcast of terrestrial or satellite digital broadcasting. Is also possible.
 以上、本発明を実施の形態をもとに説明した。実施の形態は例示であり、それらの各構成要素や各処理プロセスの組合せにいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。 The present invention has been described based on the embodiments. The embodiments are exemplifications, and it will be understood by those skilled in the art that various modifications can be made to combinations of the respective constituent elements and processing processes, and such modifications are within the scope of the present invention. .
 上述した実施の形態1において、図26のフローチャートを用いて、時間結合動き情報候補リストの生成の動作を説明した。そのフローチャートでは、時間結合動き情報候補の動きベクトルを算出する処理がある(S164)。時間結合動き情報候補は、候補ブロックの動き情報で有効な予測方向である参照画像ColRefPicと動きベクトルmvColを基準に、双方向の動き情報を算出する。候補ブロックの予測方向がL0方向もしくはL1方向の単方向の場合には、その予測方向の参照画像と動きベクトルを基準として選択する。候補ブロックの予測方向が双方向である場合には、L0方向或いはL1方向のいずれか一方の参照画像と動きベクトルを基準として選択する。双方向動き情報生成の基準とする参照画像と動きベクトルが選択されたら、時間結合動き情報候補の動きベクトルを算出する。 In the first embodiment described above, the operation of generating the time combination motion information candidate list has been described using the flowchart of FIG. In the flowchart, there is a process of calculating a motion vector of a temporally coupled motion information candidate (S164). The temporally combined motion information candidate calculates bidirectional motion information based on the reference image ColRefPic and the motion vector mvCol, which are effective prediction directions in the motion information of the candidate block. When the prediction direction of the candidate block is the unidirectional direction of the L0 direction or the L1 direction, it is selected based on the reference image and the motion vector in the prediction direction. When the prediction direction of the candidate block is bidirectional, the selection is made based on either the reference image in the L0 direction or the L1 direction and the motion vector. When a reference image and a motion vector as a reference for generating bidirectional motion information are selected, a motion vector of a temporally combined motion information candidate is calculated.
 ここで、双方向動き情報生成の基準とする動きベクトルColMvと参照画像ColRefPicからの時間結合動き情報候補の動きベクトルmvL0t、mvL1tの算出手法について図67を用いて説明する。 Here, a method for calculating the motion vectors mvL0t and mvL1t of the temporally combined motion information candidates from the motion vector ColMv and the reference image ColRefPic used as the basis for bidirectional motion information generation will be described with reference to FIG.
 ColPicとColRefPicの画像間距離をColDistとし、時間結合動き情報候補のL0方向の参照画像ColL0Picと処理対象画像CurPicの画像間距離をCurL0Dist、時間結合動き情報候補のL1方向の参照画像ColL1Picと処理対象画像CurPicの画像間距離をCurL1Distとすると、ColMvをColDistとCurL0Dist、CurL1Distの距離比率でスケーリングした下記式1の動きベクトルを、時間結合動き情報候補の動きベクトルとする。なお、画像間距離の算出はPOCを用いて行われ、正負の符号を有する。
mvL0t=mvCol×CurrL0Dist/ColDist
mvL1t=mvCol×CurrL1Dist/ColDist  ・・・(式1)
 なお、図67のColPic、ColRefPic、ColL0Pic、ColL1Picは一例であってこれ以外の関係であってもよい。
The distance between the images of ColPic and ColRefPic is ColDist, the distance between the reference images ColL0Pic in the L0 direction of the temporally coupled motion information candidate and the image to be processed CurPic is CurL0Dist, and the reference image ColL1Pic in the L1 direction of the temporally coupled motion information candidate is the target of processing. Assuming that the distance between images of the image CurPic is CurL1Dist, a motion vector of the following equation 1 obtained by scaling ColMv with a distance ratio of ColDist, CurL0Dist, and CurL1Dist is set as a motion vector of a temporally combined motion information candidate. The inter-image distance is calculated using POC and has a positive / negative sign.
mvL0t = mvCol × CurrL0Dist / ColDist
mvL1t = mvCol × CurrL1Dist / ColDist (Formula 1)
Note that ColPic, ColRefPic, ColL0Pic, and ColL1Pic in FIG. 67 are examples, and other relationships may be used.
 100 動画像符号化装置、 101 予測ブロック画像取得部、 102 減算部、 103 予測誤差符号化部、 104 符号列生成部、 105 予測誤差復号部、 106 動き補償部、 107 加算部、 108 動きベクトル検出部、 109 動き情報生成部、 110 フレームメモリ、 111 動き情報メモリ、 120 差分ベクトル算出部、 121 結合動き情報決定部、 122 予測符号化モード決定部、 130 予測ベクトル候補リスト生成部、 131 予測ベクトル決定部、 132 減算部、 140 結合動き情報候補生成部、 141 結合動き情報選択部、 150 単方向動き情報候補リスト生成部、 151 第1結合動き情報候補リスト削減部、 152 双方向結合動き情報候補リスト生成部、 153 第2結合動き情報候補リスト削減部、 154 候補番号管理テーブル変更部、 155 L0方向動き情報候補リスト生成部、 156 L1方向動き情報候補リスト生成部、 160 基準方向決定部、 161 基準方向動き情報決定部、 162 逆方向動き情報決定部、 163 双方向動き情報決定部、 200 動画像復号装置、 201 符号列解析部、 202 予測誤差復号部、 203 加算部、 204 動き情報再生部、 205 動き補償部、 206 フレームメモリ、 207 動き情報メモリ、 210 符号化モード判定部、 211 動きベクトル再生部、 212 結合動き情報再生部、 220 予測ベクトル候補リスト生成部、 221 予測ベクトル決定部、 222 加算部、 230 結合動き情報候補生成部、 231 結合動き情報選択部。 DESCRIPTION OF SYMBOLS 100 moving image encoder, 101 prediction block image acquisition part, 102 subtraction part, 103 prediction error encoding part, 104 code sequence generation part, 105 prediction error decoding part, 106 motion compensation part, 107 addition part, 108 motion vector detection Unit, 109 motion information generation unit, 110 frame memory, 111 motion information memory, 120 difference vector calculation unit, 121 combined motion information determination unit, 122 prediction coding mode determination unit, 130 prediction vector candidate list generation unit, 131 prediction vector determination , 132 subtraction unit, 140 combined motion information candidate generation unit, 141 combined motion information selection unit, 150 unidirectional motion information candidate list generation unit, 151 first combined motion information candidate list reduction unit, 152 bidirectional combined motion information Complementary list generation unit, 153, second combined motion information candidate list reduction unit, 154, candidate number management table change unit, 155, L0 direction motion information candidate list generation unit, 156, L1 direction motion information candidate list generation unit, 160, reference direction determination unit, 161 Reference direction motion information determination unit, 162 Reverse direction motion information determination unit, 163 Bidirectional motion information determination unit, 200 Video decoding device, 201 Code sequence analysis unit, 202 Prediction error decoding unit, 203 Addition unit, 204 Motion information reproduction , 205 motion compensation unit, 206 frame memory, 207 motion information memory, 210 encoding mode determination unit, 211 motion vector playback unit, 212 combined motion information playback unit, 220 prediction vector candidate list generation unit, 221 prediction vector determination unit, 222 adding unit 230 coupled motion information candidate generating unit, 231 coupled motion information selector.
 本発明は、動き補償予測を利用した動画像符号化および復号技術に利用できる。 The present invention can be used for a moving picture encoding and decoding technique using motion compensated prediction.

Claims (28)

  1.  動き補償予測を行う画像符号化装置であって、
     符号化対象ブロックに隣接する複数の符号化済みのブロックから、動きベクトルの情報と参照画像の情報とを少なくとも含む動き情報をそれぞれ1つまたは2つ持つ複数のブロックを選択して、選択されたブロックの動き情報から、動き補償予測に用いる動き情報の候補を含む候補リストを生成する候補リスト生成部と、
     前記候補に含まれる第1の候補から第1の予測リストの動き情報を取得する第1の動き情報取得部と、
     前記候補に含まれる第2の候補から第2の予測リストの動き情報を取得する第2の動き情報取得部と、
     前記第1の動き情報取得部により取得された前記第1の予測リストの動き情報と、前記第2の動き情報取得部により取得された前記第2の予測リストの動き情報を組み合わせて、動き情報の新たな候補を生成する選択候補生成部と、
     を備えることを特徴とする画像符号化装置。
    An image encoding device that performs motion compensation prediction,
    A plurality of blocks that have one or two pieces of motion information each including at least motion vector information and reference image information are selected from a plurality of already-encoded blocks adjacent to the encoding target block. A candidate list generating unit that generates a candidate list including motion information candidates used for motion compensation prediction from the motion information of the block;
    A first motion information acquisition unit that acquires motion information of a first prediction list from a first candidate included in the candidates;
    A second motion information acquisition unit that acquires motion information of a second prediction list from a second candidate included in the candidates;
    Combining the motion information of the first prediction list acquired by the first motion information acquisition unit with the motion information of the second prediction list acquired by the second motion information acquisition unit, motion information A selection candidate generator for generating new candidates of
    An image encoding device comprising:
  2.  前記候補リスト生成部は、前記候補の数が、設定された最大数に満たない場合、前記選択候補生成部により生成された新たな候補を含めた候補リストを生成する、
     ことを特徴とする請求項1に記載の画像符号化装置。
    The candidate list generation unit generates a candidate list including a new candidate generated by the selection candidate generation unit when the number of candidates is less than a set maximum number.
    The image coding apparatus according to claim 1.
  3.  前記候補リスト生成部は、前記候補の数が、前記最大数を超えないように、前記選択候補生成部により生成された1つ以上の新たな候補を含めた候補リストを生成する、
     ことを特徴とする請求項2に記載の画像符号化装置。
    The candidate list generation unit generates a candidate list including one or more new candidates generated by the selection candidate generation unit so that the number of candidates does not exceed the maximum number;
    The image coding apparatus according to claim 2, wherein:
  4.  動き補償予測に用いる動き情報の候補を前記候補リスト内で特定するための候補特定情報を符号化する符号列生成部と、
     をさらに備えることを特徴とする請求項1から3のいずれかに記載の画像符号化装置。
    A code string generation unit that encodes candidate specifying information for specifying motion information candidates used for motion compensation prediction in the candidate list;
    The image encoding apparatus according to claim 1, further comprising:
  5.  前記候補リスト生成部は、前記選択候補生成部により生成された新たな候補に前記候補よりも大きな候補特定情報を割り当てる、
     ことを特徴とする請求項4に記載の画像符号化装置。
    The candidate list generation unit allocates candidate identification information larger than the candidate to a new candidate generated by the selection candidate generation unit,
    The image coding apparatus according to claim 4, wherein:
  6.  前記第1の予測リストおよび前記第2の予測リストは、異なる予測リストである、
     ことを特徴とする請求項1から5のいずれかに記載の画像符号化装置。
    The first prediction list and the second prediction list are different prediction lists;
    The image encoding device according to claim 1, wherein the image encoding device is an image encoding device.
  7.  前記候補リスト生成部は、前記符号化対象ブロックを含む画像と時間的に異なる画像のブロックの動き情報から導出した動き情報を候補リストに含める、
     ことを特徴とする請求項1から6のいずれかに記載の画像符号化装置。
    The candidate list generation unit includes motion information derived from motion information of a block of an image temporally different from an image including the encoding target block in the candidate list.
    The image encoding device according to claim 1, wherein the image encoding device is an image encoding device.
  8.  前記第1の動き情報取得部は、前記候補を第1の優先順に従って検索し、有効となる候補を前記第1の候補とし、
     前記第2の動き情報取得部は、前記候補を第2の優先順に従って検索し、有効となる候補を前記第2の候補とする、
     ことを特徴とする請求項1から7のいずれかに記載の画像符号化装置。
    The first motion information acquisition unit searches for the candidates according to a first priority order, and sets a valid candidate as the first candidate.
    The second motion information acquisition unit searches the candidates according to a second priority order, and sets a valid candidate as the second candidate.
    The image encoding device according to claim 1, wherein the image encoding device is an image encoding device.
  9.  前記第1の動き情報取得部は、前記候補の中の予め定められた候補を前記第1の候補とし、
     前記第2の動き情報取得部は、前記候補の中の予め定められた別の候補を前記第2の候補とする、
     ことを特徴とする請求項1から7のいずれかに記載の画像符号化装置。
    The first motion information acquisition unit sets a predetermined candidate among the candidates as the first candidate,
    The second motion information acquisition unit sets another predetermined candidate among the candidates as the second candidate.
    The image encoding device according to claim 1, wherein the image encoding device is an image encoding device.
  10.  前記選択候補生成部は、前記第1の動き情報取得部および前記第2の動き情報取得部により取得された、前記第1の予測リストの動き情報および前記第2の予測リストの動き情報の両方が有効である場合、前記新たな候補を生成する、
     ことを特徴とする請求項1から9のいずれかに記載の画像符号化装置。
    The selection candidate generation unit includes both the motion information of the first prediction list and the motion information of the second prediction list acquired by the first motion information acquisition unit and the second motion information acquisition unit. If is valid, the new candidate is generated.
    The image encoding device according to claim 1, wherein the image encoding device is an image encoding device.
  11.  前記新たな候補は、2つの動き情報を持つことを特徴とする請求項1から10のいずれかに記載の画像符号化装置。 11. The image encoding device according to claim 1, wherein the new candidate has two pieces of motion information.
  12.  前記新たな候補は、1つの動き情報を持つ、
     ことを特徴とする請求項1から9のいずれかに記載の画像符号化装置。
    The new candidate has one piece of motion information,
    The image encoding device according to claim 1, wherein the image encoding device is an image encoding device.
  13.  動き補償予測を行う画像符号化方法であって、
     符号化対象ブロックに隣接する複数の符号化済みのブロックから、動きベクトルの情報と参照画像の情報とを少なくとも含む動き情報をそれぞれ1つまたは2つ持つ複数のブロックを選択して、選択されたブロックの動き情報から、動き補償予測に用いる動き情報の候補を含む候補リストを生成するステップと、
     前記候補リストに含まれる第1の候補から第1の予測リストの動き情報を取得するステップと、
     前記候補リストに含まれる第2の候補から第2の予測リストの動き情報を取得するステップと、
     前記第1の予測リストの動き情報と、前記第2の予測リストの動き情報を組み合わせて、動き情報の新たな候補を生成するステップと、
     を備えることを特徴とする画像符号化方法。
    An image encoding method for performing motion compensation prediction,
    A plurality of blocks that have one or two pieces of motion information each including at least motion vector information and reference image information are selected from a plurality of already-encoded blocks adjacent to the encoding target block. Generating a candidate list including motion information candidates used for motion compensation prediction from the motion information of the block;
    Obtaining motion information of a first prediction list from a first candidate included in the candidate list;
    Obtaining motion information of a second prediction list from a second candidate included in the candidate list;
    Combining the motion information of the first prediction list and the motion information of the second prediction list to generate a new candidate for motion information;
    An image encoding method comprising:
  14.  動き補償予測を行う画像符号化プログラムであって、
     符号化対象ブロックに隣接する複数の符号化済みのブロックから、動きベクトルの情報と参照画像の情報とを少なくとも含む動き情報をそれぞれ1つまたは2つ持つ複数のブロックを選択して、選択されたブロックの動き情報から、動き補償予測に用いる動き情報の候補を含む候補リストを生成する処理と、
     前記候補リストに含まれる第1の候補から第1の予測リストの動き情報を取得する処理と、
     前記候補リストに含まれる第2の候補から第2の予測リストの動き情報を取得する処理と、
     前記第1の予測リストの動き情報と、前記第2の予測リストの動き情報を組み合わせて、動き情報の新たな候補を生成する処理と、
     をコンピュータに実行させることを特徴とする画像符号化プログラム。
    An image encoding program for performing motion compensation prediction,
    A plurality of blocks that have one or two pieces of motion information each including at least motion vector information and reference image information are selected from a plurality of already-encoded blocks adjacent to the encoding target block. A process for generating a candidate list including motion information candidates used for motion compensation prediction from the motion information of the block;
    A process of acquiring motion information of the first prediction list from the first candidate included in the candidate list;
    A process of obtaining motion information of a second prediction list from a second candidate included in the candidate list;
    A process of generating new candidates for motion information by combining the motion information of the first prediction list and the motion information of the second prediction list;
    An image encoding program that causes a computer to execute the above.
  15.  動き補償予測を行う画像復号装置であって、
     復号対象ブロックに隣接する複数の復号済みのブロックから、動きベクトルの情報と参照画像の情報とを少なくとも含む動き情報をそれぞれ1つまたは2つ持つ複数のブロックを選択して、選択されたブロックの動き情報から、動き補償予測に用いる動き情報の候補を含む候補リストを生成する候補リスト生成部と、
     前記候補に含まれる第1の候補から第1の予測リストの動き情報を取得する第1の動き情報取得部と、
     前記候補に含まれる第2の候補から第2の予測リストの動き情報を取得する第2の動き情報取得部と、
     前記第1の動き情報取得部により取得された前記第1の予測リストの動き情報と、前記第2の動き情報取得部により取得された前記第2の予測リストの動き情報を組み合わせて、動き情報の新たな候補を生成する選択候補生成部と、
     を備えることを特徴とする画像復号装置。
    An image decoding apparatus that performs motion compensation prediction,
    From a plurality of decoded blocks adjacent to the decoding target block, a plurality of blocks each having one or two pieces of motion information including at least motion vector information and reference image information are selected, and the selected block A candidate list generating unit that generates a candidate list including motion information candidates used for motion compensation prediction from the motion information;
    A first motion information acquisition unit that acquires motion information of a first prediction list from a first candidate included in the candidates;
    A second motion information acquisition unit that acquires motion information of a second prediction list from a second candidate included in the candidates;
    Combining the motion information of the first prediction list acquired by the first motion information acquisition unit with the motion information of the second prediction list acquired by the second motion information acquisition unit, motion information A selection candidate generator for generating new candidates of
    An image decoding apparatus comprising:
  16.  前記候補リスト生成部は、前記候補の数が、設定された最大数に満たない場合、前記選択候補生成部により生成された新たな候補を含めた候補リストを生成する、
     ことを特徴とする請求項15に記載の画像復号装置。
    The candidate list generation unit generates a candidate list including a new candidate generated by the selection candidate generation unit when the number of candidates is less than a set maximum number.
    The image decoding apparatus according to claim 15.
  17.  前記候補リスト生成部は、前記候補の数が、前記最大数を超えないように、前記選択候補生成部により生成された1つ以上の新たな候補を含めた候補リストを生成する、
     ことを特徴とする請求項16に記載の画像復号装置。
    The candidate list generation unit generates a candidate list including one or more new candidates generated by the selection candidate generation unit so that the number of candidates does not exceed the maximum number;
    The image decoding apparatus according to claim 16.
  18.  動き補償予測に用いる動き情報の候補を前記候補リスト内で特定するための候補特定情報を復号する符号列解析部と、
     復号された前記候補特定情報を用いて、前記候補リストに含まれる候補の中から1つの候補を選択する選択部と、
     をさらに備えることを特徴とする請求項15から17のいずれかに記載の画像復号装置。
    A code string analyzer that decodes candidate specifying information for specifying motion information candidates used for motion compensation prediction in the candidate list;
    A selection unit that selects one candidate from candidates included in the candidate list using the decoded candidate identification information;
    The image decoding apparatus according to claim 15, further comprising:
  19.  前記候補リスト生成部は、前記選択候補生成部により生成された新たな候補に前記候補よりも大きな候補特定情報を割り当てる、
     ことを特徴とする請求項18に記載の画像復号装置。
    The candidate list generation unit allocates candidate identification information larger than the candidate to a new candidate generated by the selection candidate generation unit,
    The image decoding device according to claim 18.
  20.  前記第1の予測リストおよび前記第2の予測リストは、異なる予測リストである、
     ことを特徴とする請求項15から19のいずれかに記載の画像復号装置。
    The first prediction list and the second prediction list are different prediction lists;
    The image decoding device according to claim 15, wherein the image decoding device is an image decoding device.
  21.  前記候補リスト生成部は、前記復号対象ブロックを含む画像と時間的に異なる画像のブロックの動き情報から導出した動き情報を候補リストに含める、
     ことを特徴とする請求項15から20のいずれかに記載の画像復号装置。
    The candidate list generation unit includes motion information derived from motion information of a block of an image temporally different from an image including the decoding target block in the candidate list;
    The image decoding device according to any one of claims 15 to 20, wherein
  22.  前記第1の動き情報取得部は、前記候補を第1の優先順に従って検索し、有効となる候補を前記第1の候補とし、
     前記第2の動き情報取得部は、前記候補を第2の優先順に従って検索し、有効となる候補を前記第2の候補とする、
     ことを特徴とする請求項15から21のいずれかに記載の画像復号装置。
    The first motion information acquisition unit searches for the candidates according to a first priority order, and sets a valid candidate as the first candidate.
    The second motion information acquisition unit searches the candidates according to a second priority order, and sets a valid candidate as the second candidate.
    The image decoding device according to any one of claims 15 to 21, wherein
  23.  前記第1の動き情報取得部は、前記候補の中の予め定められた候補を前記第1の候補とし、
     前記第2の動き情報取得部は、前記候補の中の予め定められた別の候補を前記第2の候補とする、
     ことを特徴とする請求項15から21のいずれかに記載の画像復号装置。
    The first motion information acquisition unit sets a predetermined candidate among the candidates as the first candidate,
    The second motion information acquisition unit sets another predetermined candidate among the candidates as the second candidate.
    The image decoding device according to any one of claims 15 to 21, wherein
  24.  前記選択候補生成部は、前記第1の動き情報取得部および前記第2の動き情報取得部により取得された、前記第1の予測リストの動き情報および前記第2の予測リストの動き情報の両方が有効である場合、前記新たな候補を生成する、
     ことを特徴とする請求項15から23のいずれかに記載の画像復号装置。
    The selection candidate generation unit includes both the motion information of the first prediction list and the motion information of the second prediction list acquired by the first motion information acquisition unit and the second motion information acquisition unit. If is valid, the new candidate is generated.
    The image decoding device according to any one of claims 15 to 23, wherein:
  25.  前記新たな候補は、2つの動き情報を持つことを特徴とする請求項15から24のいずれかに記載の画像復号装置。 The image decoding device according to any one of claims 15 to 24, wherein the new candidate has two pieces of motion information.
  26.  前記新たな候補は、1つの動き情報を持つ、
     ことを特徴とする請求項15から23のいずれかに記載の画像復号装置。
    The new candidate has one piece of motion information,
    The image decoding device according to any one of claims 15 to 23, wherein:
  27.  動き補償予測を行う画像復号方法であって、
     復号対象ブロックに隣接する複数の復号済みのブロックから、動きベクトルの情報と参照画像の情報とを少なくとも含む動き情報をそれぞれ1つまたは2つ持つ複数のブロックを選択して、選択されたブロックの動き情報から、動き補償予測に用いる動き情報の候補を含む候補リストを生成するステップと、
     前記候補に含まれる第1の候補から第1の予測リストの動き情報を取得するステップと、
     前記候補に含まれる第2の候補から第2の予測リストの動き情報を取得するステップと、
     前記第1の予測リストの動き情報と、前記第2の予測リストの動き情報を組み合わせて、動き情報の新たな候補を生成するステップと、
     を備えることを特徴とする画像復号方法。
    An image decoding method for performing motion compensation prediction,
    From a plurality of decoded blocks adjacent to the decoding target block, a plurality of blocks each having one or two pieces of motion information including at least motion vector information and reference image information are selected, and the selected block Generating a candidate list including motion information candidates used for motion compensation prediction from the motion information;
    Obtaining motion information of a first prediction list from a first candidate included in the candidates;
    Obtaining motion information of a second prediction list from a second candidate included in the candidates;
    Combining the motion information of the first prediction list and the motion information of the second prediction list to generate a new candidate for motion information;
    An image decoding method comprising:
  28.  動き補償予測を行う画像復号プログラムであって、
     復号対象ブロックに隣接する複数の復号済みのブロックから、動きベクトルの情報と参照画像の情報とを少なくとも含む動き情報をそれぞれ1つまたは2つ持つ複数のブロックを選択して、選択されたブロックの動き情報から、動き補償予測に用いる動き情報の候補を含む候補リストを生成する処理と、
     前記候補に含まれる第1の候補から第1の予測リストの動き情報を取得する処理と、
     前記候補に含まれる第2の候補から第2の予測リストの動き情報を取得する処理と、
     前記第1の予測リストの動き情報と、前記第2の予測リストの動き情報を組み合わせて、動き情報の新たな候補を生成する処理と、
     をコンピュータに実行させることを特徴とする画像復号プログラム。
    An image decoding program for performing motion compensation prediction,
    From a plurality of decoded blocks adjacent to the decoding target block, a plurality of blocks each having one or two pieces of motion information including at least motion vector information and reference image information are selected, and the selected block A process of generating a candidate list including motion information candidates used for motion compensation prediction from motion information;
    Processing for obtaining motion information of the first prediction list from the first candidate included in the candidates;
    A process of acquiring motion information of the second prediction list from a second candidate included in the candidates;
    A process of generating new candidates for motion information by combining the motion information of the first prediction list and the motion information of the second prediction list;
    An image decoding program that causes a computer to execute the above.
PCT/JP2012/004148 2011-06-30 2012-06-27 Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program WO2013001803A1 (en)

Priority Applications (15)

Application Number Priority Date Filing Date Title
KR1020217021884A KR102365353B1 (en) 2011-06-30 2012-06-27 Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program
KR1020207010416A KR102200578B1 (en) 2011-06-30 2012-06-27 Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program
KR1020147002407A KR20140043242A (en) 2011-06-30 2012-06-27 Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program
CN201280032664.5A CN103636218B (en) 2011-06-30 2012-06-27 Picture decoding apparatus and picture decoding method
KR1020227004619A KR102464103B1 (en) 2011-06-30 2012-06-27 Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program
KR1020157019113A KR20150088909A (en) 2011-06-30 2012-06-27 Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program
KR1020187028735A KR102004113B1 (en) 2011-06-30 2012-06-27 Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program
KR1020217000106A KR102279115B1 (en) 2011-06-30 2012-06-27 Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program
KR1020197020932A KR102103682B1 (en) 2011-06-30 2012-06-27 Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program
US14/109,629 US9516314B2 (en) 2011-06-30 2013-12-17 Picture decoding device with a motion compensation prediction
US15/339,242 US9686564B2 (en) 2011-06-30 2016-10-31 Picture encoding device, picture encoding method, picture encoding program, picture decoding device, picture decoding method, and picture decoding program
US15/422,694 US9854266B2 (en) 2011-06-30 2017-02-02 Picture encoding device, picture encoding method, picture encoding program, picture decoding device, picture decoding method, and picture decoding program
US15/422,679 US9693075B2 (en) 2011-06-30 2017-02-02 Picture encoding device, picture encoding method, picture encoding program, picture decoding device, picture decoding method, and picture decoding program
US15/422,656 US9681149B1 (en) 2011-06-30 2017-02-02 Picture encoding device, picture encoding method, picture encoding program, picture decoding device, picture decoding method, and picture decoding program
US15/843,054 US10009624B2 (en) 2011-06-30 2017-12-15 Picture encoding device, picture encoding method, picture encoding program, picture decoding device, picture decoding method, and picture decoding program

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2011-146066 2011-06-30
JP2011146066 2011-06-30
JP2011-146065 2011-06-30
JP2011146065 2011-06-30
JP2012-143341 2012-06-26
JP2012143341A JP5678924B2 (en) 2011-06-30 2012-06-26 Image decoding apparatus, image decoding method, and image decoding program, and receiving apparatus, receiving method, and receiving program
JP2012143340A JP5807621B2 (en) 2011-06-30 2012-06-26 Image encoding device, image encoding method, image encoding program, transmission device, transmission method, and transmission program
JP2012-143340 2012-06-26

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/109,629 Continuation US9516314B2 (en) 2011-06-30 2013-12-17 Picture decoding device with a motion compensation prediction

Publications (1)

Publication Number Publication Date
WO2013001803A1 true WO2013001803A1 (en) 2013-01-03

Family

ID=47423722

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/004148 WO2013001803A1 (en) 2011-06-30 2012-06-27 Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program

Country Status (2)

Country Link
KR (2) KR102365353B1 (en)
WO (1) WO2013001803A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014514814A (en) * 2011-03-21 2014-06-19 クゥアルコム・インコーポレイテッド Bi-predictive merge mode based on uni-predictive neighbor in video coding
JP2016015787A (en) * 2011-04-12 2016-01-28 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Method for decoding moving pictures
US9456217B2 (en) 2011-05-24 2016-09-27 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US9456214B2 (en) 2011-08-03 2016-09-27 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus, and moving picture coding and decoding apparatus
US9560373B2 (en) 2011-05-31 2017-01-31 Sun Patent Trust Image coding method and apparatus with candidate motion vectors
US9609356B2 (en) 2011-05-31 2017-03-28 Sun Patent Trust Moving picture coding method and apparatus with candidate motion vectors
US9615107B2 (en) 2011-05-27 2017-04-04 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US9723322B2 (en) 2011-05-27 2017-08-01 Sun Patent Trust Decoding method and apparatus with candidate motion vectors
US10887585B2 (en) 2011-06-30 2021-01-05 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
CN112514395A (en) * 2018-12-13 2021-03-16 Jvc建伍株式会社 Image decoding device, image decoding method, and image decoding program
US11218708B2 (en) 2011-10-19 2022-01-04 Sun Patent Trust Picture decoding method for decoding using a merging candidate selected from a first merging candidate derived using a first derivation process and a second merging candidate derived using a second derivation process
US11838514B2 (en) 2018-08-06 2023-12-05 Electronics And Telecommunications Research Institute Image encoding/decoding method and device, and recording medium storing bitstream

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3637996B2 (en) 1997-03-28 2005-04-13 シャープ株式会社 Video encoding / decoding device using motion-compensated interframe prediction method capable of region integration
HUE044616T2 (en) * 2002-04-19 2019-11-28 Panasonic Ip Corp America Motion vector calculating method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HIDEKI TAKEHARA ET AL.: "Bi-derivative merge candidate", JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG16 WP3 AND ISO/IEC JTC1/SC29/WG11, JCTVC-F372, 6TH MEETING, July 2011 (2011-07-01), TORINO, IT, pages 1 - 5 *
J. JUNG ET AL.: "Temporal MV predictor modification for MV-Comp", SKIP, DIRECT AND MERGE SCHEMES, JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG16 WP3 AND ISO/IEC JTC1/SC29/WG11, JCTVC-D164, 4TH MEETING, January 2011 (2011-01-01), DAEGU, KR, pages 1 - 5 *
JIAN-LIANG LIN ET AL.: "Improved Advanced Motion Vector Prediction", JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG16 WP3 AND ISO/IEC JTC1/SC29/WG11, JCTVC-D125_R2, 4TH MEETING, January 2011 (2011-01-01), DAEGU, KR, pages 1 - 8 *
YUNFEI ZHENG ET AL.: "Extended Motion Vector Prediction for Bi predictive Mode", JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG16 WP3 AND ISO/IEC JTC1/SC29/WG11, JCTVC-E343, 5TH MEETING, March 2011 (2011-03-01), GENEVA, pages 1 - 4 *

Cited By (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014514814A (en) * 2011-03-21 2014-06-19 クゥアルコム・インコーポレイテッド Bi-predictive merge mode based on uni-predictive neighbor in video coding
US9648334B2 (en) 2011-03-21 2017-05-09 Qualcomm Incorporated Bi-predictive merge mode based on uni-predictive neighbors in video coding
US9872036B2 (en) 2011-04-12 2018-01-16 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
JP2016015787A (en) * 2011-04-12 2016-01-28 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Method for decoding moving pictures
US9445120B2 (en) 2011-04-12 2016-09-13 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US11917186B2 (en) 2011-04-12 2024-02-27 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US11356694B2 (en) 2011-04-12 2022-06-07 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US11012705B2 (en) 2011-04-12 2021-05-18 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10609406B2 (en) 2011-04-12 2020-03-31 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10536712B2 (en) 2011-04-12 2020-01-14 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10382774B2 (en) 2011-04-12 2019-08-13 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10178404B2 (en) 2011-04-12 2019-01-08 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US11228784B2 (en) 2011-05-24 2022-01-18 Velos Media, Llc Decoding method and apparatuses with candidate motion vectors
US10129564B2 (en) 2011-05-24 2018-11-13 Velos Media, LCC Decoding method and apparatuses with candidate motion vectors
US9456217B2 (en) 2011-05-24 2016-09-27 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US10484708B2 (en) 2011-05-24 2019-11-19 Velos Media, Llc Decoding method and apparatuses with candidate motion vectors
US9826249B2 (en) 2011-05-24 2017-11-21 Velos Media, Llc Decoding method and apparatuses with candidate motion vectors
US9615107B2 (en) 2011-05-27 2017-04-04 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US11076170B2 (en) 2011-05-27 2021-07-27 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US10034001B2 (en) 2011-05-27 2018-07-24 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US10200714B2 (en) 2011-05-27 2019-02-05 Sun Patent Trust Decoding method and apparatus with candidate motion vectors
US10212450B2 (en) 2011-05-27 2019-02-19 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US9838695B2 (en) 2011-05-27 2017-12-05 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US11575930B2 (en) 2011-05-27 2023-02-07 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US11895324B2 (en) 2011-05-27 2024-02-06 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US11115664B2 (en) 2011-05-27 2021-09-07 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US11979582B2 (en) 2011-05-27 2024-05-07 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US9723322B2 (en) 2011-05-27 2017-08-01 Sun Patent Trust Decoding method and apparatus with candidate motion vectors
US10595023B2 (en) 2011-05-27 2020-03-17 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US11570444B2 (en) 2011-05-27 2023-01-31 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US9883199B2 (en) 2011-05-27 2018-01-30 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US10721474B2 (en) 2011-05-27 2020-07-21 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US10708598B2 (en) 2011-05-27 2020-07-07 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US9900613B2 (en) 2011-05-31 2018-02-20 Sun Patent Trust Image coding and decoding system using candidate motion vectors
US10652573B2 (en) 2011-05-31 2020-05-12 Sun Patent Trust Video encoding method, video encoding device, video decoding method, video decoding device, and video encoding/decoding device
US10645413B2 (en) 2011-05-31 2020-05-05 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US10951911B2 (en) 2011-05-31 2021-03-16 Velos Media, Llc Image decoding method and image decoding apparatus using candidate motion vectors
US9609356B2 (en) 2011-05-31 2017-03-28 Sun Patent Trust Moving picture coding method and apparatus with candidate motion vectors
US11057639B2 (en) 2011-05-31 2021-07-06 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US11949903B2 (en) 2011-05-31 2024-04-02 Sun Patent Trust Image decoding method and image decoding apparatus using candidate motion vectors
US11917192B2 (en) 2011-05-31 2024-02-27 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US9819961B2 (en) 2011-05-31 2017-11-14 Sun Patent Trust Decoding method and apparatuses with candidate motion vectors
US9560373B2 (en) 2011-05-31 2017-01-31 Sun Patent Trust Image coding method and apparatus with candidate motion vectors
US10412404B2 (en) 2011-05-31 2019-09-10 Velos Media, Llc Image decoding method and image decoding apparatus using candidate motion vectors
US11368710B2 (en) 2011-05-31 2022-06-21 Velos Media, Llc Image decoding method and image decoding apparatus using candidate motion vectors
US11509928B2 (en) 2011-05-31 2022-11-22 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US10887585B2 (en) 2011-06-30 2021-01-05 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US11553202B2 (en) 2011-08-03 2023-01-10 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US9456214B2 (en) 2011-08-03 2016-09-27 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus, and moving picture coding and decoding apparatus
US10440387B2 (en) 2011-08-03 2019-10-08 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US10284872B2 (en) 2011-08-03 2019-05-07 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US11979598B2 (en) 2011-08-03 2024-05-07 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US10129561B2 (en) 2011-08-03 2018-11-13 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US11647208B2 (en) 2011-10-19 2023-05-09 Sun Patent Trust Picture coding method, picture coding apparatus, picture decoding method, and picture decoding apparatus
US11218708B2 (en) 2011-10-19 2022-01-04 Sun Patent Trust Picture decoding method for decoding using a merging candidate selected from a first merging candidate derived using a first derivation process and a second merging candidate derived using a second derivation process
US11838514B2 (en) 2018-08-06 2023-12-05 Electronics And Telecommunications Research Institute Image encoding/decoding method and device, and recording medium storing bitstream
CN112514395B (en) * 2018-12-13 2023-07-21 Jvc建伍株式会社 Image decoding device and method, and image encoding device and method
CN112514395A (en) * 2018-12-13 2021-03-16 Jvc建伍株式会社 Image decoding device, image decoding method, and image decoding program

Also Published As

Publication number Publication date
KR102365353B1 (en) 2022-02-23
KR20220025216A (en) 2022-03-03
KR20210091356A (en) 2021-07-21
KR102464103B1 (en) 2022-11-04

Similar Documents

Publication Publication Date Title
KR102200578B1 (en) Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program
WO2013001803A1 (en) Image encoding device, image encoding method, image encoding program, image decoding device, image decoding method, and image decoding program
JP6135750B2 (en) Image encoding device, image encoding method, image encoding program, transmission device, transmission method, and transmission program
JP5720751B2 (en) Image decoding apparatus, image decoding method, and image decoding program, and receiving apparatus, receiving method, and receiving program
JP2013021613A (en) Image decoder, image decoding method and image decoding program
JP2013021612A (en) Image encoder, image encoding method and image encoding program
JP2013021572A (en) Image encoder, image encoding method, and image encoding program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12804320

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20147002407

Country of ref document: KR

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 12804320

Country of ref document: EP

Kind code of ref document: A1