WO2013001803A1 - Dispositif de codage d'image, procédé de codage d'image, programme de codage d'image, dispositif de décodage d'image, procédé de décodage d'image et programme de décodage d'image - Google Patents

Dispositif de codage d'image, procédé de codage d'image, programme de codage d'image, dispositif de décodage d'image, procédé de décodage d'image et programme de décodage d'image Download PDF

Info

Publication number
WO2013001803A1
WO2013001803A1 PCT/JP2012/004148 JP2012004148W WO2013001803A1 WO 2013001803 A1 WO2013001803 A1 WO 2013001803A1 JP 2012004148 W JP2012004148 W JP 2012004148W WO 2013001803 A1 WO2013001803 A1 WO 2013001803A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion information
candidate
prediction
candidates
list
Prior art date
Application number
PCT/JP2012/004148
Other languages
English (en)
Japanese (ja)
Inventor
英樹 竹原
博哉 中村
福島 茂
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2012143341A external-priority patent/JP5678924B2/ja
Priority claimed from JP2012143340A external-priority patent/JP5807621B2/ja
Priority to KR1020217000106A priority Critical patent/KR102279115B1/ko
Priority to KR1020147002407A priority patent/KR20140043242A/ko
Priority to KR1020217021884A priority patent/KR102365353B1/ko
Priority to KR1020227004619A priority patent/KR102464103B1/ko
Priority to KR1020187028735A priority patent/KR102004113B1/ko
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Priority to KR1020207010416A priority patent/KR102200578B1/ko
Priority to CN201280032664.5A priority patent/CN103636218B/zh
Priority to KR1020157019113A priority patent/KR20150088909A/ko
Priority to KR1020197020932A priority patent/KR102103682B1/ko
Publication of WO2013001803A1 publication Critical patent/WO2013001803A1/fr
Priority to US14/109,629 priority patent/US9516314B2/en
Priority to US15/339,242 priority patent/US9686564B2/en
Priority to US15/422,679 priority patent/US9693075B2/en
Priority to US15/422,656 priority patent/US9681149B1/en
Priority to US15/422,694 priority patent/US9854266B2/en
Priority to US15/843,054 priority patent/US10009624B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/577Motion compensation with bidirectional frame interpolation, i.e. using B-pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • H04N19/521Processing of motion vectors for estimating the reliability of the determined motion vectors or motion vector field, e.g. for smoothing the motion vector field or for correcting motion vectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/157Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/105Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/147Data rate or code amount at the encoder output according to rate distortion criteria
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/189Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
    • H04N19/196Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding being specially adapted for the computation of encoding parameters, e.g. by averaging previously computed encoding parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • H04N19/423Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation characterised by memory arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • H04N19/517Processing of motion vectors by encoding
    • H04N19/52Processing of motion vectors by encoding by predictive encoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/172Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field

Definitions

  • the present invention relates to a moving image encoding technique using motion compensated prediction, and in particular, an image encoding device, an image encoding method, an image encoding program, and an image decoding device that encode or decode motion information used in motion compensated prediction.
  • the present invention relates to an image decoding method and an image decoding program.
  • motion compensation prediction divides the target image into fine blocks, uses the decoded image as a reference image, and based on the amount of motion indicated by the motion vector, the position of the position moved from the target block of the target image to the reference block of the reference image. This is a technique for generating a signal as a prediction signal. Some motion compensation predictions are performed unidirectionally using one motion vector, and others are performed bidirectionally using two motion vectors.
  • the motion vector of the encoded block adjacent to the processing target block is set as a prediction motion vector (also simply referred to as “prediction vector”), and the difference between the motion vector of the processing target block and the prediction vector is obtained.
  • the compression efficiency is improved by transmitting the difference vector as an encoded vector.
  • MPEG-4AVC improves the efficiency of motion compensation prediction by making the block size of motion compensation prediction finer and more diverse than MPEG-2.
  • the code amount of the encoded vector becomes a problem.
  • the motion vector of the block adjacent to the left of the block to be processed is simply used as the prediction vector.
  • the prediction vector is obtained by using the median of the motion vectors of a plurality of adjacent blocks as the prediction vector. And the increase in the code amount of the encoded vector is suppressed.
  • direct motion compensation prediction is known in MPEG-4 AVC. Direct motion compensated prediction generates a new motion vector by scaling the motion vector of a block at the same position as the processing target block of another encoded image by the distance between the target image and two reference images. The motion compensation prediction is realized without transmitting the quantization vector.
  • motion compensation prediction that realizes motion compensation prediction without transmitting an encoded vector using motion information of a block adjacent to a processing target block is known (see, for example, Patent Document 1).
  • direct motion compensated prediction that does not transmit an encoded vector focuses on the continuity of motion of a block that is in the same position as a processing target block and a processing target block of another image that has been encoded. Further, Patent Document 1 focuses on the continuity of movement of a processing target block and a block adjacent to the processing target block. As a result, by using the motion information of other blocks, the encoding efficiency is improved without encoding the motion information including the difference vector as the encoded vector.
  • the motion of the processing target block is shifted to the motion of the processing target block and the adjacent block, or the motion of a block around the same position as the processing target block of another encoded image.
  • the motion information including the difference vector must be encoded, and there is a difficult aspect that the improvement of the encoding efficiency is not sufficiently exhibited.
  • the present invention has been made in view of such circumstances, and an object thereof is to provide a technique for further improving the coding efficiency of motion information including a motion vector.
  • an image encoding device that performs motion compensation prediction, and includes a plurality of encoded blocks adjacent to an encoding target block, A plurality of blocks each having one or two pieces of motion information including at least vector information and reference image information are selected, and motion information candidates used for motion compensation prediction are included from the motion information of the selected blocks
  • a second motion information acquisition unit (162) that acquires motion information of the second prediction list from the second candidate included, and the first motion information acquired by the first motion information acquisition unit (161).
  • a selection candidate generation unit (163) that generates a new candidate for motion information by combining the motion information of the prediction list and the motion information of the second prediction list acquired by the second motion information acquisition unit (162). And).
  • the candidate list generation unit (140) generates a candidate list including new candidates generated by the selection candidate generation unit (163) when the number of candidates is less than the set maximum number. Also good.
  • the candidate list generation unit (140) generates a candidate list including one or more new candidates generated by the selection candidate generation unit (163) so that the number of candidates does not exceed the maximum number. It may be generated.
  • a code string generation unit (104) for encoding candidate specifying information for specifying a candidate for motion information used for motion compensation prediction in the candidate list may be further included.
  • the candidate list generation unit (140) may assign candidate specifying information larger than the candidate to the new candidate generated by the selection candidate generation unit (163).
  • the first prediction list and the second prediction list may be different prediction lists.
  • the candidate list generation unit (140) may include motion information derived from motion information of a block of an image temporally different from the image including the encoding target block in the candidate list.
  • the first motion information acquisition unit (161) may search for the candidates according to a first priority order, and may select a valid candidate as the first candidate.
  • the second motion information acquisition unit (162) may search for the candidates according to a second priority order and set a valid candidate as the second candidate.
  • the first motion information acquisition unit (161) may use a predetermined candidate among the candidates as the first candidate.
  • the second motion information acquisition unit (162) may use another predetermined candidate among the candidates as the second candidate.
  • the selection candidate generation unit (163) includes the first prediction list motion information and the first prediction list motion information acquired by the first motion information acquisition unit (161) and the second motion information acquisition unit (162). If both pieces of motion information in the second prediction list are valid, the new candidate may be generated.
  • the new candidate may have two pieces of motion information.
  • the new candidate may have one piece of motion information.
  • Another aspect of the present invention is an image encoding method.
  • This method is an image encoding method for performing motion compensation prediction, and each of motion information including at least motion vector information and reference image information from a plurality of encoded blocks adjacent to an encoding target block. Selecting a plurality of blocks having one or two and generating a candidate list including motion information candidates used for motion compensation prediction from the motion information of the selected blocks; and a first included in the candidate list Obtaining the motion information of the first prediction list from the candidates, obtaining the motion information of the second prediction list from the second candidates included in the candidate list, and the motion of the first prediction list Combining the information and the motion information of the second prediction list to generate a new candidate for motion information.
  • An image decoding device is an image decoding device that performs motion compensation prediction, and includes at least motion vector information and reference image information from a plurality of decoded blocks adjacent to a decoding target block.
  • a candidate list generation unit (230) that selects a plurality of blocks each having one or two pieces of motion information and generates a candidate list including motion information candidates used for motion compensation prediction from the motion information of the selected blocks.
  • a first motion information acquisition unit (161) that acquires motion information of the first prediction list from the first candidate included in the candidate, and a second prediction list from the second candidate included in the candidate
  • a second motion information acquisition unit (162) that acquires the motion information of the first prediction list acquired by the first motion information acquisition unit (161), and the second Comprising a combination of motion information of the second prediction lists acquired by the motion information obtaining section (162), selecting a candidate generation unit for generating a new candidate motion information (163), the.
  • the candidate list generation unit (230) generates a candidate list including new candidates generated by the selection candidate generation unit (163) when the number of candidates is less than the set maximum number. Also good.
  • the candidate list generation unit (230) generates a candidate list including one or more new candidates generated by the selection candidate generation unit (163) so that the number of candidates does not exceed the maximum number. It may be generated.
  • a selection unit (231) that selects one candidate from the selection candidates included in the candidate list generated in (230) may be further provided.
  • the candidate list generation unit (230) may assign candidate specifying information larger than the candidate to the new candidate generated by the selection candidate generation unit (163).
  • the first prediction list and the second prediction list may be different prediction lists.
  • the candidate list generation unit (230) may include motion information derived from motion information of a block of an image temporally different from the image including the decoding target block in the candidate list.
  • the first motion information acquisition unit (161) may search for the candidates according to a first priority order, and may select a valid candidate as the first candidate.
  • the second motion information acquisition unit (162) may search for the candidates according to a second priority order and set a valid candidate as the second candidate.
  • the first motion information acquisition unit (161) may use a predetermined candidate among the candidates as the first candidate.
  • the second motion information acquisition unit (162) may use another predetermined candidate among the candidates as the second candidate.
  • the selection candidate generation unit (163) includes the first prediction list motion information and the first prediction list motion information acquired by the first motion information acquisition unit (161) and the second motion information acquisition unit (162). If both pieces of motion information in the second prediction list are valid, the new candidate may be generated.
  • the new candidate may have two pieces of motion information.
  • the new candidate may have one piece of motion information.
  • Another aspect of the present invention is an image decoding method.
  • This method selects and selects a plurality of blocks each having one or two pieces of motion information including at least motion vector information and reference image information from a plurality of decoded blocks adjacent to the decoding target block.
  • generating a new candidate for motion information by combining the motion information of the first prediction list and the motion information of the second prediction list.
  • the encoding efficiency of motion information including motion vectors can be further improved.
  • FIGS. 2A and 2B are diagrams for explaining an encoded block.
  • FIGS. 3A to 3D are diagrams for explaining a prediction block. It is a figure for demonstrating a prediction block size. It is a figure for demonstrating prediction encoding mode.
  • FIGS. 6A to 6D are diagrams for explaining the prediction direction of motion compensation prediction. It is a figure for demonstrating an example of the syntax of a prediction block.
  • FIGS. 8A to 8C are views for explaining a truncated unary code string of a merge index. It is a figure for demonstrating the structure of the moving image encoder which concerns on Embodiment 1 of this invention.
  • FIGS. 1 and 2 are figures for demonstrating the management method of the motion information in the motion information memory of FIG. It is a figure for demonstrating the structure of the motion information generation part of FIG. It is a figure for demonstrating the structure of the difference vector calculation part of FIG. It is a figure for demonstrating a space candidate block group. It is a figure for demonstrating a time candidate block group. It is a figure for demonstrating the structure of the joint motion information determination part of FIG. It is a figure for demonstrating the structure of the joint motion information candidate production
  • 19A and 19B are diagrams for explaining conversion from a merge candidate number to a merge index. It is a flowchart for demonstrating the operation
  • 10 is a flowchart for explaining an operation of a motion information generation unit in FIG. 9.
  • 12 is a flowchart for explaining an operation of a difference vector calculation unit in FIG. 11.
  • movement of the joint motion information determination part of FIG. 18 is a flowchart for explaining the operation of the bidirectional combined motion information candidate list generation unit in FIG. 16. It is a flowchart for demonstrating operation
  • FIGS. 31A to 31C are diagrams for explaining an extended example of determining the prediction direction of the bidirectional combined motion information candidate.
  • FIG. 33 is a diagram for describing a configuration of a motion information reproducing unit in FIG. 32. It is a figure for demonstrating the structure of the motion vector reproduction
  • 33 is a flowchart for explaining an operation of a motion information reproducing unit in FIG. 32. It is a flowchart for demonstrating operation
  • FIG. 40A and 40B are diagrams for explaining a candidate number management table according to the first modification.
  • FIG. 10 is a diagram for explaining another candidate number management table according to the first modification of the first embodiment. It is a flowchart for derivation
  • FIG. 10 is a flowchart for explaining an operation of a backward motion information determination unit according to Modification 3 of Embodiment 1.
  • FIG. 10 is a diagram for describing a configuration of a combined motion information candidate generation unit according to Modification 4 of Embodiment 1. It is a figure for demonstrating operation
  • 49 (a) and 49 (b) are diagrams for explaining a predetermined combination of BD0 and BD1 according to the sixth modification of the first embodiment.
  • FIG. 10 is a diagram for explaining a predetermined combination of BD0 and BD1 according to the sixth modification of the first embodiment.
  • FIG. 6 is a diagram (part 1) for explaining the effect of the first embodiment
  • FIG. 8 is a diagram (part 2) for explaining the effect of the first embodiment
  • FIG. 6 is a diagram (No. 3) for explaining the effect of the first embodiment
  • FIGS. 53A and 53B are diagrams for explaining the syntax for encoding the candidate number management table of the second embodiment in the encoded stream.
  • FIG. 10 is a diagram for explaining a candidate number management table according to the third embodiment.
  • FIG. 10 is a diagram for illustrating a configuration of a combined motion information candidate generation unit according to the third embodiment.
  • 10 is a flowchart for explaining an operation of a combined motion information candidate generation unit according to the third embodiment.
  • 10 is a flowchart for explaining an operation of a candidate number management table changing unit according to the third embodiment.
  • FIGS. 60A and 60B are diagrams for explaining the candidate number management table of the candidate number management table changing unit according to the first modification of the third embodiment.
  • 15 is a flowchart for explaining an operation of a candidate number management table changing unit according to the second modification of the third embodiment.
  • 22 is a flowchart for explaining an operation of a candidate number management table changing unit according to the third modification of the third embodiment.
  • 10 is a flowchart for explaining an operation of a reference direction motion information determination unit according to the fourth embodiment.
  • FIG. 25 is a diagram for illustrating a configuration of a combined motion information candidate generation unit according to the fifth embodiment.
  • FIG. 20 is a diagram for explaining a candidate number management table according to the sixth embodiment.
  • 18 is a flowchart for explaining an operation of a reference direction determination unit according to the sixth embodiment. It is a figure for demonstrating the calculation method of the motion vector mvL0t of a time coupling
  • the MPEG-2 video (ISO / IEC 18 13818-2) encoding system was established as a general-purpose video compression encoding system, and used for DVD and D-VHS (registered trademark) standard digital VTR magnetic tapes, etc. It is widely used as an application for storage media and digital broadcasting.
  • H.264 (144 / 96-10 in ISO / IEC and H.264 in ITU-T, hereinafter referred to as MPEG-4AVC) has been established as an international standard.
  • an input image signal is divided into maximum coding block units as shown in FIG. 1, and the divided coding blocks are processed in a raster scan order.
  • the encoded block has a hierarchical structure, and can be made smaller encoded blocks by sequentially equally dividing into 4 in consideration of the encoding efficiency. Note that the encoded blocks divided into four are encoded in the zigzag scan order. An encoded block that cannot be further reduced is called a minimum encoded block.
  • An encoded block is a unit of encoding, and the maximum encoded block is also an encoded block when the number of divisions is zero. In this embodiment, the maximum coding block is 64 pixels ⁇ 64 pixels, and the minimum coding block is 8 pixels ⁇ 8 pixels.
  • FIGS. 2A and 2B show an example of division of the maximum coding block.
  • the encoded block is divided into ten.
  • CU0, CU1 and CU9 are 32 ⁇ 32 pixel coding blocks
  • CU2, CU3 and CU8 are 16 ⁇ 16 pixel coding blocks
  • CU4, CU5, CU6 and CU7 are 8 ⁇ 8 pixel coding blocks. It has become.
  • the encoded block is further divided into prediction blocks.
  • the prediction block division patterns are shown in FIGS. 3A is 2N ⁇ 2N that does not divide the encoded block, FIG. 3B is 2N ⁇ N that is horizontally divided, FIG. 3C is N ⁇ 2N that is vertically divided, and FIG. 3D. Indicates N ⁇ N divided horizontally and vertically.
  • the prediction block size includes a CU division number of 0 and a maximum prediction block size of 64 pixels ⁇ 64 pixels to a CU division number of 3 and a minimum prediction block size. There are 13 predicted block sizes up to 4 pixels x 4 pixels.
  • the maximum encoding block is 64 pixels ⁇ 64 pixels and the minimum encoding block is 8 pixels ⁇ 8 pixels, but the present invention is not limited to this combination.
  • the prediction block division patterns are shown in FIGS. 3A to 3D, the division is not limited to this as long as it is divided into one or more.
  • the prediction direction of motion compensation prediction and the number of encoded vectors can be switched by the block size of the prediction block.
  • an example of a predictive coding mode in which the prediction direction of motion compensation prediction and the number of coding vectors are associated will be briefly described with reference to FIG.
  • the prediction coding mode shown in FIG. 5 includes a unidirectional mode (UniPred) in which the prediction direction of motion compensation prediction is unidirectional and the number of coding vectors is 1, and the prediction direction of motion compensation prediction is bidirectional.
  • a bidirectional mode (BiPred) in which the number of encoded vectors is 2
  • a merge mode (MERGE) in which the prediction direction of motion compensation prediction is unidirectional or bidirectional and the number of encoded vectors is 0.
  • MERGE merge mode
  • Intra intra mode which is a predictive coding mode in which motion compensation prediction is not performed.
  • the reference image used in the motion compensation prediction is encoded in the encoded stream together with the encoded vector as a reference image index.
  • the reference image index used in motion compensation prediction is a numerical value of 0 or more.
  • a plurality of reference images that can be selected by the reference image index are managed in a reference index list. If the prediction direction of motion compensation prediction is unidirectional, one reference image index is encoded. If the prediction direction of motion compensation prediction is bidirectional, a reference image index indicating a reference image in each prediction direction is encoded. (See FIG. 5).
  • Predicted vector index In HEVC, in order to improve the accuracy of a prediction vector, it is considered to select an optimal prediction vector from among a plurality of prediction vector candidates and to encode a prediction vector index for indicating the selected prediction vector. Yes.
  • the prediction vector index is introduced. If the prediction direction of motion compensation prediction is unidirectional, one prediction vector index is encoded. If the prediction direction of motion compensation prediction is bidirectional, a prediction vector index indicating a prediction vector in each prediction direction is encoded. (See FIG. 5).
  • merge technique the above-described merge index (merge technique) is introduced. As shown in FIG. 5, one merge index is encoded when the prediction encoding mode is the merge mode. If the motion information is bidirectional, the motion information includes motion vector information and reference image information in each prediction direction.
  • motion information possessed by a block that may be indicated by the merge index is referred to as a combined motion information candidate, and an aggregate of combined motion information candidates is referred to as a combined motion information candidate list.
  • FIG. 6A shows a case where the reference image (RefL0Pic) in the unidirectional direction and the L0 direction is at a time before the encoding target image (CurPic).
  • FIG. 6B shows a case where the reference image in the unidirectional direction and the L0 direction is at a time after the encoding target image.
  • the reference image in the L0 direction in FIGS. 6A and 6B may be replaced with a reference image in the L1 direction (RefL1Pic).
  • FIG. 6C shows a case where the reference image in the L0 direction is at a time before the encoding target image and the reference image in the L1 direction is at a time after the encoding target image.
  • FIG. 6D shows a case in which the reference image in the L0 direction and the reference image in the L1 direction are at a time before the encoding target image.
  • the reference image in the L0 direction in FIGS. 6C and 6D may be replaced with the reference image in the L1 direction (RefL1Pic), and the reference image in the L1 direction may be replaced with the reference image in the L0 direction.
  • the L0 direction and the L1 direction which are prediction directions of motion compensation prediction, can be indicated in either the forward direction or the backward direction in terms of time.
  • a plurality of reference images can exist in each of the L0 direction and the L1 direction, the reference image in the L0 direction is registered in the reference image list L0, and the reference image in the L1 direction is registered in the reference image list L1, The position of the reference image in the reference image list is designated by the reference image index in each prediction direction, and the reference image is determined.
  • the prediction direction is the L0 direction is a prediction direction that uses motion information associated with the reference image registered in the reference image list L0
  • the prediction direction is the L1 direction is registered in the reference image list L1.
  • the prediction direction uses motion information associated with the reference image.
  • the prediction block includes a merge flag (merge_flag), a merge index (merge_idx), a direction of motion compensation prediction (inter_pred_type), a reference index (ref_idx_l0 and ref_idx_l1), and a difference vector (mvd_l0 [0], mvd_l0 [1], mvd_l [1], mvd ], Mvd_l1 [1]) and prediction vector indexes (mvp_idx_l0 and mvp_idx_l1). [0] of the difference vector indicates a horizontal component and [1] indicates a vertical component.
  • ref_idx_l0 and mvd_l0 [0], mvd_l0 [1], and mvp_idx_l0 are information regarding the L0 direction
  • ref_idx_l1 and mvd_l1 [0], mvd_l1 [1], and mvp_idx_l1 are information regarding the L1 direction.
  • inter_pred_type Pred_L0 (unidirectional in the L0 direction), Pred_L1 (unidirectional in the L1 direction), and Pred_BI (bidirectional in the BI).
  • the merge mode can transmit motion information with one merge index. Therefore, if the prediction errors in the merge mode (merge flag is 1) and non-merge mode (merge flag is 0) are about the same, the merge mode can more efficiently encode motion information. In other words, the efficiency of motion information encoding can be improved by increasing the selection rate of the merge mode.
  • the motion information is encoded with less information in the merge mode than in the non-merge mode. What is necessary is not limited to this.
  • the motion information may be only a difference vector.
  • NumMergeCands which is a function for calculating the number of merge candidates before the merge index decoding (encoding), calculates the number of prediction vector candidates before the prediction vector index decoding (encoding).
  • NumMvpCands () is installed. These are functions necessary for obtaining the number of candidates because the number of merge candidates and the number of prediction vector candidates change for each prediction block depending on the validity of motion information of adjacent blocks. Note that the motion information of an adjacent block is valid means that the adjacent block is not a block outside the area or the intra mode, and the motion information of the adjacent block is invalid means that the adjacent block is out of the area. It is a block or intra mode.
  • FIG. 8A shows a merge index code string using a truncated unary code string when the number of merge candidates is two
  • FIG. 8B shows a merge using a truncated unary code string when the number of merge candidates is three
  • FIG. 8C shows a code sequence of the index
  • FIG. 8C shows a code sequence of the merge index by the Truncated Unary code sequence when the number of merge candidates is four.
  • the encoding efficiency of the merge index improves as the number of merge candidates decreases. That is, the coding efficiency of the merge index can be improved by leaving candidates with high selectivity and reducing candidates with low selectivity. In addition, when the number of candidates is the same, the code amount of the smaller merge index is smaller, so that the encoding efficiency can be improved by assigning a small merge index to a candidate with a high selection rate.
  • POC Picture Order Count
  • time information time information of an image.
  • POC is a counter indicating the display order of images defined by MPEG-4 AVC.
  • the POC is also increased by 1. Therefore, the time difference (distance) between images can be acquired from the POC difference between images.
  • the degree of correlation between the motion information of the processing target block and the motion information of the block adjacent to the processing target block is high when the processing target block and the adjacent block have the same motion. For example, this is a case where the region including the processing target block and the adjacent block is translated.
  • the degree of correlation between the motion information of the processing target block and the motion information of the adjacent block also depends on the length of contact between the processing target block and the adjacent block.
  • a block in the same position as the processing target block (hereinafter referred to as the same position block) on another decoded image generally used in the temporal direct mode or the spatial direct mode, and the processing target block
  • the degree of correlation is high when the block at the same position and the block to be processed are in a stationary state.
  • FIG. 9 shows a configuration of moving picture coding apparatus 100 according to Embodiment 1 of the present invention.
  • the moving image encoding apparatus 100 is an apparatus that encodes a moving image signal in units of prediction blocks for performing motion compensation prediction. It is assumed that the coding block division, the prediction block size determination, and the prediction encoding mode determination are determined by the higher-order encoding control unit.
  • the moving picture encoding apparatus 100 is realized by hardware such as an information processing apparatus including a CPU (Central Processing Unit), a frame memory, a hard disk, and the like.
  • the moving image encoding apparatus 100 realizes functional components described below by operating the above components. Note that the position information of the prediction block to be processed, the prediction block size, and the prediction direction of motion compensated prediction are assumed to be shared in the video encoding device 100 and are not shown.
  • the moving image encoding apparatus 100 includes a prediction block image acquisition unit 101, a subtraction unit 102, a prediction error encoding unit 103, a code string generation unit 104, a prediction error decoding unit 105, a motion compensation unit 106, and an addition unit. 107, a motion vector detection unit 108, a motion information generation unit 109, a frame memory 110, and a motion information memory 111.
  • the prediction block image acquisition unit 101 acquires the image signal of the prediction block to be processed from the image signal supplied from the terminal 10 based on the position information and the prediction block size of the prediction block, and subtracts the image signal of the prediction block 102, and supplied to the motion vector detection unit 108 and the motion information generation unit 109.
  • the subtraction unit 102 subtracts the image signal supplied from the prediction block image acquisition unit 101 and the prediction signal supplied from the motion compensation unit 106 to calculate a prediction error signal, and calculates the prediction error signal to the prediction error encoding unit 103. To supply.
  • the prediction error encoding unit 103 performs processing such as quantization and orthogonal transformation on the prediction error signal supplied from the subtraction unit 102 to generate prediction error encoded data, and encodes the prediction error encoded data.
  • the data is supplied to the column generation unit 104 and the prediction error decoding unit 105.
  • the code string generation unit 104 includes the prediction error encoded data supplied from the prediction error encoding unit 103, the merge flag supplied from the motion information generation unit 109, the merge candidate number, the prediction direction of motion compensation prediction, and the reference image index.
  • the difference vector and the prediction vector index are entropy-encoded according to the syntax together with the prediction direction of motion compensation prediction to generate a code string, and the code string is supplied to the terminal 11.
  • the merge candidate number is converted into a merge index to generate a code string.
  • the merge candidate number is a number indicating the selected combined motion information candidate. The conversion from the merge candidate number to the merge index will be described later.
  • a truncated unary code string is used for encoding a merge index and a prediction vector index as described above.
  • the code string can be encoded with a smaller number of bits as the number of candidates is smaller. .
  • the prediction error decoding unit 105 performs a process such as inverse quantization or inverse orthogonal transform on the prediction error encoded data supplied from the prediction error encoding unit 103 to generate a prediction error signal, and the prediction error signal Is supplied to the adder 107.
  • the motion compensation unit 106 performs motion compensation on the reference image in the frame memory 110 indicated by the reference image index supplied from the motion information generation unit 109 based on the motion vector supplied from the motion information generation unit 109, and generates a prediction signal. Generate. If the prediction direction is bidirectional, the prediction signal is obtained by averaging the prediction signals in the L0 direction and the L1 direction.
  • the addition unit 107 adds the prediction error signal supplied from the prediction error decoding unit 105 and the prediction signal supplied from the motion compensation unit 106 to generate a decoded image signal, and supplies the decoded image signal to the frame memory 110. To do.
  • the motion vector detection unit 108 detects a motion vector and a reference image index indicating the reference image from the image signal supplied from the prediction block image acquisition unit 101 and an image signal corresponding to a plurality of reference images, and the motion vector and the motion vector The reference image index is supplied to the motion information generation unit 109. If the prediction direction is bidirectional, motion vectors and reference image indexes in the L0 direction and the L1 direction are detected.
  • a general motion vector detection method calculates an error evaluation value for an image signal corresponding to a reference image moved by a predetermined movement amount from the same position as the image signal of the target image, and moves to minimize the error evaluation value. Let the amount be a motion vector.
  • SAD Sum of Absolute Difference
  • MSE Mel Square Error
  • the motion information generation unit 109 uses the motion vector and reference image index supplied from the motion vector detection unit 108, the candidate block group supplied from the motion information memory 111, and the reference image indicated by the reference image index in the frame memory 110. , A merge candidate number, or a difference vector and a prediction vector index, and a code flag generation unit 104, a motion compensation unit 106, and a merge flag, a merge candidate number, a reference image index, a difference vector, and a prediction vector index
  • the motion information memory 111 is supplied. A detailed configuration of the motion information generation unit 109 will be described later.
  • the frame memory 110 stores the decoded image signal supplied from the adding unit 107. In addition, for a decoded image in which decoding of the entire image is completed, a predetermined number of images of 1 or more is stored as a reference image.
  • the frame memory 110 supplies the stored reference image signal to the motion compensation unit 106 and the motion information generation unit 109.
  • a storage area for storing the reference image is controlled by a FIFO (First In First Out) method.
  • the motion information memory 111 stores the motion information supplied from the motion information generation unit 109 for a predetermined number of images in units of the minimum predicted block size.
  • the motion information of the adjacent block of the prediction block to be processed is set as a space candidate block group.
  • the motion information memory 111 uses the motion information of the block on the ColPic and the surrounding blocks at the same position as the prediction block to be processed as a time candidate block group.
  • the motion information memory 111 supplies the spatial candidate block group and the temporal candidate block group to the motion information generation unit 109 as candidate block groups.
  • the motion information memory 111 is synchronized with the frame memory 110 and is controlled by a FIFO (First In First Out) method.
  • ColPic refers to a decoded image different from the prediction block to be processed and stored in the frame memory 110 as a reference image.
  • ColPic is a reference image decoded immediately before.
  • ColPic is a reference image decoded immediately before.
  • it may be an encoded image, for example, a reference image immediately before in display order or a reference image immediately after in display order may be used. It can also be specified in the encoded stream.
  • FIG. 10 shows a state where the predicted block size to be processed is 16 pixels ⁇ 16 pixels.
  • the motion information of the prediction block is stored in 16 memory areas indicated by hatching in FIG.
  • the predictive coding mode is the intra mode
  • (0, 0) is stored as the motion vector in the L0 direction and the L1 direction
  • “ ⁇ 1” is stored as the reference image index in the L0 direction and the L1 direction.
  • the reference image index “ ⁇ 1” may be any value as long as it can be determined that the mode does not perform motion compensation prediction. From this point onward, unless expressed otherwise, the term “block” refers to the smallest predicted block unit when expressed simply as a block. Further, in the case of a block outside the area, as in the intra mode, (0, 0) is stored as the motion vector in the L0 direction and the L1 direction, and “ ⁇ 1” is stored as the reference image index in the L0 direction and the L1 direction. .
  • the LX direction (X is 0 or 1) is valid
  • the reference image index in the LX direction is 0 or more
  • the reference image index in the LX direction is “ ⁇ 1 ”.
  • the motion information generation unit 109 includes a difference vector calculation unit 120, a combined motion information determination unit 121, and a predictive coding mode determination unit 122.
  • the terminal 12 is in the motion information memory 111
  • the terminal 13 is in the motion vector detection unit 108
  • the terminal 14 is in the frame memory 110
  • the terminal 15 is in the prediction block image acquisition unit 101
  • the terminal 16 is in the code string generation unit 104
  • the terminal 50 Are connected to the motion compensation unit 106
  • the terminal 51 is connected to the motion information memory 111, respectively.
  • the difference vector calculation unit 120 predicts from the candidate block group supplied from the terminal 12, the motion vector and reference image index supplied from the terminal 13, the reference image supplied from the terminal 14, and the image signal supplied from the terminal 15. A vector index is determined, and a difference vector and a rate distortion evaluation value are calculated. Then, the reference image index, the motion vector, the difference vector, the prediction vector index, and the rate distortion evaluation value are supplied to the prediction encoding mode determination unit 122.
  • the detailed configuration of the difference vector calculation unit 120 will be described later.
  • the combined motion information determination unit 121 generates a combined motion information candidate list from the candidate block group supplied from the terminal 12, the reference image supplied from the terminal 14, and the image signal supplied from the terminal 15. Then, the combined motion information determination unit 121 selects a combined motion information candidate from the generated combined motion information candidate list, determines a merge candidate number, calculates a rate distortion evaluation value, and combines the combined motion information candidate. Motion information, the merge candidate number, and the rate distortion evaluation value are supplied to the predictive coding mode determination unit 122. A detailed configuration of the combined motion information determination unit 121 will be described later.
  • the predictive coding mode determination unit 122 compares the rate distortion evaluation value supplied from the difference vector calculation unit 120 with the rate distortion evaluation value supplied from the combined motion information determination unit 121. If the former is less than the latter, the merge flag is set to “0”. The predictive coding mode determination unit 122 supplies the merge flag, the reference image index, the difference vector, and the prediction vector index supplied from the difference vector calculation unit 120 to the terminal 16, and the motion vector supplied from the difference vector calculation unit 120. The reference image index is supplied to the terminal 50 and the terminal 51.
  • the predictive coding mode determination unit 122 supplies the merge flag and the merge candidate number supplied from the combined motion information determination unit 121 to the terminal 16, and the motion vector of the motion information supplied from the combined motion information determination unit 121 and the reference image
  • the index is supplied to the terminal 50 and the terminal 51.
  • the difference vector calculation unit 120 includes a prediction vector candidate list generation unit 130, a prediction vector determination unit 131, and a subtraction unit 132.
  • the terminal 17 is connected to the predictive coding mode determination unit 122.
  • the prediction vector candidate list generation unit 130 is also installed in the moving image decoding apparatus 200 that decodes the code sequence generated by the moving image encoding apparatus 100 according to the first embodiment.
  • the image decoding apparatus 200 generates a prediction vector candidate list having no contradiction.
  • the prediction vector candidate list generation unit 130 deletes candidate blocks outside the region and candidate blocks in the intra mode from the candidate block group supplied from the terminal 12. Further, when there are a plurality of candidate blocks having overlapping motion vectors, one candidate block is left and deleted. The prediction vector candidate list generation unit 130 generates a prediction vector candidate list from these candidate blocks after deletion, and supplies the prediction vector candidate list to the prediction vector determination unit 131. It is assumed that the predicted vector candidate list generated in this way includes one or more predicted vector candidates that do not overlap. For example, when there is no candidate block having a motion vector, the vector (0, 0) is added to the prediction vector candidate list. If the prediction direction is bidirectional, a prediction vector candidate list is generated and supplied for the L0 direction and the L1 direction.
  • the prediction vector determination unit 131 selects an optimal prediction vector for the motion vector supplied from the terminal 13 from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 130.
  • the prediction vector determination unit 131 supplies the selected prediction vector to the subtraction unit 132, and supplies a reference vector index and a prediction vector index that is information indicating the selected prediction vector to the terminal 17. If the prediction direction is bidirectional, an optimal prediction vector is selected and supplied for the L0 direction and the L1 direction.
  • the prediction error amount is calculated from the reference image supplied from the terminal 14 and the image signal supplied from the terminal 15 based on the motion vector of the prediction vector candidate as the optimal prediction vector. Then, a rate distortion evaluation value is calculated from the code amount of the reference image index, the difference vector, and the prediction vector index, and the above-described prediction error amount, and a prediction vector candidate that minimizes the rate distortion evaluation value is selected.
  • the subtraction unit 132 calculates a difference vector by subtracting the prediction vector supplied from the prediction vector determination unit 131 from the motion vector supplied from the terminal 13, and supplies the difference vector to the terminal 17. If the prediction direction is bidirectional, a difference vector is calculated and supplied for the L0 direction and the L1 direction.
  • the candidate block group includes a spatial candidate block group and a temporal candidate block group.
  • FIG. 13 shows adjacent blocks of the prediction block to be processed when the prediction block size to be processed is 16 pixels ⁇ 16 pixels.
  • the space candidate block group is assumed to be five blocks of block A1, block C, block D, block B1, and block E shown in FIG.
  • the spatial candidate block group is five blocks of block A1, block C, block D, block B1, and block E, but the spatial candidate block group is at least one or more processed adjacent to the prediction block to be processed. Any block may be used, and the present invention is not limited to these. For example, all of block A1, block A2, block A3, block A4, block B1, block B2, block B3, block B4, block C, block D, and block E may be spatial candidate blocks.
  • FIG. 14 shows a block in a prediction block on ColPic and its peripheral blocks at the same position as the prediction block to be processed when the prediction block size to be processed is 16 pixels ⁇ 16 pixels.
  • the time candidate block group includes two blocks, block H and block I6 shown in FIG.
  • the time candidate block group is two blocks of block H and block I6 on ColPic, but the time candidate block group is at least one block on a decoded image different from the prediction block to be processed.
  • the time candidate block group is at least one block on a decoded image different from the prediction block to be processed.
  • all of block I1 to block I16, block A1 to block A4, block B1 to block B4, block C, block D, block E, block F1 to block F4, block G1 to block G4 and block H on ColPic It may be a candidate block.
  • block A4 is referred to as block A
  • block B4 is referred to as block B.
  • blocks H and I6 are referred to as time blocks.
  • the combined motion information determination unit 121 includes a combined motion information candidate generation unit 140 and a combined motion information selection unit 141.
  • the combined motion information candidate generation unit 140 is also installed in the moving image decoding apparatus 200 that decodes the code sequence generated by the moving image encoding apparatus 100 according to Embodiment 1, and is combined with the moving image encoding apparatus 100 and the moving image.
  • the image decoding apparatus 200 generates the same combined motion information list without any contradiction.
  • the combined motion information candidate generation unit 140 generates a combined motion information candidate list from the candidate block group supplied from the terminal 12, and supplies the combined motion information candidate list to the combined motion information selection unit 141. A detailed configuration of the combined motion information candidate generation unit 140 will be described later.
  • the combined motion information selection unit 141 is information that selects an optimal combined motion information candidate from the combined motion information candidate list supplied from the combined motion information candidate generation unit 140 and indicates the selected combined motion information candidate.
  • the merge candidate number is supplied to the terminal 17.
  • the motion vector, and the reference image supplied from the terminal 14 obtained based on the reference image index and the image signal supplied from the terminal 15 as the optimal combined motion information candidate.
  • a prediction error amount is calculated.
  • a rate distortion evaluation value is calculated from the code amount of the merge candidate number and the prediction error amount, and a combined motion information candidate that minimizes the rate distortion evaluation value is selected.
  • the candidate block group includes a spatial candidate block group and a temporal candidate block group.
  • the space candidate block group is assumed to be four blocks of block A4, block B4, block C, and block E shown in FIG.
  • the spatial candidate block group is four blocks of block A4, block B4, block C, and block E, but the spatial candidate block group may be at least one or more processed blocks adjacent to the prediction block to be processed. What is necessary is not limited to these.
  • the time candidate block group includes two blocks, block H and block I6 shown in FIG.
  • the time candidate block group is the same as the time candidate block group supplied to the prediction vector candidate list generation unit 130, but the time candidate block group is on a decoded image different from the prediction block to be processed.
  • the block is not limited to these as long as it is at least 0 or more blocks.
  • FIG. 16 shows a configuration of the combined motion information candidate generation unit 140.
  • the terminal 18 is connected to the combined motion information selection unit 141.
  • the combined motion information candidate generation unit 140 includes a unidirectional combined motion information candidate list generation unit 150, a first combined motion information candidate list reduction unit 151, a bidirectional combined motion information candidate list generation unit 152, and a second combined motion information candidate list reduction. Part 153.
  • the unidirectional combined motion information candidate list generation unit 150 generates a first combined motion information candidate list from the candidate block group supplied from the terminal 12, and reduces the first combined motion information candidate list to the first combined motion information candidate list. To the unit 151.
  • the first combined motion information candidate list reduction unit 151 includes a plurality of combined motion information candidates having motion information overlapping from the first combined motion information candidate list supplied from the unidirectional combined motion information candidate list generation unit 150.
  • the second combined motion information candidate list is generated by deleting one of the combined motion information candidates, and the second combined motion information candidate list is supplied to the bidirectional combined motion information candidate list generating unit 152.
  • the bidirectional combined motion information candidate list generating unit 152 generates a bidirectional combined motion information candidate list from the second combined motion information candidate list supplied from the first combined motion information candidate list reducing unit 151, and the bidirectional combined motion information candidate list is generated.
  • the information candidate list is combined with the second combined motion information candidate list described above to generate a third combined motion information candidate list, and the third combined motion information candidate list is supplied to the second combined motion information candidate list reduction unit 153.
  • a detailed configuration of the bidirectional combined motion information candidate list generation unit 152 will be described later.
  • the bidirectional combined motion information candidate list generation unit 152 generates a bidirectional combined motion information candidate (BD0) whose reference direction is L0 and a bidirectional combined motion information candidate (BD1) whose reference direction is L1. Shall. Therefore, there is a possibility that BD0 and BD1 are included in the above-described bidirectional combined motion information candidate list.
  • the second combined motion information candidate list reduction unit 153 includes a plurality of combined motion information candidates that have overlapping motion information from the third combined motion information candidate list supplied from the bidirectional combined motion information candidate list generation unit 152.
  • the combined motion information candidate list is generated by deleting one combined motion information candidate, and the combined motion information candidate list is supplied to the terminal 18.
  • the unidirectional combined motion information candidate is a motion information candidate of a candidate block used in a so-called merge technique, and is motion information obtained from one candidate block.
  • the bidirectional combined motion information is a technique that is a feature of the first embodiment, and is motion information obtained by using two pieces of motion information from two candidate blocks. In this embodiment, one L0 direction and one L1 direction are used as two pieces of motion information.
  • FIG. 17 shows a configuration of the bidirectional combined motion information candidate list generation unit 152.
  • the terminal 19 is connected to the first combined motion information candidate list reduction unit 151, and the terminal 20 is connected to the second combined motion information candidate list reduction unit 153.
  • the bidirectional combined motion information candidate list generation unit 152 includes a reference direction determination unit 160, a reference direction motion information determination unit 161, a reverse direction motion information determination unit 162, and a bidirectional motion information determination unit 163.
  • the reference direction determination unit 160 determines the reference direction of the bidirectional combined motion information candidate from the second combined motion information candidate list, and determines the reference direction and the second combined motion information candidate list supplied from the terminal 19 as the reference direction motion information.
  • the data is sent to the determination unit 161.
  • the reference direction for the bidirectional combined motion information candidate (BD0) with the reference direction L0 is the L0 direction
  • the reference direction for the bidirectional combined motion information candidate (BD1) with the reference direction L1 is the L1 direction.
  • the reference direction motion information determination unit 161 determines the reference direction motion vector and the reference image index of the bidirectional combined motion information candidate from the reference direction and the second combined motion information candidate list supplied from the reference direction determination unit 160, The reference direction, the motion vector in the reference direction, the reference image index, and the second combined motion information candidate list are sent to the backward direction motion information determination unit 162.
  • the backward direction motion information determination unit 162 determines the bidirectional combined motion information candidate from the reference direction, the reference direction motion vector and the reference image index, and the second combined motion information candidate list supplied from the reference direction motion information determination unit 161. A backward motion vector and a reference image index are determined.
  • the backward motion information determination unit 162 sends the motion vector in the reference direction and the reference image index, the backward motion vector and the reference image index, and the second combined motion information candidate list to the bidirectional motion information determination unit 163.
  • the reverse direction is the L1 direction
  • the reverse direction is the L0 direction.
  • the bidirectional motion information determination unit 163 determines a bidirectional combined motion information candidate from the reference direction motion vector and the reference image index supplied from the backward direction motion information determination unit 162, and the backward direction motion vector and the reference image index. . In addition, the bidirectional motion information determination unit 163 generates a third combined motion information candidate list from the second combined motion information candidate list, and sends the third combined motion information candidate list to the terminal 20.
  • Merge candidate numbers 0 to 6 are combined motion information candidates (A) of block A, combined motion information candidates (B) of block B, and combined motion information candidates (COL) of time blocks included in the combined motion information candidate list, respectively.
  • Combined motion information candidate (C) of block C, combined motion information candidate (E) of block E, bidirectional combined motion information candidate (BD0) with reference direction L0, and bidirectional combined motion information candidate with reference direction L1 (BD1) is shown.
  • the maximum number of combined motion information candidates included in the combined motion information candidate list is 7 (the maximum value of the merge index is 6).
  • the merge candidate number of the bidirectional combined motion information candidate (BD0) having the reference direction L0 and the bidirectional combined motion information candidate (BD1) having the reference direction L1 is the merge candidate of the unidirectional combined motion information candidate. Assigned to be larger than the number.
  • the candidate number management table used in Embodiment 1 is shown in FIG. 18, it is only necessary that a smaller merge candidate number is assigned to a combined motion information candidate with a higher selection rate.
  • the maximum number of combined motion information candidates included in the candidate number management table and the combined motion information candidate list is assumed to be shared in the moving picture coding apparatus 100, and is not illustrated.
  • conversion from the merge candidate number to the merge index will be described with reference to FIGS.
  • FIG. 19A shows a combined motion information candidate for block A, a combined motion information candidate for block B, a combined motion information candidate for time block, a combined motion information candidate for block C, a combined motion information candidate for block E, and a reference direction.
  • the bidirectional combined motion information candidate of L0 and the bidirectional combined motion information candidate of the reference direction L1 are all valid, it indicates that the merge candidate number becomes the merge index as it is.
  • FIG. 19B shows a case in which merge indexes are assigned in ascending order of merge candidate numbers after filling invalid merge candidate numbers when the combined motion information candidates include invalid blocks.
  • the merge index 0 is set to the merge candidate number 0 and the merge index 1 is converted to merge candidate number 2
  • merge index 2 is converted to merge candidate number 3
  • merge index 3 is converted to merge candidate number 5
  • merge index 4 is converted to merge candidate number 6.
  • the merge index of the bidirectional combined motion information candidate (BD0) having the reference direction L0 and the bidirectional combined motion information candidate (BD1) having the reference direction L1 is greater than the merge index of the unidirectional combined motion information candidate. Is also assigned to be larger.
  • the reverse conversion from the merge index to the merge candidate number is performed, and the moving picture encoding apparatus 100 And the moving picture decoding apparatus 200 generates the same candidate number management table with no contradiction.
  • the prediction block image acquisition unit 101 acquires the image signal of the prediction block to be processed from the image signal supplied from the terminal 10 based on the position information of the prediction block and the prediction block size (S100).
  • the motion vector detection unit 108 detects a reference image index indicating a motion vector and a reference image from the image signal supplied from the predicted block image acquisition unit 101 and image signals corresponding to a plurality of reference images (S101).
  • the motion information generation unit 109 generates a merge candidate number or a difference vector and a prediction vector index from the motion vector and reference image index supplied from the motion vector detection unit 108 and the candidate block group supplied from the motion information memory 111. (S102).
  • the motion compensation unit 106 performs motion compensation on the reference image indicated by the reference image index in the frame memory 110 based on the motion vector supplied from the motion vector detection unit 108 to generate a prediction signal. If the prediction direction is bidirectional, an average of the prediction signals in the L0 direction and the L1 direction is generated as a prediction signal (S103).
  • the subtraction unit 102 calculates a difference between the image signal supplied from the prediction block image acquisition unit 101 and the prediction signal supplied from the motion compensation unit 106 to calculate a prediction error signal (S104).
  • the prediction error encoding unit 103 performs processing such as quantization and orthogonal transformation on the prediction error signal supplied from the subtraction unit 102 to generate prediction error encoded data (S105).
  • the code string generation unit 104 includes the prediction error encoded data supplied from the prediction error encoding unit 103, and the merge flag, merge candidate number, reference image index, difference vector, and prediction vector index supplied from the motion information generation unit 109. Is entropy-coded according to the syntax along with the prediction direction to generate a code string (S106).
  • the addition unit 107 adds the prediction error signal supplied from the prediction error decoding unit 105 and the prediction signal supplied from the motion compensation unit 106 to generate a decoded image signal (S107).
  • the frame memory 110 stores the decoded image signal supplied from the adding unit 107 (S108).
  • the motion information memory 111 stores one image of the motion vector supplied from the motion vector detection unit 108 in units of the minimum predicted block size (S109).
  • the difference vector calculation unit 120 includes a candidate block group supplied from the terminal 12, a motion vector and reference image index supplied from the terminal 13, a reference image supplied from the terminal 14, and an image signal supplied from the terminal 15. A prediction vector index is determined, and a difference vector and a rate distortion evaluation value are calculated (S110).
  • the combined motion information determination unit 121 determines the merge candidate number from the candidate block group supplied from the terminal 12, the reference image supplied from the terminal 14, and the image signal supplied from the terminal 15 to obtain the rate distortion evaluation value. Calculate (S111).
  • the predictive coding mode determination unit 122 compares the rate distortion evaluation value supplied from the difference vector calculation unit 120 with the rate distortion evaluation value supplied from the combined motion information determination unit 121, and merges if the former is smaller than the latter The flag is set to “0”, otherwise, the merge flag is set to “1” (S112).
  • the prediction vector candidate list generation unit 130 excludes candidate blocks that are out of the region, candidate blocks that are in the intra mode, and candidate blocks that have overlapping motion vectors from the candidate block group supplied from the terminal 12. A prediction vector candidate list is generated. If the prediction direction is bidirectional, a prediction vector candidate list is generated for the L0 direction and the L1 direction (S120).
  • the prediction vector determination unit 131 selects an optimal prediction vector for the motion vector supplied from the terminal 13 from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 130. If the prediction direction is bidirectional, an optimal prediction vector is selected for the L0 direction and the L1 direction (S121). The subtraction unit 132 subtracts the prediction vector supplied from the prediction vector determination unit 131 from the motion vector supplied from the terminal 13 to calculate a difference vector. If the prediction direction is bidirectional, a difference vector is calculated for the L0 direction and the L1 direction (S122).
  • the combined motion information candidate generation unit 140 generates a combined motion information candidate list from the candidate block group supplied from the terminal 12 (S130).
  • the combined motion information selection unit 141 selects from the combined motion information candidate list supplied from the combined motion information candidate generation unit 140, the motion vector and reference image index supplied from the terminal 13, and the combined motion information optimal for the prediction direction. Is determined (S131).
  • the unidirectional combined motion information candidate list generation unit 150 generates a spatial combined motion information candidate list from candidate blocks obtained by excluding candidate blocks outside the region and candidate blocks in the intra mode from the spatial candidate block group supplied from the terminal 12. (S140). Detailed operations for generating the spatially coupled motion information candidate list will be described later.
  • the unidirectional combined motion information candidate list generation unit 150 generates a time combined motion information candidate list from candidate blocks obtained by excluding candidate blocks outside the region and candidate blocks in the intra mode from the time candidate block group supplied from the terminal 12. Generate (S141). The detailed operation of generating the time combination motion information candidate list will be described later.
  • the unidirectional combined motion information candidate list generation unit 150 combines the spatial combined motion information candidate list and the temporal combined motion information candidate list in the order of merge candidate numbers to generate a first combined motion information candidate list (S142).
  • the first combined motion information candidate list reduction unit 151 includes a plurality of combined motion information candidates having motion information overlapping from the first combined motion information candidate list supplied from the unidirectional combined motion information candidate list generation unit 150.
  • the second combined motion information candidate list is generated by deleting one combined motion information candidate (S143).
  • the bidirectional combined motion information candidate list generating unit 152 generates a bidirectional combined motion information candidate list from the second combined motion information candidate list supplied from the first combined motion information candidate list reducing unit 151 (S144). The detailed operation of generating the bidirectional combined motion information candidate list will be described later.
  • the bidirectional combined motion information candidate list generation unit 152 combines the second combined motion information candidate list and the bidirectional combined motion information candidate list in the order of merge candidate numbers to generate a third combined motion information candidate list (S145). ).
  • the second combined motion information candidate list reduction unit 153 includes a plurality of combined motion information candidates that have overlapping motion information from the third combined motion information candidate list supplied from the bidirectional combined motion information candidate list generation unit 152.
  • a combined motion information candidate list is generated by deleting one combined motion information candidate (S146).
  • the spatial combination motion information candidate list includes motion information of four or less candidate blocks.
  • block A, block B, block C and block E which are the four candidate blocks included in the space candidate block group (S150 to S153).
  • the validity of the candidate block is checked (S151).
  • a candidate block is valid when the candidate block is not out of the region and is not in intra mode. If the candidate block is valid (YES in S151), the motion information of the candidate block is added to the spatially combined motion information candidate list (S152). If the candidate block is not valid (NO in S151), step S152 is skipped.
  • the spatial combination motion information candidate list includes motion information of four or less candidate blocks.
  • the number of spatial combination motion information candidate lists may vary depending on the effectiveness of the candidate block. It is not limited.
  • the temporal combination motion information candidate list includes motion information of one or less candidate blocks.
  • the following processing is repeated for time blocks that are two candidate blocks included in the time candidate block group (S160 to S166).
  • the validity of the candidate block is checked (S161). A candidate block is valid when the candidate block is not out of the region and is not in intra mode. If the candidate block is valid (YES in S161), a time combination motion information candidate is generated, the time combination motion information candidate is added to the list (step S162 to step S165), and the process ends. . If the candidate block is not valid (NO in S161), the next candidate block is inspected (S166).
  • the prediction direction of the temporally combined motion information candidate is determined (S162).
  • the prediction direction of the combined motion information candidate is bidirectional.
  • the reference images in the L0 direction and the L1 direction of the time combination motion information candidate are determined (S163).
  • the reference image in the L0 direction is the reference image that is the closest to the processing target image among the reference images in the L0 direction
  • the reference image in the L1 direction is the processing target image among the reference images in the L1 direction.
  • the reference image is the closest distance.
  • the reference image in the L0 direction is the reference image that is the closest to the processing target image among the reference images in the L0 direction
  • the reference image in the L1 direction is the closest to the processing target image among the reference images in the L1 direction.
  • the reference image is at a distance, it is only necessary to determine the reference image in the L0 direction and the reference image in the L1 direction, and the present invention is not limited to this.
  • the reference images in the L0 direction and the L1 direction may be encoded in the encoded stream
  • the reference image indexes in the L0 direction and the L1 direction may be set to 0, and the L0 used by the adjacent block of the processing target block.
  • the most frequently used reference image may be used as a reference image for reference in each of the L0 direction and the L1 direction.
  • the motion vector of the time combination motion information candidate is calculated (S164).
  • the temporally combined motion information candidate calculates bidirectional motion information based on the reference image ColRefPic and the motion vector mvCol, which are effective prediction directions in the motion information of the candidate block.
  • the prediction direction of the candidate block is the unidirectional direction of the L0 direction or the L1 direction, it is selected based on the reference image and the motion vector in the prediction direction.
  • the prediction direction of the candidate block is bidirectional, the selection is made based on either the reference image in the L0 direction or the L1 direction and the motion vector.
  • a reference image and a motion vector that are present in the same time direction as ColPic are selected as a reference, and a candidate block is selected based on a reference image that is closer to the distance between ColPic and a reference image in the L0 direction or L1 direction of the candidate block.
  • the block may be selected based on the direction in which the motion vector in the L0 direction or the L1 direction intersects the processing target image.
  • the time combination motion information candidates are generated as described above, but it is only necessary to be able to determine bidirectional motion information using motion information of another encoded image, and the present invention is not limited to this.
  • a motion vector scaled according to the distance between the reference image in each direction and the processing target image as in direct motion compensation may be used as a bidirectional motion vector. If the candidate block is invalid (NO in S163), the next candidate block is inspected (S165).
  • time combination motion information candidate list includes motion information of one or less candidate blocks.
  • number of time combination motion information candidate lists may be changed depending on the effectiveness of the candidate block, and is not limited thereto.
  • the prediction direction, reference image, and motion vector determination method are not limited to these.
  • the reference direction determination unit 160 determines the reference direction of the bidirectional combined motion information candidate from the second combined motion information candidate list (S170).
  • the reference direction for the bidirectional combined motion information candidate (BD0) with the reference direction L0 is the L0 direction
  • the reference direction for the bidirectional combined motion information candidate (BD1) with the reference direction L1 is the L1 direction.
  • the reference direction motion information determination unit 161 determines the reference direction motion vector and the reference image index of the bidirectional combined motion information candidate from the reference direction and the second combined motion information candidate list supplied from the reference direction determination unit 160 ( S171). The detailed operation of the reference direction motion information determination unit 161 will be described later.
  • the backward direction motion information determination unit 162 reverses the bidirectional combined motion information candidate from the reference direction, the reference direction motion vector, the reference image index, and the second combined motion information candidate list supplied from the reference direction motion information determination unit 161.
  • a direction motion vector and a reference image index are determined (S172). The detailed operation of the backward motion information determination unit 162 will be described later.
  • the bidirectional motion information determination unit 163 generates bidirectional combined motion information from the reference direction, the reference direction motion vector and the reference image index, and the reverse direction motion vector and the reference image index supplied from the backward direction motion information determination unit 162.
  • a candidate prediction direction is determined (S173). The detailed operation of determining the prediction direction of the bidirectional combined motion information candidate will be described later.
  • the bidirectional motion information determination unit 163 checks the validity of the prediction direction of the bidirectional combined motion information candidate (S174). If the prediction direction of the bidirectional combined motion information candidate is valid (YES in S174), the bidirectional motion information determination unit 163 adds the bidirectional combined motion information candidate to the bidirectional combined motion information candidate list (S175). If the prediction direction of the bidirectional combined motion information candidate is invalid (NO in S174), step S175 is skipped.
  • the reference direction motion information determination unit 161 will be described using the flowchart of FIG. Assume that the LX direction (X is 0 or 1) is selected as the reference direction of the bidirectional combined motion information candidate. The validity of the reference direction LX is set to “0” (S190). The number of combined motion information candidates (NCands) included in the second combined motion information candidate list is repeated, and the following processing is repeated (S191 to S194). The validity of the combined motion information candidate in the LX direction is checked (S192).
  • the validity of the reference direction LX is set to “1”, and the motion vector and reference index in the reference direction are set to the LX direction of the combined motion information candidate.
  • the process ends with the motion vector and the reference index (S193). If the LX direction of the combined motion information candidate is invalid (NO in S192), the next candidate is examined (S194).
  • NANDs combined motion information candidates
  • the present invention is not limited to this.
  • the number of inspections is fixed to a predetermined number such as 2 or 3, and the amount of processing is reduced and redundant bi-directional coupling is performed. It is also possible to reduce the amount of merge index codes by reducing the possibility of generating motion information candidates.
  • the reverse direction of the reference direction is set as the reverse direction of the bidirectional combined motion information candidate. It is assumed that the LY direction (Y is 0 or 1) is selected as the reverse direction. The validity of LY in the reverse direction is set to “0” (S200). The number of combined motion information candidates included in the second combined motion information candidate list (NCands) and the following processing are repeated (S201 to S205).
  • NANDs combined motion information candidates
  • the present invention is not limited to this.
  • the number of inspections is fixed to a predetermined number such as 2 or 3, and the amount of processing is reduced and redundant bi-directional coupling is performed. It is also possible to reduce the amount of merge index codes by reducing the possibility of generating motion information candidates.
  • the block to start the inspection as the combined motion information candidate next to the combined motion information candidate selected in the reference direction, the possibility that BD0 and BD1 are the same can be eliminated, and step S202 can be reduced. .
  • the prediction direction is a bi-directional BI. If only the LX direction is valid, the prediction direction is a unidirectional LX direction. If only the LY direction is valid, the prediction direction is a single direction. The prediction direction becomes invalid if both the LX direction and the LY direction are invalid. That is, when both the LX direction and the LY direction are valid, a combined motion information candidate having motion information in the LX direction and a combined motion information candidate having motion information in the LX direction having motion information in the LY direction A new bidirectional combined motion information candidate is generated by combining with another combined motion information candidate.
  • the prediction direction of the combined motion information candidate having the effective LX prediction is bi-prediction
  • the prediction direction of the combined motion information candidate is converted to single prediction.
  • the prediction direction of the combined motion information candidate is converted to single prediction.
  • FIGS. 31 (a) to 31 (c) show an extended example of determining the prediction direction of the bidirectional combined motion information candidate. For example, if at least one of the LX direction and the LY direction is invalid as shown in FIG. 31A, the prediction direction is invalidated, or the prediction direction is forced as shown in FIGS. 31B and 31C. Alternatively, it may be bidirectional.
  • the prediction efficiency of bidirectional prediction is higher than that of unidirectional prediction. Therefore, in FIG. 31A, when both the LX direction and the LY direction are not valid, the prediction direction of the bidirectional combined motion information candidates is invalidated, and the number of combined motion information candidates is reduced to reduce the code amount of the merge index. Can be reduced.
  • the adaptive processing may be performed so as to invalidate the prediction direction of the bidirectional combined motion information candidate.
  • the motion vector in the invalid prediction direction is (0, 0) and the reference index is “0”.
  • the bidirectional combined motion information candidate can be forced to be bidirectional using the reference image of the shortest distance as a prediction signal. This is because the reference index “0” is generally the reference image closest to the processing target image, and the reliability of the prediction signal with the shortest distance is the highest.
  • FIG. 32 shows a moving picture decoding apparatus 200 according to the first embodiment.
  • the video decoding device 200 is a device that generates a playback image by decoding the code string encoded by the video encoding device 100.
  • the video decoding device 200 is realized by hardware such as an information processing device including a CPU (Central Processing Unit), a frame memory, and a hard disk.
  • the moving picture decoding apparatus 200 realizes functional components described below by operating the above components. Note that the position information and the prediction block size of the prediction block to be decoded are shared in the video decoding device 200 and are not shown. Further, the maximum number of combined motion information candidates included in the candidate number management table and the combined motion information candidate list is assumed to be shared in the moving image decoding apparatus 200 and is not illustrated.
  • the moving picture decoding apparatus 200 includes a code string analysis unit 201, a prediction error decoding unit 202, an addition unit 203, a motion information reproduction unit 204, a motion compensation unit 205, a frame memory 206, and a motion information memory 207.
  • the code string analysis unit 201 decodes the code string supplied from the terminal 30 to predict prediction error encoded data, merge flag, merge candidate number, prediction direction of motion compensation prediction, reference image index, difference vector, and prediction vector index. Is decoded according to the syntax. Then, the prediction error coding data is transferred to the prediction error decoding unit 202, the merge flag, the merge candidate number, the prediction direction of the motion compensation prediction, the reference image index, the difference vector, and the prediction vector index as motion information. This is supplied to the playback unit 204.
  • the merge candidate number is obtained by conversion from the merge index.
  • the prediction error decoding unit 202 performs a process such as inverse quantization or inverse orthogonal transform on the prediction error encoded data supplied from the code string analysis unit 201 to generate a prediction error signal, and the prediction error signal is It supplies to the addition part 203.
  • the adding unit 203 adds the prediction error signal supplied from the prediction error decoding unit 202 and the prediction signal supplied from the motion compensation unit 205 to generate a decoded image signal, and the decoded image signal is stored in the frame memory 206 and Supply to terminal 31.
  • the motion information reproduction unit 204 is supplied from the motion information memory 207 and the merge flag, merge candidate number, motion compensation prediction direction, reference image index, difference vector, and prediction vector index supplied from the code string analysis unit 201. Motion information is reproduced from the candidate block group, and the motion information is supplied to the motion compensation unit 205. A detailed configuration of the motion information reproducing unit 204 will be described later.
  • the motion compensation unit 205 performs motion compensation on the reference image indicated by the reference image index in the frame memory 206 based on the motion information supplied from the motion information reproduction unit 204, and generates a prediction signal. If the prediction direction is bidirectional, an average of the prediction signals in the L0 direction and the L1 direction is generated as a prediction signal, and the prediction signal is supplied to the adding unit 203.
  • the frame memory 206 and the motion information memory 207 have the same functions as the frame memory 110 and the motion information memory 111 of the moving picture coding apparatus 100.
  • FIG. 33 shows the configuration of the motion information playback unit 204.
  • the motion information playback unit 204 includes an encoding mode determination unit 210, a motion vector playback unit 211, and a combined motion information playback unit 212.
  • the terminal 32 is connected to the code string analysis unit 201, the terminal 33 is connected to the motion information memory 207, and the terminal 34 is connected to the motion compensation unit 205.
  • the merge flag supplied from the code stream analysis unit 201 is “0”
  • the coding mode determination unit 210 determines the prediction direction, reference image index, difference vector, motion compensation prediction supplied from the code stream analysis unit 201, The prediction vector index is supplied to the motion vector reproduction unit 211. If the merge flag is “1”, the merge candidate number supplied from the code string analysis unit 201 is supplied to the combined motion information reproduction unit 212.
  • the motion vector reproduction unit 211 receives motion information from the prediction direction of motion compensation prediction supplied from the encoding mode determination unit 210, the reference image index, the difference vector, and the prediction vector index, and the candidate block group supplied from the terminal 33. Is supplied to the terminal 34. A detailed configuration of the motion vector reproducing unit 211 will be described later.
  • the combined motion information reproduction unit 212 reproduces the motion information from the merge candidate number supplied from the encoding mode determination unit 210 and the candidate block group supplied from the terminal 33 and supplies the motion information to the terminal 34. A detailed configuration of the combined motion information reproducing unit 212 will be described later.
  • the motion vector reproduction unit 211 includes a prediction vector candidate list generation unit 220, a prediction vector determination unit 221, and an addition unit 222.
  • the terminal 35 is connected to the encoding mode determination unit 210.
  • the prediction vector candidate list generation unit 220 has the same function as the prediction vector candidate list generation unit 130 of the video encoding device 100.
  • the prediction vector determination unit 221 determines a prediction vector from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 220 and the prediction vector index supplied from the terminal 35, and supplies the prediction vector to the addition unit 222.
  • the addition unit 222 adds the difference vector supplied from the terminal 35 and the prediction vector supplied from the prediction vector determination unit 221 to calculate a motion vector, and supplies the motion vector to the terminal 34.
  • the combined motion information reproduction unit 212 includes a combined motion information candidate generation unit 230 and a combined motion information selection unit 231.
  • the combined motion information candidate generation unit 230 has the same function as the combined motion information candidate generation unit 140 shown in FIG. Based on the combined motion information candidate list supplied from the combined motion information candidate generation unit 230 and the merge candidate number supplied from the terminal 35, the combined motion information selection unit 231 selects motion information from the combined motion information candidate list. The motion information is selected and supplied to the terminal 34.
  • the code string analysis unit 201 decodes the code string supplied from the terminal 30 to predict prediction error encoded data, merge flag, merge candidate number, prediction direction of motion compensation prediction, reference image index, difference vector, and prediction vector index. Is decoded according to the syntax (S210).
  • the motion information reproduction unit 204 is supplied from the motion information memory 207 and the merge flag, merge candidate number, motion compensation prediction direction, reference image index, difference vector, and prediction vector index supplied from the code string analysis unit 201. Motion information is reproduced from the candidate block group (S211).
  • the motion compensation unit 205 performs motion compensation on the reference image indicated by the reference image index in the frame memory 206 based on the motion information supplied from the motion information reproduction unit 204, and generates a prediction signal. If the prediction direction is bidirectional, an average of the prediction signals in the L0 direction and the L1 direction is generated as a prediction signal (S212).
  • the prediction error decoding unit 202 performs a process such as inverse quantization or inverse orthogonal transform on the prediction error encoded data supplied from the code string analysis unit 201 to generate a prediction error signal (S213).
  • the adding unit 203 adds the prediction error signal supplied from the prediction error decoding unit 202 and the prediction signal supplied from the motion compensation unit 205 to generate a decoded image signal (S214).
  • the frame memory 206 stores the decoded image signal supplied from the adding unit 203 (S215).
  • the motion information memory 207 stores the motion vector supplied from the motion information reproducing unit 204 for one image in the minimum predicted block size unit (S216).
  • the encoding mode determination unit 210 determines whether the merge flag supplied from the code string analysis unit 201 is “0” or “1” (S220). If the merge flag is “1” (1 in S220), the combined motion information reproduction unit 212 calculates the motion from the merge candidate number supplied from the encoding mode determination unit 210 and the candidate block group supplied from the terminal 33. Information is reproduced (S221).
  • the motion vector reproduction unit 211 supplies the motion compensation prediction prediction direction, reference image index, difference vector, and prediction vector supplied from the coding mode determination unit 210. Motion information is reproduced from the index and the candidate block group supplied from the terminal 33 (S222).
  • the prediction vector candidate list generation unit 220 generates a prediction vector candidate list by the same operation as the prediction vector candidate list generation unit 130 of the video encoding device 100 (S300).
  • the prediction vector determination unit 221 selects a prediction vector candidate indicated by the prediction vector index supplied from the terminal 35 from the prediction vector candidate list supplied from the prediction vector candidate list generation unit 220, and determines a prediction vector. (S301).
  • the adder 222 adds the difference vector supplied from the terminal 35 and the prediction vector supplied from the prediction vector determination unit 221 to calculate a motion vector (S302).
  • the combined motion information candidate generation unit 230 generates a combined motion information candidate list by the same operation as the combined motion information candidate generation unit 140 of the video encoding device 100 (S310).
  • the combined motion information selection unit 231 selects a combined motion information candidate indicated by the merge candidate number supplied from the terminal 35 from the combined motion information candidate list supplied from the combined motion information candidate generation unit 230, and combines them.
  • the motion information is determined (S311).
  • the first embodiment can be modified as follows.
  • FIG. 18 is given as an example of the candidate number management table.
  • the maximum number of combined motion information candidates may be 1 or more, and a merge candidate number with a smaller combined motion information candidate with a higher selection rate. Is not limited to that shown in FIG.
  • the maximum number of combined motion information candidates included in the combined motion information candidate list is 7 (the maximum value of the merge index is 6), but it may be 2 or more.
  • the selection rate of bidirectional combined motion information candidates is higher than the selection rate of combined motion information candidates of block C and block E, it may be as shown in FIGS. 40 (a) and 40 (b). .
  • each bidirectional combined motion information candidate (BD0 to BD3) will be described.
  • the bidirectional combined motion information candidate (BD0) and the bidirectional combined motion information candidate (BD1) are assumed to be the same as those in the first embodiment.
  • the bidirectional combined motion information candidate (BD2) and the bidirectional combined motion information candidate (BD3) include a reference direction motion vector and a reference index of the reference direction bidirectional combined motion information candidate, and a reverse bidirectional combined motion information candidate.
  • the method of determining the motion vector and reference index in the base direction is different between the bidirectional combined motion information candidate (BD0) and the bidirectional combined motion information candidate (BD1).
  • FIG. 42 is a flowchart for explaining the derivation of the bidirectional combined motion information candidate (BD2).
  • FIG. 42 is obtained by replacing step S193 in the flowchart of FIG. 28 from step S195 to step S197.
  • step S195 to step S197 will be described. It is checked whether the effectiveness of LX is “1” (S195). If the validity of LX is not “1” (NO in S195), the validity of LX is set to “1” (S196), and the next candidate is examined (S194). If the validity of LX is “1” (YES in S195), the motion vector and reference index in the base direction are set as the motion vector and reference index in the LX direction of the combined motion information candidate (S197), and the process ends.
  • FIG. 43 is a flowchart for explaining the derivation of the bidirectional combined motion information candidate (BD3).
  • FIG. 43 is obtained by replacing step S204 in the flowchart of FIG. 29 from step S206 to step S208.
  • step S206 to step S208 will be described.
  • Whether the validity of LY is “1” is checked (S206). If the validity of LY is not “1” (NO in S206), the validity of LY is set to “1” (S207), and the next candidate is examined (S205). If the validity of LY is “1” (YES in S206), the motion vector and reference index in the base direction are set as the motion vector and reference index in the LY direction of the combined motion information candidate (S208), and the process is terminated.
  • the bidirectional combined motion information candidate (BD2) is first effective when it is not the same candidate as the reference direction in the reverse direction, and the motion vector and reference index in the reference direction of the combined motion information candidate that is second effective in the reference direction.
  • the combined motion information candidate is a bidirectional combined motion information candidate using a motion vector in the reverse direction of the combined motion information candidate and a reference index.
  • the bidirectional combined motion information candidate (BD3) is the second effective that is not the same candidate as the reference direction in the reverse direction and the motion vector and reference index in the reference direction of the combined motion information candidate that is first effective in the reference direction.
  • the combined motion information candidate is a bidirectional combined motion information candidate that combines a motion vector in the reverse direction of the combined motion information candidate and a reference index.
  • the number of combinations of bidirectional combined motion information candidates can be increased, the selection rate of combined motion information candidates can be increased, and the encoding efficiency of motion information can be improved.
  • FIG. 29 is given as an example of the operation of the backward direction motion information determination unit 162, but it is only necessary to generate bidirectional combined motion information candidates, and the present invention is not limited to this.
  • step S240 may be added as shown in FIG. 44 in order to increase the effectiveness of the bidirectional combined motion information candidate, that is, to prevent the second combined motion information candidate list reduction unit 153 from deleting it.
  • the combined motion information candidate having the same motion information as the bidirectional combined motion information candidate using the motion vector and reference index in the reference direction and the backward motion vector and reference index of the combined motion information candidate to be inspected is the second It is inspected that it is not in the combined motion information candidate list (S240).
  • step S205 is performed.
  • the next candidate is examined (S206). In this case, the second combined motion information candidate list reduction unit 153 in FIG. 16 and step S146 in FIG. 24 can be omitted.
  • the second combined motion information candidate list reduction unit 153 does not reduce the bidirectional combined motion information candidates, and the selection rate of the combined motion information candidates can be increased and the motion information encoding efficiency can be improved. .
  • FIG. 29 is given as an example of the operation of the backward direction motion information determination unit 162, but step S250 may be added as shown in FIG.
  • step S205 is performed. If they are the same (NO in S250), the next candidate is inspected (S206).
  • the bidirectional combined motion information candidate is not the same as the combined motion information candidate selected in the reference direction, and the effectiveness of the bidirectional combined motion information candidate is increased and the combined motion information candidate selection rate is increased.
  • the encoding efficiency of motion information can be improved.
  • FIG. 16 is given as an example of the configuration of the combined motion information candidate generation unit 140.
  • the first combined motion information candidate list reduction unit 151 is configured as illustrated in FIG.
  • only the second combined motion information candidate list reduction unit 153 can be combined with the deletion unit.
  • the bidirectional combined motion information candidate list generation unit 152 since a redundant combined motion information candidate is supplied to the bidirectional combined motion information candidate list generation unit 152 as a problem in this case, if the first two unidirectional combined motion information candidates are the same, the reference The bidirectional combined motion information candidate (BD0) whose direction is L0 and the bidirectional combined motion information candidate (BD1) whose reference direction is L1 are the same motion information. Therefore, as shown in FIG. 47 (b), the probability of generating the same bidirectional combined motion information candidate by changing the inspection order of FIG. 28 and FIG. 29 depending on whether the reference direction is L0 or L1. Can be reduced.
  • bidirectional combined motion information candidates are generated by searching for combined motion information candidate blocks that are valid in the direction opposite to the reference direction and using the motion information in the direction opposite to the reference direction. . By searching in the direction opposite to the reference direction, the effectiveness of the bidirectional combined motion information candidate can be improved, but the processing amount increases.
  • the bidirectional combined motion information candidates are defined as combinations of predetermined combined motion information candidate blocks with higher reliability, thereby omitting the search process. It is possible to improve the coding efficiency by improving the selection rate of the directional coupled motion information candidates.
  • the bidirectional combined motion information candidate (BD0) having the reference direction L0 indicates the motion information in the L0 direction of the candidate block A with the highest reliability and the L1 direction of the candidate block B with the second highest reliability.
  • the bi-directional combined motion information candidate (BD1) whose reference direction is L1 is the motion information in the L1 direction of the most reliable candidate block A and the L0 direction of the second most reliable candidate block B in the L0 direction. This is an example in which motion information is combined and the prediction direction is defined as bidirectional prediction.
  • the bidirectional combined motion information candidate (BD0) whose reference direction is L0 is the motion information in the L0 direction of the candidate block A having the highest reliability
  • the prediction direction is defined as unidirectional prediction. Note that other combinations are possible as long as the combinations of candidate blocks have higher reliability.
  • a small merge candidate number is assigned to the bidirectional combined motion information candidate (BD0) whose reference direction is L0.
  • the present invention is not limited to this.
  • a small merge candidate number is assigned to a bidirectional combined motion information candidate for bidirectional prediction with high prediction efficiency.
  • the encoding efficiency can also be improved.
  • BD0 and BD1 are bidirectional prediction
  • a smaller merge candidate number can be preferentially assigned to a bidirectional combined motion information candidate whose motion information in the reference direction is unidirectional. This is because the reliability of motion information is generally high when unidirectional prediction is selected even though bidirectional prediction has higher prediction efficiency than unidirectional prediction.
  • Embodiment 1 Example of the effect of bidirectional joint motion information in bidirectional prediction
  • the motion vector in the L0 direction of the block N is mvL0N
  • the motion vector in the L1 direction is mvL1N
  • the reference image index in the L0 direction is refIdxL0N
  • the reference image index in the L1 direction is refIdxL1N
  • the difference vector in the L0 direction is dmvL0N
  • the difference vector in the L0 direction is represented as dmvL1N
  • the difference between the reference image indexes in the L0 direction is represented as drefIdxL0N
  • the reference image index in the L1 direction is represented as drefIdxL1N.
  • the unidirectional combined motion information candidates are A, B, COL, C, and E in FIG.
  • these unidirectional combined motion information candidates there is no motion information identical to the motion information that minimizes the prediction error for the processing target block (Z). Therefore, a unidirectional combined motion information candidate having a minimum rate distortion evaluation value is selected from these unidirectional combined motion information candidates. Then, the candidate rate distortion evaluation value and the rate distortion evaluation value calculated by the difference vector calculation unit 120 are compared, and the merge mode is used as the encoding mode only when the former is smaller than the latter. Become.
  • the merge mode When the merge mode is selected as the encoding mode, it is because the balance between the encoding efficiency of motion information and the prediction error is optimal, and the prediction error is not optimal. On the other hand, when the non-merge mode is selected as the encoding mode, the encoding efficiency of motion information is not optimal.
  • the bidirectional combined motion information candidates generated by the first embodiment are BD0 and BD1 in FIG.
  • the bidirectional combined motion information candidate (BD0) whose reference direction is L0 is a bidirectional combined motion information candidate including the motion information of the block A in the L0 direction and the motion information of the block B in the L1 direction.
  • the bidirectional combined motion information candidate (BD1) having the reference direction L1 is a bidirectional combined motion information candidate including the motion information of the block A in the L1 direction and the motion information of the block B in the L0 direction.
  • the bidirectional combined motion information candidate (BD0) whose reference direction is L0 has the same motion information as the motion information that minimizes the prediction error for the processing target block (Z). That is, by selecting the bidirectional combined motion information candidate (BD0) whose reference direction is L0, it is possible to minimize the prediction error and optimize the encoding efficiency of the motion information.
  • unidirectional combined motion information candidates are invalid (x), and valid unidirectional combined motion information candidates A and E have motion information as shown in FIG. Also in this case, there is no motion information in the unidirectional combined motion information candidates that minimizes the prediction error for the processing target block (Z).
  • the bidirectional combined motion information candidates generated by the first embodiment are BD0 and BD1 in FIG.
  • the bidirectional combined motion information candidate (BD0) whose reference direction is L0 is a bidirectional combined motion information candidate whose prediction direction consisting of motion information of the block A in the L0 direction is unidirectional.
  • the bidirectional combined motion information candidate (BD1) whose reference direction is L1 is a bidirectional combined motion information candidate including the motion information of the block E in the L0 direction and the motion information of the block A in the L1 direction. It can be seen that the bidirectional combined motion information candidate (BD0) whose reference direction is L0 has the same motion information as the motion information that minimizes the prediction error for the processing target block (Z). That is, by selecting the bidirectional combined motion information candidate (BD0) whose reference direction is L0, it is possible to minimize the prediction error and optimize the encoding efficiency of the motion information.
  • unidirectional combined motion information candidates A, COL, and C are invalid (x), and valid unidirectional combined motion information candidates B and E have motion information as shown in FIG. Also in this case, there is no motion information in the unidirectional combined motion information candidate that minimizes the prediction error for the processing target block (Z).
  • the bidirectional combined motion information candidates generated by the first embodiment are BD0 and BD1 in FIG.
  • the bidirectional combined motion information candidate (BD0) whose reference direction is L0 is a bidirectional combined motion information candidate including the motion information of the block B in the L0 direction and the motion information of the block E in the L1 direction, and BD1 is not generated.
  • the bidirectional combined motion information candidate (BD0) having the reference direction L0 has the same motion information as the motion information that minimizes the prediction error for the processing target block (Z). That is, by selecting the bidirectional combined motion information candidate (BD0) whose reference direction is L0, it is possible to minimize the prediction error and optimize the encoding efficiency of the motion information.
  • Bidirectional motion information candidate As described above, by generating the bidirectional combined motion information candidate using the motion information in the L0 direction and the L1 direction of the unidirectional combined motion information candidate, the motion of the processing target block is encoded with another image. Even if there is a deviation from the motion of the block located at the same position or the adjacent block of the processing target block, the motion information can be encoded using only the index without encoding. Therefore, it is possible to realize a moving image encoding device and a moving image decoding device that can optimize encoding efficiency and prediction efficiency.
  • the bidirectional combined motion information candidate list generation unit 152 has the bidirectional combined motion information candidate (BD 0) whose reference direction is L 0 and the reference It can be avoided that the bidirectional combined motion information candidate (BD1) having the direction L1 has the same motion information, and the effectiveness of the bidirectional combined motion information candidate can be increased to improve the encoding efficiency.
  • the combined motion information candidates are generated without increasing the number of unidirectional combined motion information candidates by generating the bidirectional combined motion information candidates using the motion information in each direction of the unidirectional combined motion information candidates.
  • the number of can be increased. Therefore, in the moving picture encoding apparatus and moving picture decoding apparatus using a general LSI in which the memory read time is increased by increasing the number of unidirectional combined motion information candidates, the number of unidirectional combined motion information candidates is increased. The increase in memory read time due to can be suppressed.
  • Adaptive switching As described above, by assigning a small merge candidate number to a bidirectional combined motion information candidate whose prediction direction is bidirectional, a bidirectional combined motion information candidate having a high prediction efficiency and a bidirectional prediction direction can be obtained.
  • a bidirectional combined motion information candidate By increasing the selection rate and preferentially assigning a small merge candidate number to a bidirectional combined motion information candidate whose motion information in the reference direction is unidirectional, a bidirectional combined motion information candidate using highly reliable motion information It is possible to improve the coding efficiency by increasing the selectivity.
  • the higher-order function of the moving picture encoding apparatus has a function of changing the candidate number management table for each encoded stream unit or for each slice that is a part of the encoded stream.
  • the code string generation unit 104 encodes the candidate number management table into an encoded stream as shown in FIGS. 53 (a) and 53 (b) and transmits the encoded stream.
  • 53 (a) and 53 (b) the syntax of encoding the candidate number management table with SPS (Sequence Parameter Set) for control in units of encoded streams and Slice_header for control in units of slices. An example is shown.
  • modified_merge_index_flag specifies whether to change the standard relationship between the merge candidate number and the combined motion information candidate, specifies the number to be redefined with "max_no_of_merge_index_minus1", and enters the combined motion information candidate list with "merge_mode [i]” Specifies the order of candidate blocks included.
  • bd_merge_base_direction that is information for designating the reference direction of the bidirectional combined motion information candidate can be set.
  • 53 (a) and 53 (b) are examples of syntax, and merge candidate numbers to be assigned to bidirectional combined motion information candidates can be specified in the encoded stream, and the reference direction of bidirectional combined motion information candidates can be determined.
  • the present invention is not limited to this.
  • the configuration of the moving picture decoding apparatus according to the second embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the code string analysis unit 201.
  • the code string analysis unit 201 decodes the candidate number management table according to the syntaxes of FIGS. 53 (a) and 53 (b).
  • Embodiment 3 Replacement of combined motion information candidates
  • the configuration of the moving picture coding apparatus according to the third embodiment is the same as that of the moving picture coding apparatus 100 according to the first embodiment except for the function of the combined motion information candidate generation unit 140.
  • the candidate number management table in Embodiment 3 is shown in FIG. 54, and the maximum number of combined motion information candidates included in the combined motion information candidate list is 5.
  • the difference is that the maximum number of combined motion information candidates included in the combined motion information candidate list is 5, and no merge candidate number is assigned to the bidirectional combined motion information candidate.
  • the difference between the combined motion information candidate generation unit 140 in Embodiment 3 and Embodiment 1 will be described with reference to FIG.
  • the candidate number management table changing unit 154 calculates the effective number of bidirectional combined motion information candidates from the second combined motion information candidate list supplied from the first combined motion information candidate list reducing unit 151. If the effective number of bidirectional combined motion information candidates is 1 or more, the candidate number management table is changed, and the second combined motion information candidate list is supplied to the bidirectional combined motion information candidate list generation unit 152. If the effective number of bidirectional combined motion information candidates is 0, the second combined motion information candidate list is supplied to the terminal 18 as a combined motion information candidate list.
  • the flowchart of FIG. 56 has the following two steps added to the flowchart of FIG.
  • the candidate number management table changing unit 154 changes the candidate number management table (S260). It is checked whether the candidate number management table has been changed (S261). If the candidate number management table is changed (YES in S261), step S144 is performed. If the candidate number management table has not been changed (NO in S261), step S144 is skipped.
  • the candidate number management table changing unit 154 counts the number of invalid combined motion information candidates not included in the second combined motion information candidate list, and calculates the invalid number of combined motion information candidates (S270).
  • the invalid number of combined motion information candidates is calculated as the number of invalid combined motion information candidates not included in the second combined motion information candidate list.
  • the number of invalid combined motion information candidates is What is necessary is just to be able to calculate, and it is not limited to this.
  • the number of valid combined motion information candidates included in the second combined motion information candidate list is subtracted from 5 which is the sum of 4 which is the maximum number of spatially combined motion information candidates and 1 which is the maximum number of temporally combined motion information candidates.
  • the number of invalid combined motion information candidates may be obtained.
  • the combined motion information having a high selection rate is invalid, it is considered that the selection rate of the bidirectional combined motion information candidate also decreases. Therefore, the number of invalid combined motion information candidates having a merge candidate number of 2 or more is set. You may count.
  • the candidate number management table changing unit 154 checks whether the invalid number of combined motion information candidates is 1 or more (S271). If the invalid number of combined motion information candidates is 1 or more (YES in S271), the subsequent processing is performed to change the candidate number management table. If the invalid number of combined motion information candidates is 0 (NO in S271), the process ends.
  • the candidate number management table changing unit 154 counts the number of effective bidirectional combined motion information candidates and calculates the effective number of bidirectional combined motion information candidates (S272). That is, if both BD0 and BD1 are valid, the effective number of bidirectional combined motion information candidates is 2, and if either BD0 or BD1 is valid, the effective number of bidirectional combined motion information candidates is 1, and BD0 And BD1 are both invalid, the effective number of bidirectional combined motion information candidates is zero.
  • the candidate number management table changing unit 154 sets the smaller one of the invalid number of combined motion information candidates and the effective number of bidirectional combined motion information candidates as the additional number of bidirectional combined motion information candidates (S273).
  • the candidate number management table changing unit 154 assigns invalid merge candidate numbers to the bidirectional combined motion information candidates for the added number of bidirectional combined motion information candidates (S274).
  • FIG. 58A shows an example in which the invalid number of combined motion information candidates is 1 and the effective number of bidirectional combined motion information candidates is 1 or more.
  • BD0 is assigned to the first invalid merge candidate number 1. If BD1 is valid, BD1 may be assigned.
  • FIG. 58B shows an example in which the invalid number of combined motion information candidates is 2 and the effective number of bidirectional combined motion information candidates is 2.
  • BD0 is assigned to the first invalid merge candidate number 2
  • BD1 is assigned to the second invalid merge candidate number 4.
  • FIG. 58C shows an example in which the invalid number of combined motion information candidates is 2 and the effective number of bidirectional combined motion information candidates is 1 (BD1 is valid).
  • BD1 is assigned to invalid merge candidate number 2.
  • the configuration of the moving picture decoding apparatus according to the third embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the combined motion information candidate generation unit 140.
  • the combined motion information candidate generation unit 140 of the video decoding device of the third embodiment is the same as the combined motion information candidate generation unit 140 of the video encoding device of the third embodiment.
  • the third embodiment can be modified as follows.
  • FIG. 57 is given as an example of the operation of the candidate number management table changing unit 154.
  • a merge candidate number smaller in the combined motion information candidate having a higher selection rate is assigned to the changed candidate number management table.
  • the present invention is not limited to this.
  • step S275 may be added to the operation of the candidate number management table changing unit 154 as shown in FIG.
  • the flowchart of FIG. 59 is obtained by adding step S275 to the flowchart of FIG.
  • the candidate number management table changing unit 154 packs the merge candidate numbers of invalid combined motion information candidates (S274).
  • FIG. 60A shows an example in which the invalid number of combined motion information candidates is 1 and the effective number of bidirectional combined motion information candidates is 1 or more.
  • BD0 is first assigned to the invalid merge candidate number 4. If BD1 is valid, BD1 may be assigned.
  • FIG. 60B shows an example in which the invalid number of combined motion information candidates is 2 and the effective number of bidirectional combined motion information candidates is 2.
  • BD0 is assigned to the first invalid merge candidate number 3
  • BD1 is assigned to the second invalid merge candidate number 4.
  • a merge candidate number larger than that of the unidirectional combined motion information candidate is assigned to the bidirectional combined motion information candidate.
  • Modification 2 Depends on a predetermined block
  • the operation of the candidate number management table changing unit 154 can be further modified. First, in this modification, it is assumed that a predetermined bidirectional combined motion information candidate is associated with a predetermined block, BD0 is associated with block C, and BD1 is associated with block D.
  • BD0 is associated with block C
  • BD1 is associated with block D.
  • S280 to S284 another modification of the operation of the candidate number management table changing unit 154 will be described with reference to FIG.
  • the following process is repeated for the number of associated blocks (S280 to S284). Whether the i-th predetermined block is invalid is checked (S281). If the i-th predetermined block is invalid (YES in S281), the subsequent processing is performed to change the candidate number management table. If the i-th predetermined block is not invalid (NO in S281), the next predetermined block is inspected.
  • the candidate number management table changing unit 154 assigns the bidirectional combined motion information candidate (BD0) to the first predetermined invalid merge candidate number, and the candidate number management table changing unit 154 sets the second predetermined invalid invalid candidate number.
  • a bidirectional combined motion information candidate (BD1) is assigned to the merge candidate number (S282).
  • the bidirectional combined motion information candidate list generation unit 152 is valid when the predetermined combined motion information candidate is invalid.
  • the predetermined combined motion information candidates are the block C and the block E, but when the combined motion information candidate having a higher merge candidate number and the low selection rate is invalid, the bidirectional combined motion information candidate is generated.
  • the present invention is not limited to this.
  • the operation of the candidate number management table changing unit 154 can be further modified.
  • a modified example of the operation of the candidate number management table changing unit 154 will be described with reference to FIG. If the invalid number of combined motion information candidates is 0 (NO in S271), the candidate number management table changing unit 154 indicates that the prediction direction included in the second combined motion information candidate list is unidirectional (L0 direction or L1 direction).
  • the number of combined motion information candidates is counted, and the number of unidirectional predictions is calculated (S290). It is investigated whether the number of unidirectional predictions is 1 or more (S291).
  • the subsequent processing is performed to change the candidate number management table. If the unidirectional prediction number is 0 (NO in S291), the process is terminated.
  • the candidate number management table changing unit 154 counts the number of bidirectional combined motion information candidates whose prediction direction is bidirectional, and calculates the effective number of bidirectional combined motion information candidates (S292).
  • the candidate number management table changing unit 154 assigns merge candidate numbers of combined motion information candidates whose prediction direction is unidirectional to the number of bidirectional combined motion information candidates corresponding to the additional number of bidirectional combined motion information candidates (S294).
  • the candidate number management table changing unit 154 determines the merge candidate number whose final prediction direction is unidirectional as the bidirectional combined motion. Assigned to information candidate (BD0).
  • the candidate number management table changing unit 154 determines a merge candidate number in which the second prediction direction from the last is unidirectional, Assigned to bidirectional combined motion information candidate (BD1).
  • the calculation of the number of unidirectional predictions is the number of combined motion information candidates whose prediction direction is included in the second combined motion information candidate list.
  • the present invention is not limited to this.
  • the number of combined motion information candidates in which the prediction direction with a merge candidate number of 3 or more is unidirectional is counted. May be. If the number of invalid combined motion information candidates is 0, the number of combined motion information candidates whose prediction direction is unidirectional is counted. However, the total number of combined motion information candidate invalid numbers and the number of unidirectional predictions is The upper limit is not limited to this as long as a merge candidate number can be assigned to a bidirectional combined motion information candidate.
  • the bidirectional combined motion information candidate list generation unit 152 replaces the combined motion information candidate whose prediction direction is unidirectional with the bidirectional combined motion information candidate whose prediction direction is bidirectional.
  • the merge is performed so that the merge candidate number of the bidirectional combined motion information candidate is larger than the merge candidate number of the unidirectional combined motion information candidate.
  • the candidate number By using the candidate number, the encoding efficiency of the merge index can be improved.
  • the combined motion information candidate of the block with high reliability and high selection rate remains, and the selection rate is low. It is possible to adaptively switch between the combined motion information candidate of the block and the bidirectional combined motion information candidate. Therefore, an increase in the code amount of the merge index due to an increase in merge candidate numbers can be suppressed, and the selection rate of the combined motion information candidates can be increased to improve the encoding efficiency.
  • the combined motion information candidate whose prediction direction is unidirectional is replaced with the bidirectional combined motion information candidate whose prediction direction is bidirectional, and the bidirectional prediction motion information whose prediction direction is bidirectional is high.
  • Embodiment 4 (Priority is given to motion information for unidirectional prediction)
  • the configuration of the moving picture encoding apparatus according to the fourth embodiment is the same as that of the moving picture encoding apparatus 100 according to the first embodiment except for the function of the reference direction motion information determination unit 161.
  • the difference between the reference direction motion information determination unit 161 in the fourth embodiment and the first embodiment will be described.
  • the operation of the reference direction motion information determination unit 161 according to Embodiment 4 will be described with reference to FIG.
  • step S323 is obtained by adding steps S320 to S323 to the flowchart of FIG. 28, and is characterized by step S321.
  • the validity of the reference direction LX is set to “0” (S190).
  • the number of combined motion information candidates (NCands) included in the second combined motion information candidate list is repeated, and the following processing is repeated (S320 to S323).
  • Whether the combined motion information candidate is valid in the LX direction and unidirectional prediction is checked (S321). If the LX direction of the combined motion information candidate is valid and unidirectional prediction (YES in S321), the validity of the reference direction LX is set to “1”, and the motion vector and reference index in the reference direction are set.
  • the processing ends with the motion vector in the LX direction of the combined motion information candidate and the reference index (S322). If the LX direction of the combined motion information candidate is valid and is not unidirectional prediction (NO in S321), the next candidate is examined (S323).
  • the number of combined motion information candidates (NCands) included in the second combined motion information candidate list is repeated. (S191 to S194).
  • the validity of the combined motion information candidate in the LX direction is checked (S192). If the LX direction of the combined motion information candidate is valid (YES in S192), the validity of the reference direction LX is set to “1”, and the motion vector and reference index in the reference direction are set to the LX direction of the combined motion information candidate. The process ends with the motion vector and the reference index (S193). If the LX direction of the combined motion information candidate is invalid (NO in S192), the next candidate is examined (S194).
  • the reference direction motion information determination unit 161 according to the fourth embodiment differs from the first embodiment in that priority is given to motion information that is unidirectional in determining the reference direction motion information.
  • the configuration of the moving picture decoding apparatus according to the fourth embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the reference direction motion information determination unit 161.
  • the combined motion information candidate generation unit 140 of the video decoding device in the fourth embodiment is the same as the combined motion information candidate generation unit 140 of the video encoding device in the fourth embodiment.
  • the fourth embodiment can be modified as follows.
  • FIG. 63 is given as an example of the operation of the reference direction motion information determination unit 161.
  • motion information that is unidirectional may be given priority in determining motion information, and is not limited thereto.
  • step S191 to step S194 in FIG. 63 may be deleted, and the motion information in the reference direction may be limited to motion information in a single direction.
  • FIG. 63 is given as an example of the operation of the reference direction motion information determination unit 161.
  • motion information that is unidirectional may be given priority in determining motion information, and is not limited thereto.
  • priority may be given to motion information that is unidirectional, similar to the reference direction motion information determination unit 161 of the fourth embodiment.
  • the selection is limited to the motion information that is unidirectional similarly to the reference direction motion information determination unit 161 of the first modification of the fourth embodiment. May be.
  • Embodiment 5 (Each direction deletion process)
  • the configuration of the moving picture encoding apparatus of the fifth embodiment is the same as that of the moving picture encoding apparatus 100 of the first embodiment except for the function of the combined motion information candidate generation unit 140.
  • the difference between the combined motion information candidate generation unit 140 in Embodiment 5 and Embodiment 1 will be described.
  • FIG. 64 Differences from the first embodiment will be described with reference to FIG. 64 regarding the configuration of the combined motion information candidate generation unit 140 according to the fifth embodiment.
  • an L0 direction motion information candidate list generation unit 155 and an L1 direction motion information candidate list generation unit 156 are installed instead of the first combined motion information candidate list reduction unit 151 of FIG.
  • the L0 direction motion information candidate list generation unit 155 for motion information candidates included in the first combined motion information candidate list, when there are a plurality of combined motion information candidates having motion information with overlapping motion information in the L0 direction. Deletes one combined motion information candidate, generates an L0 direction motion information candidate list, and supplies the L0 direction motion information candidate list to the bidirectional combined motion information candidate list generation unit 152.
  • the L1 direction motion information candidate list generation unit 156 for motion information candidates included in the first combined motion information candidate list, when there are a plurality of combined motion information candidates having motion information with overlapping motion information in the L1 direction. Deletes one combined motion information candidate, generates an L1 direction motion information candidate list, and supplies the L1 direction motion information candidate list to the bidirectional combined motion information candidate list generation unit 152.
  • the bidirectional combined motion information candidate list generation unit 152 includes the L0 direction motion information candidate list supplied from the L0 direction motion information candidate list generation unit 155 and the L1 direction motion information supplied from the L1 direction motion information candidate list generation unit 156.
  • a bidirectional combined motion information candidate list is generated from the candidate list.
  • the configuration of the moving picture decoding apparatus according to the fifth embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the combined motion information candidate generation unit 140.
  • the combined motion information candidate generation unit 140 of the video decoding device in the fifth embodiment is the same as the combined motion information candidate generation unit 140 of the video encoding device in the fifth embodiment.
  • Embodiment 6 (Selective use of bidirectional combined motion information candidates)
  • the configuration of the moving picture coding apparatus according to the sixth embodiment is the same as that of the moving picture coding apparatus 100 according to the first embodiment except for the function of the reference direction determination unit 160.
  • the candidate number management table in the sixth embodiment is shown in FIG. 65, and the maximum number of combined motion information candidates included in the combined motion information candidate list is 6.
  • the difference is that the maximum number of combined motion information candidates included in the combined motion information candidate list is 6, and that only one merge candidate number is assigned to the bidirectional combined motion information candidate.
  • the difference between the reference direction determination unit 160 in the sixth embodiment and the first embodiment will be described.
  • the operation of the reference direction determination unit 160 according to the sixth embodiment will be described with reference to FIG.
  • the reference direction determination unit 160 repeats the following processing (S300 to S305) for the number of combined motion information candidates (NCands) included in the second combined motion information candidate list.
  • the validity of the combined motion information candidate in the L0 direction is checked (S301). If the L0 direction of the combined motion information candidate is valid (YES in S301), the reference direction is set to L0 and the process ends (S302). If the L0 direction of the combined motion information candidate is invalid (NO in S301), the validity of the combined motion information candidate in the L1 direction is checked (S303). If the L1 direction of the combined motion information candidate is valid (YES in S303), the reference direction is set to L1 and the process ends (S304). If the L1 direction of the combined motion information candidate is invalid (NO in S303), the next candidate is examined (S305). If the reference direction cannot be set, no bidirectional combined motion information candidate is generated (S306).
  • the configuration of the moving picture decoding apparatus according to the sixth embodiment is the same as that of the moving picture decoding apparatus 200 according to the first embodiment except for the function of the reference direction determination unit 160.
  • the reference direction determination unit 160 of the moving picture decoding apparatus according to the sixth embodiment is the same as the reference direction determination unit 160 of the moving picture encoding apparatus according to the sixth embodiment.
  • the moving image encoded stream output from the moving image encoding apparatus according to the first to sixth embodiments described above is specified so that it can be decoded according to the encoding method used in the first to sixth embodiments.
  • Data format A moving picture decoding apparatus corresponding to the moving picture encoding apparatus can decode an encoded stream of this specific data format.
  • a merge index indicating a bidirectional combined motion information candidate and a candidate number management table are encoded in the encoded stream. Also, only the merge index indicating the bidirectional combined motion information candidate is encoded in the encoded stream, and the candidate number management table is shared by the video encoding device and the video decoding device, so that the candidate number management table is included in the encoded stream. It does not have to be encoded.
  • the encoded stream When a wired or wireless network is used to exchange an encoded stream between a moving image encoding device and a moving image decoding device, the encoded stream is converted into a data format suitable for the transmission form of the communication path. It may be transmitted.
  • a video transmission apparatus that converts the encoded stream output from the video encoding apparatus into encoded data in a data format suitable for the transmission form of the communication channel and transmits the encoded data to the network, and receives the encoded data from the network Then, a moving image receiving apparatus that restores the encoded stream and supplies the encoded stream to the moving image decoding apparatus is provided.
  • the moving image transmitting apparatus is a memory that buffers the encoded stream output from the moving image encoding apparatus, a packet processing unit that packetizes the encoded stream, and transmission that transmits the packetized encoded data via the network.
  • the moving image receiving apparatus generates a coded stream by packetizing the received data, a receiving unit that receives the packetized coded data via a network, a memory that buffers the received coded data, and packet processing. And a packet processing unit provided to the video decoding device.
  • the above encoding and decoding processes can be realized as a transmission, storage, and reception device using hardware, as well as firmware stored in ROM (Read Only Memory), flash memory, and the like. It can also be realized by software such as a computer.
  • the firmware program and software program can be recorded and provided on a computer-readable recording medium, provided from a server through a wired or wireless network, or provided as a data broadcast of terrestrial or satellite digital broadcasting. Is also possible.
  • the temporally combined motion information candidate calculates bidirectional motion information based on the reference image ColRefPic and the motion vector mvCol, which are effective prediction directions in the motion information of the candidate block.
  • the prediction direction of the candidate block is the unidirectional direction of the L0 direction or the L1 direction, it is selected based on the reference image and the motion vector in the prediction direction.
  • the prediction direction of the candidate block is bidirectional, the selection is made based on either the reference image in the L0 direction or the L1 direction and the motion vector.
  • a reference image and a motion vector as a reference for generating bidirectional motion information are selected, a motion vector of a temporally combined motion information candidate is calculated.
  • the distance between the images of ColPic and ColRefPic is ColDist
  • the distance between the reference images ColL0Pic in the L0 direction of the temporally coupled motion information candidate and the image to be processed CurPic is CurL0Dist
  • the reference image ColL1Pic in the L1 direction of the temporally coupled motion information candidate is the target of processing.
  • a motion vector of the following equation 1 obtained by scaling ColMv with a distance ratio of ColDist, CurL0Dist, and CurL1Dist is set as a motion vector of a temporally combined motion information candidate.
  • the inter-image distance is calculated using POC and has a positive / negative sign.
  • ColPic, ColRefPic, ColL0Pic, and ColL1Pic in FIG. 67 are examples, and other relationships may be used.
  • the present invention can be used for a moving picture encoding and decoding technique using motion compensated prediction.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

Selon l'invention, un générateur de liste de candidats sélectionne, à partir d'une pluralité de blocs codés adjacents à un bloc à coder, une pluralité de blocs ayant individuellement un ou deux éléments d'informations de mouvement qui comprennent au moins des informations de vecteur de mouvement et des informations d'image de référence, et génère, à partir des informations de mouvement dans les blocs sélectionnés, une liste de candidats comprenant des candidats d'informations de mouvement utilisés pour une prédiction à mouvement compensé. Une première unité d'acquisition d'informations de mouvement acquiert des informations de mouvement d'une première liste de prédictions à partir d'un premier candidat inclus dans les candidats. Une seconde unité d'acquisition d'informations de mouvement acquiert des informations de mouvement d'une seconde liste de prédictions à partir d'un second candidat inclus dans les candidats. Un générateur de candidat sélectionné combine les informations de mouvement de la première liste de prédictions acquise à partir de la première unité d'acquisition d'informations de mouvement et les informations de mouvement de la seconde liste de prédictions acquise à partir de la seconde unité d'acquisition d'informations de mouvement, et génère de nouveaux candidats d'informations de mouvement.
PCT/JP2012/004148 2011-06-30 2012-06-27 Dispositif de codage d'image, procédé de codage d'image, programme de codage d'image, dispositif de décodage d'image, procédé de décodage d'image et programme de décodage d'image WO2013001803A1 (fr)

Priority Applications (15)

Application Number Priority Date Filing Date Title
KR1020157019113A KR20150088909A (ko) 2011-06-30 2012-06-27 화상 부호화 장치, 화상 부호화 방법, 화상 부호화 프로그램, 화상 복호 장치, 화상 복호 방법 및 화상 복호 프로그램
KR1020197020932A KR102103682B1 (ko) 2011-06-30 2012-06-27 화상 부호화 장치, 화상 부호화 방법, 화상 부호화 프로그램, 화상 복호 장치, 화상 복호 방법 및 화상 복호 프로그램
CN201280032664.5A CN103636218B (zh) 2011-06-30 2012-06-27 图像解码装置和图像解码方法
KR1020147002407A KR20140043242A (ko) 2011-06-30 2012-06-27 화상 부호화 장치, 화상 부호화 방법, 화상 부호화 프로그램, 화상 복호 장치, 화상 복호 방법 및 화상 복호 프로그램
KR1020217021884A KR102365353B1 (ko) 2011-06-30 2012-06-27 화상 부호화 장치, 화상 부호화 방법, 화상 부호화 프로그램, 화상 복호 장치, 화상 복호 방법 및 화상 복호 프로그램
KR1020227004619A KR102464103B1 (ko) 2011-06-30 2012-06-27 화상 부호화 장치, 화상 부호화 방법, 화상 부호화 프로그램, 화상 복호 장치, 화상 복호 방법 및 화상 복호 프로그램
KR1020187028735A KR102004113B1 (ko) 2011-06-30 2012-06-27 화상 부호화 장치, 화상 부호화 방법, 화상 부호화 프로그램, 화상 복호 장치, 화상 복호 방법 및 화상 복호 프로그램
KR1020217000106A KR102279115B1 (ko) 2011-06-30 2012-06-27 화상 부호화 장치, 화상 부호화 방법, 화상 부호화 프로그램, 화상 복호 장치, 화상 복호 방법 및 화상 복호 프로그램
KR1020207010416A KR102200578B1 (ko) 2011-06-30 2012-06-27 화상 부호화 장치, 화상 부호화 방법, 화상 부호화 프로그램, 화상 복호 장치, 화상 복호 방법 및 화상 복호 프로그램
US14/109,629 US9516314B2 (en) 2011-06-30 2013-12-17 Picture decoding device with a motion compensation prediction
US15/339,242 US9686564B2 (en) 2011-06-30 2016-10-31 Picture encoding device, picture encoding method, picture encoding program, picture decoding device, picture decoding method, and picture decoding program
US15/422,694 US9854266B2 (en) 2011-06-30 2017-02-02 Picture encoding device, picture encoding method, picture encoding program, picture decoding device, picture decoding method, and picture decoding program
US15/422,656 US9681149B1 (en) 2011-06-30 2017-02-02 Picture encoding device, picture encoding method, picture encoding program, picture decoding device, picture decoding method, and picture decoding program
US15/422,679 US9693075B2 (en) 2011-06-30 2017-02-02 Picture encoding device, picture encoding method, picture encoding program, picture decoding device, picture decoding method, and picture decoding program
US15/843,054 US10009624B2 (en) 2011-06-30 2017-12-15 Picture encoding device, picture encoding method, picture encoding program, picture decoding device, picture decoding method, and picture decoding program

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2011-146065 2011-06-30
JP2011-146066 2011-06-30
JP2011146066 2011-06-30
JP2011146065 2011-06-30
JP2012-143341 2012-06-26
JP2012143341A JP5678924B2 (ja) 2011-06-30 2012-06-26 画像復号装置、画像復号方法、及び画像復号プログラム、並びに、受信装置、受信方法、及び受信プログラム
JP2012143340A JP5807621B2 (ja) 2011-06-30 2012-06-26 画像符号化装置、画像符号化方法、画像符号化プログラム、送信装置、送信方法および送信プログラム
JP2012-143340 2012-06-26

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/109,629 Continuation US9516314B2 (en) 2011-06-30 2013-12-17 Picture decoding device with a motion compensation prediction

Publications (1)

Publication Number Publication Date
WO2013001803A1 true WO2013001803A1 (fr) 2013-01-03

Family

ID=47423722

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/004148 WO2013001803A1 (fr) 2011-06-30 2012-06-27 Dispositif de codage d'image, procédé de codage d'image, programme de codage d'image, dispositif de décodage d'image, procédé de décodage d'image et programme de décodage d'image

Country Status (2)

Country Link
KR (2) KR102464103B1 (fr)
WO (1) WO2013001803A1 (fr)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014514814A (ja) * 2011-03-21 2014-06-19 クゥアルコム・インコーポレイテッド ビデオコーディングにおける単予測ネイバーに基づく双予測マージモード
JP2016015787A (ja) * 2011-04-12 2016-01-28 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America 動画像復号化方法
US9456217B2 (en) 2011-05-24 2016-09-27 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US9456214B2 (en) 2011-08-03 2016-09-27 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus, and moving picture coding and decoding apparatus
US9560373B2 (en) 2011-05-31 2017-01-31 Sun Patent Trust Image coding method and apparatus with candidate motion vectors
US9609356B2 (en) 2011-05-31 2017-03-28 Sun Patent Trust Moving picture coding method and apparatus with candidate motion vectors
US9615107B2 (en) 2011-05-27 2017-04-04 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US9723322B2 (en) 2011-05-27 2017-08-01 Sun Patent Trust Decoding method and apparatus with candidate motion vectors
US10887585B2 (en) 2011-06-30 2021-01-05 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
CN112514395A (zh) * 2018-12-13 2021-03-16 Jvc建伍株式会社 图像解码装置、图像解码方法以及图像解码程序
US11218708B2 (en) 2011-10-19 2022-01-04 Sun Patent Trust Picture decoding method for decoding using a merging candidate selected from a first merging candidate derived using a first derivation process and a second merging candidate derived using a second derivation process
US11838514B2 (en) 2018-08-06 2023-12-05 Electronics And Telecommunications Research Institute Image encoding/decoding method and device, and recording medium storing bitstream

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3637996B2 (ja) 1997-03-28 2005-04-13 シャープ株式会社 領域統合が可能な動き補償フレーム間予測方式を用いた動画像符号化・復号化装置
KR100931750B1 (ko) * 2002-04-19 2009-12-14 파나소닉 주식회사 움직임 벡터 계산방법

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HIDEKI TAKEHARA ET AL.: "Bi-derivative merge candidate", JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG16 WP3 AND ISO/IEC JTC1/SC29/WG11, JCTVC-F372, 6TH MEETING, July 2011 (2011-07-01), TORINO, IT, pages 1 - 5 *
J. JUNG ET AL.: "Temporal MV predictor modification for MV-Comp", SKIP, DIRECT AND MERGE SCHEMES, JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG16 WP3 AND ISO/IEC JTC1/SC29/WG11, JCTVC-D164, 4TH MEETING, January 2011 (2011-01-01), DAEGU, KR, pages 1 - 5 *
JIAN-LIANG LIN ET AL.: "Improved Advanced Motion Vector Prediction", JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG16 WP3 AND ISO/IEC JTC1/SC29/WG11, JCTVC-D125_R2, 4TH MEETING, January 2011 (2011-01-01), DAEGU, KR, pages 1 - 8 *
YUNFEI ZHENG ET AL.: "Extended Motion Vector Prediction for Bi predictive Mode", JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG16 WP3 AND ISO/IEC JTC1/SC29/WG11, JCTVC-E343, 5TH MEETING, March 2011 (2011-03-01), GENEVA, pages 1 - 4 *

Cited By (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014514814A (ja) * 2011-03-21 2014-06-19 クゥアルコム・インコーポレイテッド ビデオコーディングにおける単予測ネイバーに基づく双予測マージモード
US9648334B2 (en) 2011-03-21 2017-05-09 Qualcomm Incorporated Bi-predictive merge mode based on uni-predictive neighbors in video coding
US9872036B2 (en) 2011-04-12 2018-01-16 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
JP2016015787A (ja) * 2011-04-12 2016-01-28 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America 動画像復号化方法
US9445120B2 (en) 2011-04-12 2016-09-13 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US11917186B2 (en) 2011-04-12 2024-02-27 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US11356694B2 (en) 2011-04-12 2022-06-07 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US11012705B2 (en) 2011-04-12 2021-05-18 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10609406B2 (en) 2011-04-12 2020-03-31 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10536712B2 (en) 2011-04-12 2020-01-14 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10382774B2 (en) 2011-04-12 2019-08-13 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10178404B2 (en) 2011-04-12 2019-01-08 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US11228784B2 (en) 2011-05-24 2022-01-18 Velos Media, Llc Decoding method and apparatuses with candidate motion vectors
US10129564B2 (en) 2011-05-24 2018-11-13 Velos Media, LCC Decoding method and apparatuses with candidate motion vectors
US10484708B2 (en) 2011-05-24 2019-11-19 Velos Media, Llc Decoding method and apparatuses with candidate motion vectors
US9456217B2 (en) 2011-05-24 2016-09-27 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US9826249B2 (en) 2011-05-24 2017-11-21 Velos Media, Llc Decoding method and apparatuses with candidate motion vectors
US9615107B2 (en) 2011-05-27 2017-04-04 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US11076170B2 (en) 2011-05-27 2021-07-27 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US10034001B2 (en) 2011-05-27 2018-07-24 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US10200714B2 (en) 2011-05-27 2019-02-05 Sun Patent Trust Decoding method and apparatus with candidate motion vectors
US10212450B2 (en) 2011-05-27 2019-02-19 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US9838695B2 (en) 2011-05-27 2017-12-05 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US11575930B2 (en) 2011-05-27 2023-02-07 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US11895324B2 (en) 2011-05-27 2024-02-06 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US11115664B2 (en) 2011-05-27 2021-09-07 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US9883199B2 (en) 2011-05-27 2018-01-30 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US9723322B2 (en) 2011-05-27 2017-08-01 Sun Patent Trust Decoding method and apparatus with candidate motion vectors
US10595023B2 (en) 2011-05-27 2020-03-17 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US11570444B2 (en) 2011-05-27 2023-01-31 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US11979582B2 (en) 2011-05-27 2024-05-07 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US10721474B2 (en) 2011-05-27 2020-07-21 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US10708598B2 (en) 2011-05-27 2020-07-07 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US10645413B2 (en) 2011-05-31 2020-05-05 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US10652573B2 (en) 2011-05-31 2020-05-12 Sun Patent Trust Video encoding method, video encoding device, video decoding method, video decoding device, and video encoding/decoding device
US11917192B2 (en) 2011-05-31 2024-02-27 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US10951911B2 (en) 2011-05-31 2021-03-16 Velos Media, Llc Image decoding method and image decoding apparatus using candidate motion vectors
US9609356B2 (en) 2011-05-31 2017-03-28 Sun Patent Trust Moving picture coding method and apparatus with candidate motion vectors
US11057639B2 (en) 2011-05-31 2021-07-06 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US11949903B2 (en) 2011-05-31 2024-04-02 Sun Patent Trust Image decoding method and image decoding apparatus using candidate motion vectors
US9900613B2 (en) 2011-05-31 2018-02-20 Sun Patent Trust Image coding and decoding system using candidate motion vectors
US9819961B2 (en) 2011-05-31 2017-11-14 Sun Patent Trust Decoding method and apparatuses with candidate motion vectors
US9560373B2 (en) 2011-05-31 2017-01-31 Sun Patent Trust Image coding method and apparatus with candidate motion vectors
US10412404B2 (en) 2011-05-31 2019-09-10 Velos Media, Llc Image decoding method and image decoding apparatus using candidate motion vectors
US11368710B2 (en) 2011-05-31 2022-06-21 Velos Media, Llc Image decoding method and image decoding apparatus using candidate motion vectors
US11509928B2 (en) 2011-05-31 2022-11-22 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US10887585B2 (en) 2011-06-30 2021-01-05 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
US11553202B2 (en) 2011-08-03 2023-01-10 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US9456214B2 (en) 2011-08-03 2016-09-27 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus, and moving picture coding and decoding apparatus
US10440387B2 (en) 2011-08-03 2019-10-08 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US10284872B2 (en) 2011-08-03 2019-05-07 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US10129561B2 (en) 2011-08-03 2018-11-13 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US11979598B2 (en) 2011-08-03 2024-05-07 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US11647208B2 (en) 2011-10-19 2023-05-09 Sun Patent Trust Picture coding method, picture coding apparatus, picture decoding method, and picture decoding apparatus
US11218708B2 (en) 2011-10-19 2022-01-04 Sun Patent Trust Picture decoding method for decoding using a merging candidate selected from a first merging candidate derived using a first derivation process and a second merging candidate derived using a second derivation process
US11838514B2 (en) 2018-08-06 2023-12-05 Electronics And Telecommunications Research Institute Image encoding/decoding method and device, and recording medium storing bitstream
CN112514395B (zh) * 2018-12-13 2023-07-21 Jvc建伍株式会社 图像解码装置和方法、以及图像编码装置和方法
CN112514395A (zh) * 2018-12-13 2021-03-16 Jvc建伍株式会社 图像解码装置、图像解码方法以及图像解码程序

Also Published As

Publication number Publication date
KR20210091356A (ko) 2021-07-21
KR102464103B1 (ko) 2022-11-04
KR102365353B1 (ko) 2022-02-23
KR20220025216A (ko) 2022-03-03

Similar Documents

Publication Publication Date Title
KR102200578B1 (ko) 화상 부호화 장치, 화상 부호화 방법, 화상 부호화 프로그램, 화상 복호 장치, 화상 복호 방법 및 화상 복호 프로그램
WO2013001803A1 (fr) Dispositif de codage d'image, procédé de codage d'image, programme de codage d'image, dispositif de décodage d'image, procédé de décodage d'image et programme de décodage d'image
JP6135750B2 (ja) 画像符号化装置、画像符号化方法、及び画像符号化プログラム、並びに、送信装置、送信方法、及び送信プログラム
JP5720751B2 (ja) 画像復号装置、画像復号方法、及び画像復号プログラム、並びに、受信装置、受信方法、及び受信プログラム
JP2013021613A (ja) 画像復号装置、画像復号方法及び画像復号プログラム
JP2013021612A (ja) 画像符号化装置、画像符号化方法及び画像符号化プログラム
JP2013021572A (ja) 画像符号化装置、画像符号化方法および画像符号化プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12804320

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20147002407

Country of ref document: KR

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 12804320

Country of ref document: EP

Kind code of ref document: A1