WO2013073138A1 - Motion vector coding apparatus, method and program - Google Patents

Motion vector coding apparatus, method and program Download PDF

Info

Publication number
WO2013073138A1
WO2013073138A1 PCT/JP2012/007171 JP2012007171W WO2013073138A1 WO 2013073138 A1 WO2013073138 A1 WO 2013073138A1 JP 2012007171 W JP2012007171 W JP 2012007171W WO 2013073138 A1 WO2013073138 A1 WO 2013073138A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion vector
plane
coding
unit
decoding
Prior art date
Application number
PCT/JP2012/007171
Other languages
French (fr)
Inventor
Mitsuru Maeda
Original Assignee
Canon Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Kabushiki Kaisha filed Critical Canon Kabushiki Kaisha
Publication of WO2013073138A1 publication Critical patent/WO2013073138A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • H04N19/517Processing of motion vectors by encoding
    • H04N19/52Processing of motion vectors by encoding by predictive encoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/189Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
    • H04N19/196Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding being specially adapted for the computation of encoding parameters, e.g. by averaging previously computed encoding parameters
    • H04N19/198Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding being specially adapted for the computation of encoding parameters, e.g. by averaging previously computed encoding parameters including smoothing of a sequence of encoding parameters, e.g. by averaging, by choice of the maximum, minimum or median value

Abstract

A motion vector coding apparatus includes a motion vector generation unit configured to generate a motion vector of a coding target block in a coding target color plane, an inter-plane motion vector extraction unit configured to extract a motion vector of an inter-plane reference block which is a block in a color plane different from the coding target color plane and at the same position as that of the coding target block; a predicted motion vector calculation unit configured to calculate a predicted motion vector from the motion vector of the inter-plane reference block, a motion vector error calculation unit configured to calculate a motion vector error from the motion vector of the coding target block and the predicted motion vector, and a motion vector difference coding unit configured to code the motion vector error.

Description

[Title established by the ISA under Rule 37.2] MOTION VECTOR CODING APPARATUS, METHOD AND PROGRAM
The present invention relates to coding and decoding of a motion vector performed when motion compensation is performed in coding and decoding of an image. Particularly, the present invention relates to image data including a plurality of color signals.
H.264/ MPEG4 Advanced Video Coding (AVC) (hereinafter abbreviated as H.264), is known as a coding method for moving image compression recording (International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC) 14496-10:2010 Information technology -- Coding of audio-visual objects -- Part 10: Advanced Video Coding). Similar to conventional coding methods, H.264 performs motion compensation by referring to other pictures for improved coding efficiency. Motion vectors can be coded in units of macro-blocks (16 pixels x 16 pixels), blocks (eight pixels x eight pixels), and blocks (16 pixels x eight pixels). To code a motion vector, a predicted motion vector is calculated by using a median value of motion vectors in neighboring (left, above, and upper left) blocks. An error between the predicted motion vector and the coding target motion vector is encoded.
In H.264, hierarchical coding can be performed. International standardization activities for coding methods of even higher efficiency as a successor to H.264 have recently started. For this purpose, the Joint Collaborative Team on Video Coding (JCT-VC) has been established between the International Organization for Standardization and the International Electrotechnical Commission (ISO/IEC) and the International Telecommunication Union (ITU) Telecommunication Standardization Sector (ITU-T). JCT-VC promotes standardization of the High Efficiency Video Coding scheme (hereinafter, abbreviated as HEVC). In HEVC, Advanced Motion Vector Prediction (hereinafter abbreviated as AMVP) has been discussed as a new motion vector coding method (a JCT-VC contribution JCTVC-A124_r2.doc the Internet <http://wftp3.itu.int/av-arch/jctvc-site/2010_04_A_Dresden/>).
In AMVP, not only a median value of motion vectors of neighboring blocks but also the motion vectors themselves of the neighboring blocks are used as a reference motion vector, similar to H.264. Aside from the motion vectors in the neighboring blocks, a motion vector in a block at the same position of the previous picture in coding order (hereinafter, referred to as a temporal direction predicted motion vector) is also included in predicted motion vectors. Motion vectors including the same components among such predicted motion vectors are integrated to reduce target motion vectors. In addition, to improve error resilience and improve throughput performance, when the number of predicted motion vectors is smaller than a predetermined number, an alternate motion vector is also inserted thereinto. The motion vector closest to the motion vector that is coded is selected from said motion vectors. A code for identifying the selected motion vector (hereinafter, referred to as a predicted vector index code) and a prediction error resulting from the prediction are coded. Thus, coding efficiency is improved.
Further, in H.264, a technique for handling a 4 : 4 : 4 format, in which sub-sampling is not performed in red (R), green (G), and blue (B) colors or in Y, Cb, Cr colors, has been developed and has become a standard. For example, they are techniques of High 4 : 4 : 4 Predictive profile and High 4 : 4 : 4 Intra profile. JCT-TV has been discussed coding targeting for a video signal on a display in addition to coding of a conventional 4 : 4 : 4 natural video (a JCT-VC contribution JCTVC-F013_v3.docx the Internet <http://phenix.int-evry.fr/jct/>).
Particularly, a coding target includes a screen signal such as presentation or computer graphics by a computer. RGB colors are often used as a video signal. In the above-described profile, a method for converting RGB colors into not only conventional YCbCr colors but also YCgCo colors by performing simple color conversion and a method for independently coding colors are added.
In High Efficiency Video Coding (HEVC), when a 4 : 4 : 4 format, in which sub-sampling is not performed, is implemented, RGB colors may be converted into YCbCr colors when RGB colors are handled, similar to H.264. However, the color conversion is indispensable. In a method for independently coding RGB colors, there is a strong correlation among each of R, G, and B signals. Therefore, efficiency is not increased. As a method for solving this issue, Japanese Patent Application Laid-Open No. 2010-110006 discusses a coding and decoding technique using a correlation among color signals.
The technique discussed in Japanese Patent Application Laid-Open No. 2010-110006 individually determines a prediction mode for each color component. The technique further discusses to select either one of prediction mode information in the vicinity of an image area to be predicted on the same color component and prediction mode information at a position within the same screen as that including the image area to be predicted in different color components and determines a prediction value in a prediction mode to code the prediction mode information.
In Japanese Patent Application Laid-Open No. 2010-110006, however, a motion vector is coded for each set of color components (hereinafter referred to as color plane). Therefore, a correlation among color signals cannot be considered so that coding efficiency is not improved.
Japanese Patent Application Laid-Open No. 2010-110006
JCT-VC contribution JCTVC-A124_r2.doc the Internet < http://wftp3.itu.int/av-arch/jctvc-site/2010_04_A_Dresden/> JCT-VC contribution JCTVC-F013_v3.docx the Internet < http://phenix.int-evry.fr/jct/>
The present invention is directed to a technique for improving coding efficiency in coding of a motion vector among color planes.
According to an aspect of the present invention, a motion vector coding apparatus includes a motion vector generation unit configured to generate a motion vector of a coding target block in a coding target color plane, an inter-plane motion vector extraction unit configured to extract a motion vector of an inter-plane reference block which is a block in a color plane different from the coding target color plane and at the same position as that of the coding target block, a predicted motion vector calculation unit configured to calculate a predicted motion vector from the motion vector of the inter-plane reference block, a motion vector error calculation unit configured to calculate a motion vector error from the motion vector of the coding target block and the predicted motion vector, and a motion vector difference coding unit configured to code the motion vector error.
According to the present invention, a reference to a motion vector of a neighboring block around a color signal plane to be coded or of a block of a previously coded picture, and a reference to a motion vector of other planes are equally handled. Thus, motion_prediction_flag and identification information can be integrated, so that coding efficiency can be improved.
Further features and aspects of the present invention will become apparent from the following detailed description of exemplary embodiments with reference to the attached drawings.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments, features, and aspects of the invention and, together with the description, serve to explain the principles of the invention.
Fig. 1 is a block diagram illustrating a configuration of an image coding apparatus according to a first exemplary embodiment. Fig. 2 is a detailed block diagram of a plane coding unit in the image coding apparatus according to the first exemplary embodiment. Fig. 3 is a flowchart illustrating processing for coding a motion vector in a non-reference plane that is not coded by referring other plane in the image coding apparatus according to the first exemplary embodiment. Fig. 4 is a flowchart illustrating processing for coding a motion vector in a reference plane that is coded by referring other plane in the image coding apparatus according to the first exemplary embodiment. Fig. 5A illustrates an example of a state of coding indexes of a reference motion vector group. Fig. 5B illustrates an example of a state of coding indexes of a reference motion vector group. Fig. 5C illustrates an example of a state of coding indexes of a reference motion vector group. Fig. 5D illustrates an example of a state of coding indexes of a reference motion vector group. Fig. 6 is a block diagram illustrating a configuration of an image coding apparatus according to a second exemplary embodiment. Fig. 7 is a detailed block diagram of a plane coding unit in the image coding apparatus according to the second exemplary embodiment. Fig. 8 is a flowchart illustrating processing for coding a motion vector in a reference plane in the image coding apparatus according to the second exemplary embodiment. Fig. 9 is a block diagram illustrating a configuration of an image decoding apparatus according to a third exemplary embodiment. Fig. 10 is a detailed block diagram of a plane decoding unit in the image decoding apparatus according to the third exemplary embodiment. Fig. 11 is a flowchart illustrating processing for decoding a motion vector in a referenced plane in the image decoding apparatus according to the third exemplary embodiment. Fig. 12 is a flowchart illustrating processing for decoding a motion vector in a reference plane in the image decoding apparatus according to the third exemplary embodiment. Fig. 13 is a block diagram illustrating a configuration of an image decoding apparatus according to a fourth exemplary embodiment. Fig. 14 is a detailed block diagram of a plane decoding unit in the image decoding apparatus according to the fourth exemplary embodiment. Fig. 15 is a flowchart illustrating processing for decoding a motion vector in a reference plane in the image decoding apparatus according to the fourth exemplary embodiment. Fig. 16 illustrates an example of blocks and motion vectors according to the first and third exemplary embodiments. Fig. 17 illustrates an example of blocks and motion vectors according to the second and fourth exemplary embodiments. Fig. 18 is a block diagram illustrating an example of a hardware configuration of a computer that is applicable to an image coding apparatus and an image decoding apparatus according to the present invention.
Various exemplary embodiments, features, and aspects of the invention will be described in detail below with reference to the drawings.
A first exemplary embodiment of the present invention will be described below with reference to the drawings. Fig. 1 is a block diagram illustrating an image coding apparatus according to the present exemplary embodiment. In Fig. 1, image data is input to a terminal 101. According to the present exemplary embodiment, it is assumed that image data including red, green, and blue (R, G, and B) color planes in a proportion of 4 : 4 : 4 are coded to facilitate the understanding of the description. A frame memory 102 once stores the image data to be coded. Plane coding units 103, 104, and 105 respectively code the color planes.
The plane coding unit 103 codes a first color plane of the input image data, to generate first color plane coded data. While the first color plane is described as a G color plane, the present exemplary embodiment is not limited to this example. The plane coding unit 104 codes a second color plane of the input image data to generate second color plane coded data. The second color plane is described as a B color plane. The plane coding unit 105 codes a third color plane of the input image data to generate third color plane coded data. The third color plane is described as an R color plane.
An integration coding unit 106 integrates the coded data pieces respectively generated by the plane coding units 103, 104, and 105, generates header information required as a sequence and header information required as a frame, and integrates the header information pieces to form a bit stream. A terminal 107 outputs the bit stream which is generated by the integration coding unit 106 to the exterior.
An image coding operation in the above image coding apparatus will be described below.
A coding method is similarly to H.264. The integration coding unit 106 generates the header information required as a sequence and the header information required as a frame prior to coding processing in units of blocks. The header information required as the frame is always generated before coding processing relating to the frame. According to the present exemplary embodiment, RGB color planes in 4 : 4: 4 format are coded. The information about the structure of the image data are coded as header information. A part of the header of the sequence (Sequence Parameter Set (hereinafter referred to as SPS)) is illustrated in Table 1.
Figure JPOXMLDOC01-appb-T000001
In Table 1, a profile_idc code and a level_idc code respectively define a profile and a level of the sequence. A seq_parameter_set_id code is a specific number in SPS. A chroma_format_idc code represents sub-sampling of chromaticity. A separate_colour_plane_idc code represents a dependence relationship at the time of coding among the color planes. Values respectively taken by the codes are shown in Table 2.
Figure JPOXMLDOC01-appb-T000002
Columns excluding the lowermost column are similar to those in H.264. If the chroma_format_idc code is 3, and the separate_colour_plane_idc code is 2, coding of each plane is performed using a correlation among the color planes.
Image data corresponding to one frame is input from the terminal 101 and stored in the frame memory 102. The image data of the G color plane, the image data of the B color plane, and the image data of the R color plane are respectively input to the plane coding unit 103, the plane coding unit 104, and the plane coding unit 105. Each of the plane coding units codes the corresponding color plane. The plane coding unit 104 and the plane coding unit 105 obtain information at the time of coding by the plane coding unit 103, and respectively code the corresponding color planes. The integration coding unit 106 integrates respective outputs of the plane coding units, and adds required header information to the integrated outputs.
Fig. 2 is a detailed block diagram of the plane coding unit. While the above-described AMVP is used as a motion vector coding method, the present exemplary embodiment is not limited to this method. Image data is input to a terminal 201. It is assumed that the image data is input in units of blocks to facilitate the understanding of the description.
An intra prediction unit 202 refers to a value of a pixel, which has been decoded, in a frame, to make a prediction. Intra prediction in H.264 will be described as an example to facilitate the understanding of the description. The intra prediction unit 202 outputs a method for the prediction and a generated prediction error to the succeeding stage. An inter prediction unit 203 refers to a value of a pixel, which has been decoded, in another frame, to perform motion compensation prediction. The inter prediction unit 203 outputs information for specifying a reference frame and a motion vector, and also outputs a prediction vector error generated by the motion compensation prediction.
A prediction determination unit 204 determines prediction information such as a block coding mode based on the prediction errors respectively output from the intra prediction unit 202 and the inter prediction unit 203. A transform and quantization unit 205 subjects the prediction errors to orthogonal transform to calculate an orthogonal transform coefficient, and further quantizes the orthogonal transform coefficient to calculate a quantization coefficient.
A coefficient coding unit 206 codes the calculated quantization coefficient. While a coding method is not particularly limited in the present invention, Huffman coding, arithmetic coding, Golomb coding, and the like may be used. An inverse quantization and inverse transform unit 207 performs inverse quantization on the quantization coefficient, which is reverse processing to the quantization performed by the transform and quantization unit 205, to generate a transform coefficient, and performs inverse transform to generate a prediction error. An image prediction reproduction unit 208 acquires a prediction method from the prediction determination unit 204 to generate image data from a value of a pixel that has been coded.
A frame memory 209 stores the generated image data so as to reference of the decoded pixels a reference to the pixel that has been decoded. An intra prediction coding unit 211 codes intra prediction information such as an intra prediction mode generated by the intra prediction unit 202 to generate intra prediction coding information code data. While the coding method is not particularly limited, Huffman coding, arithmetic coding, Golomb coding, and the like may be used.
Information such as a motion vector is input to a terminal 210 from the plane coding unit that codes the other color plane. A motion vector storage unit 212 stores a motion vector at the time of inter prediction coding. The motion vector stored in the motion vector storage unit 212 is output to the exterior from a terminal 214 according to a request from the exterior.
A reference motion vector group generation unit 213 generates a reference motion vector group, described below, from the motion vector input from the terminal 210 or the motion vector stored in the motion vector storage unit 212. A predicted motion vector selection and error calculation unit 215 determines a predicted motion vector of a motion vector to be coded from the reference motion vector group and calculates a prediction error of the determined predicted motion vector.
A motion vector coding unit 216 codes information about the determined predicted motion vector and the motion vector prediction error to generate motion vector code data. While the coding method is not particularly limited, Huffman coding, arithmetic coding, Golomb coding, and the like may be used. An inter prediction coding unit 217 codes inter prediction information such as an inter prediction mode generated by the inter prediction unit 203 and generates inter prediction coding information code data from the coded inter prediction information and the generated motion vector code data.
A prediction information coding unit 218 codes the prediction information such as the block coding mode output from the prediction determination unit 204 to generate prediction information code data. A selector 219 selects an input according to the block coding mode.
A header coding unit 220 codes information about a block. A code data formation unit 221 collects respective outputs of the header coding unit 220, the prediction information coding unit 218, the selector 219, and the coefficient coding unit 206, and generates a bit stream. The bit stream is output to the exterior from a terminal 222. According to the present exemplary embodiment, the terminal 210 is not required in the plane coding unit 103, and the terminal 214 is not required in the plane coding units 104 and 105. The terminals may be omitted.
An operation for coding each of the color planes in the above-described configuration will be described below.
The header coding unit 220 inputs image data in units of blocks which has been input from the terminal 201 to the intra prediction unit 202 and the inter prediction unit 203. The intra prediction unit 202 reads a value of a pixel, which has been coded, around a coding target block in the same frame from the frame memory 209 to perform intra prediction, and outputs an intra prediction mode and an intra prediction error of the pixel value. The intra prediction mode is input to and coded by the intra prediction coding unit 211 to generate intra prediction code data. The prediction error is input to the prediction determination unit 204.
The inter prediction unit 203 searches for a motion vector to perform motion compensation from a value of a pixel around a position of a coding target block in a frame that is previously coded, and determines a motion vector which minimizes a prediction error of the pixel value and a motion compensation mode. While the motion compensation mode is not particularly limited, a plurality of frames may be referred to, similar to H.264. Alternatively, forward prediction, backward prediction, and bidirectional prediction may be used, similar to MPEG-1, -2, and -4. The determined motion vector is input to the motion vector storage unit 212, the predicted motion vector selection and error calculation unit 215, and the image prediction reproduction unit 208. The minimum prediction error of the pixel value is output to the prediction determination unit 204.
The prediction determination unit 204 compares the prediction errors of the pixel values which are respectively input from the intra prediction unit 202 and the inter prediction unit 203. If the prediction error input from the intra prediction unit 202 is small, the prediction determination unit 204 determines that the coding mode for the coding target block is an intra coding mode. Otherwise, the prediction determination unit 204 determines that the coding mode for the coding target block is an inter coding mode. The prediction determination unit 204 outputs the determined coding mode to the image prediction reproduction unit 208, the motion vector storage unit 212, the selector 219, and the prediction information coding unit 218.
The prediction information coding unit 218 codes the input reference information, generates prediction information code data, and inputs the generated prediction information code data to the code data formation unit 221. The prediction information coding unit 218 also inputs the smallest one of the prediction errors of the pixel values used for the comparison to the transform and quantization unit 205. If the coding mode is the inter coding mode, the motion vector storage unit 212 stores the motion vector and the motion compensation mode in units of blocks for a reference in the succeeding stage and a reference from the exterior.
The prediction error of the pixel value input to the transform and quantization unit 205 is orthogonally transformed and quantized to generate a quantization coefficient. The generated quantization coefficient is coded by the coefficient coding unit 206, to generate quantization coefficient code data. The generated quantization coefficient code data is input to the code data formation unit 221. The quantization coefficient is input to the inverse quantization and inverse transform unit 207 to generate a prediction error of a pixel value via inverse quantization and inverse orthogonal transform. The prediction error input to image prediction reproduction unit 208. The unit calculates a prediction value of the pixel from the frame memory 209 according to the coding mode and the intra prediction mode or the motion compensation mode and the motion vector, and reproduces the pixel value of the coding target block from the calculated prediction value and the generated prediction error. The reproduced pixel value is input to the frame memory 209, and is stored therein.
The reference motion vector group generation unit 213 first reads out motion vectors of blocks above, to the left of, and upper right of the coding target block from the motion vector storage unit 212. Such motion vector is hereinafter referred to as a spatial predicted motion vector. A motion vector in a block, at the same position as that of the coding target block, in a frame coded immediately before a coding target frame is then read out from the motion vector storage unit 212. Such motion vector is hereinafter referred to as a temporal predicted motion vector. In the plane coding unit 103, the above-described motion vectors are input.
Fig. 16 illustrates the motion vectors that are input to the reference motion vector group generation unit 213. In Fig. 16, three motion vectors, described below, are input as spatial predicted motion vectors for a coding target block 1601 in a coding target plane. More specifically, a motion vector 1609 of a block 1602 above the coding target block 1601, a motion vector 1610 of a block 1603 on the upper right of the coding target block 1601, and a motion vector 1611 of a block 1604 to the left of the coding target block 1601 are input as the spatial predicted motion vectors. Further, a motion vector 1612 of a block 1605 at the same position as that of a coding target block 1601 in a frame coded immediately before the cording target frame is input as a temporal predicted motion vector.
The plane coding units 104 and 105 further receive a motion vector of a block at the same position as that of the coding target block generated by the plane coding unit 103 from the terminal 210. In Fig. 16, in the plane coding units 104 and 105, a reference plane 1 is the first color plane. The plane coding units 104 and 105 receive a motion vector 1613 of a block 1606 at the same position. The motion vector obtained by referring to the other color planes is thus hereinafter referred to as an inter-plane predicted motion vector.
The plane coding unit 105 further receives a motion vector of a block at the same position as that of the coding target block generated by the plane coding unit 104 from the terminal 210. In Fig. 16, in the plane coding unit 105, a reference plane 1 is the first color plane, and a reference plane 2 is the second color plane. The plane coding unit 105 can receives an inter-plane predicted motion vector 1614 of a block 1607 at the same position.
Thus, the reference motion vector group generation unit 213 receives motion vectors used for prediction respectively from the motion vector storage unit 212 and the terminal 210 based on the position of the coding target block. The motion vectors also are output from the motion vector storage unit 212 via the terminal 214.
The reference motion vector group generation unit 213 collects the motion vectors into a reference motion vector group. The reference motion vector group generation unit 213 may further newly calculate a motion vector for prediction from an average value and a median value of the motion vectors. Further, since motion vectors having the same component are redundant among the motion vectors, only one of them may be set aside to reduce the number of motion vectors. Then, the motion vectors are respectively assigned indexes and regarded as a reference motion vector group.
Figs. 5A to 5D illustrate how the reference motion vector groups are formed. In coding of the first color plane, there is no inter-plane predicted motion vector. Therefore, the reference motion vector group can include a maximum of five motion vectors in the present exemplary embodiment, as illustrated in Fig. 5C. For example, if the spatial predicted motion vector 1 includes the same component as that of the spatial predicted motion vector 4, and the spatial predicted motion vector 2 includes the same component as that of the spatial predicted motion vector 3, the number of reference motion vector groups can be reduced, as illustrated in Fig. 5D. While the order of indexes is not particularly limited, the spatial predicted motion vector may be assigned an index in the order of positions of blocks, and the temporal predicted motion vector may be assigned an index upper than that assigned to the spatial predicted motion vector.
In the plane coding unit 104, the reference motion vector group is as illustrated in Fig. 5B. The inter-plane predicted motion vector is read out from the plane coding unit 103, and is incorporated into the reference motion vector group. While the order of indexes is not particularly limited, the inter-plane predicted motion vector may be assigned an index upper than that assigned to the temporal predicted motion vector.
Further, in the plane coding unit 105, the reference motion vector group is as illustrated in Fig. 5A. The number of inter-plane predicted motion vectors is same in each of the planes. For the inter-plane predicted motion vector, the temporal predicted motion vector, and the spatial predicted motion vector, the motion vectors including the same component can be reduced in number without being distinguished.
Referring to Fig. 2 again, the coding target motion vector determined by the inter prediction unit 203 is input to the predicted motion vector selection and error calculation unit 215. The reference motion vector group is also input from the reference motion vector group generation unit 213. The predicted motion vector selection and error calculation unit 215 calculates a prediction error between the input coding target motion vector and each of the motion vectors in the reference motion vector group, and selects the motion vector which minimizes a motion vector prediction error as a predicted motion vector.
The index assigned to the predicted motion vector (see Figs. 5A to 5D) and the prediction error between the motion vector of the coding target block and the predicted motion vector are output. The index and the prediction error are input to the motion vector coding unit 216, and are coded to generate motion vector code data. The inter prediction coding unit 217 integrates the motion compensation mode generated by the inter prediction unit 203 and the motion vector code data output from the motion vector coding unit 216 to generate inter prediction code data.
The intra prediction code data from the intra prediction coding unit 211 and the inter prediction code data from the inter prediction coding unit 217 are input to the selector 219. The selector 219 selects an output of the intra prediction coding unit 211 if the prediction information from the prediction determination unit 204 is the intra coding mode, and selects an output of the inter prediction coding unit 217 if the prediction information from the prediction determination unit 204 is the inter coding mode, and inputs the selected outputs to the code data formation unit 221. Prediction information code data and header coded data are input to the code data formation unit 221, respectively from the prediction information coding unit 218 and the header coding unit 220. The code data formation unit 221 forms code data in units of blocks, and outputs the formed code data from the terminal 222.
Referring to Fig. 1 again, the integration coding unit 106 receives the first color plane coded data, the second color plane coded data, and the third color plane coded data, respectively from the plane coding unit 103, the plane coding unit 104, and the plane coding unit 105. The integration coding unit 106 collects the code data in units of blocks to constitute code data in a frame unit, forms the code data according to a format of a bit stream, and outputs the formed code data from the terminal 107.
Figs. 3 and 4 are flowcharts illustrating motion vector coding processing in the image coding apparatus according to the first exemplary embodiment. Fig. 3 illustrates processing for coding the first color plane, and Fig. 4 illustrates processing for coding the second color plane or the third color plane.
The processing for coding the first color plane will be described with reference to Fig. 3. In step S301, the image coding apparatus first inputs a motion vector of a coding target block, and stores the input motion vector to refer to the motion vector when another motion vector is predicted.
In step S302, the image coding apparatus calculates a spatial predicted motion vector from a block around the coding target block and a temporal predicted motion vector from a block in a frame which is coded immediately before the coding target block. In step S303, the image coding apparatus generates a reference motion vector group from the predicted motion vectors which are calculated in step S302. When the reference motion vector group is generated, redundant motion vectors are reduced and are respectively assigned indexes.
In step S304, the image coding apparatus compares the motion vectors in the reference motion vector group with a motion vector of the coding target block, determines a predicted motion vector which minimizes a motion vector prediction error, and generates an index assigned thereto as selection information. The minimum motion vector prediction error is regarded as a motion vector prediction error of the block. In step S305, the image coding apparatus codes the index serving as the selection information to generate selection information code data.
In step S306, the image coding apparatus codes the motion vector prediction error of the coding target block.
Then, processing for coding the second color plane or the third color plane will be described below with reference to Fig. 4. In Fig. 4, steps implementing similar functions to those implemented by the steps illustrated in Fig. 3 are assigned the same reference numerals, and hence description thereof is not repeated.
In step S401, the image coding apparatus specifies a frame most recently coded, and specifies an inter-plane reference block which is a block at the same position as that of the coding target block. In step S402, the image coding apparatus extracts a motion vector of the specified inter-plane reference block.
In step S403, the image coding apparatus generates a reference motion vector group from the spatial predicted motion vector and the temporal predicted motion vector which are extracted in step S302. When the reference motion vector group is generated, redundant motion vectors are reduced in number, and are respectively assigned indexes. In steps S304 to S306, the image coding apparatus determines the selection information, calculates a motion vector prediction error, and codes the selection information and the motion vector prediction error.
According to the above-described configuration and operation, the prediction coding of the motion vector is performed among the color planes, so that the motion vector can be efficiently coded using a correlation among the color planes. Further, the inter-plane predicted motion vector is assigned an index upper in the reference motion vector group, so that the correlation can be more effectively used.
While input image data is once stored in the frame memory 102 according to the present exemplary embodiment, the present exemplary embodiment is not limited to this configuration. For example, the frame memory 102 may be replaced with line memories. While a block at a position illustrated in Fig. 16 is referred to as a motion vector in the reference motion vector group, the present exemplary embodiment is not limited to this example. For example, a motion vector of a block on the right of and adjacent to the block 1601 may be added. A motion vector obtained from a median value of the motion vectors may be added. A motion vector that refers to a motion vector in another plane is not limited to that at the same position. A motion vector of a block around the block 1606 illustrated in Fig. 16 may be referred to. Further, a motion vector of another block may be referred to.
While image data including the RGB color planes is described as an example according to the present exemplary embodiment, the present exemplary embodiment is not limited to this example. The image data may include color planes such as the above-described YCbCr color planes and YCgCo color planes, and other color planes. Further, while 4 : 4 : 4 format is used for description, the present exemplary embodiment is not limited to this example. Coding may be performed by 4 : 4 : 4 : 4 format by adding an alpha plane required to synthesize an image of the 4 : 4 : 4 format.
While AMVP is used as the motion vector prediction coding method in the present exemplary embodiment, the present exemplary embodiment is not limited to this example. Motion vectors to be referred to may include a temporal predicted motion vector and an inter-plane predicted motion vector, similar to H.264.
According to a second exemplary embodiment, as a motion vector coding method, a prediction error between a motion vector and an inter-plane predicted motion vector is coded.
Fig. 6 is a block diagram illustrating an image coding apparatus according to the present exemplary embodiment. In Fig. 6, it is assumed that image data including RGB color planes in a proportion of 4 : 4 : 4 are coded to facilitate the understanding of the description in the present exemplary embodiment. A plane order determination unit 608 determines the order of color planes in the coding. A plane order coding unit 609 codes the determined order of color planes. Plane coding units 603, 604, and 605 respectively code the color planes.
The plane coding unit 603 codes a first color plane of the input image data to generate first color plane coded data. While the first color plane is described as a G color plane, the present exemplary embodiment is not limited to this example. The plane coding unit 604 codes a second color plane of the input image data to generate second color plane coded data. The second color plane is described as a B color plane. The plane coding unit 605 codes a third color plane of the input image data to generate third color plane coded data. The third color plane is described as an R color plane.
An integration coding unit 606 integrates the coded data respectively generated by the plane coding units 603, 604, and 605, generates header information required as a sequence and header information required as a frame, and integrates the generated header information pieces to form a bit stream. The bit stream generated by the integration coding unit 106 is output to the exterior from a terminal 607.
An image coding operation in the above image coding apparatus will be described below.
Prior to the coding, the integration coding unit 606 respectively sets a chroma_format_idc code and a separate_colour_plane_flag code to 3 and 2, and includes the codes in SPS.
The input image data is stored in a frame memory 102. A G color plane, a B color plane, and an R color plane of the image data are respectively input to the plane coding units 603, 604, and 605.
The plane order determination unit 608 determines the coding order of the color planes from states of the color planes of the image data in the frame memory 102. A method for determining the coding order is not limited. For example, the sum of absolute values of differences among the color planes may be calculated to determine the coding order of the color planes in the descending order of the sums. Alternatively, standard errors in the respective color planes may be calculated to determine the coding order in the descending order of the errors. According to the present exemplary embodiment, the G color plane, the B color plane, and the R color plane are coded in this order, and the first color plane, the second color plane, and the third color plane are respectively the G color plane, the B color plane, and the R color plane. The order is input to each of the plane order coding unit 609 and the plane coding units 603, 604, and 605.
The plane order coding unit 609 respectively codes the G color plane, the B color plane, and the R color plane as the first color plane, the second color plane, and the third color plane, and generates plane order code data. While a coding method is not particularly limited in the present invention, Huffman coding, arithmetic coding, Golomb coding, and the like may be used. For example, the G color plane, the B color plane, and the R color plane may be coded in this order, and values of the order 3 of the R plane, the order 1 of the G plane, and the order 2 of the B plane may be coded. In this case, if the respective orders of the R plane and the G plane are assigned, the order of the B plane is uniquely determined. Therefore, the coding of the B plane may also be omitted. Alternatively, a Video Usability Information (VUI) parameter can also be used, similar to H.264. An example in which a matrix_coefficients code in the VUI parameter is expanded is illustrated in Table 3.
Figure JPOXMLDOC01-appb-T000003
Table 3 illustrates a value of a code and a combination of color planes to be taken. The combination is determined by setting the value of the code from 0 or 9 to 13. The orders of color plane are arranged in the order of descending orders from the left.
The plane coding unit 603 codes the G color plane. In the plane order determination unit 608, the G color plane is coded without referring to the other color planes because it is the first color plane. The plane coding unit 604 codes the B color plane. In the plane order determination unit 608, the B color plane is coded by referring to only the first color plane because it is the second color plane. The plane coding unit 605 codes the R color plane. In the plane order determination unit 608, the R color plane is coded by referring to the first color plane and the second color plane because it is the third color plane.
Fig. 7 is a detailed block diagram of the plane coding units 603, 604, and 605. In Fig. 7, blocks implementing similar functions to those implemented by the blocks illustrated in Fig. 2 are assigned the same reference numerals, and hence description thereof is not repeated. The order of color planes respectively coded by the plane coding units 603, 604, and 605 from the plane order determination unit 608 is input to a terminal 701.
A motion vector in the other plane is input to a terminal 702. A terminal 210 has the same function. The terminals are respectively connected to the different plane coding units. Terminals 704 and 705 are respectively connected to the different plane coding units, and respectively output the motion vectors to the exterior. For example, the terminal 210 and the terminal 704 of the plane coding unit 603 are respectively connected to the terminal 702 and the terminal 705 of the plane coding unit 604. The terminal 210 and the terminal 704 of the plane coding unit 604 are respectively connected to the terminal 702 and the terminal 705 of the plane coding unit 605. The terminal 210 and the terminal 704 of the plane coding unit 605 are respectively connected to the terminal 702 and the terminal 705 of the plane coding unit 603.
A motion vector input/output control unit 703 controls input and output of the motion vectors according to the order of color planes input from the terminal 701. If the plane coding unit codes the first color plane from the terminal 701, the terminal 210 and the terminal 702 do not operate to inhibit the motion vectors from being input from the other plane coding units. If the second color plane is coded, either one of the terminal 701 and the terminal 702 does not operate to inhibit the motion vector from being input from the plane coding unit of the third color plane. If the third color plane is coded, either one of the terminal 701 and the terminal 702 does not operate to input the motion vector from the plane coding unit of the first color plane. However, the present exemplary embodiment is not limited to this example. If the third color plane is coded, the motion vector in the second color plane may be input. Alternatively, a predicted motion vector may be calculated from the motion vectors in the first color plane and the second color plane.
A motion vector storage unit 712 differs from the motion vector storage unit 212 illustrated in Fig. 2 in that a motion vector is not directly output to the exterior. A motion vector error calculation unit 713 calculates a prediction error between a predicted motion vector input from the motion vector input/output control unit 703 and a motion vector of an input coding target block. A motion vector coding unit 716 differs from the motion vector coding unit 216 illustrated in Fig. 2 in that reference information about a reference motion vector group is neither input nor coded.
An operation for coding each of the color planes in the above-described configuration will be described below. The integration coding unit 606 illustrated in Fig. 6 adds and outputs required header information before starting the coding, similarly to the integration coding unit 106 in the first exemplary embodiment. The plane order determination unit 608 may previously determine the order of color planes, or may dynamically determine the order of color planes to match input image data. For example, the order of color planes may be fixed to the G color plane, the B color plane, and the R color plane, similar to the first exemplary embodiment. Alternatively, the order of color planes may be determined from the characteristic of a frame image, as described above.
The determined order is input to the plane order coding unit 609. The plane order coding unit 609 codes the respective orders of the color planes and generates color plane order code data. The generated color plane order code data is input to the integration coding unit 606. The order of color planes may be fixed in the whole sequence. Alternatively, the order of color planes may be replaced to match a scene change.
Processing in units of blocks will be described below. In Fig. 7, image data in units of blocks is input from the terminal 201 to perform intra prediction and inter prediction. In a inter prediction unit 203, motion compensation is performed, similar to the first exemplary embodiment, to calculate a motion vector of a coding target block. The calculated motion vector is input to the motion vector storage unit 712 and the motion vector error calculation unit 713.
The motion vector input/output control unit 703 codes input image data in units of blocks of the input G color plane as a first color plane if the plane coding unit is the plane coding unit 603. The motion vector input/output control unit 703 receives input indicating that the first color plane is coded, and does not receive input from the terminal 210 and the terminal 702. A motion vector around the coding target block in the coding target frame is input from the motion vector storage unit 712. In this case, a spatial predicted motion vector of the coding target block is extracted from the motion vector storage unit 712, similar to H.264. A predicted motion vector is calculated from a median value of the extracted spatial predicted motion vector, and is input to the motion vector error calculation unit 713.
The motion vector input/output control unit 703 codes the input image data in units of blocks of the B color plane as a second color plane if the plane coding unit is the plane coding unit 604. The motion vector input/output control unit 703 receives input indicating that the second color plane is coded from the terminal 701, and does not receives input from the terminal 210 connected to the plane coding unit 605 that codes the third color plane at this time. A motion vector of a block of the G color plane at the same position as that of the coding target block is input to the motion vector input/output control unit 703 from the motion vector storage unit 712 in the plane coding unit 603 via the terminal 704 and the terminal 702 of the plane coding unit 604. The motion vector input/output control unit 703 inputs the received motion vector as a predicted motion vector to the motion vector error calculation unit 713.
The motion vector input/output control unit 703 codes the input image data in units of blocks of the R color plane as a third color plane if the plane coding unit is the plane coding unit 605. The motion vector input/output control unit 703 receives input indicating that the third color plane is coded from the terminal 701, and does not receive input from the terminal 702 connected to the plane coding unit 604 that codes the second color plane at this time. A motion vector of a block in the G color plane at the same position as that of the coding target block is input to the motion vector input/output control unit 703 from the motion vector storage unit 712 in the plane coding unit 603 via the terminal 705 and the terminal 210 of the plane coding unit 605.
The motion vector input/output control unit 703 inputs the received motion vector as a predicted motion vector to the motion vector error calculation unit 713. The motion vector error calculation unit 713 inputs the received predicted motion vector and the motion vector of the coding target block generated by the inter prediction unit 203 to calculate a motion vector prediction error. For example, the motion vector prediction error may be calculated by subtracting each of components of the predicted motion vector from the motion vector of the coding target block.
Fig. 17 illustrates how the prediction error is calculated. In Fig. 17, if the coding target plane is a first color plane 1701, a motion vector obtained a median value of motion vectors of blocks around a motion vector 1705 of a coding target block 1704 is regarded as a predicted motion vector. More specifically, a motion vector obtained from a median value of a motion vector 1708 of a block 1706 above the coding target block 1704 and a motion vector 1709 of a block 1707 to the left thereof is regarded as a predicted motion vector. A prediction error is calculated from the predicted motion vector. If the coding target plane is a second color plane 1702, a motion vector 1712 having the same component as that of the motion vector 1705 in the first color plane 1701 is regarded as a predicted motion vector to calculate a motion vector 1713 of a prediction error with respect to a motion vector 1711 of a coding target block 1710. If the coding target plane is a third color plane 1703, a motion vector 1716 having the same component as that of the motion vector 1705 in the first color plane 1701 is regarded as a predicted motion vector to calculate a motion vector 1717 of a prediction error with respect to a motion vector 1715 of a coding target block 1714.
The motion vector coding unit 716 codes the generated prediction error and generates motion vector code data. The motion vector of the coding target block which is generated by the inter prediction unit 203 is stored in the motion vector storage unit 712. The motion vector code data thus generated by coding the prediction error calculated by predicting with the motion vector including an inter-plane predicted motion vector, is output to the exterior from a terminal 222 via an inter prediction coding unit 217 and a code data formation unit 221.
Referring to Fig. 6 again, the integration coding unit 606 receives the plane order code data from the plane order coding unit 609 and the code data of each color plane from the corresponding plane coding unit. The integration coding unit 606 integrates the plane order code data with a header in a sequence to form a bit stream and outputs the formed bit stream via the terminal 607.
Fig. 8 is a flowchart illustrating motion vector coding processing in the image coding apparatus according to the second exemplary embodiment. The processing for coding the first color plane illustrated in Fig. 8 is similar to that illustrated in Fig. 3 in the first exemplary embodiment, and hence description thereof is not repeated. However, a temporal predicted motion vector is not extracted in step S302 and it is not added to a reference motion vector group in step S303. Fig. 8 illustrates processing for coding a second color plane or a third color plane.
The processing for coding the second color plane will be described with reference to Fig. 8. In Fig. 8, steps implementing similar functions to those implemented by the steps illustrated in Fig. 3 are assigned the same reference numerals, and hence description thereof is not repeated. In step S801, the image coding apparatus specifies a frame most recently coded, and specifies an inter-plane reference block which is a block at the same position as that of the coding target block.
In step S802, the image coding apparatus extracts a motion vector of the specified inter-plane reference block as a predicted motion vector. In step S803, the image coding apparatus calculates a motion vector prediction error from the motion vector of the coding target block which is input in step S301 and the predicted motion vector which is extracted in step S802.
According to the above-described configuration and operation, prediction coding of a motion vector is performed among color planes by ordering the color planes and adaptively referring to motion vectors of the other color planes, so that the motion vector can be efficiently coded using a correlation among the color planes in a simple configuration.
While the order of color planes is determined before the sequence is coded according to the present exemplary embodiment, the present exemplary embodiment is not limited to this example. For example, the order of color planes may be changed by detecting a scene change or the like. In this case, the order of color planes can be changed by including a code representing the orders of the color planes in a picture parameter set in H.264.
According to the present exemplary embodiment, a method for generating a predicted motion vector is not limited to the above-described example. For example, in the coding of the third color plane, a predicted motion vector may be calculated from motion vectors respectively input from the terminal 210 and the terminal 702. For example, the average of the motion vectors may be regarded as a predicted motion vector. While the order of color planes are individually coded according to the coding order thereof, there is also a method for sharing a combination of the orders on the coding side and the decoding side and coding identification information of the combination.
Fig. 9 is a block diagram illustrating a configuration of an image decoding apparatus according to a third exemplary embodiment of the present invention. According to the present exemplary embodiment, decoding of coded data which is generated according to the first exemplary embodiment will be described as an example. A bit stream is input to a terminal 901. An integration decoding unit 902 decodes header information required as a sequence and header information required as a frame, separates the bit stream into code data pieces of respective color planes, and outputs the separated code data pieces to the succeeding stages.
A plane decoding unit 903 decodes input code data of a first color plane to generate image data of the first color plane. While the first color plane is described as a G color plane, the present exemplary embodiment is not limited to this example. A plane decoding unit 904 decodes input code data of a second color plane to generate image data of the second color plane. The second color plane is described as a B color plane. A plane decoding unit 905 decodes input code data of a third color plane to generate image data of the third color plane. The third color plane is described as an R color plane.
A frame memory 906 once stores the image data. The image data which is decoded and reproduced is output to the exterior from a terminal 907.
An image decoding operation in the above-described image decoding apparatus will be described below.
In Fig. 9, a bit stream input from the terminal 901 is input to the integration decoding unit 902. The integration decoding unit 902 decodes code data of the header information required as the sequence to reproduce a configuration of the color planes. In the first exemplary embodiment, a chroma_format_idc code is 3, and a separate_colour_plane_flag code is 2. At this time, the integration decoding unit 902 identifies that the color planes include G, B, and R color planes and decoding is performed using a correlation among the color planes. The integration decoding unit 902 separates the code data pieces of the G color plane, the B color plane, and the R color plane from the bit stream to be subsequently input and outputs the code data pieces respectively to the plane decoding units 903, 904, and 905.
Fig. 10 is a detailed block diagram of the plane decoding unit. Code data is input to a terminal 100. It is assumed that the code data is input in units of blocks to facilitate the understanding of the description. A code data separation unit 1002 separates the input code data into code data required in the succeeding stage, and outputs the separated code data. A header decoding unit 1003 decodes code data of a header and reproduces information about a block.
A quantization coefficient decoding unit 1004 decodes quantization coefficient code data to reproduce a quantization coefficient. An inverse quantization and inverse transform unit 1005 subjects the quantization coefficient to inverse quantization and inverse transform to reproduce a prediction error. A prediction information decoding unit 1009 reproduces prediction information such as a block coding mode.
A selector 1010 selects an output according to the block coding mode. An inter prediction decoding unit 1011 decodes and generates inter prediction information such as an inter prediction mode. A motion vector decoding unit 1016 decodes motion vector code data and reproduces information about a predicted motion vector and a motion vector prediction error.
Information such as a motion vector is input to a terminal 1017 from the plane decoding unit in the other color plane. A motion vector storage unit 1019 stores a motion vector subjected to inter prediction coding. A reference motion vector group generation unit 1018 generates a reference motion vector group similarly to the reference motion vector group generation unit 213 in the first exemplary embodiment from the motion vector input from the terminal 1017 and the motion vector stored in the motion vector storage unit 1019.
The motion vector stored in the motion vector storage unit 1019 is output to the exterior from a terminal 1020 according to a request from the exterior. A predicted motion vector selection unit 1021 selects a predicted motion vector from the reference motion vector group generated by the reference motion vector group generation unit 1018. A motion vector reproduction unit 1022 reproduces a motion vector of a decoding target block from the motion vector prediction error reproduced by the motion vector decoding unit 1016 and the predicted motion vector selected by the predicted motion vector selection unit 1021.
An inter prediction unit 1012 refers to a value of a decoded pixel in another frame according to the reproduced motion vector to perform motion compensation prediction. An intra prediction decoding unit 1013 decodes intra prediction coding information code data to generate intra prediction information such as an intra prediction mode. An intra prediction unit 1014 refers to a value of a decoded pixel in a frame according to the generated intra prediction information such as the intra prediction mode and makes a prediction.
A selector 1015 selects an input according to a block coding mode, similarly to the selector 1010. An image prediction reproduction unit 1006 is input a prediction value generated by the intra prediction unit 1004 or the inter prediction unit 1012 and reproduces image data of the decoding target block from the prediction error input from the inverse quantization and inverse transform unit 1005. A frame memory 1007 stores the reproduced image data. The reproduced image data is output to the exterior from a terminal 1008.
An image data decoding operation in the above-described configuration will be described below. Code data of a decoding target block input from the terminal 1001 is input to the code data separation unit 1002 and is classified into respective code data pieces. Code data of header information is input to the header decoding unit 1003, a motion vector code is input to the motion vector decoding unit 1016, and prediction information code data is input to the prediction information decoding unit 1009.
The prediction information decoding unit 1009 first decodes the prediction information, generates a block coding mode, and inputs the generated coding mode to the selector 1010 and the selector 1015. The selector 1010 determines an output destination of the input code data according to the coding mode output from the prediction information decoding unit 1009. If the coding mode is an intra prediction mode, the input code data is output to the intra prediction decoding unit 1013. Otherwise, the input code data is output to the inter prediction decoding unit 1011.
The selector 1015 determines an input source of the output code data according to the coding mode output from the prediction information decoding unit 1009. If the coding mode is an intra prediction mode, the code data input from the intra prediction unit 1004 is output to the inter prediction decoding unit 1011. Otherwise, the code data input from the inter prediction unit 1012 is output.
At the time of the intra prediction mode, the code data from the selector 1010 is input to the intra prediction decoding unit 1013, is decoded, and is input to the intra prediction unit 1014. The intra prediction decoding unit 1013 reproduces a coding mode of the decoding target block, and inputs the intra prediction mode to the intra prediction unit 1014. The intra prediction unit 1014 refers to a value of a decoded pixel in a frame to make a prediction, and calculates a prediction value.
On the other hand, the code data separation unit 1002 separates the motion vector code data from the code data, and inputs the separated motion vector code data to the motion vector decoding unit 1016. The motion vector decoding unit 1016 decodes the input motion vector code data, to reproduce an index (see Fig. 5) of a predicted motion vector, and a prediction error of a decoding target block. The index and the prediction error are respectively input to the predicted motion vector selection unit 1021 and the motion vector reproduction unit 1022. In parallel therewith, a spatial predicted motion vector, a temporal predicted motion vector, and an inter-plane predicted motion vector are respectively calculated from blocks around the decoding target block, a block in a frame previously decoded , and a motion vector in the other color plane.
The reference motion vector group generation unit 1018 inputs the motion vector of the other color plane from the terminal 1017. In the case where the first color plane is decoded, the inter-plane predicted motion vector is not input from the terminal 1017. In the case where the second color plane is decoded, the motion vector of the first color plane is input from the terminal 1017. In the case where the third color plane is decoded, the motion vector of the second color plane and the motion vector of the first color plane are input from the terminal 1017. The motion vectors are read out of the terminal 1020.
The reference motion vector group generation unit 1018 is input a spatial predicted motion vector and a temporal predicted motion vector from the motion vector storage unit 1019. These motion vectors are collected into a reference motion vector group. Further, since the motion vectors including the same component among the motion vectors are redundant, only one of them can also be set aside to reduce the number of motion vectors, similar to the first exemplary embodiment.
Then, the reference motion vector group generation unit 1018 assigns an index to each of the motion vectors, and collects the motion vectors into a reference motion vector group. The calculated reference motion vector group is input to the predicted motion vector selection unit 1021. The predicted motion vector selection unit 1021 selects a predicted motion vector from the reference motion vector group according to the input index. The motion vector reproduction unit 1022 reproduces a motion vector of the decoding target block from the obtained predicted motion vector and motion vector prediction error.
At the time of the inter prediction mode, the code data from the selector 1010 is input to the inter prediction decoding unit 1011, is decoded, and is input to the inter prediction unit 1012. The inter prediction unit 1012 refers to decoded pixel data according to the inter prediction mode based on the reproduced motion vector and calculates a prediction value.
The selector 1015 determines an output destination of the input code data according to the coding mode output from the prediction information decoding unit 1009. If the coding mode is the intra prediction mode, an input from the intra prediction unit 1014 is output to the image prediction reproduction unit 1006. Otherwise, an input from the inter prediction unit 1012 is output to the image prediction reproduction unit 1006.
The image prediction reproduction unit 1006 reproduces image data of the decoding target block from the prediction value input from the selector 1015 and the prediction error input from the inverse quantization and inverse transform unit 1005. The reproduced image data is stored to a predetermined position of the frame memory 1007. The stored image data is used in the inter prediction unit 1012 and the intra prediction unit 1014. The reproduced image data is output from the terminal 1008 to the exterior, e.g., the frame memory 906 illustrated in Fig. 9 according to the present exemplary embodiment.
Referring to Fig. 9 again, the image data of the color plane output from each of the plane decoding units is stored in the frame memory 906, and is output from the terminal 907 as image data in a frame.
Figs. 11 and 12 are flowcharts illustrating motion vector decoding processing in the image decoding apparatus according to the third exemplary embodiment. Fig. 11 illustrates processing for decoding a motion vector of a first color plane, and Fig. 12 illustrates processing for decoding motion vectors of a second color plane and a third color plane.
In Fig. 11, in step S1101, the image decoding apparatus first decodes selection information code data to reproduce an index serving as selection information. In step S1102, the image decoding apparatus decodes motion vector code data to reproduce a prediction error between a motion vector of the decoding target block and a predicted motion vector.
In step S1103, the image decoding apparatus extracts a spatial predicted motion vector from blocks around the decoding target block and a temporal predicted motion vector from a block decoded immediately before the decoding target block, similar to the processing in step S302 illustrated in Fig. 3 in the first exemplary embodiment. In step S1104, the image decoding apparatus generates a reference motion vector group from the extracted motion vectors, similar to the processing in step S303 illustrated in Fig. 3 in the first exemplary embodiment. In this processing, redundant motion vectors are reduced in number, and the motion vectors are assigned indexes.
In step S1105, the image decoding apparatus selects the predicted motion vector from the reference motion vector group according to the index generated in step S1101. In step S1106, the image decoding apparatus reproduces a motion vector of the decoding target block from the prediction error reproduced in step S1102 and the predicted motion vector selected in step S1105. Further, the reproduced motion vector is stored for a reference in the succeeding stage, and then the processing for decoding the motion vector in the first color plane is end.
In Fig. 12, steps implementing similar functions to those illustrated in Fig. 11 are assigned the same reference numerals, and hence detailed description thereof is omitted. In step S1201, the image decoding apparatus specifies a frame most recently decoded, and specifies an inter-plane reference block which is a block at the same position as that of the decoding target block.
In step S1202, the image decoding apparatus extracts a motion vector of an inter-plane reference block as an inter-plane predicted motion vector from the motion vectors stored by the motion vector decoding processing in other planes. In decoding of the second color plane, the motion vector of the inter-plane reference block in the first color plane is regarded as the inter-plane predicted motion vector. In decoding of the third color plane, motion vectors of the inter-plane reference blocks in the first color plane and in the second color plane are used as the inter-plane predicted motion vector.
In step S1203, the image decoding apparatus generates a reference motion vector group from the spatial predicted motion vector and the temporal predicted motion vector which are extracted in step S1103, and the inter-plane predicted motion vector which is calculated in step S1201. In this processing, redundant motion vectors are reduced in number, and the motion vectors are assigned indexes.
In step S1204, the image decoding apparatus selects a predicted motion vector from the reference motion vector group extracted in step S1203 according to the index generated in step S1101.
According to the above-described configuration and operation,
The motion code data that efficiently coded by using a correlation among the color planes can be decoded.
Further, in the reference motion vector group, the inter-plane predicted motion vector is arranged prior to the reference motion vector group so that the correlation can be more effectively used. While the decoded image data is once stored in the frame memory 906 according to the present exemplary embodiment, the present exemplary embodiment is not limited to this configuration. For example, the frame memory 906 may be replaced with a line memory. Alternatively, the image data may be output in a pixel unit in a dot sequential system.
While image data including the RGB color planes is described as an example according to the present exemplary embodiment, the present exemplary embodiment is not limited to this example. The image data may include color planes such as the above-described YCbCr color planes and YCgCo color planes, and other color planes. Further, while a 4 : 4 : 4 format is used for description, the present exemplary embodiment is not limited to this example. Decoding may be performed on a bit stream with a 4 : 4 : 4 : 4 format by adding an alpha plane required to synthesize an image of the 4 : 4 : 4 format.
While AMVP is used as the motion vector coding method in the present exemplary embodiment, the present exemplary embodiment is not limited to this example. Motion vectors to be referred to may include a temporal predicted motion vector and an inter-plane predicted motion vector, similar to H.264.
Fig. 13 is a block diagram illustrating a configuration of an image decoding apparatus according to a fourth exemplary embodiment of the present invention. According to the present exemplary embodiment, decoding of coded data which is generated in the second exemplary embodiment will be described as an example. In Fig. 13, blocks implementing similar functions to those in the image decoding apparatus illustrated in Fig. 9 are assigned the same reference numerals, and hence detailed description thereof is omitted. An integration decoding unit 1302 decodes header information required as a sequence and header information required as a frame, separates the bit stream into plane order code data and code data pieces of respective color planes, and outputs the separated code data pieces to the succeeding stages. A plane order decoding unit 1301 decodes the plane order code data to reproduce the order of color planes. Plane decoding units 1303, 1304, and 1305 respectively decode the color planes.
An image decoding operation in the above-described image decoding apparatus will be described below.
In Fig. 13, a bit stream input from a terminal 901 is input to the integration decoding unit 1302.
The integration decoding unit 1302 decodes code data of the header information required as the sequence to reproduce a configuration of the color planes. As described in the second exemplary embodiment, the configuration of the color plane can be known from a matrix_coefficients code in a VUI parameter. The integration decoding unit 1302 identifies that the color planes include G, B, and R color planes. Since a chroma_format_idc code and a separate_colour_plane_flag code in SPS are respectively 3 and 2, the integration decoding unit 1302 identifies that decoding is performed using a correlation among the color planes. The integration decoding unit 1302 separates the plane order code data and the code data pieces of the G color plane, the B color plane, and the R color plane from the bit stream to be subsequently input. The plane order code data is input to the plane order decoding unit 1301, and the code data pieces of the respective color planes are respectively input to the plane decoding units 1303, 1304, and 1305.
The plane order decoding unit 1301 respectively inputs the orders of the color planes to the plane decoding units 1303, 1304, and 1305. According to the present exemplary embodiment, the plane decoding unit 1303 decodes the G color plane as a first color plane, the plane decoding unit 1304 decodes the B color plane as a second color plane, and the plane decoding unit 1305 decodes the R color plane as a third color plane.
Fig. 14 is a detailed block diagram of the plane decoding units 1303, 1304, and 1305. In Fig. 14, blocks implementing similar functions to those implemented by the blocks illustrated in Fig. 10 are assigned the same reference numerals, and hence description thereof is not repeated.
The order of color planes respectively decoded by the plane decoding units is input to a terminal 1401 from the plane order decoding unit 1301. Motion vectors of the other planes are respectively input to terminals 1402 and 1403. Terminals 1404 and 1405 are respectively connected to different plane decoding units. The motion vectors are respectively output to the exterior from the terminals 1404 and 1405.
For example, the terminal 1402 and the terminal 1404 of the plane decoding unit 1303 are respectively connected to the terminals 1403 and 1405 of the plane decoding unit 1304. The terminals 1402 and 1404 of the plane decoding unit 1304 are respectively connected to the terminals 1403 and 1405 of the plane decoding unit 1305. The terminals 1402 and 1404 of the plane decoding unit 1305 are respectively connected to the terminals 1403 and 1405 of the plane decoding unit 1303.
A motion vector input/output control unit 1451 controls input and output of the motion vectors according to the order of color planes input from the terminal 1401. If the plane decoding unit 1303 decodes the first color plane from the terminal 1401, the terminals 1402 and 1403 do not operate to inhibit the motion vectors from being input from the other plane decoding units. If the second color plane is decoded, either one of the terminal 1402 and the terminal 1403 which are connected to the plane decoding unit 1305 that decodes the third color plane does not operate to inhibit the motion vector from being input from the plane decoding unit 1305 that decodes the third color plane. If the third color plane is decoded, either one of the terminal 1402 and the terminal 1403 which are connected to the plane decoding unit 1304 that decodes the second color plane does not operate to inhibit the motion vector from being input from the plane decoding unit 1304 that decodes the second color plane. However, the present exemplary embodiment is not limited to this configuration, similar to the second exemplary embodiment. If the third color plane is coded, the motion vector in the second color plane may be input. Alternatively, a predicted motion vector may be calculated from the motion vectors in the first color plane and the second color plane.
A motion vector storage unit 1419 differs from the motion vector storage unit 1019 illustrated in Fig. 10 in that a motion vector is not directly output to the exterior. A motion vector decoding unit 1416 differs from the motion vector decoding unit 1016 illustrated in Fig. 10 in that selection information including an index is not output.
An image data decoding operation in the above-described configuration will be described below. Code data of a decoding target block input from the terminal 1001 is input to the code data separation unit 1002 and is classified into respective code data pieces. A motion vector code is input to the motion vector decoding unit 1416. The motion vector decoding unit 1416 decodes motion vector code data to reproduce a prediction error between a motion vector of the decoding target block and a predicted motion vector. The reproduced prediction error is input to a motion vector reproduction unit 1022.
The motion vector input/output control unit 1451 decodes the input image data in units of blocks of the G color plane as the first color plane if the plane decoding unit is the plane decoding unit 1303. The motion vector input/output control unit 1451 does not receive input from the terminal 1402 and the terminal 1403. A motion vector around the decoding target block in the decoding target frame is input from the motion vector storage unit 1419 to extract a spatial predicted motion vector. The motion vector input/output control unit 1451 further calculates a predicted motion vector from a median value of the extracted spatial predicted motion vector, and inputs the calculated predicted motion vector to the motion vector reproduction unit 1022.
The motion vector input/output control unit 1451 decodes the input image data in units of blocks of the B color plane as a second color plane if the plane decoding unit is the plane decoding unit 1304. The motion vector input/output control unit 1451 does not receive input from the terminal 1402 connected to the plane decoding unit 1305 that decodes the third color plane at this time. A motion vector of a block in the G color plane at the same position as that of the decoding target block is input to the motion vector input/output control unit 1451 from the motion vector storage unit 1419 in the plane decoding unit 1303 via the terminal 1404 and the terminal 1402 of the plane decoding unit 1304. The motion vector input/output control unit 1451 inputs the received motion vector as a predicted motion vector to the motion vector reproduction unit 1022.
Further, the motion vector input/output control unit 1451 decodes the input image data in units of blocks of the R color plane as a third color plane if the plane decoding unit is the plane decoding unit 1305. The motion vector input/output control unit 1451 does not receive input from the terminal 1403 connected to the plane decoding unit 1304 that decodes the second color plane at this time. A motion vector of a block in the G color plane at the same position as that of the decoding target block is input to the motion vector input/output control unit 1451 from the motion vector storage unit 1419 in the plane decoding unit 1303 via the terminal 1405 and the terminal 1402 of the plane decoding unit 1305. The motion vector input/output control unit 1451 inputs the received motion vector as a predicted motion vector to the motion vector reproduction unit 1022. The motion vector reproduction unit 1022 reproduces a motion vector of the decoding target block from the obtained predicted motion vector and motion vector prediction error, similar to the third exemplary embodiment.
Fig. 15 is a flowchart illustrating processing for decoding motion vectors in a second color plane and a third color plane in the image decoding apparatus according to the fourth exemplary embodiment. The existing method can be used for processing for decoding a motion vector in the first color plane. For example, a motion vector decoding method in H.264 can be used. In Fig. 15, in step S1102, the image decoding apparatus decodes motion vector code data to generate a motion vector prediction error. In step S1501, the image decoding apparatus refers to a motion vector in the other color plane to extract an inter-plane predicted motion vector. In step S1106, the image decoding apparatus reproduces a motion vector in the decoding target block from the prediction error reproduced in step S1102 and the inter-plane predicted motion vector calculated in step S1501. Further, the reproduced motion vector is stored for a reference in the succeeding stage, and then the processing for decoding the motion vector in the second color plane or the third color plane is ended.
According to the above-described configuration and operation, the order of color planes is determined, and the motion vector which is subjected to difference coding is decoded among the color planes, so that the coded motion vector can be efficiently reproduced using a correlation among the color planes.
While the motion vectors in the different color planes are simply used as the motion vector prediction method according to the present exemplary embodiment, the present exemplary embodiment is not limited to this example. Motion vectors to be referred to may include a temporal predicted motion vector and an inter-plane predicted motion vector, similar to H.264. AMVP may also be used.
Each of the processing units illustrated in Figs. 1, 2, 6, 7, 9, 10, 13, and 14 is described as including hardware according to the present exemplary embodiment. However, processing performed by each of the processing units illustrated in these figures may be realized by a computer program.
Fig. 18 is a block diagram illustrating an example of a hardware configuration of a computer that is applicable to an image processing apparatus according to each of the above-described exemplary embodiments.
A central processing unit (CPU) 1801 controls the entire computer using a computer program and data which are stored in a random access memory (RAM) 1802 and a read-only memory (ROM) 1803 while performing the above-described processing as being performed by the image processing apparatus according to each of the above-described exemplary embodiments. More specifically, the CPU 1801 functions as each of the processing units illustrated in Figs. 1, 2, 6, 7, 9, 10, 13, and 14.
The RAM 1802 includes an area for temporarily storing a computer program and data which are loaded from an external storage device 1806, and data which is obtained from the exterior via an interface (I/F) 1807. Further, the RAM 1802 includes a work area to be used when the CPU 1801 performs various types of processing. More specifically, the RAM 1802 can be assigned as a frame memory, for example, and can be provided with various types of other areas, as needed.
The ROM 1803 stores setting data, a boot program, and the like for the computer. An operation unit 1804 includes a keyboard and a mouse, and can input various instructions to the CPU 1801 when operated by a user of the computer. A display unit 1805 displays a processing result by the CPU 1801. The output unit 1805 includes a liquid crystal display, for example.
The external storage device 1806 is a large-capacity information storage device represented by a hard disk drive device. The external storage device 1806 stores an operating system (OS), and a computer program for causing the CPU 1801 to implement the function of each of the units illustrated in Figs. 1, 2, 6, 7, 9, 10, 13, and 14. Further, the external storage device 1806 may store each of image data serving as a processing target.
Each of the computer program and the data which are stored in the external storage device 1806 are loaded into the RAM 1802, as needed, according to control performed by the CPU 1801, and handles as a target to be processed by the CPU 1801. Networks such as a local area network (LAN) and the Internet and other devices such as a projection device and a display device can be connected to an I/F 1807. The computer can obtain and transmit various types of information pieces via the I/F 1807. A bus 1808 connects the above-described units.
The CPU 1801 mainly controls the operations described in the flowcharts in the above-described configuration.
The present invention can also be achieved by providing a storage medium which stores a computer program code for implementing functions of the above-described exemplary embodiments, to a system or an apparatus. The computer program code stored in the storage medium can be read and executed by the system. In this case, the code itself of the computer program read out from the storage medium implements the functions of the above-described embodiments, and the storage medium storing the computer program code constitutes the present invention. The present invention also includes a case where an OS operating on a computer performs a part or the whole of actual processing based on an instruction from the computer program code, and the processing implements the above-described functions.
Further, the present invention may be implemented in the following form. More specifically, a computer program code read out from a storage medium is written into a function expansion card inserted into a computer or a memory included in a function expansion unit connected to the computer. The present invention may also include a case where a CPU included in the function expansion card or the function expansion unit performs a part or the whole of actual processing based on an instruction from the computer program code, to implement the above-described functions.
If the present invention is applied to the above-described storage medium, the storage medium stores a computer program code corresponding to the flowcharts previously described.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications, equivalent structures, and functions.
This application claims priority from Japanese Patent Application No. 2011-252923 filed November 18, 2011, which is hereby incorporated by reference herein in its entirety.

Claims (8)

  1. A motion vector coding apparatus comprising:
    a motion vector generation unit configured to generate a motion vector of a coding target block in a coding target color plane;
    an inter-plane motion vector extraction unit configured to extract a motion vector of an inter-plane reference block which is a block in a color plane different from the coding target color plane and at the same position as that of the coding target block;
    a predicted motion vector calculation unit configured to calculate a predicted motion vector from the motion vector of the inter-plane reference block;
    a motion vector error calculation unit configured to calculate a motion vector error from the motion vector of the coding target block and the predicted motion vector; and
    a motion vector difference coding unit configured to code the motion vector error.
  2. The motion vector coding apparatus according to claim 1, wherein the predicted motion vector calculation unit regards the motion vector of the inter-plane reference block as a predicted motion vector.
  3. The motion vector coding apparatus according to claim 1, wherein the predicted motion vector calculation unit includes a reference motion vector generation unit configured to generate a reference motion vector group including a motion vector of a block around the coding target block and the motion vector of the inter-plane reference block, and a predicted vector selection unit configured to select a predicted vector of the motion vector of the coding target block from the reference motion vector group and output selection information, and
    the motion vector coding apparatus further comprising a selection information coding unit configured to code the selection information to generate a selection information code.
  4. A motion vector decoding apparatus comprising:
    a code data input unit configured to input code data of image data including a plurality of color planes;
    an inter-plane motion vector extraction unit configured to extract a motion vector of an inter-plane reference block which is a block in a color plane different from a decoding target color plane and at the same position as that of a decoding target block;
    a motion vector difference decoding unit configured to decode a motion vector error from the code data; and
    a motion vector reproduction unit configured to reproduce a motion vector of the decoding target block from the motion vector of the inter-plane reference block which is extracted by the inter-plane motion vector extraction unit and the motion vector error.
  5. A method for coding a motion vector in a motion vector coding apparatus, the method comprising:
    generating a motion vector of a coding target block in a coding target color plane;
    extracting a motion vector of an inter-plane reference block which is a block in a color plane different from the coding target color plane and at the same position as that of the coding target block;
    calculating a predicted motion vector from the motion vector of the inter-plane reference block;
    calculating a motion vector error from the motion vector of the coding target block and the predicted motion vector; and
    coding the motion vector error.
  6. A method for decoding a motion vector in a motion vector decoding apparatus, the method comprising:
    inputting code data of image data including a plurality of color planes;
    extracting a motion vector of an inter-plane reference block which is a block in a color plane different from a decoding target color plane and at the same position as that of a decoding target block;
    decoding a motion vector error from the code data; and
    reproducing a motion vector of the decoding target block from the motion vector of the inter-plane reference block and the motion vector error.
  7. A program causing a computer to function as a motion vector coding apparatus according to claim 1 by being read out and executed by the computer.
  8. A program causing a computer to function as a motion vector decoding apparatus according to claim 4 by being read out and executed by the computer.
PCT/JP2012/007171 2011-11-18 2012-11-08 Motion vector coding apparatus, method and program WO2013073138A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011252923A JP2013110517A (en) 2011-11-18 2011-11-18 Motion vector encoding device, motion vector encoding method and program, motion vector decoding device, motion vector decoding method and program
JP2011-252923 2011-11-18

Publications (1)

Publication Number Publication Date
WO2013073138A1 true WO2013073138A1 (en) 2013-05-23

Family

ID=47351901

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/007171 WO2013073138A1 (en) 2011-11-18 2012-11-08 Motion vector coding apparatus, method and program

Country Status (2)

Country Link
JP (1) JP2013110517A (en)
WO (1) WO2013073138A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104581186A (en) * 2013-10-14 2015-04-29 上海天荷电子信息有限公司 Method for encoding and decoding intra-frame moving vector during image compression

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6410513B2 (en) * 2014-08-07 2018-10-24 キヤノン株式会社 Image coding apparatus and method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010110006A (en) 2005-07-22 2010-05-13 Mitsubishi Electric Corp Image encoder, image decoder, image encoding method, image decoding method, image encoding program, image decoding program, and computer readable recording medium recording image encoding program, and computer readable recording medium recording image decoding program
EP2230849A1 (en) * 2009-03-20 2010-09-22 Mitsubishi Electric R&D Centre Europe B.V. Encoding and decoding video data using motion vectors
JP2011077761A (en) * 2009-09-30 2011-04-14 Sony Corp Image processor and processing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010110006A (en) 2005-07-22 2010-05-13 Mitsubishi Electric Corp Image encoder, image decoder, image encoding method, image decoding method, image encoding program, image decoding program, and computer readable recording medium recording image encoding program, and computer readable recording medium recording image decoding program
EP2230849A1 (en) * 2009-03-20 2010-09-22 Mitsubishi Electric R&D Centre Europe B.V. Encoding and decoding video data using motion vectors
JP2011077761A (en) * 2009-09-30 2011-04-14 Sony Corp Image processor and processing method
US20120219216A1 (en) * 2009-09-30 2012-08-30 Kazushi Sato Image processing device and method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
J-L LIN ET AL: "Improved Advanced Motion Vector Prediction", 4. JCT-VC MEETING; 95. MPEG MEETING; 20-1-2011 - 28-1-2011; DAEGU;(JOINT COLLABORATIVE TEAM ON VIDEO CODING OF ISO/IEC JTC1/SC29/WG11AND ITU-T SG.16 ), no. JCTVC-D125, 15 January 2011 (2011-01-15), XP030008165, ISSN: 0000-0015 *
KEN MCCANN ET AL: "Samsung's Response to the Call for Proposals on Video Compression Technology", ITU-T SG16 WP3 AND ISO/IEC JTC1/SC29/WG11, DOCUMENT JCTVC-A124_R2, 1ST JCT-VC MEETING, 15-23 APRIL 2010, DRESDEN, DE, 1 June 2010 (2010-06-01), XP055008873, Retrieved from the Internet <URL:http://wftp3.itu.int/av-arch/jctvc-site/2010_04_A_Dresden/JCTVC-A124_r2.doc> [retrieved on 20111006] *
UNKNOWN: "JCT-VC contribution JCTVC-A124_r2.doc the Internet", Retrieved from the Internet <URL:http://wftp3.itu.indav-arch/jctvc-site/2010_04_A_Dresden/>
UNKNOWN: "JCT-VC contribution JCTVC-F013_v3.docx the Internet", Retrieved from the Internet <URL:http://phenix.int-evry.fr/jct/>

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104581186A (en) * 2013-10-14 2015-04-29 上海天荷电子信息有限公司 Method for encoding and decoding intra-frame moving vector during image compression

Also Published As

Publication number Publication date
JP2013110517A (en) 2013-06-06

Similar Documents

Publication Publication Date Title
US10939117B2 (en) Sample sets and new down-sampling schemes for linear component sample prediction
US11711539B2 (en) Image encoding apparatus, image encoding method and program, image decoding apparatus, and image decoding method and program
US8260069B2 (en) Color image encoding and decoding method and apparatus using a correlation between chrominance components
JP5047995B2 (en) Video intra prediction encoding and decoding method and apparatus
KR102419112B1 (en) Residual sign prediction method and apparatus in transform domain
US9591325B2 (en) Special case handling for merged chroma blocks in intra block copy prediction mode
US9510011B2 (en) Video encoding device, video decoding device, video encoding method, video decoding method, and program
US8977048B2 (en) Method medium system encoding and/or decoding an image using image slices
US10986333B2 (en) Motion vector coding apparatus, method and program for coding motion vector, motion vector decoding apparatus, and method and program for decoding motion vector
US20190306515A1 (en) Coding apparatus, coding method, decoding apparatus, and decoding method
US20180048903A1 (en) Image encoding apparatus, image encoding method, and recording medium; and image decoding apparatus, image decoding method, and recording medium
JP7454633B2 (en) Encoding device, decoding device and corresponding method using palette encoding
TW201251468A (en) Image processing device and image processing method
KR20210088693A (en) Encoders, decoders and corresponding methods using IBC search range optimization for arbitrary CTU sizes
KR20160031506A (en) Image coding apparatus, image coding method, and program, and image decoding apparatus, image decoding method and program
US20140153642A1 (en) Image coding apparatus, image coding method, and program
WO2013073138A1 (en) Motion vector coding apparatus, method and program
JP2009260494A (en) Image coding apparatus and its control method
JP6618578B2 (en) Image encoding apparatus and image decoding apparatus
JP6469277B2 (en) Image encoding device, image encoding method and program, image decoding device, image decoding method and program
US20120320965A1 (en) Apparatus and method for encoding/decoding a multi-plane image, and recording medium for same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12799326

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12799326

Country of ref document: EP

Kind code of ref document: A1