KR20170040542A - A method and an apparatus for for playing an video image using addition information - Google Patents

A method and an apparatus for for playing an video image using addition information Download PDF

Info

Publication number
KR20170040542A
KR20170040542A KR1020150139659A KR20150139659A KR20170040542A KR 20170040542 A KR20170040542 A KR 20170040542A KR 1020150139659 A KR1020150139659 A KR 1020150139659A KR 20150139659 A KR20150139659 A KR 20150139659A KR 20170040542 A KR20170040542 A KR 20170040542A
Authority
KR
South Korea
Prior art keywords
line
image
pixel value
deleted
additional data
Prior art date
Application number
KR1020150139659A
Other languages
Korean (ko)
Inventor
권영달
김건태
유형수
한상훈
Original Assignee
주식회사 케이티
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 케이티 filed Critical 주식회사 케이티
Priority to KR1020150139659A priority Critical patent/KR20170040542A/en
Publication of KR20170040542A publication Critical patent/KR20170040542A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/93Regeneration of the television signal or of selected parts thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0135Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

According to an aspect of the present invention, there is provided a method for reproducing an image, the method comprising: reconstructing a current image using encoding information from a bitstream, separating the reconstructed current image into a first image composed of odd lines and a second image composed of even lines, Interpolation is performed for each of the first image and the second image, and the interpolated first and second images are sequentially output. Also, the process of separating the restored current image may include generating additional data used for restoration of the line to be deleted and applying the additional data to the current image.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a method and an apparatus for reproducing image data using additional data,

The present invention relates to a method and an apparatus for image re-imaging.

An interlace scanning method and a progressive scanning method are used as a scanning method used to express an image on a display screen. Interlaced scanning is widely used to reduce the size of data during storage and transmission of images.

An object of the present invention is to solve the problem that some data lost in the image processing of the interlaced scanning method can not be restored by using the existing deinterlacing algorithm.

An image reproducing method and apparatus of the present invention reconstructs a current image using coding information from a bitstream, separates the reconstructed current image into a first image composed of odd lines and a second image composed of even lines, And interpolates the separated first and second images, and sequentially outputs the interpolated first and second images.

In the image reproducing method and apparatus of the present invention, in the separating step, the deletion target line is marked in the current image, the additional data used for restoration of the deletion target line is generated, And separating the current image to which the additional data is applied into the first image and the second image.

In the image reproducing method and apparatus of the present invention, the deletion target line is a line deleted from the current image in the process of separating the first image or the process of separating the second image.

In the image reproducing method and apparatus of the present invention, the deletion target line is a line including an object image in the current image, and is a line adjacent to a boundary between the background image included in the current image and the object image.

In the image reproducing method and apparatus of the present invention, the additional data is generated by using a pixel value of the line to be deleted and a pixel value of a line adjacent to at least one of the upper and lower ends of the line to be deleted.

In the image reproducing method and apparatus of the present invention, when the deletion target line is located at the uppermost end of the object image, the additional data may include a pixel value of the line to be deleted and a pixel value of a line adjacent to the lower end of the deletion object line ≪ / RTI >

The additional data is generated by using the pixel value of the line to be deleted and the pixel value of the line adjacent to the upper end of the line to be deleted, when the deletion target line is located at the lowermost end of the object image.

In the image reproducing method and apparatus of the present invention, in the step of applying the additional data, the additional data is applied based on the similarity between the pixel value of the line adjacent to the deletion object line and the pixel value of the background image in the current image The line to be determined is determined.

The difference between the pixel value of the line adjacent to the upper end of the deletion object line and the pixel value of the background image is the pixel value of the line adjacent to the lower end of the deletion object line, The additional data is applied to a line adjacent to an upper end of the line to be deleted,

When the difference between the pixel value of the line adjacent to the upper end of the deletion object line and the pixel value of the background image is larger than the difference between the pixel value of the line adjacent to the lower end of the deletion object line and the pixel value of the background image, Is applied to a line adjacent to the lower end of the line to be deleted.

In the image reproducing method and apparatus of the present invention, in performing the interpolation, an interpolation filter is applied to a plurality of consecutive odd lines in the first image, and a pixel value of a line located in the middle of the plurality of odd lines And applying an interpolation filter to a plurality of consecutive even lines in the second image to generate a pixel value of a line located in the middle of the plurality of even lines.

In the image reproducing method and apparatus of the present invention, the interpolation filter is any one selected from a plurality of pre-set interpolation filters based on an index for identifying a type of an interpolation filter signaled from an image encoder.

According to the present invention, it is possible to restore some data lost in an interlaced scanning image process, similar to an original image.

According to the present invention, by generating additional data using reconstruction data in the original image, additional bit allocation required for coding additional data can be avoided.

According to the present invention, an image similar to an original image can be restored by applying an interpolation filter to an image separated according to the interlaced scanning method.

According to the present invention, by using the interpolation filter selectively, the accuracy of restoration can be improved and the error with the original image can be minimized.

FIG. 1 shows a schematic configuration of an image reproducing apparatus 100 according to an embodiment to which the present invention is applied.
FIG. 2 is a schematic block diagram of an image restoring unit 110 according to an embodiment of the present invention.
3 is a flowchart illustrating a process of outputting an image encoded by the image re-encoding apparatus 100 according to an embodiment of the present invention.
FIG. 4 is a diagram illustrating a process of processing an original image for interlaced scanning in the interlaced scanning mode processing unit 130 according to an embodiment of the present invention.
FIG. 5 illustrates a method of generating an image for interlaced scanning using additional data according to an embodiment of the present invention. Referring to FIG.

While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. Like reference numerals are used for like elements in describing each drawing.

The terms first, second, etc. may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. And / or < / RTI > includes any combination of a plurality of related listed items or any of a plurality of related listed items.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are used to specify that there is a feature, a number, a step, an operation, an element, a component or a combination thereof described in the specification, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. Hereinafter, the same reference numerals will be used for the same constituent elements in the drawings, and redundant explanations for the same constituent elements will be omitted.

FIG. 1 shows a schematic configuration of an image reproducing apparatus 100 according to an embodiment to which the present invention is applied.

Referring to FIG. 1, an image reproducing apparatus 100 according to the present invention restores an encoded image and outputs a reconstructed image according to a predetermined scanning method. The apparatus includes an image restoring unit 110, a progressive scan processing unit 120 An interlaced scanning method processing unit 130, and an image output unit 140. [

The image reconstructing unit 110 may reconstruct the image using the encoding information about the encoded image. Here, the encoding information may include encoded residual data, a prediction mode, motion information, a block partition, an intra prediction mode, in-loop filtering information, and the like. A method of restoring an image using the encoded information will be described in detail with reference to FIG.

The progressive scan type processing unit 120 may be a component for outputting the reconstructed image output from the image reconstruction unit 110 in a progressive scanning manner. The progressive scan method is a type of image display method that sequentially displays odd lines and even lines of an image. If the reconstructed image is a progressively scanned image, the progressive scan processing unit 120 rearranges the reconstructed image according to a temporal order and transmits the reconstructed image to the image output unit 140. Here, the temporal order of the reconstructed image may be determined using a picture order count (POC) allocated to each reconstructed image. On the other hand, when the reconstructed image is an interlaced-scanned image, the progressive scan processing unit 120 combines half frames composed of odd lines and half frames composed of even lines to generate one image full frame to the video output unit 140,

The interlaced scanning processing unit 130 may be a component for outputting the restored image output from the image restoring unit 110 through an interlace scanning method. The interlaced scanning method is an image display method in which an image is divided into an odd line and an even line and alternately displayed. For this, the interlaced scanning processing unit 130 may divide one image into a first image composed of odd lines and a second image composed of even lines. And rearranging the plurality of separated first and second images in temporal order. Alternatively, the reconstructed images may be rearranged in temporal order, and then the reconstructed images may be separated into a first image and a second image. The first image and the second image thus separated may be transmitted to the image output unit 140 so as to be output in an interlaced manner.

The video output unit 140 is a component for outputting the restored video either in a progressive scanning mode or an interlaced scanning mode. For example, the video output unit 140 may include a display screen, a screen, a monitor, and the like of the terminal.

FIG. 2 is a schematic block diagram of an image restoring unit 110 according to an embodiment of the present invention.

2, the image reconstruction unit 110 includes an entropy decoding unit 210, a rearrangement unit 215, an inverse quantization unit 220, an inverse transformation unit 225, prediction units 230 and 235, 240, and a memory 245 may be included.

The entropy decoding unit 210 can perform entropy decoding in a procedure opposite to that in which entropy encoding is performed in the entropy encoding unit of the image encoder. For example, various methods such as Exponential Golomb, Context-Adaptive Variable Length Coding (CAVLC), and Context-Adaptive Binary Arithmetic Coding (CABAC) may be applied in accordance with the method performed by the image encoder.

The entropy decoding unit 210 may decode information related to intra prediction and inter prediction performed in the encoder.

The reordering unit 215 can perform reordering based on a method in which the entropy decoding unit 210 rearranges the entropy-decoded bitstreams in the encoding unit. The coefficients represented by the one-dimensional vector form can be rearranged by restoring the coefficients of the two-dimensional block form again. The reordering unit 215 can perform reordering by receiving information related to the coefficient scanning performed by the encoding unit and performing a reverse scanning based on the scanning order performed by the encoding unit.

The inverse quantization unit 220 can perform inverse quantization based on the quantization parameters provided by the encoder and the coefficient values of the re-arranged blocks.

The inverse transform unit 225 may perform an inverse DCT, an inverse DST, and an inverse KLT on the DCT, DST, and KLT transformations performed by the transform unit on the quantization result performed by the image encoder. The inverse transform can be performed based on the transmission unit determined by the image encoder. In the inverse transform unit 225 of the image decoder, a transform technique (e.g., DCT, DST, KLT) may be selectively performed according to a plurality of information such as a prediction method, a size of a current block, and a prediction direction.

The prediction units 230 and 235 can generate a prediction block based on the prediction block generation related information provided by the entropy decoding unit 210 and the previously decoded block or picture information provided in the memory 245. [

As described above, when intra prediction is performed in the same manner as in the image encoder, when the size of the prediction unit is the same as the size of the conversion unit, pixels existing on the left side of the prediction unit, pixels existing on the upper left side, However, when the size of the prediction unit differs from the size of the prediction unit in intra prediction, intraprediction is performed using a reference pixel based on the conversion unit . It is also possible to use intra prediction using N x N divisions for only the minimum coding unit.

The prediction units 230 and 235 may include a prediction unit determination unit, an inter prediction unit, and an intra prediction unit. The prediction unit determination unit receives various information such as prediction unit information input from the entropy decoding unit 210, prediction mode information of the intra prediction method, motion prediction related information of the inter prediction method, and identifies prediction units in the current coding unit. It is possible to determine whether the unit performs inter prediction or intra prediction. The inter prediction unit 230 predicts the current prediction based on the information included in at least one of the previous picture of the current picture or the following picture including the current prediction unit by using information necessary for inter prediction of the current prediction unit provided by the image encoder, Unit can be performed. Alternatively, the inter prediction may be performed on the basis of the information of the partial region previously reconstructed in the current picture including the current prediction unit.

In order to perform inter prediction, a motion prediction method of a prediction unit included in a corresponding encoding unit on the basis of an encoding unit includes a skip mode, a merge mode, an AMVP mode, and an intra block copy mode It is possible to judge whether or not it is any method.

The intra prediction unit 235 can generate a prediction block based on the pixel information in the current picture. If the prediction unit is a prediction unit that performs intra prediction, the intra prediction can be performed based on the intra prediction mode information of the prediction unit provided by the image encoder. The intraprediction unit 235 may include an AIS (Adaptive Intra Smoothing) filter, a reference pixel interpolator, and a DC filter. The AIS filter performs filtering on the reference pixels of the current block and can determine whether to apply the filter according to the prediction mode of the current prediction unit. The AIS filtering can be performed on the reference pixel of the current block using the prediction mode of the prediction unit provided in the image encoder and the AIS filter information. When the prediction mode of the current block is a mode in which AIS filtering is not performed, the AIS filter may not be applied.

The reference pixel interpolator may interpolate the reference pixels to generate reference pixels in units of pixels less than or equal to an integer value when the prediction mode of the prediction unit is a prediction unit that performs intra prediction based on pixel values obtained by interpolating reference pixels. The reference pixel may not be interpolated in the prediction mode in which the prediction mode of the current prediction unit generates the prediction block without interpolating the reference pixel. The DC filter can generate a prediction block through filtering when the prediction mode of the current block is the DC mode.

The restored block or picture may be provided to the filter unit 240. The filter unit 240 may include a deblocking filter, an offset correction unit, and an ALF.

When information on whether a deblocking filter is applied to a corresponding block or picture from the image encoder or a deblocking filter is applied, information on whether a strong filter or a weak filter is applied can be provided. In the deblocking filter of the video decoder, the deblocking filter related information provided by the video encoder is provided, and the video decoder can perform deblocking filtering for the corresponding block.

The offset correction unit may perform offset correction on the reconstructed image based on the type of offset correction applied to the image and the offset value information during encoding.

The ALF can be applied to an encoding unit on the basis of ALF application information and ALF coefficient information provided from an encoder. Such ALF information may be provided in a specific parameter set.

The memory 245 may store the reconstructed picture or block to be used as a reference picture or a reference block, and may also provide the reconstructed picture to the output unit.

As described above, in the embodiment of the present invention, a coding unit (coding unit) is used as a coding unit for convenience of explanation, but it may be a unit for performing not only coding but also decoding.

3 is a flowchart illustrating a process of outputting an image encoded by the image re-encoding apparatus 100 according to an embodiment of the present invention.

Referring to FIG. 3, a current image can be reconstructed using encoding information from an input bitstream (S300).

The current image can be divided and restored in predetermined block units. Here, the predetermined block unit may mean a coding unit, a prediction unit, a conversion unit, or the like. Specifically, the coding unit refers to a basic unit for processing a coded video image, and the prediction unit refers to a basic unit on which intraprediction or inter prediction is performed. The conversion unit is based on a conversion technique such as DCT, DST, KLT, And may be a basic unit for performing conversion processing. In addition, the current image can be partitioned into a predetermined fragment unit for parallel decoding processing. The fragment unit may have a square, a rectangle, or other asymmetric shape, and information on at least one of the shape, size, and number of fragment units may be encoded by the image encoder and signaled to the image reconstruction unit 110.

Also, as shown in FIG. 2, a prediction block may be generated through intra prediction or inter prediction, and a current block may be reconstructed using a prediction block and a residual block. Here, the residual block may be derived by performing at least one of inverse quantization and inverse transform on the residual coefficient extracted from the input bitstream. Then, it is possible to perform in-loop filtering on the restored block to generate a final restored block. Specifically, deblocking filtering for removing an artifact of a block boundary, SAO filtering for adding or subtracting a predetermined offset for each sample in the block, At least one of the other adaptive loop filters (ALF) may optionally be used.

 The reconstructed image generated in operation S300 may be output through the image output unit 140 according to a scanning method.

For example, when the restored image is output in a progressive scan mode, the restored image may be sequentially displayed on the odd-numbered line and the even-numbered line through the image output unit 140 (S310).

In contrast, when the restored image is output in the interlaced scanning mode, the restored image can be divided into a first image composed of odd lines and a second image composed of even lines (S320), and the separated first image and second image And interpolation may be performed for each of them (S330).

Specifically, a first image may be generated by extracting only odd lines from the reconstructed image, and a second image may be generated by extracting only even lines.

An interpolation filter (e.g., bi-linear interpolation filter) may be applied to two consecutive odd lines in the first image to generate a pixel value of a line located in the middle of two odd lines. However, the present invention is not limited to the use of only two consecutive odd lines, and an interpolation filter (for example, 4-tap, 6-tap, etc.) using pixel values located in two or more odd lines as inputs . The interpolation filter used in this case may be a vertical interpolation filter. The number of filter coefficients and the number of filter coefficients of the interpolation filter may be one fixed or one of a plurality of interpolation filters that are preset in the image reproducing apparatus 100. For this purpose, an index that identifies the type of the interpolation filter in the image encoder may be encoded and signaled. Alternatively, the filter coefficients of the interpolation filter, the number of filter coefficients, and the like may be variably determined according to the positions of the line and / or pixel values to be interpolated.

Interpolation may be performed on the second image in the same manner as interpolation of the first image, and a detailed description thereof will be omitted.

However, in the process of separating one reconstructed image into a first image and a second image, some data required for interpolation calculation may disappear. That is, an object image exists only in one of the separated first and second images, and there is no object image in the other image. In order to solve this problem, in step S300, The restored image may be corrected by applying additional data to the restored restored image, and the above-described steps S320 to S330 may be performed on the corrected restored image. This will be described in detail with reference to FIGS. 4 to 5. FIG.

The first image and the second image interpolated in step S330 may be output sequentially (S340).

FIG. 4 is a diagram illustrating a process of processing an original image for interlaced scanning in the interlaced scanning mode processing unit 130 according to an embodiment of the present invention.

Referring to FIG. 4, the processing of the original image for the output of the interlaced method may include an interlace processing step and a deinterlacing processing step. Here, the original image may refer to the reconstructed image generated by the image reconstructing unit 110, and the interlacing process may include separating the original image into odd lines and even lines. In the deinterlacing process, And performing the interpolation, respectively.

Specifically, in the interlace processing step, the first image and the second image can be generated by extracting the odd line and the even line of the original image, respectively.

Then, in the deinterlacing processing step, pixel values of lines located in the middle of two odd lines can be generated by interpolating two adjacent odd lines in the first image composed of odd lines. Likewise, a pixel value of a line located in the middle of two even lines can be generated by interpolating two adjacent even lines in a second image composed of even lines. Accordingly, the first image and the second image can be corrected to an image similar to the original image.

If the X line shown in FIG. 4 is assumed to be an odd line, the X line is present only in the first image and not in the second image. When the original image is divided into the first image and the second image, some data necessary for interpolation disappear, and thus it is difficult to restore the data that has disappeared in the deinterlacing process. In order to minimize such data loss, a method of applying additional data to the original image can be used, which will be described in detail with reference to FIG.

FIG. 5 illustrates a method of generating an image for interlaced scanning using additional data according to an embodiment of the present invention. Referring to FIG.

Referring to FIG. 5, a line to be deleted may be marked in the original image (S500).

The deletion target line of the present invention may mean a line deleted in the process of separating a first image composed of odd lines from a source image or a second image composed of even lines. In particular, the deletion target line may be a line including an object image, and may be a line adjacent to the boundary between the background image and the object image.

Additional data used for restoration of the line to be deleted may be generated (S510).

More specifically, the additional data may be for restoring the pixel value of the line to be deleted while minimizing data loss of the line to be deleted. The additional data may be generated using a pixel value of a line to be deleted (for example, an RGB value) and a pixel value of a line adjacent to at least one of the upper and lower ends of the line to be deleted. For example, if the deletion target line is located at the top of the object image, the additional data may be generated using the pixel value of the line to be deleted and the pixel value of the line adjacent to the lower end of the deletion target line. On the other hand, when the deletion target line is located at the lower end of the object image, the additional data can be generated by using the pixel value of the line to be deleted and the pixel value of the line adjacent to the upper end of the line to be deleted.

For example, when the image reproducing apparatus 100 of the present invention supports an interpolation filter of an averaging operation, the additional data may be derived as shown in the following equation (1).

Figure pat00001

The additional data generated in step S510 may be applied to the original image (S520).

Specifically, the generated additional data can be selectively applied to a line adjacent to either the upper end or the lower end of the line to be deleted. The line to which the additional data is to be applied can be determined based on the degree of similarity between the pixel value of the line adjacent to the target line and the pixel value of the background image. That is, the additional data can be applied to a line having a similarity degree to the pixel value of the background image among the lines adjacent to the deletion target line.

For example, when the difference between the pixel value of the line adjacent to the upper end of the deletion target line and the pixel value of the background image is smaller than the difference between the pixel value of the line adjacent to the lower end of the deletion object line and the pixel value of the background image, May be applied to a line adjacent to the top of the line to be deleted.

On the other hand, if the difference between the pixel value of the line adjacent to the upper end of the deletion target line and the pixel value of the background image is larger than the difference between the pixel value of the line adjacent to the lower end of the deletion object line and the pixel value of the background image, It can be applied to a line adjacent to the lower end of the target line.

In step S520, the original image to which the additional data is applied may be divided into a first image including odd lines and a second image including even lines (S530).

In operation S540, an image for interlaced scanning may be generated by performing interpolation on the first image and the second image separated in operation S530. A method of generating an image for interlaced scanning output has been described with reference to FIG. 4, and a detailed description thereof will be omitted.

Claims (18)

Reconstructing a current image using encoding information from a bitstream;
Separating the restored current image into a first image composed of odd lines and a second image composed of even lines;
Performing interpolation on each of the separated first and second images; And
And sequentially outputting the interpolated first image and the second image.
2. The method of claim 1,
Marking a line to be deleted in the current image; Here, the deletion target line may be a line deleted in the process of separating the first image from the current image or separating the second image,
Generating additional data used for restoration of the line to be deleted;
Applying the generated additional data to the current image; And
And separating the current image to which the additional data is applied into the first image and the second image.
3. The method of claim 2,
Wherein the deletion target line is a line including an object image in the current image and is a line adjacent to a boundary between the background image included in the current image and the object image.
The method of claim 3,
Wherein the additional data is generated using a pixel value of the line to be deleted and a pixel value of a line adjacent to at least one of an upper end or a lower end of the deletion target line.
5. The method of claim 4,
Wherein the additional data is generated using a pixel value of the deletion object line and a pixel value of a line adjacent to the lower end of the object line to be deleted when the deletion target line is located at the top of the object image,
Wherein the additional data is generated by using a pixel value of the line to be deleted and a pixel value of a line adjacent to an upper end of the line to be deleted, when the deletion target line is located at the lowermost end of the object image .
3. The method of claim 2, wherein applying the additional data comprises:
Determining a line to which the additional data is applied based on the similarity between the pixel value of the line adjacent to the deletion target line and the pixel value of the background image in the current image.
The method according to claim 6,
When the difference between the pixel value of the line adjacent to the upper end of the deletion object line and the pixel value of the background image is smaller than the difference between the pixel value of the line adjacent to the lower end of the deletion object line and the pixel value of the background image, Is applied to a line adjacent to an upper end of the line to be deleted,
When the difference between the pixel value of the line adjacent to the upper end of the deletion object line and the pixel value of the background image is larger than the difference between the pixel value of the line adjacent to the lower end of the deletion object line and the pixel value of the background image, Is applied to a line adjacent to a lower end of the line to be deleted.
2. The method of claim 1, wherein in performing the interpolation,
An interpolation filter is applied to a plurality of consecutive odd lines in the first image to generate a pixel value of a line located in the middle of the plurality of odd lines and an interpolation filter And generates a pixel value of a line located in the middle of the plurality of even lines.
9. The method of claim 8,
Wherein the interpolation filter is any one selected from a plurality of pre-set interpolation filters based on an index for identifying a type of an interpolation filter signaled from an image encoder.
An image reconstruction unit for reconstructing a current image using encoding information from a bitstream;
An interlaced scanning method processing unit for dividing the restored current image into a first image composed of odd lines and a second image composed of even lines, and performing interpolation for each of the separated first and second images; And
And an image output unit for sequentially outputting the interpolated first and second images.
The apparatus according to claim 10,
The method comprising the steps of: marking a line to be deleted in the current image, generating additional data used for restoring the line to be deleted, applying the generated additional data to the current image, Separating the first image and the second image,
Wherein the deletion target line is a line deleted in the process of separating the first image from the current image or in the process of separating the second image.
12. The method of claim 11,
Wherein the deletion target line is a line including an object image in the current image and is a line adjacent to a boundary between the background image included in the current image and the object image.
13. The apparatus according to claim 12,
And generates the additional data by using the pixel value of the line to be deleted and the pixel value of the line adjacent to at least one of the upper and lower ends of the line to be deleted.
14. The apparatus according to claim 13,
Generating the additional data by using the pixel value of the deletion object line and the pixel value of the line adjacent to the lower end of the object line to be deleted when the deletion target line is located at the top of the object image,
And generates the additional data using the pixel value of the deletion object line and the pixel value of the line adjacent to the upper end of the deletion object line when the deletion target line is located at the lowermost end of the object image. .
12. The apparatus according to claim 11,
And determines a line to which the additional data is to be applied based on a degree of similarity between a pixel value of a line adjacent to the deletion target line and a pixel value of a background image in the current image.
16. The apparatus according to claim 15,
When the difference between the pixel value of the line adjacent to the upper end of the deletion object line and the pixel value of the background image is smaller than the difference between the pixel value of the line adjacent to the lower end of the deletion object line and the pixel value of the background image, To a line adjacent to an upper end of the line to be deleted,
When the difference between the pixel value of the line adjacent to the upper end of the deletion object line and the pixel value of the background image is larger than the difference between the pixel value of the line adjacent to the lower end of the deletion object line and the pixel value of the background image, Is applied to a line adjacent to a lower end of the line to be deleted.
The apparatus according to claim 10,
An interpolation filter is applied to a plurality of consecutive odd lines in the first image to generate a pixel value of a line located in the middle of the plurality of odd lines and an interpolation filter And generates a pixel value of a line located in the middle of the plurality of even lines.
18. The method of claim 17,
Wherein the interpolation filter is any one selected from among a plurality of pre-set interpolation filters based on an index for identifying a type of an interpolation filter signaled from an image encoder.
KR1020150139659A 2015-10-05 2015-10-05 A method and an apparatus for for playing an video image using addition information KR20170040542A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150139659A KR20170040542A (en) 2015-10-05 2015-10-05 A method and an apparatus for for playing an video image using addition information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150139659A KR20170040542A (en) 2015-10-05 2015-10-05 A method and an apparatus for for playing an video image using addition information

Publications (1)

Publication Number Publication Date
KR20170040542A true KR20170040542A (en) 2017-04-13

Family

ID=58579742

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150139659A KR20170040542A (en) 2015-10-05 2015-10-05 A method and an apparatus for for playing an video image using addition information

Country Status (1)

Country Link
KR (1) KR20170040542A (en)

Similar Documents

Publication Publication Date Title
KR102421921B1 (en) Apparatus and method for image coding and decoding
CN107113422B (en) Computer system for reference picture management for video encoding and decoding
KR102470832B1 (en) Method and apparatus for processing a video signal
KR20190040088A (en) Intra-prediction method, and encoder and decoder using same
KR20180117219A (en) Intra prediction method and encoding apparatus and decoding apparatus using same
KR101442608B1 (en) Method and apparatus for encoding/decoding image efficiently
KR20090116655A (en) A method and an apparatus for decoding a video signal
CN112369034A (en) Video encoding method and apparatus using merge candidates
KR102393736B1 (en) Method and apparatus for coding video
KR102422485B1 (en) Method and apparatus for processing a video signal
KR102116683B1 (en) A method and an apparatus for decoding a video signal
KR20170040542A (en) A method and an apparatus for for playing an video image using addition information
KR102511611B1 (en) Image encoding method/apparatus, image decoding method/apparatus and and recording medium for storing bitstream
CN111903132A (en) Image processing apparatus and method
KR20170125297A (en) Method and apparatus for coding image compensation information and decoding using the same
KR20230042236A (en) Image encoding method/apparatus, image decoding method/apparatus and and recording medium for storing bitstream
JP5578974B2 (en) Image encoding device
KR20200076807A (en) A method and an apparatus for processing a block divided into sub-block units
KR20170125724A (en) Method for coding and decoding intra-prediction, and atparatus for the same