KR20160125704A - Apparatus and method for processing hybrid moving picture - Google Patents

Apparatus and method for processing hybrid moving picture Download PDF

Info

Publication number
KR20160125704A
KR20160125704A KR1020150056480A KR20150056480A KR20160125704A KR 20160125704 A KR20160125704 A KR 20160125704A KR 1020150056480 A KR1020150056480 A KR 1020150056480A KR 20150056480 A KR20150056480 A KR 20150056480A KR 20160125704 A KR20160125704 A KR 20160125704A
Authority
KR
South Korea
Prior art keywords
image
unit
prediction
block
elements
Prior art date
Application number
KR1020150056480A
Other languages
Korean (ko)
Inventor
유승진
박수창
박주혁
Original Assignee
유승진
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 유승진 filed Critical 유승진
Priority to KR1020150056480A priority Critical patent/KR20160125704A/en
Publication of KR20160125704A publication Critical patent/KR20160125704A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/625Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using discrete cosine transform [DCT]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/124Quantisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/13Adaptive entropy coding, e.g. adaptive variable length coding [AVLC] or context adaptive binary arithmetic coding [CABAC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards

Abstract

The present invention relates to a hybrid moving image processing apparatus and method, and a hybrid moving picture processing apparatus according to the present invention includes an encoding predicting unit for performing a coding prediction for eliminating spatial redundancy or temporal redundancy of a digital image to output a prediction block, The difference block generated by the difference between the current block of the image and the prediction block output from the encoding prediction unit is converted into the frequency domain through the DCT part in the case of the optical image and is converted into the frequency domain through the DHT part in the case of the infrared image. And a quantization unit for performing quantization on the transform coefficients of the differential block transformed by the transform coding unit, and a context-based adaptive binary arithmetic coding unit for performing context-based adaptive binary arithmetic coding on the syntax elements having binary values, An entropy encoding unit for outputting a frequency characteristic of the video signal, Depending on can be properly processed.

Figure P1020150056480

Description

[0001] Apparatus and method for processing hybrid moving picture [0002]

The present invention relates to an apparatus and method for processing a hybrid moving image, and more particularly, to a hybrid moving image processing apparatus and method capable of minimizing image loss and increasing compression efficiency by processing an input optical image and an infrared image according to frequency characteristics of an image.

A moving image processing apparatus mounted on an airplane captures a moving image on a flight path, processes the captured information using a compression technique, and then transmits the moving image data to the ground. Currently, the video processing device mounted on the aircraft processes video by applying AVC / H.264 video compression technology established as a standard in 2003.

On the other hand, high definition (HD) images with higher resolution are required for UAVs used for reconnaissance at high altitudes. However, when HD video is compressed using AVC / H.264 video compression technology, there is a limitation in transmitting the data in real time due to the bandwidth limit of the data channel.

Accordingly, the present inventors have developed a high efficiency video coding (HEVC) / H.265 image compressing technology having a compression rate twice as high as that of AVC / H.264, aiming at mounting the image processing apparatus of an aircraft. In the case of unmanned aerial vehicles, it is required to mount an infrared camera. Therefore, the optical image and the infrared image are required to be transmitted to the ground after being processed by a single image processing apparatus.

However, since the frequency characteristic of the optical image and the frequency characteristic of the infrared image are different, if the HEVC / H.265 algorithm is directly applied to process the optical image and the infrared image in one image processing apparatus, The compression efficiency becomes low.

In order to solve the above-described problems, it is an object of the present invention to provide a hybrid video processing apparatus and method capable of minimizing a video loss and increasing a compression efficiency by processing an input optical image and an infrared video according to frequency characteristics of an image do.

It is another object of the present invention to provide a hybrid video processing apparatus and method capable of shortening the encryption time in encrypting moving pictures in real time.

According to an aspect of the present invention, there is provided a hybrid moving image processing apparatus comprising: an encoding predicting unit for performing a coding prediction for eliminating spatial redundancy or temporal redundancy of a digital image to output a prediction block; The difference block generated by the difference between the current block and the prediction block output from the encoding prediction unit is converted into the frequency domain through the DCT part in the case of the optical image and converted into the frequency domain through the DHT part in the case of the infrared image, Based binary adaptive binary arithmetic coding on the syntax elements having binary values and a quantization unit for performing quantization on the transform coefficients of the difference block transformed by the transform coding unit, And an entropy encoding unit for outputting the entropy encoding unit.

The quantization unit may include an optical image scanning part for scanning the difference block of the quantized optical image and an infrared image scanning part for scanning the difference block of the quantized infrared image.

The encoding predicting unit may include a plurality of inter picture prediction parts to process motion estimation and prediction in parallel.

Encryption in the entropy encoding unit may encode macroblock layer related syntax header elements, intra prediction related syntax element elements, and inter picture prediction related syntax element elements among syntax element header elements.

The encryption in the entropy encoding unit may include encryption of the syntax image data elements corresponding to the syntax header elements to be encrypted.

The entropy encoding unit may encode syntax elements having no binary value into a binary sequence, perform encryption, and perform context-based adaptive binary arithmetic coding on syntax elements having an encrypted binary value to output a bitstream .

According to another aspect of the present invention, there is provided a hybrid moving image processing method including the steps of outputting a prediction block by performing coding prediction to eliminate spatial redundancy or temporal redundancy of a digital image, outputting a current block of the digital image and the prediction image Converting the difference block generated by the difference of the prediction block output from the prediction block into a frequency domain through a DCT part in the case of an optical image and into a frequency domain through a DHT part in the case of an infrared image, And performing a context-based adaptive binary arithmetic coding on the syntax elements having binary values to output a bitstream, the method comprising the steps of: The above-described object can be achieved.

According to the above-described configuration, both the optical image and the infrared image can be appropriately processed according to the frequency characteristics of the image signal as well as the minimum cost.

In addition, the present invention can shorten the encryption time without complex hardware design in encrypting the moving picture in real time.

1 is a block diagram of a hybrid moving picture processing apparatus according to an embodiment of the present invention.
Fig. 2 is a block diagram showing a concrete block diagram of the image compression unit shown in Fig. 1. Fig.
FIGS. 3A, 3B and 3C are diagrams showing scanning methods in the optical image scanning part shown in FIG.
4 is a diagram showing an example of a scanning method used in the infrared image part shown in Fig.
FIG. 5 is a diagram showing the entropy encoding unit shown in FIG. 2 in more detail.
6 is a flowchart illustrating a hybrid moving picture processing method according to another embodiment of the present invention.

Best Mode for Carrying Out the Invention Hereinafter, preferred embodiments of a hybrid moving image processing apparatus and method according to the present invention will be described with reference to the accompanying drawings. In the following description of the present invention, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the technical scope of the present invention. Will be.

FIG. 1 is a block diagram of a hybrid moving picture processing apparatus according to an embodiment of the present invention. FIG. 2 is a specific block diagram of the image compression unit shown in FIG. 1, and FIGS. And FIG. 3C are diagrams showing scanning methods in the optical image scanning part shown in FIG. 2, and FIG. 4 is a view showing an example of a scanning method used in the infrared image part shown in FIG.

As shown in FIG. 1, the hybrid moving picture processing apparatus includes an image capturing unit 110, an image compressing unit 120, and a video transmitting unit 130.

The image capturing unit 110 includes an optical camera unit 112 and an infrared camera unit 114 for capturing an image and outputting the captured image as digital image data.

2, the image compression unit 120 includes an intra-picture prediction unit 210, a plurality of inter-picture prediction units 220, a transform coding unit 230, a quantization unit 240, an inverse quantization unit An inverse transform unit 260, a loop filter unit 270, an entropy encoding unit 280, a performance measurement unit 290, and a control unit 295.

The intra prediction unit 210 performs prediction using a correlation between neighboring pixels in one screen. The amount of data that can be generally referred to is small and prediction efficiency is not higher than inter-picture prediction, but encoding is performed independently and encoding speed is fast. In the case of HEVC / H.265, a prediction mode with DC and planar modes and 33 directions can be used for in-picture coding.

The inter prediction unit 220 compensates for motion using motion estimation. First, an inter prediction unit 220 predicts a current block to be coded from previously encoded or decoded reference pictures, Find the most similar optimal block. For more precise motion estimation, it is possible to interpolate a reconstructed picture according to the type of a video codec, and to perform motion estimation on a pixel-by-pixel basis for the interpolated image. Motion compensation generates a prediction block on the basis of motion information (motion vector, reference picture index, etc.) for the optimal prediction block found in the motion estimation process.

1, the inter-picture prediction unit 220 may include a first inter-picture prediction unit 220 and a second inter-picture prediction unit 220, as shown in FIG. In general, the calculation in the inter-picture predicting unit 220 accounts for 60 to 70% of the total execution time. In order to reduce the calculation speed in the inter-picture predicting unit 220, As shown in FIG.

The transcoding unit 230 is used to reduce spatial redundancy by mapping pixels of an image to a frequency domain. The reason for converting the video signal in the spatial domain into the frequency domain is that quantization and entropy encoding in the frequency domain are more effective in data compression than in the spatial domain.

The difference between the current block and the prediction block obtained from the intra-frame prediction unit 210 and the inter-frame prediction unit 220 is obtained, which is referred to as a difference block. The redundancy is largely eliminated by the intra prediction unit 210 or the inter prediction unit 220, but there is still much information to be encoded for some blocks that are not predicted well. Therefore, frequency conversion is performed to increase the compression ratio in the quantization unit 240 with respect to the difference block.

The transform coding unit 230 is required to process the optical image and the infrared image according to the frequency characteristics. For the processing of the infrared image and the DCT (Discrete Cosine Transform) part 232 which is generally used for processing the optical image, And a DHT (Discrete High Transform) part 234 for passing a high frequency signal.

The quantization unit 240 performs quantization on the transform coefficients of the difference block changed to the frequency domain by the transform coding unit 230 in the process of mapping a certain range of input values to discrete values, Value can be approximated. To this end, the quantization unit 240 determines the quantization rate according to the quantization parameter, and divides the transformation coefficient of the difference block by the quantization rate. The quantization unit 240 is used to reduce the amount of information so as to obtain a high compression efficiency in the entropy encoding unit 280, and information loss may be generated only in the moving image compression.

The quantization unit 240 is also required to process the optical image and the infrared image according to the frequency characteristics. The quantization unit 240 includes an optical image scanning part 242 for scanning the optical image, an infrared image scanning part 244 for scanning the infrared image, .

The optical image scanning part 242 may adaptively scan the optical image by using the zigzag scanning method of FIG. 3A, the horizontal priority scanning method of FIG. 3B, and the vertical priority scanning method of FIG. 3C.

However, the optical image scanning part 242 is a standard prescribed for compressing an optical image mainly containing a low frequency component. Unlike a general optical image, when an infrared image including a large amount of high frequency components is compressed, do.

The infrared scanning method is a scanning method for utilizing the high frequency component of the infrared image, that is, the edge component. When the scanning schemes shown in FIGS. 3A, 3B, and 3C are applied to an infrared image, the compression rate is remarkably reduced. An example of the infrared image scanning method according to the present invention is shown in FIG. The infrared image scanning method shown in FIG. 4 is similar to the reverse order of the vertical priority scanning method of FIG. 3C.

The inverse quantization unit 250 is used to obtain the transform coefficients of the difference block from the quantized data. When the quantized data used in the quantization unit 240 is multiplied by the quantized data, the inverse quantization unit 250 can obtain the transform coefficients of the difference block . The inverse transform unit 260 reconstructs the difference block using the transform coefficients transformed by the transform coding unit 230. [

The loop filter unit 270 is configured to solve the problem of image quality deterioration due to a quantization error, and can be composed of two filters, a deblocking filter and a sample adaptive offset (SAO). The deblocking filter is mainly used to remove blocking deterioration in the reconstructed picture, and SAO can cause ringing artifacts in the edge area of the input image when the QP value is large in the image encoding process. Lt; / RTI >

The entropy encoding unit 280 minimizes the amount of data to be allocated by variably allocating the length of the symbol according to the probability of occurrence of the symbol for reducing the amount of data representation using the statistical characteristics of the information to be transmitted.

In order to retrieve a bitstream from a video decoding apparatus, information on the entire image, information on each frame, motion vector information on each block, reference index information, and quantized transform coefficients are all required. The entropy encoding unit 280 generates a bitstream by expressing all of the information as a variable-length bitstream composed of 0s and 1s based on a probability model. This bit stream is composed of a sequence of syntax elements, and each syntax element may have different statistical characteristics. For effective entropy encoding, different entropy encoding methods may be used for each syntax element. That is, a mapping table considering the occurrence probability for each syntax element can be separately defined and used. On the other hand, the entropy encoding unit 280 can shorten the encryption time by encrypting only important syntax elements among these syntax elements.

The performance measuring unit 290 includes a peak signal to noise ratio (PSNR) measurement, a compression ratio measurement, and a coding time measurement. The performance measuring unit 290 can quantify the ratio of the distortion to the maximum value of the signal using the PSNR for evaluating the image quality using the average error of the deteriorated image of the original image. Also, the performance measuring unit 290 can calculate the ratio of the data size of the original image and the compressed image with the compression ratio.

The performance measuring unit 290 also measures the total coding time of the HEVC / H.265 using only the first inter picture predicting unit 220. If the total coding time is over and the real time compression is impossible, The inter prediction unit 220 is also operated and processed in parallel, thereby reducing the coding time.

The control unit 295 can control whether to drive the optical camera unit 210 or the infrared camera unit 220 by using, for example, the amount of light input from an optical sensor (not shown). When the driving and output of the optical camera unit 210 or the infrared camera unit 220 are selected by the control unit 295, the intra prediction unit 210, the first inter picture prediction unit 220, Prediction of the screen data is performed in the coding unit 220 and provided to the transform coding unit 230. [

The control unit 295 provides the optical prediction image to the DCT part 232 of the transcoding unit 230 in the case of an optical image and the infrared predicted image to the DHT part 234 of the transcoding unit 230 in the case of the infrared image. Provide images. In the case of an optical image, the control unit 295 scans the optical quantization image through the optical image scanning unit 242 of the quantization unit 240. In the case of the infrared image, the control unit 295 controls the infrared image scanning unit 244 to scan the infrared quantization image.

The video transmitting unit 130 transmits the bit stream output from the video compressing unit 120 to the ground video receiving apparatus.

2 shows the DCT part 232 and the DHT part 234 of the transcoding part 230 separately and the optical image scanning part 242 and the infrared image scanning part 242 of the quantization part 240 Although shown separately, they can be obtained by adjusting only the parameters in one part.

FIG. 5 is a diagram showing the entropy encoding unit shown in FIG. 2 in more detail.

5, the entropy encoding unit 280 includes a context-based adaptive binary arithmetic coding (CABAC), which includes a binarization part 310, an encryption part 320, A modeling part 330 and a binary arithmetic coding part 340.

The binarization part 310 maps a syntax element output from the quantization part 240 to a binarized sequence when the syntax element is not a binary value. If a syntax element having a binary value is given as an input signal, a syntax element having a binary value is provided to the encryption part 320 without going through the binarization part 310. [

The encryption part 320 encrypts and outputs some syntax elements among syntax elements having binary values, for example, intra-prediction syntax elements and inter-picture prediction syntax elements. The encryption method may be DES (Data Encryption Standard) or AES, Advanced Encryption Standard), but it is preferable to use 256 bit AES.

The syntax structure constituting the bit stream is composed of a plurality of syntax elements. The syntax elements can be classified into syntax element header elements related to header information of syntaxes and syntax element image data elements for transmitting image data corresponding to syntax element header elements . Encryption part 320 may also encrypt only the syntax heading element and the syntax heading element of the syntax image data element. In this case, the encryption time can be reduced to 1/10.

The encryption part 320 can encrypt only the macroblock layer related syntax element header elements, the intra picture prediction related syntax element header elements, and the inter picture prediction related syntax element header elements among the syntax element header elements. In this case, the encryption time may be reduced by a factor of two to five times that of encrypting all the syntax header elements.

The intra-picture prediction related syntax element elements may be, for example, prev_intra4x4_pred_mode_flag, rem_intra4x4_pred_mode, intra_chroma_pred_mode, and intra_chroma_pred_mode_flag, which may be mb_type, coded_block_pattern (luma), coded_block_pattern (chroma), and mb_qp_delta, , And the inter-picture prediction related syntax header elements may be, for example, mvd (horizontal) and mvd (vertical).

The context modeling part 330 selects a probability model corresponding to the encrypted current binary value according to a previously encoded syntax element or binary value and the binary arithmetic coding part 340 selects a probability model corresponding to the encrypted current binary value from the context modeling part 330 And outputs the entropy-encoded bitstream by performing binary arithmetic coding using the determined probability model and the given binary value.

In FIG. 3, encryption is performed on a syntax element having a binary value, but encryption can be performed also in the rear end of the binary arithmetic coding part 340.

6 is a flowchart illustrating a hybrid moving picture processing method according to another embodiment of the present invention.

The control unit 295 selects an optical image of the optical camera unit 210 as an input to the image compression unit 120 or outputs the optical image of the optical camera unit 210 as an input to the image compression unit 120 in accordance with a light amount input from an optical sensor The infrared image of the camera unit 220 can be selected as the input of the image compression unit 120 (S602).

The intra prediction unit 210 or the inter prediction unit 220 is used to perform coding prediction for eliminating spatial redundancy or temporal redundancy of digital image data at step S604. The difference between the current block and the prediction block output by the encoding prediction is calculated to generate a difference block (S606).

Transform coding converts the image pixels of the difference block into the frequency domain to reduce the spatial redundancy of the image (S608). The control unit 295 provides the optical prediction image to the DCT part 232 of the transcoding unit 230 in the case of an optical image and the infrared predicted image to the DHT part 234 of the transcoding unit 230 in the case of the infrared image. Provide images.

The quantization unit 240 quantizes the transform coefficient of the difference block (S610). The control unit 295 scans the optical quantization image through the optical image scanning unit 242 of the quantization unit 240 and the infrared image scanning unit 244 of the quantization unit 240 in the case of the infrared image, To scan the infrared quantization image.

The binarization part 310 maps the syntax element output from the quantization part 240 to a binarized sequence when the syntax element is not a binary value (S612). When a syntax element having a global binary value is given as an input signal, a syntax element having a binary value is provided to the encryption part 320 without going through the quantization part 240.

The encryption part 320 encrypts some syntax elements of the syntax elements having binary values, for example, intra-picture prediction syntax elements and inter-picture prediction syntax elements (S614).

The context modeling part 330 and the binary arithmetic coding part 340 perform context-based adaptive binary coral coding to output an entropy-encoded bitstream (S616).

Although the present invention is a technical idea obtained in the process of developing an HEVC / H.265 image compression technology for mounting on an aircraft as described above, those skilled in the art will understand that the present invention is not limited to the HEVC / H.265 image compression technology will be.

The embodiments of the present invention described above are merely illustrative of the technical idea of the present invention, and the scope of protection of the present invention should be interpreted according to the claims. It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit and scope of the invention as defined by the appended claims. It should be interpreted that it is included in the scope of right.

110: image capturing unit 120: image compression unit
130: video transmission unit 210: intra-picture prediction unit
220: inter picture prediction unit 230: transcoding unit
232: DCT part 234: DHT part
240: quantization unit 242: optical image scanning part
244: Infrared image scanning part 250: Inverse quantization part
260: Inverse transform unit 270: Loop filter unit
280: Entropy coding unit 290: Performance measurement unit
295: control unit 310: binarization part
320: Encryption Part 330: Contextual Modeling Part
340: Binary arithmetic coding part

Claims (8)

An encoding prediction unit for performing encoding prediction to remove spatial redundancy or temporal redundancy of a digital image to output a prediction block,
The difference block generated by the difference between the current block of the digital image and the prediction block output from the encoding prediction unit is converted into the frequency domain through the DCT part in the case of the optical image and is converted into the frequency domain through the DHT part in the case of the infrared image, Into a region,
A quantization unit which quantizes the transform coefficients of the differential block transformed by the transform coding unit;
And an entropy coding unit for performing context-based adaptive binary arithmetic coding on syntax elements having binary values to output a bitstream.
The method according to claim 1,
Wherein the quantization unit includes an optical image scanning part for scanning a differential block of the quantized optical image and an infrared image scanning part for scanning a differential block of the quantized infrared image.
3. The method of claim 2,
Wherein the encoding predicting unit includes a plurality of inter-picture prediction parts to process motion estimation and prediction in parallel.
4. The method according to any one of claims 1 to 3,
Wherein the encryption in the entropy encoding unit encrypts macroblock layer related syntax element header elements, intra picture prediction related syntax element elements, and inter picture prediction related syntax element elements among syntax element header elements.
4. The method according to any one of claims 1 to 3,
Wherein the encryption in the entropy encoding unit includes syntax image data elements corresponding to the syntax element elements to be encrypted.
5. The method of claim 4,
The entropy encoding unit performs mapping after mapping syntax elements having no binary value to a binary sequence, performs context-based adaptive binary arithmetic coding on syntax elements having an encrypted binary value, and outputs a bitstream A hybrid video processing device.
Outputting a prediction block by performing coding prediction to eliminate spatial redundancy or temporal redundancy of a digital image,
The difference block generated by the difference between the current block of the digital image and the prediction block outputted at the step of outputting the prediction image is converted into the frequency domain through the DCT part in the case of the optical image, Into a frequency domain through a frequency domain;
Performing quantization on transform coefficients of the transformed difference block in the frequency domain, and
And performing a context-based adaptive binary arithmetic coding on syntax elements having binary values to output a bitstream.
8. The method of claim 7,
In the case of the optical image, the quantization step may include scanning the differential block of the quantized optical image through the optical image scanning part, and scanning the differential block of the quantized infrared image through the infrared image scanning part in the case of the infrared image Wherein the hybrid moving picture processing method comprises the steps of:
KR1020150056480A 2015-04-22 2015-04-22 Apparatus and method for processing hybrid moving picture KR20160125704A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150056480A KR20160125704A (en) 2015-04-22 2015-04-22 Apparatus and method for processing hybrid moving picture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150056480A KR20160125704A (en) 2015-04-22 2015-04-22 Apparatus and method for processing hybrid moving picture

Publications (1)

Publication Number Publication Date
KR20160125704A true KR20160125704A (en) 2016-11-01

Family

ID=57484741

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150056480A KR20160125704A (en) 2015-04-22 2015-04-22 Apparatus and method for processing hybrid moving picture

Country Status (1)

Country Link
KR (1) KR20160125704A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107819573A (en) * 2017-10-17 2018-03-20 东北大学 High dimension safety arithmetic coding method
CN108702511A (en) * 2017-07-28 2018-10-23 深圳市大疆创新科技有限公司 Method, equipment and the system of transmission of video
GB2621912A (en) * 2022-06-16 2024-02-28 Mbda Uk Ltd Method for image encoding

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108702511A (en) * 2017-07-28 2018-10-23 深圳市大疆创新科技有限公司 Method, equipment and the system of transmission of video
CN107819573A (en) * 2017-10-17 2018-03-20 东北大学 High dimension safety arithmetic coding method
GB2621912A (en) * 2022-06-16 2024-02-28 Mbda Uk Ltd Method for image encoding

Similar Documents

Publication Publication Date Title
KR102442894B1 (en) Method and apparatus for image encoding/decoding using prediction of filter information
KR102294733B1 (en) Methods of determination for chroma quantization parameter and apparatuses for using the same
EP2324638B1 (en) System and method for video encoding using adaptive loop filter
EP2705667B1 (en) Lossless coding and associated signaling methods for compound video
CN106170092B (en) Fast coding method for lossless coding
US20180139452A1 (en) Image coding device, image decoding device, image coding method, and image decoding method
KR20180074000A (en) Method of decoding video data, video decoder performing the same, method of encoding video data, and video encoder performing the same
US11146829B2 (en) Quantization parameter signaling in video processing
KR102324462B1 (en) Method and apparatus for scalable video coding using intra prediction mode
KR101646072B1 (en) Encryption apparatus and method for moving picture data
JP7297918B2 (en) Color conversion for video coding
US11039166B2 (en) Devices and methods for using base layer intra prediction mode for enhancement layer intra mode prediction
KR102480967B1 (en) Method and apparatus for image encoding/decoding
WO2012044093A2 (en) Method and apparatus for video-encoding/decoding using filter information prediction
KR20160125704A (en) Apparatus and method for processing hybrid moving picture
WO2016194380A1 (en) Moving image coding device, moving image coding method and recording medium for storing moving image coding program
CN114830642A (en) Image encoding method and image decoding method
US20130195180A1 (en) Encoding an image using embedded zero block coding along with a discrete cosine transformation
Kim et al. Improved H. 264/AVC lossless intra coding with two-layered residual coding (TRC)
KR20180113868A (en) Image Reencoding Method based on Decoding Data of Image of Camera and System thereof
CN114521325A (en) Image encoding method and image decoding method
CN114788270A (en) Image encoding method and image decoding method
CN114830650A (en) Image encoding method and image decoding method
KR20150117854A (en) Method and apparatus for applying Sample Adaptive Offset filtering
CN114830643A (en) Image encoding method and image decoding method

Legal Events

Date Code Title Description
A201 Request for examination
E601 Decision to refuse application