WO2003013144A1 - Moving object based motion estimation wavelet picture compression and decompression system - Google Patents

Moving object based motion estimation wavelet picture compression and decompression system Download PDF

Info

Publication number
WO2003013144A1
WO2003013144A1 PCT/KR2001/001612 KR0101612W WO03013144A1 WO 2003013144 A1 WO2003013144 A1 WO 2003013144A1 KR 0101612 W KR0101612 W KR 0101612W WO 03013144 A1 WO03013144 A1 WO 03013144A1
Authority
WO
WIPO (PCT)
Prior art keywords
picture
wavelet
outputted
frame
moving
Prior art date
Application number
PCT/KR2001/001612
Other languages
French (fr)
Inventor
Yoengmin Kim
Myungok Lee
Original Assignee
Hichips Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hichips Co., Ltd. filed Critical Hichips Co., Ltd.
Publication of WO2003013144A1 publication Critical patent/WO2003013144A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • H04N19/517Processing of motion vectors by encoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
    • H04N19/23Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding with coding of regions that are present throughout a whole video segment, e.g. sprites, background or mosaic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/57Motion estimation characterised by a search window with variable size or shape
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/63Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using sub-band based transform, e.g. wavelets

Definitions

  • the present invention relates to picture compression and decompression systems.
  • the invention relates to a moving object based motion estimation wavelet picture compression and decompression system for improving the compression rate and processing speed as well as minimizing the memory usage rate by performing wavelet transformation and compression using the error picture obtained, first by storing the difference between the original picture and the next picture calculated based on the overlapping characteristic according to picture time at the time of picture compression/decompression, and secondly by calculating the difference between the next picture and the ensuing picture after that and performing a logical AND operation on the picture difference obtained as such.
  • the most commonly used picture data compression method is the Block-DCT processing method which spatially compresses the information through Discrete Fourier Transformation (DCT) after dividing a frame into 8 x 8 pixel blocks.
  • DCT Discrete Fourier Transformation
  • the subjects of DCT are the brightness signal and color difference signal.
  • This DCT-Block transformation method is effective for an estimation/ compensation of local picture, however, it is incapable of reducing the time for global estimation/compensation and processing.
  • a wavelet transmission method is used instead of the DCT-Block transformation method.
  • the wavelet transmission method can flexibly determine the size of a block using the characteristics of wavelet coefficients and at the same time the method can reduce the calculation time by using only the four lowest frequency bands. Also, it increases the effectiveness of the global estimation/compensation.
  • Korean Pat. No. 0223647 discloses a method and system for an effective moving estimation/compensation in a picture compression/decompression system.
  • FIG. 1 is a block configuration diagram of the picture compression/decompression apparatus according to the conventional technology.
  • a picture is inputted to a picture input terminal IV, the picture is inputted to a first synthesizer 111 and a picture estimation/compensation section 125.
  • the first synthesizer 111 synthesizes the inputted picture with a compensation value estimated from the picture estimation/compensation section 125 and inputs to a wavelet transformer 201.
  • the output of the first synthesizer 101 is wavelet transformed for sub-band coding and inputted to a quantizer 103.
  • the quantizer 103 quantizes the outputted signal from the wavelet transformer 201 according to the rate and inputted to an entropy coder in order to code the undetermined information amount.
  • the output of the quantizer 105 is reverse-quantized in a reverse-quantizer 115, it is inputted to a reverse-wavelet transformer 203 to be transformed into a reverse-wavelet.
  • Both of the vertical and horizontal components of the output of the second synthesizer 205 are filtered in the first filter 211 at a low frequency.
  • Each of the vertical and horizontal components of the output of the third synthesizer 207 is filtered in the second filter 213 at a low and high frequency respectively.
  • Each of the vertical and horizontal components of the output of the fourth synthesizer 209 is filtered in the third filter 215 at a high and low frequency respectively.
  • the picture estimation/compensation section 125 estimates as well as compensates the picture signal inputted by the signal filtered in the first to third filters 211, 213, 215.
  • the output signals from the entropy coder 107 and picture estimation/ compensation section 125 are multiplex outputted through a multiplexer (MUX) 109.
  • MUX multiplexer
  • the outputted is signal from the multiplexer 109 is buffer outputted.
  • a rate control section 113 generates a rate control signal according to the quantization of the quantizer 105 through the signal outputted from the output buffer 111.
  • the conventional technology uses a wavelet transformation.
  • This wavelet transmission method can flexibly determine the size of the blocks using the characteristics of wavelet coefficients and at the same time the method can reduce the calculation time by using only the four lowest frequency bands. Also, it increases the effectiveness of the global estimation/compensation.
  • the picture compression/decompression system according to the conventional technology is capable of multiplexing resolution codification and de-codification, however, there were problems concerning one level having most of bit columns and low effectiveness due to the absence of time gain during the de-codification.
  • the present invention is designed to overcome the above problems of prior art.
  • the object of the present invention is to provide a moving object based motion estimation wavelet picture compression and decompression system, for significantly increasing the picture compression rate through extracting the background information separately and comparing the background information with the previously transmitted background information and transmitting only the differences, and for significantly reducing the number of moving vectors and their sizes through a movement estimation of the same object in the previous picture for each wavelet band, and for enabling flexible changes of image size through only separating moving objects for wavelet transformation.
  • FIG. 1 is a block configuration diagram of the picture compression/decompression apparatus according to the conventional technology.
  • FIG. 2 is a block diagram which shows an overall configuration of the moving object based motion estimation wavelet picture compression and decompression system according to one embodiment of the present invention.
  • FIG. 3 is an illustrative diagram which shows the bounding box extraction section in FIG. 2.
  • FIG. 4 is an extraction process diagram for moving object separation mask for each band in FIG. 2.
  • FIG. 5a and FIG. 5b are diagrams which illustrate the moving estimation process by the moving estimation section.
  • FIG. 6 is an illustrative diagram which shows the order of preference in the scan direction for zero searching algorithm.
  • FIG. 7 is a diagram which shows the data formation of bit stream outputted from the second multiplexing section in FIG. 2.
  • FIG. 8 is a block diagram which shows the configuration of the moving object based motion estimation wavelet picture decompression system according to another embodiment of the present invention.
  • 301 frame memory 302: frame difference calculation section
  • moving object mask creation section 304 bounding box extraction section 310, 320: first and second wavelet processing sections
  • moving object mask creation section 341 moving object mask creation section 342: mask memory
  • first and second Quantization sections 410 wavelet transformation/quantization sections 421-425: first to fifth entropy coding sections
  • first multiplexing section 502 reverse multiplexing section 510
  • 530 first and second frame processing section 521
  • 541 second and third multiplexing section 522
  • 542 first and second frame memory
  • the moving object based motion estimation wavelet picture compression and decompression system comprises: a frame difference calculation means for detecting two consecutive picture frames among the inputted picture frames and for calculating the detected picture differences; a binarization means for binarizing the frame differences outputted from said frame difference calculation means; a bounding box extracting means for extracting a rectangular bounding box by searching periphery blocks to objects through the frame differences outputted from said binarization means; first and second wavelet processing means for wavelet transforming and quantizing each frame picture which exists in the rectangular bounding box extracted by said bounding box extracting means; a synthesizing means for synthesizing wavelet quantization data outputted from each of said first and second wavelet processing means; a moving object mask creation means for creating moving object masks by calculating the overlapping parts of moving objects of the frames according to the synthesis results outputted from said synthesizing means; a moving object separation means for separating the moving objects among the frames outputted by the second wavelet
  • the moving object based motion estimation wavelet picture compression and decompression system comprises: a reverse multiplexing means for reverse multiplexing the inputted bit streams according to each subject; a frame memory for storing and outputting the bounding box information outputted through said reverse multiplexing means; a first frame processing means for Variable Length Decoding (VLD), Inverse Wavelet Quantizing (IWQ) and Inverse Discrete Wavelet Transforming (IDWT) the I-pictures (intra-picture) of the initial two frames outputted from said reverse multiplexing means; a second frame processing means for Variable Length Transforming and movement compensating motion vector components of the pictures inputted from the third frame which is outputted from said reverse multiplexing means; an error information decompression means for decompressing the codified error components outputted from said reverse multiplexing means through VLD, Run length Decoding (RLD), Inverse Quantization (IQ) and Inverse Discrete Cosine Transform (TDCT)
  • VLD Variable Length Decoding
  • IWQ
  • FIG. 2 is a block diagram which shows an overall configuration of the moving object based motion estimation wavelet picture compression and decompression system according to one embodiment of the present invention.
  • the system comprises: a frame difference calculation section 302 for separately extracting the consecutive frames of input picture F2 as picture frame Fl and F2 separately through a frame memory 301 and for calculating the picture frame differences of the extracted picture frames; a binarization section 303 for binarizing the frame differences outputted from said frame difference calculation section; a bounding box extracting section 304 for extracting a rectangular bounding box by searching periphery blocks through the frame differences outputted from said binarization section; first and second wavelet processing sections 310, 320 for wavelet transforming and quantizing each of the frame pictures which exist in the bounding boxes extracted by said bounding box extracting section 304; a synthesizing section 330 for synthesizing the wavelet quantization data F W i box>
  • the present invention performs wavelet transformation and quantization after extracting bounding boxes of the moving objects of the inputted picture.
  • the direction of movement is searched and estimated by separating the wavelet quantized moving objects.
  • FIG. 2 is a block diagram which shows an overall configuration of the moving object based motion estimation wavelet picture compression and decompression system according to one embodiment of the present invention.
  • the two consecutive picture frames Fl and F2 among the input picture frames are codified since they performs an entropy coding after completion of the wavelet transformation and wavelet quantization process as infra coded picture.
  • the frame memory separates the inputted pictures to the first and second frames Fl and F2 and the frame difference calculation section 301 transforms the frames into to binary numbers in the binarization section 303 after the differences have been calculated.
  • the binarized frames DLF21 are inputted to the bounding box extraction section 304 and the periphery blocks are searched by the moving object separation algorithm.
  • non zero blocks whose object is not found are searched by scanning the rows from left to right. Moving towards the right direction starting from the first non zero block, the block which are peripheral to at least one face of a zero block among the four zero blocks surrounding the current periphery block are assigned as the next periphery block.
  • next periphery block is assigned in the order of preference as shown in FIG. 6 according to the previously passed blocks. Also, the current periphery block can not be the next periphery block, however, the past periphery block can be the next periphery block.
  • the process returns to the starting block while the periphery block is being searched and its number gets recorded or the search for a periphery block is completed when the block peripheral to the frame boundary is reached.
  • the top left corner coordinate of a bounding box is assigned by selecting the minimum X coordinate and the minimum Y coordinate and the bottom right corner of the bounding box is assigned by selecting the maximum X coordinate and the maximum Y coordinate.
  • first and second common parts calculation sections 311, 321 the bounding box extracted from the bounding box extraction section 304 calculates the common parts using each of the extracted first and second frames Fl, F2. More specifically, after calculating a bounding box for each of the whole moving objects that exist in the binarized frame DIF21, wavelet transformation and quantization are performed on the picture frames Fl and F2 that exist only in each of the bounding boxes in the first and second wavelet transformation sections 312, 322 and the first and second quantization sections 313, 323. Each of the wavelet transformed and quantized pictures are stored in fist and second wavelet memory 314, 324 and outputted F w 1bOX , F w
  • a mask BWFMASK12 is created which represents a shape of the objects of the first and second frames Fl and F2 are overlapped due to movements.
  • FIG. 4 is an extraction diagram of moving object separation mask for each band.
  • the mask BWFMASK22 calculated from the above is AND operated in the moving object separation section 361 with the second frame F2 BWF2 which is bounded through wavelet transformation, then the only moving object BWF 2 0BJ in the second frame F2 is separated.
  • the moving faults and moving vectors are produced by inputting the separated moving object to the moving estimation section 362 as well as using BWF1 as a reference frame.
  • the background parts of the BWF2 objects which were covered by the moving objects of BWF1 are still picture data and are subjected to intracoding.
  • the background picture which is covered by the background picture extraction section 350 is extracted.
  • the moving object frame BWF1 which is separately outputted through the first AND calculation section 343, is NOT operated in a reverse section 351.
  • the result of this operation together with the moving object mask BWFMASK12 outputted from the mask memory 342 are AND operated in a second AND calculation section 352 in order to separately extract a background picture BWFMASK BACK .
  • a background part BWF 2 BACK which is covered by the moving objects of the past image of BWF2 can be separated using a third AND calculation section 353.
  • the background part BWF 2 BACK produced by the third AND calculation section 353 is accumulated in the background memory 382 and in the comparing section 382, the differences are calculated by comparing with the previously accumulated data.
  • the background data difference produced through the comparing section 382 and the moving fault which is the moving estimation result are either discrete cosine transformed or run length coded by a first and second DCT transformation section 371, 391 and a first and second quantization section 372, 392. Afterwards, they are either once again transform codified or quantized and run length codified (RLD) and entropy coding is executed through each of the entropy coding sections 421-425.
  • the bounding box information BINF which is outputted by the bounding box extraction section 304 is entropy codified through the fourth entropy coding section 424.
  • the entropy coding is executed through the fifth entropy coding section 425.
  • FIG. 5 a and FIG. 5b are diagrams which illustrate the moving estimation process by the moving estimation section 362.
  • the moving estimation basically utilizes the wavelet transformed picture data. Accordingly, the estimation is carried out according to each wavelet band and the search window illustrated in FIG. 5a shows one band picture.
  • the moving vector corresponding to the minimum error is produced by searching a moving object BWFi OBJ in the reference frame BWF2.
  • the picture for the moving object BWFi OBJ is divided into macro-blocks with a specific size as shown in FIG. 5b.
  • Search is executed on each of the macro-blocks of the reference frame BWF2 to produce moving vectors using the bounding box as a search window. If a three step wavelet transformation is presupposed, then the picture size for each step, i.e., each band increases four times therefore the sizes of search window or micro blocks are readjusted four times accordingly.
  • the size of initial bounding box before the wavelet transformation is different for each moving object, the sizes of search window or micro blocks become different. As a result, the size of a macro-block should be limited below the size of 16 xl6 pixel.
  • the moving vectors for each macro-block are codified as the difference of all macro-blocks averages against the moving object.
  • the lower band moving vectors are twice much in comparison to the upper most band moving vectors since the size of vectors which move in each band is determined. Applying this principle, the number of moving vectors can be minimized.
  • FIG. 8 is a block diagram which shows the configuration of the moving object based motion estimation wavelet picture decompression system according to another embodiment of the present invention.
  • the decompression system comprises: a reverse multiplexing section 501 for reverse multiplexing the inputted bit stream for each category; a first frame memory 522 for storing and outputting the bounding box information which is outputted from said reverse multiplexing section 501; a first frame processing section 510 for Variable Length Decoding (VLD), Inverse Wavelet Quantizing (IWQ) and Inverse Discrete Wavelet Transforming (IDWT) the I-pictures of the initial two frames outputted from said reverse multiplexing section 501; a second frame processing section 530 for Variable Length Transforming and movement compensating motion vector components of the pictures inputted from the third frame which is outputted from said reverse multiplexing section 501; an error information decompression section 560 for decompressing the codified error components outputted from said reverse multiplexing section through VLD, Run length Decoding (VLD),
  • 'BD F' represents the bounding box information
  • T represent an I-picture which is the initial frame of the bit stream
  • 'MV represents a motion vector a P-picture of the I-picture
  • 'error' represent a error component for the moving compensation
  • 'back' is a corresponding information which was covered by the moving objects of the current frame.
  • two I-pictures are transmitted initially, they are processed with VLD, IWQ, IDWT in the first frame processing section 510 and stored in the first frame memory 522 via the second multiplexing section 521.
  • the past frames stored in the first frame memory 522 are stored for each band category in the second frame memory 542 through DWT and WQ by the wavelet processing section 580.
  • the motion vector component of P-picture is variable length decoded (VLD) by the frame processing section 530, error component is processed for
  • VLD VLD
  • RLD IQ
  • IDCT movement compensated
  • the stored data in the second frame memory 542 is decompressed to the original picture through IWQ and IDWT processes and stored in the corresponding box location of the first frame memory 522.
  • the background picture covered by separately transmitted object is decompressed through VLD, RLD, IQ and IDCT processes in the decompressing section 570 and stored in the second frame memory 542 after being added to the moving object part.
  • the picture frame decompressed by the above method is transmitted from the first frame memory 542 to a indicator (not shown)
  • wavelet decoding is performed by receiving the first I-picture frame of video sequence and the first frame is stored in the first frame memory 522. Afterwards, the fixed object bounded by the second frame is added to the stored frame after being wavelet decoded first.
  • the background image extracted using the algorithm for background extraction only and after one frame is received this frame is stored in the background frame memory after being wavelet transformed.
  • the wavelet decoding is performed by receiving the first I-picture frame of video sequence and the frame is stored in the first frame memory 522.
  • the present invention provides a moving object based motion estimation wavelet picture compression and decompression system, for significantly increasing the picture compression rate through extracting the background information separately and comparing the background information with the previously transmitted background information and transmitting only the differences, and for significantly reducing the number of moving vectors and their sizes through a movement estimation of the same object in the previous picture for each wavelet band, and for enabling flexible changes of image size through only separating moving objects for wavelet transformation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The invention relates to a moving object based motion estimation wavelet picture compression and decompression system. The objects of the present invention is to provide a moving object based motion estimation wavelet picture compression and decompression system, for significantly increasing the picture compression rate through extracting the background information separately and comparing the background information with the previously transmitted background information and transmitting only the differences, and for significantly reducing the number of moving vectors and their sizes through a movement estimation of the same object in the previous picture for each wavelet band, and for enabling flexible changes of image size through only separating moving objects for wavelet transformation.

Description

MOVING OBJECT BASED MOTION ESTIMATION WAVELET PICTURE COMPRESSION AND DECOMPRESSION SYSTEM
Technical Field The present invention relates to picture compression and decompression systems.
More particularly, the invention relates to a moving object based motion estimation wavelet picture compression and decompression system for improving the compression rate and processing speed as well as minimizing the memory usage rate by performing wavelet transformation and compression using the error picture obtained, first by storing the difference between the original picture and the next picture calculated based on the overlapping characteristic according to picture time at the time of picture compression/decompression, and secondly by calculating the difference between the next picture and the ensuing picture after that and performing a logical AND operation on the picture difference obtained as such.
Background Art Generally, in order to view a realistic moving life picture as can be seen on TV or Movies, information at the speed of 30-50 frames per second are required. As a result, the time differences between the consecutive frames for a movie film or videotape are almost unnoticeable. Likewise, as a method for reducing the redundancy between periphery frames, the picture data is digitalized and compressed.
The most commonly used picture data compression method is the Block-DCT processing method which spatially compresses the information through Discrete Fourier Transformation (DCT) after dividing a frame into 8 x 8 pixel blocks. At this instance, the subjects of DCT are the brightness signal and color difference signal.
This DCT-Block transformation method is effective for an estimation/ compensation of local picture, however, it is incapable of reducing the time for global estimation/compensation and processing.
In order to overcome this problem, conventionally, a wavelet transmission method is used instead of the DCT-Block transformation method. The wavelet transmission method can flexibly determine the size of a block using the characteristics of wavelet coefficients and at the same time the method can reduce the calculation time by using only the four lowest frequency bands. Also, it increases the effectiveness of the global estimation/compensation. As a prior art, Korean Pat. No. 0223647 discloses a method and system for an effective moving estimation/compensation in a picture compression/decompression system.
FIG. 1 is a block configuration diagram of the picture compression/decompression apparatus according to the conventional technology.
If a picture is inputted to a picture input terminal IV, the picture is inputted to a first synthesizer 111 and a picture estimation/compensation section 125. The first synthesizer 111 synthesizes the inputted picture with a compensation value estimated from the picture estimation/compensation section 125 and inputs to a wavelet transformer 201. In the wavelet transformer 201, the output of the first synthesizer 101 is wavelet transformed for sub-band coding and inputted to a quantizer 103. The quantizer 103 quantizes the outputted signal from the wavelet transformer 201 according to the rate and inputted to an entropy coder in order to code the undetermined information amount. After the output of the quantizer 105 is reverse-quantized in a reverse-quantizer 115, it is inputted to a reverse-wavelet transformer 203 to be transformed into a reverse-wavelet.
Second to fourth synthesizers 205, 207, 209, synthesize the output of the reverse- wavelet transformer 203 with a picture estimation value from the picture estimation/ compensation section 125.
Both of the vertical and horizontal components of the output of the second synthesizer 205 are filtered in the first filter 211 at a low frequency. Each of the vertical and horizontal components of the output of the third synthesizer 207 is filtered in the second filter 213 at a low and high frequency respectively. Each of the vertical and horizontal components of the output of the fourth synthesizer 209 is filtered in the third filter 215 at a high and low frequency respectively.
The picture estimation/compensation section 125 estimates as well as compensates the picture signal inputted by the signal filtered in the first to third filters 211, 213, 215.
The output signals from the entropy coder 107 and picture estimation/ compensation section 125 are multiplex outputted through a multiplexer (MUX) 109.
The outputted is signal from the multiplexer 109 is buffer outputted. In the mean time, a rate control section 113 generates a rate control signal according to the quantization of the quantizer 105 through the signal outputted from the output buffer 111.
As explained so far, the conventional technology uses a wavelet transformation.
This wavelet transmission method can flexibly determine the size of the blocks using the characteristics of wavelet coefficients and at the same time the method can reduce the calculation time by using only the four lowest frequency bands. Also, it increases the effectiveness of the global estimation/compensation. The picture compression/decompression system according to the conventional technology is capable of multiplexing resolution codification and de-codification, however, there were problems concerning one level having most of bit columns and low effectiveness due to the absence of time gain during the de-codification.
Disclosure of Invention
The present invention is designed to overcome the above problems of prior art.
The object of the present invention is to provide a moving object based motion estimation wavelet picture compression and decompression system, for significantly increasing the picture compression rate through extracting the background information separately and comparing the background information with the previously transmitted background information and transmitting only the differences, and for significantly reducing the number of moving vectors and their sizes through a movement estimation of the same object in the previous picture for each wavelet band, and for enabling flexible changes of image size through only separating moving objects for wavelet transformation.
Brief Description of Drawings
FIG. 1 is a block configuration diagram of the picture compression/decompression apparatus according to the conventional technology. FIG. 2 is a block diagram which shows an overall configuration of the moving object based motion estimation wavelet picture compression and decompression system according to one embodiment of the present invention.
FIG. 3 is an illustrative diagram which shows the bounding box extraction section in FIG. 2. FIG. 4 is an extraction process diagram for moving object separation mask for each band in FIG. 2.
FIG. 5a and FIG. 5b are diagrams which illustrate the moving estimation process by the moving estimation section.
FIG. 6 is an illustrative diagram which shows the order of preference in the scan direction for zero searching algorithm.
FIG. 7 is a diagram which shows the data formation of bit stream outputted from the second multiplexing section in FIG. 2.
FIG. 8 is a block diagram which shows the configuration of the moving object based motion estimation wavelet picture decompression system according to another embodiment of the present invention. <Description of the numeric on the main parts of the drawings> 301: frame memory 302: frame difference calculation section
303: moving object mask creation section 304: bounding box extraction section 310, 320: first and second wavelet processing sections
311, 321: first and second common parts calculation sections
312, 322: first and second wavelet transformation sections
313, 323: first and second wavelet quantization sections
314, 324: first and second wavelet memories 330: synthesis section
340: moving object mask creation section 341: moving object mask creation section 342: mask memory
343, 352, 353: first to third AND operation section 350: background picture extracting section
351: reverse section 361: moving object separation section 362: moving estimation section
370, 390: first and second DCT/Quantization sections 381: background memory
382: comparing section
371, 391: first and second DCT
372, 392: first and second Quantization sections 410: wavelet transformation/quantization sections 421-425: first to fifth entropy coding sections
430: first multiplexing section 502: reverse multiplexing section 510, 530: first and second frame processing section 521, 541: second and third multiplexing section 522, 542: first and second frame memory
550: picture decompression section 560: error information decompression section 570: background image decompression section 580: wavelet processing section Best Mode for Carrying Out the Invention
In order to achieve the stated objects of the present invention, the moving object based motion estimation wavelet picture compression and decompression system according to the present invention comprises: a frame difference calculation means for detecting two consecutive picture frames among the inputted picture frames and for calculating the detected picture differences; a binarization means for binarizing the frame differences outputted from said frame difference calculation means; a bounding box extracting means for extracting a rectangular bounding box by searching periphery blocks to objects through the frame differences outputted from said binarization means; first and second wavelet processing means for wavelet transforming and quantizing each frame picture which exists in the rectangular bounding box extracted by said bounding box extracting means; a synthesizing means for synthesizing wavelet quantization data outputted from each of said first and second wavelet processing means; a moving object mask creation means for creating moving object masks by calculating the overlapping parts of moving objects of the frames according to the synthesis results outputted from said synthesizing means; a moving object separation means for separating the moving objects among the frames outputted by the second wavelet processing means through the moving objects masks outputted from said moving object mask creation means; a moving estimation means for calculating moving faults or moving vectors with reference to the frame data outputted from said first wavelet processing means and the moving object frames outputted from said moving object separation means; a first DCT/Quantization means for discrete cosine transforming and quantizing the moving faults which are the results of moving estimation, outputted from said moving estimation means; a background picture extracting means for separately extracting the background parts of a picture which have been covered by the moving objects outputted from said moving object mask creation means; a comparing means for calculating the differences by comparing between the picture outputted from said background picture calculation means and the background picture which is already stored; a second DCT/Quantization means for discrete cosine transforming and quantizing the moving faults outputted through said comparing means; a wavelet transformation/quantization means for wavelet transforming and quantizing the second frame of said input picture; first to fifth entropy coding means for entropy decoding each data outputted to said moving estimation means, said first and second discrete cosine transforming means, said bounding box extracting means, and wavelet transformation/quantization means; and a second multiplexing means for outputting encoded bit streams by multiplexing the signals outputted to said first to fifth entropy coding means.
In order to achieve the other objective of the present invention, the moving object based motion estimation wavelet picture compression and decompression system according to the present invention comprises: a reverse multiplexing means for reverse multiplexing the inputted bit streams according to each subject; a frame memory for storing and outputting the bounding box information outputted through said reverse multiplexing means; a first frame processing means for Variable Length Decoding (VLD), Inverse Wavelet Quantizing (IWQ) and Inverse Discrete Wavelet Transforming (IDWT) the I-pictures (intra-picture) of the initial two frames outputted from said reverse multiplexing means; a second frame processing means for Variable Length Transforming and movement compensating motion vector components of the pictures inputted from the third frame which is outputted from said reverse multiplexing means; an error information decompression means for decompressing the codified error components outputted from said reverse multiplexing means through VLD, Run length Decoding (RLD), Inverse Quantization (IQ) and Inverse Discrete Cosine Transform (TDCT) and transmitting them with said motion vectors to the second frame processing section; a background image decompression means for decompressing the background image information which is covered by the moving objects outputted from said reverse multiplexing means through VLD, RLD, IP and IDCT; a second multiplexing means for multiplex outputting the picture outputted from said second frame processing means and said background image decompression means; a second memory for storing the picture frames outputted from said second multiplexing means; a picture decompression means for decompressing the stored picture in said second memory to real picture through IWQ and IDWT; a first multiplexing means for storing the outputs from said picture decompression means and said first frame processing means in said first memory through multiplexing; and a wavelet processing means for outputting the previously stored frames in said first memory to said second multiplexing means through DWT and WQ.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. FIG. 2 is a block diagram which shows an overall configuration of the moving object based motion estimation wavelet picture compression and decompression system according to one embodiment of the present invention. According to the present invention, the system comprises: a frame difference calculation section 302 for separately extracting the consecutive frames of input picture F2 as picture frame Fl and F2 separately through a frame memory 301 and for calculating the picture frame differences of the extracted picture frames; a binarization section 303 for binarizing the frame differences outputted from said frame difference calculation section; a bounding box extracting section 304 for extracting a rectangular bounding box by searching periphery blocks through the frame differences outputted from said binarization section; first and second wavelet processing sections 310, 320 for wavelet transforming and quantizing each of the frame pictures which exist in the bounding boxes extracted by said bounding box extracting section 304; a synthesizing section 330 for synthesizing the wavelet quantization data FWibox> Fw 2 ox outputted from each of said first and second wavelet processing sections; a mask creation section 340 for creating masks of the moving objects by calculating the overlapping parts of the moving object frames according to the synthesis result outputted from said synthesizing section; a moving object separation section 361 for separating the moving objects among outputted from the second wavelet processing section 320 through the moving object masks outputted from said moving object mask creation section 340; a moving estimation section 362 for calculating moving faults or moving vectors with reference to the frame data outputted from said first wavelet processing section 310 and the moving object frames outputted from said moving object separation section 361; a first DCT/Quantization means for DCT and quantizing the moving faults which are the results of moving estimation, outputted from said moving estimation section 362; a background picture extracting section 350 for separately extracting the background parts of a picture which have been covered by the moving objects outputted from said moving object mask creation section 340; a comparing section 382 for calculating the differences by comparing between the picture outputted from said background picture extraction section and the background picture which is already stored in the background memory 381; a second DCT/Quantization section for DCT and quantizing the moving faults outputted through said comparing section 382; a wavelet transformation/ quantization section 410 for wavelet transforming and quantizing the second frame of said input picture; first to fifth entropy coding sections 421-425 for entropy decoding each data outputted to said moving estimation section 362, said first and second DST sections 370, 390, said bounding box extracting section 304, and wavelet transformation quantization section; and a first multiplexing section 430 for outputting encoded bit streams by multiplexing the signals outputted to said first to fifth entropy coding sections 421-425.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to FIG. 2 to FIG. 7.
First of all, the present invention performs wavelet transformation and quantization after extracting bounding boxes of the moving objects of the inputted picture. The direction of movement is searched and estimated by separating the wavelet quantized moving objects.
FIG. 2 is a block diagram which shows an overall configuration of the moving object based motion estimation wavelet picture compression and decompression system according to one embodiment of the present invention.
As shown in FIG. 2, the two consecutive picture frames Fl and F2 among the input picture frames are codified since they performs an entropy coding after completion of the wavelet transformation and wavelet quantization process as infra coded picture.
The frame memory separates the inputted pictures to the first and second frames Fl and F2 and the frame difference calculation section 301 transforms the frames into to binary numbers in the binarization section 303 after the differences have been calculated.
The binarized frames DLF21 are inputted to the bounding box extraction section 304 and the periphery blocks are searched by the moving object separation algorithm. A rectangular bounding box, which bounds the corresponding periphery block, is extracted. The detailed descriptions of the process of forming a bounding box with reference to FIG. 3 are given in the following.
First of all, non zero blocks whose object is not found are searched by scanning the rows from left to right. Moving towards the right direction starting from the first non zero block, the block which are peripheral to at least one face of a zero block among the four zero blocks surrounding the current periphery block are assigned as the next periphery block.
If there are more than two blocks which are peripheral to the face, then the next periphery block is assigned in the order of preference as shown in FIG. 6 according to the previously passed blocks. Also, the current periphery block can not be the next periphery block, however, the past periphery block can be the next periphery block.
Likewise, the process returns to the starting block while the periphery block is being searched and its number gets recorded or the search for a periphery block is completed when the block peripheral to the frame boundary is reached.
Among the periphery blocks, the top left corner coordinate of a bounding box is assigned by selecting the minimum X coordinate and the minimum Y coordinate and the bottom right corner of the bounding box is assigned by selecting the maximum X coordinate and the maximum Y coordinate.
In first and second common parts calculation sections 311, 321, the bounding box extracted from the bounding box extraction section 304 calculates the common parts using each of the extracted first and second frames Fl, F2. More specifically, after calculating a bounding box for each of the whole moving objects that exist in the binarized frame DIF21, wavelet transformation and quantization are performed on the picture frames Fl and F2 that exist only in each of the bounding boxes in the first and second wavelet transformation sections 312, 322 and the first and second quantization sections 313, 323. Each of the wavelet transformed and quantized pictures are stored in fist and second wavelet memory 314, 324 and outputted Fw 1bOX, Fw
2box
If the wavelet transformed frame against the first frame Fl in the bounding box is assigned as BWF1 and if the wavelet transformed frame against the first frame F2 as BWF2, then BWF1 and BWF2 are calculated and the results are binarized in the binarization section 341. Afterwards, as a result of the binarization through the mask memory 342, a mask BWFMASK12 is created which represents a shape of the objects of the first and second frames Fl and F2 are overlapped due to movements.
Here, if the third frame F3 passes through the same process, a mask BWFMASK23 due to the first and second frames F2, F3 are created through the mask memory 342.
FIG. 4 is an extraction diagram of moving object separation mask for each band.
As shown in FIG. 2, if BWFMASK12 and BWFMASK23 which are obtained from the above mask extraction process are AND operated in a first AND calculation section, then a mask BWFMASK2 which represent only the moving objects in the second frame F2 can be produced.
The mask BWFMASK22 calculated from the above is AND operated in the moving object separation section 361 with the second frame F2 BWF2 which is bounded through wavelet transformation, then the only moving object BWF2 0BJ in the second frame F2 is separated.
The moving faults and moving vectors are produced by inputting the separated moving object to the moving estimation section 362 as well as using BWF1 as a reference frame.
Also, the background parts of the BWF2 objects which were covered by the moving objects of BWF1 are still picture data and are subjected to intracoding.
In order to calculate this, the background picture which is covered by the background picture extraction section 350 is extracted.
More specifically, the moving object frame BWF1, which is separately outputted through the first AND calculation section 343, is NOT operated in a reverse section 351. The result of this operation together with the moving object mask BWFMASK12 outputted from the mask memory 342 are AND operated in a second AND calculation section 352 in order to separately extract a background picture BWFMASKBACK . From the separately extract background picture BWFMASKBACK, only a background part BWF2 BACK , which is covered by the moving objects of the past image of BWF2, can be separated using a third AND calculation section 353.
The background part BWF2 BACK produced by the third AND calculation section 353 is accumulated in the background memory 382 and in the comparing section 382, the differences are calculated by comparing with the previously accumulated data.
The background data difference produced through the comparing section 382 and the moving fault which is the moving estimation result are either discrete cosine transformed or run length coded by a first and second DCT transformation section 371, 391 and a first and second quantization section 372, 392. Afterwards, they are either once again transform codified or quantized and run length codified (RLD) and entropy coding is executed through each of the entropy coding sections 421-425. In the mean time, the bounding box information BINF which is outputted by the bounding box extraction section 304 is entropy codified through the fourth entropy coding section 424.
Also, for he frame data processed by passing the input picture frame F2 through the wavelet transformation/quantization section 410, the entropy coding is executed through the fifth entropy coding section 425.
The data, which are outputted from each of the entropy coding sections 421-425, are outputted in a codified bit stream format through the first multiplexing section 430 as shown in FIG. 7.
FIG. 5 a and FIG. 5b are diagrams which illustrate the moving estimation process by the moving estimation section 362.
As shown in FIG. 5 a and FIG. 5b, the moving estimation basically utilizes the wavelet transformed picture data. Accordingly, the estimation is carried out according to each wavelet band and the search window illustrated in FIG. 5a shows one band picture.
More specifically, the moving vector corresponding to the minimum error is produced by searching a moving object BWFi OBJ in the reference frame BWF2.
Here, the picture for the moving object BWFi OBJ is divided into macro-blocks with a specific size as shown in FIG. 5b. Search is executed on each of the macro-blocks of the reference frame BWF2 to produce moving vectors using the bounding box as a search window. If a three step wavelet transformation is presupposed, then the picture size for each step, i.e., each band increases four times therefore the sizes of search window or micro blocks are readjusted four times accordingly.
The size of initial bounding box before the wavelet transformation is different for each moving object, the sizes of search window or micro blocks become different. As a result, the size of a macro-block should be limited below the size of 16 xl6 pixel.
After moving vectors for each macro-block are extracted, the total average is calculated. The moving vectors for each macro-block are codified as the difference of all macro-blocks averages against the moving object.
Also, The lower band moving vectors are twice much in comparison to the upper most band moving vectors since the size of vectors which move in each band is determined. Applying this principle, the number of moving vectors can be minimized.
FIG. 8 is a block diagram which shows the configuration of the moving object based motion estimation wavelet picture decompression system according to another embodiment of the present invention. According to the present invention, the decompression system comprises: a reverse multiplexing section 501 for reverse multiplexing the inputted bit stream for each category; a first frame memory 522 for storing and outputting the bounding box information which is outputted from said reverse multiplexing section 501; a first frame processing section 510 for Variable Length Decoding (VLD), Inverse Wavelet Quantizing (IWQ) and Inverse Discrete Wavelet Transforming (IDWT) the I-pictures of the initial two frames outputted from said reverse multiplexing section 501; a second frame processing section 530 for Variable Length Transforming and movement compensating motion vector components of the pictures inputted from the third frame which is outputted from said reverse multiplexing section 501; an error information decompression section 560 for decompressing the codified error components outputted from said reverse multiplexing section through VLD, Run length Decoding (RLD), Inverse Quantization (IQ) and Inverse Discrete Cosine Transform (IDCT) and transmitting them with said motion vectors to the second frame processing section 530; a background image decompression section 570 for decompressing the background image information which is covered by the moving objects outputted from said reverse multiplexing section 501 through VLD, RLD, IP and IDCT; a third multiplexing section 541 for multiplex outputting the picture outputted from said second frame processing section and said background image decompression section; a second frame memory 542 for storing the picture frames outputted from said second multiplexing section; a picture decompression section 550 for decompressing the stored picture in said second frame memory 542 to real picture through IWQ and IDWT; a second multiplexing section for storing the outputs from said picture decompression section 550 and said first frame processing section 510 in said first frame memory 522 through multiplexing; and a wavelet processing section 580 for outputting the previously stored frames in said first frame memory 522 to said third multiplexing section 541 through DWT and WQ.
Hereinafter, the above embodiment of the present invention will be described in detail with reference to the accompanying drawings.
As shown in FIG. 8, there are five categories for the input bit stream, 'BD F' represents the bounding box information, T represent an I-picture which is the initial frame of the bit stream, 'MV represents a motion vector a P-picture of the I-picture,
'error' represent a error component for the moving compensation, and 'back' is a corresponding information which was covered by the moving objects of the current frame.
If two I-pictures are transmitted initially, they are processed with VLD, IWQ, IDWT in the first frame processing section 510 and stored in the first frame memory 522 via the second multiplexing section 521.
Also, the past frames stored in the first frame memory 522 are stored for each band category in the second frame memory 542 through DWT and WQ by the wavelet processing section 580.
From the third frame, the motion vector component of P-picture is variable length decoded (VLD) by the frame processing section 530, error component is processed for
VLD, RLD, IQ, and IDCT through the error information decompression section 560 and movement compensated (MC) with the motion vectors. At this instance, the stored picture in the second frame memory 542 as a reference frame.
The stored data in the second frame memory 542 is decompressed to the original picture through IWQ and IDWT processes and stored in the corresponding box location of the first frame memory 522.
Also, the background picture covered by separately transmitted object is decompressed through VLD, RLD, IQ and IDCT processes in the decompressing section 570 and stored in the second frame memory 542 after being added to the moving object part.
The picture frame decompressed by the above method is transmitted from the first frame memory 542 to a indicator (not shown)
If the changes in the moving object background or camera are severe, then wavelet decoding is performed by receiving the first I-picture frame of video sequence and the first frame is stored in the first frame memory 522. Afterwards, the fixed object bounded by the second frame is added to the stored frame after being wavelet decoded first.
If the camera is either fixed or the change in the background is not severe, then the background image extracted using the algorithm for background extraction only and after one frame is received, this frame is stored in the background frame memory after being wavelet transformed. The wavelet decoding is performed by receiving the first I-picture frame of video sequence and the frame is stored in the first frame memory 522.
From the second frame, only bounded moving object fault is received and this is binarized after being wavelet decoded.
Industrial Applicability
As explained so far, the present invention provides a moving object based motion estimation wavelet picture compression and decompression system, for significantly increasing the picture compression rate through extracting the background information separately and comparing the background information with the previously transmitted background information and transmitting only the differences, and for significantly reducing the number of moving vectors and their sizes through a movement estimation of the same object in the previous picture for each wavelet band, and for enabling flexible changes of image size through only separating moving objects for wavelet transformation.

Claims

What is claimed is:
1. A moving object based motion estimation wavelet picture compression and decompression system, comprising: a frame difference calculation means for detecting two consecutive picture frames among the inputted picture frames and for calculating the detected picture differences; a binarization means for binarizing the frame differences outputted from said frame difference calculation means; a bounding box extracting means for extracting a rectangular bounding box by searching periphery blocks to objects through the frame differences outputted from said binarization means; first and second wavelet processing means for wavelet transforming and quantizing each frame picture which exists in the rectangular bounding box extracted by said bounding box extracting means; a synthesizing means for synthesizing wavelet quantization data outputted from each of said first and second wavelet processing means; a moving object mask creation means for creating moving object masks by calculating the overlapping parts of moving objects of the frames according to the synthesis results outputted from said synthesizing means; a moving object separation means for separating the moving objects among the frames outputted by the second wavelet processing means through the moving objects masks outputted from said moving object mask creation means; a moving estimation means for calculating moving faults or moving vectors with reference to the frame data outputted from said first wavelet processing means and the moving object frames outputted from said moving object separation means; a first DCT/Quantization means for discrete cosine transforming and quantizing the moving faults which are the results of moving estimation, outputted from said moving estimation means; a background picture extracting means for separately extracting the background parts of a picture which have been covered by the moving objects outputted from said moving object mask creation means; a comparing means for calculating the differences by comparing between the picture outputted from said background picture calculation means and the background picture which is already stored; a second DCT/Quantization means for discrete cosine transforming and quantizing the moving faults outputted through said comparing means; a wavelet transformation/quantization means for wavelet transforming and quantizing the second frame of said input picture; first to fifth entropy coding means for entropy decoding each data outputted to said moving estimation means, said first and second discrete cosine transforming means, said bounding box extracting means, and wavelet transformation/quantization means; and a second multiplexing means for outputting encoded bit streams by multiplexing the signals outputted to said first to fifth entropy coding means.
2. The system as claimed in Claim 1 wherein said bounding box extraction means is formed in the shape of a rectangular box that includes moving objects and searches the periphery blocks of the moving objects.
3. The system as claimed in Claim 1, wherein said first wavelet processing means further including: a first common parts calculation section for calculating the common parts between the bounding box extracted from the bounding box extraction means and the first frame picture outputted from said frame memory; a first wavelet transformation section for wavelet transforming the picture outputted from said first common parts calculation section; a first wavelet quantization section for quantizing the picture outputted from said wavelet transformation section; and a first wavelet memory for storing and outputting the data outputted from said first wavelet quantization section.
4. The system as claimed in Claim 1, wherein said second wavelet processing means further including: a second common parts calculation section for merging the bounding box extracted from the bounding box extraction means and the second frame picture outputted from said frame memory; a second wavelet transformation section for wavelet transforming the picture outputted from said second common parts calculation section; a second wavelet quantization section for quantizing the picture outputted from said wavelet transformation section; and a second wavelet memory for storing and outputting the data outputted from said second wavelet quantization section.
5. A moving object based motion estimation wavelet picture decompression system, comprising: a reverse multiplexing means for reverse multiplexing the inputted bit stream for each category; a first frame memory for storing and outputting the bounding box information which is outputted from said reverse multiplexing means; a first frame processing means for Variable Length Decoding (VLD), Inverse Wavelet Quantizing (IWQ) and Inverse Discrete Wavelet Transforming (IDWT) the I- pictures of the initial two frames outputted from said reverse multiplexing means; a second frame processing means for Variable Length Transforming and movement compensating motion vector components of the pictures inputted from the third frame which is outputted from said reverse multiplexing means; an error information decompression means for decompressing the codified error components outputted from said reverse multiplexing means through VLD, Run length Decoding (RLD), Inverse Quantization (IQ) and Inverse Discrete Cosine Transform (IDCT) and transmitting them with said motion vectors to the second frame processing means; a background image decompression means for decompressing the background image information which is covered by the moving objects outputted from said reverse multiplexing means through VLD, RLD, IP and IDCT processes; a third multiplexing means for multiplex outputting the picture outputted from said second frame processing means and said background image decompression means; a second frame memory for storing the picture frames outputted from said second multiplexing means; a picture decompression means for decompressing the stored picture in said second frame memory to real picture through IWQ and IDWT; a second multiplexing means for storing the outputs from said picture decompression means and said first frame processing means in said first frame memory through multiplexing; and a wavelet processing means for outputting the previously stored frames in said first frame memory to said third multiplexing means through DWT and WQ.
6. The system as claimed in Claim 5, wherein the separately outputted information from said reverse multiplexing means further including: a bounding box information; an initial frame of input bit stream; a motion vector for the next frame which comes after said initial frame; an error information for movement compensation; and a background picture information which was covered by the moving objects of the current frame.
PCT/KR2001/001612 2001-07-28 2001-09-26 Moving object based motion estimation wavelet picture compression and decompression system WO2003013144A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2001-0045705A KR100468384B1 (en) 2001-07-28 2001-07-28 Moving object based Motion estimation Wavelet picture Compression and Decompression system
KR2001/45705 2001-07-28

Publications (1)

Publication Number Publication Date
WO2003013144A1 true WO2003013144A1 (en) 2003-02-13

Family

ID=19712673

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2001/001612 WO2003013144A1 (en) 2001-07-28 2001-09-26 Moving object based motion estimation wavelet picture compression and decompression system

Country Status (2)

Country Link
KR (1) KR100468384B1 (en)
WO (1) WO2003013144A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030032355A (en) * 2001-10-17 2003-04-26 이호석 Method for extracting video object plane by using moving object edge
KR100451584B1 (en) * 2001-12-20 2004-10-08 엘지전자 주식회사 Device for encoding and decoding a moving picture using of a wavelet transformation and a motion estimation
KR100497353B1 (en) * 2002-03-26 2005-06-23 삼성전자주식회사 Apparatus for processing image, apparatus and method for receiving processed image
CN107742115A (en) * 2017-11-10 2018-02-27 广东工业大学 A kind of method and system of the moving target analyte detection tracking based on video monitoring
KR102140873B1 (en) * 2018-11-27 2020-08-03 연세대학교 산학협력단 Apparatus and method for detecting dynamic object

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR970057959A (en) * 1995-12-23 1997-07-31 김광호 Effective Motion Estimation / Compensation Method and System in Video Compression / Restoration System
US6141452A (en) * 1996-05-13 2000-10-31 Fujitsu Limited Apparatus for compressing and restoring image data using wavelet transform
JP2001197499A (en) * 1999-10-29 2001-07-19 Sony Corp Method and device for encoding animation, method and device for decoding the same, and method and device for transmitting animation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR970057959A (en) * 1995-12-23 1997-07-31 김광호 Effective Motion Estimation / Compensation Method and System in Video Compression / Restoration System
US6141452A (en) * 1996-05-13 2000-10-31 Fujitsu Limited Apparatus for compressing and restoring image data using wavelet transform
JP2001197499A (en) * 1999-10-29 2001-07-19 Sony Corp Method and device for encoding animation, method and device for decoding the same, and method and device for transmitting animation

Also Published As

Publication number Publication date
KR100468384B1 (en) 2005-01-27
KR20030010983A (en) 2003-02-06

Similar Documents

Publication Publication Date Title
US6272253B1 (en) Content-based video compression
EP1528813B1 (en) Improved video coding using adaptive coding of block parameters for coded/uncoded blocks
US5896176A (en) Content-based video compression
US6026183A (en) Content-based video compression
EP1379000B1 (en) Signal encoding method and apparatus and decoding method and apparatus
JP2744871B2 (en) Image signal encoding method and image signal encoding device
US6614847B1 (en) Content-based video compression
US6771826B2 (en) Digital image encoding and decoding method and digital image encoding and decoding device using the same
US5491523A (en) Image motion vector detecting method and motion vector coding method
US6078694A (en) Image signal padding method, image signal coding apparatus, image signal decoding apparatus
US8064516B2 (en) Text recognition during video compression
EP1359770B1 (en) Signaling for fading compensation in video encoding
JPH07288474A (en) Vector quantization coding/decoding device
JPH07203438A (en) Image information compressing and expanding device
US20070160298A1 (en) Image encoder, image decoder, image encoding method, and image decoding method
US5541659A (en) Picture signal coding/decoding method and device using thereof
WO2000014685A1 (en) Subband coding/decoding
CA2188840C (en) Content-based video compression
WO2003013144A1 (en) Moving object based motion estimation wavelet picture compression and decompression system
KR100281322B1 (en) Binary shape signal encoding and decoding device and method thereof
KR19980033415A (en) Apparatus and method for coding / encoding moving images and storage media for storing moving images
JPH09326024A (en) Picture coding and decoding method and its device
US20040013200A1 (en) Advanced method of coding and decoding motion vector and apparatus therefor
JPH06113291A (en) Picture coder and decoder
KR20020015231A (en) System and Method for Compressing Image Based on Moving Object

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BY BZ CA CH CN CO CR CU CZ DE DM DZ EC EE ES FI GB GD GE GH HR HU ID IL IN IS JP KE KG KP KZ LK LR LS LT LU LV MA MD MG MK MW MX MZ NO NZ PH PL PT RO RU SE SG SI SK SL TJ TM TR TT TZ UA US UZ VN YU ZA

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZW AM AZ BY KG KZ MD TJ TM AT BE CH CY DE DK ES FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW MR NE SN TD TG

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP