WO2005115007A1 - Systems and methods of encoding moving pictures for mobile communication terminals - Google Patents

Systems and methods of encoding moving pictures for mobile communication terminals Download PDF

Info

Publication number
WO2005115007A1
WO2005115007A1 PCT/KR2004/001204 KR2004001204W WO2005115007A1 WO 2005115007 A1 WO2005115007 A1 WO 2005115007A1 KR 2004001204 W KR2004001204 W KR 2004001204W WO 2005115007 A1 WO2005115007 A1 WO 2005115007A1
Authority
WO
WIPO (PCT)
Prior art keywords
block
unit
motion
happened
coding
Prior art date
Application number
PCT/KR2004/001204
Other languages
French (fr)
Inventor
Changho Lee
Original Assignee
Multivia Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Multivia Co., Ltd. filed Critical Multivia Co., Ltd.
Priority to CNB2004800007255A priority Critical patent/CN100405847C/en
Priority to PCT/KR2004/001204 priority patent/WO2005115007A1/en
Publication of WO2005115007A1 publication Critical patent/WO2005115007A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B82NANOTECHNOLOGY
    • B82YSPECIFIC USES OR APPLICATIONS OF NANOSTRUCTURES; MEASUREMENT OR ANALYSIS OF NANOSTRUCTURES; MANUFACTURE OR TREATMENT OF NANOSTRUCTURES
    • B82Y20/00Nanooptics, e.g. quantum optics or photonic crystals
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01SDEVICES USING THE PROCESS OF LIGHT AMPLIFICATION BY STIMULATED EMISSION OF RADIATION [LASER] TO AMPLIFY OR GENERATE LIGHT; DEVICES USING STIMULATED EMISSION OF ELECTROMAGNETIC RADIATION IN WAVE RANGES OTHER THAN OPTICAL
    • H01S5/00Semiconductor lasers
    • H01S5/20Structure or shape of the semiconductor body to guide the optical wave ; Confining structures perpendicular to the optical axis, e.g. index or gain guiding, stripe geometry, broad area lasers, gain tailoring, transverse or lateral reflectors, special cladding structures, MQW barrier reflection layers
    • H01S5/22Structure or shape of the semiconductor body to guide the optical wave ; Confining structures perpendicular to the optical axis, e.g. index or gain guiding, stripe geometry, broad area lasers, gain tailoring, transverse or lateral reflectors, special cladding structures, MQW barrier reflection layers having a ridge or stripe structure
    • H01S5/2202Structure or shape of the semiconductor body to guide the optical wave ; Confining structures perpendicular to the optical axis, e.g. index or gain guiding, stripe geometry, broad area lasers, gain tailoring, transverse or lateral reflectors, special cladding structures, MQW barrier reflection layers having a ridge or stripe structure by making a groove in the upper laser structure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/004Arrangements for detecting or preventing errors in the information received by using forward error control
    • H04L1/0075Transmission of coding parameters to receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/109Selection of coding mode or of prediction mode among a plurality of temporal predictive coding modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/11Selection of coding mode or of prediction mode among a plurality of spatial predictive coding modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/12Selection from among a plurality of transforms or standards, e.g. selection between discrete cosine transform [DCT] and sub-band transform or selection between H.263 and H.264
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/13Adaptive entropy coding, e.g. adaptive variable length coding [AVLC] or context adaptive binary arithmetic coding [CABAC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/137Motion inside a coding unit, e.g. average field, frame or block difference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • H04N19/467Embedding additional information in the video signal during the compression process characterised by the embedded information being invisible, e.g. watermarking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01SDEVICES USING THE PROCESS OF LIGHT AMPLIFICATION BY STIMULATED EMISSION OF RADIATION [LASER] TO AMPLIFY OR GENERATE LIGHT; DEVICES USING STIMULATED EMISSION OF ELECTROMAGNETIC RADIATION IN WAVE RANGES OTHER THAN OPTICAL
    • H01S5/00Semiconductor lasers
    • H01S5/30Structure or shape of the active region; Materials used for the active region
    • H01S5/34Structure or shape of the active region; Materials used for the active region comprising quantum well or superlattice structures, e.g. single quantum well [SQW] lasers, multiple quantum well [MQW] lasers or graded index separate confinement heterostructure [GRINSCH] lasers
    • H01S5/343Structure or shape of the active region; Materials used for the active region comprising quantum well or superlattice structures, e.g. single quantum well [SQW] lasers, multiple quantum well [MQW] lasers or graded index separate confinement heterostructure [GRINSCH] lasers in AIIIBV compounds, e.g. AlGaAs-laser, InP-based laser
    • H01S5/34346Structure or shape of the active region; Materials used for the active region comprising quantum well or superlattice structures, e.g. single quantum well [SQW] lasers, multiple quantum well [MQW] lasers or graded index separate confinement heterostructure [GRINSCH] lasers in AIIIBV compounds, e.g. AlGaAs-laser, InP-based laser characterised by the materials of the barrier layers
    • H01S5/34366Structure or shape of the active region; Materials used for the active region comprising quantum well or superlattice structures, e.g. single quantum well [SQW] lasers, multiple quantum well [MQW] lasers or graded index separate confinement heterostructure [GRINSCH] lasers in AIIIBV compounds, e.g. AlGaAs-laser, InP-based laser characterised by the materials of the barrier layers based on InGa(Al)AS

Definitions

  • the present invention relates to moving picture encoding. More particularly, the present invention relates to systems and methods of encoding moving pictures for mobile communication terminals, which are capable of minimizing processes and the number of operations when encoding moving pictures and are adaptable to mobile communication terminals by embodying it in a software manner.
  • a Video On Demand (VOD) service such as a movie service on the Internet is a representative example of the technologies.
  • VOD Video On Demand
  • International standards in such diverse fields for processing moving pictures are defined.
  • MPEG Moving Picture Experts Group
  • CD video Compact Disk
  • MPEG- 2 applied and used as a compression method for a high definition digital TV broadcasting or a Digital Video Disk (DVD) service
  • MPEG-4 which is a method of making a compression coder suitable for various contents and used in a moving picture compression solution in wireless environments such as an internet broadcasting or International Mobile Telecommunication-2000 (IMT-2000).
  • H.261 which is developed for a video conference, has a performance similar to that of MPEG-1 and is mainly used in ISDN network
  • H.263 which is developed for a videophone and provides a base of MPEG-4
  • H.26L H.264
  • H.264 is the most recent standardized compression method for a picture phone, a moving picture-supporting cellular phone and TV, etc. and can increase a compression performance to two times as much as MPEG-4.
  • the above-mentioned moving picture-processing technologies are applied to the various fields, and a personal computer (PC) is a field commonly contacted by the public.
  • PC personal computer
  • the PC has a problem of a mobility restraint.
  • a moving picture encoding apparatus comprises a video signal transforming unit 10 which transforms an inputted RGB image into a moving picture compression object signal (YUV420 or YUV422), an intra-coding section 20 which performs a compression only with a frame itself using a spatial correlation in a frame, and an inter-coding section 30 which performs a compression using a time relation between a current frame and a previous frame.
  • the system encodes a moving picture using a discrete cosine transform (DCT) method.
  • the intra-coding section 20 comprises a discrete cosine transform (DCT) unit 21, a quantization unit 22 and a Huffman coding unit 23.
  • the DCT unit 21 For Y component of the moving picture compression object signal, the DCT unit 21 divides a macro block (16*16 pixels) into four block units (8*8 pixels) and performs a DCT for the block units (8*8 pixels). Since U and V components of the moving picture compression object signal have a data size corresponding to 1/4 of the Y component, the DCT unit 21 performs DCT for the U and V components in a macro block. After the DCT, the DCT unit performs a quantization and then a Huffman coding as an entropy coding method.
  • the inter-coding section 30 applies the Huffman coding method as an entropy coding as with the intra-coding section 20, and performs an encoding using a time relation between a current frame and a previous frame for a predictive frame coding (i.e., P frame coding). At this time, in the case of an image for the previous frame, since comparison is made for an image restored from compressed image, a decoding is also performed when performing an encoding.
  • a motion estimation unit 31 in the inter-coding section 30 divides a moving picture compression object signal (YUV420, YUV422), which is a signal obtained by transforming a RGB (Red Green Blue) image outputted from an video signal transforming unit 10 into a MPEG format, into macro blocks of 16*16 pixels and estimates whether motions happened in each macro block, i.e., search area. Then, it obtains a motion vector (VM) through the estimation of motions and estimates information about conditions where motion is out of the search area or where motion can not be expressed by a motion vector. At this time, the motion vector is location information having a most similarity between a current frame image and a previous frame image.
  • a moving picture compression object signal YUV420, YUV422
  • each macro block is classified into a block where no motion happened and a block where motions happened, and the block where motions happened is classified into an intra mode and an inter mode depending on its encoding method.
  • the intra mode is a mode of processing the macro block in the same manner as the intra coding
  • the inter mode is a mode of encoding the macro block using a difference value between a current frame and a previous frame.
  • the intra mode and the inter mode are distinguished through a difference value between a current frame and the lowest mean value of a macro block unit in the search area which occurs while estimating the motion vector, the lowest mean value being obtained by calculating a difference value between a macro block of a current frame and a macro block of a previous frame.
  • inter-coding Since inter-coding obtains motion vectors by performing a motion estimation for each macro block and, in addition, performs a DCT and a quantization processes to obtain information about whether motions happened or not in each of the macro block, the inter- coding requires complex operations. Accordingly, it is difficult to embody a moving picture encoding in a mobile communication terminal in a software manner. Further, since a Huffman coding is used as entropy coding, it is not suitable for a real-time service.
  • the object of the present invention is to embody a moving picture encoding function suitable for a mobile communication terminal which is an application field requiring a real-time service.
  • the present invention applies a Golomb-Rice coding having a low complexity and a fast processing speed as an entropy coding manner when encoding a moving picture, and separately encodes each of Y component and U and V components of a moving picture compression object signal according to their degrees of importance.
  • the other object of the present invention is to minimize the number of operations, as a decoding process is not required when performing an encoding, by executing a predictive-frame coding using an original image of a previous intra frame when performing an inter-coding for moving pictures.
  • Another object of the present invention is to reduce operations a lot through removing unnecessary processes such as a process for obtaining motion vector when inter- coding moving pictures, by performing a motion estimation in a block unit when performing an inter-coding of moving pictures, estimating whether motions happened using pixel values of a block boundary, and performing an estimation of an intra mode and an inter mode at the same time.
  • the other object of the present invention is to save a development cost for addition of a hardware chip allowing encoding of moving pictures suitable for a mobile communication terminal, and to more efficiently provide moving picture service for a mobile communication terminal which is rapidly introduced, by embodying functions of encoding moving pictures in a software manner capable of quickly coping with some conditions, contrary to a hardware chip.
  • a system of moving pictures for mobile communication terminals having a video signal transforming unit for transforming an inputted RGB image into a moving picture compression object signal having Y, U and V components
  • a video signal transforming unit for transforming an inputted RGB image into a moving picture compression object signal having Y, U and V components
  • an intra-coding section for encoding the U and V components of the moving picture compression object signal by calculating an average value in a 4*4 pixel unit and encoding the average value, and for encoding the Y component of the moving picture compression object signal by discrete cosine transforming the Y component in a 8*8 pixel block unit, by obtaining a quantization coefficient through quantizing the discrete cosine transformed Y component, and by entropy coding the quantization coefficient with Golomb-Rice coding; and an inter-coding section for predictive frame coding the moving picture compression object signal using an original image of a previous intra frame, for estimating whether motions happened in a 8*8 pixel block unit using pixel values of
  • the inter-coding section comprises: a motion estimation unit for estimating whether motions happened in each 8*8 pixel block unit, using an original image of a previous intra frame, for the moving picture compression object signal outputted by the video signal transforming unit; a texture map information storing unit for storing texture map information indicating whether motions happened or not in each block; a motion block storing unit for storing a motion block where motions happened, the motion block being outputted by the motion estimation unit; a non-motion block storing unit for storing a non- motion block where no motion happened, the non-motion block being outputted by the motion estimation unit; a discrete cosine transform (DCT) unit for obtaining a DCT coefficient by discrete cosine transforming the Y component of the motion block in a 8*8 pixel block unit, the motion block being stored in the motion block storing unit; a quantization unit for producing a quantization coefficient by performing a quantization process corresponding to a quantization width and each frequency component for the obtained DCT coefficient; and
  • the motion estimation unit estimates whether motions happened through a difference between boundary value of a current frame' s block and boundary value of a previous frame ' s block, outputs respectively the block where motions happened and the block where no motion happened, and outputs information about whether motions happened or not in each block as texture map information.
  • the motion estimation unit performs an estimation of an intra mode and an inter mode, based on a value obtained by squaring a difference value between a block of a current frame and a block of a previous frame, the difference value being a resultant value obtained when estimating whether motions happened.
  • a method of encoding moving pictures for mobile communication terminals performing an intra-coding and an inter-coding by transforming an inputted RGB image into a moving picture compression object signal comprising: the intra- coding process that encodes U and V components of the moving picture compression object signal by calculating an average value in a 4*4 pixel unit and encoding the average value, and for encoding Y component of the moving picture compression object signal by discrete cosine transforming the Y component in a 8*8 pixel block unit, by obtaining a quantization coefficient through quantizing the discrete cosine transformed Y component, and by entropy coding the quantization coefficient with Golomb-Rice coding; and the inter-coding process that estimates whether motions happened in a 8*8 pixel block unit for the moving picture compression object signal, classifies the blocks into a motion block and a non-motion block, stores the motion block and the non-motion block separately, encodes and outputs information about whether motions happened or not as texture map information, and performs an estimation
  • the inter-coding process comprises: estimating whether motions happened through a difference between boundary value of a current frame' s block and boundary value of a previous frame' s block; outputting and storing non-motion blocks where no motion happened and motion blocks where motions happened in each storing unit according to a result of the estimation of whether motions happened, and then encoding and outputting texture map information indicating whether motions happened or not in a corresponding block; and estimating whether to encode the motion block where motions happened in an intra mode or inter mode at the same time when performing the estimation of whether motions happened or not.
  • the inter-coding process encodes motion blocks, estimated as the intra mode, in the same manner as the intra-coding and encodes motion blocks, estimated as the inter mode, by encoding different components between an original image of a current frame and a previous image.
  • the inter-coding process encodes the U and V components of the motion block by calculating an average value in a 4*4 pixel unit and encoding the average value, and encodes the Y components of the motion block by discrete cosine transforming the Y component in a 8*8 pixel block unit, by obtaining a quantization coefficient through quantizing the discrete cosine transformed Y component, and by entropy coding the quantization coefficient with Golomb-Rice coding.
  • Figure 1 is a block diagram illustrating an apparatus of coding moving pictures according to the related art MPEG; and Figure 2 is a block diagram functionally illustrating structure of a moving picture encoding system for a mobile communication terminal according to an embodiment of the present invention.
  • the system according to an embodiment of the present invention comprises a video signal transforming unit 10 that transforms an inputted RGB image into a moving picture compression object signal (YUV420) having a MPEG-4 format, an intra-coding section 50 and an inter-coding section 60.
  • the intra-coding section 50 divides the moving picture compression object signal
  • the intra-coding section 50 comprises a DCT unit 51, a quantization unit 52 and a Golomb-Rice coding unit 53, for encoding the Y component.
  • the DCT unit 51 obtains a DCT coefficient by discrete cosine transforming the Y component in a block unit (8*8).
  • the quantization unit 52 produces a quantization coefficient by performing a quantization process of making the obtained DCT coefficient into a 8*8 quantization matrix corresponding to a quantization width and each frequency component.
  • the Golomb-Rice coding unit 53 encodes the produced quantization coefficient through the Golomb-Rice coding, and then outputs it.
  • the inter-coding section 60 performs a predictive frame coding using an original image of a previous intra frame, estimates whether motions happened in a 8*8 block unit using pixel values of a block boundary, and simultaneously performs an estimation of an intra mode and an inter mode.
  • the inter-coding section 60 calculates an average value in a 4*4 pixel unit and then encodes the average value.
  • the inter-coding section 60 discrete cosine transforms and quantizes the Y component in a 8*8 pixel block unit, and entropy codes the component with the Golomb-Rice coding.
  • the inter-coding section 60 comprises a motion estimation unit 61, a motion block storing unit 62, a non-motion block storing unit 63, a texture map information storing unit 64, a DCT unit 65, a quantization unit 66 and a Golomb-Rice coding unit 67.
  • the motion estimation unit 61 estimates whether motions happened in each 8*8 block unit, using an original image of a previous intra frame, for the moving picture compression object signal outputted by the video signal transforming unit 10.
  • the motion block storing unit 62 stores blocks where motions happened.
  • the non-motion block storing unit 63 stores blocks where no motion happened.
  • the texture map information storing unit 64 stores the texture map information outputted by the motion estimation unit 61 , the texture map information indicating whether motions happened or not in each block.
  • the DCT unit 65 produces a DCT coefficient by discrete cosine transforming the Y component of the motion block stored in the motion block storing unit 62 in the 8*8 pixel block unit.
  • the quantization unit 66 produces a quantization coefficient by performing a quantization process of making the obtained DCT coefficient into a 8*8 quantization matrix corresponding to a quantization width and each frequency component.
  • the Golomb-Rice coding unit 67 encodes and outputs the produced quantization coefficient through the Golomb-Rice coding.
  • the inter-coding section calculates an average value in a 4*4 pixel unit, and encodes and outputs the average value, for the U and V component of the motion block stored in the motion block storing unit 62.
  • a RGB image inputted through an inputting device such as a camera is transformed into a moving picture compression object signal (YUV420) having a MPEG-4 format and then inputted into the intra-coding section 50 by the video signal transforming unit 10.
  • Y component constituting the moving picture compression object signal is a one indicating light and darkness (black and white) of an image
  • U and V components are ones illustrating color information of an image with a color difference component.
  • the Y component takes charge of an important portion in constituting an image, compared to the U and V components.
  • Each of Y, U and V components is present in each storing space. Since the Y component, and the U and V components have different degrees of importance in constituting an image during encoding, they are processed respectively according to the degrees of importance.
  • the intra-coding section 50 calculates an average value in a 4*4 pixel unit and then encodes the average value, for the U and V components of the moving picture compression object signal (YUV420) inputted from the video signal transforming unit 10.
  • the intra-coding section 50 obtains a DCT coefficient by discrete cosine transforming the Y component and then produces a quantization coefficient by performing a quantization process of making the obtained DCT coefficient into a 8*8 quantization matrix corresponding to a quantization width and each frequency component. Then, the intra- coding section 50 encodes the produced quantization coefficient through the Golomb-Rice coding and then outputs the quantization coefficient. Accordingly, complexity in encoding becomes lower than a Huffman coding which is a related art entropy coding and a real-time encoding can be made.
  • the inter-coding section 60 encodes the moving picture compression object signal inputted from the video signal transforming unit 10 using a time relation between a current frame and a previous frame.
  • the motion estimation unit 61 of the inter-coding section 60 divides the moving picture compression object signal into block units (8*8) and classifies the block units into blocks where motions happened (motion block) and blocks where no motion happened (non-motion block). At this time, it stores information about whether motions happened or not in each block as texture map information. In other words, the motion estimation unit 61 estimates whether motions happened in a search area, a block unit, using an original image of a previous intra frame. At this time, it estimates whether motions happened through a difference between boundary value of a current frame ' s block and boundary value of a previous frame ' s block.
  • the motion estimation unit 61 When it is estimated that a corresponding block is a non-motion block where no motion happened, the motion estimation unit 61 outputs the non-motion block, stores the non-motion block in the non-motion block storing unit 63, and stores information that no motion happened in the block in the texture map information storing unit 64. However, when it is estimated that a corresponding block is a motion block where motions happened, the motion estimation unit 61 outputs the motion block, stores the motion block in the motion block storing unit 62, and stores information that motions happened in the block in the texture map information storing unit 64.
  • a decoding section (not illustrated) is able to know the information about whether motions happened in each block. Accordingly, an encoding process or any subsequent processes is not required for the non-motion block.
  • the decoding section processes a non-motion block with the texture map information by copying blocks in the previous image.
  • the motion estimation unit 61 estimates whether to encode the motion block where motions happened in the intra mode or in the inter mode at the same time when it performs the estimation of whether motions happened or not.
  • blocks can be estimated as a motion block where motions happened, in case that there is movement of objects, change of light or a noise from a moving picture inputting device (for example, a camera).
  • motion blocks where there is a big movement such as the movement of objects are estimated to be encoded in the intra mode.
  • Motion blocks where there is a small change in images such as minute movements due to lights or noise are estimated to be encoded in the inter mode.
  • the corresponding intra mode encoding or inter mode encoding information is encoded and stored for the decoding section to be aware of it.
  • block values of an original image are encoded in the same manner as the intra frame coding.
  • the motion block estimated in the inter mode different components between an original image and a previous image are encoded.
  • a corresponding encoding process is performed for the motion block.
  • an average value is calculated in a 4*4 pixel unit and then encoded for the U and V components of the motion block.
  • the Y component is discrete cosine transformed and quantized in a 8*8 pixel unit. Then, the Y component is entropy coded with the Golomb-Rice coding.
  • a Golomb-Rice coding having a low complexity and a fast processing speed is applied as an entropy coding method, instead of a Huffman coding.
  • Y component, and U and V components of a moving picture compression object signal are individually encoded according to their degrees of importance. Accordingly, since it is possible to considerably reduce an encoding time of an entire system without a decrease of a compression performance or definition, compared to the related art, a moving picture encoding function suitable for a mobile communication terminal which is an application requiring a real-time service can be embodied.
  • a predictive-frame coding is performed using an original image of a previous intra frame. Accordingly, since it is not required to use an image restored from compressed previous image as the related art, the number of operations can be minimized. For example, a decoding process is not required when performing an encoding.
  • an estimation of whether motions happened is performed in a block unit when performing an intra-coding for moving pictures. At this time, it is estimated only whether motions happened using pixel values of a block boundary without obtaining a motion vector, as with the related art.
  • a process for obtaining a motion vector, and a discrete-cosine transformation and a quantization process required for obtaining information about whether motions happened are omitted, many operations can be reduced.
  • a motion estimation and an estimation of an intra mode and an inter mode are performed at the same time, rather than performing an estimation of the intra mode and the inter mode after obtaining a motion vector, as with the related art. Accordingly, it is possible to reduce the operations of the previously described processes, compared to the related art apparatus of encoding moving pictures.
  • the above-mentioned moving picture encoding functions are embodied in a software manner. Accordingly, it is possible to reduce a burden on additional expenses for developing a mobile communication terminal, which expenses occur due to an addition of a hardware chip for encoding moving pictures.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Nanotechnology (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Geometry (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Discrete Mathematics (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The present invention relates to systems and methods of encoding moving pictures for mobile communication terminals. The present invention includes: the intra-coding process that encodes U and V components of the moving picture compression object signal by calculating an average value in a 4*4 pixel unit, and for encoding Y component of the moving picture compression object signal by discrete cosine transforming, quantizing and Golomb-Rice coding the Y component in a 8*8 pixel block unit; and the inter-coding process that estimates whether motions happened in a 8*8 pixel block unit, classifies the blocks into a motion block and a non-motion block, stores the motion block and the non-motion block separately, encodes and outputs information about whether motions happened as texture map information, and performs an estimation of an infra mode and an inter mode for the motion block at the same time.

Description

SYSTEMS AND METHODS OF ENCODING MOVING PICTURES FOR MOBILE COMMUNICATION TERMINALS
TECHNICAL FIELD The present invention relates to moving picture encoding. More particularly, the present invention relates to systems and methods of encoding moving pictures for mobile communication terminals, which are capable of minimizing processes and the number of operations when encoding moving pictures and are adaptable to mobile communication terminals by embodying it in a software manner.
BACKGROUND ART Recently, technologies for processing moving pictures have been used in diverse fields. A Video On Demand (VOD) service such as a movie service on the Internet is a representative example of the technologies. International standards in such diverse fields for processing moving pictures are defined. As typical international standards, there are Moving Picture Experts Group (MPEG) -1 used as a compression method for a video Compact Disk (CD) storage, MPEG- 2 applied and used as a compression method for a high definition digital TV broadcasting or a Digital Video Disk (DVD) service and MPEG-4 which is a method of making a compression coder suitable for various contents and used in a moving picture compression solution in wireless environments such as an internet broadcasting or International Mobile Telecommunication-2000 (IMT-2000). In addition, there are H.261 which is developed for a video conference, has a performance similar to that of MPEG-1 and is mainly used in ISDN network, H.263 which is developed for a videophone and provides a base of MPEG-4, and H.26L (H.264) which is the most recent standardized compression method for a picture phone, a moving picture-supporting cellular phone and TV, etc. and can increase a compression performance to two times as much as MPEG-4. Meanwhile, the above-mentioned moving picture-processing technologies are applied to the various fields, and a personal computer (PC) is a field commonly contacted by the public. However, the PC has a problem of a mobility restraint. Recently, technologies of processing moving pictures in mobile communication terminals such as a Personal Digital Assistant (PDA) and a cellular phone, etc., which allows various multimedia services to be provided through an internet connection and a camera mounting and solves the mobility restraint of the PC, are gradually expanded. However, since the mobile communication terminal has smaller CPU performance, smaller memory space and a limited battery capacity for supplying a driving power, compared to the PC, it has many restraints when applying moving picture-processing technologies requiring many operations to the terminal. Accordingly, the moving picture- processing technologies have not been embodied in a software manner like in the PC. Instead, the moving picture-compressing technologies have been provided by a hardware chip. Recently, as performance of CPU used in the mobile communication terminal has been improved (for example, commercialization of MSM6000 from Qualcomm), there are attempts to apply the moving picture-processing technology to the mobile communication terminal in a software manner. However, since the related art moving picture-processing technology requires many operations to encode moving pictures, compared to a decoding of the moving pictures, the moving picture-processing technology cannot be actually applied to the mobile communication terminal in a software manner. Accordingly, in order to apply the moving picture-processing technology to the mobile communication terminal in a software manner, it is required to develop a moving picture compression-encoding technology capable of reducing the number of operations to meet a CPU performance of a mobile communication terminal. For example, as illustrated in Figure 1, a moving picture encoding apparatus according to the related art using a MPEG comprises a video signal transforming unit 10 which transforms an inputted RGB image into a moving picture compression object signal (YUV420 or YUV422), an intra-coding section 20 which performs a compression only with a frame itself using a spatial correlation in a frame, and an inter-coding section 30 which performs a compression using a time relation between a current frame and a previous frame. At this time, the system encodes a moving picture using a discrete cosine transform (DCT) method. The intra-coding section 20 comprises a discrete cosine transform (DCT) unit 21, a quantization unit 22 and a Huffman coding unit 23. For Y component of the moving picture compression object signal, the DCT unit 21 divides a macro block (16*16 pixels) into four block units (8*8 pixels) and performs a DCT for the block units (8*8 pixels). Since U and V components of the moving picture compression object signal have a data size corresponding to 1/4 of the Y component, the DCT unit 21 performs DCT for the U and V components in a macro block. After the DCT, the DCT unit performs a quantization and then a Huffman coding as an entropy coding method. The inter-coding section 30 applies the Huffman coding method as an entropy coding as with the intra-coding section 20, and performs an encoding using a time relation between a current frame and a previous frame for a predictive frame coding (i.e., P frame coding). At this time, in the case of an image for the previous frame, since comparison is made for an image restored from compressed image, a decoding is also performed when performing an encoding. In addition, a motion estimation unit 31 in the inter-coding section 30 divides a moving picture compression object signal (YUV420, YUV422), which is a signal obtained by transforming a RGB (Red Green Blue) image outputted from an video signal transforming unit 10 into a MPEG format, into macro blocks of 16*16 pixels and estimates whether motions happened in each macro block, i.e., search area. Then, it obtains a motion vector (VM) through the estimation of motions and estimates information about conditions where motion is out of the search area or where motion can not be expressed by a motion vector. At this time, the motion vector is location information having a most similarity between a current frame image and a previous frame image. By providing such a motion vector, it is possible to perform a compression with a higher compression performance and quality. However, the motion estimation process by the motion estimation unit 31 occupies the most operations when encoding moving pictures. Accordingly, in the case of a mobile communication terminal which requires a low complexity due to a restraint of the CPU performance, it is difficult to embody a function of encoding moving pictures in a software manner because of the motion estimation process. In addition, according to the related art moving picture encoding apparatus, in the inter-coding method, each macro block is classified into a block where no motion happened and a block where motions happened, and the block where motions happened is classified into an intra mode and an inter mode depending on its encoding method. The intra mode is a mode of processing the macro block in the same manner as the intra coding, and the inter mode is a mode of encoding the macro block using a difference value between a current frame and a previous frame. The intra mode and the inter mode are distinguished through a difference value between a current frame and the lowest mean value of a macro block unit in the search area which occurs while estimating the motion vector, the lowest mean value being obtained by calculating a difference value between a macro block of a current frame and a macro block of a previous frame. As described above, according to the related art moving picture encoding apparatus using a MPEG, since a previous image restored from compressed image is used when performing an inter-coding, a decoding process is also required when encoding moving pictures. Since inter-coding obtains motion vectors by performing a motion estimation for each macro block and, in addition, performs a DCT and a quantization processes to obtain information about whether motions happened or not in each of the macro block, the inter- coding requires complex operations. Accordingly, it is difficult to embody a moving picture encoding in a mobile communication terminal in a software manner. Further, since a Huffman coding is used as entropy coding, it is not suitable for a real-time service.
DISCLOSURE OF INVENTION Accordingly, the present invention has been made to solve the above-mentioned problems occurring in the related art. The object of the present invention is to embody a moving picture encoding function suitable for a mobile communication terminal which is an application field requiring a real-time service. To make it possible to considerably reduce an encoding time of an entire system without a decrease of a compression performance or a definition compared to the related art, the present invention applies a Golomb-Rice coding having a low complexity and a fast processing speed as an entropy coding manner when encoding a moving picture, and separately encodes each of Y component and U and V components of a moving picture compression object signal according to their degrees of importance. The other object of the present invention is to minimize the number of operations, as a decoding process is not required when performing an encoding, by executing a predictive-frame coding using an original image of a previous intra frame when performing an inter-coding for moving pictures. Another object of the present invention is to reduce operations a lot through removing unnecessary processes such as a process for obtaining motion vector when inter- coding moving pictures, by performing a motion estimation in a block unit when performing an inter-coding of moving pictures, estimating whether motions happened using pixel values of a block boundary, and performing an estimation of an intra mode and an inter mode at the same time. The other object of the present invention is to save a development cost for addition of a hardware chip allowing encoding of moving pictures suitable for a mobile communication terminal, and to more efficiently provide moving picture service for a mobile communication terminal which is rapidly introduced, by embodying functions of encoding moving pictures in a software manner capable of quickly coping with some conditions, contrary to a hardware chip. In order to accomplish the objects, there is provided a system of moving pictures for mobile communication terminals having a video signal transforming unit for transforming an inputted RGB image into a moving picture compression object signal having Y, U and V components comprising: an intra-coding section for encoding the U and V components of the moving picture compression object signal by calculating an average value in a 4*4 pixel unit and encoding the average value, and for encoding the Y component of the moving picture compression object signal by discrete cosine transforming the Y component in a 8*8 pixel block unit, by obtaining a quantization coefficient through quantizing the discrete cosine transformed Y component, and by entropy coding the quantization coefficient with Golomb-Rice coding; and an inter-coding section for predictive frame coding the moving picture compression object signal using an original image of a previous intra frame, for estimating whether motions happened in a 8*8 pixel block unit using pixel values of a block boundary, for performing an estimation of an intra mode and an inter mode at the same time, for encoding the U and V components of a motion block where motions happened by calculating an average value in a 4*4 pixel unit, and for encoding the Y component of the motion block by discrete cosine transforming the Y component in a 8*8 pixel block unit, by obtaining a quantization coefficient through quantizing the discrete cosine transformed Y component, and by entropy coding the quantization coefficient with Golomb-Rice coding. Preferably, the inter-coding section comprises: a motion estimation unit for estimating whether motions happened in each 8*8 pixel block unit, using an original image of a previous intra frame, for the moving picture compression object signal outputted by the video signal transforming unit; a texture map information storing unit for storing texture map information indicating whether motions happened or not in each block; a motion block storing unit for storing a motion block where motions happened, the motion block being outputted by the motion estimation unit; a non-motion block storing unit for storing a non- motion block where no motion happened, the non-motion block being outputted by the motion estimation unit; a discrete cosine transform (DCT) unit for obtaining a DCT coefficient by discrete cosine transforming the Y component of the motion block in a 8*8 pixel block unit, the motion block being stored in the motion block storing unit; a quantization unit for producing a quantization coefficient by performing a quantization process corresponding to a quantization width and each frequency component for the obtained DCT coefficient; and a Golomb-Rice coding unit for encoding and then outputting the produced quantization coefficient through a Golomb-Rice coding. Preferably, the motion estimation unit estimates whether motions happened through a difference between boundary value of a current frame' s block and boundary value of a previous frame ' s block, outputs respectively the block where motions happened and the block where no motion happened, and outputs information about whether motions happened or not in each block as texture map information. Preferably, the motion estimation unit performs an estimation of an intra mode and an inter mode, based on a value obtained by squaring a difference value between a block of a current frame and a block of a previous frame, the difference value being a resultant value obtained when estimating whether motions happened. Alternatively, there is provided a method of encoding moving pictures for mobile communication terminals performing an intra-coding and an inter-coding by transforming an inputted RGB image into a moving picture compression object signal comprising: the intra- coding process that encodes U and V components of the moving picture compression object signal by calculating an average value in a 4*4 pixel unit and encoding the average value, and for encoding Y component of the moving picture compression object signal by discrete cosine transforming the Y component in a 8*8 pixel block unit, by obtaining a quantization coefficient through quantizing the discrete cosine transformed Y component, and by entropy coding the quantization coefficient with Golomb-Rice coding; and the inter-coding process that estimates whether motions happened in a 8*8 pixel block unit for the moving picture compression object signal, classifies the blocks into a motion block and a non-motion block, stores the motion block and the non-motion block separately, encodes and outputs information about whether motions happened or not as texture map information, and performs an estimation of an intra mode and an inter mode for the motion block at the same time when estimating whether motions happened. Preferably, the inter-coding process comprises: estimating whether motions happened through a difference between boundary value of a current frame' s block and boundary value of a previous frame' s block; outputting and storing non-motion blocks where no motion happened and motion blocks where motions happened in each storing unit according to a result of the estimation of whether motions happened, and then encoding and outputting texture map information indicating whether motions happened or not in a corresponding block; and estimating whether to encode the motion block where motions happened in an intra mode or inter mode at the same time when performing the estimation of whether motions happened or not. Preferably, the inter-coding process encodes motion blocks, estimated as the intra mode, in the same manner as the intra-coding and encodes motion blocks, estimated as the inter mode, by encoding different components between an original image of a current frame and a previous image. Preferably, the inter-coding process encodes the U and V components of the motion block by calculating an average value in a 4*4 pixel unit and encoding the average value, and encodes the Y components of the motion block by discrete cosine transforming the Y component in a 8*8 pixel block unit, by obtaining a quantization coefficient through quantizing the discrete cosine transformed Y component, and by entropy coding the quantization coefficient with Golomb-Rice coding. BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 is a block diagram illustrating an apparatus of coding moving pictures according to the related art MPEG; and Figure 2 is a block diagram functionally illustrating structure of a moving picture encoding system for a mobile communication terminal according to an embodiment of the present invention. **Description of the codes at important parts of the diagrams** 10: Video signal transforming unit 50: Intra-coding section 51 , 65 : DCT unit 52, 66 : Quantization unit 53, 67: Golomb-Rice coding unit 60: Inter-coding section 61 : Motion estimation unit 62 : Motion block storing unit 63 : Non-motion block storing unit 67 : Texture map information storing unit
BEST MODE FOR CARRYING OUT THE INVENTION Hereinafter, preferred embodiments of the present invention will be described with reference to the accompanying drawings. A system of encoding moving pictures according to the present invention is embodied in a software manner, rather than a hardware chip. As illustrated in Figure 2, the system according to an embodiment of the present invention comprises a video signal transforming unit 10 that transforms an inputted RGB image into a moving picture compression object signal (YUV420) having a MPEG-4 format, an intra-coding section 50 and an inter-coding section 60. The intra-coding section 50 divides the moving picture compression object signal
(YUV420) outputted by the video signal transforming unit 10 into U and V components, and Y component. Then, it calculates an average value in a 4*4 pixel unit and then encodes the average value, for the U and V component. In addition, it discrete cosine transforms and quantizes the Y component in a 8*8 pixel block unit, and entropy codes the transformed and quantized Y component with a Golomb-Rice coding. According to the present invention, the intra-coding section 50 comprises a DCT unit 51, a quantization unit 52 and a Golomb-Rice coding unit 53, for encoding the Y component. The DCT unit 51 obtains a DCT coefficient by discrete cosine transforming the Y component in a block unit (8*8). The quantization unit 52 produces a quantization coefficient by performing a quantization process of making the obtained DCT coefficient into a 8*8 quantization matrix corresponding to a quantization width and each frequency component. The Golomb-Rice coding unit 53 encodes the produced quantization coefficient through the Golomb-Rice coding, and then outputs it. The inter-coding section 60 performs a predictive frame coding using an original image of a previous intra frame, estimates whether motions happened in a 8*8 block unit using pixel values of a block boundary, and simultaneously performs an estimation of an intra mode and an inter mode. After that, for the U and V components of a motion block where motions happened, the inter-coding section 60 calculates an average value in a 4*4 pixel unit and then encodes the average value. For the Y component of a motion block where motions happened, the inter-coding section 60 discrete cosine transforms and quantizes the Y component in a 8*8 pixel block unit, and entropy codes the component with the Golomb-Rice coding. Specifically, as illustrated in Figure 2, the inter-coding section 60 comprises a motion estimation unit 61, a motion block storing unit 62, a non-motion block storing unit 63, a texture map information storing unit 64, a DCT unit 65, a quantization unit 66 and a Golomb-Rice coding unit 67. The motion estimation unit 61 estimates whether motions happened in each 8*8 block unit, using an original image of a previous intra frame, for the moving picture compression object signal outputted by the video signal transforming unit 10. At this time, it estimates whether motions happened through a difference between boundary value of a current frame' s block and boundary value of a previous frame' s block, outputs the block where motions happened and the block where no motion happened respectively depending on results of the estimation, and outputs information about whether motions happened in each block as texture map information. In addition, it performs an estimation of an intra mode and an inter mode, based on a value obtained by squaring a difference value between a block of the current frame and a block of the previous frame, the difference value being a resultant value obtained when estimating whether motions happened. The motion block storing unit 62 stores blocks where motions happened. The non-motion block storing unit 63 stores blocks where no motion happened. The texture map information storing unit 64 stores the texture map information outputted by the motion estimation unit 61 , the texture map information indicating whether motions happened or not in each block. The DCT unit 65 produces a DCT coefficient by discrete cosine transforming the Y component of the motion block stored in the motion block storing unit 62 in the 8*8 pixel block unit. The quantization unit 66 produces a quantization coefficient by performing a quantization process of making the obtained DCT coefficient into a 8*8 quantization matrix corresponding to a quantization width and each frequency component. The Golomb-Rice coding unit 67 encodes and outputs the produced quantization coefficient through the Golomb-Rice coding. At this time, the inter-coding section calculates an average value in a 4*4 pixel unit, and encodes and outputs the average value, for the U and V component of the motion block stored in the motion block storing unit 62. Hereinafter, operations of the moving picture encoding system having the above- mentioned structures will be described. First, a RGB image inputted through an inputting device (not illustrated) such as a camera is transformed into a moving picture compression object signal (YUV420) having a MPEG-4 format and then inputted into the intra-coding section 50 by the video signal transforming unit 10. Y component constituting the moving picture compression object signal is a one indicating light and darkness (black and white) of an image, and U and V components are ones illustrating color information of an image with a color difference component. The Y component takes charge of an important portion in constituting an image, compared to the U and V components. Each of Y, U and V components is present in each storing space. Since the Y component, and the U and V components have different degrees of importance in constituting an image during encoding, they are processed respectively according to the degrees of importance. In other words, the intra-coding section 50 calculates an average value in a 4*4 pixel unit and then encodes the average value, for the U and V components of the moving picture compression object signal (YUV420) inputted from the video signal transforming unit 10. Then, the intra-coding section 50 obtains a DCT coefficient by discrete cosine transforming the Y component and then produces a quantization coefficient by performing a quantization process of making the obtained DCT coefficient into a 8*8 quantization matrix corresponding to a quantization width and each frequency component. Then, the intra- coding section 50 encodes the produced quantization coefficient through the Golomb-Rice coding and then outputs the quantization coefficient. Accordingly, complexity in encoding becomes lower than a Huffman coding which is a related art entropy coding and a real-time encoding can be made. In addition, since the U and V components constituting the moving picture compression object signal are less important than the Y component when expressing an image, it is possible to embody and express a compression performance and a definition similar to those obtained through a DCT of a 8*8 pixel unit, just by encoding the average value of a 4*4 pixel unit. Accordingly, it is possible to considerably reduce a time required for encoding the U and V components compared to the related art intra-coding, without a decrease of a compression performance or definition. Meanwhile, the inter-coding section 60 encodes the moving picture compression object signal inputted from the video signal transforming unit 10 using a time relation between a current frame and a previous frame. First, the motion estimation unit 61 of the inter-coding section 60 divides the moving picture compression object signal into block units (8*8) and classifies the block units into blocks where motions happened (motion block) and blocks where no motion happened (non-motion block). At this time, it stores information about whether motions happened or not in each block as texture map information. In other words, the motion estimation unit 61 estimates whether motions happened in a search area, a block unit, using an original image of a previous intra frame. At this time, it estimates whether motions happened through a difference between boundary value of a current frame ' s block and boundary value of a previous frame ' s block. When it is estimated that a corresponding block is a non-motion block where no motion happened, the motion estimation unit 61 outputs the non-motion block, stores the non-motion block in the non-motion block storing unit 63, and stores information that no motion happened in the block in the texture map information storing unit 64. However, when it is estimated that a corresponding block is a motion block where motions happened, the motion estimation unit 61 outputs the motion block, stores the motion block in the motion block storing unit 62, and stores information that motions happened in the block in the texture map information storing unit 64. Since the information about whether motions happened or not, which is stored in the texture map information storing unit 64, is encoded and outputted, a decoding section (not illustrated) is able to know the information about whether motions happened in each block. Accordingly, an encoding process or any subsequent processes is not required for the non-motion block. In other words, the decoding section processes a non-motion block with the texture map information by copying blocks in the previous image. In addition, the motion estimation unit 61 estimates whether to encode the motion block where motions happened in the intra mode or in the inter mode at the same time when it performs the estimation of whether motions happened or not. In other words, blocks can be estimated as a motion block where motions happened, in case that there is movement of objects, change of light or a noise from a moving picture inputting device (for example, a camera). Among the above cases, motion blocks where there is a big movement such as the movement of objects are estimated to be encoded in the intra mode. Motion blocks where there is a small change in images such as minute movements due to lights or noise are estimated to be encoded in the inter mode. The corresponding intra mode encoding or inter mode encoding information is encoded and stored for the decoding section to be aware of it. At this time, in the case of the motion block estimated in the intra mode, block values of an original image are encoded in the same manner as the intra frame coding. In the case of the motion block estimated in the inter mode, different components between an original image and a previous image are encoded. In addition, after performing the estimation of whether or not motions happened and the estimation of the infra/inter modes for the motion block, a corresponding encoding process is performed for the motion block. At this time, an average value is calculated in a 4*4 pixel unit and then encoded for the U and V components of the motion block. On the other hand, the Y component is discrete cosine transformed and quantized in a 8*8 pixel unit. Then, the Y component is entropy coded with the Golomb-Rice coding. While the invention has been illustrated and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.
INDUSTRIAL APPLICABILITY As described above, according to the present invention, when encoding a moving picture, a Golomb-Rice coding having a low complexity and a fast processing speed is applied as an entropy coding method, instead of a Huffman coding. In addition, Y component, and U and V components of a moving picture compression object signal are individually encoded according to their degrees of importance. Accordingly, since it is possible to considerably reduce an encoding time of an entire system without a decrease of a compression performance or definition, compared to the related art, a moving picture encoding function suitable for a mobile communication terminal which is an application requiring a real-time service can be embodied. According to the present invention, when performing an inter-coding for moving pictures, a predictive-frame coding is performed using an original image of a previous intra frame. Accordingly, since it is not required to use an image restored from compressed previous image as the related art, the number of operations can be minimized. For example, a decoding process is not required when performing an encoding. In addition, according to the present invention, an estimation of whether motions happened is performed in a block unit when performing an intra-coding for moving pictures. At this time, it is estimated only whether motions happened using pixel values of a block boundary without obtaining a motion vector, as with the related art. Accordingly, since a process for obtaining a motion vector, and a discrete-cosine transformation and a quantization process required for obtaining information about whether motions happened are omitted, many operations can be reduced. Further, according to the present invention, when performing inter-coding for moving pictures, a motion estimation and an estimation of an intra mode and an inter mode are performed at the same time, rather than performing an estimation of the intra mode and the inter mode after obtaining a motion vector, as with the related art. Accordingly, it is possible to reduce the operations of the previously described processes, compared to the related art apparatus of encoding moving pictures. Additionally, according to the present invention, the above-mentioned moving picture encoding functions are embodied in a software manner. Accordingly, it is possible to reduce a burden on additional expenses for developing a mobile communication terminal, which expenses occur due to an addition of a hardware chip for encoding moving pictures.
In addition, contrary to a hardware chip, since the functions are embodied in a software manner to simply and quickly cope with some conditions, a moving picture service on a mobile communication terminal can be more effectively provided.

Claims

WHAT IS CLAIMED IS: 1. A system of moving pictures for mobile communication terminals having a video signal transforming unit for transforming an inputted RGB image into a moving picture compression object signal having Y, U and V components comprising: an intra-coding section for encoding the U and V components of the moving picture compression object signal by calculating an average value in a 4*4 pixel unit and encoding the average value, and for encoding the Y component of the moving picture compression object signal by discrete cosine transforming the Y component in a 8*8 pixel block unit, by obtaining a quantization coefficient through quantizing the discrete cosine transformed Y component, and by entropy coding the quantization coefficient with Golomb- Rice coding; and an inter-coding section for predictive frame coding the moving picture compression object signal using an original image of a previous intra frame, for estimating whether motions happened in a 8*8 pixel block unit using pixel values of a block boundary, for performing an estimation of an intra mode and an inter mode at the same time, for encoding the U and V components of a motion block where motions happened by calculating an average value in a 4*4 pixel unit, and for encoding the Y component of the motion block by discrete cosine transforming the Y component in a 8*8 pixel block unit, by obtaining a quantization coefficient through quantizing the discrete cosine transformed Y component, and by entropy coding the quantization coefficient with Golomb-Rice coding.
2. The system according to claim 1 , wherein the inter-coding section comprises: a motion estimation unit for estimating whether motions happened in each 8*8 pixel block unit, using an original image of a previous intra frame, for the moving picture compression object signal outputted by the video signal transforming unit; a texture map information storing unit for storing texture map information indicating whether motions happened or not in each block; a motion block storing unit for storing a motion block where motions happened, the motion block being outputted by the motion estimation unit; a non-motion block storing unit for storing a non-motion block where no motion happened, the non-motion block being outputted by the motion estimation unit; a discrete cosine transform (DCT) unit for obtaining a DCT coefficient by discrete cosine transforming the Y component of the motion block in a 8*8 pixel block unit, the motion block being stored in the motion block storing unit; a quantization unit for producing a quantization coefficient by performing a quantization process corresponding to a quantization width and each frequency component for the obtained DCT coefficient; and a Golomb-Rice coding unit for encoding and then outputting the produced quantization coefficient through a Golomb-Rice coding.
3. The system according to claim 2, wherein the motion estimation unit estimates whether motions happened through a difference between boundary value of a current frame' s block and boundary value of a previous frame' s block, outputs respectively the block where motions happened and the block where no motion happened, and outputs information about whether motions happened or not in each block as texture map information.
4. The system according to claim 2, wherein the motion estimation unit performs an estimation of an intra mode and an inter mode, based on a value obtained by squaring a difference value between a block of a current frame and a block of a previous frame, the difference value being a resultant value obtained when estimating whether motions happened.
5. A method of encoding moving pictures for mobile communication terminals performing an intra-coding and an inter-coding by transforming an inputted RGB image into a moving picture compression object signal comprising: the intra-coding process that encodes U and V components of the moving picture compression object signal by calculating an average value in a 4*4 pixel unit and encoding the average value, and for encoding Y component of the moving picture compression object signal by discrete cosine transforming the Y component in a 8*8 pixel block unit, by obtaining a quantization coefficient through quantizing the discrete cosine transformed Y component, and by entropy coding the quantization coefficient with Golomb-Rice coding; and the inter-coding process that estimates whether motions happened in a 8*8 pixel block unit for the moving picture compression object signal, classifies the blocks into a motion block and a non-motion block, stores the motion block and the non-motion block separately, encodes and outputs information about whether motions happened or not as texture map information, and performs an estimation of an intra mode and an inter mode for the motion block at the same time when estimating whether motions happened.
6. The method according to claim 5, wherein the inter-coding process comprises: estimating whether motions happened through a difference between boundary value of a current frame ' s block and boundary value of a previous frame ' s block; outputting and storing non-motion blocks where no motion happened and motion blocks where motions happened in each storing unit according to a result of the estimation of whether motions happened, and then encoding and outputting texture map information indicating whether motions happened or not in a corresponding block; and estimating whether to encode the motion block where motions happened in an intra mode or inter mode at the same time when performing the estimation of whether motions happened or not.
7. The method according to claim 5, wherein the inter-coding process encodes motion blocks, estimated as the intra mode, in the same manner as the intra-coding and encodes motion blocks, estimated as the inter mode, by encoding different components between an original image of a current frame and a previous image.
8. The method according to claim 5, wherein the inter-coding process encodes the U and V components of the motion block by calculating an average value in a 4*4 pixel unit and encoding the average value, and encodes the Y components of the motion block by discrete cosine transforming the Y component in a 8*8 pixel block unit, by obtaining a quantization coefficient through quantizing the discrete cosine transformed Y component, and by entropy coding the quantization coefficient with Golomb-Rice coding.
PCT/KR2004/001204 2004-05-20 2004-05-20 Systems and methods of encoding moving pictures for mobile communication terminals WO2005115007A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CNB2004800007255A CN100405847C (en) 2004-05-20 2004-05-20 Moving image system and method for coded mobile communication terminal
PCT/KR2004/001204 WO2005115007A1 (en) 2004-05-20 2004-05-20 Systems and methods of encoding moving pictures for mobile communication terminals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2004/001204 WO2005115007A1 (en) 2004-05-20 2004-05-20 Systems and methods of encoding moving pictures for mobile communication terminals

Publications (1)

Publication Number Publication Date
WO2005115007A1 true WO2005115007A1 (en) 2005-12-01

Family

ID=35428689

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2004/001204 WO2005115007A1 (en) 2004-05-20 2004-05-20 Systems and methods of encoding moving pictures for mobile communication terminals

Country Status (2)

Country Link
CN (1) CN100405847C (en)
WO (1) WO2005115007A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101500159B (en) * 2008-01-31 2012-01-11 华为技术有限公司 Method and apparatus for image entropy encoding, entropy decoding
KR102101824B1 (en) * 2019-05-03 2020-04-20 한국콘베어공업주식회사 Method for estimating length of roller chain using friction sound
CN111402380B (en) * 2020-03-12 2023-06-30 杭州小影创新科技股份有限公司 GPU compressed texture processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010017405A (en) * 1999-08-11 2001-03-05 정병철 Animation Moving Image Coding Method
US6650784B2 (en) * 2001-07-02 2003-11-18 Qualcomm, Incorporated Lossless intraframe encoding using Golomb-Rice
KR20040074725A (en) * 2003-02-18 2004-08-26 (주) 멀티비아 Moving-Picture Coding System And Method For Mobile Communication Terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9800900D0 (en) * 1998-01-17 1998-03-11 Philips Electronics Nv Graphic image generation and coding

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010017405A (en) * 1999-08-11 2001-03-05 정병철 Animation Moving Image Coding Method
US6650784B2 (en) * 2001-07-02 2003-11-18 Qualcomm, Incorporated Lossless intraframe encoding using Golomb-Rice
KR20040074725A (en) * 2003-02-18 2004-08-26 (주) 멀티비아 Moving-Picture Coding System And Method For Mobile Communication Terminal

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DORFLER P. AND MITRA S.K.: "Compression of color images without visible blocking artifacts using a modified DCT computation scheme", IMAGE PROCESSING AND ITS APPLICATIONS, vol. 1, 13 July 1999 (1999-07-13) - 15 July 1999 (1999-07-15), pages 169 - 173, XP006501260 *
KELLER Y. AND AVERBUCH A.: "Efficient global motion estimation for MPEG4 video compression", ELECTRICAL AND ELECTRONICS ENGINEERS IN ISRAEL, 1 December 2002 (2002-12-01), pages 69 - 71, XP010631050 *
MIN B. ET AL: "A novel compression algorithm for cell animation images", 2001 INTERATIONAL CONFERENCE IMAGE PROCESSIN PROCEEDINGS, vol. 2, 7 October 2001 (2001-10-07) - 10 October 2001 (2001-10-10), pages 459 - 462, XP010563797 *
RAMASWAMY A. ET AL: "A multiple transform based scheme for still color image compression", CICUITS AND SYSTEMS, 1996. ISCAS '96, vol. 2, 12 May 1996 (1996-05-12) - 15 May 1996 (1996-05-15), pages 433 - 436, XP000948169 *

Also Published As

Publication number Publication date
CN100405847C (en) 2008-07-23
CN1717938A (en) 2006-01-04

Similar Documents

Publication Publication Date Title
JP4641892B2 (en) Moving picture encoding apparatus, method, and program
US20090141808A1 (en) System and methods for improved video decoding
JP4501631B2 (en) Image coding apparatus and method, computer program for image coding apparatus, and portable terminal
JP2004248285A (en) Video encoder capable of differentially coding speaker's video image at pictorial call and video signal compression method using the same
JP2002532029A (en) Efficient macroblock header coding for video compression
KR20080043390A (en) Image decoding method and apparatus
JP5195032B2 (en) Encoding device / decoding device, encoding method / decoding method, and program
KR20050089838A (en) Video encoding with skipping motion estimation for selected macroblocks
US20070133689A1 (en) Low-cost motion estimation apparatus and method thereof
JP2004241957A (en) Image processor and encoding device, and methods therefor
CN1302666C (en) Appts. and method of coding moving picture
KR100845623B1 (en) Method and Apparatus for Transform-domain Video Editing
US8326060B2 (en) Video decoding method and video decoder based on motion-vector data and transform coefficients data
WO2005115007A1 (en) Systems and methods of encoding moving pictures for mobile communication terminals
KR100497760B1 (en) Moving-Picture Coding System And Method For Mobile Communication Terminal
KR100497753B1 (en) Moving-Picture Coding System And Method For Mobile Communication Terminal
WO2005115006A1 (en) System and methods of encoding moving pictures for mobile communication terminals
JP2008289105A (en) Image processing device and imaging apparatus equipped therewith
JP4100067B2 (en) Image information conversion method and image information conversion apparatus
JP2003087797A (en) Apparatus and method for picture information conversion, picture information conversion program, and recording medium
JPH05344491A (en) Inter-frame prediction coding system
WO2005104560A1 (en) Method of processing decoded pictures.
CN113132734A (en) Encoding and decoding method, device and equipment
JPH05347758A (en) Inter-frame estimated coding system
WO2005053317A1 (en) A method of compressing moving pictures for mobile devices

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 20048007255

Country of ref document: CN

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase