WO2000049525A1 - Procede et dispositif pour memoriser au moins une image avec les donnees relationnelles qui s'y rapportent - Google Patents

Procede et dispositif pour memoriser au moins une image avec les donnees relationnelles qui s'y rapportent Download PDF

Info

Publication number
WO2000049525A1
WO2000049525A1 PCT/DE2000/000386 DE0000386W WO0049525A1 WO 2000049525 A1 WO2000049525 A1 WO 2000049525A1 DE 0000386 W DE0000386 W DE 0000386W WO 0049525 A1 WO0049525 A1 WO 0049525A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
information
relational information
objects
relational
Prior art date
Application number
PCT/DE2000/000386
Other languages
German (de)
English (en)
Inventor
André KAUP
Jörg Heuer
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO2000049525A1 publication Critical patent/WO2000049525A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
    • H04N19/27Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding involving both synthetic and natural picture components, e.g. synthetic natural hybrid coding [SNHC]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/783Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/7837Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content

Definitions

  • the invention relates to a method and an arrangement for storing at least one image by a computer.
  • a method for image compression with the associated arrangement is known from [1].
  • the known method serves as a coding method in the MPEG standard and is essentially based on the hybrid DCT (Discrete Cosine Transformation) with motion compensation.
  • a similar procedure is used for video telephony with nx 64kbit / s (CCITT recommendation H.261), for TV contribution (CCR recommendation 723) with 34 or 45Mbit / s and for multimedia applications with 1.2Mbit / s (ISO-MPEG-1) is used.
  • the hybrid DCT consists of a temporal processing stage, which takes advantage of the relationship between successive images, and a local processing stage, which uses correlation within an image.
  • the local processing essentially corresponds to the classic DCT coding.
  • the image is broken down into blocks of 8x8 pixels, each using
  • DCT can be transformed into the frequency domain.
  • the result is a matrix of 8x8 coefficients, which approximately reflect the two-dimensional spatial frequencies in the transformed image block.
  • a coefficient with frequency 0 (DC component) represents an average gray value of the image block.
  • a second step of data reduction takes the form of an adaptive quantization, by means of which the amplitude accuracy of the coefficients is further reduced or by which the small amplitudes are set to zero.
  • Quantization depends on the fill level of the output buffer: If the buffer is empty, fine quantization takes place, so that more data is generated, while if the buffer is full, it is coarser, which reduces the amount of data.
  • variable-length coding VLC
  • a further compression gain is obtained through the temporal processing (Interfra e coding).
  • a lower data rate is required for coding differential images than for the original images, because the amplitude values are much lower.
  • the motion information is transmitted with the image information, usually only one motion vector per macroblock (e.g. four 8x8 image blocks) is used.
  • the coder also has a temporal recursion loop, because the predictor must calculate the prediction value from the values of the (coded) images already transmitted.
  • An identical time recursion loop is in the decoder, so that the encoder and decoder are completely synchronized.
  • I-pictures There are three main methods in the MPEG-2 coding method with which images can be processed: I-pictures: No temporal prediction is used for the I-pictures, ie the picture values are directly transformed and encoded. I-pictures are used in order to be able to start the decoding process again without knowing the past, or to achieve a resynchronization in the event of transmission errors.
  • a temporal prediction is made based on the P-pictures, the DCT is based on the temporal
  • B-pictures With the B-pictures, the temporal bidirectional prediction error is calculated and then transformed.
  • the bidirectional prediction works basically adaptively, i.e. forward prediction, backward prediction or interpolation are permitted.
  • the distance between the P-pictures is denoted by m, where there are m-1 B-pictures between the P-pictures.
  • the MPEG syntax leaves it up to the user to choose m and n.
  • [2] discloses a method for estimating movement in the context of a method for block-based image coding. It is assumed that a digitized image has pixels which are combined in m image blocks of 8x8 pixels or 16x16 pixels. If necessary, an image block can also comprise several image blocks. An example of this is a macro block with 6 picture blocks, of which 4 picture blocks for Brightness information and 2 image blocks for color information are provided.
  • a value for an error measure is determined.
  • a sum is preferably determined via the amounts of the differences from the coding information associated with the pixels of the image block and the previous image block.
  • Coding information here means brightness information (luminance value) and / or color information (chrominance value), each of which is assigned to a pixel. "In a search space of predeterminable size and shape around the starting position in the previous image, a value of the error measure is determined for an area of the same size of the previous image block, shifted by one or half an image point.
  • Picture block is assumed that this previous picture block best matches the picture block of the picture to be encoded for which the motion estimation is to be performed.
  • the result of the movement estimation is a
  • Compression of the image data is achieved in that the motion vector and the error signal are encoded.
  • the motion estimation is carried out for each image block of an image.
  • the object of the invention is to make an image data stream searchable with regard to the information contained in the image data.
  • a method for storing at least one image by a computer is specified, in which relational information associated with the at least one image is stored.
  • This relational information can in particular be stored together with the at least one image.
  • a reference pointer, pointer
  • the relational information can in particular be stored together with the at least one image.
  • a reference pointer, pointer
  • a further development consists in that the -relational information is determined before the storage.
  • relational information is a feature information and a
  • Reference information between objects and / or images includes.
  • the feature information provides e.g. Information about a movement feature, the reference information creates the link to the object or image for which the feature information is relevant.
  • relational information identifies information relating to a predefined relationship between two objects, wherein on the one hand the information for the type of relationship (feature information) and the objects involved in the relationship (reference formation) can be combined in the relational information.
  • the association of the relational information with the image can be realized in such a way that a reference to the relational information is stored. It is not necessary to use the same memory for the information and the image data. A division over any storage locations is possible, preferably a link information (pointer) is stored, on the basis of which the actual information can be found.
  • a link information pointer
  • a further development consists in the fact that the at least one picture is a sequence of several pictures.
  • relational information When storing images, additional information, referred to here as relational information, is accordingly determined and stored with the images.
  • the type of relational information explained below enables a later search for certain image data.
  • the search preferably takes place in the data of the relational information (s), the image data, which is preferably in compressed form do not need to be specially restored.
  • This search at a high level of abstraction "red car drives from left to right through the picture", specific pictures, here the red car, can be found.
  • red car drives from left to right through the picture specific pictures, here the red car
  • Image compression can in particular be based on an image compression standard, e.g. an MPEG or an H.26x standard.
  • an image compression standard e.g. an MPEG or an H.26x standard.
  • a further development consists in the relational information comprising at least one of the following options:
  • motion information (especially between objects) can be determined automatically from image data.
  • objects can be identified in a picture; the picture itself is hierarchical (comparable to a tree structure).
  • the hierarchical relationships of the objects to one another can be supplemented by movement information between the respective objects. This movement information identifies the relative movement of the connected objects.
  • the total movement (relative and absolute) of all objects existing and relevant in the picture results from the complete hierarchical structure.
  • the hierarchical structure of the image (or the scene) can be given according to different specifications: An example is an "included-in" relationship, that is, the hierarchical structure indicates which objects are (at least partially) contained in other objects. Other examples of hierarchical division of the scene are also possible.
  • b) Distance information is also possible.
  • the distance information between objects can be determined and stored.
  • the distance can be determined, for example, on the basis of an edge boundary or a center of gravity of an object. The distances between the multiple objects in the scene are fully described.
  • the type or degree of overlap between objects is recorded as relational information.
  • the sum of the overlaps results in the arrangement of the objects within the scene.
  • any relationship between objects and / or images can be used as relational information.
  • the hierarchical arrangement of the objects of a scene described above can take place according to the chosen relationship.
  • transformation information over time can also serve as relational information.
  • objects / images are preferably transformed over a predetermined period of time, the transformation yielding values that provide average values of the movement over time.
  • Such an average is obtained, for example, by means of a discrete cosine transformation (DCT).
  • DCT discrete cosine transformation
  • the relational information can be determined in particular between two images or between two objects of an image, taking into account the change in the relational information over time (for example movement information).
  • an image or a scene can comprise a large number of objects which are connected to one another and whose position changes differently over time.
  • the relational information can be determined between two objects according to their hierarchical arrangement. Alternatively, the relational information can also be determined based on absolute information (e.g. absolute coordinates within the image). From the absolute information, the information of the objects to each other emerges and vice versa.
  • the relational information is added to a feature set using an image compression method.
  • the image compression process is particularly standardized. Examples are an MPEG standard or an H.26x standard.
  • the method described can preferably be used in the context of encoding using an image compression method.
  • a further development consists in that image data which have been stored according to the described method can be accessed selectively by means of suitable
  • Relational information is implemented. For example, the movement information between objects that is stored in the feature set can be searched and found in a targeted manner. This makes it possible to search for the red car that moves from left to right through an image mentioned at the beginning. It should be pointed out that the search itself can use the relational information described by the method based on different functionality. This enables an "intelligent" evaluation of the different information within an application-specific search. Relational information alone enables the search in image data that otherwise do not have any searchable features.
  • Fig.l a scene that is hierarchically divided into three objects
  • 4 is a sketch illustrating a transmitter and receiver for image compression
  • 5 shows a sketch with an image encoder and an image decoder in greater detail
  • FIG. 7 shows an alternative embodiment for storing object-related image data.
  • FIG. 1 shows a hierarchical structure consisting of a square 101, a rectangle 102 and a triangle 103 in the form of a tree diagram.
  • triangle 103 correspond to an "included-in" relation, i.e. the square 101 contains both the rectangle
  • FIG.2. 2 comprises a scene which is shown in different forms 201, 202, 203 and 204 over time.
  • Fig.l are present in every temporal version of the scene.
  • the square 101 moves from its starting position 205 downward to the left 206, further downward 207 and then to the right 208.
  • the rectangle 102 remains unchanged during the changes over time (indicated by arrows 217, 218 and 219) Position at top left (see positions 209, 210, 211 and 212).
  • the triangle 103 is also contained in the square 101 and moves gradually from an initial position 213 in the different time steps 217 to 219 (see positions 214, 215 and 216).
  • the relations 104 and 105 from FIG. 1 can thus be expanded by the time step
  • the relative change in position per time step is preferably specified using the parameters translation (along the coordinate axes), rotation and zoom.
  • FIG. 3 shows a possibility for storing image data, in particular dividing an image into objects, e.g. according to the MPEG-4 standard.
  • a sequence 301 of image data for an object 1 and a sequence 302 of image data for an object 2 are shown
  • Feature set which contains both intrinsic data 303 and 305 (e.g. shape and color of the object) and relational information 304 and 306.
  • the relation is preferably also supplemented by a reference 315 or 316 (pointer).
  • This reference represents the linking of the hierarchically structured objects.
  • object 1 corresponds to square 101 and object 2 corresponds to rectangle 102.
  • Arrow 316 indicates the relation "contains” and arrow 315 indicates that
  • the change in position between object 1 and object 2 for sequences 301 and 302 is also stored in the fields for relational information 304 and 306, respectively.
  • the object-related data 307 to 310 (for object 1) or 311 to 314 (for object 2) determine the respective sequences 301 and 302. Relational information is determined and stored for these sequences, in particular each sequence is interpreted as a "global" movement, i.e. for the
  • FIG. 1 shows an arrangement which comprises two computers and a camera, with image coding, transmission of the image data and image decoding being illustrated.
  • a camera 1101 is connected to a first computer 1102 via a line 1119.
  • the camera 1101 transmits captured images 1104 to the first computer 1102.
  • the first computer 1102 has a first processor unit 1103, which is connected via a bus 1118 to an image memory 1105.
  • the image coding methods are carried out with the processor unit 1103 of the first computer 1102.
  • Image data 1106 encoded in this way is transmitted from the first computer 1102 to a second computer 1108 via a communication link 1107, preferably a line or a radio link.
  • the second computer 1108 contains a second processor unit 1109 which is connected to the image memory 1111 via a bus 1110. Methods for image decoding are carried out on the second processor unit 1109.
  • Both the first computer 1102 and the second computer 1108 each have a screen 1112 or 1113 on which the image data 1104 are visualized.
  • Input units are provided for operating both the first computer 1102 and the second computer 1108, preferably a keyboard 1114 or 1115, and a computer mouse 1116 or 1117.
  • the image data 1104, which are transmitted from the camera 1101 to the first computer 1102 via the line 1119, are preferably data in the time domain, while the data 1106 which are transmitted from the first computer 1102 to the second computer 1108 via the communication link 1107, Image data are in the spectral range.
  • the decoded image data is displayed on a screen 1120.
  • FIG. 5 shows a sketch of an arrangement for carrying out a block-based image coding method.
  • a video data stream to be encoded with chronologically successive digitized images is fed to an image coding unit 1201.
  • the digitized images are divided into macro blocks 1202, each
  • Macroblock has 16x16 pixels.
  • the macro block 1202 comprises 4 picture blocks 1203, 1204, 1205 and 1206, each picture block containing 8x8 picture elements to which luminance values (brightness values) are assigned.
  • each macro block 1202 comprises two chrominance blocks 1207 and 1208 with chrominance values (color information, color saturation) assigned to the pixels.
  • luminance value, first chrominance value and second chrominance value are referred to as color values.
  • the image blocks are fed to a transformation coding unit 1209.
  • a transformation coding unit 1209 In the case of differential image coding, values to be coded from image blocks of temporally preceding images are subtracted from the image blocks currently to be coded; only the difference formation information 1210 of the transformation coding unit (discrete cosine
  • the current macro block 1202 is communicated to a motion estimation unit 1229 via a connection 1234.
  • Spectral coefficients 1211 are formed in the transformation coding unit 1209 for the picture blocks or difference picture blocks to be coded and are fed to a quantization unit 1212.
  • This Quantization unit 1212 corresponds to the quantization device according to the invention.
  • Quantized spectral coefficients 1213 become both a scan unit 1214 and an inverse
  • Quantization unit 1215 fed in a reverse path.
  • entropy coding is carried out on the scanned spectral coefficients 1232 in an entropy coding unit 1216 provided for this purpose.
  • the entropy-coded spectral coefficients are transmitted as coded image data 1217 via a channel, preferably a line or a radio link, to a decoder.
  • Spectral coefficients 1218 obtained in this way are fed to an inverse transformation coding unit 1219 (inverse discrete cosine transformation, IDCT).
  • Reconstructed coding values (also differential coding values) 1220 are supplied to an adder 1221 in the differential image mode.
  • the adder 1221 also receives coding values of an image block which result from a temporally preceding image after motion compensation has already been carried out. The adder 1221 is used to reconstruct
  • Image blocks 1222 are formed and stored in an image memory 1223.
  • Chrominance values 1224 of the reconstructed image blocks 1222 become one from the image memory 1223
  • Motion compensation unit 1225 supplied.
  • an interpolation takes place in an interpolation unit 1227 provided for this purpose. Based on the interpolation, the number of brightness values contained in the respective image block is preferably doubled.
  • All brightness values 1228 are both the motion compensation unit 1225 and the Motion estimation unit 1229 supplied.
  • the motion estimation unit 1229 also receives the image blocks of the macro block to be coded in each case (16x16 pixels) via the connection 1234. This takes place in the motion estimation unit 1229
  • Motion estimation taking into account the interpolated brightness values ("motion estimation on a half-pixel basis").
  • motion estimation on a half-pixel basis absolute differences between the individual brightness values are preferably determined in the macro block 1202 currently to be coded and in the reconstructed one
  • Macroblock determined from the previous image.
  • the result of the motion estimation is a motion vector 1230, by means of which a local displacement of the selected macroblock from the temporally previous image to the macroblock 1202 to be coded is expressed.
  • Both brightness information and chrominance information relating to the macroblock determined by the motion estimation unit 1229 are shifted by the motion vector 1230 and subtracted from the coding values of the macroblock 1202 (see data path 1231).
  • the processor unit PRZE comprises a processor CPU, one
  • Memory SPE and an input / output interface IOS which are used in different ways via an interface IFC: output is displayed on a monitor MON and / or output on a printer PRT via a graphic interface. An entry is made using a mouse MAS or a keyboard TAST.
  • the processor unit PRZE also has a data bus BUS, which ensures the connection of a memory MEM, the processor CPU and the input / output interface IOS. Furthermore, additional components can be connected to the data bus BUS, for example additional memory, data storage (hard disk) or scanner.
  • FIG. 7 shows an alternative embodiment to FIG. 3 for storing object-related image data. A sequence 701 of image data for the object 1 and a sequence 702 image data for the object 2 are shown. An intrinsic information belonging to the respective object (shape, color of the object).
  • Object 703 or 704 is stored with object 701 or 702.
  • the relational information 713 is preferably stored separately from the respective objects 701 and 702.
  • Relational information 713 includes feature information 714 relating to the linking of objects 701 and 702, e.g. the movement of the object 1 relative to the object 2.
  • the link itself is established using the reference information 715, 716, which preferably has references to the objects 701 and 702 associated with the feature information 714.
  • the object-related data 705 to 708 or 704 to 712 each determine a sequence belonging to the respective object.
  • the sequence can include any number of images (for the respective object).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Library & Information Science (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

L'invention concerne un procédé permettant de mémoriser au moins une image à l'aide d'un ordinateur, selon lequel les données relationnelles sont mémorisées, associées à ladite image (au moins au nombre de une).
PCT/DE2000/000386 1999-02-18 2000-02-09 Procede et dispositif pour memoriser au moins une image avec les donnees relationnelles qui s'y rapportent WO2000049525A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE19906830.5 1999-02-18
DE19906830A DE19906830A1 (de) 1999-02-18 1999-02-18 Verfahren und Anordnung zum Abspeichern von mindestens einem Bild durch einen Rechner

Publications (1)

Publication Number Publication Date
WO2000049525A1 true WO2000049525A1 (fr) 2000-08-24

Family

ID=7897925

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE2000/000386 WO2000049525A1 (fr) 1999-02-18 2000-02-09 Procede et dispositif pour memoriser au moins une image avec les donnees relationnelles qui s'y rapportent

Country Status (2)

Country Link
DE (1) DE19906830A1 (fr)
WO (1) WO2000049525A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05108730A (ja) * 1991-08-29 1993-04-30 Internatl Business Mach Corp <Ibm> 動画像データベースの検索
JPH1084525A (ja) * 1996-02-05 1998-03-31 Texas Instr Inc <Ti> ビデオに索引をつける方法
US5969755A (en) * 1996-02-05 1999-10-19 Texas Instruments Incorporated Motion based event detection system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05108730A (ja) * 1991-08-29 1993-04-30 Internatl Business Mach Corp <Ibm> 動画像データベースの検索
US5892520A (en) * 1991-08-29 1999-04-06 International Business Machines Corporation Picture query system using abstract exemplary motions of a pointing device
JPH1084525A (ja) * 1996-02-05 1998-03-31 Texas Instr Inc <Ti> ビデオに索引をつける方法
US5969755A (en) * 1996-02-05 1999-10-19 Texas Instruments Incorporated Motion based event detection system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LIANG ET AL.: "Video Indexing by Spatial Representation", PROCEEDINGS OF THE THIRD AUSTRALIAN AND NEW ZEALAND CONFERENCE ON INTELLIGENT INFORMATION SYSTEMS, 1995. ANZIIS-95, 27 November 1995 (1995-11-27), pages 99 - 104, XP002140419 *
PISSINOU ET AL.: "A Topological - Directional Model for the Spatio-Temporal Composition of Video Objects", EIGHTH INTERNATIONAL WORKSHOP ON RESEARCH ISSUES IN DATA ENGINEERING 1998. 'CONTINUOUS-MEDIA DATABASES AND APPLICATIONS', 23 February 1998 (1998-02-23), pages 17 - 24, XP002140418 *

Also Published As

Publication number Publication date
DE19906830A1 (de) 2000-08-31

Similar Documents

Publication Publication Date Title
DE69623330T2 (de) Merkmalbasiertes videokompressionsverfahren
DE69831961T2 (de) Bildobjekterzeugungsverfahren für objektbasierte kodierungssysteme unter verwendung von masken und gerundeten mittelwerten
DE69723550T2 (de) Kodierung und dekodierung von grafischen symbolen
DE19704439C2 (de) Verfahren und Vorrichtung zur Bewegungsschätzung in einem digitalen Videocodierer unter Verwendung von Trajektorien
EP1025708B1 (fr) Procede et dispositif pour le traitement d&#39;une image numerique
DE69521255T2 (de) Verfahren zum betrieb eines interaktiven bildanzeigesystems und bildanzeigesystem zur durchführung des verfahrens
DE69915843T2 (de) Teilbandkodierung/-dekodierung
EP0773690A2 (fr) Procédé pour le codage d&#39;un flux de données vidéo
EP1101196B1 (fr) Procede et dispositif d&#39;evaluation du mouvement dans une image numerisee possedant des pixels
EP0985317B1 (fr) Procede de codage et de decodage d&#39;une image numerisee
EP1110407B1 (fr) Procede et dispositif pour le codage et le decodage d&#39;une image numerisee faisant appel a un vecteur de deplacement total
DE10022520A1 (de) Verfahren zur örtlichen skalierbaren Bewegtbildcodierung
EP1116184B1 (fr) Procede et systeme de traitement d&#39;une image numerisee dotee de points d&#39;images
EP1285537B1 (fr) Procede et agencement pour le codage et le decodage d&#39;une suite d&#39;images
DE19951341B4 (de) Verfahren zur bewegungskompensierenden Prädiktion von Bewegtbildern sowie Einrichtung hierzu
DE69909880T2 (de) Dekodierung eines komprimierten digitalen Bildsignals
EP0981910B1 (fr) Procede et dispositif de codage d&#39;une image numerisee
WO2001049038A1 (fr) Procede, dispositif et produit programme informatique servant a effectuer une prediction lors du codage d&#39;une image divisee en blocs d&#39;image
WO2001062009A1 (fr) Procede et dispositif de codage, ou de codage et decodage d&#39;une suite de nombres
WO2000049525A1 (fr) Procede et dispositif pour memoriser au moins une image avec les donnees relationnelles qui s&#39;y rapportent
EP0981909B1 (fr) Procede et dispositif de codage et de decodage d&#39;une image numerisee
EP1121809B1 (fr) Procede et dispositif pour le codage d&#39;une image numerisee, procede et dispositif pour le decodage d&#39;une image numerisee
DE19944300C2 (de) Verfahren, Anordnung und Computerprogrammerzeugnis zur Bewegungsschätzung bei der Codierung von einem Bildobjekt in einem Bild
DE19903859A1 (de) Verfahren und Anordnung zur Transformation eines Bildbereichs
WO2001028252A1 (fr) Codage et decodage progressifs tolerants aux erreurs d&#39;une suite d&#39;images

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CN JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase