WO2003060548A2 - Method for efficiently storing the trajectory of tracked objects in video - Google Patents
Method for efficiently storing the trajectory of tracked objects in video Download PDFInfo
- Publication number
- WO2003060548A2 WO2003060548A2 PCT/IB2002/005377 IB0205377W WO03060548A2 WO 2003060548 A2 WO2003060548 A2 WO 2003060548A2 IB 0205377 W IB0205377 W IB 0205377W WO 03060548 A2 WO03060548 A2 WO 03060548A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- coordinates
- video
- particular object
- current
- box
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
- G01S3/7864—T.V. type tracking systems
- G01S3/7865—T.V. type tracking systems using correlation of the live video image with a stored image
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
Definitions
- the present invention relates to the tracking of objects in video sequences. More particularly, the present invention relates to storage of coordinates used to track object trajectories.
- trajectory coordinates are typically generated for each frame of video.
- NTSC NTSC standard
- a new location or coordinate for each object in a video sequence must be generated and stored for each frame.
- This process is extremely inefficient and requires tremendous amounts of storage. For example, if five objects in a video sequence were tracked, over two megabytes of storage would be needed just to store the trajectory data for a single hour. Thus, storage of all of the trajectories is expensive, if not impractical.
- the coordinates are stored only when objects move more than a predetermined amount, rather than storing their movement after every frame. This feature permits a tremendous savings in memory or disk usage over conventional methods. In addition, the need to generate coordinates can be greatly reduced to fractions of the generation per frame that is conventionally processed.
- a video content analysis module automatically identifies objects in a video frame, and determines the (X;,y ; ) coordinates of each object i.
- the reference coordinates for each for object i, (xref ,yref ) are set to (x,-,y;) when the object is first identified. For subsequent frames, if the new coordinates (xnew;,ynew) are less than a given distance from the reference coordinates, that is if
- the current coordinates (xnew;,ynew ; ) are stored in the object's trajectory list, and we set the reference coordinates (xref ⁇ ,yrefj) to the object's current position. This process is repeated for all subsequent video frames.
- the resulting compact trajectory lists can then be written to memory or disk while they are being generated, or when they are complete.
- the present invention can be used in many areas, including video surveillance security system that tracks movement in a particular area, such as a shopping mall, etc.
- the compact storage of the present invention makes the permanent storage of secure areas much more practical, and provides a record to investigators to see whether a particular place was "cased" (e.g. observed by a wrongdoer prior to committing an unlawful act) by a wrongdoer prior to a subsequent unlawful action being performed.
- a method for storing a trajectory of tracked objects in a video comprising the steps of:
- step (b) determining first reference coordinates (xrefj-,yref ⁇ ) for each of said objects identified in step (a) in the first video frame; (c) storing the first reference coordinates (xref ⁇ ,yref ⁇ );
- the method according may further comprise (g) repeating steps (e) and (f) for all video frames subsequent to said second video frame in a video sequence so as to update the storage area with additional coordinates and to update the current reference coordinates with new values each time said condition in step (f) is satisfied.
- the method may include a step of storing the last coordinates of the object (i.e., the coordinates just before the object disappears and the trajectory ends), even if the last coordinate does not satisfy condition (f).
- the object trajectory list for the particular object stored in step (f) may comprise a temporary memory of a processor, and the method may optionally include the following step:
- the permanent storage referred to in step (h) may comprise at least one of a magnetic disk, optical disk, and magneto-optical disk, or even tape.
- the permanent storage can be arranged in a network server.
- the determination of the current reference coordinates xi n e w Y mew ) in step (e) can include size tracking of the objects moving one of (i) substantially directly toward, and (ii) substantially directly away from a camera by using a box bounding technique.
- the box bounding technique may comprise:
- box bounding technique may comprise:
- Figs. 1 A-1C illustrate a first aspect of the present invention wherein the motion in Fig. IB relative to Fig. 1 A fails to satisfy the expression in Fig. lC.
- Figs. 2A-2C illustrate a second aspect of the present invention wherein the motion in Fig. 2B relative to Fig. 2 A satisfies the expression in Fig. lC.
- Figs. 3 A-3C illustrate another aspect of the present invention pertaining to a box bounding technique.
- Fig. 4 illustrates a schematic of a system used according to the present invention.
- Fig. 5 A and 5B are a flow chart; illustrating an aspect of the present invention.
- Fig. 1A-1C illustrate a first aspect of the present invention.
- a frame 105 contains an object 100 (in this case a stick figure representing a person).
- object 100 in this case a stick figure representing a person.
- numerical scales in both the X direction and Y direction have been added to the frame.
- the x,y coordinates can be obtained, for example, by using the center of the mass of the object pixels, or in the case of a bounding box technique (which is disclosed, infra) by using the center of the object bounding box.
- the object 100 is identified at a position (xref ⁇ ,yref / ) which are now used as the x and y reference point for this particular object.
- the objects identified do not have to be, for example, persons, and could include inanimate objects in the room, such as tables, chairs, and desks. As known in the art, these objects could be identified by, for example, their color, shape, size, etc.
- a background subtraction technique is used to separate moving objects from the background.
- this technique is by learning the appearance of the background scene and then identifying image pixels that differ from the learned background. Such pixels typically correspond to foreground objects.
- Applicants hereby incorporate by reference as background material the articles by A. Elgammal, D. Harwood, and L. Davis, "Non-parametric Model for Background Subtraction", Proc. European Confl on Computer vision, pp. II: 751-767, 2000, and C.
- Stauffer W.E.L. Grimson, "Adaptive Background Mixture Models for Real-time Tracking", Proc. Computer Vision and Pattern Recognition, pp. 246-252, 1999 as providing reference material for some of the methods that an artisan can provide object identification.
- simple tracking links objects in successive frames based on distance, by marking each object in the new frame by the same number as the closest object in the previous frame.
- the objects can be identified by grouping the foreground pixels, for example, by a connected-components algorithm, as described by T. Gormen, C. Leiserson, R. Rivest, "Introduction to Algorithms", MIT Press, 1990, chapter 22.1, which is hereby incorporated by reference as background material.
- the objects can be tracked such as disclosed in U.S. patent application serial
- object 100 has moved to a new position captured in the second frame 110 having coordinates of (xnew ; -,ynew ; ) which is a distance away from the (xref ,yre ⁇ )) of the first frame 105.
- an algorithm determines whether or not the movement by object 100 in the second frame is greater than a certain predetermined amount. In the case where the movement is less than the predetermined amount, coordinates for
- Figure IB are not stored.
- the reference coordinates identified in the first frame 105 continue to be used against a subsequent frame.
- Fig. 2A again illustrates, (for convenience of the reader), frame 105, whose coordinates will be used to track motion in a third frame 210.
- the amount of movement by the object 100 in the third frame, as opposed to its position in the first frame 105, is greater than the predetermined threshold. Accordingly, the coordinates of the object 100 in Figure 2B now become the new reference coordinates (as identified in the drawing as new (xref ⁇ ,yref ⁇ ), versus the old (xre ⁇ ),yref ). Accordingly, the trajectory of the object 100 includes the coordinates in frames 1 and 3, without the need to save the coordinates in frame 2.
- the predetermined amount of movement could be set so that significant amounts of coordinates would not require storage. This process can permit an efficiency in compression heretofore unknown.
- the amount of movement used as a predetermined threshold could be tailored for specific applications, and includes that the threshold can be dynamically computed, or modified during the analysis process. The dynamic computation can be based on factors such as average object velocity, general size of the object, importance of the object, or other statistics of the video. For example, in a security film, very small amounts of motion could be used when items being tracked are extremely valuable, as opposed to larger threshold amounts permit more efficient storage, which can be an important consideration based on storage capacity and/or cost.
- the threshold amount can be application specific so that the trajectory of coordinates is as close to the actual movement as desired. In other words, if a threshold amount is too large, it could be movement in different directions that is not stored. Accordingly, the trajectory of the motion would be that between only the saved coordinates, which, of course, may not necessarily comprise the exact path that would be determined in the conventional tracking and storage for each individual frame. It should be noted that with many forms of compression, there normally is some degree of paring of the representation of the objects.
- Figs. 3 A to 3C illustrate another aspect of the present invention pertaining to a box bounding technique. It is understood by persons of ordinary skill in the art that while a camera is depicted, the video image could be from a video server, DVD, videotape, etc. When objects move directly toward or away from a camera, their coordinates may not change enough to generate new trajectory coordinates for storage.
- a box bounding technique is one way that the problem can be overcome. For example, in the case of an object moving directly toward or away from the camera, the size of the object will appear to become larger or smaller depending on the relative direction.
- Figs. 3 A to 3C illustrate a box bounding technique using size tracking. As shown in Fig. 3 A, a bounding box 305 represents the width and height of an object 307 the first frame 310.
- the box bounding technique would store the coordinate of the object in the second frame 312 if the width of a bounding box in a subsequent frame is different from the width of the reference box of the previous frame, or the height of the bounding box in a particular frame is different from the height of the bounding box of a reference frame; in each case the difference is more than a predetermined threshold value.
- the area of the bounding box (width x height) could be used as well, so if the area of the bounding box 310 is different than the area of the reference bounding box 305 by a predetermined amount, the coordinates of the second frame would be stored.
- Fig. 4 illustrates one embodiment of a system according to the present invention.
- connections between all of the elements could be any combination of wired, wireless, fiber optic, etc.
- some of the items could be connected via a network, including but not limited to the Internet.
- a camera 405 captures images of a particular area and relays the information to a processor 410.
- the processor 410 includes a video content analysis module 415 which identifies objects in a video frame and determines the coordinates for each object.
- the current reference coordinates for each object could be stored, for example, in a RAM 420, but it should be understood that other types of memory could be used.
- the initial reference coordinates of the identified objects would also be stored in a permanent storage area 425.
- This permanent storage area could be a magnetic disc, optical disc, magneto optical disc, diskette, tape, etc. or any other type of storage.
- This storage could be located in the same unit as the processor 410 or it could be stored remotely. The storage could in fact be part of or accessed by a server 430.
- the video content module determines that motion for an object in a frame exceeds the value of the reference coordinates by a predetermined tlueshold, the current reference coordinates in the RAM 420 would be updated as well as permanently stored 425.
- the storage could be video tape.
- Figs. 5 A and 5B illustrate a flow chart that provides an overview of the process of the present of the present invention.
- the reference coordinates for each of the objects identified in the first video frame are determined.
- the determination of these reference coordinates may be known by any known method, e.g., using the center of the object bounding box, or the center of mass of the object pixels.
- the first reference coordinates determined in step 10 are stored.
- these coordinates could be stored in a permanent type of memory that would record the trajectory of the object.
- the coordinates need not be stored after each step. In other words, the coordinates could be tracked by the processor in the table, and after all the frames have been processed, the trajectory could be stored at that time.
- the objects in the second video frame are identified.
- the current reference coordinates of a particular object are stored in an object trajectory list and used to replace the first referenced coordinates of that particular object if the following condition for the particular object is satisfied
- 2 ⁇ e
- the condition is not satisfied, the first reference coordinates are retained for comparison with subsequent video frames. The process continues until all of the video frames have been exhausted.
- the object trajectory list could be a table, and/or a temporary storage area in the processor which is later stored, for example, on a hard drive, writeable CD ROM, tape, non volatile electronic storage, etc.
- Various modifications may be made on the present invention by a person of ordinary skill in the art that would not depart from the spirit of the invention or the scope of the appended claims.
- the type of method used to identify the object in the video frames, the threshold values provided by which storage of additional coordinates and subsequent frames is determined may all be modified by the artisan in the spirit of the claimed invention.
- a time interval could be introduced into the process, where for example, after a predetermined amount of time, the coordinates of a particular frame are stored even if a predetermined threshold of motion is not reached.
- coordinates other than x and y could be used, (for example, z) or, the x,y coordinates could be transformed into another space, plane or coordinate system, and the measure would be done in the new space. For example, if the images were put through a perspective transformation prior to measuring.
- the distance measured could be other than Euclidian distance, such as a less-compute-intensive measure, such as
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2004-7010114A KR20040068987A (en) | 2001-12-27 | 2002-12-10 | Method for efficiently storing the trajectory of tracked objects in video |
EP02788352A EP1461636A2 (en) | 2001-12-27 | 2002-12-10 | Method for efficiently storing the trajectory of tracked objects in video |
JP2003560590A JP2005515529A (en) | 2001-12-27 | 2002-12-10 | A method for effectively storing the track of a tracked object in a video |
AU2002353331A AU2002353331A1 (en) | 2001-12-27 | 2002-12-10 | Method for efficiently storing the trajectory of tracked objects in video |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/029,730 | 2001-12-27 | ||
US10/029,730 US20030126622A1 (en) | 2001-12-27 | 2001-12-27 | Method for efficiently storing the trajectory of tracked objects in video |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2003060548A2 true WO2003060548A2 (en) | 2003-07-24 |
WO2003060548A3 WO2003060548A3 (en) | 2004-06-10 |
Family
ID=21850560
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2002/005377 WO2003060548A2 (en) | 2001-12-27 | 2002-12-10 | Method for efficiently storing the trajectory of tracked objects in video |
Country Status (7)
Country | Link |
---|---|
US (1) | US20030126622A1 (en) |
EP (1) | EP1461636A2 (en) |
JP (1) | JP2005515529A (en) |
KR (1) | KR20040068987A (en) |
CN (1) | CN1613017A (en) |
AU (1) | AU2002353331A1 (en) |
WO (1) | WO2003060548A2 (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8711217B2 (en) | 2000-10-24 | 2014-04-29 | Objectvideo, Inc. | Video surveillance system employing video primitives |
US8564661B2 (en) | 2000-10-24 | 2013-10-22 | Objectvideo, Inc. | Video analytic rule detection system and method |
US9892606B2 (en) | 2001-11-15 | 2018-02-13 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
US8457401B2 (en) | 2001-03-23 | 2013-06-04 | Objectvideo, Inc. | Video segmentation using statistical pixel modeling |
US7424175B2 (en) | 2001-03-23 | 2008-09-09 | Objectvideo, Inc. | Video segmentation using statistical pixel modeling |
SE527467C2 (en) * | 2003-12-22 | 2006-03-14 | Abb Research Ltd | Method of positioning and a positioning system |
CN101421727A (en) | 2005-09-30 | 2009-04-29 | 罗伯特·博世有限公司 | Method and software program for searching image information |
US8588583B2 (en) * | 2007-08-22 | 2013-11-19 | Adobe Systems Incorporated | Systems and methods for interactive video frame selection |
US20130021488A1 (en) * | 2011-07-20 | 2013-01-24 | Broadcom Corporation | Adjusting Image Capture Device Settings |
US8929588B2 (en) | 2011-07-22 | 2015-01-06 | Honeywell International Inc. | Object tracking |
US9438947B2 (en) | 2013-05-01 | 2016-09-06 | Google Inc. | Content annotation tool |
US10115032B2 (en) * | 2015-11-04 | 2018-10-30 | Nec Corporation | Universal correspondence network |
KR101803275B1 (en) * | 2016-06-20 | 2017-12-01 | (주)핑거플러스 | Preprocessing method of video contents for tracking location of merchandise available to match with object included in the video contetns, server and coordinate keyborder device implementing the same |
US10970855B1 (en) | 2020-03-05 | 2021-04-06 | International Business Machines Corporation | Memory-efficient video tracking in real-time using direction vectors |
CN113011331B (en) * | 2021-03-19 | 2021-11-09 | 吉林大学 | Method and device for detecting whether motor vehicle gives way to pedestrians, electronic equipment and medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0530049A1 (en) * | 1991-08-30 | 1993-03-03 | Texas Instruments Incorporated | Method and apparatus for tracking an aimpoint with arbitrary subimages |
EP0579319A2 (en) * | 1992-07-16 | 1994-01-19 | Philips Electronics Uk Limited | Tracking moving objects |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3487436B2 (en) * | 1992-09-28 | 2004-01-19 | ソニー株式会社 | Video camera system |
JP3268953B2 (en) * | 1995-02-27 | 2002-03-25 | 三洋電機株式会社 | Tracking area setting device, motion vector detection circuit, and subject tracking device using the same |
US6185314B1 (en) * | 1997-06-19 | 2001-02-06 | Ncr Corporation | System and method for matching image information to object model information |
US6301370B1 (en) * | 1998-04-13 | 2001-10-09 | Eyematic Interfaces, Inc. | Face recognition from video images |
US6741725B2 (en) * | 1999-05-26 | 2004-05-25 | Princeton Video Image, Inc. | Motion tracking using image-texture templates |
US6707486B1 (en) * | 1999-12-15 | 2004-03-16 | Advanced Technology Video, Inc. | Directional motion estimator |
US6731805B2 (en) * | 2001-03-28 | 2004-05-04 | Koninklijke Philips Electronics N.V. | Method and apparatus to distinguish deposit and removal in surveillance video |
US6985603B2 (en) * | 2001-08-13 | 2006-01-10 | Koninklijke Philips Electronics N.V. | Method and apparatus for extending video content analysis to multiple channels |
US8316407B2 (en) * | 2005-04-04 | 2012-11-20 | Honeywell International Inc. | Video system interface kernel |
US7529646B2 (en) * | 2005-04-05 | 2009-05-05 | Honeywell International Inc. | Intelligent video for building management and automation |
US9077882B2 (en) * | 2005-04-05 | 2015-07-07 | Honeywell International Inc. | Relevant image detection in a camera, recorder, or video streaming device |
US7876361B2 (en) * | 2005-07-26 | 2011-01-25 | Honeywell International Inc. | Size calibration and mapping in overhead camera view |
-
2001
- 2001-12-27 US US10/029,730 patent/US20030126622A1/en not_active Abandoned
-
2002
- 2002-12-10 CN CNA028261070A patent/CN1613017A/en active Pending
- 2002-12-10 EP EP02788352A patent/EP1461636A2/en not_active Withdrawn
- 2002-12-10 JP JP2003560590A patent/JP2005515529A/en not_active Withdrawn
- 2002-12-10 AU AU2002353331A patent/AU2002353331A1/en not_active Abandoned
- 2002-12-10 KR KR10-2004-7010114A patent/KR20040068987A/en not_active Application Discontinuation
- 2002-12-10 WO PCT/IB2002/005377 patent/WO2003060548A2/en not_active Application Discontinuation
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0530049A1 (en) * | 1991-08-30 | 1993-03-03 | Texas Instruments Incorporated | Method and apparatus for tracking an aimpoint with arbitrary subimages |
EP0579319A2 (en) * | 1992-07-16 | 1994-01-19 | Philips Electronics Uk Limited | Tracking moving objects |
Also Published As
Publication number | Publication date |
---|---|
WO2003060548A3 (en) | 2004-06-10 |
AU2002353331A1 (en) | 2003-07-30 |
CN1613017A (en) | 2005-05-04 |
JP2005515529A (en) | 2005-05-26 |
US20030126622A1 (en) | 2003-07-03 |
KR20040068987A (en) | 2004-08-02 |
EP1461636A2 (en) | 2004-09-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20030126622A1 (en) | Method for efficiently storing the trajectory of tracked objects in video | |
JP4369233B2 (en) | Surveillance television equipment using video primitives | |
JP4320141B2 (en) | Method and system for summary video generation | |
EP1225518B1 (en) | Apparatus and method for generating object-labelled images in a video sequence | |
US6100941A (en) | Apparatus and method for locating a commercial disposed within a video data stream | |
CA2135938C (en) | Method for detecting camera-motion induced scene changes | |
US7046731B2 (en) | Extracting key frames from a video sequence | |
US7469010B2 (en) | Extracting key frames from a video sequence | |
KR100601933B1 (en) | Method and apparatus of human detection and privacy protection method and system employing the same | |
Nicolas | New methods for dynamic mosaicking | |
US20070058717A1 (en) | Enhanced processing for scanning video | |
US6181345B1 (en) | Method and apparatus for replacing target zones in a video sequence | |
US20070030396A1 (en) | Method and apparatus for generating a panorama from a sequence of video frames | |
EP0509208A2 (en) | Camera work detecting method | |
JPH04111181A (en) | Change point detection method for moving image | |
JPH05501184A (en) | Method and apparatus for changing the content of continuous images | |
US5177794A (en) | Moving object detection apparatus and method | |
JP2008518331A (en) | Understanding video content through real-time video motion analysis | |
Liu et al. | Shot boundary detection using temporal statistics modeling | |
CN114549582A (en) | Track map generation method and device and computer readable storage medium | |
Wei et al. | TV program classification based on face and text processing | |
EP1576539A2 (en) | Method and apparatus for reduction of visual content | |
WO2003067868A2 (en) | Unit for and method of segmentation | |
JP3513011B2 (en) | Video telop area determination method and apparatus, and recording medium storing video telop area determination program | |
Dahyot et al. | Unsupervised statistical detection of changing objects in camera-in-motion video |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2002788352 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003560590 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20028261070 Country of ref document: CN Ref document number: 1020047010114 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2002788352 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2002788352 Country of ref document: EP |