WO2002027667A1 - Method for automated two-dimensional and three-dimensional conversion - Google Patents
Method for automated two-dimensional and three-dimensional conversion Download PDFInfo
- Publication number
- WO2002027667A1 WO2002027667A1 PCT/US2001/028563 US0128563W WO0227667A1 WO 2002027667 A1 WO2002027667 A1 WO 2002027667A1 US 0128563 W US0128563 W US 0128563W WO 0227667 A1 WO0227667 A1 WO 0227667A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- image
- dimensional
- image file
- border
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 15
- 238000006243 chemical reaction Methods 0.000 title abstract description 6
- 230000011218 segmentation Effects 0.000 claims description 9
- 238000010422 painting Methods 0.000 claims description 6
- 238000013500 data storage Methods 0.000 claims description 2
- 230000003287 optical effect Effects 0.000 abstract description 4
- 230000000007 visual effect Effects 0.000 abstract description 3
- 238000004422 calculation algorithm Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 2
- 230000016507 interphase Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/77—Retouching; Inpainting; Scratch removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/261—Image signal generators with monoscopic-to-stereoscopic image conversion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/189—Recording image signals; Reproducing recorded image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/286—Image signal generators having separate monoscopic and stereoscopic modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/305—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0077—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0092—Image segmentation from stereoscopic image signals
Definitions
- the present invention relates to image processing and fixation and, more particularly, to a method for converting a two-dimensional image to a multiple images and interphasing back to a single image for fixation and display as a three-dimensional image.
- An example embodiment of the present invention includes a step of receiving a two-dimensional pixel image data file into a data storage, the two-dimensional image file having pixel data reflecting a plurality of objects, and a step retrieving the two-dimensional pixel image data file and a step of displaying an image corresponding to the image file on a video display.
- the next step receives segmentation command indicating an initial border for at least one of the objects.
- the next step receives a border refinement command having parameters for generating a final border for at least one of the objects, the parameters being from a group consisting of one or more of color, noise frequency, and edge softness .
- a following step of the first example embodiment segments the image file into a plurality of images based on the segmentation command and the border refinement command.
- in-painting of at least one of the plurality of images is performed which fills a portion of the at least one image with a pixels generated based on pixels within the object represented by the at least one image file.
- a subsequent step receives a first depth data and second depth data for at least a first and a second of the plurality images.
- a parallax shift step generates at least two phase-shifted images for each of the at least first and second of the plurality of images, the two phase-shifted images having a parallax shift with respect to one another and with respect to at least one other of the plurality of images, the parallax shift being based on the depth data for that image.
- the plurality of phase-shifted images are interphased into a single interphased image file for output and printing.
- a further aspect of a method according to this invention includes a further step of receiving a plurality of two-dimensional image files, and overlaying these into a single two-dimensional image file for the segmentation step.
- the present invention provides a software package controlled by optical principles for the conversion of a single digital input image to multi images and then interphasing back to a single master image.
- the new image created a program has intensified information.
- This image when viewed through optical material, has a large variety of visual uses and applications including but not limited to three-dimensional, flip, morph, zoom and action.
- the same interphasing segment can be used when multiple images are input to produce three dimensions from multiple image cameras in the case of different or sequential images, flip, morph, zoom or action images are produced .
- FIG. 1 is a perspective view of a first example two- object corresponding to pixel image file
- FIGS. 1.1, 1.2, 1.3 and 1.4 show an example first, second, third and fourth parallax-shifted image, corresponding to a parallax shifting step according to the present invention of the FIG. 1 image ;
- FIG. 1.5 shows an interphased image based on interphasing according to the present invention of the FIGS. 1.1 through 1.4 images,- [0012]
- FIG. 1.6 shows a close up of an example portion of the interphased image shown in FIG. 1.5;
- FIG. 2.0 shows the FIG. 1.5 example image fixed upon a media
- FIG. 2.1 shows an example foreground object from within the example image shown in FIG. 1 as fixed within the FIG. 2.0 image ;
- FIG. 2.2 shows an example background object from within the example image shown in FIG. 1 as fixed within the FIG. 2.0 image ;
- FIG. 2.3 depicts an example thin film multiple lens sheet for overlay on the FIG. 2.0 fixed image
- FIG. 2.4 shows an example image as would be seen by a human observer of the FIG. 2.0 image seen through the FIG. 2.3 multiple lens overlay;
- FIG. 2.5 is a graphical representation of the eyes of the observer in FIG. 2.4;
- FIGS. 3.1 - 3.5 depict an example segmentation and border refinement step within the method of this invention, for segmenting the two-dimensional image into a plurality of objects;
- FIGS. 4.1 - 4.4 show a gap in the background object image corresponding to a plurality of parallax shifts in accordance with the method of this invention;
- FIGS. 5.1 - 5.4 depict an inpainting step performed by the method of this invention for filling the gaps depicted in FIGS. 4.1 - 4.4; of a welded mount double clip according to the present invention.
- One software package of the present invention effects conversion of a single digital input, two- dimensional image FIG. 1 to multi-images FIG. 1.1-1.4 and then interphases the images 1.1-1.4 back to a single master three-dimensional image FIG. 1.5 having intensified the information.
- Image 1.5 can be output as a photograph through the use of special photographic printers or printed out directly on standard PC printers and master you can clearly see each of the 4 input images FIG. 1.1-1.4.
- FIG. 2.0 displaying objects 2.1 and 2.2 through MOM FIG. 2.3
- the resulting image FIG. 2.4 appears to the observer FIG. 2.5 as having three-dimensions, actions, flip or zoom.
- FIG. 1.5 the following procedure is preferably followed.
- the initial two-dimensional image 1 and then subjected to a preparation phase, wherein the image undergoes segmenting involving the masking of each object FIG. 2.1-2.2 in the image by an edge detection method, e.g. "intelligent scissors". Magnetic Lasso of FIG. 3.1-3.4.
- the resulting object FIG 3.5 is segmented onto a separate layer.
- Segmenting is the process by which the singe 2D image is separated into layers based on depth in the scene. This allows the layers to be shifted by different amounts to simulate parallax.
- the users first draws a rough border around the object to be separated FIG. 3.1-3.4.
- the user can dynamically refine the area without having to paint by hand. These adjustments can be refined for the whole image or just a particular region to allow more accurate segmentation.
- the parameters are color (hue, saturation, and brightness) , noise frequency and edge softness.
- the user can adjust the rough point the user can "bake” or apply the parameters to the rough border which results in a new, more accurate rough edge border which can further refined using the adjustable parameters or through hand painting in perspective and moving resulting in images FIG. 1.1-1.4.
- the images undergo "in-Painting" FIG. 4.1-4.4 (involving the painting and removal of holes that were created during the segmented step) .
- the algorithms calculates the pixels that will require filling. This is done through intersection and subtraction calculations on the masks or alpha channels of the layers above the current layer.
- the software then uses a scanline algorithm, only filling one pixel per intersection per scanline until the area is filled.
- the color of each pixel is based on a weighted average of any surrounding pixels that already have color either from the original image or from earlier passes of this algorithm.
- the weight of each pixel is adjusted after each pass so that it relies evenly on top/bottom and right/left pixels for the first passes, and relies more heavily on the right/left pixels as the distance to original pixels increases.
- the user selects the depth by entering the value into the GUI edit field provided.
- the software assigns layer order by taking the layers in the order in which they were selected.
- the software then accurately generates the frames necessary for the interphasing step. See FIG. 1.1- 1.4.
- [0032] Can be used when multiple images are input to produce three-dimensions from multiple image cameras, the case of different or sequential images, flip, morph, zoom or action images are produced.
- the software package of this invention is controlled by the same optical principles used in three-dimensional photography. Such principles are described, for example in US Patent Nos . 3,852,787; 3,895,867; and 3,960,563; each of the references being hereby incorporated by reference herein in the entirety.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Image Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP01970887A EP1323135A1 (en) | 2000-09-14 | 2001-09-14 | Method for automated two-dimensional and three-dimensional conversion |
JP2002531371A JP2004510272A (en) | 2000-09-14 | 2001-09-14 | Automatic 2D and 3D conversion method |
AU2001290838A AU2001290838A1 (en) | 2000-09-14 | 2001-09-14 | Method for automated two-dimensional and three-dimensional conversion |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US23241000P | 2000-09-14 | 2000-09-14 | |
US60/232,410 | 2000-09-14 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2002027667A1 true WO2002027667A1 (en) | 2002-04-04 |
Family
ID=22872982
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2001/028563 WO2002027667A1 (en) | 2000-09-14 | 2001-09-14 | Method for automated two-dimensional and three-dimensional conversion |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP1323135A1 (en) |
JP (1) | JP2004510272A (en) |
CN (1) | CN1524249A (en) |
AU (1) | AU2001290838A1 (en) |
RU (1) | RU2003110175A (en) |
WO (1) | WO2002027667A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012014009A1 (en) * | 2010-07-26 | 2012-02-02 | City University Of Hong Kong | Method for generating multi-view images from single image |
CN102469318A (en) * | 2010-11-04 | 2012-05-23 | 深圳Tcl新技术有限公司 | Method for converting two-dimensional image into three-dimensional image |
TWI463434B (en) * | 2011-01-28 | 2014-12-01 | Chunghwa Picture Tubes Ltd | Image processing method for forming three-dimensional image from two-dimensional image |
US9305398B2 (en) | 2010-10-08 | 2016-04-05 | City University Of Hong Kong | Methods for creating and displaying two and three dimensional images on a digital canvas |
US9432651B2 (en) | 2008-07-24 | 2016-08-30 | Koninklijke Philips N.V. | Versatile 3-D picture format |
US10133283B2 (en) | 2012-07-26 | 2018-11-20 | Honeywell International Inc. | HVAC controller with wireless network based occupancy detection and control |
US10928087B2 (en) | 2012-07-26 | 2021-02-23 | Ademco Inc. | Method of associating an HVAC controller with an external web service |
EP2306744B1 (en) * | 2009-09-30 | 2021-05-19 | Disney Enterprises, Inc. | Method and system for utilizing pre-existing image layers of a two-dimensional image to create a stereoscopic image |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7840070B2 (en) * | 2004-11-16 | 2010-11-23 | Koninklijke Philips Electronics N.V. | Rendering images based on image segmentation |
JP4463215B2 (en) * | 2006-01-30 | 2010-05-19 | 日本電気株式会社 | Three-dimensional processing apparatus and three-dimensional information terminal |
JP5011316B2 (en) * | 2006-02-27 | 2012-08-29 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Rendering the output image |
KR100753536B1 (en) * | 2006-05-04 | 2007-08-30 | 주식회사 아이너스기술 | Method for detecting 2 dimension sketch data of source model data for 3 dimension reverse modeling |
US7719531B2 (en) * | 2006-05-05 | 2010-05-18 | Microsoft Corporation | Editing text within a three-dimensional graphic |
CN101206767B (en) * | 2006-12-22 | 2010-12-15 | 财团法人资讯工业策进会 | Image conversion device and method for transforming flat picture to three-dimensional effect image |
US7932904B2 (en) * | 2007-06-01 | 2011-04-26 | Branets Larisa V | Generation of constrained voronoi grid in a plane |
CN101350016B (en) * | 2007-07-20 | 2010-11-24 | 富士通株式会社 | Device and method for searching three-dimensional model |
CN101271578B (en) * | 2008-04-10 | 2010-06-02 | 清华大学 | Depth sequence generation method of technology for converting plane video into stereo video |
CN101593349B (en) * | 2009-06-26 | 2012-06-13 | 福州华映视讯有限公司 | Method for converting two-dimensional image into three-dimensional image |
CN101860767B (en) * | 2010-05-18 | 2011-10-19 | 南京大学 | Lattice-based three-dimensional moving image display method and realization device thereof |
CN102469323B (en) * | 2010-11-18 | 2014-02-19 | 深圳Tcl新技术有限公司 | Method for converting 2D (Two Dimensional) image to 3D (Three Dimensional) image |
CN105117021A (en) * | 2015-09-24 | 2015-12-02 | 深圳东方酷音信息技术有限公司 | Virtual reality content generation method and playing device |
CN105631938B (en) * | 2015-12-29 | 2019-12-24 | 联想(北京)有限公司 | Image processing method and electronic equipment |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4925294A (en) * | 1986-12-17 | 1990-05-15 | Geshwind David M | Method to convert two dimensional motion pictures for three-dimensional systems |
-
2001
- 2001-09-14 WO PCT/US2001/028563 patent/WO2002027667A1/en not_active Application Discontinuation
- 2001-09-14 JP JP2002531371A patent/JP2004510272A/en active Pending
- 2001-09-14 CN CNA018188818A patent/CN1524249A/en active Pending
- 2001-09-14 EP EP01970887A patent/EP1323135A1/en not_active Withdrawn
- 2001-09-14 AU AU2001290838A patent/AU2001290838A1/en not_active Abandoned
- 2001-09-14 RU RU2003110175/09A patent/RU2003110175A/en not_active Application Discontinuation
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4925294A (en) * | 1986-12-17 | 1990-05-15 | Geshwind David M | Method to convert two dimensional motion pictures for three-dimensional systems |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9432651B2 (en) | 2008-07-24 | 2016-08-30 | Koninklijke Philips N.V. | Versatile 3-D picture format |
US10567728B2 (en) | 2008-07-24 | 2020-02-18 | Koninklijke Philips N.V. | Versatile 3-D picture format |
EP2306744B1 (en) * | 2009-09-30 | 2021-05-19 | Disney Enterprises, Inc. | Method and system for utilizing pre-existing image layers of a two-dimensional image to create a stereoscopic image |
WO2012014009A1 (en) * | 2010-07-26 | 2012-02-02 | City University Of Hong Kong | Method for generating multi-view images from single image |
US9305398B2 (en) | 2010-10-08 | 2016-04-05 | City University Of Hong Kong | Methods for creating and displaying two and three dimensional images on a digital canvas |
CN102469318A (en) * | 2010-11-04 | 2012-05-23 | 深圳Tcl新技术有限公司 | Method for converting two-dimensional image into three-dimensional image |
TWI463434B (en) * | 2011-01-28 | 2014-12-01 | Chunghwa Picture Tubes Ltd | Image processing method for forming three-dimensional image from two-dimensional image |
US10133283B2 (en) | 2012-07-26 | 2018-11-20 | Honeywell International Inc. | HVAC controller with wireless network based occupancy detection and control |
US10613555B2 (en) | 2012-07-26 | 2020-04-07 | Ademco Inc. | HVAC controller with wireless network based occupancy detection and control |
US10928087B2 (en) | 2012-07-26 | 2021-02-23 | Ademco Inc. | Method of associating an HVAC controller with an external web service |
US11493224B2 (en) | 2012-07-26 | 2022-11-08 | Ademco Inc. | Method of associating an HVAC controller with an external web service |
Also Published As
Publication number | Publication date |
---|---|
EP1323135A1 (en) | 2003-07-02 |
JP2004510272A (en) | 2004-04-02 |
CN1524249A (en) | 2004-08-25 |
AU2001290838A1 (en) | 2002-04-08 |
RU2003110175A (en) | 2004-08-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1323135A1 (en) | Method for automated two-dimensional and three-dimensional conversion | |
US4925294A (en) | Method to convert two dimensional motion pictures for three-dimensional systems | |
JP7283513B2 (en) | VIDEO DISPLAY DEVICE, VIDEO PROJECTION DEVICE, THEIR METHOD AND PROGRAM | |
JP4644669B2 (en) | Multi-view image generation | |
US5311329A (en) | Digital filtering for lenticular printing | |
KR100445619B1 (en) | Device and method for converting two-dimensional video into three-dimensional video | |
CN101635859B (en) | Method and device for converting plane video to three-dimensional video | |
JP2010154422A (en) | Image processor | |
EP2323416A2 (en) | Stereoscopic editing for video production, post-production and display adaptation | |
US20040032488A1 (en) | Image conversion and encoding techniques | |
CN104112275B (en) | A kind of method and device for generating viewpoint | |
EP2225725A2 (en) | Segmentation of image data | |
JP2009282979A (en) | Image processor and image processing method | |
EP1425707A2 (en) | Image segmentation by means of temporal parallax difference induction | |
JP2000500598A (en) | Three-dimensional drawing system and method | |
US11398007B2 (en) | Video generation device, video generation method, program, and data structure | |
KR20110134142A (en) | Method and apparatus for transforming stereoscopic image by using depth map information | |
JP2002123842A (en) | Device for generating stereoscopic image, and medium for recording information | |
JPH11509998A (en) | Method for converting an image into a stereoscopic image, and an image and a series of images obtained by the method | |
US6828973B2 (en) | Method and system for 3-D object modeling | |
JP3091644B2 (en) | 3D image conversion method for 2D images | |
JP2006211383A (en) | Stereoscopic image processing apparatus, stereoscopic image display apparatus, and stereoscopic image generating method | |
WO2012096065A1 (en) | Parallax image display device and parallax image display method | |
KR101626679B1 (en) | Method for generating stereoscopic image from 2D image and for medium recording the same | |
Tran et al. | On consistent inter-view synthesis for autostereoscopic displays |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2002531371 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 305/MUMNP/2003 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2001970887 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2003110175 Country of ref document: RU Kind code of ref document: A Ref country code: RU Ref document number: RU A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 018188818 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 2001970887 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2001970887 Country of ref document: EP |