WO1990014634A1 - Digital data processing - Google Patents

Digital data processing Download PDF

Info

Publication number
WO1990014634A1
WO1990014634A1 PCT/GB1989/000523 GB8900523W WO9014634A1 WO 1990014634 A1 WO1990014634 A1 WO 1990014634A1 GB 8900523 W GB8900523 W GB 8900523W WO 9014634 A1 WO9014634 A1 WO 9014634A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
edge
digital data
corner
positions
Prior art date
Application number
PCT/GB1989/000523
Other languages
French (fr)
Inventor
Christopher George Harris
Original Assignee
Plessey Overseas Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to GB8811223A priority Critical patent/GB2218507B/en
Priority claimed from GB8811223A external-priority patent/GB2218507B/en
Application filed by Plessey Overseas Limited filed Critical Plessey Overseas Limited
Priority to JP1506161A priority patent/JPH03502261A/en
Priority to EP89906420A priority patent/EP0449827A1/en
Priority to PCT/GB1989/000523 priority patent/WO1990014634A1/en
Priority to CA000607884A priority patent/CA1333424C/en
Priority claimed from CA000607884A external-priority patent/CA1333424C/en
Publication of WO1990014634A1 publication Critical patent/WO1990014634A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/1961Movement detection not involving frame subtraction, e.g. motion detection on the basis of luminance changes in the image
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding

Definitions

  • This invention relates to digital data processing. It relates particularly to digital data information which is present in a stored digitised image matrix and to a method of and means for detecting significant features in the stored information.
  • edge features present in an image is only helpful for certain parts of the image.
  • image corners corresponding to edge junctions and sharp bends in edges, and to isolated point-like features, are important.
  • the kind of feature to be expected in the image is not restricted in any way, for example, in a depiction of a natural scene, then merely to describe the image in terms of edges alone or of corners alone would be incapable of providing understanding of the whole amount of the information present.
  • edge extractor is the Canny edge filter.
  • this device can suffer from an inability to form edge junctions and it may be necessary to perform an additional heuristic pr essing operation in order to overcome this problem.
  • edges can sometimes form a poor and unreliable descriptive feature and this therefore may lead one to prefer a corner description.
  • corner detectors Some of the best available corner detectors have been proposed by Moravec, Nagel, Beaudet and Kitchen & Rosenfeld. These corner detectors are, however, not ideal since they can be temporally inconsistent in their responses (that is, they are unreliable), and they can respond too readily to the presence of small imperfections in strong image edges.
  • the present invention was devised in an attempt to overcome some of the disadvantages of both edge and corner feature detectors.
  • a method of processing digital data in a digitised image matrix in order to detect the positions of any sharp intensity variations present in adjacent ones of the pixels representing said image comprising the steps of comparing the intensity of each primary pixel with those of the secondary pixels which surround it, calculating whether said primary pixel can be classified as having a constant, an edge-like or a corner-like value, and collating this information to give positions of both edge and corner features which are present in said image matrix.
  • the original image matrix may be processed to form an intermediate simplified image which excludes unnecessary detail.
  • This intermediate image may be a quaternary image.
  • the method may include the further step of taking a second digitised image matrix after a given time interval has elapsed, and comparing the processed digital area of the said second image with that of a first image, such that any change in the positions of the selected features of the images will be detected.
  • the detector device of the invention will first be described with reference to the main components shown in the Figure and the operation of the feature extraction stage will be explained later.
  • the surveillance detector comprises a video camera 1 which provides information for a digital data processor 2 and this acts to supply output signals on an output line 3.
  • the processor 2 receives input signals from the video camera and these are delivered to an image digitisation stage 4. After this stage, the input information is passed into a frame store 6 and the pixels in this store are then processed in turn by a corner/edge feature extraction circuit 7.
  • the data obtained from the feature extraction circuit 7 is then applied to a feature matching block 8.
  • the data is also fed into a delay block 9 which will hold this data for a predetermined time interval before delivering it to the block 8.
  • the extracted features of the data stream are compared with the set of features which is compiled after the predetermined time interval.
  • the information obtained is thus able to show any movement of the selected extracted feature which has taken place during the specified time interval. From this information it is possible to calculate the location and image velocity of a moving body which is viewed by the video camera 1.
  • a local auto-correlation function for each of the primary pixels is thus required and this will describe whether the local patch of image intensities, represented by the possible total of twentyfive pixels, is approximately constant in value, is edge-like, or is corner- like. Corners are indicated by the auto-correlation function being sharply peaked, and edges by the auto-correlation function being ridge shaped.
  • the explicit auto-correlation function may not need to be calculated, but only a determination of its second-order expansion about the origin may be necessary. A mathematical specification of these requirements will now be given.
  • R (A.B-C 2 ) - k(A+B)2 a value of 0.1 being typical for the parameter k.
  • R The presence of a corner region is indicated by the value for R being large and positive.
  • An individual pixel is deemed to be a corner if its response, R, is larger than the responses of each of its eight neighbouring pixels.
  • an edge region is indicated by R being large and negative.
  • a pixel is deemed to be an edge pixel if its response, R, is smaller (more negative) than its two neighbours in either the x or y direction, depending on which of the first gradients, X and Y, are larger in magnitude.
  • Standard edge clean-up algorithms are applied to the edge image, to remove short lines and spurs, and to complete breaks in edges.
  • the result is a quaternary (that is, a four-state) image, with each pixel classified as a corner, a corner neighbourhood, an edge or a background.
  • the edges are thin (that is, they are one pixel in width) and run between the corner regions.
  • the problem of junction formation is overcome by the presence of corner regions surrounding the corners, at which regions the edges terminate.
  • the problem of edge inconsistency in textured image regions is overcome by their being represented by regions containing corners but few edges. .
  • One way of effecting the necessary calculations would be by processing the data by means of an off-the-shelf microcomputer, with the algorithm being implemented in either high or low-level software.
  • the input to the microcomputer would be digitised images, and the output would be the locations and image velocities of the required feature combinations.
  • use of a microcomputer would be relatively slow, and this slowness will be undesirable for time-critical applications.
  • special-purpose hardware could be constructed to calculate the intermediate images X, Y, A, B, C and R, and to select appropriate local maxima or minima values of the corner/edge response R as corner or edge pixels respectively.
  • This proposed special-purpose hardware would be pipelined, and make use of convolution chips and other dedicated VLSI circuits. This step would be followed by stages for edge clean-up and feature tracking on one or more microcomputers.
  • the invention can be used to process a second digitised image matrix after a given time interval has elapsed.
  • the processed data of the said second image can then be compared with that of a first image so that any change in the positions of the selected features of the images will be detected.
  • the digital data processing method has been proposed in the construction of passive equipment for surveying a scene, such as for a burglar alarm system. With imagery acquired from a static surveying camera, any consistent motion of edge and corner features will indicate a moving target whilst inconsistent motion may be due to image noise, wind-blown vegetation etc.
  • the method can be used to detect the presence of moving vehicles. Since the digitised image could be produced from infra-red radiation rather than visible light, the system could still work effectively at night-time or in bad weather conditions.

Abstract

A method of processing digital data in a digitised image matrix in order to detect the positions of any sharp intensity variations present in adjacent ones of the pixels representing said image, the method comprising the steps of comparing the intensity of each primary pixel with those of the secondary pixels which surround it, calculating whether said primary pixel can be classified as having a constant, and edge-like or a corner-like value, and collating this information to give positions of both edge and corner features which are present in said image matrix. The processing method can be used for example in detecting images, for instance in a burglar alarm system or for the cytological scanning of cell material.

Description

DIGITAL DATA PROCESSING
This invention relates to digital data processing. It relates particularly to digital data information which is present in a stored digitised image matrix and to a method of and means for detecting significant features in the stored information.
When it is required to view a particular digital image automatically, it is usually necessary to identify corner and edge features since these correspond to the outlines of objects and to prominent surface markings which are present in the stored image. This operation gives a quantity of low-level information which can be used for further work, but it is most important that the data extracted at this early stage should be of a reliable quality.
A description of the edge features present in an image is only helpful for certain parts of the image. For other parts of the image, the so-called image corners, corresponding to edge junctions and sharp bends in edges, and to isolated point-like features, are important. Where the kind of feature to be expected in the image is not restricted in any way, for example, in a depiction of a natural scene, then merely to describe the image in terms of edges alone or of corners alone would be incapable of providing understanding of the whole amount of the information present.
Some data processing methods have been proposed which will extract either edge or corner information from an image matrix. One edge extractor is the Canny edge filter. However, this device can suffer from an inability to form edge junctions and it may be necessary to perform an additional heuristic pr essing operation in order to overcome this problem. In addition, in textured regions of the image, edges can sometimes form a poor and unreliable descriptive feature and this therefore may lead one to prefer a corner description.
Some of the best available corner detectors have been proposed by Moravec, Nagel, Beaudet and Kitchen & Rosenfeld. These corner detectors are, however, not ideal since they can be temporally inconsistent in their responses (that is, they are unreliable), and they can respond too readily to the presence of small imperfections in strong image edges.
The present invention was devised in an attempt to overcome some of the disadvantages of both edge and corner feature detectors.
According to the invention, there is provided a method of processing digital data in a digitised image matrix in order to detect the positions of any sharp intensity variations present in adjacent ones of the pixels representing said image, the method comprising the steps of comparing the intensity of each primary pixel with those of the secondary pixels which surround it, calculating whether said primary pixel can be classified as having a constant, an edge-like or a corner-like value, and collating this information to give positions of both edge and corner features which are present in said image matrix.
Conveniently, before said comparison stage, the original image matrix may be processed to form an intermediate simplified image which excludes unnecessary detail. This intermediate image may be a quaternary image. In one embodiment, the method may include the further step of taking a second digitised image matrix after a given time interval has elapsed, and comparing the processed digital area of the said second image with that of a first image, such that any change in the positions of the selected features of the images will be detected.
By way of example, a particular embodiment of jthe invention will now be described with reference to the accompanying drawing the single Figure of which shows a block diagram of the main components of an image motion surveillance detector.
The detector device of the invention will first be described with reference to the main components shown in the Figure and the operation of the feature extraction stage will be explained later.
The surveillance detector comprises a video camera 1 which provides information for a digital data processor 2 and this acts to supply output signals on an output line 3. The processor 2 receives input signals from the video camera and these are delivered to an image digitisation stage 4. After this stage, the input information is passed into a frame store 6 and the pixels in this store are then processed in turn by a corner/edge feature extraction circuit 7.
The data obtained from the feature extraction circuit 7 is then applied to a feature matching block 8. In parallel with this, the data is also fed into a delay block 9 which will hold this data for a predetermined time interval before delivering it to the block 8.
At the feature matching block 8, the extracted features of the data stream are compared with the set of features which is compiled after the predetermined time interval. The information obtained is thus able to show any movement of the selected extracted feature which has taken place during the specified time interval. From this information it is possible to calculate the location and image velocity of a moving body which is viewed by the video camera 1.
In order to detect the presence of a sharp intensity variation in adjacent ones of the pixels representing the image matrix, it is first necessary to compare the intensity of light at one primary pixel with the intensities of each of the secondary pixels which surround it. There may be at least twentyfour of the secondary pixels, or possibly several hundred.
A local auto-correlation function for each of the primary pixels is thus required and this will describe whether the local patch of image intensities, represented by the possible total of twentyfive pixels, is approximately constant in value, is edge-like, or is corner- like. Corners are indicated by the auto-correlation function being sharply peaked, and edges by the auto-correlation function being ridge shaped. The explicit auto-correlation function may not need to be calculated, but only a determination of its second-order expansion about the origin may be necessary. A mathematical specification of these requirements will now be given.
Let the (possibly pre-smoothed) image intensities be represented by the array of values Iy, where x and y are the Cartesian image coordinates. To start with, the two first gradients are calculated, thus
Figure imgf000006_0001
Next, the smoothed quadratic gradients are calculated A = X2 * W
B = Y2 * W
C = (X.Y) * W where * represents convolution, and W is a smoothing filter, an explicit example being
(1 2 1)
W = (2 4 2)
(1 1)
Finally, the corner/edge response, R, is calculated
R = (A.B-C2) - k(A+B)2 a value of 0.1 being typical for the parameter k.
The presence of a corner region is indicated by the value for R being large and positive. An individual pixel is deemed to be a corner if its response, R, is larger than the responses of each of its eight neighbouring pixels. Similarly, an edge region is indicated by R being large and negative. A pixel is deemed to be an edge pixel if its response, R, is smaller (more negative) than its two neighbours in either the x or y direction, depending on which of the first gradients, X and Y, are larger in magnitude.
Standard edge clean-up algorithms (similar to those used in the Canny edge filter) are applied to the edge image, to remove short lines and spurs, and to complete breaks in edges. The result is a quaternary (that is, a four-state) image, with each pixel classified as a corner, a corner neighbourhood, an edge or a background. The edges are thin (that is, they are one pixel in width) and run between the corner regions. The problem of junction formation is overcome by the presence of corner regions surrounding the corners, at which regions the edges terminate. The problem of edge inconsistency in textured image regions is overcome by their being represented by regions containing corners but few edges. .
One way of effecting the necessary calculations would be by processing the data by means of an off-the-shelf microcomputer, with the algorithm being implemented in either high or low-level software. The input to the microcomputer would be digitised images, and the output would be the locations and image velocities of the required feature combinations. However, due to the number of calculations involved, use of a microcomputer would be relatively slow, and this slowness will be undesirable for time-critical applications.
For the time-critical applications, special-purpose hardware could be constructed to calculate the intermediate images X, Y, A, B, C and R, and to select appropriate local maxima or minima values of the corner/edge response R as corner or edge pixels respectively. This proposed special-purpose hardware would be pipelined, and make use of convolution chips and other dedicated VLSI circuits. This step would be followed by stages for edge clean-up and feature tracking on one or more microcomputers.
"Where the scene viewed by the video camera can be expected to change with the passage of time, the invention can be used to process a second digitised image matrix after a given time interval has elapsed. The processed data of the said second image can then be compared with that of a first image so that any change in the positions of the selected features of the images will be detected.
Use of the digital data processing method has been proposed in the construction of passive equipment for surveying a scene, such as for a burglar alarm system. With imagery acquired from a static surveying camera, any consistent motion of edge and corner features will indicate a moving target whilst inconsistent motion may be due to image noise, wind-blown vegetation etc. In a traffic control application, the method can be used to detect the presence of moving vehicles. Since the digitised image could be produced from infra-red radiation rather than visible light, the system could still work effectively at night-time or in bad weather conditions.
Use of the data processing method for the automatic scanning of cytological cell material has also been proposed.
The foregoing description of an embodiment of the invention has been given by way of example only and a number of modifications may be made without departing from the scope of the invention as defined in the appended claims. For instance, although the invention has been described as a method of processing the data in a single image matrix, this would be equally applicable to processing material from two or more image matrices, so that a three-dimensional effect could be obtained.

Claims

1. A method of processing digital data in a digitised image matrix in order to detect the positions of any sharp intensity variations present in adjacent ones of the pixels representing said image, the method comprising the steps of comparing the intensity of each primary pixel with those of the secondary pixels which surround it, calculating whether said primary pixel can be classified as having a constant, an edge-like or a comer-like value, and collating this information to give positions of both edge and corner features which are present in said image matrix.
2. A method as claimed in Claim 1, in which before said comparison stage, the original image matrix is processed to form an intermediate simplified image from which unwanted detail has been excluded.
3. A method as claimed in Claim 2, in which the said intermediate image is a quaternary image.
4. A method as claimed in any one of Claims 1 to 3, including the further step of taking a second digitised image matrix after a given time interval has elapsed, and comparing the processed digital data of the said second image with that of a first image, such that any change in the positions of the selected features of the images will be detected.
5. A method of processing digital data in a digitised image matrix, substantially as hereinbefore described.
PCT/GB1989/000523 1988-05-12 1989-05-15 Digital data processing WO1990014634A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
GB8811223A GB2218507B (en) 1989-05-15 1988-05-12 Digital data processing
JP1506161A JPH03502261A (en) 1988-05-12 1989-05-15 digital data processing
EP89906420A EP0449827A1 (en) 1988-05-12 1989-05-15 Digital data processing
PCT/GB1989/000523 WO1990014634A1 (en) 1988-05-12 1989-05-15 Digital data processing
CA000607884A CA1333424C (en) 1988-05-12 1989-08-09 Digital data processing

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
GB8811223A GB2218507B (en) 1989-05-15 1988-05-12 Digital data processing
PCT/GB1989/000523 WO1990014634A1 (en) 1988-05-12 1989-05-15 Digital data processing
CA000607884A CA1333424C (en) 1988-05-12 1989-08-09 Digital data processing

Publications (1)

Publication Number Publication Date
WO1990014634A1 true WO1990014634A1 (en) 1990-11-29

Family

ID=27168408

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1989/000523 WO1990014634A1 (en) 1988-05-12 1989-05-15 Digital data processing

Country Status (2)

Country Link
EP (1) EP0449827A1 (en)
WO (1) WO1990014634A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0819244A1 (en) * 1995-04-04 1998-01-21 Bacharach, Inc. Apparatus for imaging gas
US8144946B2 (en) 2007-01-30 2012-03-27 Continental Automotive France Method of identifying symbolic points on an image of a person's face

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
IBM Technical Disclosure Bulletin, Vol. 15, No. 10, March 1973 (New York, US) T. KANEKO et al.: " Detecting the Boundary of a Moving Object on a Motion Picture", pages 3247-3251 *
Proceedings of the International Conference on Pattern Recognition, November 1982, Munich, IEEE (New York, US) L.S. DRESCHLER et al.: "On the Selection of Critical Points and Local Curvature Extrema of Region Boundaries for Interframe Matching", pages 542-544 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0819244A1 (en) * 1995-04-04 1998-01-21 Bacharach, Inc. Apparatus for imaging gas
EP0819244A4 (en) * 1995-04-04 1999-04-14 Bacharach Inc Apparatus for imaging gas
US8144946B2 (en) 2007-01-30 2012-03-27 Continental Automotive France Method of identifying symbolic points on an image of a person's face

Also Published As

Publication number Publication date
EP0449827A1 (en) 1991-10-09

Similar Documents

Publication Publication Date Title
EP0506327B1 (en) A system and method for ranking and extracting salient contours for target recognition
US9196043B2 (en) Image processing apparatus and method
US7430303B2 (en) Target detection method and system
US5890808A (en) Image processing method and apparatus for correlating a test image with a template
JP2863818B2 (en) Moving image change point detection method
GB2218507A (en) Digital data processing
US6687419B1 (en) Automatic image montage system
US6226388B1 (en) Method and apparatus for object tracking for automatic controls in video devices
US7397970B2 (en) Automatic scene correlation and identification
US5081689A (en) Apparatus and method for extracting edges and lines
US7181047B2 (en) Methods and apparatus for identifying and localizing an area of relative movement in a scene
EP1105842B1 (en) Image processing apparatus
EP3726421A2 (en) Recognition method and apparatus for false detection of an abandoned object and image processing device
US20040208364A1 (en) System and method for image segmentation
US20200394802A1 (en) Real-time object detection method for multiple camera images using frame segmentation and intelligent detection pool
WO1990014634A1 (en) Digital data processing
KR940003654B1 (en) Method of processing digital data
JPH08161474A (en) Method for correcting registration between images by different kinds of sensors
Köhn et al. Automatic Building Extraction and Roof Reconstruction in 3k Imagery Based on Line Segments
CA1333424C (en) Digital data processing
Gates et al. A real-time line extraction algorithm
JPH09102040A (en) Picture recognition device by edge
JPH0546734A (en) Method for recognizing pattern
CN115830304A (en) Object detection system
Mostafavi et al. High speed implementation of linear feature extraction algorithms

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 1989906420

Country of ref document: EP

AK Designated states

Kind code of ref document: A1

Designated state(s): JP KP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE FR GB IT LU NL SE

WWP Wipo information: published in national office

Ref document number: 1989906420

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1989906420

Country of ref document: EP