EP0293397A1 - Gestaltermittlung - Google Patents

Gestaltermittlung

Info

Publication number
EP0293397A1
EP0293397A1 EP87906194A EP87906194A EP0293397A1 EP 0293397 A1 EP0293397 A1 EP 0293397A1 EP 87906194 A EP87906194 A EP 87906194A EP 87906194 A EP87906194 A EP 87906194A EP 0293397 A1 EP0293397 A1 EP 0293397A1
Authority
EP
European Patent Office
Prior art keywords
data
sinogram
shape
value
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP87906194A
Other languages
English (en)
French (fr)
Inventor
Violet Frances Leavers
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB08622497A external-priority patent/GB2203877A/en
Application filed by Individual filed Critical Individual
Publication of EP0293397A1 publication Critical patent/EP0293397A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation

Definitions

  • This invention relates to a method of and apparatus for shape detection of digital data.
  • US patent specification No 3069654 describes a method of finding the parameters of straight lines in an image.
  • Bach point is mapped into a gradient-intercept (m,c) parametric transform space to produce lines representing all possible gradients and intercepts of lines passing through that point.
  • a maximum, or intersection, in the transform space is detected and determined to represent a line in the image space, since an intersection at (m I ,c I ) of a plurality of lines in the transform space denotes a corresponding plurality of colinear points in the image space all lying on the line having the equation y
  • Each point in the image space is mapped into a sine curve in the sin ⁇ gram representing r and 8 for all possible lines in the image space passing through the point, where r is the algebraic distance from the origin to the line along a normal to the line and ⁇ is the angle to the normal to the x-axis.
  • the limits of ⁇ are - ⁇ ⁇ 8 ⁇ ⁇ .
  • the transform plane has the topology of a Xobius strip when the range covers an interval of ⁇ . In the case of a bounded square image space of size L x L, the limits of r sure
  • shape in two dimensions
  • shape to be the relative spatial arrangement of a 'closed set ⁇ f connex points in the (x,y) plane, the principal boundary of the set dividing the plane into two domains, one open, one closed. Both domains share as common boundary, the same simple, closed curve 32. See Fig.1.
  • Secondary boundaries 34 may exist which exclude closed connex sets of points from the domain of the shape, see Fig.2. Where a shape is composed of a multiplicity of boundaries then the sets of points of such, boundaries are disjoint one from another.
  • the boundary points ⁇ f the domain of the shape may be partitioned into closed sets called shape-primitives.
  • One aspect of the present invention is concerned with the automatic partitioning of the boundary points ⁇ f the edge image ⁇ f an object into its constituent shape-primitives thus enabling a symbolic description of the abject to be deduced and stored in a computer's memory for the purposes of automatic recognition and location ⁇ f objects.
  • Figs 3 to 10 show the transforms of binary images of curved shape-primitives. Any arc of a conic section may also be used as a shape-primitive. It is convenient to use conic sections or arcs of conic sections as shape-primitives but cither formulations are possible.
  • a and b are parameters describing the dilations which are required to produce a particular shape-primitive from a unit circle
  • xo,yo are the parameters describing the translation of an abject centered ca-ordinate system with respect to the viewer-centered coordinate system
  • is the angle through which the former co-ordinate system has been rotated.
  • Figure 11 shows how the different shape primitives are related; a transformation matrix is used to dilate the image space in which a unit circle is defined.
  • the hyperbola, parabola, and straight line may also result using the limiting cases ⁇ f particular families ⁇ f ellipses.
  • the first partial derivatives, with, respect to x and y, of f(x,y) must exist and be finite. This ensures that the tangents to the curve are continuously turning;
  • the shape-primitive must be of finite length, not intersect itself, and have no branch points;
  • An endpoint of a shape-primitive may be common to only one other shape-primitive. Where a point is common to two shape-primitives that point is an endpoint ⁇ f both shape-primitives. Thus the only intersection of the curves is the plural membership of the endpoints of the curves. This ensures the closure of the curve which forms the boundary of the domain of an object and allows composite shape-boundaries to be formed by the conjunction of a finite number of shape-primitives.
  • the prior art may be classified as providing a method ⁇ f shape detection in digital image data comprising the steps ⁇ f: transforming the image data into a parametric transform space; and extracting shape characterising parameters from the transform space indicative of a particular shape in the image space; by either: using a transformation particular to a given shape-primitive and thereafter searching that transform space for the position of maxima indicative of the particular shape-primitive in image space; (in the instances ⁇ f curved shape-primitives it is necessary to have a parameter space whose dimensionality corresponds to the number of parameters under detection); or locating the maxima in a sinogram and thereafter performing shape ⁇ primitive specific transformations on the distributions of maxima in the transform space and applying an "inverse" transform on the resulting image, as shown for example in Fig.14 to Fig.17.
  • An accumulation point 50 will be seen at (xo,yo), the position of the center of the circle.
  • the transformation of the first transform space, - ⁇ will be known and hence this method yields the radius and the center co-ordinates of the circle.
  • the method may be similarly used to deduce the parameters associated with other conic sections. Where the image is composed of shape-primitves of different scales, then each shape primitive must be separately searched. If the image contains a multiplicity of shape primitives, each must be detected separately using a similar process.
  • the present invention seek to overcome the problems of: the detection of spurious maxima and disconnected c ⁇ linearities associated with the prior art when the shape-primitive under detection is a straight line segment; the optimal detection of the maxima associated with curved shape primitives; and the automatic decomposition of the boundary of the shape under detection into its constituent shape-primitives, allowing symbolic descriptions of shape to be deduced automatically.
  • the method provided by one aspect of the present invention is characterised in that the extraction step includes the step of detecting at least one particular shape-primitive indicative distribution of data in the transform space. For example, in the present invention, detection may be made of the spatial distributions of data immediately surrounding a maximum, or to either side of a continuous locus of maxima.
  • a second transformation of the maxima into a second transform space is performed.
  • the second transform space is a sinogram.
  • Xaxima will occur in the second transform space, the positions of which are indicative ⁇ f the straight line segments of maxima in the first transform space. From the positions of these latter maxima, a search can.be made by tracking for all ⁇ f the maxima in the first transform space indicative of the particular shapeprimitive, and thus the parameters of the shape-primitive in the image space can be determined.
  • the positions of the maxima in the second transform space and the criteria associated with "proper" shape-primitives may be used either: to locate, track and numerically fit the data points comprising a continuous curve in the first, filtered transform plane; or t ⁇ locate and track the points on a continuous sinusoid in the first, filtered transform plane and to use these points to locate and fit numerically the constituent points of a shape-primitive in the image space.
  • any curve which may be approximated by the conjunction of arcs ⁇ f conic sections may be similarly treated.
  • arcs of conic sections As the shape-primitives in the present formulation of the method other types of shape-primitives may also be deduced; for example a polynomial of order n, where n is chosen to give the required degree of accuracy in the modelling process.
  • Fig.1 is an illustration of the principle boundary of a shape.
  • Fig.2 is an illustration of the principle boundary of a shape with examples of secondary boundaries.
  • Fig.3 is a digital, binary image of a circle.
  • Fig.4 is an intensity map of the the transformed image of the circle shown in Fig.3.
  • Fig.5 is a digital, binary image ⁇ f an ellipse.
  • Fig.6 is a digital image of an intensity map ⁇ f the transformed image of the ellipse shown in Fig.5.
  • Fig.7 is a digital, binary image ⁇ f an hyperbola.
  • Fig.8 is a digital image of an intensity map of the transformed image of the hyperbola shown in Fig.7.
  • Fig.9 is a digital, binary image of a parabola.
  • Fig.10 is a digital image of an intensity map of the transformed image of the parabola shown in Fig.9.
  • Fig.11 is a schematic representation of the method by which various shape-primitives may be created by the application of linear transformations to the space ⁇ f a unit circle.
  • Fig.12 is a diagram showing the position of the tangent to a curve at the point where the curve and the tangent have a common normal.
  • Fig.13 is a diagram showing the position of the maximum value in the transform space whose position in that space may be used to deduce the equation of the tangent shown in Fig.12.
  • Fig.14 is a diagram of a circle in image space.
  • Fig.15 is a diagram of the transformed image of Fig.14 showing the curve which is the locus of the maximum values in that transform space.
  • Fig.16 is a diagram of the result of subtracting a value equal to the radius ⁇ f the circle in Fig.14 from each of the points along the curve shown in Fig.15.
  • Fig.17 is a diagram of the transform plane which results from plotting the straight lines whose equations may be deduced using the positions of the points along the curve shown in Fig.16.
  • Fig.18 is a schematic diagram ⁇ f the apparatus ⁇ f one embodiment of the invention.
  • Fig.19 is a perspective view of an object, the processing ⁇ f an image of which is described below; Figures 20 to 22 are representative of the object after various processing operations;
  • Fig.23 and Fig.24 illustrate the mapping of a single point and three colinear points, respectively, from an image space to a sinogram
  • Fig.25 and Fig.26 are graphical representations of distributions ⁇ f curves in the sinogram
  • Fig.27 is a matrix of mask values used in detecting distributions of data in the sinogram indicative of a straight line segment in the image space;
  • Figs 28 and 29 are digital representations corresponding to Figures 25 and 26, respectively;
  • Figures 30 and 31 illustrate the mapping of a curved shape-primitive (in this case a circle) from an image space to a sinogram;
  • Rig.32 is a graphical representation of data intensity across a belt produced in the sinogram at the location indicated by the lines XV - 1V in Fig.31;
  • FIg.33 is a digital representation of the data shown in Fig.32;
  • Figures 34 and 36 are mask values for use in detecting data in the sinogram representative of a curved shape-primitive
  • Figures 35 and 37 show the data of Fig.33 after convolution using the masks of Figures 34 and 36, respectively;
  • Figures 38 and 39 show the data of Figures 35 and 75, respectively after further processing
  • Fig.40 is the digital, binary image of a hand-drawn curve
  • Fig.41 is the digital image of the first transform of the binary image of the curve
  • Fig.42 is the digital, binary image of the transform plane after the application of the convolution filter detailed in Fig.27;
  • Fig.43 is the digital image of the first transform plane after the application of the convolution filters detailed in Figures 34 and 36, where both convolved images have been added and the result binarized.
  • Fig.44 is a digital image of the second transformation, i.e. the transformation of the binary image of Fig.43;
  • Fig 45 is a digital image of the reconstruction of the curve of Fig.40 as the envelope of its tangents.
  • a camera 10 outputs a digitised video signal representing the object illustrated in Fig.19, which is stored in a frame store 12.
  • the frame size is 256 pixels by 256 pixels, and each pixel has eight bits and so can store values in the range 0 to 255.
  • a parallel processor 16 such as a linear array processor as described in UK patent specification no. 2129545b, then performs an edge detection operation on the stored frame using a Sobel type operator in a known manner to produce an image as represented in Fig.20 which is stored as a frame in the frame store
  • the image is then subjected, to a thresholding operation by parallel processor 16 to produce a binarized image in which each pixel either has a predetermined low value, for example 0, or a predetermined high value, for example 255.
  • the binarized image which is represented in Fig.21, is stored in the frame store 12.
  • the edges of the image are then thinned, and isolated points are removed by the processor 16, the resulting binarized image, as represented in Fig.22, is stored in the framestore 12.
  • the edge points are mapped from the image space into a sinogram or angle, radius normal parametrisation space ( ⁇ ,r) by a host computer 20, and the sinogram is stored as a frame in the framestore 12.
  • a host computer 20 a host computer 20
  • the sinogram is stored as a frame in the framestore 12.
  • each edge point at co-ordinates (x i ,y i ) in the image space is transformed to a sine curve d representing the angle and radii ( ⁇ ,r) of the normals to all possible lines passsing through the point (x i ,y i ) in the image space.
  • the sine curve Ci satisfies the equation: r - (X i 2 + y i 2 ) 1 /2 cos( ⁇ + tan -1 (y i /x i ))
  • lines 1 1 , 1 2 are shown in the image space of Fig.23 which produce paints at ( ⁇ 1 ,r 1 ) and ( ⁇ 2 ,r 2 ) in the sinogram.
  • Fig,24 shows how three points P 1 , P 2 , P 3 in the image space are transformed into three sine curves C 1 , C 2 , C 3 in the sinogram. Since the three points P 1 , P 2 , P 3 are colinear, the sine curves C 1 , C 2 , C 3 intersect at a single point I in the sinogram.
  • the co-ordinates ( ⁇ L ,r L ) of the intersection L in the sinogram give the angle and length of the normal in the image space which defines the line L on which the three points P 1 , P 2 , P 3 lie.
  • Fig.27 shows a 3 pixel by 3 pixel mask for detecting butterfly shaped distributions in the sinogram.
  • the sinogram in the frame store 12 is convolved with the mask using the parallel processor 16 and the results of the convolution operation are stored as a frame in the framestore 12. More specifically, considering the upper three rows of pixels in a frame Io.o to
  • the mask is applyed to each 3 x 3 group ⁇ f pixels, that is Io,i- 1, Io,i, Io,i- 1 ; I1 ,i-1, I1 ,i, I1,i- 1i I2,i- 1 , I2,i and
  • the Figure 29 group gives a result of ((3 x 0 ) + (1 x -2) + (3 x
  • the two dfae ⁇ si ⁇ nal convolution operation decribed above may be more efficiently computed using two one dimensional convolution operations and summing the result.
  • a column vector of the form (-2, 2, -2) is applied to each column of the image data, begining at the second column and terminating at the penultimate column.
  • the results are summed and stored as the centre pixel value of a new frame of data i.e. in each column the centre pixel J n is multiplied by the value 2 and the pixels on either side of the centre pixel, pixels J n- 1 and J n+ 1 , are multiplied by the value -2.
  • the results of these operations are summed and stored in a separate frame at the location of the pixel J n .
  • the host computer selects those ⁇ f the convolution results above a predetermined threshold value, say 2, and representative of a continuous line and, from the location ⁇ f the pixel in the frame section 18, determines the parameters of the line represented by that pixel.
  • a predetermined threshold value say 2
  • the convolution result of 3 for the group ⁇ f pixel data shown in Fig.28 produces an indication of a continuous line, as compared with the result of 1 for the group of Fig.29, which is not treated as indicating a continuous line, despite the centre pixel of that group being a maximum.
  • the computer 20 compares the detected line with the original image stored in framestore 12 and determines the location of the endpoints of the detected line.
  • Fig.30 the points on a circle, C in the image space are mapped in the same way as described above and produce in the sinogram shown in Fig.31 a belt 26 of sine curves.
  • the distribution of sine curves across the belt in the sinogram is not even, but rather the curves have maximum intensities at the edges 28, 30 ⁇ f the belt and the intensity exhibits an inverse square root dependence near the edges.
  • Fig.32 is a plot, by way of example, of the intensity I across the belt 26, and Fig.33 is a digital representation of the intensity.
  • the parameters of the circle in the image space can be determined from the size and phase of the belt in the sinogram.
  • the scale of the r axis is one half of that for the x or y axes and so 2p in Figure 31 appears to be the same distance as ⁇ in Figure 30.
  • Two 1 x 3 masks are used to detect the edges of belts, the mask of Fig.34 having values (2, -1, -1) being used to detect lower edges, and the mask of Fig.36 having values (-1, -1, 2) being used to detect upper edges.
  • the upper and lower edge masks are traversed in the -r direction of the sinogram along each column of pixels with a step of one pixel, and at each stage the group of three pixel values in the respective column ⁇ f the sinogram are multiplied by the corresponding mask values and summed and placed In a location corresponding to the middle pixel of the group in a further frame.
  • the process is repeated for the second to fourth pixel values of Fig.33 and the result of zero is stored as the value of the third pixel, as shown in Fig.37.
  • the process is repeated all the way down the pixel column, and similar processes are carried out by the parallel processsor 16 simultaneously for all the pixel columns.
  • Detection ⁇ f curves in the sinogram that are the locus of the maxima indicative of a curved shape-primitive in the image space may be carried out by simply adding the convolved images after the application ⁇ f the two masks of figures 34 and 36.
  • the image so obtained may then be binarized and the result transformed into a second transform space similarly to the transformation from the image space (Fig.30) to the first sinogram (Fig.31).
  • the second transform space may then be filtered using the butterfly detecting mask of Fig.27 and the positions of the maxima corresponding to quasistraight line segments in the first, filtered transform plane determined. See Fig.45.
  • Fig.40 is the binary image of a hand-drawn "arbitrary" curve and Fig.41 is the first transform of the binary image of that curve, Fig,42 is the binary image of the transform plane ⁇ f Fig.41 after the application of the convolution filter detailed in Fig.27. Fig.43 is of the first transform plane of Fig.41 after the convolution filters detailed in Figures 34 and 36 are applied separately to images of that transform plane and both convolved images added and the result binarized.
  • Fig.44 is the result of the transformation of the data shown in Fig.43.
  • Fig 45 is of the reconstruction of the curve of Fig.40 as the envelope of its tangents.
  • the positions of the maxima 52 in the second transform space (fig.46, v and vi) are used to seed a tracking process which uses the known properties of the sinusoids 54 detected In the first transform space (Fig.46 iii and iv). For example it is known that a continuous distribution 56 of paints (such as the circle) in the image space (Fig.46 i and ii) will produce a continuous distribution of maxima in the first transform space.
  • This coupled with a knowledge ⁇ f the general equation of the distribution of maxima associated with curved shape-primitives, may be used to guide the partitioning of the image 56 into sets of points comprising the constituent shape-primitives of the boundary of the object, which in the case illustrated is a single circular shape-primitive.
  • the search may be aided by the fact that the first transform space has the topology of a Hobius strip. The plane may therefore be twisted and the edges joined to form a continuous strip.
  • a variety of methods may be used to extract the parameters associated with the curved shape-primitives of the edges in image space from the information contained in the corresponding sets of maxima generated in the first transform space, the latter having been located and partitioned into sets of points with the aid ⁇ f a second transformation.
  • the most general methods would be either to fit the constituent points on the sinusoids numerically using the general equation of the curve (equation (1)) or alternatively to use the points on the sinusoids as pointers to the actual shape-primitives in image space.
  • BB pair of adjacent points on the sinusoid will be indicative of two tangents to the curved shape primitive it. image space. The intersection of these tangents will form a part of the envelope to the curve (see Fig.45).
  • the most general methods may be more computationally intensive than is necessary where prior knowledge of the image exists. For example; if it is known that only circles or arcs of circles are present in the edge image then the partitioned sets of the constituent points ⁇ f the sinusoids may be each separately Fourier transformed and the Fourier coefficients used to deduce the radius and the location of the centre co-ordinates of the circles or circular arcs.
  • the method does not suffer from the problems of consistent-labelling techniques associated with the prior art because the shape-primitives which form the principal boundary (32 in Figures 1 and 2) of the object will always form a closed loop by virtue of the property of closure of the principal boundary curves.
  • Secondary, disjoint boundaries (34 in Fig.2) may be labelled according to their relative spatial arrangements with respect t ⁇ the origin of an object centered co-ordinate system; such a co-ordinate system may be defined by the centroid of the principal boundary.
  • the method allows an hierarchical search strategy to be employed when attempting to recognise a particular instance of a shape. This is computationally very efficient.
  • the parameters in the image space can be stored away in a library using much less memory than would be required to store a whole frame ⁇ f image data.
  • the method described above can be used to read, encode and store automatically and efficiently two dimensional images such as technical drawings or architectural drawings or electrical circuit diagrams.
  • the method can also be used for example in interferometry to detect and provide the parameters of interference fringes.
  • the two dimensional spatial relationships between lines and curves can be determined by the computer and then compared with stored two dimensional data.
  • the method here described refers to a two dimensional representation of shape.
  • two cameras are situated t ⁇ provide a pair of stereoscopic images which are both processed as described above in parallel with each other.
  • the two sets of Image data are then matched by the computer to obtain three dimensional information.
  • the same technique may then be be applied using a three dimensional transformation and shape primitives composed of surfaces rather than curves.
  • a three dimensional representation is determined and may be used to identify and locate an object, thus providing automatic robotic vision.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
EP87906194A 1986-09-18 1987-09-17 Gestaltermittlung Withdrawn EP0293397A1 (de)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB8622497 1986-09-18
GB08622497A GB2203877A (en) 1986-09-18 1986-09-18 Shape parametrisation
GB8719450 1987-08-18
GB8719450 1987-08-18

Publications (1)

Publication Number Publication Date
EP0293397A1 true EP0293397A1 (de) 1988-12-07

Family

ID=26291306

Family Applications (1)

Application Number Title Priority Date Filing Date
EP87906194A Withdrawn EP0293397A1 (de) 1986-09-18 1987-09-17 Gestaltermittlung

Country Status (2)

Country Link
EP (1) EP0293397A1 (de)
WO (1) WO1988002158A1 (de)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990027482A (ko) * 1997-09-30 1999-04-15 전주범 패턴 인식 시스템에서 영상내 물체 구성 직선들의 관계 설정방법
AU3830100A (en) * 1999-04-10 2000-11-14 Victoria University Of Manchester, The Form analysis of particles using a parametric transform
GB2391099B (en) 1999-07-05 2004-06-16 Mitsubishi Electric Inf Tech Method and apparatus for representing and searching for an object in an image
GB2391676B (en) * 1999-07-05 2004-05-05 Mitsubishi Electric Inf Tech Method and apparatus for representing and searching for an object in an image
EP1516264B1 (de) * 1999-07-30 2017-05-24 Intellectual Ventures Holding 81 LLC Auffinden von bildern durch generierung eines deskriptors für jeden bereich eines bildes dessen zellen visuelle merkmale innerhalb einer gewissen toleranz aufweisen
CN111435544B (zh) * 2019-01-14 2021-11-05 珠海格力电器股份有限公司 图片处理方法和装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO8802158A1 *

Also Published As

Publication number Publication date
WO1988002158A1 (en) 1988-03-24

Similar Documents

Publication Publication Date Title
Siddiqi et al. Geometric shock-capturing ENO schemes for subpixel interpolation, computation and curve evolution
US10198858B2 (en) Method for 3D modelling based on structure from motion processing of sparse 2D images
Petitjean A survey of methods for recovering quadrics in triangle meshes
JP6216508B2 (ja) 3dシーンにおける3d物体の認識および姿勢決定のための方法
Flusser et al. A moment-based approach to registration of images with affine geometric distortion
Fan Describing and recognizing 3-D objects using surface properties
EP1658579B1 (de) Verfahren zur klassifikation und räumlichen lokalisierung von begrenzten 3d-objekten
Ziou Line detection using an optimal IIR filter
EP0514688A2 (de) Generalisierte Gestaltautokorrelation zur Ermittlung und Erkennung von Gestalt
Lee et al. Region matching and depth finding for 3D objects in stereo aerial photographs
Parvin et al. Adaptive multiscale feature extraction from range data
EP0293397A1 (de) Gestaltermittlung
Leavers Use of the Radon transform as a method of extracting information about shape in two dimensions
Fidrich Following feature lines across scale
Guerrini et al. Innerspec: Technical report
GB2203877A (en) Shape parametrisation
Brookshire et al. Automated stereophotogrammetry
Latecki Well-composed sets
Besl et al. Range image segmentation
Walker Combining geometric invariants with fuzzy clustering for object recognition
Lindeberg et al. Linear scale-space II: Early visual operations
Lee et al. 3-D shape from contour and selective confirmation
McAndrew et al. Rapid invocation and matching of 3D models to 2D images using curvilinear data
Greenspan et al. Projection-based approach to image analysis: Pattern recognition and representation in the position-orientation space
Tan et al. Three-dimensional object recognition and pose determination based on combined-edge and surface-shape data

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 19880930

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH DE FR GB IT LI LU NL SE

17Q First examination report despatched

Effective date: 19910213

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 19910626