GB2210488A - Pattern recognition method - Google Patents

Pattern recognition method Download PDF

Info

Publication number
GB2210488A
GB2210488A GB8822100A GB8822100A GB2210488A GB 2210488 A GB2210488 A GB 2210488A GB 8822100 A GB8822100 A GB 8822100A GB 8822100 A GB8822100 A GB 8822100A GB 2210488 A GB2210488 A GB 2210488A
Authority
GB
United Kingdom
Prior art keywords
silhouette
normalized
unknown
image
silhouettes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB8822100A
Other versions
GB8822100D0 (en
Inventor
John Terzian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Raytheon Co
Original Assignee
Raytheon Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Raytheon Co filed Critical Raytheon Co
Publication of GB8822100D0 publication Critical patent/GB8822100D0/en
Publication of GB2210488A publication Critical patent/GB2210488A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/42Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
    • G06V10/421Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation by analysing segments intersecting the pattern

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The digitized image of the silhouette of an unknown ship is represented as a set of vertical vectors (32) derived from the outline extracted (30) from the image. The vertical vectors are normalized to a standard number of vectors with normalized heights (36, 38) and the set of normalized vectors is compared (40, 41, 42, 44, 46) with a reference set from a library (18) containing normalized digitized images of known ships. The unknown ship is then identified (54) as corresponding to the library image most closely matching the image of the unknown ship. <IMAGE>

Description

PATW R9OX2ETION FETHOD Backqround of the Invention This invention pertains qenerally to pattern recognition, and in particular to a method of classifyinq objects by matching the silhouette of an unknown object to the silhouette of a known object.
Object recognition may be performed using any distinctive characteristic of the object. One such characteristic useful in many cases is the silhouette of the object as recorded by a device sensitive to visible liqht or infrared radiation. If the silhouette of an unknown object can be matched to the silhouette of a previously identified object, it may be concluded that the unidentified object is of the same class as the known object. Since the silhouettes are equivalent to two-dimensional patterns, any known patternmatching technique may be used.
In one known technique, the silhouettes are divided into small elements, or "pixels." The known and unknown silhouettes are deemed to be matched if the contents of a predetermined percentage of corresponding pixels are the same. The technique beinq discussed, though computationally simple, performs poorly if the silhouettes being compared are not in registration one with the other.
The technique also performs poorly if the image of the unknown object is a different size than that of the image of the known object (as might be the case if the images were made by devices different distances from the object). Another technique, which involves comparing two-dimensional Fourier transforms of a known imaqe with the transform of an unknown imaqe, operates well even if the known and unknown silhouettes are oriented differently. However, such a technique requires substantially more computation.
Summary of the Invention It is a purpose of this invention to provide a means for classifying objects by matching an image containing the silhouette of an unknown object to an image containing the silhouette of a known object.
It is a further purpose of this invention to provide an object classification technique which is computationally simple, yet robust enough to operate effectively when the silhouettes are not precisely reqistered or differ in size.
The foregoing and other purposes of the invention may be achieved by a process comprisinq the steps of creating a silhouette of an unknown object from a digitized picture, normalizing the silhouette of the unknown object to a form for comparison with reference silhouettes of known similarly normalized objects and comparison the normalized image of the unknown object to a normalized reference.
Brief Description of the Drawings The invention will be more fully understood from the following more detailed description when taken in conjunction with the accompanying drawings in which: FIG. 1 is a sketch useful in understanding a system in which the invention might be employed; FIG. 2 is a flow chart of the contemplated object identification process; and FIG. 3 is a sketch useful in understandinq how an image is represented by vectors.
Description of the Preferred Embodiment Pattern recognition techniques have numerous applications. For purposes of describing the present invention, the patterns are herein depicted as representing images containinq silhouettes of ships. One skilled in the art will recognize that the source of the patterns is illustrative only and the present invention operates regardless of particular types of patterns being recognized.
Referring now to FIG. 1, an imaging device 10 is shown focussed on an unknown ship 14. The imaging device 10 may be any known device for forming an image usinq visible light or infrared radiation. The image of the unknown ship 14 is digitized by a digitizer 20. The digitized image is transferred to digital computer 12 wherein processing is accomplished according to the method shown in FIG. 2 and described below. As part of that processing, the digitized image is compared to silhouettes of known ships stored in memory 18. The results of the comparisons are sent to a utilization device 11 such as a cathode ray tube (CRT), allowing a human operator to read the results.The imaging device 10, the image digitizer 20, the digital computer 12, and memory 18 could be any devices known in the art for obtaining and processingdiqitized images.
Referring now to FIG. 2, it should be noted that the processing in FIG. 2 here is implemented by programming a general purpose digital computer such as a VAX 11/780 manufactured by Digital Equipment Corporation, Maynard, Massachusetts. The rectangular elements in the flow diagram of FIG. 2, typified by element 30 and hereinafter denoted as "processing blocks," represent a single instruction or group of computer software instructions for the general purpose digital computer 12 (FIG. 1) to execute. The diamond-shaped elements typified by element 44 and hereinafter denoted "decision blocks," represent groups of computer software instructions which evaluate some condition and effect the order of execution of the-processing blocks of the condition.The elements with curved bottoms, typified by memory 18, represent information stored in hardware memory accessible by the general purpose digital computer 12 (FIG. 1). Memory is only shown explicitly where large amounts of information are stored. The usaqes of memory common to most computer software programs, such as the storage of program variables, are not explicitly shown. It will be noted by one skilled in the art that initialization of variables and loops (and other standard elements of computer software programs) is not explicitly shown.
At processing block 30 the silhouette of the unknown ship 14 (FIG. 1) is extracted according to any known algorithm, or combination of algorithms. For example, the algorithm for edge enhancement described at pages 322 to 326 of Diqital Imaqe Processinq by William K. Pratt, published by John Wiley & Sons, Inc., 1978 and the algorithms for edge segmentation described at pp. 542-545 of the same reference may be used. However, one skilled in the art will recognize that any other algorithms could be used to extract the silhouette of the unknown ship 14 (FIG. 1).
At processing block 32 the silhouette is vectorized to form a set of vertical vectors 74, as shown in FIG. 3.
FIG. 3 shows exemplary ones of a set of N vertical vectors 74 which describe the silhouette 72 of the unknown ship 14 (FIG. 1) in a field of view. Each vertical vector 74 is described by a pair of numbers representing an X and a Y value. As can be seen in FIG. 3, the vertical vectors 74 extend from the bottom to the top of the silhouette 72. It should be noted in FIG. 3 that the vertical vectors 74 begin at equidistant intervals in the X direction.
Because the vertical vectors 74 are evenly spaced in the X direction, the total number N depends on the extent of the silhouette in the X direction. The height of the vectors, i.e., the lengths of the vertical vectors, depends on the distance between the top and the bottom of the silhouette 72 at each X location.
At processing block 34 in FIG. 2, the vertical vectors 74 (FIG. 3) describing the silhouette of the unknown ship 14 (FIG. 1) are adjusted to represent only variations along the top edge of the silhouette, here the superstructure of the unknown ship. At processing block 34, the minimum midship vector 76 (FIG. 3), hereinafter also denoted VN, is identified as the shortest of all vertical vectors (excluding the few vertical vectors, say four percent, on the left side of the image and the few vertical vectors, say four percent, on the right side of the image).
All of the vertical vectors are then adjusted by subtracting the magnitude of VN. Each member of the set of adjusted vectors is hereafter denoted U(X).
Alternatively, the processing as done at processing block 34 may be omitted and the normalized vertical vectors 74 used in subsequent processing.
At processing block 36 of FIG. 2, the width of the silhouette is normalized by using only every CINT (N/100)th vector where CINT is a function rounding the value in parentheses to the nearest integer. The resulting image will always contain, therefore, 100 vertical vectors. At processing block 38 the height of the vectors is normalized such that the silhouette occupies a predetermined area. The normalization is achieved according to the formula H(x) = U(X)/(MSV/8) where H(X) is the normalized height of vector X, U(X) is the unnormalized, adjusted height of vector X, and~ MSV is the average of the set of adjusted vectors.
At processing blocks 40, 41, and 42, the normalized image of the unknown ship's silhouette is compared to stored silhouettes of known ships. The silhouettes of the known ships are stored before operation in memory 18 in a vectorized, normalized form corresponding to the form of the silhouette to be recognized after passing through processing blocks 30, 32, 34, 36 and 38. Memory 18 contains images for every class of ship the processing can identify. In addition, the memory 18 preferably contains several images for each class, representing the ship viewed from several, here four, different aspects. At processing block 40 one vector of the reference image, i.e., the known silhouette of a particular class of ships, is subtracted from the corresponding vector of the received, normalized and vectorized silhouette.Because the bow of the unknown ship might be at either side of the image, a second difference is computed at processing block 41. At processing block 41, the difference is computed by selecting a reference vector as if the reference image had been formed by a ship headed in the opposite direction, i.e., when the first reference vector is used at processing block 40, the last is used at processing block 4i; when the second is used at processing block 40, the second to last is used at processing block 41; and so on. At processing block 42 the absolute values of the differences computed at processing blocks 40 and 41 are accumulated.
Decision block 44 causes processing blocks 40, 41, and 42 to be repeated for the next vector in the silhouette of the unknown ship, as determined at processing block 46, until each vector in the received, vectorized, normalized silhouette has been subtracted from the corresponding vector in the reference image then being taken out of the memory 18. The resulting summations of the absolute values of the differences are denoted as the "scores" of the reference image, one for the unknown compared to the reference and the other for the unknown compared to the reference in reverse. At processing block 48 the scores for the reference ship computed at processing blocks 40, 41 and 42 are stored in memory 50.One skilled in the art will recognize that processing block 48 might alternatively process the scores in some way, such as only storing the lowest scores, to reduce the amount of information stored in memory 50. The comparison process is repeated until the silhouette of the unknown ship is compared to each reference image stored in memory 18 as controlled by the loopback path containing processing block 58.
After the silhouette of the unknown ship is compared to all the reference images stored in memory 18, the reference image with the lowest score is selected at processing block 54. The reference ship corresponding to the lowest score "matches" the unknown ship. The processing thus recognizes the unknown ship as helonging to the same class as the selected reference ship.
Alternatively, a thresholdinq function might be employed such that no classification would be assigned to the unknown ship unless the lowest score obtained for all reference silhouettes is lower than some predetermined value.
It should be understood by those skilled in the art that various modifications may be made in the present invention without departing from the spirit and scope thereof as described in the specification and defined in the appended claims.

Claims (4)

CLAIMS :
1. A method -of identifying an unknown object from a diqitized image of the silhouette of the object by comparinq the digitized image to each one of a plurality of images of the silhouettes of known objects comprising the steps of: (a) representing the digitized image of the silhouette of the unknown object as a set of vertical vectors; and (b) forming a normalized silhouette by normalizing the width and the area of the silhouette as represented by the set of vertical vectors.
2. A method of identifying an unknown object as in claim 1 further comprising the steps of: (a) comparing the normalized silhouette to each one of a plurality of known normalized silhouettes formed from sets of vertical vectors representing digitized images of silhouettes of known objects to compute a score for each one of the comparisons, such score equaling the summation of the ~ absolute values of the differences between each vertical vector in the unknown normalized silhouette and the corresponding vector in each one of the known normalized silhouettes.
3. A method of identifying an unknown object as in claim 2 further comprising the step of identifying the unknown object as being the same class as the known object from which the known normalized silhouette corresponding to the lowest score computed was formed.
4. A method of identifying an unknown ship as in claim 3 wherein the plurality of known normalized silhouettes includes silhouettes formed by ships at various aspect angles.
GB8822100A 1987-09-30 1988-09-20 Pattern recognition method Pending GB2210488A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10298087A 1987-09-30 1987-09-30

Publications (2)

Publication Number Publication Date
GB8822100D0 GB8822100D0 (en) 1988-10-19
GB2210488A true GB2210488A (en) 1989-06-07

Family

ID=22292724

Family Applications (1)

Application Number Title Priority Date Filing Date
GB8822100A Pending GB2210488A (en) 1987-09-30 1988-09-20 Pattern recognition method

Country Status (2)

Country Link
FR (1) FR2621148B1 (en)
GB (1) GB2210488A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1454813A (en) * 1973-07-25 1976-11-03 Optical Business Machines Method and apparatus for recognising handwritten characters in an optical character recognition machine
GB1479134A (en) * 1973-07-09 1977-07-06 Ricoh Kk Character pattern normalization method and apparatus for optical character recognition system
EP0114305A2 (en) * 1982-12-27 1984-08-01 International Business Machines Corporation Normalisation of printed character representations

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58129684A (en) * 1982-01-29 1983-08-02 Toshiba Corp Pattern recognizing device
JPS62204381A (en) * 1986-03-04 1987-09-09 Mitsubishi Heavy Ind Ltd Ship image recognition device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1479134A (en) * 1973-07-09 1977-07-06 Ricoh Kk Character pattern normalization method and apparatus for optical character recognition system
GB1454813A (en) * 1973-07-25 1976-11-03 Optical Business Machines Method and apparatus for recognising handwritten characters in an optical character recognition machine
EP0114305A2 (en) * 1982-12-27 1984-08-01 International Business Machines Corporation Normalisation of printed character representations

Also Published As

Publication number Publication date
FR2621148B1 (en) 1992-09-11
FR2621148A1 (en) 1989-03-31
GB8822100D0 (en) 1988-10-19

Similar Documents

Publication Publication Date Title
US4901362A (en) Method of recognizing patterns
US5040231A (en) Vertical vector pattern recognition algorithm
KR0130962B1 (en) Apparatus for extracting facila characteristic points
US4887304A (en) Library image optimization
US9141871B2 (en) Systems, methods, and software implementing affine-invariant feature detection implementing iterative searching of an affine space
US8103115B2 (en) Information processing apparatus, method, and program
Johnson et al. Recognizing objects by matching oriented points
US6466695B1 (en) Procedure for automatic analysis of images and image sequences based on two-dimensional shape primitives
US5432712A (en) Machine vision stereo matching
KR100972849B1 (en) Method of object recognition
CN110546651B (en) Method, system and computer readable medium for identifying objects
US11176409B2 (en) Distance-independent keypoint detection
US20120294535A1 (en) Face detection method and apparatus
US7570815B2 (en) Comparing patterns
JP3914864B2 (en) Pattern recognition apparatus and method
Burghardt et al. Automated visual recognition of individual African penguins
Creusot et al. Automatic keypoint detection on 3D faces using a dictionary of local shapes
Rey-Otero et al. Comparing feature detectors: A bias in the repeatability criteria
Liu et al. Fingerprint retrieval by complex filter responses
WO2016181400A1 (en) System and method for automated object recognition
CA2421292C (en) Analysing a moving image
GB2210488A (en) Pattern recognition method
Soetedjo et al. An efficient algorithm for implementing traffic sign detection on low cost embedded system
KR20160148806A (en) Object Detecter Generation Method Using Direction Information, Object Detection Method and Apparatus using the same
Nagao et al. Using photometric invariants for 3D object recognition