CA1096504A - Method and apparatus for obtaining multi-spectral signatures - Google Patents

Method and apparatus for obtaining multi-spectral signatures

Info

Publication number
CA1096504A
CA1096504A CA277,836A CA277836A CA1096504A CA 1096504 A CA1096504 A CA 1096504A CA 277836 A CA277836 A CA 277836A CA 1096504 A CA1096504 A CA 1096504A
Authority
CA
Canada
Prior art keywords
data
polygon
data vectors
vertices
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired
Application number
CA277,836A
Other languages
French (fr)
Inventor
Michael F. Gordon
Dempster S. Christenson
Vernon H. Smith
Rowland H. Mclaughlin
Robert E. Marshall
Frank J. Kriegler
Seymour R. Lampert
Roland D. Kistler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Environmental Research Institute of Michigan
Original Assignee
Environmental Research Institute of Michigan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Environmental Research Institute of Michigan filed Critical Environmental Research Institute of Michigan
Application granted granted Critical
Publication of CA1096504A publication Critical patent/CA1096504A/en
Expired legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/40Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor

Abstract

ABSTRACT OF THE DISCLOSURE
A multispectral data classification system has an improved method and apparatus for the identification of train-ing samples of multispectral image data useful in the computa-tion of spectral signatures.
The method involves displaying a set of multispectral image data on a user-interactive device in a two-dimensional ordered array. The training sample is defined by the sequential definition of the vertices of a polygon representing the train-ing sample. A rectangle is closely superscribed about the polygon. The data elements within the rectangle are examined to determine if they are within the polygon.
The apparatus includes a multispectral data classifica-tion system incorporating a color, CRT display for displaying the set of data; a manually-operable trackball cursor inter-active with the color, CRT display for defining the polygon and rectangle; an alphanumeric keyboard and monochrome CRT for providing communication between the operator and system; and a random access memory device.

Description

p-308 INTRODUCTION
This invention relates to the multispectral scanning art and more particularly to improved method and apparatus for identifying distinct subsets of multispectral image data within a base set of data.
BACKGROUND OF THE INV~NTION
~ ultispectral scanning has become an important practical application of radiometry theory. The technique applies the principle that since various types of matter can be characteri~ed by their spectral signature, i.e., their energy emission curve over a broad range of frequencies, classification of a group of various subjects contained within a scene can be accomplished by remotely scanning the scene with a detector sensitive to emitted energy over a broad range of frequencies and comparing the detector output with the known spectral signatures of various subjects believed to be in the area.
This technique implies that known spectral signatures are available prior to the classification analysis. Spectral signatures are computed from and tested in comparison with training samples of multispectral image data selected on the basis of a priori information of a scanned scene, generally a photograph~
The identification of these training samples has in the past represented a bottleneck in the overall multispectral processing scheme. The specific limitation has been the requirement that the operator first display a set of data containing the training sample, then make individual identifica-tions of the coordinates of each data element, and finally compile all of the coordinate identifications to define the training sample.

;~

S~

An objective of the present invention is to provide a method and apparatus for minimizing operator involvement in the training sample identification process, thus to expedite the overall processing scheme.

SUMMARY OF THE INVENTION
In accordance with the present invention, in a system for the classification of a body of multi-spectral data vectors which represents a multivariate scene having a plurality of subject classes, the system having means for determining the spectral signature of each subject class and means for classi-fying the body of multispectral data vectors in accordance with the spectral signatures, the improvement comprising the inclu-sion in the means for determining the spectral signature of each subject class of: storage means for storing the body of multispectral data vectors; display means for displaying a data set from the body of data vectors which contains a subject class in a format forming a two-dimensional, ordered array;
operator control means responsi~e to an operator identification of the vertices of a polygon representing a subject class con-tained within a displayed set of data vectors; means foridentifying data vectors contained within the polygon defined by the operator; and means ~or operating on data vec~ors identified as being contained within the polygon to compute a spectral signature.
; In accordance with the present invention, training samples of multispectral image data are extracted from a base of such data by displaying a set of the base data on a user-interactive device and having the operator define a polygonal representation of the training sample. The displayed data set is then systematically analyzed to determine which data elements ~ 2 --iS04 are contained within the polygonal representation (training sample elements) and which are without (non-training sample elements).
The interactive display device is preferably a color, CRT display. The set of base data has at least one channel displayed in a two-dimensional ordered array that corresponds to the coordinate system of the scanned scene.
The operator is generally assisted in his identification of the training sample within the displayed data set by some ground truth in~ormation about the scanned scene, typically a photograph. On the basis of this information, he delineates the training sample by selecting the vertices of a polygon which is closely inscribed within the contours of the training sample. This may be accomplished by use of a track ball cursor interactive with the CRT display, where the position of the cursor point on the display is manually controlled by rotating the track ball.
Once having defined the polygon, the method of the present invention employs a tWo-part search for each data 1~

- 2a -5~

element to determine if it is contained within the polygon.
First, the vertical coordinate of a data element is checked to determine if it falls within the vertical range of any two adjoining vertices. If so, then for those pair of adjoining vertices which satisfy the first test, the horizontal coordi-nate of the data element is compared with the polygon sides joining each of these pairs of vertices to see if it falls to the relative left of the polygon sides an odd number of times.
If so, then the data element is, according to mathematical formulation, within the polygon.
The apparatus further includes means for two-way com-munication between the operator and system. In the illustra-tive embodiment of the invention these include an alphanumeric keyboard and a monochrome, CRT display. In addition, memory capability is provided by means of a random access storage device.
The invention has general applicability to the extrac-tion of samples from any class of multivariate data that may be spatially represented in a two-dimensional ordered array. For a fuller appreciation of the invention, reference should be made to the following detailed description of a specific embodiment.
~RIEF DESCRIPTION OF THE DRAWING
FIGURE 1 is a schematic diagram of a multispectral image data processing system adapted for use with the present invention;
FIGURE 2 is a pictorial representation of the operator control hardware of the system of FIGURE l;
FIGURE 3 is a flowcharted representation of the steps involved in computing a spectral signature for a subject class in accordance with the method of the present invention;

~L~9~5~4 p-308 FIGURES 4-6 are flowcharted representations of spec-ial routines used to identify a subject class within a display of multispectrai image data;
FIGURE 7 is a pictorial representation of the screen of a cathode ray tube display of multispectral image data having a subject class defined in accordance with the present invention; and FIGURE 8 is a flowcharted representation of the steps involved in extracting a test area of multispectral image data for use in testing the accuracy of spectral signatures already computed.

DETAILED DESCRIPTION OF THE SPECIFIC EMBODIMENT

.
A multispectral image data processing system which incorporates the present invention and is representative of a class of data processing systems to which the present inven-tion is adaptable, is shown generally at 10 in FIGURE 1.
The system 10, exclusive of the present invention, is taught in full detail in National Aeronautics Space Administration Report No. NASA CR-132463, dated August, 1974, Volumes 1, 2 and 3. This report is unclassified and is accessible to the general public.
In overview, the multispectral image data processing system 10 is an assembly of several subsystems, which, in effect, cooperate to process raw, unclassified multispectral image data from a scanned scene into a classified representation of the scene on a hard copy, color output or other suitable display medium.
A central processing (CP) subsystem, shown generally at 12, is responsible for monitoring and executing system tasks.
CP subsystem 12 includes a general-purpose digital computer and necessary peripherals 22, including core and disk memory ~-~g~i504 p-308 units. Also included in CP subsystem 12 are the interfaces 24 necessary to bring peripheral equipment into communication with the digital computer.
Multispectral image data is input in raw form to the system through a data input subsystem, shown generally at 14. The raw input data, represented by symbol 26, is typically stored on magnetic tape; either analog tape, high density digital tape, or computer compatible tape. The data appears on the tape in a sequence of discrete elements; each representative of an elemental portion of a scene. Each data element takes the form of an n x 1 vector, where n repre-sents the number of spectral channels used. Accordingly, the ith position in the vector represents the signal strength in the ith channel. An input unit 28 reads in the data 26;
its form is dependent upon the particular type of input medium selected. From input unit 28 the data flows to a data path selector 30, a special-purpose hardware unit designed to gate the flow of data among alternative paths depending on the pro-cessing function to be perrormed. For repetitive use the data is put on a random access storage device (DISK~33 via computer interface 24, but can be routed directly to the preprocessor 38.
A user-interactive subsystem, shown generally at 16, ; provides for operator participation in the processing sequence.
Such participation is important primarily in the identification of training samples and test areas for the computation and testing of spectral signatures. The subsystem 16 includes a three color, cathode ray tube (CRT) display 32 for displaying multispectral image data, a track ball cursor 34 for delineat-ing subsets of data displayed on the CRT 32 and an alphanumeric keyboard 36 and monochrome, CRT display 37 which together pro-vide for communication between the operator and the central processing subsystem 12. Details of the apparatus s~

and operation of the user-interactive subsystem 16 which relate to the present invention will hereinafter more fully be described.
r~ultispectral image data is classified through a data processing pipeline subsystem, shown generally at 18, which includes a preprocessor 38 and a classifier 40; both of which are special purpose hardware units whose functions are briefly described as follows.
The function of the preprocessor 38 is to condition the raw data in any of a number of ways preparatory to classi-fication. Various methods of conditioning include: Signal conditioning to scale the raw data into an acceptable dynamic range; correcting for scanned-angle error with additive and/or multiplicative operations; linearly transforming data vectors for dimension reduction; and ratioing of channels. These operations will, in many instances, enhance classification accuracy without materially impairing information content.
The output of the preprocessor 38 flows directly to the input of the classifier 40 which performs the actual classification of the data into categories~ The classifier ~ 40 preferably performs a maximum-liklihood decision computation ; that assumes a multimodal Gaussian multivariate distribution.
Classified data is output from the system 10 through a data output subsystem, shown generally at 20, which includes a printer 42 for the production of a hard copy record of the classified data, as represented by symbol 44.
Wlth reference to FIGURE 2, the user-interactive sub-system 16 is to determine the spectral signatures of subjec-t classes that are believed to exist within the body of multi-spectral image data representative of that scene. Signaturecomputation is necessarily a man/machine interactive process.

~9~504 P~308 In broad outline, the operator must be able to call up from storage and display sets of data which, based on a priori information available to him, are believed to contain training samples of subject classes. He must discriminate between training and non-training sample areas in the data display.
The data elements included in the training sample areas must then be assembled for subsequent spectral analysis.
To facilitate these tasks, the user-interactive sub-system 16 of the present inven-tion includes the three color, CRT display 32, the track ball cursor 34 and the alphanumeric keyboard 36 and monochrome, CRT display 37. Each of these units is commerically available under the trademark RAMTEK.
The three color, CRT 32 provides the basic means of displaying imagery. The display uses MOS storage for screen refresh and allows display of 512 by 512 elements on the Screen at 5 bits (plus an overlay bit) per scene element.
Each five bit element may be translated into 3 four bit signals by table-look-up m~mories. The overlay channel is employed to designate points as a cursor or to indicate boundary loca-tions in designating fields for analysis.
Operator control is through the track ball cursor 34 and alphanumeric keyboard 36. The track ball cursor 34 con-trols the position of the cursor bit on the screen of the CRT
display 32. Once the cursor bit is moved to the desired posi-tion, the operator can signal the same to the system by press-ing the "ENTER" button 35 on the track ball cursor 34. The alphanumeric keyboard 36 and monochrome, CRT 37 allow two-way communication between the operator and central processing sub-system 12.

~L~9~

A typical training sequence in the computation of a spectral signature is represented by FIGURE 3. The sequence starts out with the assumption that a base of digitized multi-spectral image data is on storage in the random access storage device 33. The base data is representative of a scene scanned from a remote, moving object, generally an aircraft or Earth satalite. The data on file is stored in a two-dimensional for-mat, that is, scan line by scan line and picture element by picture element (pixel), which correlate to reference positions in the scanned scene.
In Step 100, the operator calls up from random access storage device 33 and displays on the CRT 32 a set of the base of multispectral image data which he believes to contain a sub-ject class. The data set is displayed in a two-dimensional ordered array. The data set selection is typically based on _ priori information about the scanned scene, e~g., a ground truth photograph, as is represented by offline file 102.
In Step 104 the operator identifies the training sample on the screen of the CRT display 32. More specifically, in accordance with the present invention, the operator deli-neates the training sample data elements from the balance of data elements by defining a polygon representing the training sample.
For example, the dashed line polygon 50 shown on the screen of the CRT display 32 in FIGU~E 7 represents an airplane.
The polygon is defined by the sequential definition of its vertices. In the present invention this is effected as follows. The operator moves the track ball cursor 34 so that the cursor point 48 lies over a vertex on the screen of the CRT display 32. The identification of this vertex is communicated to the system by pressing the enter button 35 on the track ball cursor 34. The cursor point 48 is then moved to the next adjoining vertex and its identification is entered in a like fashion. This sequence is repeated for every ver-tex. When the last vertex has been entered, pressing the enter button 35 again without moving the cursor closes the polygon.
Next, as indicated by step 106, it is the task of the s,vstem software to identify coo:rdinates of the data element contained within the polygon. For this purpose, the system employs the scanning routine represented in FIGURES 4 through 6.
The main routine for identifying data elements within the polygon is entitled IDENTIFY, and is shown in flow-chart form in FIGURE 4. Two subroutines which are called by IDENTIFY are entitled RECTANGLE and POLYGON CHECK, and are shown in flowchart form in FIGURES 5 and 6.
Upon entering IDENTIFY at 200 in FIGURE 4, the subroutine RECTANGLE is immediately called in step 202. The - function of the subroutine RECTANGLE is to superscribe a minimum area rectangle around the polygon, as illustrated by the long-short dashed line rectangle 52 around polygon 50 in FIGURE 7. Only data elements contained within this rectangle will be analyzed in a step to enhance program efficiency.
Referring to FIGURE 5, subroutine RECTANGLE is entered through step 250. The routine uses several variables which are defined as follo~s: N = the number of polygon ver-ticès; I = a dummy variable used for counting iterations;
VTXY is an N-dimensional vector whose ith element is the Y-coordinate value for the ith polygon vertex; VTXX is an N-dimensional vector whose ith element is -the X--coordinate value for ith polygon vertex; MAXY is equal -to the largest ~09~5~a element in the VTXY vector; MINY is equal to the smallest element in the VTXY vector; MAXX is equal to the largest element in the VTXX vector; MINX is equal to the smallest element in the VTXX vector.
Basically, RECTANGLE is an iterative sorting routine which examines each polygon vertex to find the extreme X
and Y cooxdinates. In Step 252 several variables are initia-lized; I is set to 1, and MAXY, MINY, ~XX AND MINX are all set to values corresponding to the coordinates of the first polygon vertex.
The program then enters an iterative sorting loop, shown generally at 274. In decision 254, I is compared to N to test if the loop has been completed.
If I is not greater than N, the "No" branch of decision 254 is followed. Step 256 increments I by one unit.
The sort then begins.
Decision 258 compares VTXY (I) with the current value of MAXY. If greater, the "Yes" branch of Decision 258 is followed to Step 260 where MAXY is set equal to VTXY (I) .
If not greater, the "No" branch of Decision 258 is followed and MAXY is left unchangedO
Decision 262 compares VTXY (I) with the current value for MINY. If smaller, the "Yes" branch of Decision 262 is followed to Step 264 where MINY is set equal to VTXY (I).
If not smaller, the "No" branch of Decision 262 is followed and MINY is left unchanged.
Decision 266 compares VTXX (I) with the current value for MAXX. If greater, the "Yes" branch of Decision 266 is followed to Step 268 where MAXX is set equal to VTXX (I).
I not greater, the "No" branch of Decision 266 is followed and MAXX is left unchanged.

P-308 ~9~Q4 Decision 270 compares VTXX (I) with the current value for MINX. If smaller, the "Yes" branch of Decision 270 is followed to Step 272 where MINX is set equal to VTXX (I) If not smaller, the "No" branch of Decision 270 is followed to the reentry of Decision 254, and MINX is left unchanged.
When I becomes greater than N, i.e., when all poly-gon vertices have been e~amined, -the "Yes" branch of Decision 254 is followed. At this point the four corners of a rectangle superscribing the polygon have been defined. In Step 276, four variables characterizing the rectanglé in terms of the (X,Y) coordinates of two diagonally opposite corners are given values. LOWER RIGHT Y is set equal to MAXY, LOWER RIGHT X is set equal to MAXX, UPPER LEFT Y is set equal to MINY, and, UPPER LEFT X is set equal to MIl~X. In Step 278 the program reenters the calling routine IDENTIFY.
Referring again to FIGURE 4, having once identified the superscribing rectangle t each data element in the rec-tangle is examined to determine if it is contained with the polygon. Each data element is identified by its X and Y
coordinates; X relating to the lateral position, and Y relating to the vertical position in the two-dimensional ordered array of data elements.
The examination begins with the numerically lowest X and Y coordinates in the rectangle, which is the upper left hand corner of the rectangle. Accordingl~, in Step 204, Y is set equal to UPPER LEFT Y, and in Step 206, X is set equal to UPPER LEFT X. In Step 208 the subroutine POLYGON CHECK
is called to determine if this data element is contained within the pol~Igon.
Referring to FIGURE 6, POLYGON CHECK is entered at block 284. The only new variables introduced in this routine 5a~

are dummy counting variables, COUNT and J, which are initia-lized in Step 2B6 to O and 1, respectively.
The program then enters an iterative examining rou-tine, shown generally at 280, which requires N passes, where N equals the number of vertices. In overview, each data ele-ment is first simply tested at 288 to determine if it is vertically located between the ~ coordinates of the J th and J + 1 st vertices, i.e., is it "beside" the polygon edge between these vertices. A point horizontally beside a vertex would be considered beside the lower edge only, by the choice of strict and non-strict inequalities. If so, a more elaborate evaluation at 290 determines whether the point is to the left of or on the polygon edge between the Jth and J + 1 st vertices. If so, the variable COUNT is incremented by one unit at 292. If, at the end of the iterative loop, COUNT is odd at 298, then the data element is known to be to the left of an odd number of edges and hence contained within the polygon, and vice versa if COUNT is even.
In greater detail, iterative examining routine 280 is entered with Decision 288 which asks if the Y-coordinate value of the data element is mediate in value between the Y-coordinates of the J th and J + 1 st polygon vertices, or equal in value to the greater Y coordinate vertex. If not, it is clear that the data element is not horizontally "beside"
the polygon edge defined by having as its end points the J th and J + 1 st vertices, and the "No" branch of Decision 288 is followed. It should be noted that this same inequality test has simultaneously ensured that the polygon edge is not horizontal.
If Decision 288 is satisfied, the "Yes" branch is followed to Decision 290, where the X and Y-coordinate values of 65~4 p-308 the data element are examined in conjunction with the same two polygon vertices to see if the data element is on or to the left of the polygon edge. The difference between the X-coordinate value of the data element and the X-coordinate value of the J th vertex is compared to the difference between the Y-coordinate value of the ~ th vertex scaled by the inverse of the slope between the J th and the J + 1 st vertices, yielding the signed horizontal displacement of the data element from the edge. If the comparison shows the data element is strictly to the right of the edge, the "No" branch of Decision 290 is followed. If Decision 290 is satisfied and the "Yes" branch is followed, then in Step 292 COUNT is incremented by one unit.
In Step 294, J, the iterative loop counter, is incremented by one unit. Note that the "No" branches of Deci-sions 288 and 290 both flow into Step 294 when the conditions posed by those decisions are no~ satisfied.
In Decision 296, the current value of J is checkedagainst N, the number of vertices. If J does not exceed N, the "No" branch is followed and the program re-enters Decision 288 for another itera-tion. If J exceeds N, iterative examining routine 280 has been completed for this data element and the ; "Yes" branch is followed.
In Decision 298 COUNT is tested to determine whether it is even or odd. If COUNT is odd, i.e., the data element is to the left of an odd number of non-horizontal polygon e~ges, it represents the condition that the data element is within the polygon. In this event,the "Yes" branch of Decision 298 is followed to Step 300 where this fact is registered and stored.
If COUNT is even, it represents the condition that the data element is not within the polygon. In this event, the "No"
branch of Decision 208 is followed to Step 302 where this fact ~1~9~53~

is registered and stored.
Both Steps 300 and 302 flow to Step 304 where pro-gram control is returned to the calling routine IDENTIFY.
Referring again to FIGURE 4, the IDENTIFY routine is re-entered at Step 208. In Step 212, X is incremented by one unit, which graphically corresponds to moving to the next col-umnar position in this row of data elements. In Decision 214, X is tested against LOWER RIGHT X to see if this row of data elements has been completed. If not, the "No" branch of Decision 214 is followed and the sequence of Steps 208 through 212 is repeated as heretofore described. If the row is com-pleted, the "Yes" branch of Decision 214 is followed.
In Step 216, Y is incremented by one unit, which corresponds graphically to advancing to the next succeeding row. In Decision 218, Y is tested against LOWER RIGHT Y to determine if all the rows of data elements in the super-scribing rectangle have been completed.
If not, the "No" branch of Decision 218 is followed to the re-entry of Step 206 where X is reset to UPPER LEFT ~, which corresponds graphically to starting back to the first columnar position in the row. Thereafter, Steps and Decisions 206 through 218 are repeated in the manner heretofore described.
If the condition posed by Decision 218 is satisfied, the "Yes" branch is followed. In Step 220 the routine IDENTIFY
is terminated.
Referring again to FIGURE 3, in Step 108 the data elements identified to the polygonal training sample are translated from the display coordinate system to the random access file coordinate system. This may be accomplished through a file header which functions as a look-up-table ~9~5~

for accessing data element coordinates on the random access device.
In Step 110, the spectral signature for this sub-ject class, e.g., in this illustration an airplane, is computed.
In computing a signature the operator first decides if he wants to include any additional training sets of a like type that may be available. Next he selects the channels to be used in the computations. Also he specifies an allowable number of standard deviations from the mean data value for each channel to serve as a criterion for excluding spurious data elements from the computations. With this information, a signature is computed. In Step 112, the signature is stored on file.
Decision 114 asks if there are additional classes for which signatures are to be obtained. If yes, the process of signature determination is repeated. If no, this phase of the overall processing sequence is completed, as indicated by terminal 116.
Once the spectral signatures of the subject classes have been obtained, the base of multispectral image data may be classified and displayed by the multispectral image data processor 10 which was described in connection with FIGURE 1.
The invention has further utility in the extraction of test areas from a base of multispectral image data. The test areas differ from the training samples only to the extent that they are extracted afker the derivation of spectral signatures and are used to test the classification accuracy of those signatures.
FIGURE 8 sets forth the steps for the overall testing procedure. The first four steps denominated 320, 324, 326 and 328 in FIGURE 8 are in direct parallel to Steps 100, 104, 106 and 108 in the training sample extraction process as i5~4 was hereinbefore discussed in connection with FIGURE 3.
Therefore, they will only be discussed briefly as follows:
Step 320 the operator displays a set from the base of multispectral image data which he believes, based on ground truth information as represented by offline file 322, to con-tain the test area of interest. In Step 324 the operator uses the polygon identification method of the present invention to delineate the test area from the balance of the display data.
In Step 326 the display da~a is examined to determine which elements are contained within the polygon and which are without the polygon. In Step 328 those data elements identified to the polygon are translated back into the data base coordinates.
Once the test area has been extracted it is subject to a classification procedure in Step 330. The known spec-tral signatures, as represented by offline file 332, are used to classify the test area. In Step 334 the results of the classification step is output onto a visual display medium for a subsequent evaluation o the accuracy of the signatures.
The sequence terminates with the "STOP" instruction of Step 336~
As will be apparent to those skilled in the art the invention has further utility beyond the classification of multispectral image data. The concept of the present invention may be generalized to apply to the extraction of data samples from any class of multivariate data that may be spacially represented in a two-dimensional ordered array.
Widely varying embodiments of the invention will suggest themselves to those having skill in the art without departing Erom the scope and essence of the following claims.

Claims (12)

THE EMBODIMENTS OF THE INVENTION IN WHICH AN EXCLUSIVE
PROPERTY OR PRIVILEGE IS CLAIMED ARE DEFINED AS FOLLOWS:
1. In a system for the classification of a body of multi-spectral data vectors which represents a multivariate scene having a plurality of subject classes, the system having means for determining the spectral signature of each subject class and means for classifying the body of multispectral data vectors in accordance with the spectral signatures, the improvement comprising the inclusion in the means for determining the spectral signature of each subject class of: storage means for storing the body of multispectral data vectors; display means for displaying a data set from the body of data vectors which contains a subject class in a format forming a two-dimensional, ordered array; operator control means responsive to an operator identification of the vertices of a polygon representing a subject class contained within a displayed set of data vectors;
means for identifying data vectors contained within the polygon defined by the operator; and means for operating on data vectors identified as being contained within the polygon to compute a spectral signature.
2. The system as defined in claim 1, wherein the storage means includes a random access memory unit.
3. The system as defined in claim 1, wherein the display means includes a cathode ray tube display.
4. The system as defined in claim 3, wherein the operator control means includes a cursor in communication with the cathode ray tube display.
5. The system as defined in claim 4, wherein the cursor position on the cathode ray tube display is controlled by a manually-operable, interactive track ball.
6, In a system for classifying a body of multispectral data vectors which represents a multivariate scene, the system having means for determining the spectral signatures of training samples extracted from selected portions of said body and means for classifying each data vector in the body into one of a plurality of classes, with each class being defined by the spectral signature of one of the training samples, wherein the improve-ment comprises means for automatically extracting the training samples from the body of data vectors, said extraction means comprising: display means for displaying said body of data vectors in a two dimensional ordered array; operator control means interactive with said display means for defining a polygon surrounding portions of said body on the display means thereby delineating a desired training sample; storage means for storing the location of the vertices of said polygon; means for automatically sequentially comparing the position of each data vector on the display means with the training sample area defined by the stored location of the vertices of the polygon;
and means for providing data associated with only those data vactors within the polygon to said signature determining means wherein the spectral signature of the training sample is determined from those selected data vectors, said training sample defining one of a plurality of classes against which the multispectral signatures of all of the base data vectors can be compared for classification.
7. The improvement of claim 6 wherein said operator con-trol means includes means for manually positioning a cursor on the display means for locating the vertices of said polygon.
8. The improvement of claim 7 wherein said cursor is posi-tioned on the display means by manually rotating a track ball in the desired direction, and wherein an electrical signal coupled to said storage means provides an indication of the location of the cursor on the display means.
9. The improvement of claim 8 wherein said control means connects adjoining vertices with a line on the display means thereby providing the edges of the polygon.
10. The improvement of claim 9 wherein said comparison means firstly compares the data vector with the vertical location of adjoining vertices to determine if it is located therebetween and if so, secondly, determines whether the data vector is to the left of the edge defined by said vertices whereby said data vector is within said polygon if it is to the left of an odd number of such polygon edges.
11. The improvement of claim 10 which further comprises:
means cooperating with said control means for further defining a rectangle closely superscribing the polygon; and means for limiting the comparison of the displayed data vectors to those data vectors contained within the rectangle.
12. The improvement of claim 11 wherein said display means is a color cathode ray tube, with said body of data vectors being displayed on said display means in one color before train-ing sample extraction, and wherein the finally classified data vectors are displayed on the display means as a multicolored representation, with each color corresponding to a particular multispectral class.
CA277,836A 1976-06-14 1977-05-06 Method and apparatus for obtaining multi-spectral signatures Expired CA1096504A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US69602876A 1976-06-14 1976-06-14
US696,028 1976-06-14

Publications (1)

Publication Number Publication Date
CA1096504A true CA1096504A (en) 1981-02-24

Family

ID=24795409

Family Applications (1)

Application Number Title Priority Date Filing Date
CA277,836A Expired CA1096504A (en) 1976-06-14 1977-05-06 Method and apparatus for obtaining multi-spectral signatures

Country Status (2)

Country Link
CA (1) CA1096504A (en)
DE (1) DE2725927A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4566126A (en) * 1982-04-30 1986-01-21 Fuji Electric Company, Ltd. Pattern discriminator
DE19802781A1 (en) * 1998-01-26 1999-07-29 Peter L Prof Dr Andresen Quick identification of valuable objects by digital image analysis

Also Published As

Publication number Publication date
DE2725927A1 (en) 1977-12-22
DE2725927C2 (en) 1988-04-28

Similar Documents

Publication Publication Date Title
US4167729A (en) Apparatus for obtaining multi-spectral signatures
US6804394B1 (en) System for capturing and using expert's knowledge for image processing
EP0183347B1 (en) Video image recognition system
Umbaugh et al. Feature extraction in image analysis. A program for facilitating data reduction in medical image classification
EP0796474B1 (en) Method for avoiding redundant identification (caused by artifacts in images of bacterial colonies)
US20040228526A9 (en) System and method for color characterization using fuzzy pixel classification with application in color matching and color match location
US20070154088A1 (en) Robust Perceptual Color Identification
US8189915B2 (en) Method for segmentation in an n-dimensional feature space and method for classifying objects in an n-dimensional data space which are segmented on the basis of geometric characteristics
US6728407B1 (en) Method for automatically determining trackers along contour and storage medium storing program for implementing the same
Laws The Phoenix image segmentation system: Description and evaluation
Yadav et al. An improved deep learning-based optimal object detection system from images
CA1096504A (en) Method and apparatus for obtaining multi-spectral signatures
Amorim et al. Analysing rotation-invariance of a log-polar transformation in convolutional neural networks
Preston Jr Image processing software A survey
CN114792300B (en) X-ray broken needle detection method based on multi-scale attention
Hao et al. Active cues collection and integration for building extraction with high-resolution color remote sensing imagery
CN115019396A (en) Learning state monitoring method, device, equipment and medium
US10248697B2 (en) Method and system for facilitating interactive review of data
Ledley Analysis of cells
CN114550179A (en) Method, system and equipment for guiding handwriting Chinese character blackboard writing
Jacot-Descombes et al. Labolmage: a workstation environment for research in image processing and analysis
Pohle-Fröhlich et al. Roof Segmentation based on Deep Neural Networks.
CN113096080A (en) Image analysis method and system
CN113096079A (en) Image analysis system and construction method thereof
Kowalski et al. Convolutional neural networks in the ovarian cancer detection

Legal Events

Date Code Title Description
MKEX Expiry