GB2108306A - Pattern recognition apparatus and method - Google Patents

Pattern recognition apparatus and method Download PDF

Info

Publication number
GB2108306A
GB2108306A GB08227791A GB8227791A GB2108306A GB 2108306 A GB2108306 A GB 2108306A GB 08227791 A GB08227791 A GB 08227791A GB 8227791 A GB8227791 A GB 8227791A GB 2108306 A GB2108306 A GB 2108306A
Authority
GB
United Kingdom
Prior art keywords
contour line
pattern
input
segment
line segments
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB08227791A
Other versions
GB2108306B (en
Inventor
Yoshiaki Kurosawa
Haruo Asada
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Tokyo Shibaura Electric Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP56165020A external-priority patent/JPS5866176A/en
Priority claimed from JP56165019A external-priority patent/JPS5866175A/en
Priority claimed from JP56165021A external-priority patent/JPS5866177A/en
Application filed by Tokyo Shibaura Electric Co Ltd filed Critical Tokyo Shibaura Electric Co Ltd
Publication of GB2108306A publication Critical patent/GB2108306A/en
Application granted granted Critical
Publication of GB2108306B publication Critical patent/GB2108306B/en
Expired legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/18Extraction of features or characteristics of the image
    • G06V30/182Extraction of features or characteristics of the image by coding the contour of the pattern
    • G06V30/1823Extraction of features or characteristics of the image by coding the contour of the pattern using vector-coding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Character Discrimination (AREA)
  • Image Analysis (AREA)

Abstract

A pattern recognition apparatus comprises: a reference memory (26) for storing data derived fro reference patterns Figure 3B, each having a plurality of contour line segments 01-09; a microprocessor (20) which detects a contour line of an input pattern Figure 3A, divides it into a plurality of contour line segments 51-59, determines a plurality of numeric attributes for each segment, and stores them in a segment memory (24); and a matching section (30) for comparing the input pattern data with data from one of the reference patterns for verification purposes. The verifying process in the matching section (30) includes a step of comparing the data representing the segments of the input pattern with the data representing the segments of one of the reference patterns. <IMAGE>

Description

SPECIFICATION Pattern recognition apparatus and method The present invention relates to a pattern recognition apparatus and its method and, more particularly, to a pattern recognition apparatus which may recognize many types of input patterns including handwritten characters, and a method for recognizing such patterns.
Development of techniques to recognize an unknown pattern and place it into one of several classes of patterns has recently been emphasized. It is said that the technique to recognize handwritten character patterns is more difficult than the technique to recognize printed character patterns, because the handwritten characters vary greatly among individuals, unlike the printed characters.
There have been many proposals to recognize input character patterns including handwritten character patterns. One proposal resolves an input character pattern into fine lines and recognizes the input character pattern on the basis of features of the structural data obtained through the resolving process. Another proposal detects a contour of the input character pattern and uses the contour data obtained for the pattern recognition.
According to the former method, in recognizing a character pattern using the fine lines, the noise contained tends to cause undesirable noise patterns to appear in the character pattern. This can lead to erroneous pattern recognition of characters, resulting in unsatisfactory reliability of the pattern recognition.
According to the other prior art pattern recognition method which detects the contour of the pattern, only a directionality of the contour section is used as attribute data for the pattern recognition. To this end, the analysis must go into details of the contour structure of the input pattern. A reference structure for providing a reference pattern for pattern recognition processing is unnecessary complicated and large in size. The result is that the pattern recognition apparatus is complicated and great difficulty is unavoidably encountered in setting and/or correcting of the reference structure. This problem is remarkably distinguished in pattern recognition of handwritten characters, failing to have stable satisfactory extraction and recognition of attribute data.
Accordingly, an object of the present invention is to provide a new and improved pattern recognition apparatus and method which may stably recognize input patterns, including character patterns, with high accuracy through a relatively easy processing.
According to a pattern recognition apparatus and its method of the present invention, upon receipt of an input pattern to be compared with a reference, a contour line of the pattern is detected and divided into a plurality of contour line segments. Predetermined kinds of attribute information are extracted from these contour line segments and are converted into digitized numeric data. The digitized attribute information extracted from one contour line segment is stored as one data unit into a first memory. Accordingly, a plurality of data units are stored in the first memory, in a one-to-one correspondence with the plurality of the contour line segments. A second memory previously stores a number of reference patterns. These reference patterns have corresponding contour line segments which are combined together to form a contour line of a specific pattern.These contour line segments are stored in the second memory in the form of data units each including a plurality of digitized attribute data. The input pattern and one of the reference patterns are supplied to a matching section. The matching section compares the data unit corresponding to one contour line segment of the input pattern with the data unit corresonding to one contour line segment of the one reference pattern.
This invention can be more fully understood from the following detailed description when taken in conjunction with the accompanying drawings, in which: This invention will be best understood with reference to the accompanying drawings, in which: Figure 1A shows a vector diagram of reference direction codes applied for the prior pattern recognition method; Figure 1B shows a model of a prior pattern recognition method in which a contour line of an input pattern is divided into a number of segments and direction attribute data of contour line picture elements are determined using the direction codes shown in Figure 1A; Figure 2 schematically shows a block diagram of an overall arrangement of a pattern recognition apparatus as an embodiment of the present invention;; Figure 3A illustrates an example of an input pattern as an object under recognition applied to the pattern recognition apparatus shown in Figure 2; Figure 3B illustrates one of the reference patterns previously stored in a reference memory of the pattern recognition apparatus of Figure 2; Figure 4 shows a flow chart explaining a general pattern recognizing operation by the pattern recognition apparatus of Figure 2; Figure 5 shows a flow chart explaining detailed processing steps for comparing a reference segment having p-type data of a reference pattern attached with an input segment in the flow chart of Figure 4; Figure 6 shows a flow chart of detailed steps for comparing a reference segment having *-type data of the reference pattern attached with the input segment of the flow chart in Figure 4;; Figure 7 is a block diagram of an overall arrangement of a pattern recognition apparatus which is a modification of the embodiment shown in Figure 2; Figure 8A shows an input pattern as an object under recognition applied to the pattern recognition apparatus of Figure 7; Figure 8B shows one of the reference patterns previously stored in the reference memory of the pattern recognition apparatus in Figure 7; and Figure 9 shows a flow chart illustrating a method to determine a verifying start position of an input pattern, which is applied in a step for verifying an input pattern with a reference pattern in a pattern recognizing operation of the pattern recognition apparatus shown in Figure 7.
Before describing the embodiments of the present invention, a prior contour line detecting type pattern recognition method will be described referring to Figures 1A and 1 B. In the prior art, reference direction codes as shown in Figure 1A are used for dividing a contour line of an input pattern into a number of segments to obtain direction component data of the segments. The direction codes are expressed by eight types of direction vectors equiangularly arranged about a center point. These direction vectors are used as direction attribute data and designated by numerals "0" to "7".When an unknown pattern digitally processed, as shown in Figure 1 B, is applied as an input pattern to a prior recognition apparatus, the direction attribute data of picture elements 10a, 10b, 10c,... in the contour line segments are determined on the basis of the direction component data shown in Figure 1A. According to the direction vector diagram shown in Figure 1 B, a train of the direction attribute data (4, 4, 5, 5, 6, 6, 6,...) in the character pattern shown in Figure 1 B are set as a train of segments under recognition.
The train of segments under recognition are made to automatically correspond to a train of segment data of a reference pattern previously stored in a reference memory (not shown) every category. The recognition of the input pattern is not completed until the train of segments under recognition of the unknown input pattern are made to match those of the reference pattern in an automaton manner. At the completion of the recognition process, it is recognized that an input graphic pattern is a numeral "2".
The prior graphic pattern recognition technique as described referring to Figures 1A and 1 B is considered an excellent and effective technique among those thus far proposed. The many problems which will be noted below impair its effectiveness, however. In the above recognition technique, the contour structure of the input pattern is minutely recognized by every contour picture element and the train of the direction attribute data on a number of contour segments is used for the pattern recognition. As a result, the reference structure is undesirably complicated and large in size. The reference pattern structure to be stored in the reference memory contained in the pattern recognition apparatus is also complicated, so that an initial setting and/or correction of the reference structure is difficult.The difficulties are more distinguished in recognizing character patterns with great variations such as Japanese letters. The complicated reference structure further brings about derivative problems that stable extraction and recognition of the features of the pattern.
The reference direction codes used in determining the direction attribute data of a contour line of the input pattern is relatively coarse, as shown in Figure 1A, compared to the segmentation of the contour of the input pattern. The segmented direction attribute data obtained is not the one expected when considering the segmentation of the contour line of the input graphic pattern. As a result, the prior recognition technique is unsatisfactory in the accuracy of the extraction of the features of the pattern.
These problems have successfully been solved by the inventors of the present patent application, and a method for recognizing graphic patterns and an apparatus for recognizing the same will now be described using their prefered embodiments with reference to Figures 2 to 3B.
Referring to Figure 2, a graphic pattern, for example, a numeral "2", depicted on a plane 11 is optically scanned by a photovoltaic converter 12 containing an optical scanner (not shown) and is converted into electrical signals. The output signals of the photovoltaic converter 12 are sampled, quantified and converted into digital signals in an analog-to-digital (A/D) converter 14. Binary patterns thus obtained are temporarily stored in a pattern memory 16 through a data bus 18 under control of a microprocessor 20. The binary pattern is divided into connected region components. Persons skilled in the art are familiar with many dividing methods. One of the known dividing methods is disclosed in Azriel Rosenfeld and Avinash C. Kak, "Digital Picture Processing", Academic Press, Section 9.1.3. (1976). Another method is disclosed in J.W.
Butler et al., "Data Acquisition and Processing in Biology and Medicine", AutomaticAnalysis of Chromosomes, Vol. 3, pp 261-275, Pergamon, Oxford (1963). The coordinates at the points on the right and left and the top and the bottom, and an area of each of the connected regions resulting from the division according to the above-mentioned method are calculated.
Subsequently, a boundary following processing starts at a point on a contour of the divided binary pattern, thereby to form a train of direction codes (direction attribute data). A method disclosed in Section 9.1.2 of the above-mentioned literature by Azriel Rosenfeld and Avinash C. Kak, for example, is available for the boundary following method. The direction code train is stored in a working memory (not shown) provided in the microprocessor 20. Then, curvatures at the respective points on the contour are calculated to form a train of curvatures (curvature attribute data). The curvature train is then stored in the working memory. The curvature train is subjected to a threshold processing on the basis of predetermined threshold values and the contour line of the binary pattern is resolved into a plurality of contour segments (referred to as segments).
In this case, for example, two values 61 and 62 are used for the threshold values (61 > 0 and 62 < 0). When curvature data ai of the segments being currently processed is kept larger than 01, aj > 61, it is judged that the contour line is convex. When a < 82, the contour line is concave. When the curvature data a does not satisfy the above two conditions, it is judged that the contour line is straight. In this way, the segment resolving process is performed. The judging method is disclosed in Section 9.3.2 of the litrature by Azriel Rosenfeld and Avinash C. Kak.
The calculating process for obtaining an average curvature of the segments thus obtained, the final direction, the coordinates of a final point, etc. is performed, and the result of the calculation is transferred to a segment memory 24 through the data bus 18. In this way, the features of the input pattern are extracted.
In an embodiment of the present invention, as schematically illustrated in Figure 3A, the input pattern is treated as a combination of the segments S1 to S9 resulting from the division of its contour line. Each segment provides one data unit as the attribute data containing the direction code train, the curvature train, position data representing positions on the coordinates, length, the features of the segment, etc, and is expressed in the form of digitized data. Each input segment is made to correspond to the data unit made up of a plurality of digitalized attribute data. The input pattern is digitally expressed by the segment data may be modeled in the form of a data table in a matrix fashion as shown in Table 1, for example.
TABLE 1 Segment no. Curva- Di- Po- Charac ture rection sition teristic 1 a1 b1 c1 d1 2 a2 b2 c2 d2 c2 3 a3 b3 C3 d3 .....
N aN bN CN dN The segment data matrixed in a table is stored in the segment memory 24.
A dictionary memory or reference memory 26 of Figure 2 previously stores the segment train data made up of the attribute data of the segments of a reference pattern dictionary pattern belonging to a plurality of categories. The attribute data of the reference pattern is made up of a group of digitized numeric data arranged in a matrix fashion as mentioned above. The reference segment train data includes the curvature, the direction, the'position, and the attribute data. One item of reference data is expressed by one data unit including a plurality of attribute data. In the data of curvature, direction and characteristic, the upper limit level data and the lower limit level data are arranged for setting a matching tolerable range to be used for a pattern matching. These data may be arranged in a matrix fashion in a data table as shown in Table 2, for example.
TABLE 2 Segment Curva- Di Position Charac- Segment no. ture rection teristic type U L U L U L 1 Aia Aib Bia Blb Cia Clb D1 P 2 A2a A2b B2a B2b C2a C2b D2 3 A3a A3b B3a B3a C3a C3b D3 * N ANa ANb BNa BNb CNa CNb DN P U: Upper Limit L: Lower Limit In the reference data units corresponding to one reference pattern, the curvature train data indicates curvatures of a plurality of contour line segments in the reference pattern, including the curvature upper limit value Aia (i = 1,2,3, 2,3 N) and the curvature lower limit Aib.The direction train data represents a direction at a given position on each of the contour segments of the reference pattern such as the final position, and includes the direction upper value Bia and the direction lower limit Bib. The position data indicates a position of each of contour segments of the reference pattern relative to the whole reference pattern, and includes the upper and tower values Cia and Cib. The digitized numerical data as the attributed data indicates the attributes of the contour segments of the reference pattern such as a convex part, a concave part, and a linear part, and is stored in the reference memory 26 of Figure 2 after it is converted into numerical data D1 according to a data converting system.The reference data unit further includes the segment type data, p, &num;, +,... corresponding to the contour segments. The type data specifies the charcteristics of each segment such as merging, omission or division among the segment data for giving a flexibility to the verifying processing for comparing the input pattern with each of the reference patterns stored in the reference memory 26.
The reference data unit of one reference pattern containing the attribute data groups additionally contains each verification start segment position data and a category name of one reference pattern such as "2". A number of the reference data units with such a format are stored in the reference memory 26 of Figure 2.
The reference memory 26 is connected through a data bus 28 to a pattern verifying or matching section 30.
The matching section 30 performs a verification or matching of the input pattern with respect to one of the reference patterns for recognition of the input pattern through a comparison of the divided input segments with the reference segments in a one-to-one correspondence. In the matching step by the matching section 30, the segment train data of the reference pattern contain the type data such as P, &num;, *, etc., as shown in Table 2. The type data P is used for specifying the verification of the segment attribute data such that the input pattern segments as an object under recognition and the reference segments are individually compared in a oneto-one correspondence.Accordingly, for verifying the reference segment having the type data P attached thereto with respect to the corresponding segment of the input pattern in the matching section 30, one of the input pattern segments is merely compared with one segment with the type data P. In this case, if any one of the attribute data aj, bi, cj and dj (i = 1,2,3 ..., N) of the input pattern segment falls outside of a tolerance of the reference segment, the identity or similarity of the input pattern with the reference pattern is rejected.
The type data &num; shown in Table 2 indicates the omission of the reference segment at the time of the verification. Accordingly, when the verification of the reference segment with respect to the type data &num; is rejected, the indentity or similarity of the input pattern with reference pattern is not immediately rejected but its negation is deferred temporarily, while allowing itto be compared with the reference segment following the reference segment already compared.
The type data * shown in Table 2 allows the reference segment used in the verification to be merged with at least one other reference segment adjacent to the former segment. Accordingly, under this data, it is possible to apply a verification of one segment in the input pattern against the reference segment section containing in succession a plurality of reference segments. In this way, when the similarity of the input segment with reference segment section is affirmed, it is judged that the verification between both the segments at least at the reference segment section has succeeded.
The verifying process in the matching section 30 in the microprocessor 20, will be described referring to Figures 4 to 7. The input pattern of which the characteristics are thus extracted is read out from the segment memory 24 under control of the microprocessor 20, and is supplied through the data bus 18 to the matching section 30.
As for the input pattern as the object under recognition, its contour line is divided into nine segments S1 to S9, for example, through the characteristic extracting process. Accordingly, in this case, the segment data I in the form of a matrix fashion of the input pattern is given by
corresponding to the contents of Table 1.
For the method for dividing the contour line of the input pattern into nine segments in the characteristic extracting step, the present embodiment employs a method in which inflection points on the contour line where curvatures change are detected and the contour line is divided at the inflection points. In this case, average curvature data at the segments S1 to S2 is used for the curvature train data
of the attribute data forming the segment date of the input pattern. The direction code traindata
in constituted of the direction data resulting from differentiating the final end part of each of the segments S1 to S9.The coordinates data at the final point of each segment or a relative position data of the specific segment to the whole pattern are applied for the position data
When the input pattern is transmitted to the matching section 30, the matching section 30 starts to compare the reference pattern stored in the reference memory 26 with the input pattern. For example, when the reference pattern with contour line segments D1 to D9 shown in Figure 3B is transferred from the reference memory 26 through the date bus 28 to the verifyng section, both the patterns are compared with each other on the basis of the segment train data. In the reference pattern of which the reference pattern is numeral "2", as shown in Figure 3B, the type data &num; is assigned to the segment D4 and the type data * to the segment D8.
When the verification operation starts, the verifying start position on the contour line in the pattern is determined as shown in a flow chart shown in Figure 4. The attributes of the segments of both the patterns shown in Figures 3A and 3B are searched according to the verification start segment data stored in the reference memory 26 corresponding to the reference pattern of Figure 3B, and the segment at the verification start position is detected. For this segment detection, the segments of the input pattern are checked for the characteristics or features given by their attributes successively and counterclockwise, for example, from the segment S1 of the input pattern located at the uppermost position of the graphic or character pattern.Through this checking operation, if the end point segment S9 if detected, the segment S9 is used as the verification start segment. In detecting this end point segment, it is judged whether the segment S9 is the end point segment or not on the basis of data such as an angular difference A6 between the final direction 61 of the proceeding segment S1 and the final direction 69 of the segment S9, the segment length, the average curvature, etc. Particularly, such segment as to have the angular difference A6 of more than 180 and a relatively short segment length is easily judged or detected that it is the end segment.
When the segment S9 of the input pattern is selected as the end point segment, the attribute data of the segments of both the patterns of Figures 3A and 3B are successively performed clockwise from the segment S9 position. First, the matching section 30 compares the segment digitized numeric data of the segment D1 of the end point segment S9 of the input pattern of Figure 3A with that of the segment D1 of the reference pattern.As the result of the comparing operation of the digitized numeric data constituting the attribute data, When the segment numeric data of the input segment S9 is coincident with that of the reference segment D1 within a tolerable range set peculiar to the reference segment D1, the reference segment is turned clockwise by one and it is checked whether or not the reference segment D2 has the type date P, &num; or * according to the flow chart of Figure 4. In the Figure 3B reference pattern, since the next reference data D2 has the type data P attached thereto, the verifying operation advances as indicated by an arrow designated by reference numeral 40 in the flow chart of Figure 4.Then, as shown in Figure 5, the comparing operation of the numeric data of the input segment S1 with that of the reference segment D2 is performed on the curvature train data, the direction data and the position data in this order. In the comparing operation step, if any of the data components fails to be coincident with the reference segment, it is treated as a matching error in a process flow as indicated by an arrow 42 in the flow chart of Figure 4. If the coincidence of the input segments S1 and D2 is detected through the comparing operation, the reference segment is advanced along a process flow as indicated by arrow 44 of the flow chart shown in Figure 4. Then, comparison between the numeric attribute data of the succeeding input segment S2 and the reference segment D3 is performed, as in the previous case.
When the input segment S3 of Figure 3A is compared with the reference segment D4 in Figure 3B, the reference segment D4 has the type data &num; and hence the verifying operation is advanced along the process flows as indicated by 46 and 48 in the Figure 4 flow chart. As a result, the same input segment S3 is compared with the reference segment D5 succeeding to the reference segment D4 according to a process flow as indicated by an arrow 40 in the flow chart of Figure 4. As a result, the coincidence between both the segments is allowed.
As in the above case, the matching section 30 completes the comparison of the input segment S5 and the reference segment D7 and the matching holds between them. Then, the verification process is applied for the input segment S6 and the reference segment D8. In this case, since the type data *, which permits merger with another segment, is attached to the reference segment D8, the verifying process in the matching section 30 progresses according to an arrow 52 of the process flow in the Figure 4 flow chart. The segment D8 is successively compared with a single segment S6 of the input pattern, the merged segments of S6 and S7, and the merged segments of S6, S7 and S8, as shown in the flow chart of Figure 6.
In Figure 6, the reference segment D8 is first compared with the single segment S6 of the input pattern. In this case, a back enable number (BEN), that is, a number which, when the verification fails, enables it to return back to a state of the merged segments at a time point that the verification succeeded, is zero. The reason for this is that the segments of the input pattern are not yet merged and the single segment S6 of the input pattern is compared with the reference segment D8. The comparison of the segment S6 of the input pattern of Figure 3A with the reference segment D8 of Figure 3B is performed on the curvature, the direction and the position data in this order, with the result that the matching succeeds as indicated by an arrow 54 of the process flow in Figure 6.Since the input segment S6 is then permitted to merge with the input segment S7 located succeeding to the segment S6, the segments S6 and S7 are merged with a first merged segment.
The BEN of the first merged segment is 1. The verification on the reference segment D8 is returned along the process flow as indicated by an arow 56 and is compared with the first merged segment on the curvature, the direction and the position data. Also in this case, the matching between both the segments succeeds and therefore the input segment S8 is added to the first merged segment to form a second merged segment made up of the input segments S6,S7 and S8.
The comparison of the second merged segment with the reference segment is rejected in the numeric data comparing process of the curvature data, for example, and an matching error is detected along the process flow as indicated by arrow 58 in Figure 6. In this case, since BEN * 0, the operation in the verification section 30 (Figure 2) progresses in the direction of arrow 60 in Figure 6. And the final added segment S8 is removed from the second merged segment which failed in matching the reference segment D8. Specifically, the final added input segment is subtracted from the input merged segment which failed in the matching and it returns to the input merged segment which finally succeeded in the matching, or the first merged input segment made up of the input segments S6 and S7.As a result, the two segments S6 and S7 contained in the merged input segment are compared with the reference segement D8, so that the matching successfully holds. Then, the verifying process returns to the above-mentioned step where the input segment S8 is compared with the reference segment D9 according to the flow chart shown in Figure 5, resulting in the success of the matching. In this way, the verifying section 30 of Figure 2 completes the comparison of the input pattern of Figure 3A with the reference pattern of Figure 3B. The matching section 30 provides a category signal 72 representing the result of the pattern recognition to the succeeding output section 70, or the recognition that the input pattern is numeral "2".
Incidentally, when the input pattern fails to match all of the reference patterns previously stored in the reference memory 26, such an input pattern is rejectd as it can be recognized, by the matching section 30. In this case, if the noise patterns generated by dust or the like are added to the rejected input patterns, and the rejected input patterns are formed of a plurality of separated blocks, the smallest of these blocks is judged to be a noise pattern and is removed. Then, the verifying operation in the matching section 30 returns to the operation start step along the path as indicated by arrow 62 in Figure 4. Then, the matching section 30 starts the verifying process again.
According to the pattern recognition method and the apparatus as embodied in the present invention, a contour line of the input pattern is detected and divided into a plurality of segments by a predetermined process. These input segments are converted into digitized numeric attribute data containing the direction train data, the position data and the characteristic data. The numeric data are successively compared with the numeric attribute data of the reference patterns for every segment. Therefore, the data used for verification is plentiful, which improves the accuracy of the pattern recognition and the possibility of the success of the pattern recognition. Further, the present invention can stably recognize a variety of graphics containing handwritten characters.
In segmenting a contour line of the input pattern into a plurality of segments, the segmentation is not fine but is rather rough as compared with the segmentation performed for every picture element as in the prior art. Therefore, the construction of the pattern recognition apparatus is simpler than that of the prior art, allowing high speed data processing.
Referring now to Figure 7, there is shown another embodiment of a pattern recognition apparatus according to the present invention. In the Figure, like reference numerals are used for designating like or equivalent portions in Figure 2. A microprocessor 82 is connected to a matching section 84 including a verification start position determining section (start segment pursuer) 86 and a reference matching section 88. An input character pattern "4" depicted on a plane 90 and partially modified as shown in Figure 8A is applied as an object of recognition to the pattern recognition apparatus shown in Figure 7. A reference pattern shown in Figure 8B is stored in the reference memory 26. It is now assumed that when the input pattern shown in Figure 8A is supplied to a start segment pursuer 86 of the matching section 84, the reference pattern of Figure 8B is transferred from the reference memory 26 to the start segment pursuer 86.
As in the previous case, a contour line of the input pattern of Figure 8A is divided into a plurality of segments T1 to T13. A contour line of the reference pattern shown in Figure 8B is also divided into a plurality of segments El to E13. Both the segments are expressed by the digitized numeric attribute data containing curvatures, directions, positions and characteristics, as in the previous embodiment.
In comparing the input pattern of Figure 8A with the reference pattern of Figure 8B, the start segment pursuer 86 first verifies a reference verification start position which is predetermined and made to correspond to the reference pattern of Figure 8B, and searches a segment of the input pattern which satisfies the verification start position data in a predetermined direction. In the present embodiment, a first data is prepared describing that in the reference pattern of Figure 8B, the reference verification start position is located at an uppermost segment El in the pattern and the search is performed counterclockwise along a pattern contour line. Second data are also prepared describing that the verification start position resides in a segment having predetermined attribute data containing characteristic data of an end point and a large curvature.With this, a segment T1 of the input pattern of Figure 8A corresponding to the segment El in the reference pattern of Figure 8B is extracted by the start segment pursuer 86. The start segment pursuer 86 then determines whether or not the segment T1 satisfies the second data on the basis of the digitized numeric attribute data of the segment T1. In this case, since the segment T1 of Figure 8A has direction attribute data different from that of the reference segment El,the segment T1 does not satisfy the second data and is rejected.Hence, the start segment pursuer 86 checks the search direction given by a search.flag and searches the segments of the input pattern in the direction according to a process flow as given by an arrow 100 in a flow chart of Figure 9. The input pattern of Figure 8A is successively searched counterclockwise from the segment T1,T13, Tri 2,... and the comparison of these segments is repeated in succession with the reference segment El according to a direction of a verification process flow of Figure 9.
When the segment T10 of Figure 8A is compared with the reference segment El of Figure 8B, the segment T10 satisfies the second data. Accordingly, the start segment pursuer 86 detects the segment10 as the verification start segment. In this way, the verification start segment of the input pattern of which the basic pattern is the same as that of the reference pattern of Figure 8B but whose partial deformation from the reference pattern is relatively large, is determined for the reference pattern of Figure 8B. When the verification start segment is not detected although the input segments of the input pattern are all searched for the reference segment Eel, the microprocessor 82 responds to the start segment pursuer 86 to judge that the verification of the input pattern to the reference pattern fails.Then, it obtains another reference pattern from the reference memory 26 and supplies it to the start segment pursuer 86, and then performs the search operation of the start segment, as mentioned above. The operation following the detection of the verification start segment in the input pattern is similar to that of the above-mentioned embodiment, and hence no further explanation will be given.
According to the present embodiment, even when the input pattern is partially modified from the reference pattern, although the basic structures of their patterns are equal to each other, the vertification start segment may be detected relatively easily. Consequently, the present embodiment may reduce the probability of occurence of erroneous recognitions of the input pattern without making the verification start segment may be detected relatively easily. Consequently, the present embodiment may reduce the probability of occurrence of erroneous recognitions of the input pattern without making the verification algorithm compricated, thus enhancing the pattern recognition.
Although the present invention has been shown and described with respect to particular embodiments, nevertherless, various changes and modifications which are obvious to a person skilled in the art to which the invention pertains are deemed to lie within the spirit, scope, and contemplation of the invention. For example, in the above-mentioned embodiments, for merging the segments of the input pattern, a specific segment is successively merged with the adjacent input segments according to the type data; however, the segment merging method may be changed variously within the scope of the invention.

Claims (12)

1. A pattern recognition method for identifying an input pattern as one of reference patterns to which the input pattern is most similar, said method comprising: (a) a first step for detecting a contour line of the input pattern and for dividing the contour line into a plurality of input contour line segments; (b) a second step for extracting predetermined kinds of attribute information for each of said input contour line segments to convert the attribute information into digitized numeric data, and for storing said numeric data as first data units respectively corresponding to said input contour line segments in a first memory device, each of said first data units corresponding to each of said input contour line segments and including a plurality of numeric attribute data;; (c) a third step for supplying said first data units to a pattern matching section and for supplying one of reference patterns previously stored in a second memory device to said pattern matching section, said reference patterns each having a plurality of reference contour line segments resulting from division of a contour line of each of the reference patterns, each of said reference contour line segments being converted into second data units each of which corresponds to each of said reference contour line segments and includes a plurality of numeric attribute information stored in the second memory device; and (d) fourth step for numerically comparing at least one of said second data units so as to verify said input pattern with respect to said one reference pattern for identity of their contour line segments.
2. The method according claim 1, wherein said first step includes a fifth step for calculating curvatures at a plurality of points on a detected contour line of said input pattern and for detecting deflection points where the curvatures change, and a sixth step for dividing said detected contour line into plurality of contour line segments at said deflection points.
3. The method according to claim 2, wherein said attribute information information includes direction information, average curvature information and position information for each of said input contour line segments and reference contour line segments, said direction, average curvature and position information being converted into digitized numeric data.
4. The method according to claim 2, wherein said first step further includes a seventh step for detecting line segment configurative features of each of said contour line segments such as one of convex, concave and approximate linear configuration on the basis of the curvatures obtained in said fifth step.
5. The method according to claim 4, wherein said attribute information information includes direction information, average curvature information and position information for each of said input contour line segments and reference contour line segments, said direction, average curvature and position information being converted into digitized numeric data.
6. The method according to claim 4, wherein said fourth step includes an eighth step for determining a contour line segment as a verifying start part on the basis of the plurality of contour line segments of said input pattern and said one reference pattern, and a ninth step for comparing said first data unit of one of said input contour line segments of said input pattern with said second data unit of one of said reference contour line segments said one reference pattern and, when a similarity holds between said first and second data units, for repetitively performing a similar comparing operation between the succeeding contour line segments of both said patterns in a predetermined tracing direction along the contour line.
7. The method according to claim 6, wherein said ninth step includes a step for advancing the processing to a comparison processing between the further succeeding contour line segments of both said patterns regardless of the result of the comparison between said succeeding contour line segments of said input pattern and said one reference pattern when said succeeding contour line segment of said reference pattern is a predetermined type of the contour line segment.
8. The method according to claim 6, wherein said ninth step includes a step for merging a plurality of input contour line segments containing said succeeding contour line segment of said input pattern to obain a merged segment structure and for comparing said merged segment structure with said succeeding contour line segment of said one reference pattern when said succeeding contour line segment of said one reference pattern is a predetermined type of the contour line segment.
9. The method according to claim 6, wherein said eighth step includes a step for selecting a contour line segment, which has data unit corresponding to the data unit of a predetermined verification start contour line segment of said reference contour line segments, from said plurality of contour line segments of said input pattern so as to define the verification start segment in said input contour line segments.
10. A pattern recognition method substantially as hereinbefore described with reference to any Figure of the drawings.
11. A pattern recognition apparatus for indentifying an input pattern as one of reference patterns to which the input pattern is most similar, said apparatus comprising: (a) first memory means for storing a plurality of reference patterns each having a plurality of contour line segments resulting from a contour line of said reference pattern, each of said contour line segments being expressed by data unit containing a plurality of digitized numeric attribute data; (b) input means for optically detecting said input pattern and for converting said detected input pattern into an electrical signal;; (c) arithmetic means connected to said input means, for detecting a contour line of said input pattern, for dividing the contour line of said input pattern into a plurality of input contour line segments, for extracting predetermined kinds of attribute information for each of said input contour line segments, and for converting the attribute information of said input pattern into digitized numeric data so as to form a single data unit for each of said input contour line segments; (d) second memory means, connected to said input means and said arithmetic means, for storing the data units of said input pattern to make the data units of said input pattern correspond to said contour line segments respectively; and (e) verifying means connected to said first memory means and second memory means, for receiving said detected input pattern and said one reference pattern and for comparing at least one of the data unit representing one of said input contour line segments of said input pattern with at least one of said data unit representing one of said reference contour line segments of said one reference patterns so as to verify said input pattern with respect to said one reference pattern for identity of their contour line segments.
12. A pattern recognition apparatus substantially as hereinbefore described with reference to any Figure ofthe drawing.
GB08227791A 1981-10-16 1982-09-29 Pattern recognition apparatus and method Expired GB2108306B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP56165020A JPS5866176A (en) 1981-10-16 1981-10-16 Pattern recognizing device
JP56165019A JPS5866175A (en) 1981-10-16 1981-10-16 Pattern recognizing device
JP56165021A JPS5866177A (en) 1981-10-16 1981-10-16 Pattern recognizing device

Publications (2)

Publication Number Publication Date
GB2108306A true GB2108306A (en) 1983-05-11
GB2108306B GB2108306B (en) 1985-05-15

Family

ID=27322423

Family Applications (1)

Application Number Title Priority Date Filing Date
GB08227791A Expired GB2108306B (en) 1981-10-16 1982-09-29 Pattern recognition apparatus and method

Country Status (2)

Country Link
DE (1) DE3238300A1 (en)
GB (1) GB2108306B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0156343A2 (en) * 1984-03-26 1985-10-02 Hitachi, Ltd. Partial pattern matching method and apparatus
EP0163885A1 (en) * 1984-05-11 1985-12-11 Siemens Aktiengesellschaft Segmentation device
DE3633743A1 (en) * 1985-10-03 1987-04-09 Ricoh Kk CHARACTER RECOGNITION SYSTEM
EP0253397A2 (en) * 1986-07-17 1988-01-20 Matsushita Electric Industrial Co., Ltd. Shape recognition method
FR2854690A1 (en) * 2003-05-09 2004-11-12 France Etat Ponts Chaussees Broken particle e.g. ballast, morphology determining method for use in e.g. railway domain, involves processing relative positions of two consecutive segments that close in contour for determining criterion of morphology of material
WO2008074477A1 (en) * 2006-12-18 2008-06-26 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device, method and computer program for identifying characters in an image
US8660318B2 (en) 2009-10-30 2014-02-25 Fujitsu Frontech Limited Living body information registration method, biometrics authentication method, and biometrics authentication apparatus

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4443728C1 (en) * 1994-12-08 1996-04-04 Kronseder Maschf Krones Object shape identification system

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0156343A2 (en) * 1984-03-26 1985-10-02 Hitachi, Ltd. Partial pattern matching method and apparatus
EP0156343A3 (en) * 1984-03-26 1988-09-14 Hitachi, Ltd. Partial pattern matching method and apparatus
EP0163885A1 (en) * 1984-05-11 1985-12-11 Siemens Aktiengesellschaft Segmentation device
US4731858A (en) * 1984-05-11 1988-03-15 Siemens Aktiengesellschaft Arrangement for the segmentation of lines
DE3633743A1 (en) * 1985-10-03 1987-04-09 Ricoh Kk CHARACTER RECOGNITION SYSTEM
EP0253397A2 (en) * 1986-07-17 1988-01-20 Matsushita Electric Industrial Co., Ltd. Shape recognition method
EP0253397A3 (en) * 1986-07-17 1990-05-30 Matsushita Electric Industrial Co., Ltd. Shape recognition apparatus
FR2854690A1 (en) * 2003-05-09 2004-11-12 France Etat Ponts Chaussees Broken particle e.g. ballast, morphology determining method for use in e.g. railway domain, involves processing relative positions of two consecutive segments that close in contour for determining criterion of morphology of material
WO2004102164A1 (en) * 2003-05-09 2004-11-25 Laboratoire Central Des Ponts Et Chaussees Method and device for determining the morphology of a divided material
WO2008074477A1 (en) * 2006-12-18 2008-06-26 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device, method and computer program for identifying characters in an image
US8660318B2 (en) 2009-10-30 2014-02-25 Fujitsu Frontech Limited Living body information registration method, biometrics authentication method, and biometrics authentication apparatus

Also Published As

Publication number Publication date
GB2108306B (en) 1985-05-15
DE3238300A1 (en) 1983-05-05

Similar Documents

Publication Publication Date Title
US5930380A (en) Method and apparatus for verifying static signatures using dynamic information
Amo et al. Road extraction from aerial images using a region competition algorithm
Mahmoud Arabic character recognition using Fourier descriptors and character contour encoding
US5058182A (en) Method and apparatus for handwritten character recognition
US4972499A (en) Pattern recognition apparatus
EP0114248B1 (en) Complex pattern recognition method and system
EP0538038B1 (en) Character recognition method &amp; apparatus
EP0355748A2 (en) A pattern recognition apparatus and method for doing the same
GB2108306A (en) Pattern recognition apparatus and method
Lee et al. Unconstrained seal imprint verification using attributed stroke graph matching
Chi et al. Separation of single-and double-touching handwritten numeral strings
Tou et al. Automatic recognition of handwritten characters via feature extraction and multi-level decision
De Vena Number plate recognition by hierarchical neural networks
Hu et al. Structural boundary feature extraction for printed character recognition
Yamagata et al. A handwritten character recognition system by efficient combination of multiple classifiers
CN112926590B (en) Segmentation recognition method and system for characters on cable
JP2658154B2 (en) Character identification method
JPS62271190A (en) Segment numeral recognizing system
JP3006823B2 (en) Character and word recognition methods
Espelid A Raster-to-Vector Conversion Concept Based on Industrial Requirements.
JP2851865B2 (en) Character recognition device
JP2658153B2 (en) Character identification method
CN117058697A (en) Extraction sequence prediction method, device, equipment and medium for case information
JP3138665B2 (en) Handwritten character recognition method and recording medium
Varga et al. The Application of Dynamic Programming to Object Identification

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 19970929