EP0149457B1 - Method of identifying contour lines - Google Patents

Method of identifying contour lines Download PDF

Info

Publication number
EP0149457B1
EP0149457B1 EP85100073A EP85100073A EP0149457B1 EP 0149457 B1 EP0149457 B1 EP 0149457B1 EP 85100073 A EP85100073 A EP 85100073A EP 85100073 A EP85100073 A EP 85100073A EP 0149457 B1 EP0149457 B1 EP 0149457B1
Authority
EP
European Patent Office
Prior art keywords
candidate point
points
center position
contour
picture elements
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
EP85100073A
Other languages
German (de)
French (fr)
Other versions
EP0149457A2 (en
EP0149457A3 (en
Inventor
Yuji Watanabe
Kozo Kato
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Komatsu Ltd
Original Assignee
Komatsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP59004451A external-priority patent/JPS60147886A/en
Priority claimed from JP59026580A external-priority patent/JPS60179881A/en
Application filed by Komatsu Ltd filed Critical Komatsu Ltd
Publication of EP0149457A2 publication Critical patent/EP0149457A2/en
Publication of EP0149457A3 publication Critical patent/EP0149457A3/en
Application granted granted Critical
Publication of EP0149457B1 publication Critical patent/EP0149457B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Definitions

  • This invention relates to a method of identifying a contour line of an object to be detected by using an image pick-up device, for example, a television camera.
  • the picture image is divided into a plurality of partial image regions each having substantially the same brightness and then the object is identified by judging the continuity of the regions on the assumption that respective surfaces of the object have similar brightness.
  • the contour line extraction method utilizes the edges of respective surfaces of the object and according to this method, the points of the picture image at which the brightness changes rapidly are extracted as the edges, and the edges are connected together to convert them into a line picture.
  • This method contemplates detection of lines in the picture image so that when compared with the region method which detects the surfaces, the number of detection stops and the quantity of informations for investigating their continuity are small so that high speed processing can be made.
  • Figs. 1a through 1d The steps of identifying a circular body with the contour line extraction method will be described with reference to Figs. 1a through 1d.
  • an original picture image shown in Fig. 1a
  • picture elements near this point are similarly differentiated and a picture element having the maximum differentiated value is taken as a point continuous to the contour candidate point.
  • This processing is repeated many times to obtain continuous contour points (contour line candidate)(see Fig. 1c) and when these contour points are closed (see Fig. 1d) they are deemed as an object.
  • 19 radial lines are set from the upper lip position as the reference position to obtain as a candidate point a point closest to the upper lip position when a slit for detecting the degree of brightness is moved along the radial lines within the searching area. It is judged that the chin contour detection has failed if no candidate point has been determined on three consecutively radial lines among the 19 radial lines.
  • the method of the present invention is defined by the features of claim 1.
  • an object to be detected is a circular body 1 shown in Fig. 3.
  • An industrial television (ITV) camera 2 photographs the circular body or object in a predetermined field of view to send a video composite signal containing the brightness signal of the input picture image to a synchronous isolating circuit 3 and an A/D converter 4.
  • the synchronous isolating circuit 3 operates to separate a synchronizing signal from the video composite signal.
  • the synchronizing signal thus separated is used to designate an address of a random access memory array (RAM array) 5, while the A/D converter 4 converts the inputted video composite signal into a picture image data having 16 tones of brightness for writing the picture image data in the designated address.
  • picture image data is stored corresponding to one picture and representing the brightness of the original picture shown in Fig. 4. Any desired picture image data can be read out by designating X and Y addresses of the RAM array 5.
  • Memory means 6 stores the main program or the like for carrying out the method of this invention, and a central processing unit (CPU) 7 executes the processing of the picture image data stored in the RAM array in accordance with the content of the main program.
  • CPU central processing unit
  • the threshold value D of the differentiated value, the diameter L of the circular body, the number of scannings Ns in the radial direction, and the preset number No of the contour candidate point are set.
  • the threshold value D is used to judge the contour candidate point at which the brightness of the picture image data varies rapidly.
  • the number of scanning is selected to be 8 and the number of presets No is selected to be 6.
  • the present picture image data (see Fig. 6a) in the RAM array 5 is searched for finding out the center position candidate point of the circular body.
  • the search of the center position candidate point is made by differentiating in the X direction.
  • the picture image data is stored in the RAM array 5, and is based on the positions of respective contour candidate points when the spacing between two contour candidate points at which the brightness changes rapidly approaches the set diameter L.
  • the number n of the contour candidate points is set to zero
  • the present picture image is scanned in the X direction
  • the differentiated value D' of the picture image data is calculated.
  • a judgment is made as to whether the differentiated value D' has exceeded the threshold value D or not.
  • the coordinate position of D' is stored at step 105, and at step 106 n is incremented by one.
  • a judgement is made as to whether n is equal to or larger than 2.
  • the distance L' (see Fig. 6b) between any two stored points is calculated.
  • the center position candidate point C (X,Y) (see Fig. 6c) is calculated by utilizing the coordinate positions of the two points. It should be understood that the method of searching the center positions candidate point is not limited to the illustrated embodiment. For example, a method can be used wherein more than three contour candidate points are determined so as to calculate the center position candidate point based on a circle passing through these three points.
  • the contour candidate point of the circular body is searched based on the center position candidate point of the circular body so as to check presence or absence of the contour line, that is, the circular body.
  • the preset scanning directions are 8, that is 0 (+ X direction), ⁇ /4, ⁇ /2, 3 ⁇ /4, ⁇ , 5 ⁇ /4, 3 ⁇ /2 and 7 ⁇ /4 by taking the center position candidate point as the reference point (see Fig. 6d).
  • the maximum differentiated value D max is set to zero at step 112. After that, at step 113, the scanning is made in either one of the eight radial directions and at step 114, the differentiated value D' of the picture image data is calculated. At step 115, a judgment is made as to whether the differentiated value D' is larger than the maximum differentiated value D max or not. When the result is YES, at step 116, the differentiated value D' is changed for the maximum differentiated value D max so that in the scanning range, all maximum differentiated values are changed to the maximum differentiated values. At step 117, a judgment is made whether the scanning in the range has completed or not.
  • step 118 a judgment is made as to whether the maximum differentiated value D max has exceeded the threshold value D or not.
  • the number n of the contour candidate point is incremented by one.
  • step 120 a judgement is made as to whether the contour candidate point presents or not in the eight scanning directions.
  • the total number n of the contour candidate points is larger than the preset number No (in this example 6) of the contour candidate points, it is judged that the contour line of the circular body presents in the doughnut shaped region.
  • the search of the center position candidate point of the circular body is performed again at step 121.
  • step 122 when the presence of the contour line of the circular body is recognized, at step 122, an approximate circle is determined from n contour candidate points, and at step 123, the coordinates of the center of the circle and, if desired, its diameter are calculated (see Fig. 6e), thus finishing the processing of the picture image.
  • the method of this invention is also applicable to such a case. Furthermore, the invention is also applicable to a substantially circular body (an ellipse close to a circle or a polygon).
  • the number Ns of scannings in the radial direction, the direction of scanning, and the preset number No establishing the threshold value are not limited to those described above.
  • the continuity of the contour line is identified by the following method.
  • Fig. 7 shows a flow chart indicative of the successive steps of the CPU 7 for identifying the continuity of the contour line.
  • step 200 characterizing points of an object to be detected are extracted from the present picture image data (see Fig. 8a) stored in the RAM array 5.
  • the object to be detected is assumed to be a circular body, as its characteristic point is detected --- the center position Po (Xo, Yo) of the circle (see Fig. 8b).
  • the method of detecting the characteristic point is not limited to that described above and the characteristic point can be detected by differentiating in the X direction the picture image data stored in the RAM array 5 so as to detect characteristic point based on the two contour candidate positions the spacing therebetween becoming the maximum and the brightness changes abruptly, or by determining more than three contour candidate points and then calculating the center position of a circle passing the three points.
  • X contour candidate points are presumed from the characteristic point Po and the radius of the circle.
  • i is set to 1, and at step 203 several picture elements Ci including the contour candidate point Pi, several picture elements Oi on the outside of the contour candidate point Pi, and several picture elements Ii on the inner side of the contour candidate point Pi are extracted (see Figs. 8c, 8d and 8e).
  • the extraction should be made such that the three types of the picture image data would be arranged in substantially normal direction with respect to the loci of the contour candidate points.
  • the direction of ⁇ of the normal is calculated by the following equation in accordance with the relative position between the characteristic point Po (Xo, Yo) and the contour candidate point Pi (Xi, Yi).
  • the picture elements are extracted based on this direction ⁇ .
  • Figs. 9a, 9b and 9c respectively show three picture elements Ci containing contour candidate point Pi, three picture elements Oi on the outside of the contour candidate point Pi and three picture elements Ii on the inside of the contour candidate point Pi.
  • the mean values Ci, Oi and Ii of the brightness of the three types of the picture image data Ci, Oi and Ii are determined.
  • a judgment is made as to whether i is equal to one or not and when the result of judgment is YES, at step 206, i is made to be 2 to execute again the foregoing steps.
  • the program sequence is advanced to step 207 where the differences Sc, So and S I between the mean values Ci, Oi and Ii and Ci-1, Oi-1 and Ii-1 which are obtained at adjacent contour candidate points are calculated according to the following equations.
  • step 209 i is incremented by one for the purpose of checking whether the next adjacent contour candidate points are continuous or discontinuous, thus executing again above described steps.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Description

    BACKGROUND OF THE INVENTION 1. Field of Invention
  • This invention relates to a method of identifying a contour line of an object to be detected by using an image pick-up device, for example, a television camera.
  • 2. Description of the Prior Art
  • Among prior art,methods of calculating the values of characteristic parameters of an object from a television picture image (a multivalue picture image) may be mentioned a region method and a contour line detection method.
  • According to the region method, the picture image is divided into a plurality of partial image regions each having substantially the same brightness and then the object is identified by judging the continuity of the regions on the assumption that respective surfaces of the object have similar brightness.
  • Although this method is effective for an object constituted by planes, when curved surfaces present, the processing thereof is difficult. Moreover, since all picture image data are processed, the amount of data becomes excessive, thus making difficult a high speed processing.
  • On the other hand, the contour line extraction method utilizes the edges of respective surfaces of the object and according to this method, the points of the picture image at which the brightness changes rapidly are extracted as the edges, and the edges are connected together to convert them into a line picture. This method contemplates detection of lines in the picture image so that when compared with the region method which detects the surfaces, the number of detection stops and the quantity of informations for investigating their continuity are small so that high speed processing can be made.
  • The steps of identifying a circular body with the contour line extraction method will be described with reference to Figs. 1a through 1d. At first, an original picture image (shown in Fig. 1a) photographed with a television camera is differentiated along respective scanning lines to extract a contour candidate point at which the brightness changes rapidly. Then, picture elements near this point are similarly differentiated and a picture element having the maximum differentiated value is taken as a point continuous to the contour candidate point. This processing is repeated many times to obtain continuous contour points (contour line candidate)(see Fig. 1c) and when these contour points are closed (see Fig. 1d) they are deemed as an object.
  • According to this prior art contour extraction method, tracing of the contour candidate points are rendered difficult by the following factors.
    • (1) blooming caused by metal luster (see Fig. 2a),
    • (2) overlapping of object (see Fig. 2b),
    • (3) vague or not clear image caused by rust, spoil, etc. of the object surfaces,
    • (4) distortion of the picture image caused by electrical noise.
  • As a consequence, there is a defect that an actually presenting object would not be detected. Furthermore, the identifying algorithm for solving these problems becomes complicated so that real time processing is almost impossible.
  • The periodical "Systems Computer Controls", Vol. 4, No. 2, 1973, 61-70, Washington, U.S.; T. Sakai et al.: "Computer Analysis of Photographs of Human Faces" discloses a method for analyzing the contour of a human face, comprising subroutines divided in two blocks, each for detecting a part of the face. This method comprises a technique of searching for the right and left positions of the cheeks, the upper lip position, the nose position and the chin position from the binarized image of a human face to determine a search area where the contour line of the chin is assumed to exist on the basis of the positions of the other elements of the face. 19 radial lines are set from the upper lip position as the reference position to obtain as a candidate point a point closest to the upper lip position when a slit for detecting the degree of brightness is moved along the radial lines within the searching area. It is judged that the chin contour detection has failed if no candidate point has been determined on three consecutively radial lines among the 19 radial lines.
  • The periodical "Computer Vision, Graphics and Image Processing", Vol. 25, No. 1, New York, USA, January 1984, 89-112, Academic Press, Inc. New York, U.S.; Y. Okawa "Automatic Inspection of the Surface Defects of Cast Metals" describes a technique of setting a binary image of 192x 192 picture elements to extract a contour line of a circular pulley by joining the consecutive picture elements whose binary values vary with respect to the eight adjacent picture elements. Thereafter, the diameter and the center point of the contour are determined by scanning along straight lines starting from an arbitrarily selected point of the contour line.
  • Finally, the periodical "IEEE Transactions On Computers", Vol. C-26, No. 9, September 1977, 882-894, New York, U.S.; M. Yachida et al.; "A Versatile Machine Vision System for Complex Industrial Parts" describes a method for recognizing a variety of complex industrial parts wherein a gray-level histogram of all the picture points is first computed to determine a threshold. Thereafter, the picture data are scanned. There is provided a line finder that first determines a search region around a specified location and applies a local gradient operator to find edge points in the region. Then the line finder searches an edge line in the region by searching an optimum sequence of edge points in the regions.
  • None of these prior art methods gives a solution for accurately identifying a contour line having the above-mentioned defects (1) to (4).
  • It is the principal object of this invention to provide a novel method of accurately identifying the contour line of a circle or an object having a configuration similar to a circle or a portion thereof by using a simple electric circuit and simple processing steps.
  • The method of the present invention is defined by the features of claim 1.
  • Further objects and advantages can be more fully understood from the following detailed description taken in conjunction with the accompanying drawings, in which
    • Figs. 1a-1d are diagrammatic representations showing the steps of identifying an object according to a prior art method of contour extraction;
    • Figs. 2a and 2b are diagrammatic representations showing one example of prior art factors that render difficulty to trace a contour line;
    • Fig. 3 is a block diagram showing one example of the apparatus utilized to carry out the method of identifying a substantially circular contour line according to this invention;
    • Fig. 4 shows the degree of brightness of a picture image data stored in the RAM array shown in Fig. 3;
    • Figs, 5a and 5b are flow charts showing one example of the steps of processing executed by the central processing unit shown in Fig. 3;
    • Figs. 6a - 6e are diagrams for explaining the steps of the flow charts shown in Figs. 5a and 5b;
    • Fig. 7 is a flow chart showing successive steps of the processings executed by the central processing unit for identifying the continuity of the contour line;
    • Figs. 8a - 8e are diagrams useful to explain the chart shown in Fig. 7;
    • Figs. 9a - 9c respectively show examples of several picture elements including a contour candidate point, several picture elements on the outside of the contour candidate point, and several picture elements on the inner side of the contour candidate point;
    • Fig. 10 is a graph showing one example of a mean value of the brightness along the entire periphery of the contour line, and
    • Fig. 11 shows another example of the contour candidate point.
    DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • A preferred embodiment of this invention will now be described with reference to the accompanying drawings.
  • It is assumed that an object to be detected is a circular body 1 shown in Fig. 3. An industrial television (ITV) camera 2 photographs the circular body or object in a predetermined field of view to send a video composite signal containing the brightness signal of the input picture image to a synchronous isolating circuit 3 and an A/D converter 4. The synchronous isolating circuit 3 operates to separate a synchronizing signal from the video composite signal. The synchronizing signal thus separated is used to designate an address of a random access memory array (RAM array) 5, while the A/D converter 4 converts the inputted video composite signal into a picture image data having 16 tones of brightness for writing the picture image data in the designated address. In this manner, picture image data is stored corresponding to one picture and representing the brightness of the original picture shown in Fig. 4. Any desired picture image data can be read out by designating X and Y addresses of the RAM array 5.
  • Memory means 6 stores the main program or the like for carrying out the method of this invention, and a central processing unit (CPU) 7 executes the processing of the picture image data stored in the RAM array in accordance with the content of the main program.
  • The steps of processing executed by the CPU 7 will be described as follows with reference to the flow charts shown in Figs. 5a and 5b and the diagrams shown in Figs. 6a - 6e.
  • At step 100 shown in Fig. 5a, the threshold value D of the differentiated value, the diameter L of the circular body, the number of scannings Ns in the radial direction, and the preset number No of the contour candidate point are set. The threshold value D is used to judge the contour candidate point at which the brightness of the picture image data varies rapidly. In this embodiment, the number of scanning is selected to be 8 and the number of presets No is selected to be 6.
  • After the initial setting has been completed, the present picture image data (see Fig. 6a) in the RAM array 5 is searched for finding out the center position candidate point of the circular body. The search of the center position candidate point is made by differentiating in the X direction. The picture image data is stored in the RAM array 5, and is based on the positions of respective contour candidate points when the spacing between two contour candidate points at which the brightness changes rapidly approaches the set diameter L.
  • More particularly, at step 101, the number n of the contour candidate points is set to zero, at step 102, the present picture image is scanned in the X direction, and at step 103, the differentiated value D' of the picture image data is calculated. At step 104, a judgment is made as to whether the differentiated value D' has exceeded the threshold value D or not. When the differentiated value D' has exceeded the threshold value D, the coordinate position of D' is stored at step 105, and at step 106 n is incremented by one. At step 107, a judgement is made as to whether n is equal to or larger than 2. When the result of this judgement is YES, at step 108 the distance L' (see Fig. 6b) between any two stored points is calculated. Then at step 109, a judgment is made as to whether the distance L' is close to the initially set diameter L or not. When the result of this judgment is YES, at step 110, the center position candidate point C (X,Y) (see Fig. 6c) is calculated by utilizing the coordinate positions of the two points. It should be understood that the method of searching the center positions candidate point is not limited to the illustrated embodiment. For example, a method can be used wherein more than three contour candidate points are determined so as to calculate the center position candidate point based on a circle passing through these three points.
  • Thereafter, the contour candidate point of the circular body is searched based on the center position candidate point of the circular body so as to check presence or absence of the contour line, that is, the circular body.
  • Referring now to Fig. 5b, at step 111, the number n of the contour candidate points is set to zero for scanning in a preset radial direction from the center position candidate point. Since the radius R (= L/2) of the circular body has been given, the region to be scanned is limited to a doughnut shaped region bounded by a circle having a permissible minimum radius (R - ΔR), and a circle having a permissible maximum radius (R + ΔR). The preset scanning directions are 8, that is 0 (+ X direction), π/4, π/2, 3π/4, π, 5π/4, 3π/2 and 7π/4 by taking the center position candidate point as the reference point (see Fig. 6d).
  • At the time of scanning in respective radial directions, the maximum differentiated value Dmax is set to zero at step 112. After that, at step 113, the scanning is made in either one of the eight radial directions and at step 114, the differentiated value D' of the picture image data is calculated. At step 115, a judgment is made as to whether the differentiated value D' is larger than the maximum differentiated value Dmax or not. When the result is YES, at step 116, the differentiated value D' is changed for the maximum differentiated value Dmax so that in the scanning range, all maximum differentiated values are changed to the maximum differentiated values. At step 117, a judgment is made whether the scanning in the range has completed or not. When the scanning has completed, at step 118, a judgment is made as to whether the maximum differentiated value Dmax has exceeded the threshold value D or not. When the result of judgment is YES, at step 119, the number n of the contour candidate point is incremented by one. After that, at step 120, a judgement is made as to whether the contour candidate point presents or not in the eight scanning directions. When the total number n of the contour candidate points is larger than the preset number No (in this example 6) of the contour candidate points, it is judged that the contour line of the circular body presents in the doughnut shaped region. On the other hand, when the total number n of the contour candidate points is less than the preset number No, the search of the center position candidate point of the circular body is performed again at step 121.
  • Finally, when the presence of the contour line of the circular body is recognized, at step 122, an approximate circle is determined from n contour candidate points, and at step 123, the coordinates of the center of the circle and, if desired, its diameter are calculated (see Fig. 6e), thus finishing the processing of the picture image.
  • When photographing a circular body with the ITV camera 2, where the center of the circular body is displaced from the ITV camera the resulting contour line is not a true circle but an ellipse. The method of this invention is also applicable to such a case. Furthermore, the invention is also applicable to a substantially circular body (an ellipse close to a circle or a polygon).
  • The number Ns of scannings in the radial direction, the direction of scanning, and the preset number No establishing the threshold value are not limited to those described above.
  • After the presence of the contour line of a circular body has been recognized, the continuity of the contour line is identified by the following method.
  • Fig. 7 shows a flow chart indicative of the successive steps of the CPU 7 for identifying the continuity of the contour line. At step 200, characterizing points of an object to be detected are extracted from the present picture image data (see Fig. 8a) stored in the RAM array 5. In this example, since the object to be detected is assumed to be a circular body, as its characteristic point is detected --- the center position Po (Xo, Yo) of the circle (see Fig. 8b). The method of detecting the characteristic point is not limited to that described above and the characteristic point can be detected by differentiating in the X direction the picture image data stored in the RAM array 5 so as to detect characteristic point based on the two contour candidate positions the spacing therebetween becoming the maximum and the brightness changes abruptly, or by determining more than three contour candidate points and then calculating the center position of a circle passing the three points.
  • Then at step 201, X contour candidate points are presumed from the characteristic point Po and the radius of the circle. For the sake of convenience, respective contour candidate points are represented by Pi (i = 1..n (see Fig. 8c). Then at step 202, i is set to 1, and at step 203 several picture elements Ci including the contour candidate point Pi, several picture elements Oi on the outside of the contour candidate point Pi, and several picture elements Ii on the inner side of the contour candidate point Pi are extracted (see Figs. 8c, 8d and 8e).
  • The extraction should be made such that the three types of the picture image data would be arranged in substantially normal direction with respect to the loci of the contour candidate points. The direction of ψ of the normal is calculated by the following equation in accordance with the relative position between the characteristic point Po (Xo, Yo) and the contour candidate point Pi (Xi, Yi).
    Figure imgb0001

    The picture elements are extracted based on this direction ψ . Figs. 9a, 9b and 9c respectively show three picture elements Ci containing contour candidate point Pi, three picture elements Oi on the outside of the contour candidate point Pi and three picture elements Ii on the inside of the contour candidate point Pi.
  • At step 204, the mean values Ci, Oi and Ii of the brightness of the three types of the picture image data Ci, Oi and Ii are determined. After that at step 205, a judgment is made as to whether i is equal to one or not and when the result of judgment is YES, at step 206, i is made to be 2 to execute again the foregoing steps. When the result of judgment of step 205 is NO, the program sequence is advanced to step 207 where the differences Sc, So and SI between the mean values Ci, Oi and Ii and Ci-1, Oi-1 and Ii-1 which are obtained at adjacent contour candidate points are calculated according to the following equations.

    Sc = C i - C i-1
    Figure imgb0002

    So = O i - O i-1    (2)
    Figure imgb0003

    S I = I i - I i-1
    Figure imgb0004


       Then at step 208, a check is made as to whether the differences Sc, So and SI thus calculated are included in the permissible range (from the lower limit TL to the upper limit TH) preset as shown in Fig. 10. When these differences are on the outside of the permissible range, it is judged that adjacent contour candidate points are discontinuous at step 211. However, when either one of the three differences lies in the permissible range, at step 208, it is judged that adjacent contour candidate points are continuous. More particularly, as shown in Fig. 10, between C and D, there is a point at which the difference goes out of the permissible range, but where either one of the other differences lies in the permissible range, it is judged that adjacent contour candidate points are continuous. Then, at step 209, i is incremented by one for the purpose of checking whether the next adjacent contour candidate points are continuous or discontinuous, thus executing again above described steps.
  • When the continuity of all contour candidate points is confirmed at step 210, the processing is terminated. In other word, it is now judged that the contour line is continuous to identify the object.

Claims (3)

  1. A method of identifying a contour line, comprising the steps of:
    based on positions of more than two points, at which brightness differs more than a predetermined value when scanning a screen on which a contour line of an object to be detected having circular or a substantially circular shape occurs, and on the radius (R) of the object, presuming a center position candidate point (C) of the object and presuming, as contour candidate points of the object, points (Pi(i=1...n)) apart from the center position candidate point (C) by a distance corresponding to the radius;
    selecting, from among picture elements (Ci) situated along a direction of a line connecting a specific contour candidate point of the presumed contour candidate points and the center position candidate point, a first group of picture elements (Ci) including a picture element (Pi) corresponding to the specific contour candidate point and several picture elements consecutively present along the direction of the line, the picture element (Pi) corresponding to the specific contour candidate point constituting the center of the first group of picture elements, a second group of picture elements (Ii) including several picture elements sequentially adjacent to the picture element (Pi) corresponding to the specific contour candidate point and situated in a first direction toward the center position candidate point, and a third group of picture elements (Oi) including several picture elements sequentially adjacent to the picture element (Pi) corresponding to the specific contour candidate point and situated in a second direction away from the center position candidate point;
    calculating average values (Ci,Ii,Oi) of brightness for the first, second and third groups of picture elements, respectively; and
    determining differences (Sc,Si,So) between the average values (Ci,Ii,Oi) obtained with respect to the specific contour candidate point and average values obtained by carrying out the selecting step and the calculating step for a picture element (Pi-1) corresponding to another contour candidate point adjacent to the specific contour candidate point corresponding to picture element Pi for the first, second and third picture element groups, respectively,
    wherein the adjacent contour candidate points corresponding to picture elements Pi and Pi-1 are recognized to be continuous when the difference with respect to at least one picture element group among the differences determined in the determining step is within a predetermined range, and wherein if said difference lies outside said range, said difference determining step is performed with respect to a picture element corresponding to another adjacent presumed contour candidate point.
  2. The method according to claim 1, wherein the presuming step includes the steps of:
    when the center position candidate point (C) is obtained, defining a region bounded by a circle constituted with points apart from the center position candidate point by a predetermined distance smaller than the radius (R) and a circle constituted with points apart from the center position candidate point by a predetermined distance larger than the radius;
    scanning the defined region in a plurality of different radial directions extending from the center position candidate point;
    judging that the contour line of the object is present within the defined region, when more than a predetermined number of scannings in which change of brightness has exceeded the predetermined value have been confirmed so as to utilize the contour candidate points determined by the center position candidate point (C) for recognizing the continuity of the contour line.
  3. The method according to claim 1, wherein the presuming step includes the steps of:
    sequentially detecting two points at which the brightness varies more than the predetermined value, or detecting three points at which the brightness varies more than the predetermined value; and
    presuming, as the center position candidate point (C) of the object, a center position of the two points when the distance between the detected two points becomes equal to a diameter which is twice of the radius (R) of the object, or a center position of the two points when the distance of the detected two points becomes maximum, or a center position of a circle passing through the detected three points.
EP85100073A 1984-01-13 1985-01-04 Method of identifying contour lines Expired - Lifetime EP0149457B1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP4451/84 1984-01-13
JP59004451A JPS60147886A (en) 1984-01-13 1984-01-13 Recognition method for continuity of profile line
JP59026580A JPS60179881A (en) 1984-02-15 1984-02-15 Recognizing method of approximately circular outline
JP26580/84 1984-02-15

Publications (3)

Publication Number Publication Date
EP0149457A2 EP0149457A2 (en) 1985-07-24
EP0149457A3 EP0149457A3 (en) 1989-02-22
EP0149457B1 true EP0149457B1 (en) 1993-03-31

Family

ID=26338218

Family Applications (1)

Application Number Title Priority Date Filing Date
EP85100073A Expired - Lifetime EP0149457B1 (en) 1984-01-13 1985-01-04 Method of identifying contour lines

Country Status (3)

Country Link
US (1) US4644583A (en)
EP (1) EP0149457B1 (en)
DE (1) DE3587220T2 (en)

Families Citing this family (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT1179997B (en) * 1984-02-24 1987-09-23 Consiglio Nazionale Ricerche PROCEDURE AND EQUIPMENT FOR THE DETECTION OF THE FOOTPRINT LEFT IN A SAMPLE TO THE MEASURE OF THE PENETRATION HARDNESS
GB2175396B (en) * 1985-05-22 1989-06-28 Filler Protection Developments Apparatus for examining objects
SE448126B (en) * 1985-05-23 1987-01-19 Context Vision Ab DEVICE FOR THE DETECTION OF LANGUAGE CHANGES OF A PROPERTY WITHIN A AREA OF A DISTRIBUTED IMAGE DIVISION
SE448124B (en) * 1985-05-23 1987-01-19 Context Vision Ab DEVICE FOR DETECTING THE VARIATION RATE OF A PROPERTY IN A AREA OF A DISTRIBUTED IMAGE DIVISION
SE448125B (en) * 1985-05-23 1987-01-19 Context Vision Ab DEVICE FOR DETERMINING THE DEGREE OF CONSTANCE WITH A PROPERTY OF A AREA IN A DISCRETE IMAGE DIVISION DIVIDED
KR900007548B1 (en) * 1985-10-04 1990-10-15 다이닛뽕스쿠링세이소오 가부시키가이샤 Pattern masking method and an apparatus therefor
JPS62103508A (en) * 1985-10-31 1987-05-14 Hajime Sangyo Kk Inspecting external configuration of object and instrument therefor
DE3542484A1 (en) * 1985-11-30 1987-07-02 Ant Nachrichtentech METHOD FOR DETECTING EDGE STRUCTURES IN AN IMAGE SIGNAL
EP0289500A1 (en) * 1985-12-16 1988-11-09 National Research Development Corporation Inspection apparatus
JPS62209305A (en) * 1986-03-10 1987-09-14 Fujitsu Ltd Method for judging accuracy of dimension
JPS62209304A (en) * 1986-03-10 1987-09-14 Fujitsu Ltd Method for measuring dimension
US5214718A (en) * 1986-10-06 1993-05-25 Ampex Systems Corporation Scan-in polygonal extraction of video images
US4759074A (en) * 1986-10-28 1988-07-19 General Motors Corporation Method for automatically inspecting parts utilizing machine vision and system utilizing same
US4933865A (en) * 1986-12-20 1990-06-12 Fujitsu Limited Apparatus for recognition of drawn shapes or view types for automatic drawing input in CAD system
JP2596744B2 (en) * 1987-04-16 1997-04-02 富士写真フイルム株式会社 Radiation field recognition method
US4825263A (en) * 1987-06-02 1989-04-25 University Of Medicine & Dentistry Of New Jersey Optical method and apparatus for determining three-dimensional changes in facial contours
US4961425A (en) * 1987-08-14 1990-10-09 Massachusetts Institute Of Technology Morphometric analysis of anatomical tomographic data
EP0307948B1 (en) * 1987-09-18 1993-03-03 Toppan Printing Co., Ltd. Silhouette cutting apparatus
JP2735197B2 (en) * 1987-11-12 1998-04-02 株式会社東芝 Graphic input device
EP0335204B1 (en) * 1988-03-19 1992-07-29 Fuji Photo Film Co., Ltd. Method for determining the contour of an irradiation field
US4983835A (en) * 1988-03-19 1991-01-08 Fuji Photo Film Co., Ltd. Method for detecting prospective contour points of an irradiation field
US4901361A (en) * 1988-05-27 1990-02-13 The United States Of America As Represented By The Secretary Of The Air Force Automated spall panel analyzer
US5018211A (en) * 1988-10-31 1991-05-21 International Business Machines Corp. System for detecting and analyzing rounded objects
GB8906587D0 (en) * 1989-03-22 1989-05-04 Philips Electronic Associated Region/texture coding systems
JPH083405B2 (en) * 1989-06-30 1996-01-17 松下電器産業株式会社 Lead position recognition device
US5054094A (en) * 1990-05-07 1991-10-01 Eastman Kodak Company Rotationally impervious feature extraction for optical character recognition
JP2528376B2 (en) * 1990-06-28 1996-08-28 大日本スクリーン製造株式会社 Image contour correction method
JP2982150B2 (en) * 1990-08-28 1999-11-22 キヤノン株式会社 Character pattern processing method and apparatus
US5345242A (en) * 1990-09-27 1994-09-06 Loral Aerospace Corp. Clutter rejection using connectivity
US5287293A (en) * 1990-12-31 1994-02-15 Industrial Technology Research Institute Method and apparatus for inspecting the contours of a gear
JPH04237383A (en) * 1991-01-22 1992-08-25 Matsushita Electric Ind Co Ltd Method for approximating to circular arc in two-dimensional image processing
US5134661A (en) * 1991-03-04 1992-07-28 Reinsch Roger A Method of capture and analysis of digitized image data
JP2639518B2 (en) * 1991-10-30 1997-08-13 大日本スクリーン製造株式会社 Image processing method
US5590220A (en) * 1992-08-12 1996-12-31 International Business Machines Corporation Bending point extraction method for optical character recognition system
US5339367A (en) * 1992-12-02 1994-08-16 National Research Council Of Canada Identifying curves within a scanned image
JP2919284B2 (en) * 1994-02-23 1999-07-12 松下電工株式会社 Object recognition method
US6178262B1 (en) * 1994-03-11 2001-01-23 Cognex Corporation Circle location
US6021222A (en) * 1994-08-16 2000-02-01 Ricoh Co., Ltd. System and method for the detection of a circle image for pattern recognition
US6084986A (en) * 1995-02-13 2000-07-04 Eastman Kodak Company System and method for finding the center of approximately circular patterns in images
JPH09138471A (en) * 1995-09-13 1997-05-27 Fuji Photo Film Co Ltd Specified shape area extracting method, specified area extracting method and copy condition deciding method
FR2743415B1 (en) * 1996-01-09 1998-02-13 Service Central Des Laboratoir PROJECTILE COMPARISON METHOD AND DEVICE
FR2743416B1 (en) * 1996-01-09 1998-02-13 Service Central Des Laboratoir METHOD FOR COMPARING PROJECTILE SOCKETS AND DEVICE
US6714679B1 (en) 1998-02-05 2004-03-30 Cognex Corporation Boundary analyzer
US6697535B1 (en) 1999-04-30 2004-02-24 Cognex Technology And Investment Corporation Method for refining a parameter of a contour in an image
US6901171B1 (en) 1999-04-30 2005-05-31 Cognex Technology And Investment Corporation Methods and apparatuses for refining groupings of edge points that represent a contour in an image
JP2001119610A (en) * 1999-08-10 2001-04-27 Alps Electric Co Ltd Contour detection circuit and image display device
US7474787B2 (en) * 1999-12-28 2009-01-06 Minolta Co., Ltd. Apparatus and method of detecting specified pattern
TWI254234B (en) * 2004-12-24 2006-05-01 Hon Hai Prec Ind Co Ltd System and method for auto-judging geometric shape trend of a set of dots on an image
DE102005023376A1 (en) * 2005-05-17 2006-11-23 Carl Zeiss Industrielle Messtechnik Gmbh Method and device for determining material boundaries of a test object
TWI320914B (en) * 2006-07-28 2010-02-21 Via Tech Inc Weight-adjusted apparatus and method thereof
US10896327B1 (en) * 2013-03-15 2021-01-19 Spatial Cam Llc Device with a camera for locating hidden object
DE102009048066A1 (en) 2009-10-01 2011-04-07 Conti Temic Microelectronic Gmbh Procedure for traffic sign recognition
JP5476938B2 (en) * 2009-11-16 2014-04-23 ウシオ電機株式会社 Alignment mark detection method
DE102010020330A1 (en) 2010-05-14 2011-11-17 Conti Temic Microelectronic Gmbh Method for detecting traffic signs
DE102011109387A1 (en) 2011-08-04 2013-02-07 Conti Temic Microelectronic Gmbh Method for detecting traffic signs
DE102013219909A1 (en) 2013-10-01 2015-04-02 Conti Temic Microelectronic Gmbh Method and device for detecting traffic signs
JP2016123407A (en) 2014-12-26 2016-07-11 富士通株式会社 Image processing apparatus, image processing method, and image processing program
US20170337689A1 (en) * 2016-05-20 2017-11-23 Yung-Hui Li Method for validating segmentation of objects with arbitrary shapes

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4017721A (en) * 1974-05-16 1977-04-12 The Bendix Corporation Method and apparatus for determining the position of a body
JPS5218136A (en) * 1975-08-01 1977-02-10 Hitachi Ltd Signal processing unit
US4115761A (en) * 1976-02-13 1978-09-19 Hitachi, Ltd. Method and device for recognizing a specific pattern
US4228432A (en) * 1979-08-28 1980-10-14 The United States Of America As Represented By The Secretary Of The Navy Raster scan generator for plan view display
JPS5926064B2 (en) * 1979-09-10 1984-06-23 工業技術院長 Feature extraction device for contour images

Also Published As

Publication number Publication date
DE3587220D1 (en) 1993-05-06
EP0149457A2 (en) 1985-07-24
US4644583A (en) 1987-02-17
EP0149457A3 (en) 1989-02-22
DE3587220T2 (en) 1993-07-08

Similar Documents

Publication Publication Date Title
EP0149457B1 (en) Method of identifying contour lines
Reid et al. A semi-automated methodology for discontinuity trace detection in digital images of rock mass exposures
US5081689A (en) Apparatus and method for extracting edges and lines
EP0128820B2 (en) Pattern matching method and apparatus
EP0124789A2 (en) Method of identifying objects
US5537490A (en) Line image processing method
US5923776A (en) Object extraction in images
CN111539927B (en) Detection method of automobile plastic assembly fastening buckle missing detection device
JP2000011089A (en) Binarizing method for optical character recognition system
US4876729A (en) Method of identifying objects
JP2000003436A (en) Device and method for recognizing isar picture
CN111325789B (en) Curvature discontinuous point detection method based on discrete direction change sequence
JPH02242382A (en) Defect checking method
JPH10208066A (en) Method for extracting edge line of check object and appearance check method using this method
JPH065545B2 (en) Figure recognition device
JPH0217832B2 (en)
JP3651037B2 (en) Line segment detection method and apparatus
JP2738252B2 (en) Edge detection device of strip material by image
JP2959017B2 (en) Circular image discrimination method
EP0297627B1 (en) Method of identifying objects
JPH065544B2 (en) Figure recognition device
JPH08304302A (en) Method for detecting surface flaws of object to be inspected
JP3087788B2 (en) Component position detection method and device
JPH0797410B2 (en) Image processing method
JPH05113315A (en) Detecting method for center position of circular image data

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Designated state(s): DE FR GB SE

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): DE FR GB SE

17P Request for examination filed

Effective date: 19890403

17Q First examination report despatched

Effective date: 19910516

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB SE

REF Corresponds to:

Ref document number: 3587220

Country of ref document: DE

Date of ref document: 19930506

ET Fr: translation filed
PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 19940110

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 19940111

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: SE

Payment date: 19940117

Year of fee payment: 10

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed
PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 19941229

Year of fee payment: 11

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Effective date: 19950105

EAL Se: european patent in force in sweden

Ref document number: 85100073.7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Effective date: 19950929

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Effective date: 19951003

EUG Se: european patent has lapsed

Ref document number: 85100073.7

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Effective date: 19960104

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 19960104