GB2247312A - Surface Inspection - Google Patents

Surface Inspection Download PDF

Info

Publication number
GB2247312A
GB2247312A GB9015576A GB9015576A GB2247312A GB 2247312 A GB2247312 A GB 2247312A GB 9015576 A GB9015576 A GB 9015576A GB 9015576 A GB9015576 A GB 9015576A GB 2247312 A GB2247312 A GB 2247312A
Authority
GB
United Kingdom
Prior art keywords
neural network
data
different
sample
photosensitive detector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9015576A
Other versions
GB9015576D0 (en
GB2247312B (en
Inventor
Brian John Griffiths
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Brunel University
Original Assignee
Brunel University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Brunel University filed Critical Brunel University
Priority to GB9015576A priority Critical patent/GB2247312B/en
Publication of GB9015576D0 publication Critical patent/GB9015576D0/en
Publication of GB2247312A publication Critical patent/GB2247312A/en
Application granted granted Critical
Publication of GB2247312B publication Critical patent/GB2247312B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • G01B11/303Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces using photoelectric detection means

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)

Abstract

A surface inspection system using pattern recognition includes a light source (1) arranged to illuminate a sample and a photosensitive detector (4) arranged to detect light scattered from the sample. The pattern recognition system is arranged to analyse output of the photosensitive detector (4) and includes a front end which is arranged to select data including an intensity contour of the scattered light by feature extraction. A neural network is arranged to analyse the selected data and to discriminate data characteristic of different surface finishes. In a preferred example, the neural network is a single layer neural network. The front end selects data corresponding to different contours at a plurality of different grey-levels and the neural network discriminates on the basis of a three- dimensional data space having first and second dimensions corresponding to the spatial dimensions of the photosensitive detector and a third dimension corresponding to the different grey levels. <IMAGE>

Description

PATTERN RECOGNITION The present invention relates to a pattern recognition technique and in particular to a technique suitable for use in surface finish inspection.
It is common practice after, for example, the machining of a metal component to inspect the surface finish to ensure that it meets required standards. This may involve manual intervention, taking a product off the production line for inspection, but increasingly it is desirable to incorporate such surface inspection as part of an automated computer-controlled manufacturing process.
The most widely used techniques for measuring surface finish are based on stylus profilometry.
Profilometry allows measurement of surface finish with high accuracy and repeatability and accordingly the agreed standards for surface finish are based on such techniques. However, profilometry suffers a number of problems. Contact between the stylus and the surface being measured can damage soft materials and the response of the profilometer is limited by the mechanical filtering caused by the stylus radius. Stylus methods are relatively slow and not readily incorporated in automated control systems.
An alternative technique for surface inspection is based on the measurement of light scattered from the surface. Although this avoids some of the disadvantages discussed above such techniques have not found widespread acceptance. This is due in part to the inability of light inspection techniques to produce a coherent output parameter which correlates with the standard measurements of surface roughness based on stylus profilometry.
A paper by R. Brodman published at pp 403-405 of Annals of the CIRP vol. 23/1/1984 describes a technique which is at least partially successful in overcoming this problem. Coherent light is directed at the surface to be measured and the light scattered from the surface detected by a photodiode array. A parameter dependent on the scatter of the light distribution is calculated and is found under some conditions to correlate well with conventional measurements of surface roughness.
Nonetheless it is still found that with certain types of finish the system is unable to discriminate different degrees of roughness with sufficient accuracy.
According to a first aspect of the present invention, a surface inspection system comprises a light source arranged to illuminate a sample, a photosensitive detector arranged to detect light scattered from the sample, and a pattern recognition system arranged to analyse the output of the photosensitive detector, the pattern recognition system comprising a front end arranged to select data including an intensity contour of the scattered light by feature extraction, and a neural network arranged to analyse the selected data and to discriminate data characteristic of different surface finishes.
The present inventor has found that a significant increase in the accuracy of discrimination can be realised using pattern recognition techniques to analyse the shape of light scattered from the sample under inspection. In particular it is found that using a hybrid system with a feature-extraction front end which may, for example trace the outline of a given intensity contour, followed by a single layer neural network (SLN) provides effective real-time discrimination with a high degree of correlation to conventional roughness measures.
One example of a hybrid recognition system is described in detail in the present applicant's earlier unpublished British application number 8925722.4 entitled "Pattern Recognition Preferably the front end of the pattern recognition system is arranged to select data lying within a window surrounding an intensity contour.
Preferably the front end is arranged to select data corresponding to different contours at a plurality of different grey-levels and the neural network is arranged to discriminate on the basis of a three-dimensional data space having a first and second dimensions corresponding to the spatial dimensions of the photosensitive detector and a third dimension corresponding to the different grey levels.
Preferably the photosensitive detector comprises a video camera, the output from the video camera being digitised and a video frame store storing the digitised output as a pixel array.
Although the use of a hybrid pattern recognition system to analyse a three dimensional sampling space has been found to be particularly valuable in connection with surface inspection its use is not limited to this particular field.
According to a further aspect of the present invention a pattern recognition system comprises a front end arranged to receive a two dimensional input data field and to recognise and select a plurality of different sub-fields corresponding to different respective values of a parameter characteristic of the data, and a neural network arranged to analyse the selected sub-fields and to discriminate data in a three dimensional sampling space having two dimensions corresponding to input data fields and a third dimension corresponding to the characteristic parameter.
According to another aspect of the present invention in a pattern recognition system including a neural network arranged to analyse an input data field, the outputs of individual tuples are stored bit-by-bit in a memory map and second-order discrimination carried out on the resulting map. Preferably the input data field is divided into different regions and the tuples selected so that corresponding bit positions within each tuple come from corresponding respective regions of the data field.
A system in accordance with the present invention will now be described in detail with reference to the figures of the accompanying drawings in which: Figure 1 is perspective view of the system; Figures 2a-2h are discriminator profiles for ten different finishes; Figure 3 is a graph of discriminator profiles at different intensity thresholds; Figure 4 is a block diagram showing an alternative arrangement for the system; and Figure 5 shows a memory map.
A surface inspection system includes a laser 1 which illuminates a region of a sample 2. A screen 3 is placed in the path of light reflected from the sample 2 and the light distribution on the screen 3 is viewed by a vidicon camera 4. The output of the camera 4 is held in a frame store 5 arranged to output data to a computer 6. The frame store may also output video data directly to a monitor 7. As described in further detail below, the computer 6 analyses the data to discriminate between patterns characteristic of different surface finishes.
In an alternative and preferred configuration, the laser 1' is coincident with the camera 4' and uses a common optical system 8.
In the present example the light source is a laser diode operating at a wavelength of substantially 623 nm.
A suitable optical system is described in the above cited paper by R. Brodman.
Different surface finishes on the sample are found to produce different characteristic intensity distributions on the screen. These different distributions are recorded using the vidicon camera 4 and frame store 5 and analysed in the computer 6 using a hybrid pattern recognition system generally similar to that described in the applicants earlier British patent application, cited above. The data may be treated initially to remove unwanted features from the visual field. In particular, a Hough transform may be used to eliminate the first order reflection from the sample.
The front end of the pattern recognition system uses feature extraction techniques to locate a target object, which in the present example, is an intensity contour defined by the points of transition in the data with respect to a predetermined intensity threshold. The input data is coded as "BLACK" or "WHITE" by comparison with the chosen intensity threshold. The contour is then located using one of a number of well known edge-tracing techniques such as chain coding. In this technique a black pixel is found and stored. The system then scans in a clockwise direction in accordance with the chain code operator, until it finds the neighbouring black pixel. The chain code operator is then centred on this new pixel, its position stored, and the process repeated until the initial position is again reached. The entire image is then stored. Alternative edge tracing techniques may be used, such as polar vector edge description.
Once the edge of the intensity contour has been traced the front end of the recognition system positions a window so that it surrounds the contours. The window may be a simple rectangular window as described in the above cited application, or alternatively may take a different shape appropriate to the patterns under analysis - the window may, for example, be annular. The data within the window is then processed using a single layer neural network. Such a network may be implemented in dedicated hardware, but in the present example is implemented in software using a discriminator having 1024 pixels. As described in our earlier application, and in GB-A-2112194 relating to a hardware implemented network, successive samples from the window defined by the front end are applied to the discriminator. In the present example the window is rectangular.For each successive sample, pixels which were applied to the discriminator in a previous sample are rejected. The windows used in the present example comprise a total of 11,250 pixels and so in practice more than ten repetitions are needed to apply all the data to the discriminator.
In an initial training phase, ten ground workpieces having different finishes corresponding to a range of a grinding wheel life were inspected using the system (workpiece 1 being from a freshly dressed wheel and workpiece 10 being machined with wheel badly needing redressing). The different resulting patterns on the screen were viewed and analysed using the pattern recognition system thereby producing ten corresponding discriminators in the neural network. Then when subsequent samples are viewed, the resulting bit patterns are compared with the discriminators to produce a comparison score on a range from 0 to 128, 128 corresponding to a perfect match with the discriminator.
Experimental results obtained with the system described above are shown in Figure 2. Figure 2a shows the comparison scores for samples of different degrees of wear when compared with discriminator 1 and, as expected, shows a clear peak for the sample of roughness 1.
Similarly Figure 2c shows the results of comparison with the discriminator trained on a sample of roughness 4 and gives a maximum score to a subsequently viewed sample of roughness 4. Similarly each of the other discriminators gives a clearly defined maximum at the appropriate roughness value and so the set of discriminators as a whole allow effective recognition of different degrees of wear.
Although discrimination may be carried out simply on the basis of intensity contours at a single predetermined threshold, it is found that the shape of the contour varies at different intensity levels. Accordingly for a given degree of roughness, different discriminators are obtained according to the intensity threshold selected.
Figure 3, for example, shows the outputs from a discriminator for surface 1 at four different threshold levels. Each individual threshold level constitutes a slice through a three dimensional distribution formed from two spatial dimensions and a third dimension corresponding to intensity. A significant further increase in accuracy of discrimination is obtained by training the discriminators not on two dimensional data at a single threshold level but over a three dimensional sampling space including data at different intensity thresholds.
As a further refinement to the discrimination process, rather than identifying the roughness of a surface simply by locating the maximum of a discriminator response, the different discriminator response curves may themselves be fed back to the input to the network and analysed by discriminators trained on the discriminator curves of the different surfaces. This second-order neural network discrimination technique is not limited in applicability to surface inspection and may be used to enhance hybrid recognition systems in other fields of use.
Other forms of second-order discrimination techniques may also be used. As clearly seen in Figure 2, a given sample in general produces a different score from each of the range of discriminators. The score for each discriminator may be plotted as a histogram and the second-order discrimination carried out using networks trained on such histograms. In another alternative approach to second-order discrimination, the system is modified as described below with reference to Figure 5.
As described above, in a conventional single layer network, in the discrimination phase, tuples of data are selected from the input image and applied to the discriminator, producing a score of "1" for a perfect match to the discriminator and a score of 0 otherwise.
The outputs of the different tuples are then summed to give a final score R, where R = z T.
where T. is the output (1 or 0) from the ith tuple.
In the presently described modification, instead of simply recording 1 or 0 indicating a match or mismatch, the result of applying each tuple to the discriminator is recorded as a binary word written into one row of a memory map. If, for example, the first and last bits only of a first tuple T1 match the corresponding elements of the discriminator then the binary word 10000001 is written into the first row of a memory map. Similarly the results for each of the other tuples is recorded in the map, thereby producing a two dimensional distribution which can in turn be used for second-order discrimination.
This modified technique produces an output which is sensitive to the position within the tuple of any mis-match. Accordingly the input field is divided geometrically into, e.g., eight regions and the tuples are generated in such a fashion that the first bit always comes from a first one of the eight regions, the second bit from a second one of the eight regions and so on.
The response of the discriminator can be used to focus the attention of the system on a particular region of the input field where a mis-match is occurring. When such a region has been identified the system may be arranged to analyse that region in further detail. Alternatively the system might simply flag the region in question to bring it to the attention of an operator using the system.

Claims (10)

1. A surface inspection system comprising a light source arranged to illuminate a sample, a photosensitive detector arranged to detect light scattered from the sample, and a pattern recognition system arranged to analyse the output of the photosensitive detector, the pattern recognition system comprising a front end arranged to select data including an intensity contour of the scattered light by feature extraction, and a neural network arranged to analyse the selected data and to discriminate data characteristic of different surface finishes.
2. A system according to claim 1, in which the neural network comprises a single layer neural network (SLN).
3. A system according to claim 1 or 2, in which the front end of the pattern recognition system is arranged to select data lying within a window surrounding an intensitycontour.
4. A system according to any one of the preceding claims, in which the front end is arranged to select data corresponding to different contours at a plurality of different grey-levels and the neural network is arranged to discriminate on the basis of a three-dimensional data space having a first and second dimensions corresponding to the spatial dimensions of the photosensitive detector and a third dimension corresponding to the different grey-levels.
5. A system according to any one of the preceding claims, in which the photosensitive detector comprises a video camera, the output from the video camera being digitised and a video frame store storing the digitised output as a pixel array.
6. A system according to any one of the preceding claims, in which the neural network is arranged to store the outputs of individual tuples bit-by-bit in a memory map and to carry out second-order discrimination on the memory map.
7. A system according to claim 6, in which the input data field for the neural network is divided into different regions and the tuples selected so that corresponding bit positions within each tuple come from corresponding respective regions of the data field.
8. A system according to any one of the preceding claims, in which the light source is a laser
9. A system according to any one of the preceding claims, further comprising means to transform data to eliminate the first-order reflection from the sample.
10. A system substantially as described with respect to the accompanying drawings.
GB9015576A 1990-07-16 1990-07-16 Surface inspection Expired - Fee Related GB2247312B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB9015576A GB2247312B (en) 1990-07-16 1990-07-16 Surface inspection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9015576A GB2247312B (en) 1990-07-16 1990-07-16 Surface inspection

Publications (3)

Publication Number Publication Date
GB9015576D0 GB9015576D0 (en) 1990-09-05
GB2247312A true GB2247312A (en) 1992-02-26
GB2247312B GB2247312B (en) 1994-01-26

Family

ID=10679151

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9015576A Expired - Fee Related GB2247312B (en) 1990-07-16 1990-07-16 Surface inspection

Country Status (1)

Country Link
GB (1) GB2247312B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2256708A (en) * 1991-06-11 1992-12-16 Sumitomo Heavy Industries Object sorter using neural network
WO1993003348A1 (en) * 1991-08-01 1993-02-18 British Textile Technology Group Sample evaluation
EP0562506A2 (en) * 1992-03-27 1993-09-29 BODENSEEWERK GERÄTETECHNIK GmbH Method and device for sorting bulk material
FR2691246A1 (en) * 1992-05-13 1993-11-19 Lorraine Laminage Metallic surface finish classification process e.g. for vehicle bodywork - digitising grey-scale matrix image and extracting roughness descriptors relating to average size, size variation and density of valleys for classification.
EP0602464A2 (en) * 1992-12-12 1994-06-22 RWE Entsorgung Aktiengesellschaft Method and apparatus for recognizing objects
GB2284293A (en) * 1993-11-30 1995-05-31 Mars Inc Currency validator
DE19827183A1 (en) * 1998-06-18 1999-12-23 Univ Ilmenau Tech Method of optically preprocessing scattered light data

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4176376A (en) * 1975-11-10 1979-11-27 Olympus Optical Company Limited Image processing system
GB1579290A (en) * 1976-06-30 1980-11-19 Ibm Defect inspection of objects
GB2064762A (en) * 1979-11-30 1981-06-17 Nippon Nuclear Fuels Surface defect inspection system
EP0066321A2 (en) * 1981-05-18 1982-12-08 KULICKE and SOFFA INDUSTRIES INC. Pattern recognition system
GB2112194A (en) * 1981-11-27 1983-07-13 Nat Res Dev Recognition apparatus
US4460969A (en) * 1980-12-04 1984-07-17 The United States Of America As Represented By The Secretary Of The Army Image spectrum analyzer for cartographic feature extraction
EP0124789A2 (en) * 1983-04-11 1984-11-14 Kabushiki Kaisha Komatsu Seisakusho Method of identifying objects
EP0159879A2 (en) * 1984-04-13 1985-10-30 Fujitsu Limited Image processing system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8925722D0 (en) * 1989-11-14 1990-01-04 Univ Brunel Pattern recognition

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4176376A (en) * 1975-11-10 1979-11-27 Olympus Optical Company Limited Image processing system
GB1579290A (en) * 1976-06-30 1980-11-19 Ibm Defect inspection of objects
GB2064762A (en) * 1979-11-30 1981-06-17 Nippon Nuclear Fuels Surface defect inspection system
US4460969A (en) * 1980-12-04 1984-07-17 The United States Of America As Represented By The Secretary Of The Army Image spectrum analyzer for cartographic feature extraction
EP0066321A2 (en) * 1981-05-18 1982-12-08 KULICKE and SOFFA INDUSTRIES INC. Pattern recognition system
GB2112194A (en) * 1981-11-27 1983-07-13 Nat Res Dev Recognition apparatus
EP0124789A2 (en) * 1983-04-11 1984-11-14 Kabushiki Kaisha Komatsu Seisakusho Method of identifying objects
EP0159879A2 (en) * 1984-04-13 1985-10-30 Fujitsu Limited Image processing system

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2256708A (en) * 1991-06-11 1992-12-16 Sumitomo Heavy Industries Object sorter using neural network
WO1993003348A1 (en) * 1991-08-01 1993-02-18 British Textile Technology Group Sample evaluation
EP0562506A2 (en) * 1992-03-27 1993-09-29 BODENSEEWERK GERÄTETECHNIK GmbH Method and device for sorting bulk material
EP0562506A3 (en) * 1992-03-27 1995-01-25 Bodenseewerk Geraetetech
FR2691246A1 (en) * 1992-05-13 1993-11-19 Lorraine Laminage Metallic surface finish classification process e.g. for vehicle bodywork - digitising grey-scale matrix image and extracting roughness descriptors relating to average size, size variation and density of valleys for classification.
EP0602464A2 (en) * 1992-12-12 1994-06-22 RWE Entsorgung Aktiengesellschaft Method and apparatus for recognizing objects
EP0602464A3 (en) * 1992-12-12 1994-12-21 Rwe Entsorgung Ag Method and apparatus for recognizing objects.
GB2284293A (en) * 1993-11-30 1995-05-31 Mars Inc Currency validator
GB2284293B (en) * 1993-11-30 1998-06-03 Mars Inc Article classifying method and apparatus
US5992600A (en) * 1993-11-30 1999-11-30 Mars, Incorporated Money validator
DE19827183A1 (en) * 1998-06-18 1999-12-23 Univ Ilmenau Tech Method of optically preprocessing scattered light data

Also Published As

Publication number Publication date
GB9015576D0 (en) 1990-09-05
GB2247312B (en) 1994-01-26

Similar Documents

Publication Publication Date Title
CN109490316B (en) Surface defect detection algorithm based on machine vision
CN108332689B (en) Optical measurement system and method for detecting surface roughness and surface damage
CA2146911C (en) Lumber defect scanning including multi-dimensional pattern recognition
US7952085B2 (en) Surface inspection apparatus and method thereof
US5179419A (en) Methods of detecting, classifying and quantifying defects in optical fiber end faces
US7505619B2 (en) System and method for conducting adaptive fourier filtering to detect defects in dense logic areas of an inspection surface
US4339745A (en) Optical character recognition
CN115330783A (en) Steel wire rope defect detection method
Ali et al. Surface roughness evaluation of electrical discharge machined surfaces using wavelet transform of speckle line images
Jeon et al. Optical flank wear monitoring of cutting tools by image processing
Shahabi et al. In-cycle monitoring of tool nose wear and surface roughness of turned parts using machine vision
You et al. Machine vision based adaptive online condition monitoring for milling cutter under spindle rotation
EP0183565A1 (en) Method and apparatus for checking articles against a standard
CN115311629B (en) Abnormal bending precision monitoring system of bending machine
CN115526889B (en) Nondestructive testing method of boiler pressure pipeline based on image processing
GB2247312A (en) Surface Inspection
EP0563897A1 (en) Defect inspection system
US4481534A (en) Configuration detecting device
Reid et al. Computer vision sensing of stress cracks in corn kernels
Batchelor et al. Defect detection on the internal surfaces of hydraulics cylinders for motor vehicles
CN114383522A (en) Method for measuring surface gap and surface difference of reflective difference workpiece
Brambilla et al. Automated Vision Inspection of Critical Steel Components based on Signal Analysis Extracted form Images
CN112414973A (en) Speckle fingerprint-based target material identification method and system
JP3523945B2 (en) Surface inspection method
CN114972227B (en) Grinding wheel porosity identification method

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 19940716