GB2248931A - High resolution parts handling system - Google Patents

High resolution parts handling system Download PDF

Info

Publication number
GB2248931A
GB2248931A GB9119774A GB9119774A GB2248931A GB 2248931 A GB2248931 A GB 2248931A GB 9119774 A GB9119774 A GB 9119774A GB 9119774 A GB9119774 A GB 9119774A GB 2248931 A GB2248931 A GB 2248931A
Authority
GB
United Kingdom
Prior art keywords
article
articles
inspection
scan
geometric inspection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9119774A
Other versions
GB9119774D0 (en
GB2248931B (en
Inventor
Arthur L Dean
Randy K Baird
Stanley P Turcheck
James P Martin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FMC Corp
Original Assignee
FMC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US07/583,117 external-priority patent/US5233328A/en
Priority claimed from US07/583,256 external-priority patent/US5103304A/en
Priority claimed from US07/586,167 external-priority patent/US5157486A/en
Priority claimed from US07/586,189 external-priority patent/US5142591A/en
Application filed by FMC Corp filed Critical FMC Corp
Publication of GB9119774D0 publication Critical patent/GB9119774D0/en
Publication of GB2248931A publication Critical patent/GB2248931A/en
Application granted granted Critical
Publication of GB2248931B publication Critical patent/GB2248931B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/024Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of diode-array scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/028Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring lateral position of a boundary of the object
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41875Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by quality surveillance of production
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F7/00Methods or arrangements for processing data by operating upon the order or content of the data handled
    • G06F7/02Comparing digital values
    • G06F7/026Magnitude comparison, i.e. determining the relative order of operands based on their numerical value, e.g. window comparator
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • G06T9/20Contour coding, e.g. using detection of edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • G06V10/507Summing image-intensity values; Histogram projection analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Quality & Reliability (AREA)
  • Manufacturing & Machinery (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

A high resolution article handling system serves as an article discriminator or identifier by creating an object silhouette. The objects are singulated on a conveyor and scanned by a linear array of CCD units 34 (2048 pixels per inch) at a scan rate of 10 MHz. Pixel transitions corresponding to object edge points are converted to a single count value from a counter 35 which is synchronized with the scanner. A microprocessor 54 with a first in, first out buffer memory 52 needs only a capacity to handle the count values rather than all data from the pixels. Article orientation is corrected in response to signal generated by determining the count value difference between a reference value and a work article value at only the scan slice windows at positions along the article length where differences have been predetermined to be a maximum for different article orientations. <IMAGE>

Description

1 HIGH RESOLUTION PARTS HANDLING SYSTEM This invention relates to the
determination of article geometry in an article handling system, and more particularly to a high resolution article handling system where article movement along a conveyor nay be several inches per second.
BACKGROUND
In Turcheck et al U.S. Patent 4,784,493 an apparatus and method are disclosed for recognition of an article and its orientation on a conveyor. To determine orientation of a work article, a number of possible orientations are recorded in a memory. The data stored in the memory for each orientation is compared with scanned data from a work article. Orientation of the work article is determined by matching of the compared data.
To enhance resolution, more data points are required which has traditionally meant more expensive processing both for memory size and processing time. The time required for making such article orientation determination restricts the number of articles that can be processed in a unit of time.
The present invention provides a system for geometric inspection of articles as defined in the Claims. Further features of the present invention are in the following description, made by way of example and with reference to the drawings, wherein:-
2 Fig. 1 is a diagrammatic view of a f irst conveyor system f or separating and orienting parts, together with a novel inspection camera and information processor; Fig. 2 is a block diagram of a camera sensor and related functional circuitry for acquiring and storing object silhouette information; Fig. 3 is an elevation of a conveyor moving surface that is supporting a round of ammunition; Fig. 4 is a group of waveforms taken at scan position 120 as depicted by line 4-4 of Fig. 3; Fig. 5 is a group of waveforms taken at scan position 800 as depicted by line 5-5 of Fig. 3; and Fig. 6 is a diagram of a suitable circuit arrangement for hardware that can compact the object image intelligence data.
Fig. 7 is a pictorial view of a second conveyor system having an article diverter, together with an inspection camera, an information processor and a system for determining scan slice windows; Fig. 8 is a flow diagram of a procedure for automatically generating scan slice windows at only a few locations along the object or article length that are sufficient to enable identification of orientation; Fig. 9 are elevation views of four possible orientations of the object whose orientation is to be identified; Fig. 10 is a pictorial view of a vibrating bowl 3 conveyor system adapted for use with the high resolution article handling system; and Fig. 11 is a pictorial view of a gravity chute conveyor article handling system.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
The present invention is adapted for use with conveyors that move a series of like objects on a repetitive basis for automated inspection or assembly. The invention serves as a substitute for human inspection of the object orientation on the conveyor surface and is adapted to provide data representation concerning a part size that may have a high resolution as little as 0.0005 inches(O.013mm).
In the illustrated conveyor 10 of Fig. 1, objects 12, 14, 16 rest on a surf ace 18 that moves in a counter-clockwise direction while a tilted central disk rotates at a slower speed to load objects in spaced positions along conveyor surface 18 in a known manner. The objects 12, 14, 16 that have been singulated pass between a camera sensor 22 and a light source 24 after which they move downstream to a conventional detector 26 and diverter 28 which enables reorientation and/or rejection of improperly oriented or sized articles. The diverter may of the general type including that disclosed in Dean et al U.S. Patent No. 4,619,356.
As illustrated, a camera sensor 22 is not a raster scan type, but instead consists of a linear array of charge coupled device (CCD) units. The CCD units are aligned to be transverse to the direction of object movement. The linear 4 array of CCD units thus may be essentially vertical in the case of a horizontal conveyor. The CCD units are aligned in a single column to provide a f ield of view that is one pixel wide and at least about 1000 pixels high. The height of the CCD unit column must be suf f icient to span the f eature of interest of the object 12, 14, 16 on the conveyor 18. For many small objects such as bolts, screwdriver handles, small caliber ammunition and the like, a maximum variation of the feature of interest may be within a one-inch (2.54 cm) span.
Silhouette image data obtained for certain applications must have a 0. 0025 inch (0.064 mm) resolution. The number of CCD units in the one-inch column may conveniently be about 2000 and advantageously may be 2048. An even smaller resolution of about 0.0005 inches may be obtained with the use of about 3000 or 4000 pixels in a one inch column. The linear array of CCD units may be obtained commercially from Texas Instruments as TC103-1. The drive circuitry necessary for proper CCD operation and timing diagrams to provide a sequential scan of the analog voltage signal are commercially available. The scan rate must provide sufficient time to transfer each pixel charge fully and not allow any charge to accumulate in a pixel between reset and the next scan at which time a momentary voltage is applied to each of the CCD sensing units. For a 2048 pixel CCD unit array, we have found that a scan can be effected in about 330 microseconds and that time periods between successive scans can be variable, with object speeds of up to about 7 inches per second (0.18 ms- 1) possible while maintaining a resolution of 0.0025 inches.
In the system of the present invention, the light source 24 is located across the conveyor surface 18 to face the CCD units. As an object 12, 14, 16 passes between the light source 24 and the camera sensor 22, a shadow is formed on certain of the pixel areas whereas unblocked pixels are fully illuminated by the light. By use of a collimated light source which operates through a lens having a shape and size corresponding to that of the linear array of CCD units forming a camera sensor, a precise point on the upper edge surface of the object can be optically determined with great accuracy. Variations in ambient light conditions are less likely to interfere with operation of the camera sensor when a collimated light source is used.
If the object has a point on the lower edge surface that is positioned above the conveyor surface, a light beam will be detected at appropriately positioned pixels in the same linear array at a point on the lower surface which is opposite the detected point on the upper object surface. Similarly, an aperture in the object which is aligned between the collimated light source 24 and the camera sensor 22 will produce transitions in the adjacent pixels to provide a manifestation of the marginal edge points of the aperture at successive positions as the object advances past the camera sensor.
Successive exposures of the camera sensor 22 to each object 12, 14 or 16 as it moves along the conveyor path 18 gives 5 successive data inputs which may be sequentially 6 processed and collectively used to provide as a display, a silhouette of the object before the object reaches the diverter station 28. object speed on the conveyor may be monitored and signals generated corresponding to object or article displacement along the conveyor path to allow article shape "learning" procedures to be carried out at a speed that is different from the operating speed or to provide compensation for speed variations generally. Successive scans may be provided at intervals as short as 300 microseconds with a 2048 pixel linear array driven by'a 10 MHz clock. Conveyor speeds up to seven inches per second may be acceptable without exceeding the resolution accuracy specified and may be monitored by a shaft position resolver which produces a digital input signal that can be used by the microprocessor as an object displacement detector 29 as shown in Fig. 2.
The installation as illustrated in Fig. 1 may include also a system control 30 and control box 32 which are usually physically located near the conveyor and may be in a single housing.
With reference to Fig. 2, a functional block diagram of the camera sensor 22 is illustrated. The vertical column of CCD units 34, consisting of a 2048 pixel linear array in the illustrated embodiment, is connected to receive clocking or timing signals from the clock and sync circuit 35. Clock circuit 35 includes an oscillator running at a frequency of at least about one MHz, and 10 MHz in the illustrated example, in order to provide pixel scanning in about 200 microseconds and 7 microseconds or more for reset operation. The CCD units that are commercially available are capable of running at clock frequencies as high as 40 MHz. Thus, pixel scan during about a 300 microsecond sampling scan after conditioning, is used to produce an analog information signal which contains a transition relating to the precise position of an edge point on an object or part which is being conveyed. To allow for variations in conveyor speeds, the actual start of each vertical slice scan follows receipt of a master reset pulse (Figs. 4 and 5) from microprocessor unit 54 on lead 55 shown in Fig. 2.
From the column of CCD units 34 which each functions as a pixel sensor, an output signal on lead 36 is in the form of an analog signal voltage (see Figs. 4 and 5) containing sequentially obtained voltages of a first amplitude for shadowed pixels and a second low amplitude for those pixels receiving light from light source 24. The analog information is a serial bit stream of uniform length and is transferred serially at the clock rate to a voltage follower that serves as an isolation circuit 38 and to a black sample and hold circuit 40 which produces a voltage level reference, signal from pixels that are blocked from receiving light. This provides a reference signal which holds the analog signal at a controlled DC level and may be used as one input to circuitry associated with an analog to digital conversion circuit 42.
The output signal on lead 44 is applied to terminal 80 of 8 the transition detector and data compaction circuitry 48 which will be described in connection with Fig. 6. On lead 46, a clock signal from the clocking and sync circuit 35 is applied to maintain synchronization between the data compaction unit 48 and the scanning means that is part of the charge coupled device array 34.
The output signal from the data compaction device 48 on leads 50 is in the form of a single binary number for each transition from the analog to digital conversion circuit and is applied to the memory 52 which serves as a buffer to collect all of the data for a particular object 12, 14 or 16 on the conveyor surface on a first in, first out basis. The microprocessor unit 54, which may be any suitable type that is commercially available, may start to process the output signals as soon as the memory 52 begins to receive valid object data.
The camera sensor 22 is thus synchronized with a counter in the data compactor 48 by means of the clocking and sync circuit 35 to provide scan slice information. The memory 52 for data buffering may have a 64K or even smaller capacity for objects of the type mentioned above. As pointed out above, low cost commercially off-shelf available components have a capability to operate up to a 10 MHz data rate in a reliable fashion thereby providing a low cost hardware product.
With reference to Fig. 3, there is illustrated a round of ammunition which has a cylindrical cartridge or casing 56 that is supported on a conveyor surface 18 and a projectile 58.
9 Fig. 4 contains a group of waveforms taken along line 4-4 of Fig. 3 and Fig. 5 contains a group of similar waveforms taken along line 5-5 of Fig. 3. Fig. 4 wavef orms are taken at a position corresponding to scan slice window 120 whereas, the Fig. 5 waveforms are taken at scan slice window 800.
In Fig. 4, the waveform of the amplified analog signal starts at time 0 in a black condition because of the conveyor 18. At pixel 30, which corresponds to count 30 in a counter, light is detected thereby starting a negative going digital pulse and a positive going edge detector pulse 60. At pixel 100, the lower edge point on the silhouette of the projectile 58 is effective to block light and create a further edge detector pulse 62. At pixel 500, the light is again detected, thereby causing a third edge detector signal 64 to be generated. Finally, at the top the sensor array and pixel 2048 of the linear array, the scanner no longer produces a signal and an end of scan transition detector pulse 66 is generated.
A conventional binary counter capable of counting up to at least 2048 at the clock frequency is synchronized with the scan of the 2048 pixels in the camera sensor as indicated at the bottom waveform of Fig. 4. The clock is reset to start at zero as the scan starts so that count values of 30, 100, 500 and 2048 are stored in the memory 52 of Fig. 2 as determined by the time of occurrence of edge detector pulses 60, 62F 64 and 66.
Fig. 5 shows the corresponding waveforms that occur at scan 800. Since the lowest point on the cylindrical casing 56 rests on the conveyor surface 18, the lowest 1499 pixels in the linear array are dark and the first transition occurs with pixel 1500, which is aligned with the upper edge point of the cartridge casing 56 at scan slice position 800.
The edge detector pulse 68 is generated in response to the transition at pixel 1500 and causes the count value of 1500 to fall through the memory 52 to its output terminals. A similar edge detector pulse 70 occurs at count 2048. Thereafter, a master reset pulse is generated either periodically in which required or by an object number of scan slices case a constant conveyor speed is displacement monitor so that the same is produced for each identical work article. The counters are reset to a zero count by a counter reset signal which is synchronized with the beginning of the next scan of the pixels.
Fig. 6 shows one preferred embodiment for converting the digital signals of Figs. 4 and 5 into count values that are supplied to the microprocessor unit (MPU) 54. The digital signal from Fig. 4, in the form of incoming serial binary bits, is applied to terminal 80 of a negative and positive edge detecting network that detects changes in the binary state and issues for each-positive or negative edge a 50n sec. pulse on lead 82. At a 10 MH clock frequency, the scanned information data and clock counts are separated by 100n sec. The 50n sec. pulse is used to gate on the memory unit 52 (Fig. 2) which includes FIFO registers 84 as illustrated in Fig. 6.
11 The three binary counter registers 86 that operate with clock signals on lead 46 are reset by a counter reset signal on lead 88. The count value on leads 50 is constantly presented to the FIFO registers 84. However, the count values are allowed to drop through the FIFO registers 84 only when an edge detector pulse on lead 82 is present. In this example, the count values of 30, 100, 150 and 2048 are stored.
When a count value falls through the FIFO registers 84, the FIFO issues an output ready signal to MPU 54 on lead 92. When the MPU sees an output ready signal, it issues a shift out signal on lead 94 to FIFO registers 84 which releases the count value immediately to the MPU 90. The data at this point is then coded object image intelligence. This handshaking continues throughout the entire scan cycle and sequentially throughout all scans of an object.
As is evident from the foregoing, for the scan 120, only four count values are processed and stored rather than 2048 bits of scan information. other scans such as scan 800 may have only two count values that are processed. The number of scans for a three-inch (7.6 cm) object or article may be about 1000. This number may be decreased where less resolution in the horizontal direction is acceptable thereby further reducing the processing time. This compaction of data increases processing speed and reduces memory size requirements without sacrificing resolution of the silhouette image.
A horizontal belt type reorientor system is shown in Fig.
12 7 which is similar from a mechanical standpoint to the article or object recognition and orientation system disclosed in Turcheck et al U.S. Patent No. 4,784,493, but which has been modified to incorporate the high resolution imaging system described in connection with the embodiment of Figs. 1-6. The general environment of the reorientor is diagrammatically illustrated in Fig. 7. The reorientor system may generally comprise a frame supported continuous belt 112 entrained around a driver roll 114 and a idler roll 116. Work pieces such as 118, 120, and 122 are similar parts having three different orientations. The simplest form of reorientor is shown in this figure, that being a stepping motor driven single axis (Y-axis) orientor having a lower chamber 126 that can be rotated 180 degrees. Other orientors including multiple position re- orientors are known in the art and may be advantageously used with the present invention.
Adjacent the continuous belt 12 at one edge thereof is a fence 28 running the length of the belt but having several breaks therein. On the inbound side of the re-orientor means 24 there is a first break in the fence to accommodate a recognition sensor 30 which may be a 16 X 1 array of vertically stacked fiber optic elements connected to 16 individual phototransistors each having a hard wire connection to a vision controller or microprocessor input port 32. An infrared light source 34 composed of dual infrared LEDs adjusted to different angles is directly across the belt from the recognition sensor 30 and provides the necessary 13 illumination to switch the phototransistors related to each of the 16 fiber optic filaments depending upon whether the individual filament is illuminated or shadowed by the work article.
Alternatively, the linear array of sensors may comprise a column of CCD units which provide a pixel density of between about 1000 and 4000 pixels per inch (39-157 pixels/mm) and preferably about 2000 pixels per inch (79 pixels/mm) thereby to provide a high resolution sensor. The CCD units are scanned at a frequency between about 1 MHz and 40 MHz and preferably about 10 MHz to produce an analog signal that is digitized and converted to a count value as described in connection with Figs. 2-6. Hardware compaction of data applied to the microprocessor allows for improved image resolution to be obtained while reducing the processing time and memory size requirements.
The second break in fence 128 is provided to accommodate a first infrared thru beam optical switch composed of a receiver 136 and a light source 138.
Immediately prior to the entry port of the orientor means 124 there may be optionally positioned at a third break in the fence 128, a second infrared thru beam optical switch means having a receiver 140 and a light source 142.
The recognition sensor communicates via a conduit line 148 with a vision controller 144 which in turn is in communication with an orientation controller 146.
Vision controller 144 is hard wired to the work article 14 sensors 130 while the orientation controller 146 is wired to the article recognition sensors 136 and 140 and reorientor 124. A signal related to the movement of the conveyor belt is supplied by lead 158 to orientation controller 146. Control and monitoring of the belt speed may be by shaft encoder 162 which is connected by lead 160 to vision controller 144, since monitoring the belt speed is important for allowing variations in operating speeds without needing to re-learn the article profile just because of changes in conveyor speed.
The identical sample work pieces 118, 120, and 122 chosen for explanatory purposes of the specification are shown in Figs. 7 and 9 and comprise a plastic article having a length of about 3 inches provided with a blunt end surface which may be either at the trailing end as shown at 118 in Fig. 1 to provide orientation A or at the leading end as is the case for work article 120 to provide orientation B. The work article 122 is shown with a third orientation C. Up to seven orientations may be determined by the program described below.
In operation, work articles 118, 120, and 122 moving along the path of the conveyor belt 112 may be inspected for conformity with a desired and acceptable work piece. In conjunction with such inspection, it is necessary to identify article orientation and make such position changes as are necessary so that all work articles leave the discharge side of the reorientor 124 with the same orientation.
memory resident in the programmable vision controller 144 is Ittaught" a plurality of up to seven possible orientations of a work article in a setting procedure prior to the production run. The present embodiment is especially adapted for reducing the time required for making the determination of the actual orientation of work pieces, or article identification as the case may be.
As explained in the 1493 patent, the capacity for data storage in the vision controller 144 is sufficient to store information concerning the edge points of an article as it passes scanner 130. The recognition device operates in a silhouette mode so that only profile data is needed. Each scan represents a slice of the article and produces at least one edge point on the profile. The number of slices per article, for example, may be 1000 depending upon conveyor speed, article length and microprocessor programming.
An article having acceptable dimensions is fed by the conveyor past the array 130 in a first orientation A. This information is stored in a "learn" mode. Typically this procedure is repeated at least once and optionally up to about ten (10) times to obtain an envelope of values or average value for the first orientation.
Next the system is taught to recognize a second orientation B of the same article by the same procedure. Additional orientations C, D.... of the same article up to a total of seven (7) different orientations can be processed by the system of the prior 1493 patent. When all of the required orientations are taught, i.e. stored in vision controller memory 144, the system is advanced from the "learn" mode to a z 16 "windows generation" mode bef ore moving on to an "operation" mode allowing the repetitive feeding of work articles. Since the conveyor belt speed is carefully controlled, once the article leading edge has been detected, edge point data for corresponding points that are acquired by successive slice scanning can be identified by slices numbered between one and 1000 in the illustrated example. The edge point data are compared to determine which of the orientation data matches the work article data.
Since the time required for processing the edge point data has been a factor limiting the speed at which the conveyor 12 may operate, various efforts have been made in the past to reduce the processing time to allow faster classification of objects by the computer. one previous approach has been to have the operator manually set areas of interest with a keyboard, a mouse, or the like. By this feature of the present invention, the computer automatically locates the areas of maximum variability between the stored object data and the collected work article data without the need for operator participation.
Reference is made to Fig. 8 which shows a flow chart for generating the windows that correspond to numbered slice scans for a specific article whose orientation is to be determined.
The procedure illustrated in Fig. 8 will be described in connection with an article that may have four orientations that must be separately ascertained. The program is capable of detecting up to seven orientations as described above. The 17 four orientations A, B, C, and D are shown in Fig. 9. Before starting the program, the orientations are stored just as described in the prior 1493 patent.
With the use of the program of Fig. 8, the scan slices 2 - 999 where maximum deviation between the marginal edges that are presented in the several orientations are identified. The article has an arbitrary length of 1000 scan slices that are oriented along the X axis. The article height is arbitrarily designated to be 400 along the Y axis. The thickness of the parts of the article is assumed to be 100 units as measured along the Y axis.
With continued reference to Fig. 8, the process is initialized by setting a first loop counter A to zero at step 202. At step 204, the counter is incremented. At steps 206 and 208, the leading edge and the trailing edge scan slices for orientation A corresponding to X axis positions of 1 and 1000 in Fig. 9 are stored. This corresponds to scan slices 1 and 1000 assuming that a three-inch article will be sequentially scanned a thousand times as it passes the sensor 130 of Fig. 7. In this embodiment, scan slices 1 and 1000 are always stored for each orientation.
At step 212, a second loop counter B is set to the value of counter A and incremented at step 214 to a position for an iteration with respect to orientation A data. Iteration compares learned data of orientation A with learned data of orientation B by starting with scan slice 2 of both orientation A and orientation B data. The difference in this 18 pixel data at scan slice 2 for orientation A is determined and is called a score. The same procedure is followed for scan slices 3 through 999. In all, 998 scores are determined at step 216.
At step 218, a scan slice having a large difference, for example the maximum score, is determined. While a reading from only one window is theoretically sufficient to determine that a part orientation does not match a stored known orientation, in practice several slice numbers, for example up to about 20, may be stored where the-scores are the largest to reduce the likelihood of ambiguity in the results.
Thus, at step 226, a determination is made as to whether a sufficient number of windows has been generated. If not, the same procedure is repeated. If "yes", the procedure advances to step 222.
At step- 222 it is determined whether count B equals the number of orientations. If "noll,loop counter B is incremented at step 214 to compare the next orientation data with those of orientation A after which the slice numbers for the next windows are generated. The loop counter B at step 214 continues incrementing until B is equal to the number of orientations stored. When this condition is satisfied, the procedure passes to step 230. If A is not yet equal to one less than the total number of orientations, A is incremented at step 204. When A equals one less than the total number of orientations at step 230, window determination is complete and the procedure halts at step 234. Thus it will be seen that 19 counter A iterates from 1 to one less than the total number of learned orientations, whilst f or each value of counter A, counter B iterates f rom A + 1 to the total number of learned orientations. In this way, the comparison of like or mirror image data sets is avoided, so avoiding duplicate or null results.
At the end of the setting-up procedure, at least 12 scan slice numbers will be identified as windows since each of the four article orientations will have at least three maximum differences which each produce three windows. Some of the windows appear at the same scan slice.
Where non-standard articles, for example those having unacceptable tolerances or shapes are to be identified, appropriate windows can be determined automatically by passing one or more acceptable articles through the scanner in the acceptable orientation to generate learned data eg as a part of the foregoing procedure. Then, as a further part of the setting-up procedure one or more sample non-standard articles can be passed through the scanner to locate scan sli'ce positions where the sensed non-standard article data differs most from the learned data. These positions will define windows appropriate for sensing non-standardbut correctly oriented articles, based on the sample non-standard articles used during the setting-up procedure.
Turning to Fig. 9, windows are established by the program of Fig. 8 without operator selection at counts 99, 199, 799 and 899. These are the windows of importance for article orientation determination in the specific example here being described.
once the windows have been identified, it has been found useful to expand each window to have a width of three or five scan slices centered about the scan slice. Thus, widening of a window compensates for possible data misalignments which can occur due to circuit delays and in some systems due to mechanical wear and other changes which occur during a continuous operation over several weeks.
After the windows are generated as a setting-up operation, work articles are fed past scanner 130 to identify edge points on the article profile. A comparison operates in real time to determine article orientation during an interval that corresponds to the interval between successive work articles on the conveyor.
When a work article is moved past sensor 130 of Fig. 7 which has an orientation A as shown in Fig. 9, a comparison of the work article prof ile data with each of the learned orientations A, B, C and D is made at the windows previously selected by the program of Fig. 8. Comparing the work article orientation A prof ile data with stored orientation A data gives a total score of zero. A similar comparison of the same work article data with the stored orientation B data gives a score of 300 at each of the four windows 99, 199, 799 and 899 to thereby produce a total score of 1200. The same comparison with the stored orientation C data gives a total score of 400 and with the orientation D data gives a total score of 1000.
21 From Fig. 9 it can be seen that regardless of which orientation the work article assumes, one stored orientation match with a score at or near zero will be obtained and the orientation of the work article thereby recognized. Where each window is three or five scan slices wide, the score for orientation mismatches increases while remaining essentially at zero for the actual orientation. The results obtained with a comparison of only four or up to about twenty windows along the length of a three- inch article can be accomplished with less memory and less time than where all scan slice information is processed while at the same time the performance is fully as reliable.
The present invention may be used with any suitable type of object or article feeder. To identify orientation and part size, the articles must appear before the scanner in single file because overlapping parts will be returned to the supply. Fig. 10 is a diagrammatic view of a vibratory bowl 301 that is a feeder type known per se, and which can be used in a high resolution parts or article handling system embodying the present invention. The vibratory bowl feeder 301 is shown to have a light source 302 and a high resolution camera sensor 303. A positional switch 305 and reorienting or diverting device 304 are located downstream of the camera sensor 303. A vision system controller 306 and overall system controller 307 are included.
Articles to be examined are placed in the center of bowl 301 and the vibratory motion of the feeder causes the articles 22 to move along a path next to the bowl run rail in a singulated fashion to pass between the light source 302 and the sensor 303 in a known manner. An electronic image of the article is formed at the vision controller 306 where decisions are made relating to condition of article or its orientation. An appropriate control signal is sent to the re- orientor/diverter 304 at a time determined by switch 305.
Fig. 11 illustrates a further article conveyor which can be used with the high resolution article handling system according to the invention. In this embodiment, a slide 310 having an article inlet 309 and an article outlet 335 is utilized. Articles enter in a singulated condition at 309. The slope of the slide is selected so that the particles proceed down the chute under the influence of gravity to pass between the light source 311 and the camera sensor 315. An electronic image is created by breaking the light path between the light source in timed relation with the movement of the article down the chute 310. Article orientation is sensed, the information processed within the controller 340, and a signal is generated that will produce a proper response by reorientor/diverter 325. An improper article may be rejected at 330 or passed in a known orientation at 335.
While several embodiments have been illustrated, it is expected that other changes and modifications will be apparent to those skilled in this art. All such modifications and changes which fall within the scope of the claims and equivalents thereof are intended to be covered thereby.
23

Claims (15)

CLAIMS:
1. A system for geometric inspection of articles, comprising:- means for passing articles to be inspected along a path adjacent to an article sensor responsive to article geometry, the sensor having a number of sensing elements arranged to view a linear array of pixels; and means for compacting pixel data obtained from the sensing elements, by producing a count value for each article edge point detected in a scan of the pixel array.
2. A system for geometric inspection as claimed in claim 1, comprising means responsive to article position as each successive article is advanced along the path, and which is arranged to initiate a scan of the array in timed relation to the article advancement, such that the articles are scanned by the sensing elements in a series of substantially evenly spaced slices, independently of the article advancement speed.
3. A system for geometric inspection as claimed in Claim 1 or Claim 2 employing automatic windowing, wherein sets of respective count values related to the profile of reference articles in at least first and second dilferent configurations are generated as a setting-up procedure, the system comprising means for comparing the corresponding count values in each set to determine one or more selected scans defining window positions, by identifying where the difference in compared values is large.
24
4. A system for geometric inspection as claimed in any preceding Claim, comprising means for selectively diverting articles from, or reorienting articles on, said path, in dependence upon the count values.
5. A system f or geometric inspection as claimed in Claim 4 wherein the article diverting or reorienting means comprises a wiper or gate that is actuated by a pneumatic cylinder or electrical solenoid.
6. A system f or geometric inspection as claimed in Claim 5 wherein the article diverting or reorienting means is located downstream of the article sensor.
7. A system for geometric inspection as claimed in any of Claims 4 - 6 wherein the article diverting or reorienting means recirculates articles for re-inspection.
8. A system for geometric inspection as claimed in any of Claims 4- 6 wherein the article diverting means removes articles having unrecognizable geometry from the system.
9. A system for geometric inspection as claimed in any preceding Claim wherein the means for passing articles comprises a conveyor belt.
10. A system f or geometric inspection as claimled in Claim 9 wherein articles in a known orientation are removed to an external system by an index or rocker mechanism.
11. A system for geometric inspection as claimed in any of Claims 1 - 8 wherein the means for passing articles comprises a vibratory bowl type article feeder.
12. A system for geometric inspection as claimed in any of Claims 1 - 8 wherein the means for passing articles comprises a gravity chute.
13. A system for-geometric inspection as claimed in any preceding claim, wherein the sensor includes at least 1000 light-sensitive sensing elements, oriented to be selectively illuminated or shadowed by articles passing thereby.
14. A system for geometric inspection as claimed in any preceding Claim, wherein the sensing elements are scanned to produce an analog signal, and an A/D converter is used to produce digital data transitions corresponding to edge points of the article appearing in the pixel array during each scan, said count values corresponding to said transitions.
15. A system for geometric inspection as claimed in any preceding Claim, including a memory, comparator means and an article diverter or reorienter, wherein reference articles are scanned to create learned article count values stored in the memory as a setting-up procedure, the stored article count values are compared in real time with the count values of work articles inspected by the system and the diverter or reorienter actuated in dependence upon the comparison result.
GB9119774A 1990-09-17 1991-09-16 High resolution parts handling system Expired - Fee Related GB2248931B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US07/583,117 US5233328A (en) 1990-09-17 1990-09-17 Method for processing compacted data
US07/583,256 US5103304A (en) 1990-09-17 1990-09-17 High-resolution vision system for part inspection
US07/586,167 US5157486A (en) 1990-09-21 1990-09-21 High resolution camera sensor having a linear pixel array
US07/586,189 US5142591A (en) 1990-09-21 1990-09-21 High resolution camera with hardware data compaction
US58693990A 1990-09-24 1990-09-24
US58744890A 1990-09-25 1990-09-25

Publications (3)

Publication Number Publication Date
GB9119774D0 GB9119774D0 (en) 1991-10-30
GB2248931A true GB2248931A (en) 1992-04-22
GB2248931B GB2248931B (en) 1995-01-04

Family

ID=27560152

Family Applications (6)

Application Number Title Priority Date Filing Date
GB9119774A Expired - Fee Related GB2248931B (en) 1990-09-17 1991-09-16 High resolution parts handling system
GB9119775A Expired - Fee Related GB2248932B (en) 1990-09-17 1991-09-16 Method for processing compacted data
GB9119780A Expired - Fee Related GB2248934B (en) 1990-09-17 1991-09-16 Automatic windowing for article recognition
GB9119777A Expired - Fee Related GB2248685B (en) 1990-09-17 1991-09-16 High-resolution vision system for part inspection
GB9119778A Expired - Fee Related GB2248686B (en) 1990-09-17 1991-09-16 High resolution camera sensor having a linear pixel array
GB9119776A Expired - Fee Related GB2248933B (en) 1990-09-17 1991-09-16 High resolution camera with hardware data compaction

Family Applications After (5)

Application Number Title Priority Date Filing Date
GB9119775A Expired - Fee Related GB2248932B (en) 1990-09-17 1991-09-16 Method for processing compacted data
GB9119780A Expired - Fee Related GB2248934B (en) 1990-09-17 1991-09-16 Automatic windowing for article recognition
GB9119777A Expired - Fee Related GB2248685B (en) 1990-09-17 1991-09-16 High-resolution vision system for part inspection
GB9119778A Expired - Fee Related GB2248686B (en) 1990-09-17 1991-09-16 High resolution camera sensor having a linear pixel array
GB9119776A Expired - Fee Related GB2248933B (en) 1990-09-17 1991-09-16 High resolution camera with hardware data compaction

Country Status (1)

Country Link
GB (6) GB2248931B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE9801170L (en) * 1998-04-02 1999-10-03 Photonic Systems Ab Method and system for monitoring or scanning an object, material or the like
CN105136045B (en) * 2015-09-22 2018-01-05 北京佰能盈天科技有限公司 One kind collection volume station, which is coiled, surveys long method
CN108445808B (en) * 2018-03-30 2024-08-27 深圳一清创新科技有限公司 Sensing device and method for data synchronization

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2140603A (en) * 1983-05-27 1984-11-28 Pa Consulting Services Adaptive pattern recognition
US4678920A (en) * 1985-06-17 1987-07-07 General Motors Corporation Machine vision method and apparatus
US4711579A (en) * 1986-08-12 1987-12-08 H. Fred Johnston System for automatically inspecting a flat workpiece for holes
US4858156A (en) * 1985-05-22 1989-08-15 Soudronic Ag Apparatus for examining objects

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4509075A (en) * 1981-06-15 1985-04-02 Oxbridge, Inc. Automatic optical inspection apparatus
US4608709A (en) * 1983-03-08 1986-08-26 Owens-Illinois, Inc. Method and apparatus for gauging containers

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2140603A (en) * 1983-05-27 1984-11-28 Pa Consulting Services Adaptive pattern recognition
US4858156A (en) * 1985-05-22 1989-08-15 Soudronic Ag Apparatus for examining objects
US4678920A (en) * 1985-06-17 1987-07-07 General Motors Corporation Machine vision method and apparatus
US4711579A (en) * 1986-08-12 1987-12-08 H. Fred Johnston System for automatically inspecting a flat workpiece for holes

Also Published As

Publication number Publication date
GB2248933A (en) 1992-04-22
GB2248686A (en) 1992-04-15
GB9119775D0 (en) 1991-10-30
GB2248934A (en) 1992-04-22
GB2248932A (en) 1992-04-22
GB9119780D0 (en) 1991-10-30
GB9119774D0 (en) 1991-10-30
GB2248934B (en) 1994-11-30
GB2248685A (en) 1992-04-15
GB2248686B (en) 1994-12-14
GB9119777D0 (en) 1991-10-30
GB2248931B (en) 1995-01-04
GB2248932B (en) 1994-10-12
GB2248685B (en) 1994-10-19
GB2248933B (en) 1994-08-31
GB9119778D0 (en) 1991-10-30
GB9119776D0 (en) 1991-10-30

Similar Documents

Publication Publication Date Title
US5311977A (en) High resolution parts handling system
CA1252849A (en) Glassware inspection using optical streak detection
US6701001B1 (en) Automated part sorting system
US5065237A (en) Edge detection using patterned background
US4896211A (en) Asynchronously triggered single field transfer video camera
US4446481A (en) Automatic product inspection system
EP0366235B1 (en) Monitoring systems and methods
US4949172A (en) Dual-mode TDI/raster-scan television camera system
US4678920A (en) Machine vision method and apparatus
EP0227404B1 (en) Sorting
US5223917A (en) Product discrimination system
US5111411A (en) Object sorting system
US5157486A (en) High resolution camera sensor having a linear pixel array
JPH0781955B2 (en) Method for removing opaque foreign matter in transparent body
NO802398L (en) PROCEDURE AND DEVICE FOR AA CLASSIFYING PIECE GOODS IN MOVEMENT
EP0764846B1 (en) Container inspection with field programmable gate array logic
AU645123B2 (en) Automatic windowing for article recognition
GB2248931A (en) High resolution parts handling system
US4742555A (en) Pattern processor controlled illuminator
US5018864A (en) Product discrimination system and method therefor
EP4194108A1 (en) Small-grain agricultural product color selection method combining area scan and line scan photoelectric features
AU646489B2 (en) High resolution camera with hardware data compaction
JP3694590B2 (en) Agricultural product image reading apparatus and sorting apparatus using the same
RU2824518C1 (en) Optical separator
JP2765181B2 (en) Visual device for moving work and work posture determination device

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20040916