INSPECTION SYSTEM FOR EXTERIOR ARTICLE SURFACES
Background of the Invention
This application pertains to the art of video inspection and more particularly to automated high speed inspection of a stream of similar articles.
The invention is particularly applicable to automated, real-time inspection of beverage containers and will be described with particular reference thereto. However, it will also be appreciated that the invention has broader application, such as in the inspection of any article wherein both exterior layout and coloration are definable within set ranges or standards.
Automated, real-time inspection is evolving to be an essential element of mass-manufactured goods. Such inspection is particularly essential in the food packaging industry. Flaws in or on containers can result in loss of product, poor consumer impression, as well as possible contamination or spoilage.
A first application of automated video inspection to the container industry was for analysis of container subassemblies. For example, can ends for beverage containers could be inspected for imperfections in the material or formation. Another generation of inspection devices functioned to analyze interiors of these containers.
While current systems perform quite well for analyzing sub-assemblies, stock and interiors, no system adequately addresses the particular detailed inspection requirements for container exteriors. Of particular importance is the integrity of the exterior indicia, such as labeling, on a conventional beverage can. Current systems are inadequate for the detailed inspection of both integrity of a container exterior, as well as the coloration thereof. The present invention contemplates a new and improved exterior inspection system which overcomes the above-referred problems, and others, and provides an exterior video inspection system which allows for detailed real-time analysis of article exteriors, inclusive of label integrity and coloration.
Summary of the Invention
An inspection system for article exteriors includes a plurality of cameras. At least one of the cameras is sensitive to colors. The remaining camera or cameras are suitably black and white. A series of specimens is communicated to an inspection area to which lenses of all of the cameras are focused. Once a specimen is disposed in the specimen area, a strobe light is pulsed to illuminate it. From this illumination, each camera captures a still image of a portion of the specimen. An
analysis is completed against the black and white image or images to determine integrity of the exterior surface, such as structural and label integrity. An analysis of a color image is completed to determine whether the resultant colors are acceptable to preselected characteristics. The results of this analysis allows for generating a signal representative of acceptability of the specimen. Specimens that are deemed unacceptable are removed from the stream of acceptable specimens. In accordance with a more limited aspect of the subject invention, a system is provided for acquiring a training set representative of the characteristics or range of characteristics for specimens deemed acceptable. This training set is, in turn, used to determine the acceptability of future, inspected specimens.
In accordance with yet a further aspect of the present invention, the images acquired from a plurality of black and white cameras are conjoined prior to analysis of the data disposed therein. An advantage of the present invention is the provision of a real-time inspection system which allows for inspection of both an exterior surface, surface labeling, and surface coloration of a stream of similar articles.
Yet another advantage of the present invention is the provision of an inspection system which may be readily adapted to various or varying specimens.
Yet another advantage of the present invention is the provision of an inspection system which allows for archiving of data on unacceptable specimens to allow for correction in the manufacturing process to address the concern.
Yet a further advantage of the present invention is the provision of a real-time container exterior inspection system which may be used in a closed-system feedback matter so as to allow for continuous updating and refinement of data representing an acceptable specimen.
Further advantages will become apparent to one of ordinary skill in the art upon a reading and understanding of the subject specification.
Brief Description of the Drawings The invention may take physical form in certain parts, and arrangements of parts, a preferred embodiment of which will be described in detail in this specification and illustrated in the accompanying drawings which form a part hereof and wherein: FIGURE 1 illustrates an overall setup of the inspection system of the present invention;
FIGURE 2 provides a block diagram of the subject inspection system;
FIGURE 3 illustrates suitable placement for multiple cameras around a specimen disposed in an inspection area;
FIGURE 4 illustrates a suitable field of view accomplished via a single camera on a specimen;
FIGURE 5 illustrates a transition from an RGB signal acquired from a color camera in the subject invention into an HSI model to facilitate detailed analysis of the resultant image data; FIGURE 6 illustrates a suitable histogram output for an analysis of color data acquired from a color camera; FIGURE 7 is a flow chart of an image combining and analysis undertaken in connection with the subject inspection system; FIGURE 8 provides a flow chart of overall system timing for real-time analysis and specimen rejection in a series of specimens;
FIGURE 9 is a flow chart for a training sequence for creating template data representative of acceptable specimens;
FIGURE 10 provides a flow chart for automated video camera gain control settings for use in the subject system; and
FIGURE 11 is a flow chart for a suitable automated light calibration routine for use in connection with the subject invention.
Detailed Description of the Preferred -jMnTaod-meiit
Turning now to the drawings wherein the showings are for the purpose of illustrating the preferred embodiment in the invention only, and not for the purpose of limiting the same, FIGURE 1 shows a diagram of the inspection system A working on a series or stream of articles or specimens B. The inspection system A will be detailed in particular below. As illustrated, the system completes an inspection of the specimen stream B. The inspection system A includes a main processor section 10 works in concert with a storage medium 12, suitably comprised of a hard disk, tape unit, floppy disk, magneto optical disk, or the like. A suitable CPU is a SPARC 3PU-3CC. Outputs of the main processor section 10 are also communicated to a monitor unit 14. The actual inspection is accomplished in an inspection area disposed within an inspection unit or tunnel 16. The tunnel 16 functions to isolate specimens from ambient light during illumination and imaging thereof in a manner detailed below.
Turning now to FIGURE 2, illustrated is a block diagram of the overall inspection system of the subject invention. A conveyor 20 communicates a series of similar specimens such as can 22 as illustrated. A specimen arriving at inspection area 24 is detected by a first part detector suitably comprised of photosensor 26. At this
point a strobe light (not shown) is initialized. During strobe illumination of the specimen 22 disposed in the inspection area 24, each of a series of radially disposed cameras acquires a still image of a unique image thereof. In the embodiment of FIGURE 2, a lens of color camera 30 is focused on the specimen 22 in the inspection area 24. Six monochrome or black and white cameras 32(a)- 32(f) are also similarly trained on each specimen disposed in the inspection area 24. The cameras 32 are underplaced to be sensitive to gray levels of the images captured thereby. Thus, when the specimen is illuminated by operation of the strobe, still images are acquired by six black and white cameras and a single color camera in the disclosed embodiment. In this embodiment, the monochrome or black and white image capture is accomplished using the six cameras 32 spaced evenly at 60° intervals around the specimen, such as can 22. In order to minimize distortion, each can is advantageously placed approximately centrally in the inspection area 24 at about a focal distance of 18 inches away from the can.
The cameras 32 used in the preferred embodiment implement a single frame output of 640 horizontal pixels per line and 480 vertical pixels per line. To best utilize resolution of the cameras, they are oriented such that the 640 horizontal pixels are aligned with a top-to-bottom axis (also referred to as a vertical wall) of a specimen. To
reconstruct a can's image, a useful area from each camera is advantageously selected as 640 horizontal pixels by 144 vertical pixels. This will allow for reassembly of the images to form an effective image at 864 pixels by 640 pixels. By way of example, this results of an image resolution of approximately 9.5 to 9.8 mils per pixel for a typical beverage container.
A color camera 30 is used to induct a color image. Camera-to-can orientation is chosen advantageously as the same as that for the black and white cameras 32. Thus, resulting resolution of the color camera is 640 horizontal pixels for a top-to-bottom (height) of a can and 240 pixels for a section of a circumference of a can. As will be detailed below, the primary color RGB (red, green, blue) outputs of the color image are digitized and converted to HSI (hue, saturation, intensity) while an image is being inducted.
It will be appreciated that actual pixel-to-size resolution of cans varies in accordance with physical dimensions of a can or other specimen to be analyzed. Actual images acquired from the cameras are communicated to a series of image processor modules 40(a)-40(d). These modules, in turn, include a digital acquisition subassembly, an analog scanner, an analog scanner acquisition module, an analog generator display module, an arithmetic unit and advanced processor module for
convolution, statistical analysis, look-up-table ("LUT") processing, and morphology.
The color camera 30 outputs its digitized image first to a color module 42 before communicating it to its own image processor module 40(d). Data from each of the image processor modules 40 is communicated to a real-time host disposed in the main processor section 10. In the preferred embodiment, the processor module 10 includes a RISC based processor, such as a host 50 computer comprised of the SPARC workstation noted above. It includes as well a real-time controller 52 in data communication therewith. The processor module 10 allows for comparison of retrieved video image with data representative of acceptable specimens. Details of this will be provided further below. The processor module 10 thus monitors data that operates in connection with a second photosensitive part detector 54 and a blow-off mechanism 56. The real-time host 46 tracks progress of specimens along the conveyor 20. Those specimens which are determined to be unacceptable to preselected standards are therefore removed from the conveyor when the part detector 54 senses the presences of the specimen by operation of the blow-off mechanism 56.
Also disclosed in FIGURE 2 is a provision for an acceptable power supply 60, the particulars of which will be appreciated by one of ordinary skill in the art. The system also advantageously includes a host 64 which is
placed in data communication with the real-time host 46. The host 64 is advantageously provided with a graphical user interface ("GUI") as illustrated at 68.
Turning now to FIGURE 3, a diagram evidencing the stream of specimens B, inspection area 24, as well as the cameras 32(a)-32(f) is illustrated. It will be seen that, in the preferred embodiment, all of the black and white cameras are spaced equidistantly around a perimeter of the inspection area 24. The subject system anticipates a minimum inter-specimen spacing d. This distance is highly application specific and will be appreciated to be a function of the image complexity and amount of available computing power provided by the processor module 10 and its associated, support hardware. For example, a typical beverage container might require a spacing of 4.7 inches to
5.6 inches with the disclosed hardware platform.
Turning to FIGURE 4, illustrated in detail is a particular image segment captured by a single camera. It will be appreciated that each camera 32 will acquire a similar image from a different portion of the specimen.
Detailed video inspection of specimen exteriors requires an ability to detect defect categories such as spots, smears/smudges, off-registration, and color defects.
A spot is suitably defined as a smaller, singular defect with some measurable area. A spot might be caused by any problem which results in a different pattern
appearing in a different area. For example, spots can be caused by defects such as scratches, extra or missing ink, certain dents or any type of dirt.
Smears or smudges typically do not have a contrast such as that which would be associated with spots. Accordingly, detecting such defects requires a larger surface area.
Off-registration occurs when two or more colors are not aligned correctly. For example, where ever a color occurs, a discrepancy similar to a spot or smear may result.
Color defects are those which are related to fade or darkening. Color defects include hue-related defects.
The subject system accomplishes both monochrome inspection, as well as color inspection. The monochrome inspection consists of a line-by-line, pixel-by-pixel comparison between a known image, referred to as a template, and a final conjoined or warped image contained from the plurality of black and white cameras. The actual analysis of the color aspects of the present invention will be detailed next.
Turning to FIGURE 5, an RGB color 90 cube evidences color perception as provided by human cognizance.
While such primary color combinations are acceptable for humans, it does not adapt well to machine analysis.
Accurate technical measurement is provided with analysis of
HSI (hue, saturation, intensity) information. Thus, the subject system accomplishes a conversion from an RGB (red, green, and blue primaries) format, such as is output from a typical color camera, to an HSI format. The system accomplishes this translation via the disclosed hardware. In the illustration of FIGURE 5, the RGB output of the color camera or cameras is illustrated at 94. The actual hardware translation is provided by inputting the RGB signals to a translator 92 which works in concert with a first look-up table ("LUT") 96 and a second LUT 98. The details of the translation hardware are well within the understanding of one of ordinary skill in the art and will not be repeated herein. Once translated, the data from the acquired, color image is provided in an HSI model illustrated generally at 100. In the HSI model, hue is represented circumferentially, saturation is represented radially, and intensity is represented along an x axis (top-to-bottom) .
Hue is described as a gradation in color or color shifts. Saturation is defined as an amount of color present. Intensity describes a level or amount of color. In the subject system, a series of acceptable specimens are provided to the system and resultant color data is obtained. This data forms a histogram which sets forth a range of acceptable color parameters. Such a histogram suitable for analysis is illustrated in FIGURE 6.
Turning to FIGURE 7, a flow chart showing the actual inspection or operation "run mode" of the subject inspection system is described. At block 104, digitized image data from each of the seven cameras of the preferred embodiment is acquired. The system directs image acquisition from the image processor modules 40 (FIGURE 2) . Each such acquisition unit interfaces between two black and white cameras 32 and analog-to-digital image processor modules. Three such image processors are used for acquiring images in the preferred embodiment. Three more image processors are suitably used to hold three memory stores in a gain/offset correction portion from which color analysis is performed. An acquisition pipe is used to acquire data from the digitizer. From an A/D module, data is piped into two pairs of memory stores.
Next, at block 106, an actual edge detection is accomplished. This is done by selectively passing codes through a look-up table to acquire a binary representation; (2) selectively writing codes to columns within a can image; writing code for a host to find edges, top, middle, bottom, and center lines using acquired information IP; and writing code for a host to generate coefficients for unwarping.
In step 108, the system accomplishes an unwarping of the images. A warping pipe uses processing rectangles selected on a second pair of stores. These memories are
used to unwarp the image. Coefficients for the warp generation are provided by the host computer 10 (FIGURE 1) . In block 110, an auto filter is provided to adjust acquired images for gain and offset correction. An unwarped image advantageously has high and low light intensity spots corrected on its image. Such intensity spots may be due to such things as a filament from a light source. Such light correction information is obtained from a template constructed while the system is provided in its learn mode. A correction type is created and an image is sent through an arithmetic unit module of the processor wherein correction are made.
Next, at block 112, the system finds a fiducial portion of images. A fiducial is some repeatable pattern or characteristic in an acquired image which, when located is used to define a single, unique location on the specimen's exterior or decoration.
Once an image has completed an unwarping at block 108, a fiducial or reference point on a can is located. Once fiducials have been found or isolated in block 112, an actual analysis of the image is completed. The image analysis includes pattern matching and tolerance template checking at block 114 and color analysis at block 116. A summation of inputs is accomplished at block 116. A summation of inputs is accomplished at block 118. It
will be appreciated that inter-processor communications are also utilized. This is provided for at block 120.
Turning next to FIGURE 8, the actual tracking of specimens along the conveyor 20 is detailed in flow chart form. A tracking operation is commenced at block 140. From this point, actual progress of the conveyor 20 is monitored from an associated gear clock at block 142. The illumination of the specimen disposed in the inspection area 24 is accomplished via the snap control 144 which is synchronized to the conveyor and the particular specimen.
It will be appreciated that actual acquisition of the image and analysis thereof is completed between snap control block 144 and the second part detector 54 (FIGURE
2) as indicated at block 146. Parts that have been so determined defective to preselected standards are eliminated or removed from the stream by blow-off mechanism 56 (FIGURE 2) as evidenced by block 148. At block 150, an actual graphical user interface image is provided to the user. Turning now to FIGURE 9, a learn mode for acquiring the template or test data from which a determination as specimen acceptability is made is described. At block 200, the systems snaps seven pictures, six monochrome or black and white pictures, six monochrome or black and white pictures and one color image in the preferred embodiment. From the resultant images, an
-16- analysis is completed by edge detection at block 202, unwarping at block 204, filtering at block 206, fiducial analysis at block 208, and an image conjoining or "stitching" at block 210. It will be appreciated that the data analysis and acquisition is similar to that provided in the actual run mode. The difference with the learn mode from the run mode being that results from several test iterations (100 in the disclosed embodiment) are averaged at subsequent block 212. This averaged information provides the template data which will be used to actually determine specimen acceptability during run mode.
In the preferred embodiment, another template is provided for light balancing. It will be appreciated that light intensity may vary in certain areas of an image due to such things as light filaments or reflections. Such a template is advantageously created by using a solid white specimen. Such a specimen is chosen to be the uniformly color on the surface thereof. A picture taken of this specimen, in an ideal instance, would have the same gray level for every pixel. Any variation from a chosen reference point is seized upon and provided into an intensity template as a set off from the ideal.
Turning now to FIGURE 10, a flow chart detailing the camera gain routine which functions to properly set up a gain and offset of each camera analog-to-digital ("A/D") converter is described. Video gain is related directly to
a lens aperture setting, camera gain, and A/D circuit gain. Since a light source directly effects apparent video gain, the subject system provides for isolation of a light source from a camera's related gain components. This is accomplished by calibration. A suitable source for calibration is a fixture approximately the size of a specimen with one or more horizontal light bars, each consisting of an evenly diffused light source.
In FIGURE 10, the camera gain routine is initialized at block 300. At this point, a calibration fixture is positioned in the inspection area on the conveyor. Once a routine is entered from block 300, it is "free running." From block 300, progress is made to a time delay, suitably one second, at block 302. A first of series of images, seven in an embodiment employing seven cameras, is then procured at block 304. A loop is next undertaken for each of the seven cameras at block 306. In the preferred embodiment, starting at line 240, a measurement of a delta between every eighth pixel is then accomplished at block 308. The largest value of a delta achieved in block 308 is calculated at block 310. Next, at block 312, pixels are highlighted which have been measured on a selected image. A location of the largest delta found is noted. The system then progresses to block 314. At this point, a display of the delta is found. A darkest and
brightest picture is also measured. Next, at block 316, a determination is made as to whether all seven cameras have been measured. A "no" determination causes progress to block 318. At this point, the camera number is incremented and progress is returned to block 308. A "yes" determination at block 316 causes progress to block 320. At block 320, a determination is made as to whether new gain and offset parameters had been entered from a user at the graphical user interface. A "yes" determination allows an operator to save the parameters selectively at block 322. Such a save causes progress to block 324, at which point an updating of new gain and offset parameters is made to the image processor modules. Progress is then returned to block 302. A negative determination made at block 322 causes either an exit of the program or return to block 304, depending on an operator selection.
Turning next to FIGURE 11, a light calibration routine advantageously employed in the above-described system setup is described. The light calibration routine serves to provide maintenance personnel of the system with a method to quickly and accurately adjust a light assembly for most even top-to-bottom lighting of specimens. The operator is also provided with an ability to set the D/A converter values associated with each strobe supply. In the preferred embodiment, the light calibrate routine of
FIGURE 11 uses a specimen which is evenly colored as possible, as noted above. The routine is commenced at block 400. Thereafter, progress is made to block 204, at which point an A/D circuit values are updated. A one second delay is then initialed at block 404.
Next, a snap induction of images is undertaken at block 406, one image at a time. One camera image is selected to be first. Thereafter, a measurement loop is entered at block 408. Again, in the preferred embodiment, starting at line 240, pixel locations 20-24 are isolated at block 410. At block 412, measurement of four pixels of each image is undertaken. An averaging of these pixels is completed at block 414. These values are then stored in a buffer at block 416. Next, at block 418, a determination is made as to whether eight measurements have been taken. If the answer is no, progress is made to block 420. At this point, the horizontal line counter is incremented by 75 and the system is returned to block 412. A positive indication from block 418 causes progress to block 422. Here, an output of a gray level measurement is made to a graphical user interface screen. Next, a determination at block 424 is made as to whether all cameras have been accommodated. If the answer is no, progress is returned to block 412. If the answer is yes, progress is made to block 426.
At block 426, a determination is made as to whether an operator had asked for an update of the earlier values to be made. If the answer is yes, these values are written into the A/D circuits by progressing to block 402. If the answer is no, a one second delay is then reinitiated as progress is passed again to block 404.
With this structure, both a real-world calibration of inspection system's characteristics is facilitated, as well as acquisition of empirical data representative of acceptable specimens. This allows for accurate inspections to be completed at a rapid rate. It also allows for easy modifications to accommodate various specimens and specimen characteristics.
This invention has been described with reference to the preferred embodiment. Obviously, modifications and alterations will occur to others upon the reading and understanding of the specification. It is intended that all such modifications and alterations be included insofar as they come within the scope of the appended claims or the equivalents thereof: