US4906099A - Methods and apparatus for optical product inspection - Google Patents

Methods and apparatus for optical product inspection Download PDF

Info

Publication number
US4906099A
US4906099A US07/115,428 US11542887A US4906099A US 4906099 A US4906099 A US 4906099A US 11542887 A US11542887 A US 11542887A US 4906099 A US4906099 A US 4906099A
Authority
US
United States
Prior art keywords
object plane
image
product
predetermined
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US07/115,428
Other languages
English (en)
Inventor
David P. Casasent
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Philip Morris USA Inc
Original Assignee
Philip Morris USA Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philip Morris USA Inc filed Critical Philip Morris USA Inc
Priority to US07/115,428 priority Critical patent/US4906099A/en
Assigned to PHILIP MORRIS INCORPORATED, A CORP. OF VA reassignment PHILIP MORRIS INCORPORATED, A CORP. OF VA ASSIGNMENT OF ASSIGNORS INTEREST. Assignors: CASASENT, DAVID P.
Priority to BR8805591A priority patent/BR8805591A/pt
Priority to JP63271922A priority patent/JPH01143946A/ja
Priority to AU24436/88A priority patent/AU622874B2/en
Priority to EP19880310213 priority patent/EP0314521A3/de
Priority to CA000581756A priority patent/CA1302542C/en
Application granted granted Critical
Publication of US4906099A publication Critical patent/US4906099A/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/88Image or video recognition using optical means, e.g. reference filters, holographic masks, frequency domain filters or spatial domain filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N2021/845Objects on a conveyor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Definitions

  • This invention relates to methods and apparatus for optically inspecting products, and more particularly to methods and apparatus for determining whether or not a product has predetermined optically detectable characteristics.
  • a one-dimensional image of the object plane is formed by integrating a two-dimensional image of the object plane parallel to the predetermined orientation so that each straight line segment having the predetermined orientation in the object plane is focused to a point in the one-dimensional image.
  • the image intensity of the point in the one-dimensional image is proportional to such parameters as the length and image intensity of the line segment.
  • the one-dimensional image is monitored to determine whether or not it includes a point having the image intensity that would result if the object plane included the desired straight line segment at the predetermined orientation.
  • the product is then indicated to have the predetermined optical characteristics depending on whether or not the one-dimensional image is found to include a point having the foregoing image intensity. If desired, additional optical characteristic criteria (e.g., line location) can be included by requiring the resulting point in the one-dimensional image to be at a particular location in that image. If the product has two or more optically detectable, parallel, straight line segments, the spacing between the resulting points in the one-dimensional image can also be used to determine whether or not the product has the appropriately spaced, parallel, straight line segments.
  • additional optical characteristic criteria e.g., line location
  • FIG. 1 is simplified elevational view of a typical product to be inspected in accordance with the principles of this invention.
  • FIG. 2 is a perspective view of apparatus which can be used to form a one-dimensional image of a two-dimensional object.
  • FIG. 3 is a plot of image intensity for a particular one-dimensional image formed by the apparatus of FIG. 2.
  • FIG. 4A is a perspective view of alternative apparatus for forming one-dimensional images of a two-dimensional object.
  • FIG. 4B is a side view of another alternative apparatus for forming one-dimensional images of a two-dimensional object.
  • FIG. 4C is a side view showing a possible modification of the apparatus of FIG. 4A.
  • FIG. 5 is a plot of image intensity for a particular one-dimensional image of the product of FIG. 1.
  • FIGS. 6 and 7 are similar to FIG. 5 but for other one-dimensional images of the product of FIG. 1.
  • FIG. 8 is a perspective view of still other alternative apparatus for forming one-dimensional images of two-dimensional objects.
  • FIG. 9 is a simplified schematic block diagram of illustrative apparatus constructed in accordance with this invention for inspecting products, and for separating products found to be unacceptable from those found to be acceptable.
  • FIG. 10 is a more detailed schematic block diagram showing how a portion of the apparatus of FIG. 9 may be constructed.
  • FIG. 11 is a simplified schematic block diagram showing an alternative embodiment of a portion of the apparatus of FIG. 9.
  • FIG. 12 is a block diagram showing how a portion of the apparatus of FIGS. 9-11 (and other subsequent FIGS.) may operate to analyze one-dimensional image data.
  • FIG. 13 is a schematic block diagram further showing how a portion of the apparatus of FIGS. 9-11 (and other subsequent FIGS.) may operate to analyze one-dimensional image data.
  • FIGS. 14A-C and 15-19 are simplified schematic block diagrams showing other alternative embodiments of portions of the apparatus of FIG. 9.
  • FIGS. 20 and 21 are diagrams which are useful in explaining the mathematical principles employed in the practice of this invention.
  • FIG. 22 is a depiction of the three-dimensional transformation (i.e., the Hough transform) of the graphic information contained in FIGS. 20 and 21 in accordance with the mathematical principles employed in the practice of this invention.
  • FIG. 23 is a block diagram of apparatus which can be used to produce data of the type shown in FIG. 22.
  • FIG. 24 is a block diagram of apparatus which can be used to process data of the type shown in FIG. 22.
  • FIG. 25 is a simplified side view of apparatus which can be used to form a Fourier transform of Hough transform data.
  • determining product acceptability is representative of many of the possible applications of this invention, many of the embodiments of the invention will be described herein in that context. It is to be understood, however, that determining product acceptability is merely illustrative of the possible uses of the invention, and that the invention is not limited to making product acceptability determinations. Similarly, although the invention is applicable to inspecting many other types of products, the invention will be fully understood from an explanation of its application to inspecting cigarette packages.
  • FIG. 1 The front of an illustrative cigarette package 10 (with all text--except for a few arbitrary product name letters--deleted) is shown in FIG. 1.
  • Package 10 includes some printed ornamentation (e.g., lines 12a and 12b), a tax stamp 14 partly visible in FIG. 1 and additionally extending over the top of the package, and a tear strip 16 for assisting the consumer in removing the transparent outer cellophane wrapper.
  • FIG. 1 shows a perfect (and therefore acceptable) cigarette package
  • many types of defects can occur to render the package unacceptable.
  • the package could be folded improperly so that the printing (including the letters NWQVI and ornamentation lines 12) might not have the proper position or orientation.
  • Tax stamp 14 could be missing entirely, or if present, could be skewed or off-center (e.g., to the left or right, up or down as viewed in FIG. 1). Tear strip 16 could also be missing or mislocated.
  • FIG. 2 shows one illustrative embodiment of one component part of this invention which addresses this problem.
  • element 30 is a two-dimensional object plane (such as the front of cigarette package 10 or a two-dimensional image of the front of cigarette package 10).
  • object plane 30 is assumed to include (in addition to its peripheral edges) two optically detectable straight line segments 32 and 34.
  • Straight line segment 32 is vertical, while straight line segment 34 is horizontal.
  • Collimated light from object plane 30 (indicated by representative light rays 40a-d) is applied to conventional cylindrical lens 42 (e.g., a lens which is planar on the side facing object plane 30 and semi-cylindrical on the other side).
  • Cylindrical lens 42 has a longitudinal axis 44 which is horizontal. Accordingly, cylindrical lens 42 focuses all of the light from object plane 30 into a single horizontal line 46 which is therefore a particular one-dimensional image of the object plane.
  • the longitudinal axis 44 of cylindrical lens 42 is parallel to horizontal line 34, and therefore perpendicular to vertical line 32.
  • the longitudinal axis of one-dimensional image 46 is also parallel to horizontal line 34 and perpendicular to vertical line 32.
  • the optical energy or information associated with or attributable to horizontal line 34 is distributed along image 46, whereas the optical energy or information associated with or attributable to vertical line 32 is all concentrated at a single point 48 in image 46.
  • point is used to mean a small area within which further dimensional analysis is not needed and is therefore irrelevant.
  • the word point is used herein in somewhat the way that it is used in mathematics, except that (because physical as opposed to purely mathematical phenomena are involved) a point herein has a small but finite area.
  • FIG. 3 a plot of the intensity of image 46 would be as shown in FIG. 3.
  • the relatively low amplitude portion 50 of the curve in FIG. 3 is due to horizontal line segment 34.
  • the highly concentrated, high-amplitude spike 52 is due to vertical line segment 32.
  • the location of spike 52 is determined by the left-right location of vertical line segment 32. (Note that the vertical location of line segment 32 is irrelevant.)
  • the height of spike 52 is a function of (1) the length of line segment 32 and (2) the optical intensity of line segment 32.
  • FIG. 4A a conventional spherical lens 54 has been added to the right of one-dimensional image 46.
  • the effect of spherical lens 54 is to cause formation of a second one-dimensional image 56 of two-dimensional object plane 30.
  • the longitudinal axis of one-dimensional image 56 is perpendicular to the longitudinal axis of one-dimensional image 46.
  • image 56 the effect of vertical line segment 32 is dispersed, while the effect of horizontal line segment 34 is concentrated at a single point 58.
  • a plot of the intensity of image 56 would be similar to FIG. 3, except that the low-amplitude portion 50 would be due to vertical line segment 32, and the spike 52 would be due to horizontal line segment 34.
  • FIG. 4B Another way to form two orthogonal one-dimensional images of object plane 30 is to reverse the order of lenses 42 and 54, and place them closer together, as shown in FIG. 4B.
  • This causes both one-dimensional images to form to the right of the right-most lens.
  • the alternative of FIG. 4B tends to produce one-dimensional images of different sizes (i.e., it reduces the size of horizontal one-dimensional image 46 which forms closest to cylindrical lens 42, and increases the size of vertical one-dimensional image 56 which forms farther from the cylindrical lens).
  • This may be undesirable if both one-dimensional images are to be used, because it means that different size detectors must be used for each one-dimensional image.
  • the alternative of FIG. 4B has been found to produce better one-dimensional images with low-cost lenses than the arrangement shown in FIG. 4A.
  • the alternative of FIG. 4B can also be made shorter than FIG. 4A, which may make it easier and cheaper to manufacture.
  • FIGS. 2, 4A, and 4B show some of the possible lens systems which can be used to form one-dimensional images, those skilled in the art will appreciate that there are many other lens systems (e.g., combinations of two or three cylindrical lenses) which can be used for this purpose.
  • FIG. 4C shows how a beam splitter 60 can be included between lenses 42 and 54 in the embodiment of FIG. 4A so that a detector can be located to detect horizontal one-dimensional image 46 without interfering with the formation and detection of vertical one-dimensional image 56.
  • Part of the light from lens 42 is reflected off the inclined surface of beam splitter 60 to cause formation of horizontal one-dimensional image 46 at a location off the optical axis 62 of the lens system.
  • the remainder of the light from lens 42 continues on to spherical lens 54 which forms vertical one-dimensional image 56 as described above.
  • a similar beam splitting arrangement could be used in FIG. 4B (e.g., by placing the beam splitter 60 between cylindrical lens 42 and the depicted location of horizontal one-dimensional image 46).
  • apparatus of the type shown in FIGS. 2 and 4C is useful in many product inspection applications because in many such applications, a product can be adequately inspected from just one angle using apparatus of the type shown in FIG. 2, or from just two orthogonal angles (e.g., parallel to the horizontal and vertical edges of the product) using apparatus of the type shown in FIG. 4C.
  • spike 52 in FIG. 3 would be shorter and wider.
  • the height and width of spike 52 can therefore be used to determine whether or not line segment 32 is properly aligned perpendicular to the longitudinal axis of one-dimensional image 46. If spike 52 is found not to be sufficiently high and/or sufficiently narrow, the product associated with object plane 30 can be rejected as unsatisfactory.
  • the height of spike 52 is also directly proportional to the length of line segment 32 and to the optical intensity of that line segment.
  • spike 52 has the desired narrowness, if spike 52 is too high or too low, the product associated with object plane 30 can be rejected as unsatisfactory. For example, spike 52 might be too high because line segment 32 was smeared in the vertical direction. Or spike 52 might be too low because line segment 32 was broken or not printed with sufficient intensity.
  • the left-right location of line segment 32 determines the left-right location of spike 52.
  • yet another test for product acceptability can be based on the left-right location of spike 52. If spike 52 is not found at the desired location, then either line segment 32 is not present at all, or it has been shifted to the left or right. In either case, the product associated with object plane 30 can be rejected on that basis.
  • the above-described shifting of spike 52 might be due to improper positioning of the product on the conveyor, rather than to a defect in the product.
  • an acceptable product has two or more parallel vertical lines (such as a left or right edge and a vertical line at a predetermined desired horizontal distance from that edge)
  • the spacing between the peaks in one-dimensional image 46 can be used to determine product acceptability independent of possible translations of the product parallel to object plane 30.
  • FIG. 5 shows the one-dimensional image 46 that results from substituting the pattern of FIG. 1 in the apparatus of FIG. 2.
  • the vertical side edges of the package of FIG. 1 are vertical in FIG. 2 (i.e., the one-dimensional image of FIG. 5 is parallel to axis 5--5 in FIG. 1).
  • Spikes 72 and 74 are respectively produced by the left- and right-hand vertical edges of package 10. These spikes are very high because the vertical edges of the package are relatively long.
  • Spikes 76 and 78 are respectively produced by the left- and right-hand vertical portions of the letter N printed on the front package 10.
  • Spikes 80 and 82 are respectively produced by the left- and right-hand edges of tax stamp 14.
  • Spike 84 is produced by the vertical portion of the printed letter I.
  • the remainder of the image intensity plotted in FIG. 5 has relatively low amplitude because there are no other vertical line segments to cause prominent peaks to form in it.
  • peaks 76, 78, and/or 84 can be used to determine such factors as: (1) whether or not the lettering is printed with sufficient intensity (these peaks will tend to be too short if the printing is not sufficiently intense); (2) whether or not the lettering is smeared in the vertical direction (these peaks will tend to be too high if the printing is smeared vertically); (3) whether or not the printing is shifted left or right (these peaks will be correspondingly shifted left or right); (4) whether or not the printing is improperly tilted (these peaks will tend to be shorter and broader if the printing is tilted either way); (5) whether or not the printing is smeared horizontally (these peaks will tend to be broader if the printing is smeared horizontally); and (6) whether or not the printing is present (these peaks will be absent if the printing is absent).
  • peaks 80 and/or 82 can be used to determine such factors as: (1) whether or not tax stamp 14 is present (these peaks will be missing if tax stamp 14 is absent); (2) whether or not tax stamp 14 is shifted left or right (these peaks will be correspondingly shifted left or right); (3) whether or not the appropriate amount of tax stamp 14 is showing on the front of package 10 (these peaks will be too high if too much of tax stamp 14 is visible, and too low if too little of tax stamp 14 is visible); (4) whether or not a corner of tax stamp 14 has been folded up or under (these peaks will not be of equal height if one corner of tax stamp 14 has been folded up or under); and (5) whether or not tax stamp 14 is crooked on the package (these peaks will be shorter and broader if the tax stamp is crooked).
  • the presence of other unintended peaks or amplitude values may indicate other defects in package 10.
  • another peak may indicate a smudge or smear
  • one-dimensional image 56 (FIG. 4A) is perpendicular to image 46 and is therefore parallel to axis 6--6 in FIG. 1.
  • Another way to form a one-dimensional image having this orientation is to rotate object plane 30 or cylindrical lens 42 by 90° about axis 62 in FIG. 2.
  • a plot of the intensity of the resulting one-dimensional image is shown in FIG. 6.
  • peaks 92 and 94 are respectively produced by the bottom and top edges of package 10. Region 96 of slightly elevated amplitude is produced by printed letters NWQVI. Peak 98 is produced by the lower edge of tax stamp 14. Peaks 100 and 102 are respectively produced by the lower and upper edges of tear strip 16.
  • peak 98 can be used to determine such factors as: (1) whether or not tax stamp 14 is present (peak 98 will be absent if tax stamp 14 is absent); (2) whether or not tax stamp 14 is crooked (peak 98 will be shorter and broader if tax stamp 98 is rotated either way); and (3) whether the proper amount of tax stamp 14 is showing on the front of package 10 (peak 98 will be shifted left or right if tax stamp 14 is shifted down or up).
  • peaks 100 and 102 can be used to determine such factors as: (1) whether or not tear strip 16 is present (peaks 100 and 102 will be absent if tear strip 16 is absent); (2) whether or not tear strip 16 is crooked (peaks 100 and 102 will be shorter and broader if tear strip 16 is inclined in either direction rather than truly horizontal); and (3) whether or not tear strip 16 is properly located (peaks 100 and 102 will be shifted left or right if tear strip 16 is shifted down or up).
  • the left-right location of elevated amplitude portion 96 can be used to determine whether or not letters NWQVI are at the proper vertical location on package 10.
  • FIG. 7 Another useful orientation for a one-dimensional image of package 10 is parallel to axis 7--7 in FIG. 1.
  • This axis is perpendicular to ornamentation line 12b.
  • Such an image can be formed by rotating image plane 30 or cylindrical lens 42 about axis 62 until axis 44 is parallel to axis 7--7.
  • a plot of the image intensity of the resulting one-dimensional image is shown in FIG. 7.
  • Spike 112 is due to ornamentation line 12b. No other prominent peak appears in FIG. 7 because ornamentation line 12b is the only significant straight line segment perpendicular to axis 7--7. Accordingly, the presence or absence of spike 112 indicates the presence or absence of line 12b.
  • spike 112 its height and width indicate whether or not line 12b is truly perpendicular to axis 7--7, and its height further indicates the intensity of line 12b, whether or not line 12b is broken or smeared longitudinally, etc.
  • edges of the product are due to the vertical edges of the product, and peaks 92 and 94 are due to the horizontal edges of the product.
  • peaks 72 and 74 are due to the vertical edges of the product
  • peaks 92 and 94 are due to the horizontal edges of the product.
  • edges can often be made the most prominent features in the one-dimensional image.
  • one way to change the orientation of one-dimensional images such as image 46 in FIG. 2 is to rotate image plane 30 or cylindrical lens 42 about axis 62.
  • Another way to accomplish this is to include a rotatable, conventional Dove prism 120 between object plane 30 and cylindrical lens 42 as shown in FIG. 8. (If desired, a spherical lens like spherical lens 54 in FIG. 4B can be included between Dove prism 120 and cylindrical lens 42 in FIG. 8.)
  • the image applied to the left-hand inclined face of Dove prism 120 is refracted down to the bottom surface of the prism. From there it is reflected up to the right-hand inclined face and refracted out again to continue along axis 62.
  • Dove prism 120 in the position shown in FIG. 8 is to invert the image from top to bottom but not from left to right.
  • Dove prism 120 is now rotated by an angle ⁇ about axis 62, the output image of the prism rotates by an angle 2 ⁇ .
  • One-dimensional image 46 stays in the same physical location because lens 42 has not moved, but the effect on the information contained in image 46 is the same as the effect of rotating cylindrical lens 42 about axis 62 by an angle 2 ⁇ .
  • rotation of Dove prism 120 allows formation of a one-dimensional image of object plane 30 parallel to any axis of object plane 30. All of these images are focused along the same line 46 because lens 42 does not move. This greatly simplifies detecting the one-dimensional image intensity information at various orientations because the detectors can be stationary (e.g., at the location of line 46).
  • FIG. 9 shows illustrative product inspection apparatus constructed in accordance with the principles of this invention.
  • Packer 150 (which may be conventional or largely conventional) produces product 10 (e.g., the cigarette packages shown in FIG. 1) which are conveyed one after another from packer 150 by conveyor 152.
  • product 10 e.g., the cigarette packages shown in FIG. 1
  • packer 150 is either controlled by processor 154, or packer 150 produces a periodic signal monitored by processor 154.
  • a light source 130 and photodetector 132 can be used to produce signals applied to processor 154 to keep the optical system synchronized with the movement of products 10 along conveyor 152.
  • Camera 156 may be a conventional television or charge coupled device (“CCD”) camera.
  • CCD charge coupled device
  • Conventional stroboscopic product illumination techniques may be used to help camera 156 effectively "freeze” the image of each product 10 even though the products are moving continuously along conveyor 152.
  • a CCD camera is preferable if stroboscopic illumination of product 10 is employed.
  • the image seen by camera 156 is displayed by conventional display 30 (e.g., a liquid crystal display (“LCD”) type video screen).
  • LCD liquid crystal display
  • camera 156 and/or display 30 include conventional means for allowing display 30 to produce a sustained, substantially continuous, fixed image of each product 10 until elements 156 and/or 30 are reset by processor 154 (e.g., when another product is to be imaged by camera 156).
  • processor 154 causes prism control 122 to rotate Dove prism 120 about axis 62 to the position required to produce at the location of detector 160 a one-dimensional image of the two-dimensional image on display 30 having a first desired angular orientation.
  • Prism control 122 may be a conventional servo-type motor arranged to rotate Dove prism 120 by any suitable mechanism such as spur gears, timing chains or belts, etc.
  • processor 154 causes laser 158 (or any other preferably collimated light source) to briefly illuminate that image.
  • This projects the two-dimensional image on display 30 through Dove prism 120 and cylindrical lens 42 to produce a one-dimensional image having a longitudinal axis perpendicular to the plane of the paper in FIG. 9 at the location of detector 160.
  • laser 158 can be on continuously and detector 160 switched on at the proper time.
  • detector 160 comprises (as shown in FIG. 10) a plurality of photo-electric detectors 162a-n arranged side-by-side along the longitudinal axis 46 of the one-dimensional image.
  • Each photodetector 162 receives a small longitudinal segment of the one-dimensional image and produces an electrical output signal proportional to the amount of light contained in that segment.
  • the output signals of all of photodetectors 162 are applied to processor 154 for processing (as described in more detail below) to determine whether or not the one-dimensional image has the characteristics that would result if the product 10 being inspected were an acceptable product. On that basis, processor 154 records whether or not the product 10 being inspected is acceptable. (Summation device 190 and integrators 192 are not used in this embodiment.)
  • FIG. 10 shows parallel read-out of photodetectors 162, the read-out could be serial if desired.
  • processor 154 causes prism control 122 to rotate the Dove prism to the angular position required to produce the second desired one-dimensional image of product 10.
  • the product image is still on display 30.
  • the desired second one-dimensional image can be produced at detector 160 by an appropriately directed 30° rotation of Dove prism 120 about axis 62.
  • processor 154 causes laser 158 to again briefly illuminate the image on display 30. This again projects the two-dimensional product image through elements 120 and 42 to produce the second one-dimensional image at detector 160. Once again, detector 160 produces electrical output signals proportional to the image intensity at each individual photodetector 162. These signals are processed by processor 154 to determine whether or not the second one-dimensional image has the characteristics of the corresponding image for an acceptable product. Again, processor 154 records whether or not the product being inspected is acceptable on that basis.
  • yet another one-dimensional image of product 10 is required (e.g., a third image having an optical orientation parallel to axis 6--6 in FIG. 1)
  • the prism is rotated a further 15°.
  • laser 158 again momentarily illuminates the image on screen 30 to produce the third one-dimensional image at detector 160.
  • the resulting detector 160 output signals are again processed by processor 154 to determine whether or not the third one-dimensional image acceptably agrees with the corresponding image for an acceptable product.
  • Processor 154 again records the results of this test of the acceptability of product 10.
  • processor 154 readies the remainder of the apparatus to inspect the next product 10 when it is opposite camera 156 on conveyor 152.
  • processor 154 controls the position of that portion of the conveyor so that product 10 is directed either to conveyor portion 152A (for accepted) if processor 154 found that all three one-dimensional images of product 10 were sufficiently as expected, or to conveyor portion 152R (for rejected) if processor 154 found that any one or more of those one-dimensional images was deficient in any respect.
  • FIG. 9 Products which have been found to be acceptable are identified in FIG. 9 as 10A; products which have been found unacceptable, and which have therefore been rejected, are identified in FIG. 9 as 10R.
  • Conveyor portion 152S which directs accepted products 10A in one direction and rejected products 10R in another direction, may be any type of controllable product diverter such as a switch in the conveyor mechanism, a mechanical finger selectively interposable in the conveyor path, a controllable air blast for blowing rejected products 10R off the main conveyor path, etc.
  • element 152S may not physically separate accepted products 10A from rejected products 10R, but may instead merely mark or otherwise identify or indicate which products are acceptable and which are not.
  • device 152S sprays a spot 10S of ink or paint on unacceptable products 10R in order to make possible subsequent identification and/or separation of accepted and rejected products (e.g., by human or robotic product handlers).
  • processor 154 can feed back signals for controlling packer 150 when unacceptable products are detected. For example, processor 154 could stop packer 150 when unacceptable products are detected. Alternatively, processor 154 could produce signals for causing appropriate adjustment of packer 150 (e.g., to cause a new roll of tax stamps or tear tape to be started if the tax stamp or tear tape is found to be missing from several successive products, to change the intensity of printing if printed features are found to be too light or too dark, etc.).
  • appropriate adjustment of packer 150 e.g., to cause a new roll of tax stamps or tear tape to be started if the tax stamp or tear tape is found to be missing from several successive products, to change the intensity of printing if printed features are found to be too light or too dark, etc.
  • processor 154 can process the output of detector 160 to determine product acceptability. For example, the value of each photodetector 162 output signal for each one-dimensional image of an acceptable product could be stored in processor 154 as reference image data. Then as the data for each one-dimensional image is received from detector 160, processor 154 subtracts all or selected portions of the appropriate reference image data from the corresponding received data. If the absolute value of any of the resulting differences exceeds a predetermined threshold value, the product can be rejected as unacceptable. The foregoing is illustrated in FIG.
  • the resulting data can be effectively shifted to correct for such product shifts before the above-described subtraction is performed. For example, if an edge of the product always produces a large peak, the data can be shifted until that peak is correlated or aligned with the corresponding large peak in the reference image data, and then the reference data can be subtracted from the received data and further processed as described above. Alternatively, the entire detector output data stream can be correlated with the reference image data to determine the best overall match between the detector output and the reference data. For example, in FIG. 13 the reference data is stored in register 154d, and the detector data is shifted from left to right through shift register or tapped delay line 154e.
  • Each output of register 154d is multiplied by a corresponding output of shift register or tapped delay line 154e, and the resulting products are summed by summation device 154f.
  • the output of summation device 154f will be at a maximum when the data in shift register or tapped delay line 154e is best correlated with the reference data. Once the amount of shift required to produce this maximum is known, the appropriately shifted detector output data can be processed as shown in FIG. 12 and described above to determine whether or not any of the detector output data deviates unacceptably from the corresponding reference data.
  • FIG. 14A shows a way in which portions of the optical system of FIG. 9 can be modified to produce several one-dimensional images in parallel.
  • computer-generated hologram (“CGH") 170 takes the place of rotating Dove prism 120 and cylindrical lens 42.
  • CGH 170 is a well-known type of device which employs a pattern of data on an otherwise transparent, usually planar medium to locally modulate and spatially deflect different portions of the input light waves passing through the medium.
  • the pattern of data is determined by well-known computerized algorithms, and can be selected to cause the CGH to act as any of a wide variety of optical devices or systems.
  • the image of product 10 on display 30 is momentarily illuminated by laser 158 as described above in connection with FIG.
  • CGH 170 causes the light from display 30 to simultaneously form three different one-dimensional images of the display image.
  • this CGH acts like three differently rotated cylindrical lenses.
  • Each of these one-dimensional images has the optical orientation of a respective one of the three one-dimensional images described above in connection with FIG. 9, and each is formed at the location of a respective one of detectors 160a-c. Because of the design flexibility of CGHs, these one-dimensional images can have any of a wide range of locations and orientations.
  • Each of detectors 160a-c may be similar to detector 160 in FIG. 9 (e.g., made up of a plurality of small detector components 162 as shown in FIG. 10).
  • the output signals of all of detectors 160a-c are applied to processor 154 for processing as described above in connection with FIG. 9, etc.
  • the embodiment of FIG. 14A may be similar to the embodiments of FIGS. 9 or 11.
  • the CGH embodiment of FIG. 14A may have certain advantages over the lens embodiment of FIG. 9. It does not require any moving mechanical parts (e.g., a rotating Dove prism) and may therefore be cheaper to build, operate, and maintain. It is also faster because all three one-dimensional images are formed simultaneously, and as a result, the data for all three images are available simultaneously. If processor 154 has three parallel data processing channels, all three one-dimensional images can be analyzed simultaneously.
  • the lens embodiment of FIG. 9 has the advantage of flexibility in that any number of one-dimensional images having any desired optical orientations can be formed merely by appropriate rotation of Dove prism 120.
  • CGH 170 must be changed for different products that require different one-dimensional images at different angles.
  • FIG. 9 also has a cost advantage in that only one detector 160 is required.
  • the embodiment shown in FIG. 14B may in some cases be more economical than the embodiment shown in FIG. 14A.
  • the two-dimensional input image from display 30 is imaged onto deflector 174 by conventional imaging optics 172.
  • Deflector 174 e.g., a mirror pivoted by a galvinometer controlled by processor 1544 deflects the light from imaging system 172 to a selected one or a selected sequence of CGHs 170a, b, c, each of which forms a predetermined one-dimensional image applied to detector 160 by conventional imaging optics 176.
  • the output signal or signals of detector 160 are applied to processor 154 as described above in connection with FIGS. 9 and 10.
  • any one of several one-dimensional images can be selected by appropriate positioning of deflector 174.
  • any desired sequence of one-dimensional images can be selected by appropriately sequencing the position of deflector 174.
  • the set of CGHs 170 could be large enough to satisfy several different product inspection applications, so that the apparatus could be adapted for each different application merely by causing processor 154 to control deflector 174 as required for the current application.
  • FIG. 14C shows another possible embodiment.
  • each CGH 170 produces several one-dimensional images, each of which is focused on a respective row (perpendicular to the plane of the paper in FIG. 14C) of two-dimensional detector array 178 (e.g., a conventional video camera).
  • the product inspection problem being addressed determines which horizontal lines (i.e., one-dimensional images) from detector array 178 are used by processor 154.
  • FIG. 14B uses a deflector 174 and one linear detector 160, while FIG.
  • N1 ⁇ N2 images where N1 is the number of CGHs and N2 is the number of one-dimensional images produced by each CGH
  • Cost and the requirements of the product inspection application will determine which of these systems is best in a particular application.
  • FIG. 15 Another way to form a two-dimensional image of product 10 is shown in FIG. 15.
  • processor 154 causes light sources 180 to momentarily illuminate the product.
  • Light sources 180 can be conventional incoherent light sources such as strobe lights.
  • Lens system 182 focuses the light reflected from product 10 (i.e., a two-dimensional image of product 10) on the input surface of spatial light modulator ("SLM") 184.
  • SLM 184 is a conventional device which reflects light from its output surface only opposite points on its input surface which are illuminated by light from lens system 182.
  • SLM 184 can be made to reflect light from its output surface only opposite points on its input surface which are not illuminated by light from lens system 182.
  • processor 154 also activates light source 188 (e.g., a laser, light emitting diode, etc.).
  • light source 188 e.g., a laser, light emitting diode, etc.
  • Coherent light from light source 188 passes through conventional beam splitter 186 and impinges on the output surface of SLM 184. This light is reflected back to beam splitter 186 only where the output surface of SLM 184 is reflective.
  • the light returned to beam splitter 186 is reflected off to Dove prism 120 (if the remainder of the apparatus is as shown in FIG. 9) or to CGH 170 (if the remainder of the apparatus is as shown in FIG. 14A).
  • the apparatus of FIG. 15 eliminates the need for camera 156 (which typically requires time for image scanning) and allows the entire image of the product to be formed in parallel.
  • SLM 184 and light source 188 can be omitted, but in that event imaging system 182 must ensure that nearly parallel (i.e., collimated) light enters Dove prism 120.
  • the Dove prism and one-dimensional integrating optics produce the necessary one-dimensional image even with polychromatic (i.e., multiple wavelength) non-coherent light.
  • Light sources 180 are typically (although not necessarily) polychromatic to ensure that different lines (on product 10) in different colors are all input with no attenuation.
  • One reason for including SLM 184 and light source 188 would be to provide an amplification of the light level entering the Dove prism system.
  • a CGH system (as in FIG. 14A) requires coherent light, so that in that case, light source 188 must be a laser.
  • FIG. 16 shows yet another way in which several one-dimensional images of a product can be formed.
  • products 10 emerge from packer 150 with an upright orientation and are conveyed to the right by conveyor 152. While still upright on conveyor portion 152-1, each product 10 passes detector 160a which may be generally similar to detector 160 in FIG. 10 with the longitudinal axis 46 of the detector perpendicular to the axis of motion of conveyor 152.
  • Each photodetector 162 in detector 160a receives light from a narrow horizontal slice of the product 10 passing the detector and produces an output signal proportional to the integral of the light received from that slice.
  • Processor 154 controls the integration time of detector 160a based on system synchronization and/or product location detection techniques similar to those described above in connection with FIG. 9.
  • the output signal (serial) or signals (parallel) of detector 160a after a product 10 has passed the detector are similar to the output signal or signals of detector 160 in FIG. 9 with Dove prism 120 at a particular angular orientation.
  • optically detectable horizontal lines on product 10 will cause peaks in the output signal or signals of detector 160a which can be used by processor 154 as described above in connection with FIG. 9, etc., to determine the acceptability of product 10.
  • each product 10 After passing detector 160a, each product 10 enters conveyor portion 152-2 which reorients the product so that other optically detectable straight line segments on product 10 are parallel to the axis of motion of conveyor 152. While on conveyor portion 152-2, the reoriented product moves past detector 160b, which is similar to detector 160a in construction, orientation, and operation. Accordingly, the output signal or signals of detector 160b are representative of another one-dimensional image of product 10 and can be applied to processor 154 for processing as described above.
  • each product 10 After passing detector 160b, each product 10 enters conveyor portion 152-3 which further reorients the product so that still other optically detectable straight line segments (in particular, initially vertical segments) are now horizontal and therefore parallel to the axis of motion of conveyor 152. While on conveyor portion 152-3, each product 10 moves past detector 160c, which is again similar in construction, orientation, and operation to detector 160a. Accordingly, detector 160c provides an output signal or signals (applied to processor 154) representative of a third one-dimensional image of each product 10.
  • processor 154 finds that any of the one-dimensional images of a product 10 are unacceptable, processor 154 controls conveyor portion 152S to divert the unacceptable product 10R to conveyor portion 152R. Otherwise, processor 154 controls conveyor portion 152S to direct acceptable products 10A to conveyor portion 152A.
  • detectors 160 are stationary while products 10 move, those skilled in the art will appreciate that all that is required is appropriate relative motion between the detectors and the products, and that this can be accomplished with moving detectors and stationary products or with both the products and the detectors moving.
  • FIG. 17 shows yet another way in which one-dimensional images having any angular orientation ⁇ i can be produced without the need for moving parts (as in FIG. 9), replacement of CGHs (as in FIG. 14), or possibly complicated relative motions of the products and detectors (as in FIG. 16).
  • the apparatus of FIG. 17 allows a different angle of orientation for the one-dimensional image simply by selecting ⁇ i by means of ⁇ i selector 200, which can be a simple analog or digital selector (e.g., a rotary dial or a numeric keypad).
  • FIG. 17 is the principle that a one-dimensional image at angle ⁇ i is a mapping of two-dimensional input image f(x,y) into a one-dimensional image f ⁇ i(p), where p is derived as follows:
  • Equation (1) gives the coordinate p in the one-dimensional image to which the input image intensity value f(x,y) at input image coordinates x and y maps.
  • the x and y origin is assumed to be at the center of the image, and the range of ⁇ i is assumed to be from 0 to ⁇ .
  • p can be positive or negative.
  • input signals x and y which are typically ramp, sawtooth, or step-function signals of the type used to perform a raster scan in a video camera or other similar two-dimensional imaging device
  • Equation 17 produces an output signal (applied to voltage controlled oscillator 212) proportional to the value of p given by Equation (1).
  • the ⁇ i output signal of selector 200 is applied to Cos ⁇ i generator 202 and to Sin ⁇ i generator 204.
  • These elements (which may be table look-ups in processor 154) produce constant output signals respectively proportional to the cosine and sine of the selected value of ⁇ i.
  • Multiplier 206 multiplies the output signal of Cos ⁇ i generator 202 by the x raster scan signal, while multiplier 208 multiplies the output signal of Sin ⁇ i generator 204 by the y raster scan signal.
  • the output signals of multipliers 206 and 208 are added together in adder 210 to produce the above-described signal proportional to p.
  • the lower portion of the apparatus shown in FIG. 17 uses the p output signal to actually map the input image intensity information f(x,y) into a one-dimensional output image detected by detector 160 (similar to the above-described detectors 160).
  • the input image intensity signal f(x,y) which may be produced by the above-mentioned raster scan of a video camera (e.g., camera 156 in FIG. 9) or another similar two-dimensional imaging device, modulates the output of light source 214 (e.g., a laser or light emitting diode).
  • deflector 216 e.g., a conventional acousto-optical device
  • deflector 216 deflects the applied light by an amount proportional to the frequency of the signal applied to it by voltage controlled oscillator 212. That frequency is in turn proportional to the magnitude of the p output signal of adder 210.
  • the p and f(x,y) signals are synchronized with one another by virtue of the fact that the same x and y raster scan signals are used to generate both p and f(x,y). Accordingly, at the end of one complete raster scan of the two-dimensional input image, photodetectors 162 will have received and accumulated optical information representative of a one-dimensional image of the input image at angle ⁇ i. Detector 160 can then be read out and the resulting signal or signals processed as described above in connection with FIG. 9, etc.
  • the angle ⁇ i can be changed simply by using selector 200 to change ⁇ i. If several one-dimensional images are required simultaneously (as in FIG. 14A), the apparatus of FIG. 17 can be duplicated as many times as necessary to produce the required multiple one-dimensional images.
  • FIG. 17 can be implemented optically as shown in FIG. 18.
  • the apparatus of FIG. 18 has the capability of simultaneously generating a large number of values of p respectively associated with a corresponding number of values of ⁇ i.
  • FIG. 18 is equivalent to several duplications of the upper portion of FIG. 17.
  • each of elements 220 and 222 is a light source such as a laser or light emitting diode.
  • the brightness of each of these devices is controlled by the applied signal (e.g., Cos ⁇ 1 in the case of light source 220-1).
  • the various Cos and Sin signals applied to these devices are produced by signal generators similar to elements 202 and 204 in FIG. 17.
  • ⁇ 1 may be selected to be 0°, in which case the signal applied to light source 220-1 has a relative value of 1
  • the signal applied to light source 222-1 has a relative value of 0.
  • 82 may be selected to be 30°, in which case the signal applied to light source 220-2 has a relative value of 0.5, and the signal applied to light source 222-2 has a relative value of 0.866.
  • the input signals to light sources 220 and 222 can be biased to always be positive, and the bias can then be subtracted from the outputs of the system (i.e., from the outputs of detectors 230). Because the bias is known, correcting for it at the output is straight-forward. Bipolar x and y data applied to modulator devices 226 can be similarly biased.
  • Optical system 224 (which can be made up of conventional optical elements such as lenses) focuses all of the light produced by light sources 220 on modulator 226x, and similarly focuses all of the light produced by light sources 222 on modulator 226y.
  • Each of devices 226 is similar to device 216 in FIG. 17, except that the frequency of the x and y raster scan signals respectively applied to devices 226x and 226y is constant while the amplitude of those signals varies. This multiplies the light input (Cos ⁇ i and Sin ⁇ i) by x and y respectively.
  • Optical system 228 (which can again be made up of conventional optical elements such as lenses) focuses the light from both of elements 226 vertically, and images light sources 220 and 222 horizontally onto a single row of detectors 230.
  • the light from light sources 220-1 and 222-1 is ultimately focused on detector 230-1.
  • the vertical integration performed by optical system 228 achieves the addition required by Equation (1).
  • the output signal of each detector 230 is therefore the p value (according to Equation (1)) for the associated value of ⁇ .
  • the apparatus of FIG. 18 is capable of simultaneously computing n signals p, each of which can be applied to apparatus of the type shown in the lower portion of FIG. 17 to produce output data f ⁇ i(p) for a one-dimensional image having the orientation of the associated value of ⁇ .
  • the signal p for any given value of ⁇ can be computed or otherwise determined in advance, assuming that the x and y signals are also known in advance (as they generally will be).
  • the p signals for any values of ⁇ can be computed in advance, stored in a memory, and read out as needed.
  • FIG. 17 When several p sequences are available in parallel, the lower part of FIG. 17 can be modified as shown in FIG. 19.
  • the p signals (p1 , p2, and p3) for three different values of ⁇ ( ⁇ 1, ⁇ 2, and ⁇ 3) are stored in memory 240.
  • these three p signals could be produced in parallel by apparatus of the type shown in FIG. 18.
  • Each of these three signals is read out from memory 240 in synchronization with application of the input image intensity signal f(x,y).
  • Each p signal is applied to a respective one of voltage controlled oscillators 212a, 212b, and 212c.
  • each voltage controlled oscillator is applied to a respective one of deflectors 216a, 216b, and 216c, each of which appropriately deflects light from the associated light source 214a, 214b, and 214c to the associated detector 160a, 160b, and 160c.
  • deflectors 216a, 216b, and 216c each of which appropriately deflects light from the associated light source 214a, 214b, and 214c to the associated detector 160a, 160b, and 160c.
  • one light source 214 could be used and its output spread across all of deflectors 216 (which could be combined in one multi-channel acousto-optic deflector cell).)
  • the light source or sources can be pulsed on with f(x,y) data for each sample of the image.
  • the output signal or signals of each detector 160 represent a one-dimensional image of the input image at the associated angle ⁇ 1, ⁇ 2, or ⁇ 3. These output signals can be processed as described above in connection with FIG. 9, etc., to determine the acceptability of the product associated with the input image.
  • Hough transform Underlying the phenomenon of producing one-dimensional images from two-dimensional images is the so-called Hough transform.
  • the Hough transform of a two-dimensional image f(x,y) is described by:
  • Equation (1) the transformation in Equation (1) is performed for all image points (with the outputs weighted by the intensity of the input image point), and the results are accumulated in what is referred to as a Hough array h( ⁇ ,p).
  • each straight line L in the input image can be described by its normal distance p from the origin and the angle ⁇ of this normal with respect to the x axis (see FIG. 20).
  • FIG. 21 shows another example of how p and ⁇ are determined. In this case p is negative.
  • each line L appears as a peak at coordinates ( ⁇ ,p) in the Hough space (see FIG. 22), with the height of the peak being proportional to the length of line L or the number of pixels on the line and their intensities.
  • peak h1 in FIG. 22 is due to line L in FIG.
  • Hough space is typically bounded: p is limited by the size of the input image; and for values of ⁇ greater than 2 ⁇ , h( ⁇ ,p) merely repeats. With optically generated Hough space procedures, it is convenient to limit ⁇ to values from 0° to 180°, and to allow p to be bipolar as illustrated by FIGS. 20-22.
  • the one-dimensional images described above are slices of the Hough space at various values of ⁇ .
  • the horizontal and vertical axes in each of these FIGS. are respectively parallel to the p and h( ⁇ ,p) axes in FIG. 22.
  • the rotating Dove prism apparatus of FIG. 9 is capable of forming the entire Hough transform of the input image.
  • Each angular position of Dove prism 120 produces one constant- ⁇ slice of the Hough space, so that when the Dove prism has rotated through an angle of 180°, all of the slices needed to form the entire Hough space will have been formed in sequence.
  • processor 154 can collect data representative of the entire Hough space (or any number of constant- ⁇ slices of that space) by appropriate sampling and storage of the output signals of detector 160.
  • Equation (1) Another (faster) way to produce the entire Hough space is shown in FIG. 23.
  • the polar coordinate form of Equation (1) is:
  • Equation (2) Equation (2)
  • f(x,y) can be rendered in polar form f(r, ⁇ ') by the simple expedient of using a camera 156 with a polar coordinate raster scan or by a computer generated hologram. From Equations (3) and (4) it can be seen that the mapping required from a polar f(r, ⁇ ') space to the Hough space is shift-invariant in ⁇ . This allows for a correlation acousto-optic realization of the Hough transform that will now be explained.
  • each point light source 242 receives a signal proportional to a respective one of a plurality of constant-r slices of f(r, ⁇ '), and produces light having intensity proportional to the level of the applied signal.
  • the signals applied to light sources 242 can be derived, for example, from a polar coordinate scanning camera 156 as described above.
  • Optical system 243 distributes the light from each light source 242 uniformly over the vertical dimension of a respective one of deflector channels 244.
  • Optical system 245 focuses the light from all of deflector channels 244 onto a single vertical array of m photodetectors 246, each of which corresponds to one point in the Hough array h( ⁇ ,p).
  • mappings are accumulated for all input image points f(r, ⁇ '), and this time integration on detectors 246 achieves the products and sums in Equation (4).
  • the deflections produced by devices 244 are vertical.
  • Optical system 245 sums these for all values of r onto detectors 246, and time integration on detectors 246 forms the sum for all values of ⁇ '.
  • Equation (4) is realized and the one-dimensional output on detectors 246 is the desired two-dimensional Hough transform lexicographically ordered on the one-dimensional detectors 246.
  • the inputs to light sources 242 are ⁇ ' outputs for fixed r values, i.e., a polar scan read-out of different r lines in parallel.
  • each point (a different ⁇ ' polar transform value), i.e., a circular scan of the original image at a fixed r value, maps to a cosine in the two-dimensional ( ⁇ ,p) space.
  • the mapping function is the same for each input point, with a shifted version of the same mapping function used for subsequent ⁇ ' values at the same r value.
  • This shift-invariant property makes it possible to use an acousto-optic cell (like 244 in FIG. 23) to implement the required operation.
  • FIG. 23 uses this concept involving viewing the transformation required as a mapping function.
  • each input image point maps to a sinusoid in a two-dimensional ( ⁇ ,p) space. If the two-dimensional sinusoid mapping is scanned horizontally, the two-dimensional sinusoid in ( ⁇ ,p) space is lexicographically ordered into a one-dimensional vector with the p sequence for line ⁇ 1 followed by the p sequence for line ⁇ 2, etc.
  • the output of detector 246 is the sequence of p values for ⁇ 1, the sequence for ⁇ 2, etc.
  • This scan format for the mapping function allows utilization of the shift-invariant property of the mapping function as shown in Equations (3) and (4). Because the same signal is applied to each acousto-optic cell channel, the apparatus of FIG. 23 can be modified to use only one acousto-optic cell 244 and thus use a very viable processor architecture. The full power and generality of the system of FIG. 23 arises when generalized mappings or Hough transforms are employed. Accordingly, after one complete scan of f(r, ⁇ '), the output signal (serial) or signals (parallel) of detectors 246 are the complete Hough space data suitable for processing by processor 154 as described above or below.
  • the Hough transform of the scaled image and the Hough transform of the reference image are related by:
  • the new Hough transform is a translated version of the original Hough transform, with the translation being parallel to the ⁇ axis and proportional to the input image rotation ⁇ .
  • a projection of the Hough space h(p) parallel to the p axis is invariant to rotation of the input object.
  • the new Hough space has the constant- ⁇ slices shifted in the p direction, with a different (but easily calculated) amount of shift for each value of ⁇ .
  • an input image translated by x1,y1 has a Hough transform with the original points ⁇ ,p mapped to new points ⁇ 1,p1, where
  • the apparatus of FIG. 9 can be used to form h( ⁇ ) as follows.
  • processor 154 causes laser 158 to illuminate that image.
  • Processor 154 then causes prism control 122 to rotate Dove prism 120 through an angle of 180° (equivalent to rotating the image 360°).
  • processor 154 periodically samples the output signals from photodetectors 162a-n and inputs those signal samples to summation device 190 (FIG. 10).
  • Summation device 190 sums the applied signal samples and applies the result to processor 154.
  • this integration of detector 162 outputs could be produced optically by imaging the detector line down into one detector, i.e., by performing the integration optically in space.
  • Each summation signal sample is the sum over a particular constant- ⁇ slice of the Hough space for the image.
  • the time history output for these successive summation signal samples comprises the h( ⁇ ) projection of the Hough space. If the product is acceptable and merely translated (not rotated), the resulting h( ⁇ ) curve will match the h( ⁇ ) curve for an acceptable product, and processor 154 will record that the product is acceptable.
  • processor 154 can either record that the product is unacceptable, or if product rotation is possible and permissible, processor 154 (using apparatus of the type shown in FIG. 13, and then possibly apparatus of the type shown in FIG. 12) can first attempt to match the detected and acceptable (reference) h( ⁇ ) curves by shifting (translating) either curve until a match is found or until all possible combinations have been unsuccessfully tried. If a match is found, processor 154 records that the product is acceptable. Otherwise, processor 154 records that the product is unacceptable.
  • the two-dimensional Hough transform array can be corrected for rotation by shifting the data parallel to the ⁇ axis by an amount equal to ⁇ .
  • Equation (7) two columns h ⁇ 1(p) and h ⁇ 2(p) at different ⁇ values are selected.
  • Each of these two h ⁇ i(p) signals is correlated with the associated reference pattern (e.g., using apparatus of the type shown in FIG. 13) to determine the shift required to align the h ⁇ i(p) data and the associated reference patterns.
  • the ⁇ lines at which the h ⁇ i(p) patterns are obtained should be lines including significant data peaks. If desired, better estimates of x1 and y1 (or ⁇ and t) can be developed if more than two h ⁇ i(p) correlations are performed.
  • Equations (7) and (8) are used to solve Equations (7) and (8) for ⁇ and t and then x1 and y1.
  • x1 and y1 have been determined in this manner, all or any portion of the Hough transform array data can be corrected for shift using Equations (7) and (8).
  • the above-described one-dimensional Hough transform projection processing provides estimates of product rotation ⁇ and translation x1, y1, as well as Hough transform array data h( ⁇ ,p) corrected for these distortions. Analysis of the projections h(p) and h( ⁇ ) can also determine product acceptability as discussed above in the case of h( ⁇ ) and as will now be discussed in the case of h(p).
  • the one-dimensional projection h(p) is useful in certain product inspection applications. For example, if the input image has a relatively small number of straight lines and edges, and it is desired to determine parallelism between two such features, it may not be necessary to search for the particular constant- ⁇ slice in which those two features produce two appropriately spaced peaks. Instead, it may be sufficient to simply form h(p) (which is a superposition of all constant- ⁇ slices and which is therefore insensitive to image rotation) and to look for the required two spaced peaks in that curve.
  • Processor 154 operates Dove prism 120 and laser 158 as described above in connection with determining h( ⁇ ). Instead of using summation device 190 (FIG. 10), however, electronic integrators 192a-n are used to separately integrate the output signals of each of detector components 162a-n throughout the 180° rotation of the Dove prism. (Alternatively, the temporal integration can be done directly in each detector element 162a-n.) At the completion of the Dove prism rotation, the output signal of each integrator 192 represents one constant-p slice of the Hough space. Collectively, these integrator output signals are h(p). Processor 154 compares this detected h(p) curve to the reference h(p) curve for an acceptable product (or otherwise analyzes the detected h(p) curve for acceptability), and determines whether or not product 10 is acceptable on that basis.
  • the h(p) pattern is also scaled, but the relative displacement between peaks in h(p) does not change. Thus, this is a fast and efficient inspection technique that will be suitable for many applications. If the input image is translated, each constant- ⁇ slice of the Hough space is shifted differently, so the Hough array data must be corrected for translation (as described above) before h(p) is formed for comparison with reference h(p) data. If the input image is merely rotated, h(p) is unaffected as has been mentioned.
  • the comparisons of various one-dimensional images that were referred to above can be performed electronically as described, for example, in connection with FIGS. 12 and 13. This type of comparison allows determination of whether or not any peak in the one-dimensional image deviates from a predetermined value by more than a predetermined amount.
  • Product identification and acceptability can also be obtained by comparing the entire one-dimensional image to one or several one-dimensional references.
  • This comparison involves a vector image product (i.e., the sum of the point-by-point product of the elements of the two one-dimensional signals). These vector image products can be performed for all relative shifts of the two one-dimensional signals. Such a shift search is needed when the object rotation ⁇ , translation x1, y1, etc., must be searched (i.e., when a one-dimensional correlation is required).
  • the system of FIG. 24 achieves this.
  • a large number of samples of the possibly distorted Hough transform projection or slice hd are respectively fed in parallel to a corresponding number of light sources 250 (e.g., laser or light emitting diodes), each of which produces output light having intensity proportional to the level of the associated input signal.
  • the light from each light source 250 is imaged onto a corresponding portion of acousto-optic cell 252.
  • the reference pattern hr with which hd is to be correlated is serially applied in a cyclically repeating fashion to acousto-optic cell 252.
  • the outputs of all of the segments of cell 252 are focused onto a single detector 256 by optical system 254 (made up of conventional optical elements such as lenses).
  • the time-history output of detector 256 (which integrates the product of the light leaving light sources 250 and the contents of cell 252) is the correlation of hd and hr for all possible shifts or distortion parameters. Should several reference patterns be required, these can be frequency-multiplexed onto the hr input to cell 252, and the correlation of hd with each reference will appear on a respective one of several detectors 256 spaced vertically in FIG. 24.
  • a multichannel acousto-optic cell can be employed for device 252.
  • a long one-dimensional pattern can be partitioned (in any single axis needed) among the several channels of the above-mentioned multichannel acousto-optic cell.
  • the apparatus of FIG. 24 forms a vector image product (product and sum) of two one-dimensional signals. It thus compares two signals for all shifts between the two signals. This is a correlation (i.e., a vector image product for all shifts).
  • the more detailed and specific product tests should not be done with a correlator. Rather, these tests require checking the presence and height of each peak of interest. It is possible to correlate with an inverted reference signal. This subtracts the two signals and yields both positive and negative values, but the sum of such values can cancel out significant errors.
  • the system of FIGS. 12 and 13 is preferable for such cases.
  • FIG. 25 shows an acousto-optic one-dimensional Fourier transform system that achieves the spectrum of the one-dimensional signal fed to the acousto-optic cell at plane P1.
  • the input signal to be Fourier transformed is fed to acousto-optic device A which is illuminated by a laser beam B and forms the Fourier transform of the input signal on a linear detector array C at plane P2.
  • This architecture when fed with the Hough transform of an input image, yields the one-dimensional Fourier transform of the input and thus a one-dimensional slice of the two-dimensional Fourier transform.
  • Such one-dimensional Fourier transform slices have many uses in product inspection. For example, to confirm that certain lines of text or pattern are present on the product, that region of the object can be scanned to determine the presence of the pattern or text and the quality of it from a simple analysis of the one-dimensional Fourier transform. The angle of the one-dimensional Fourier transform slice can be selected depending on the orientation of this information in the image.

Landscapes

  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pathology (AREA)
  • Analytical Chemistry (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Inspection Of Paper Currency And Valuable Securities (AREA)
  • Wrapping Of Specific Fragile Articles (AREA)
US07/115,428 1987-10-30 1987-10-30 Methods and apparatus for optical product inspection Expired - Lifetime US4906099A (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US07/115,428 US4906099A (en) 1987-10-30 1987-10-30 Methods and apparatus for optical product inspection
BR8805591A BR8805591A (pt) 1987-10-30 1988-10-27 Processo e aparelho para determinar se um produto possui ou nao um segmento de linha reta,detectavel oticamente,com uma orientacao predeterminada,em um plano de objeto predeterminado
JP63271922A JPH01143946A (ja) 1987-10-30 1988-10-27 光学的製品検査方法及び装置
AU24436/88A AU622874B2 (en) 1987-10-30 1988-10-28 Methods and apparatus for optical product inspection
EP19880310213 EP0314521A3 (de) 1987-10-30 1988-10-31 Verfahren und Gerät zur optischen Produktprüfung
CA000581756A CA1302542C (en) 1987-10-30 1988-10-31 Methods and apparatus for optical product inspection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US07/115,428 US4906099A (en) 1987-10-30 1987-10-30 Methods and apparatus for optical product inspection

Publications (1)

Publication Number Publication Date
US4906099A true US4906099A (en) 1990-03-06

Family

ID=22361343

Family Applications (1)

Application Number Title Priority Date Filing Date
US07/115,428 Expired - Lifetime US4906099A (en) 1987-10-30 1987-10-30 Methods and apparatus for optical product inspection

Country Status (6)

Country Link
US (1) US4906099A (de)
EP (1) EP0314521A3 (de)
JP (1) JPH01143946A (de)
AU (1) AU622874B2 (de)
BR (1) BR8805591A (de)
CA (1) CA1302542C (de)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1990010275A1 (en) * 1989-03-03 1990-09-07 Greyhawk Systems, Inc. Projected image linewidth correction apparatus and method
EP0451865A2 (de) * 1990-04-12 1991-10-16 Olympus Optical Co., Ltd. Automatisches Fokussiergerät mit Verfahren zur optimalen Berechnung der Fokussierposition
US5061063A (en) * 1989-10-30 1991-10-29 Philip Morris Incorporated Methods and apparatus for optical product inspection
USRE33774E (en) * 1988-03-02 1991-12-24 Wegu-Messtechnik Gmbh Coordinate measuring and testing machine
US5110213A (en) * 1989-12-21 1992-05-05 R. J. Reynolds Tobacco Company Method and apparatus for measuring concentration of a material in a sample
US5235375A (en) * 1990-04-12 1993-08-10 Olympus Optical Co., Ltd. Focusing position detecting and automatic focusing apparatus with optimal focusing position calculation method
US5249239A (en) * 1990-07-17 1993-09-28 Nec Corporation Means for measuring coplanarity of leads on an IC package
US5293428A (en) * 1991-05-08 1994-03-08 Rohm Co., Ltd. Optical apparatus for use in image recognition
US5329359A (en) * 1991-05-17 1994-07-12 Canon Kabushiki Kaisha Parts mounting inspection method
US5430810A (en) * 1990-11-20 1995-07-04 Imra America, Inc. Real time implementation of the hough transform
US5537670A (en) * 1991-08-08 1996-07-16 Philip Morris Incorporated Product appearance inspection methods and apparatus employing low variance filter
US5581409A (en) * 1994-09-30 1996-12-03 Republic Lens Co., Inc. Imaging system to combine disparate fields of view
US5659624A (en) * 1995-09-01 1997-08-19 Fazzari; Rodney J. High speed mass flow food sorting appartus for optically inspecting and sorting bulk food products
US5677763A (en) * 1996-08-08 1997-10-14 Technology Resources, Inc. Optical device for measuring physical and optical characteristics of an object
US5832138A (en) * 1994-03-07 1998-11-03 Nippon Telegraph And Telephone Corporation Image processing method and apparatus for extracting lines from an image by using the Hough transform
US5946101A (en) * 1996-02-09 1999-08-31 Sony Corporation Apparatus and method for detecting a posture
US5966218A (en) * 1997-07-11 1999-10-12 Philip Morris Incorporated Bobbin optical inspection system
US6020969A (en) * 1997-07-11 2000-02-01 Philip Morris Incorporated Cigarette making machine including band inspection
US6075882A (en) * 1997-06-18 2000-06-13 Philip Morris Incorporated System and method for optically inspecting cigarettes by detecting the lengths of cigarette sections
US6100984A (en) * 1999-06-11 2000-08-08 Chen; Fang Surface measurement system with a laser light generator
US6181372B1 (en) * 1996-06-10 2001-01-30 G.D S.P.A. Method and a device for monitoring the external integrity of cigarettes
US6198537B1 (en) 1997-07-11 2001-03-06 Philip Morris Incorporated Optical inspection system for the manufacture of banded cigarette paper
US20020071277A1 (en) * 2000-08-12 2002-06-13 Starner Thad E. System and method for capturing an image
US6519356B1 (en) 1999-08-03 2003-02-11 Intelligent Machine Concepts, L.L.C. System and method for inspecting cans
US6525333B1 (en) 2000-07-18 2003-02-25 Intelligent Machine Concepts, L.L.C. System and method for inspecting containers with openings with pipeline image processing
US6564527B1 (en) * 1999-02-04 2003-05-20 Focke & Co. (Gmbh) Process and apparatus for checking cigarette packs for the correct positioning of material strips
US20030098978A1 (en) * 2001-11-09 2003-05-29 Norimasa Ikeda Color sorting apparatus for granular object with optical detection device consisting of CCD linear sensor
US20030127366A1 (en) * 2001-12-06 2003-07-10 Norimasa Ikeda Color sorting apparatus for granular objects with function to sorting out foreign magnetic metal matters
US6629611B2 (en) 2000-06-16 2003-10-07 Satake Corporation Granular object sorting apparatus
US6677591B1 (en) 2002-01-30 2004-01-13 Ciena Corporation Method and system for inspecting optical devices
US6760113B2 (en) 2001-03-20 2004-07-06 Ford Global Technologies, Llc Crystal based fringe generator system
US20050058350A1 (en) * 2003-09-15 2005-03-17 Lockheed Martin Corporation System and method for object identification
US20050067332A1 (en) * 2003-09-04 2005-03-31 Norimasa Ikeda Granule color sorting apparatus with display control device
US7360750B2 (en) 2003-04-18 2008-04-22 Satake Corporation Piezoelectric air valve and multiple-type piezoelectric air valve
US20080130002A1 (en) * 2006-12-01 2008-06-05 Samsung Electronics Co., Ltd. Equipment and method for measuring transmittance of photomask under off axis illumination
US20090055116A1 (en) * 2007-08-24 2009-02-26 Chou-Pi Chen Method For Inspecting Appearance Of Pellet Type Medicines And System Employing Thereof
US8369967B2 (en) 1999-02-01 2013-02-05 Hoffberg Steven M Alarm system controller and a method for controlling an alarm system
US8892495B2 (en) 1991-12-23 2014-11-18 Blanding Hovenweep, Llc Adaptive pattern recognition based controller apparatus and method and human-interface therefore
CN104326107A (zh) * 2014-11-03 2015-02-04 南京文易特电子科技有限公司 小包散包视觉检测装置
US10361802B1 (en) 1999-02-01 2019-07-23 Blanding Hovenweep, Llc Adaptive pattern recognition based control system and method
CN112927226A (zh) * 2021-04-08 2021-06-08 广州绿简智能科技有限公司 一种划痕损伤的图像检测方法

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4972494A (en) * 1988-02-26 1990-11-20 R. J. Reynolds Tobacco Company Package inspection system
EP0716287B1 (de) * 1994-12-10 2001-04-25 Koenig & Bauer Aktiengesellschaft Vorrichtungen zum Messen einer Lage von Bahnen oder Bogen
DE4444079C2 (de) * 1994-12-10 1998-03-19 Koenig & Bauer Albert Ag Verfahren und Vorrichtung zur Durchführung dieses Verfahrens zum Messen der Lage einer Kante von einer Bahn oder einem Bogen
US5644895A (en) * 1995-05-01 1997-07-08 Johnson & Johnson Vision Products, Inc. Packaging arrangement
US6169600B1 (en) 1998-11-20 2001-01-02 Acuity Imaging, Llc Cylindrical object surface inspection system
JP2001097322A (ja) * 1999-09-30 2001-04-10 Nippon Seiki Co Ltd 充填包装袋のシール不良判別方法および充填包装機における充填包装袋のシール不良判別装置
JP4677628B2 (ja) * 2004-09-28 2011-04-27 レーザーテック株式会社 欠陥検出装置及び欠陥検出方法並びにパターン基板の製造方法
US7860277B2 (en) 2007-04-10 2010-12-28 Bizerba Gmbh & Co. Kg Food product checking system and method for identifying and grading food products
US10467474B1 (en) * 2016-07-11 2019-11-05 National Technology & Engineering Solutions Of Sandia, Llc Vehicle track detection in synthetic aperture radar imagery

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US988720A (en) * 1908-07-09 1911-04-04 Zeiss Carl Lens system for projecting stripes.
US3069654A (en) * 1960-03-25 1962-12-18 Paul V C Hough Method and means for recognizing complex patterns
DE2704983A1 (de) * 1977-02-07 1978-08-10 Siemens Ag Verfahren zum selbsttaetigen erkennen von fehlern in der oberflaeche oder in den abmessungen eines objektes sowie vorrichtung zur ausuebung des verfahrens
US4242702A (en) * 1976-12-01 1980-12-30 Hitachi, Ltd. Apparatus for automatically checking external appearance of object
US4493554A (en) * 1979-02-27 1985-01-15 Diffracto Method and apparatus for determining physical characteristics of objects and object surfaces
US4515480A (en) * 1980-05-19 1985-05-07 Micro Automation, Inc. Pattern alignment system and method
US4618989A (en) * 1983-01-21 1986-10-21 Michio Kawata, Director-General of Agency of Industrial Science and Technology Method and system for detecting elliptical objects
JPS61290583A (ja) * 1985-06-19 1986-12-20 Yokogawa Electric Corp 画像処理装置
EP0205628A1 (de) * 1985-06-19 1986-12-30 International Business Machines Corporation Verfahren zum Identifizieren dreidimensionaler Objekte mittels zweidimensionaler Bilder

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3571796A (en) * 1968-05-28 1971-03-23 Bendix Corp Rotation translation independent feature extraction means
DE2238766C3 (de) * 1972-08-07 1975-10-09 Vierling, Oskar, Prof. Dr.Phil. Habil., 8553 Ebermannstadt Vorrichtung zum Erkennen von aut diffus streuenden Trägern angebrachten Zeichen
JPS5977577A (ja) * 1982-10-25 1984-05-04 Ricoh Co Ltd 枠抽出法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US988720A (en) * 1908-07-09 1911-04-04 Zeiss Carl Lens system for projecting stripes.
US3069654A (en) * 1960-03-25 1962-12-18 Paul V C Hough Method and means for recognizing complex patterns
US4242702A (en) * 1976-12-01 1980-12-30 Hitachi, Ltd. Apparatus for automatically checking external appearance of object
DE2704983A1 (de) * 1977-02-07 1978-08-10 Siemens Ag Verfahren zum selbsttaetigen erkennen von fehlern in der oberflaeche oder in den abmessungen eines objektes sowie vorrichtung zur ausuebung des verfahrens
US4493554A (en) * 1979-02-27 1985-01-15 Diffracto Method and apparatus for determining physical characteristics of objects and object surfaces
US4515480A (en) * 1980-05-19 1985-05-07 Micro Automation, Inc. Pattern alignment system and method
US4618989A (en) * 1983-01-21 1986-10-21 Michio Kawata, Director-General of Agency of Industrial Science and Technology Method and system for detecting elliptical objects
JPS61290583A (ja) * 1985-06-19 1986-12-20 Yokogawa Electric Corp 画像処理装置
EP0205628A1 (de) * 1985-06-19 1986-12-30 International Business Machines Corporation Verfahren zum Identifizieren dreidimensionaler Objekte mittels zweidimensionaler Bilder

Non-Patent Citations (18)

* Cited by examiner, † Cited by third party
Title
C. Barney, "Hologram Filter Spots Images from Any Angle", Electronics, Sep. 4, 1986, pp. 37-38.
C. Barney, Hologram Filter Spots Images from Any Angle , Electronics, Sep. 4, 1986, pp. 37 38. *
D. Casasent, "Coherent Optical Pattern Recognition", Proceedings of the IEEE, vol. 67, No. 5, May 1979, pp. 813-825.
D. Casasent, "Computer Generated Holograms in Pattern Recongition: A Review", Optical Engineering, vol. 24, No. 5, Sep./Oct. 1985, pp. 724-730.
D. Casasent, Coherent Optical Pattern Recognition , Proceedings of the IEEE, vol. 67, No. 5, May 1979, pp. 813 825. *
D. Casasent, Computer Generated Holograms in Pattern Recongition: A Review , Optical Engineering, vol. 24, No. 5, Sep./Oct. 1985, pp. 724 730. *
G. R. Gindi et al., "Optical Feature Extraction Via the Radon Transform", Optical Engineering, vol. 23, No. 5, Sep./Oct. 1984, pp. 499-506.
G. R. Gindi et al., Optical Feature Extraction Via the Radon Transform , Optical Engineering, vol. 23, No. 5, Sep./Oct. 1984, pp. 499 506. *
R. Krishnapuram, "Hough Space Transformations for Discrimination and Distortion Estimation", Computer Vision, Graphics, and Image Processing, vol. 38, No. 3, Jun. 1987, pp. 299-316.
R. Krishnapuram, Hough Space Transformations for Discrimination and Distortion Estimation , Computer Vision, Graphics, and Image Processing, vol. 38, No. 3, Jun. 1987, pp. 299 316. *
Van Daele et al., "Automatic Visual Inspection of Reed Switches", Optical Engineering, vol. 19, No. 2, (Mar./Apr. 1980) pp. 240-244.
Van Daele et al., "The Leuven Automatic Visual Inspection Machine", Proceedings of the Society of Photo-Optical Instrumentation Engineers, vol. 182 (1979) pp. 58-64.
Van Daele et al., Automatic Visual Inspection of Reed Switches , Optical Engineering, vol. 19, No. 2, (Mar./Apr. 1980) pp. 240 244. *
Van Daele et al., The Leuven Automatic Visual Inspection Machine , Proceedings of the Society of Photo Optical Instrumentation Engineers, vol. 182 (1979) pp. 58 64. *
W. H. Steier et al., "Optical Hough Transform", Applied Optics, vol. 25, No. 16, Aug. 1986, pp. 2734-2738.
W. H. Steier et al., Optical Hough Transform , Applied Optics, vol. 25, No. 16, Aug. 1986, pp. 2734 2738. *
Wai Hon Lee, Computer Generated Holograms; Techniques and Applications , in Progress in Optics, vol. XVI, pp. 121 232, North Holland Publishing Company, Amsterdam, 1978. *
Wai-Hon Lee, "Computer-Generated Holograms; Techniques and Applications", in Progress in Optics, vol. XVI, pp. 121-232, North-Holland Publishing Company, Amsterdam, 1978.

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE33774E (en) * 1988-03-02 1991-12-24 Wegu-Messtechnik Gmbh Coordinate measuring and testing machine
WO1990010275A1 (en) * 1989-03-03 1990-09-07 Greyhawk Systems, Inc. Projected image linewidth correction apparatus and method
US5061063A (en) * 1989-10-30 1991-10-29 Philip Morris Incorporated Methods and apparatus for optical product inspection
US5110213A (en) * 1989-12-21 1992-05-05 R. J. Reynolds Tobacco Company Method and apparatus for measuring concentration of a material in a sample
EP0451865A2 (de) * 1990-04-12 1991-10-16 Olympus Optical Co., Ltd. Automatisches Fokussiergerät mit Verfahren zur optimalen Berechnung der Fokussierposition
EP0451865A3 (en) * 1990-04-12 1992-10-21 Olympus Optical Co., Ltd. Focusing position detecting and automatic focusing apparatus with optimal focusing position calculation method
US5235375A (en) * 1990-04-12 1993-08-10 Olympus Optical Co., Ltd. Focusing position detecting and automatic focusing apparatus with optimal focusing position calculation method
US5249239A (en) * 1990-07-17 1993-09-28 Nec Corporation Means for measuring coplanarity of leads on an IC package
US5430810A (en) * 1990-11-20 1995-07-04 Imra America, Inc. Real time implementation of the hough transform
US5293428A (en) * 1991-05-08 1994-03-08 Rohm Co., Ltd. Optical apparatus for use in image recognition
US5329359A (en) * 1991-05-17 1994-07-12 Canon Kabushiki Kaisha Parts mounting inspection method
US5537670A (en) * 1991-08-08 1996-07-16 Philip Morris Incorporated Product appearance inspection methods and apparatus employing low variance filter
US8892495B2 (en) 1991-12-23 2014-11-18 Blanding Hovenweep, Llc Adaptive pattern recognition based controller apparatus and method and human-interface therefore
US5832138A (en) * 1994-03-07 1998-11-03 Nippon Telegraph And Telephone Corporation Image processing method and apparatus for extracting lines from an image by using the Hough transform
US5581409A (en) * 1994-09-30 1996-12-03 Republic Lens Co., Inc. Imaging system to combine disparate fields of view
US5659624A (en) * 1995-09-01 1997-08-19 Fazzari; Rodney J. High speed mass flow food sorting appartus for optically inspecting and sorting bulk food products
US5887073A (en) * 1995-09-01 1999-03-23 Key Technology, Inc. High speed mass flow food sorting apparatus for optically inspecting and sorting bulk food products
US5946101A (en) * 1996-02-09 1999-08-31 Sony Corporation Apparatus and method for detecting a posture
US6181372B1 (en) * 1996-06-10 2001-01-30 G.D S.P.A. Method and a device for monitoring the external integrity of cigarettes
US5677763A (en) * 1996-08-08 1997-10-14 Technology Resources, Inc. Optical device for measuring physical and optical characteristics of an object
US6075882A (en) * 1997-06-18 2000-06-13 Philip Morris Incorporated System and method for optically inspecting cigarettes by detecting the lengths of cigarette sections
US6020969A (en) * 1997-07-11 2000-02-01 Philip Morris Incorporated Cigarette making machine including band inspection
US6198537B1 (en) 1997-07-11 2001-03-06 Philip Morris Incorporated Optical inspection system for the manufacture of banded cigarette paper
US5966218A (en) * 1997-07-11 1999-10-12 Philip Morris Incorporated Bobbin optical inspection system
US10361802B1 (en) 1999-02-01 2019-07-23 Blanding Hovenweep, Llc Adaptive pattern recognition based control system and method
US9535563B2 (en) 1999-02-01 2017-01-03 Blanding Hovenweep, Llc Internet appliance system and method
US8369967B2 (en) 1999-02-01 2013-02-05 Hoffberg Steven M Alarm system controller and a method for controlling an alarm system
US8583263B2 (en) 1999-02-01 2013-11-12 Steven M. Hoffberg Internet appliance system and method
US6564527B1 (en) * 1999-02-04 2003-05-20 Focke & Co. (Gmbh) Process and apparatus for checking cigarette packs for the correct positioning of material strips
US6100984A (en) * 1999-06-11 2000-08-08 Chen; Fang Surface measurement system with a laser light generator
US6519356B1 (en) 1999-08-03 2003-02-11 Intelligent Machine Concepts, L.L.C. System and method for inspecting cans
US6629611B2 (en) 2000-06-16 2003-10-07 Satake Corporation Granular object sorting apparatus
US6525333B1 (en) 2000-07-18 2003-02-25 Intelligent Machine Concepts, L.L.C. System and method for inspecting containers with openings with pipeline image processing
US20020071277A1 (en) * 2000-08-12 2002-06-13 Starner Thad E. System and method for capturing an image
US6760113B2 (en) 2001-03-20 2004-07-06 Ford Global Technologies, Llc Crystal based fringe generator system
US6784996B2 (en) 2001-11-09 2004-08-31 Satake Corporation Color sorting apparatus for granular object with optical detection device consisting of CCD linear sensor
US20030098978A1 (en) * 2001-11-09 2003-05-29 Norimasa Ikeda Color sorting apparatus for granular object with optical detection device consisting of CCD linear sensor
US6817474B2 (en) 2001-12-06 2004-11-16 Satake Corporation Color sorting apparatus for granular objects with function to sorting out foreign magnetic metal matters
US20030127366A1 (en) * 2001-12-06 2003-07-10 Norimasa Ikeda Color sorting apparatus for granular objects with function to sorting out foreign magnetic metal matters
US6677591B1 (en) 2002-01-30 2004-01-13 Ciena Corporation Method and system for inspecting optical devices
US7360750B2 (en) 2003-04-18 2008-04-22 Satake Corporation Piezoelectric air valve and multiple-type piezoelectric air valve
US7298870B2 (en) 2003-09-04 2007-11-20 Satake Corporation Granule color sorting apparatus with display control device
US20050067332A1 (en) * 2003-09-04 2005-03-31 Norimasa Ikeda Granule color sorting apparatus with display control device
US20050058350A1 (en) * 2003-09-15 2005-03-17 Lockheed Martin Corporation System and method for object identification
US7630079B2 (en) * 2006-12-01 2009-12-08 Samsung Electronics Co., Ltd. Equipment and method for measuring transmittance of photomask under off axis illumination
US20080130002A1 (en) * 2006-12-01 2008-06-05 Samsung Electronics Co., Ltd. Equipment and method for measuring transmittance of photomask under off axis illumination
US20090055116A1 (en) * 2007-08-24 2009-02-26 Chou-Pi Chen Method For Inspecting Appearance Of Pellet Type Medicines And System Employing Thereof
CN104326107A (zh) * 2014-11-03 2015-02-04 南京文易特电子科技有限公司 小包散包视觉检测装置
CN104326107B (zh) * 2014-11-03 2016-10-26 南京文易特电子科技有限公司 小包散包视觉检测装置
CN112927226A (zh) * 2021-04-08 2021-06-08 广州绿简智能科技有限公司 一种划痕损伤的图像检测方法
CN112927226B (zh) * 2021-04-08 2022-08-26 广州绿简智能科技有限公司 一种划痕损伤的图像检测方法

Also Published As

Publication number Publication date
CA1302542C (en) 1992-06-02
AU622874B2 (en) 1992-04-30
BR8805591A (pt) 1989-07-11
JPH01143946A (ja) 1989-06-06
EP0314521A3 (de) 1991-01-02
EP0314521A2 (de) 1989-05-03
AU2443688A (en) 1989-05-04

Similar Documents

Publication Publication Date Title
US4906099A (en) Methods and apparatus for optical product inspection
US4972494A (en) Package inspection system
US5120126A (en) System for non-contact colored label identification and inspection and method therefor
EP0330495A2 (de) Verpackungsbeaufsichtigungssystem
US4926489A (en) Reticle inspection system
US4079416A (en) Electronic image analyzing method and apparatus
US5175601A (en) High-speed 3-D surface measurement surface inspection and reverse-CAD system
US4553837A (en) Roll fingerprint processing apparatus
US4687326A (en) Integrated range and luminance camera
EP0877914B1 (de) Verfahren und vorrichtung zum messen der streifenphase in der abbildung eines objekts
JP3023694B2 (ja) 多参照画像用光パターン認識方法
US6141038A (en) Alignment correction prior to image sampling in inspection systems
GB2261341A (en) Method and apparatus for image inspection
US5374988A (en) System for non-contact identification and inspection of color patterns
US5118195A (en) Area scan camera system for detecting streaks and scratches
FR2711820A1 (fr) Système d'inspection optique automatique à base pondérée de données de transmission.
US4776692A (en) Testing light transmitting articles
US9057707B2 (en) Detection system and inspection method for bottle seam and embossing alignment
US5061063A (en) Methods and apparatus for optical product inspection
US5245399A (en) System for non-contact colored label identification and inspection and method therefor
US4735508A (en) Method and apparatus for measuring a curvature of a reflective surface
CH617278A5 (de)
US4009965A (en) Method and apparatus for determining object dimension and other characteristics using diffraction waves
CA3021480C (en) An image capturing system and a method for determining the position of an embossed structure on a sheet element
CA1232071A (en) Direct finger reading

Legal Events

Date Code Title Description
AS Assignment

Owner name: PHILIP MORRIS INCORPORATED, 120 PARK AVENUE, NEW Y

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST.;ASSIGNOR:CASASENT, DAVID P.;REEL/FRAME:004818/0439

Effective date: 19871229

Owner name: PHILIP MORRIS INCORPORATED, A CORP. OF VA,NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CASASENT, DAVID P.;REEL/FRAME:004818/0439

Effective date: 19871229

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction
FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12

SULP Surcharge for late payment

Year of fee payment: 11