WO2011070914A1 - Dispositif de contrôle d'aspect - Google Patents

Dispositif de contrôle d'aspect Download PDF

Info

Publication number
WO2011070914A1
WO2011070914A1 PCT/JP2010/070986 JP2010070986W WO2011070914A1 WO 2011070914 A1 WO2011070914 A1 WO 2011070914A1 JP 2010070986 W JP2010070986 W JP 2010070986W WO 2011070914 A1 WO2011070914 A1 WO 2011070914A1
Authority
WO
WIPO (PCT)
Prior art keywords
shape
image
inspection object
inspection
unit
Prior art date
Application number
PCT/JP2010/070986
Other languages
English (en)
Japanese (ja)
Inventor
晋也 松田
広志 青木
Original Assignee
第一実業ビスウィル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 第一実業ビスウィル株式会社 filed Critical 第一実業ビスウィル株式会社
Priority to JP2011545165A priority Critical patent/JP5654486B2/ja
Priority to CN201080056200.9A priority patent/CN102713580B/zh
Priority to KR1020127017899A priority patent/KR101762158B1/ko
Publication of WO2011070914A1 publication Critical patent/WO2011070914A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9508Capsules; Tablets
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • G01B11/303Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces using photoelectric detection means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • G01B11/306Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces for measuring evenness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/892Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined

Definitions

  • the present invention relates to an apparatus for inspecting the appearance of medicines (tablets, capsules, etc.), food, machine parts, electronic parts, etc. (hereinafter referred to as “inspection object”).
  • the inspection apparatus exists on the surface of the inspection object by irradiating the surface of the inspection object with diffused light, appropriately imaging the surface with an imaging apparatus, and analyzing the obtained grayscale image. It detects smudges and printed parts and determines their suitability.
  • the surface of the object to be inspected is irradiated with diffused light so that the surface is uniformly illuminated from all directions, thereby eliminating irregularities existing on the surface, that is, generation of shadows due to the irregularities. It is possible to obtain a grayscale image that is suppressed and the surface pattern (dirt or printed portion) is emphasized.
  • the inspection apparatus irradiates the surface of the inspection object with laser slit light, appropriately captures an image of the irradiated laser slit light, and analyzes the obtained image according to a light cutting method.
  • the inspection apparatus irradiates the surface of the inspection object with laser slit light, appropriately captures an image of the irradiated laser slit light, and analyzes the obtained image according to a light cutting method.
  • the inspection apparatus according to the conventional example 2 has a problem as described below.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide an appearance inspection apparatus capable of accurately inspecting the surface shape of an object having a dark pattern on the surface. .
  • the present invention provides: Transport means for transporting the inspection object along a predetermined transport path; Surface pattern inspection means for inspecting the surface pattern of the inspection object conveyed by the conveyance means; Similarly, an appearance inspection apparatus comprising a surface shape inspection means for inspecting a surface shape of the inspection object conveyed by the conveyance means, The surface pattern inspection means is arranged in the vicinity of the conveyance path, irradiates the surface of the inspection object with diffused light, and captures a grayscale image of the surface of the inspection object illuminated by the diffused light.
  • the surface shape inspection means is disposed in the vicinity of the transport path upstream or downstream of the gray image capturing unit, and the strip-shaped slit light is irradiated so that the irradiation line thereof is orthogonal to the transport direction of the inspection object. Irradiating the inspection object surface from the direction in which the imaging optical axis is along the conveying direction of the inspection object and intersects the optical axis of the slit light irradiated to the inspection object.
  • a slit light image capturing unit that captures an image when the slit light is irradiated, and a shape characteristic of the surface of the inspection object is recognized based on the image captured by the slit light image capturing unit, and the shape is related
  • a shape determination unit for determining suitability
  • the shape determination unit receives information on an area where at least a dark color part on the surface of the inspection object exists from the pattern determination unit, sets the received area as a non-inspection area, and determines whether the shape is appropriate.
  • the present invention relates to an appearance inspection apparatus configured to determine.
  • the inspection object conveyed by the conveying unit is inspected on the surface by the surface pattern inspection unit, and based on the grayscale image captured by the grayscale image capturing unit. Compliance with the surface pattern is determined. For example, when the surface has a stain, the stain is detected as a pattern feature, and as a result, it is determined to be defective, and when a character or the like is printed on the surface, this print portion is detected as a pattern feature, The suitability of the printing state is determined.
  • the surface shape of the inspection object is inspected by the surface shape inspection means. That is, in the slit light image capturing unit, the surface of the inspection object is irradiated with a band-shaped slit light, and the reflected light is imaged.
  • the inspection target is obtained by, for example, a light cutting method based on the captured image. Data relating to the three-dimensional shape of the object surface is calculated, the feature of the surface shape is recognized from the calculated data, and the suitability is determined.
  • the shape determination unit receives information on the region where at least the dark color portion of the surface of the inspection object exists from the pattern determination unit, sets the received region as a non-inspection region, and shapes the surface of the inspection object Judgment of suitability for
  • the slit light image capturing unit When there is a dark color part on the surface of the inspection object, if the dark color part is irradiated with slit light, the slit light is absorbed and not reflected in the dark color part, so the reflected light in the dark color part is missing.
  • the captured image is captured by the slit light image capturing unit.
  • the data of the dark color portion is missing.
  • the shape determination unit receives at least information related to an area where the dark color portion exists from the pattern determination unit of the surface pattern inspection means that can accurately determine the dark color portion, and receives the information.
  • the configured area is set as a non-inspection area, and the suitability for the three-dimensional shape of the surface of the inspection object is determined.
  • the region where the dark color portion is present As the non-inspection region, it is possible to prevent the erroneous determination as described above related to the three-dimensional shape, and to accurately inspect the shape of the surface of the inspection object.
  • either the gray image capturing unit or the slit light image capturing unit may be provided on the upstream side, but considering the speed of processing in the shape determining unit, the gray image capturing unit Is preferably provided upstream, so that the process of the pattern determination unit precedes the process of the shape determination unit because standby time does not occur in the process of the shape determination unit.
  • the slit light image capturing unit irradiates the slit light in a vertical direction, captures images from two directions on the upstream side and the downstream side in the transport direction of the inspection object, and determines the shape.
  • the unit is preferably configured to synthesize the two images captured by the slit light image capturing unit and determine suitability regarding the shape of the surface of the inspection object based on the combined image.
  • the imaging direction in the slit light image capturing unit is one direction, an image is not obtained for a surface existing at a position that is a blind spot with respect to the imaging direction, and the suitability of the three-dimensional shape for the surface is not obtained.
  • imaging is performed from two opposite directions, such blind spots can be minimized, and the suitability of the three-dimensional shape can be determined for substantially the entire surface.
  • the inspection object is an object having a dark pattern on the surface
  • the three-dimensional shape of the surface can be inspected accurately.
  • FIG. 2 is a partial cross-sectional view in the direction of arrow AA in FIG. 1. It is explanatory drawing for demonstrating schematic structure of an A surface grayscale image imaging part and a B surface grayscale image imaging part. It is explanatory drawing for demonstrating schematic structure of an A surface slit light image imaging part and a B surface slit light image imaging part. It is a block diagram for demonstrating the structure of an inspection selection process part. It is explanatory drawing for demonstrating the irradiation state of the slit light in an A surface slit light image imaging part and a B surface slit light image imaging part.
  • the appearance inspection apparatus 1 of this example includes a supply unit 3 that supplies the inspection object K in an aligned manner, a first linear conveyance unit 10 that linearly conveys the supplied inspection object K, and a second one.
  • a supply unit 3 that supplies the inspection object K in an aligned manner
  • a first linear conveyance unit 10 that linearly conveys the supplied inspection object K
  • a second one Arranged in the vicinity of the conveyance path of the A-line gray image capturing unit 21 and the A-side slit light image imaging unit 31 and the second linear conveyance unit 15 disposed in the vicinity of the conveyance path of the linear conveyance unit 15 and the first linear conveyance unit 10.
  • the B surface grayscale image capturing unit 51, the B surface slit light image capturing unit 61, the inspection sorting processing unit 20, and the sorting unit 80 are provided.
  • inspection target K in this example pharmaceuticals (tablets, capsules, etc.), foods, mechanical parts, electronic parts, and the like can be exemplified, but are not limited to these.
  • the supply unit 3 includes a hopper 4 into which a large number of inspection objects K are loaded, a vibration feeder 5 that imparts vibration to the inspection object K that is discharged from the lower end of the hopper 4, and a conveyance end of the vibration feeder 5.
  • Chute 6 that slides down the inspection object K discharged from the table, horizontally rotates, the alignment table 7 that discharges the inspection object K supplied from the chute 6 in a line, and a disk-shaped member that rotates in a vertical plane
  • a rotary conveyance unit 8 that adsorbs and conveys the inspection object K discharged from the alignment table 7 to the outer peripheral surface of the disk-shaped member, and aligns a large number of inspection objects K in a line. Then, the paper is sequentially transferred to the first linear conveyance unit 10.
  • the first linear conveyance unit 10 and the second linear conveyance unit 15 have the same structure, and the second linear conveyance unit 15 is disposed in an upside down state with respect to the first linear conveyance unit 10, and the first linear conveyance unit 10 has a transport path in the upper part thereof, and the second linear transport unit 15 has a transport path in the lower part thereof.
  • FIG. 2 is a partial cross-sectional view in the direction of arrows AA in FIG. 1 and shows the structure of the first linear transport unit 10.
  • the reference numerals in parentheses correspond to the second linear transport unit 15. The member which carried out is shown.
  • the first linear transport unit 10 is guided by side plates 11 and 12 arranged to face each other at a predetermined interval, and guide grooves formed on the upper surfaces of the side plates 11 and 12. And endless round belts 13 and 14 that run along the guide grooves.
  • the space sandwiched between the side plates 11 and 12 is closed by the side plates 11 and 12 and other members (not shown) so that the upper part is opened, and maintained at a negative pressure by a vacuum pump (not shown).
  • the second linear transport unit 15 is the same, and includes side plates 16 and 17 and endless round belts 18 and 19, and the space between the side plates 16 and 17 is maintained at a negative pressure so that a round pressure is maintained. A suction force due to a negative pressure is generated between the belts 18 and 19, and the inspection object K is sucked and sucked by the round belts 18 and 19, and is conveyed in the traveling direction along with the traveling.
  • the conveyance start end of the first linear conveyance unit 10 is connected to the conveyance termination of the rotary conveyance unit 8, the conveyance termination end of the first linear conveyance unit 10 is connected to the conveyance start end of the second linear conveyance unit 15, and the first straight line
  • the conveyance unit 10 sequentially receives the inspection object K from the rotary conveyance unit 8, sucks the lower surface (B surface), conveys it to the conveyance end, and delivers it to the second linear conveyance unit 15.
  • the 2nd linear conveyance part 15 receives the test object K sequentially from the 1st linear conveyance part 10, adsorb
  • the sorting unit 80 is provided at the transfer end of the second linear transport unit 15, and includes a sorting and collecting mechanism, a non-defective product collecting chamber, and a defective product collecting chamber (not shown), and according to a command from the inspection sorting processing unit 20
  • the sorting and collecting mechanism is driven, and the non-defective product is collected in the non-defective product collecting chamber and the defective product is collected in the defective product collecting chamber among the inspection object K transported to the transport end of the second linear transport unit 15.
  • the A-surface grayscale image capturing unit 21 is a hemisphere disposed above the transport path of the first linear transport unit 10 so as to cover the transport path and allow the inspection object K to pass therethrough.
  • the camera 22 captures an image of the inside of the diffusion member 24 through an opening 24a provided at the top.
  • the light emitted from the lamp 23 is diffused when passing through the diffusing member 24, and becomes scattered light (diffused light) having no directivity, and illuminates the space covered by the diffusing member 24.
  • the upper surface (A surface) of the inspection object K carried into the diffusing member 24 by the first linear transport unit 10 is uniformly illuminated by the diffused light. And by uniformly illuminating the upper surface (A surface) in this way, even if the upper surface (A surface) is uneven, the entire surface is illuminated uniformly, and the upper surface is in a state in which the shading is emphasized. .
  • the camera 22 is composed of a line sensor or an area sensor, and takes a grayscale image of the upper surface (A surface) of the inspection object K carried into the diffusion member 24 by the first linear transport unit 10 at a predetermined shutter speed.
  • the obtained image of at least the entire upper surface (A surface) is transmitted to the examination selection processing unit 20 as a frame image captured for each shutter.
  • the upper surface (A surface) of the inspection object K in a state where the light is uniformly illuminated by the diffused light and the gray level is more emphasized is captured and captured by the camera 22.
  • the grayscale image is transmitted to the inspection selection processing unit 20.
  • the B-side gray image capturing unit 51 includes a diffusing member 54, a plurality of lamps 53, and a camera 52 having the same configuration as the A-side gray image capturing unit 21, and is vertically inverted from the A-side gray image capturing unit 21. Thus, it is disposed in the vicinity of the second linear conveyance section 15.
  • the reference numerals in parentheses indicate the corresponding members of the B surface grayscale image capturing unit 51.
  • the lower surface (B surface) of the inspection object K conveyed by the second linear conveyance unit 15 is diffused by the action of the lamp 53 and the diffusing member 54.
  • the lower surface (B surface) in a state where the light is uniformly illuminated by light and the shade is further enhanced by the uniform illumination is imaged by the camera 52 through the opening 54a of the diffusion member 54, and at least the entire lower surface (B surface) imaged.
  • the A-surface slit light image capturing unit 31 is disposed on the downstream side in the transport direction from the A-surface gray image capturing unit 21, and is disposed above the transport path of the first linear transport unit 10, as shown in FIG. a camera 32 which is a slit irradiator 33 for irradiating a band-shaped slit beam L 1, led to slit beam L 1 emitted from the slit irradiator 33 directly under the direction of the camera 32, the first straight line
  • the mirrors 34 and 35 that irradiate the transport path of the transport unit 10 and the reflected light L 2 of the slit light L 1 irradiated on the transport path are received from the upstream side of the transport direction (arrow direction) of the first linear transport unit 10. to provided with mirrors 36
  • the slit light irradiator 33 and the mirrors 34, 35 are configured to transmit the slit light L 1 in the conveyance direction (arrows) of the inspection object K whose irradiation line is conveyed by the first linear conveyance unit 10. Irradiate vertically downward so as to be orthogonal to the indicated direction.
  • the camera 32 when the slit beam L 1 is irradiated on the inspection object K being conveyed by the first linear conveyance unit 10, the reflected light L 2 of the slit beam L 1 , it received from the transfer direction (arrow direction) upstream of the inspection object K, capturing each image by receiving the reflected light L 3 from the downstream side.
  • the camera 32 captures an image of the irradiation line of the slit light L 1 as seen from the two directions.
  • FIG. 7 shows the imaging form of the camera 32 in FIG. 4 as a simple and equivalent form that is easy to understand.
  • the camera 32 is an area sensor composed of elements arranged in double rows and double columns, and receives the reflected lights L 2 and L 3 to generate image data consisting of double row and double column pixels each having luminance data. To do.
  • FIG. 8 An example of an image obtained by imaging one reflected light (for example, reflected light L 2 ) is shown in FIG.
  • a part L s corresponding to the surface of the inspection object K corresponds to a base part L b where X is a direction orthogonal to the transport direction and Y is a transport direction. Is shifted to the Y direction (see also FIG. 8).
  • the imaging direction intersects with the irradiation direction of the slit light and is called a so-called light cutting method.
  • the image L S corresponding to the surface of the inspection object K Looking at the pixel (X i , Y i ), the height of the surface of the inspection object K corresponding to the pixel (X i ) from the base surface is the pixel (Y j ) of the image L b corresponding to the base surface. and based on the difference between the pixel (Y i) of the image L S, it can be calculated by geometric calculation techniques.
  • the height of the surface of the inspection object K is not directly calculated, but the image captured by the camera 32 includes height information based on such a light cutting method.
  • the image data captured in this way is transmitted from the camera 32 to the examination selection processing unit 20.
  • the position data (X i , Y i ) composed of the pixel position (X i ) in the X direction and the pixel position (Y i ) having the maximum luminance in the column is transmitted to the inspection / selection processing unit 20 as image data.
  • the amount of data to be transmitted is reduced, the transmission speed and the processing speed in the inspection / sorting processing unit 20 can be increased, and rapid processing can be performed.
  • the camera 32 captures the images in the two directions at a predetermined shutter speed, and at least the image data while the laser beam is irradiated on the upper surface of the inspection object K is used as a frame image obtained for each shutter. It transmits to the said inspection selection process part 20.
  • the A-side slit light image capturing unit 31 captures an image including height information of the upper surface (A surface) of the inspection target K, and transmits the image to the inspection / sorting processing unit 20.
  • the B-side slit light image capturing unit 61 is disposed on the downstream side in the transport direction from the B-side gray image capturing unit 51, and has the same configuration as the A-side slit light image capturing unit 31 and the slit light irradiator 63. , Mirrors 64, 65, 66, 67, 68, and 69, and disposed in the vicinity of the second linear transport unit 15 with the A-side slit light image capturing unit 31 turned upside down.
  • reference numerals in parentheses indicate corresponding members of the B-side slit light image capturing unit.
  • the camera 62 transmits the slit light irradiated on the lower surface (B-side) of the inspection object K conveyed by the second linear conveyance unit 15.
  • the reflected light is received from two directions upstream and downstream in the conveyance direction of the inspection object K, and the image data (pixel position (X i ) in the X direction) and pixel position having the maximum luminance in the column.
  • Position data (X i , Y i )) composed of (Y i ), and at least the inspection selection processing unit using the image data while the laser beam is irradiated on the lower surface of the inspection object K as a frame image 20 to send.
  • the inspection / selection processing unit 20 includes an A-side pattern determination unit 25, an A-side shape determination unit 40, a B-side pattern determination unit 55, a B-side shape determination unit 70, and a selection control unit 81.
  • the A surface pattern determination unit 25 stores the A surface gray image storage unit 26 that stores the gray image of the A surface received from the A surface gray image capturing unit 21 and the A surface gray image storage unit 26.
  • An A surface grayscale image binarization processing unit 27 that binarizes the A surface grayscale image with a predetermined reference value, and an image portion corresponding to the upper surface (A surface) of the inspection object K from the binarized image.
  • the grayscale image captured by the A plane grayscale image capturing unit 21 and stored in the A plane grayscale image capturing unit 21 is a multivalued image, and this multivalued image is binarized with a predetermined reference value.
  • An image portion corresponding to the upper surface (A surface) of the inspection object K is extracted from the binarized image, and a black portion (pattern portion) in the extracted image is further extracted to obtain a black portion (pattern portion). ) Is compared with a predetermined reference pattern, and the quality is determined.
  • the extracted black portion For example, if there is no pattern such as printed characters on the upper surface (A surface) of the appropriate inspection object K, if there is an extracted black portion, it is determined that the spot is defective, and the printed characters are printed on the surface. If a pattern such as is attached, the extracted black part (pattern part) is compared with an appropriate pattern, and pass / fail is determined from the degree of adaptation.
  • the A surface shape determination unit 40 includes an A surface slit light image storage unit 41, an A surface luminance data conversion processing unit 42, an A surface 2 image composition processing unit 43, and an A surface shape feature extraction process. Part 44 and A surface shape determination processing part 45.
  • the A-side slit light image storage unit 41 stores image data (frame images) in two directions received from the A-side slit light image capturing unit 31.
  • the A-side luminance data conversion processing unit 42 reads out the two-direction frame images stored in the A-side slit light image storage unit 41, performs the following processing, and obtains position data derived from the height component as its height.
  • the image data is converted into luminance data set according to the component, and new image data in which the height component is expressed by the luminance data is generated.
  • the A-plane luminance data conversion processing unit 42 first sequentially reads out the frame image data on one side, and based on the pixel position (X i , Y i ), as shown in FIG.
  • the pixel position (Y i ) corresponding to the depth component is converted into 256-gradation luminance data, image data composed of the pixel position (X i ) and luminance data is generated, and sequentially converted for all frame images, New image data (image data composed of two-dimensional plane position data and luminance data representing height information at each position, hereinafter referred to as “luminance image data”) is generated.
  • the luminance image data is generated in the same manner for the other side image data.
  • the A-side two-image composition processing unit 43 performs data conversion by the A-side luminance data conversion processing unit 42, and synthesizes the newly generated two-direction luminance image data into one luminance image data.
  • the inspection object K is imaged from an obliquely upper side on the upstream side in the transport direction
  • the reflected light at the front part of the inspection object K is weak, and when imaged from an obliquely upper side on the downstream side in the transport direction. Since the reflected light at the rear part of the inspection object K becomes weak, the image data for these parts becomes inaccurate.
  • FIG. 11A shows an image obtained by converting the image obtained by imaging the inspection object K in FIG. 7 from the upstream side in the transport direction by the A-plane luminance data conversion processing unit 42, and similarly downstream in the transport direction.
  • FIG. 11B shows a converted image of the image captured from FIG.
  • the upper part of the image (the part surrounded by the white line) is inaccurate
  • the lower part of the image (the part surrounded by the white line) is inaccurate. Therefore, by combining these two images, for example, when data is missing between each other, the data that is present is applied, and when there is data between each other, the average value thereof is applied, whereby FIG.
  • FIG. As shown in (c), an image in which the entire upper surface (A surface) of the inspection object K is accurately represented can be obtained.
  • the blind spot 100a when there is a missing portion 100 on the surface of the inspection object K, when the camera 32 captures an image from the direction indicated by the solid line, a blind spot portion 100 a is generated, but the opposite direction (in the two-dot chain line If the image is taken from the direction shown), the blind spot 100a can be imaged.
  • the A surface shape feature extraction processing unit 44 performs a process of extracting shape features based on the composite image generated by the A surface 2 image composition processing unit 43. Specifically, the synthesized image is smoothed by a so-called smoothing filter, and feature image data is generated by taking a difference between the obtained smoothed image data and the synthesized image data.
  • the synthesized image is obtained by converting the height component into luminance data, and the luminance represents the height of the upper surface (A surface) of the inspection object K.
  • the upper surface is obtained by subtracting the averaged image from the synthesized image.
  • An image in which a large amount of change in the height direction of (A surface) is emphasized can be obtained. For example, as shown in FIG. 12, by subtracting the smoothed image (FIG. 12B) from the composite image (FIG. 12A), as shown in FIG. And the number “678” stamped on the upper surface (A surface) are emphasized as dark portions.
  • the spots under the numeral “7” are dark spots, which will be described later.
  • the A-surface shape determination processing unit 45 compares this with data related to the appropriate surface shape based on the feature image related to the surface shape generated by the A-surface shape feature extraction processing unit 44, It is judged whether it is good or bad and whether or not it is missing.
  • the A-side shape determination processing unit 45 receives the feature image related to the surface pattern generated by the A-side pattern feature extraction processing unit 29 from the A-side pattern feature extraction processing unit 29, and Recognize the area where the black portion exists, and set the area corresponding to the area where the black portion exists in the feature image generated by the A-surface shape feature extraction processing unit 44 as a non-inspection area. Make a decision.
  • the black circle portion is a black spot portion and a data missing portion.
  • the image data generated by the A-surface shape feature extraction processing unit 44 is used as it is to determine whether the surface shape is good or bad. If this is done, it will be determined that even those which are originally proper in shape are defective.
  • the feature image related to the surface pattern generated by the A surface pattern feature extraction processing unit 29 is received from the A surface pattern feature extraction processing unit 29, and the black color in the image is displayed.
  • a region where the black portion exists is recognized, and a region corresponding to the region where the black portion exists is set as a non-inspection region in the feature image related to the surface shape generated by the A-surface shape feature extraction processing unit 44. Therefore, the quality of the shape is determined.
  • the feature image generated by the A-side shape feature extraction processing unit 44 is as shown in FIG.
  • the feature image generated by the image 29 is as shown in FIG. 12 (d), but the A-surface shape determination processing unit 45 recognizes the region (the region indicated by the two-dot chain line) in FIG.
  • the region (region indicated by a two-dot chain line) in FIG. 12C is set as a non-inspection region, and as shown in FIG. In 12 (e), the quality of the stamped shape is determined.
  • the shape of the surface of the inspection object K can be inspected accurately.
  • the B-side pattern determination unit 55 includes a B-side gray image storage unit 56, a B-side gray image binarization processing unit 57, a B-side target part extraction processing unit 58, and a B-side pattern feature extraction process. Part 59 and B surface pattern determination processing part 60.
  • the B-side gray image storage unit 56 is the A-side gray image storage unit 26
  • the B-side gray image binarization processing unit 57 is the A-side gray image binarization processing unit 27, and the B-side target part extraction processing unit 58.
  • Are the A-side target part extraction processing unit 28, the B-side pattern feature extraction processing unit 59 is the A-side pattern feature extraction processing unit 29, and the B-side pattern determination processing unit 60 is the A-side pattern determination processing unit 30. It has the same configuration and performs the same processing.
  • the B-side pattern determining unit 55 detects the feature related to the pattern on the lower surface (B-side) of the inspection object K, and determines its quality.
  • the B surface shape determination unit 70 includes a B surface slit light image storage unit 71, a B surface luminance data conversion processing unit 72, a B surface 2 image composition processing unit 73, and a B surface shape feature.
  • An extraction processing unit 74 and a B-surface shape determination processing unit 75 are included.
  • the B-side slit light image storage unit 71 is the A-side slit light image storage unit 41
  • the B-side luminance data conversion processing unit 72 is the A-side luminance data conversion processing unit 42
  • the B-side two image composition processing unit 73 is included.
  • the B-side shape feature extraction processing unit 74 is the A-side shape feature extraction processing unit 44
  • the B-side shape determination processing unit 75 is the A-side shape determination processing unit 45, respectively. It has the same configuration and performs the same processing.
  • the B-side shape determining unit 70 detects the feature related to the shape of the lower surface (B-side) of the inspection object K, and determines its quality.
  • the selection control unit 81 receives the determination results from the A-side pattern determination processing unit 30, the A-side shape determination processing unit 45, the B-side pattern determination processing unit 60, and the B-side shape determination processing unit 75, and performs these processes.
  • a selection signal is transmitted to the selection unit 80 at the timing when the inspection object K determined to be defective reaches the selection unit 80.
  • the sorting unit 80 collects the inspection object K in the defective product collection chamber, and when not receiving the sorting signal, collects the conveyed inspection object K in the non-defective product collection chamber.
  • the A surface pattern is based on the image captured by the A surface grayscale image capturing unit 21 while being transported by the first linear transport unit 10.
  • the determination unit 25 checks whether or not the pattern on the upper surface (A surface) of the inspection target K is appropriate, and based on the image captured by the A surface slit light image capturing unit 31, the A surface shape determination unit 40 Appropriateness regarding the shape of (A surface) is inspected, and then in the B surface pattern determination unit 55 based on the image captured by the B surface grayscale image capturing unit 51 while being transported by the second linear transport unit 15.
  • the suitability of the pattern on the lower surface (B surface) of the inspection target K is inspected, and the lower surface (B surface) is determined by the B surface shape determination unit 70 based on the image captured by the B surface slit light image capturing unit 61. Suitable for the shape of There is examined, and the upper and lower surfaces of the pattern and shape of the inspection object K is automatically inspected.
  • the A-side shape determining unit 40 and the B-side shape determining unit 70 extract the features related to the shape and determine the suitability of the shape
  • the A-side pattern determining unit 25 and the B-side pattern determining unit 55 Receiving the feature images related to the extracted patterns, recognizing the region where the black portion exists, and setting the region corresponding to the region where the black portion exists among the feature images related to the shape as a non-inspection region
  • the shape of the upper and lower surfaces can be accurately inspected even when dark-colored printing portions and spots are present on the upper and lower surfaces of the inspection object K.
  • the A-side slit light image capturing unit 31 and the B-side slit light image capturing unit 61 capture images from two directions on the upstream side and the downstream side in the conveyance direction of the inspection target K, and the A-side shape determination unit 40 and B
  • the surface shape determination unit 70 combines the two obtained images to generate one image, and determines whether the shape of the upper and lower surfaces of the inspection target K is appropriate based on the generated combined image. An image having as little blind spot as possible can be obtained, and the shape of the entire upper and lower surfaces can be accurately inspected.
  • the A-side gray image storage unit 26 includes the A-side slit light image storage unit 41.
  • data for the same inspection object K1 is stored. Accordingly, the processes of the A plane grayscale image binarization processing unit 27 to the A plane pattern determination processing unit 30 are executed prior to the processes of the A plane luminance data conversion processing unit 42 to the A plane shape determination processing unit 45, and the A plane The shape determination processing unit 45 can perform processing with reference to data from the A-surface pattern feature extraction processing unit 29 without causing a waiting time, and can perform quick processing.
  • data for the same inspection object K1 is stored in the B-side gray image storage unit 56 prior to the B-side slit light image storage unit 71, and in the B-side shape determination processing unit 75, a waiting time is stored.
  • the processing can be performed with reference to the data from the B-side pattern feature extraction processing unit 59 without causing the occurrence of the problem, and the rapid processing can be performed.
  • the data for the same inspection object K1 is stored in the A-side grayscale image storage unit 26 and the A-side slit light image storage unit 41, respectively.
  • the processing of the grayscale image binarization processing unit 27 to the A surface pattern determination processing unit 30 and the processing of the A surface luminance data conversion processing unit 42 to the A surface shape determination processing unit 45 may be executed simultaneously in parallel.
  • the B-side gray image binarization processing unit 57 to the B-side pattern determination
  • the processing of the processing unit 60 and the processing of the B surface luminance data conversion processing unit 72 to the B surface shape determination processing unit 75 may be executed simultaneously in parallel.
  • the A-side slit light image capturing unit 31 is disposed upstream from the A-side gray image capturing unit 21, and the B-side slit light image capturing unit 61 is disposed upstream from the B-side gray image capturing unit 51. May be installed.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Textile Engineering (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne un dispositif de contrôle d'aspect pouvant contrôler de façon précise la forme d'un objet même si celui-ci présente un motif de couleur foncée à sa surface. Le dispositif de contrôle d'aspect est doté d'un moyen de contrôle de motif de surface et d'un moyen de contrôle de forme de surface qui sont disposés près d'un chemin de transfert pour le transfert d'un objet à contrôler (K). Le moyen de contrôle de motif de surface est doté d'une unité de capture d'image à niveaux de gris (21, 51) pour la capture d'une image à niveaux de gris par application d'une lumière de diffusion sur l'objet à contrôler (K), et d'une unité de détermination de motif pour la détermination de l'acceptation ou de la non-acceptation du motif de surface sur la base de l'image à niveaux de gris capturée. Le moyen de contrôle de la forme de surface est doté d'une unité de capture d'image à lumière en fente (31, 61) pour la capture d'une image formée par application d'une lumière en fente en forme de bande sur l'objet à contrôler (K), et une unité de détermination de forme pour la détermination de l'acceptation ou de la non acceptation de la forme de surface sur la base de l'image capturée. L'unité de détermination de forme reçoit, en provenance de l'unité de détermination de motif, des informations relatives à une région dans laquelle au moins une partie en couleur foncée de la surface de l'objet à contrôler (K) est présente, règle la région reçue à une région de non-contrôle, et détermine l'acceptation ou la non acceptation de la forme.
PCT/JP2010/070986 2009-12-11 2010-11-25 Dispositif de contrôle d'aspect WO2011070914A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2011545165A JP5654486B2 (ja) 2009-12-11 2010-11-25 外観検査装置
CN201080056200.9A CN102713580B (zh) 2009-12-11 2010-11-25 外观检查装置
KR1020127017899A KR101762158B1 (ko) 2009-12-11 2010-11-25 외관 검사 장치

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009-281084 2009-12-11
JP2009281084 2009-12-11

Publications (1)

Publication Number Publication Date
WO2011070914A1 true WO2011070914A1 (fr) 2011-06-16

Family

ID=44145461

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/070986 WO2011070914A1 (fr) 2009-12-11 2010-11-25 Dispositif de contrôle d'aspect

Country Status (4)

Country Link
JP (1) JP5654486B2 (fr)
KR (1) KR101762158B1 (fr)
CN (1) CN102713580B (fr)
WO (1) WO2011070914A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5457599B1 (ja) * 2013-09-12 2014-04-02 株式会社Windy 薬剤分包システム
JP2015135265A (ja) * 2014-01-17 2015-07-27 Nok株式会社 表面形状検査装置
JP2017133930A (ja) * 2016-01-27 2017-08-03 倉敷紡績株式会社 距離画像生成装置および方法
JP2021177186A (ja) * 2013-07-16 2021-11-11 株式会社キーエンス 三次元画像処理装置及び三次元画像処理方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6336735B2 (ja) * 2013-11-11 2018-06-06 第一実業ビスウィル株式会社 外観検査装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3344995B2 (ja) * 2000-09-22 2002-11-18 東芝アイティー・ソリューション株式会社 錠剤表面検査装置
JP2004317126A (ja) * 2003-04-10 2004-11-11 Renesas Technology Corp はんだ印刷装置
JP3640247B2 (ja) * 2002-06-21 2005-04-20 シーケーディ株式会社 錠剤の外観検査装置及びptp包装機

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60200103A (ja) * 1984-03-26 1985-10-09 Hitachi Ltd 光切断線抽出回路
JPS61290311A (ja) * 1985-06-19 1986-12-20 Hitachi Ltd はんだ付部の検査装置及びその方法
RU2196977C2 (ru) * 1997-06-17 2003-01-20 Юки Инженеринг Систем Ко., Лтд. Устройство контроля листовой упаковки
JP2004061196A (ja) * 2002-07-26 2004-02-26 Toei Denki Kogyo Kk 2次元レーザ変位センサによる起伏形状検査装置
JP4166587B2 (ja) * 2003-01-24 2008-10-15 株式会社サキコーポレーション 外観検査装置および体積検査方法
JP4278536B2 (ja) * 2004-02-27 2009-06-17 サンクス株式会社 表面形状検出器
CN101082562B (zh) * 2007-06-28 2010-12-29 中国科学院安徽光学精密机械研究所 基于图像监测微颗粒形状与散射的装置
JP5025442B2 (ja) * 2007-12-10 2012-09-12 株式会社ブリヂストン タイヤ形状検査方法とその装置
EP2599556B1 (fr) 2011-11-29 2021-06-30 General Electric Technology GmbH Procédé pour nettoyer un précipitateur électrostatique

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3344995B2 (ja) * 2000-09-22 2002-11-18 東芝アイティー・ソリューション株式会社 錠剤表面検査装置
JP3640247B2 (ja) * 2002-06-21 2005-04-20 シーケーディ株式会社 錠剤の外観検査装置及びptp包装機
JP2004317126A (ja) * 2003-04-10 2004-11-11 Renesas Technology Corp はんだ印刷装置

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021177186A (ja) * 2013-07-16 2021-11-11 株式会社キーエンス 三次元画像処理装置及び三次元画像処理方法
JP7150105B2 (ja) 2013-07-16 2022-10-07 株式会社キーエンス 三次元画像処理装置及び三次元画像処理方法
JP5457599B1 (ja) * 2013-09-12 2014-04-02 株式会社Windy 薬剤分包システム
JP2015054115A (ja) * 2013-09-12 2015-03-23 株式会社Windy 薬剤分包システム
JP2015135265A (ja) * 2014-01-17 2015-07-27 Nok株式会社 表面形状検査装置
JP2017133930A (ja) * 2016-01-27 2017-08-03 倉敷紡績株式会社 距離画像生成装置および方法

Also Published As

Publication number Publication date
CN102713580A (zh) 2012-10-03
KR20120109547A (ko) 2012-10-08
CN102713580B (zh) 2014-11-12
JP5654486B2 (ja) 2015-01-14
KR101762158B1 (ko) 2017-07-27
JPWO2011070914A1 (ja) 2013-04-22

Similar Documents

Publication Publication Date Title
JP5670915B2 (ja) 外観検査装置
WO2016121878A1 (fr) Dispositif d'inspection d'aspect optique et système d'inspection d'aspect optique l'utilisant
KR101915498B1 (ko) 외관 검사 장치
JP5654486B2 (ja) 外観検査装置
JP5174540B2 (ja) 木材欠陥検出装置
JP4093460B2 (ja) 複雑な形状をもつ物品の外観検査装置
JP3640247B2 (ja) 錠剤の外観検査装置及びptp包装機
JP2010107254A (ja) Ledチップ検査装置、ledチップ検査方法
WO2011086634A1 (fr) Procédé d'inspection d'un écran à cristaux liquides et dispositif
KR20200014532A (ko) 이미지 센서를 이용한 표면결함 검사장치 및 검사방법
JP4493048B2 (ja) 検査装置
JP2009002764A (ja) 海苔の外観検査方法及びその装置
JP2001299288A (ja) 乾燥海苔の異物検知方法及び装置
JP3989739B2 (ja) 検査装置
JP2005308623A (ja) 光学部材検査装置
JP2002139444A (ja) 錠剤の外観検査装置およびptp包装機
JP5679564B2 (ja) 表面異物検査装置
JP2009047517A (ja) 検査装置
JP2000346813A (ja) 物品の表面検査装置
CN216669771U (zh) 一种线同轴光补偿成像装置
JP2009047513A (ja) 検査装置
JP2005010036A (ja) 外観検査装置
JP2022020873A (ja) 撮像装置及び撮像装置を有する表面検査装置、撮像方法及びプログラム
JP2009270993A (ja) サインパネル検査装置およびその方法

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080056200.9

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10835836

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011545165

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 5854/CHENP/2012

Country of ref document: IN

ENP Entry into the national phase

Ref document number: 20127017899

Country of ref document: KR

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 10835836

Country of ref document: EP

Kind code of ref document: A1