US20210150695A1 - Inspection device - Google Patents

Inspection device Download PDF

Info

Publication number
US20210150695A1
US20210150695A1 US16/622,674 US201816622674A US2021150695A1 US 20210150695 A1 US20210150695 A1 US 20210150695A1 US 201816622674 A US201816622674 A US 201816622674A US 2021150695 A1 US2021150695 A1 US 2021150695A1
Authority
US
United States
Prior art keywords
unit
fault
workpiece
surface shape
inspection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/622,674
Other languages
English (en)
Inventor
Teruaki Yogo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Opton Co Ltd
Original Assignee
Opton Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Opton Co Ltd filed Critical Opton Co Ltd
Assigned to OPTON CO., LTD. reassignment OPTON CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YOGO, TERUAKI
Publication of US20210150695A1 publication Critical patent/US20210150695A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • G01N21/95607Inspecting patterns on the surface of objects using a comparative method
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/174Segmentation; Edge detection involving the use of two or more images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N2021/845Objects on a conveyor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • G01N2021/8829Shadow projection or structured background, e.g. for deflectometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30116Casting

Definitions

  • the present disclosure relates to an inspection device to inspect a molding fault in a molded article.
  • molded articles produced by molding such as press-forming and injection molding have been inspected by a visual inspection and a manual operation of a measuring instrument to inspect the presence or absence of a molding fault.
  • the molding fault as used herein is a portion pertinent to molding inferiority including a flaw, dent, missing part, burr, crack, dross, and peeling of plating generated during a molding process.
  • the visual inspection and the manual operation of the measuring instrument may require relatively long time to inspect one molded article. Thus, time and cost required for a total inspection of mass-produced molded articles may become unacceptable. When the time and cost required for the total inspection are unacceptable, there has been no choice but to conduct a sampling inspection in which a sample is taken from a whole lot and inspected.
  • Patent Document 1 Japanese Unexamined Patent Application Publication No. 2015-114309
  • Patent Document 1 As a technique to measure a surface shape of a measurement target with high speed and high accuracy, a technique of a three-dimensional image measurement as described in Patent Document 1, for example, has been known.
  • the inventors applied this type of technique of the three-dimensional image measurement, and earnestly developed a system in which the presence or absence of the molding faults in the molded articles can be inspected at a high speed with high accuracy, and in which the total inspection of the mass-produced molded articles can be realized.
  • One aspect of the present disclosure is, preferably, to inspect the presence or absence of the molding fault in the molded articles at a high speed with high accuracy by using the three-dimensional image measurement.
  • the inspection device in one embodiment of the present disclosure includes a projection unit, an imaging unit, a measuring unit, and a comparing and detecting unit, and a fault specifying unit.
  • the projection unit is configured to project a specific optical pattern onto a given imaging range.
  • the imaging unit is configured to take an image including the optical pattern projected onto an inspection target placed within the imaging range.
  • the measuring unit is configured to measure a three-dimensional surface shape of the inspection target based on the optical pattern included in the image taken by an imaging device.
  • the comparing and detecting unit collates the three-dimensional surface shape measured by the measuring unit and a given nominal contour data representing a three-dimensional surface shape of a non-defective product corresponding to the inspection target.
  • the comparing and detecting unit is configured to detect a portion recognized to show a shape different from the three-dimensional surface shape of the non-defective product in the three-dimensional surface shape of the inspection target, as a possible molding fault, together with a dimension of the shape.
  • the fault specifying unit is configured to specify, as a molding fault, only a possible molding fault in which a dimension of a shape of the possible molding fault is equal to or more than a given reference representing a criterion for the molding fault, among a plurality of the possible molding faults detected by the comparing and detecting unit.
  • the inspection device of the present disclosure it is possible to inspect the presence or absence of the molding fault in the molded article at a high speed with high accuracy by using the three-dimensional image measurement.
  • the inspection device of the present disclosure collates the three-dimensional surface shape of the inspection target measured by the method of the three-dimensional image measurement, and the nominal contour data representing the three-dimensional surface shape of the non-defective product, thereby detecting the possible molding fault in the inspection target.
  • the inspection device of the present disclosure is configured to specify, as a molding fault in which a dimension of a shape of the possible molding fault is equal to or more than the given criterion value for the molding fault, among the detected possible molding faults.
  • the three-dimensional image measurement for example, when the resolution of the image taken by the imaging unit is sufficiently high, even a minute flaw that cannot be a defect as a product may be detected. However, it is an excessive quality for the inspection to determine that all the minute flaws that cannot be defects as a product are the molding faults.
  • the minute flaw that does not meet the criterion value from the target molding fault it is possible to accurately detect only the molding fault having a dimension that can be a defect as a product.
  • the inspection device is configured such that the criterion value can be arbitrarily set and changed by a user.
  • the fault specifying unit is configured to specify the molding fault by using the criterion value that is arbitrarily set and changed by the user.
  • the inspection device is configured such that the criterion value can be separately set to each region formed by dividing the three-dimensional surface shape represented by the nominal contour data into two or more regions.
  • the fault specifying unit is configured to specify the molding fault by using a criterion value of a region corresponding to a position of the possible molding fault detected by the comparing and detecting unit.
  • a portion where the material is formed into a steep shape by a molding process and a portion having a gentle surface may have different properties of the molding defect generated on the surface of each portion.
  • the inspection device includes information indicating a grid number assigned to each grid formed by dividing the three-dimensional surface shape represented by the nominal contour data at given intervals.
  • an output unit is configured to output information indicating the molding fault specified by the fault specifying unit while associating with a grid number of a grid corresponding a position of the molding fault.
  • the output unit may be configured to put a mark showing a type of the molding fault specified by the fault specifying unit and a grid number corresponding to the position of the molding fault onto the inspection target.
  • a scratch may be generated by friction with the die.
  • the scratch caused by the die is formed as a portion exhibiting a stronger gloss than the periphery thereof, for example.
  • Such scratch is not treated as the molding fault if the shape of the product has no problem.
  • a user can confirm that the time to correct the die is approaching.
  • the inspection device may further include a gloss measuring unit, a scratch specifying device, a warning output unit.
  • the gloss measuring unit is configured to measure a degree of a gloss on the surface of the inspection target from the image taken by the imaging device.
  • the scratch specifying device is configured to specify a portion satisfying a given criterion for a scratch as a scratch based on a three-dimensional surface shape measured by the measuring unit and a distribution of the degree of the gloss measured by the gloss measuring unit.
  • the warning output unit is configured to output a warning related to the scratch specified by the scratch specifying device.
  • the inspection device further includes at least one standard arranged in the imaging range.
  • the measuring unit is configured to measure also a three-dimensional surface shape of the standard imaged together with the inspection target by the imaging device. Then, the determiner collates the three-dimensional surface shape of the standard measured by the measuring unit and a given calibration data representing a normal three-dimensional surface shape corresponding the standard, then the determiner determines an appropriateness of an inspection result related to the inspection target depending on whether the three-dimensional surface shape of the standard conforms to the calibration data.
  • the inspection device includes two or more imaging units configured to each image a different portion of the inspection target.
  • the measuring unit is configured to measure a three-dimensional surface shape of each portion with respect to an image taken by each of the two or more imaging devices.
  • the comparing and detecting unit is configured to collate the three-dimensional surface shape of each portion measured by the measuring unit and the nominal contour data prepared for each portion, and the comparing and detecting unit is configured to detect the possible molding fault.
  • the inspection device includes a robot arm, a first projection unit, a first imaging unit, a second projection unit, and a second imaging unit.
  • the robot arm is configured to grasp and carry an inspection target came from a previous manufacturing process, and the robot arm is configured to place the inspection target on a conveyance device for loading and conveying the inspection target to a next manufacturing process.
  • the first projection unit is configured to project the optical pattern onto a first surface of two surfaces composing a front and a back of the inspection target.
  • the first imaging unit is configured to take an image including the optical pattern projected onto the first surface.
  • the second projection unit is configured to project the optical pattern onto a second surface of the two surfaces composing the front and the back of the inspection target.
  • the second imaging unit is configured to take an image including the optical pattern projected onto the second surface.
  • the first projection unit and the first imaging unit is arranged at a position where a projection and a imaging of the optical pattern can be performed with respect to the inspection target that is grasped and carried by the robot arm.
  • the robot arm is configured to perform an operation of grasping the inspection target and an operation of directing the first surface towards the first projection unit and the first imaging unit, then the robot arm is configured to place the inspection target on the conveyance device with the second surface upward.
  • the second projection unit and the second imaging unit are arranged at a position where the projection and the imaging of the optical pattern can be performed on the second surface of the inspection target placed on the conveyance device.
  • FIG. 1 is a perspective view showing an external appearance of an inspection system.
  • FIG. 2 is a block diagram schematically showing a configuration of the inspection system.
  • FIG. 3 A is a block diagram schematically showing a configuration of a small-sized monocular camera unit.
  • FIG. 3 B is a block diagram schematically showing a configuration of a large-sized binocular camera unit.
  • FIG. 4 is a view showing one example of grid lines set to a three-dimensional surface shape of a nominal contour data.
  • FIG. 5 is a flowchart showing a procedure of an inspection process executed by a controller.
  • FIG. 1 A configuration of an inspection system 1 of an embodiment will be described with reference to FIG. 1 and FIG. 2 .
  • arrows indicating directions of top, bottom, front, rear, left, and right in FIG. 1 are described to promote understanding of relations between components.
  • the present disclosure should not be limited to the directions of the arrows in FIG. 1 .
  • the inspection system 1 is attached in association with a manufacturing line that conveys a workpiece 100 , as an inspection target.
  • the manufacturing line in which the inspection system 1 is applied includes a press machine 2 , a conveying conveyor 3 , and a defective product discharging conveyor 4 .
  • the press machine 2 produces the workpiece 100 by processing a metal material by press molding.
  • the conveying conveyor 3 conveys the workpiece came out from the press machine 2 to a next manufacturing process.
  • the workpiece 100 is a molded article produced by the press machine 2 .
  • the workpiece 100 has unevenness on the surface thereof.
  • the inspection system 1 is configured to inspect the presence or absence of a molding fault for all the workpieces 100 while the workpiece 100 came out from the press machine 2 is conveyed to a next manufacturing process.
  • the molding fault that is an inspection target for the inspection system 1 is a part corresponding to a molding defect including a flaw, dent, missing part, burr, crack, dross, and a peeling of plat
  • the inspection system 1 includes a workpiece pick-up robot 10 , a first imaging unit 20 , a second imaging unit 30 , standard gauge 40 a , 40 b , a defective workpiece discharger 50 , a marking robot 60 , and a controller 70 .
  • the conveying conveyor 3 is arranged in front of the press machine 2 .
  • the conveying conveyor 3 is a carrying device to convey the workpiece 100 came out from the press machine 2 to a next manufacturing process.
  • the defective product discharging conveyor 4 is arranged beside the conveying conveyor 3 .
  • the defective product discharging conveyor 4 is a carrying device to discharge the workpiece 100 , in which the molding fault is detected, from the manufacturing line.
  • the conveying conveyor 3 and the defective product discharging conveyor 4 may be embodied as a belt conveyer and a roller conveyer, for example.
  • the conveying conveyor 3 and the defective product discharging conveyor 4 convey the workpiece 100 placed thereon forward.
  • the workpiece pick-up robot 10 is arranged near an outlet through which the workpiece 100 comes out from the press machine 2 .
  • the workpiece pick-up robot 10 is an articulated robot arm with a vacuum gripper at the tip thereof to suck the workpiece 100 .
  • the workpiece pick-up robot 10 is operated based on control by the controller 70 .
  • the workpiece pick-up robot 10 sucks and grasps upward the workpiece 100 comes out from the press machine 2 with the vacuum gripper. Then, the workpiece pick-up robot 10 performs an operation of carrying the grasped workpiece 100 and placing it on the conveying conveyor 3 or the defective product discharging conveyor 4 .
  • the workpiece 100 has two surfaces composing front and back.
  • One surface of the workpiece 100 (for example, a back side surface) is referred to as a first surface
  • the other surface (for example, a front side surface) is referred to as a second surface.
  • the workpiece pick-up robot 10 first grasps the second surface side of the workpiece 100 came out from the press machine 2 and lifts the workpiece 100 . Then, while holding the workpiece 100 , the workpiece pick-up robot 10 performs an operation of carrying the workpiece 100 within a given imaging range of the first imaging unit 20 provided in the vicinity thereof, and directing the first surface of the workpiece 100 towards the first imaging unit 20 in a predetermined posture. In a state where the first surface of the workpiece 100 is directed towards the first imaging unit 20 , an image of the first surface of the workpiece 100 is taken by the first imaging unit 20 .
  • the first imaging unit 20 is a device to image the first surface side of the workpiece 100 lifted and held by the workpiece pick-up robot 10 .
  • a description will be given of a case where the first imaging unit 20 is configured by two camera units 21 a , 21 b .
  • the configuration is not limited to this, and the first imaging unit 20 may be configured by one camera unit, or the first imaging unit 20 may be configured by three or more camera units.
  • the first imaging unit 20 is attached to a pole 5 standing beside the conveying conveyor 3 .
  • the camera units 21 a , 21 b may be attached to the pole 5 preferably at a position where each of the camera units 21 a , 21 b can image a whole first surface of the workpiece 100 , which is held by the workpiece pick-up robot 10 , from different directions.
  • the camera units 21 a , 21 b may be attached so that the camera unit 21 a captures one region, which is formed by dividing the whole first surface of the workpiece 100 into two regions, within the imaging range, and the camera unit 21 b captures the other region within the imaging range.
  • the camera units 21 a , 21 b output the data of the taken image to the controller 70 .
  • the controller 70 a three-dimensional surface shape of the first surface of the workpiece 100 is measured from the taken image of the first surface, and then the presence or absence of the molding fault on the first surface of the workpiece 100 is inspected.
  • the workpiece pick-up robot 10 puts the workpiece 100 , which passed the inspection, on the conveying conveyor 3 with the second surface side up.
  • the workpiece pick-up robot 10 places the workpiece 100 , which is determined to be defective, on the defective product discharging conveyor 4 based on the control of the controller 70 .
  • the workpiece 100 placed on the conveying conveyor 3 by the workpiece pick-up robot 10 is conveyed forward by the conveying conveyor 3 . Then, at the timing when the conveyed the workpiece 100 reaches within a given imaging range of the second imaging unit 30 , the second imaging unit 30 takes an image of the second surface of the workpiece 100 based on the control of the controller 70 .
  • the second imaging unit 30 is a device that images the second surface side of the workpiece 100 , which is conveyed by the conveying conveyor 3 , from above.
  • a description will be given of a case where the second imaging unit 30 is configured by two camera units 31 a , 31 b .
  • the configuration is not limited to this, and the second imaging unit 30 may be configured by one camera unit, or and the second imaging unit 30 may be configured by three or more camera units.
  • the second imaging unit 30 is attached to a frame structure 6 arranged in a more front side than the first imaging unit 20 .
  • the frame structure 6 is a frame-shaped structure formed of poles and beams arranged to stride over the conveying conveyor 3 and the defective product discharging conveyor 4 arranged side by side.
  • the camera unit 31 a may be attached to the frame structure 6 preferably at a position where a whole second surface of the workpiece 100 can be imaged from different directions when the workpiece 100 conveyed on the conveying conveyor 3 reaches a given imaging position.
  • the camera units 31 a , 31 b may be attached so that the camera unit 31 a captures a first region, which is formed by diving the whole second surface of the workpiece 100 into two, within an imaging range, and the camera unit 31 b captures a second region within the imaging range.
  • Each camera unit 31 a , 31 b of the second imaging unit 30 outputs the data of the taken image to the controller 70 .
  • the second imaging unit 30 takes the image of the second surface of the workpiece 100
  • a three-dimensional surface shape of the second surface of the workpiece 100 is measured from the taken image in the controller 70 , and the presence or absence of the molding fault in the second surface of the workpiece 100 is inspected.
  • Each camera unit 21 a , 21 b , 31 a , 31 b of the first imaging unit 20 and the second imaging unit 30 is configured to project light having a given wavelength and frequency to configure an image of a given stripe fringe pattern.
  • camera units 21 a , 21 b , 31 a , 31 b are configured to image a region on which the light is projected (this region is the imaging range).
  • a light made by combining different colors selected from the three primary colors of red, blue and green is used.
  • a configuration may be adopted where several types of stripe fringe patterns, in which a light quantity distribution of each stripe in the fringe pattern is modulated to a sinusoidal wave, can be projected while shifting the phase.
  • each camera unit 21 a , 21 b , 31 a , 31 b used in the first imaging unit 20 and the second imaging unit 30 will be described with reference to FIG. 3 A and FIG. 3 B.
  • each camera unit 21 a , 21 b , 31 a , 31 b is not particularly distinguished, each camera unit 21 a , 21 b , 31 a , 31 b is simply referred to as a camera unit.
  • FIG. 3A shows an example of a configuration of a small-sized monocular camera unit including one projector 22 and one camera 23 .
  • the projector 22 is a light emitting device configured to selectively project an image formed by light having a given stripe fringe pattern and an image formed by light having no pattern. Examples of a light source used in the projector 22 may include a light emitting diode and a laser diode.
  • the projector 22 may preferably adopt a configuration in which several types of stripe fringe patterns, in which the light quantity distribution of each stripe in the fringe pattern is modulated to the sinusoidal wave, can be projected while shifting the phase. With this configuration, by using the known phase-shift method, highly accurate measurement can be achieved.
  • the camera 23 is an imaging device to take an image in an area on which the light emitted from the projector 22 is projected.
  • the camera 23 may include a known image sensor element, such as a CCD image sensor and a CMOS image sensor.
  • the projector 22 and the camera 23 are positioned and arranged in a casing 24 such that the area where a light is projected from one projector 22 overlaps with a visual field of one camera 23 .
  • This small-sized monocular camera unit is suitably used for taking an image of a relatively small inspection target.
  • FIG. 3B shows an example of a configuration of a large-sized binocular camera unit including one projector 22 and two cameras 23 .
  • the projector 22 and the cameras 23 are positioned and arranged in a casing 25 such that two regions, which are formed by dividing the area of the light projected by one projector 22 , and two visual fields of the cameras 23 are respectively overlapped.
  • This large-sized binocular camera unit is suitably used for taking an image of a relatively large inspection target.
  • a standard gauge 40 a is attached in the vicinity of the tip of the arm of the workpiece pick-up robot 10 .
  • the standard gauge 40 a is a standard used for an accuracy evaluation of the three-dimensional image measurement based on the image taken by the first imaging unit 20 .
  • the standard gauge 40 a is attached to the workpiece pick-up robot 10 at a position where the standard gauge 40 a comes in the imaging range of the first imaging unit 20 when the workpiece pick-up robot 10 is in the posture of directing the first surface of the workpiece 100 towards the first imaging unit 20 .
  • a standard gauge 40 b is attached to the frame structure 6 below the second imaging unit 30 .
  • the standard gauge 40 b is a standard used for the accuracy evaluation of the three-dimensional image measurement based on the image taken by the second imaging unit 30 .
  • the standard gauge 40 b is attached to the frame structure 6 at a position where the standard gauge 40 b comes in the imaging range of the second imaging unit 30 .
  • Examples of these standard gauges 40 a , 40 b may preferably include a known standard, such as a block gauge and a step gauge.
  • the defective workpiece discharger 50 is provided.
  • the defective workpiece discharger 50 is configured to execute an operation of discharging the workpiece 100 , which is determined as a defective product by the inspection based on the image taken by the second imaging unit 30 , from the conveying conveyor 3 .
  • the defective workpiece discharger 50 includes a grasping means and a transporting means.
  • the grasping means grasps and lifts the workpiece 100 placed on the conveying conveyor 3 .
  • the transporting means moves the grasping means in both ways between the upper part of the conveying conveyor 3 and the upper part of the defective product discharging conveyor 4 along the frame structure 6 .
  • Examples of the grasping means may preferably include a vacuum gripper or the like that can suck the workpiece 100 .
  • the defective workpiece discharger 50 lifts the workpiece 100 , which is determined as the defective product, from the conveying conveyor 3 , carries the workpiece 100 above the defective product discharging conveyor 4 , and places the workpiece 100 on the defective product discharging conveyor 4 .
  • the workpiece 100 which is determined as a passed product by the inspection based on the image taken by the second imaging unit 30 , is conveyed forward by the conveying conveyor 3 and supplied to a next manufacturing process.
  • the workpiece 100 which is determined as a defective product and placed on the defective product discharging conveyor 4 , is conveyed forward by the defective product discharging conveyor 4 and discharged from the manufacturing line.
  • the marking robot 60 is attached to the frame structure 6 on a side of the defective product discharging conveyor 4 and on a front side of the frame structure 6 .
  • the marking robot 60 performs an operation of putting a mark indicating the content of the molding fault detected in the workpiece 100 on the defective workpiece 100 conveyed by the defective product discharging conveyor 4 .
  • the marking robot 60 includes an ink jet print head at the tip of an articulated robot arm.
  • the marking robot 60 is preferably configured to print characters and symbols indicating a type of the molding fault and a position of the molding fault and the like on the surface of the workpiece 100 based on control of the controller 70 .
  • the controller 70 is an information processor mainly composed of a CPU, RAM, ROM, an input-output interface (not shown), and a memory 73 .
  • the memory 73 is an auxiliary memory including, for example, HDD and SSD.
  • the controller 70 may be embodied by a computer system and the like having an appropriate information processing ability. Functions of the controller 70 are performed by the CPU executing programs stored in a substantive memory medium such as the ROM and the memory 73 . It is to be noted that the number of the computers configuring the controller 70 may be one or more.
  • the controller 70 includes a measurement processor 71 and a system integrated management unit 72 as components to fulfill the functions. It is to be noted the way to realize these elements configuring the controller 70 is not limited to software, but a part or all of the element may be realized by hardware that is a combination of a logic circuit, an analog circuit and the like.
  • the measurement processor 71 measures a three-dimensional surface shape of the workpiece 100 using the images taken by the first imaging unit 20 and the second imaging unit 30 . Then, the measurement processor 71 detects the molding fault from the three-dimensional surface shape. Specifically, the measurement processor 71 compares the three-dimensional surface shape measured from the workpiece 100 with a three-dimensional surface shape represented by given nominal contour data, and then specifies, as a possible molding fault, a portion having a dissimilar shape in the three-dimensional surface shape measured from the workpiece 100 .
  • a known pattern projection method is used.
  • the pattern projection method as the three-dimensional surface shape, a given stripe fringe pattern is projected onto the inspection target, and then the three-dimensional surface shape is measured based on a degree of distortion of the stripe fringe pattern projected onto the inspection target.
  • One of the examples of this type of the pattern projection method includes a phase-shift method.
  • the stripe fringe pattern in which a projection intensity is modulated to the sinusoidal wave, is projected several times while shifting the phase to measure the three-dimensional surface shape.
  • the nominal contour data used for the detection of the molding fault is data representing a three-dimensional surface shape that serves as a criterion for a non-defective workpiece 100 .
  • the nominal contour data is pre-stored in the memory 73 and the like of the controller 70 .
  • several types of the nominal contour data, each of which corresponds to each region may be preferably included.
  • the nominal contour data includes a nominal contour data corresponding to the three-dimensional surface shape of the first surface side of the workpiece 100 and a nominal contour data corresponding to the three-dimensional surface shape of the second surface side of the workpiece 100 .
  • a grid line is added at a given distance (for example, every 50 mm) to the three-dimensional surface shape represented by the nominal contour data, and a specific number (hereinafter, referred to as a grid number) is added to each grid of the three-dimensional surface shape divided by the grid lines.
  • a reference numeral 200 denotes the three-dimensional surface shape represented by the nominal contour data.
  • a reference numeral 201 denotes two or more grid lines drawn at equal intervals in an x axis direction and a y axis direction of the three-dimensional surface shape 200 .
  • a reference numeral 202 denotes each grid surrounded by two or more grid lines 201 .
  • a specific grid number is added to each of the two or more grids 202 divided by two or more grid lines 201 .
  • the controller 70 assigns the grid lines and the grid number to a three-dimensional surface shape represented by the nominal contour data, and stores them in the memory 73 and the like together with the nominal contour data.
  • the measurement processor 71 specifies the position of the molding fault detected in the workpiece 100 by associating with the corresponding grid number.
  • the measurement processor 71 specifies a possible molding fault as the molding fault only when the dimension of the deformed portion is a given criterion value or more.
  • the dimension of the deformed portion includes a length, depth or height.
  • This criterion value represents a criterion of the dimension of the molding fault that may be a practical defect as a product when in use, and the criterion value is preregistered in the memory 73 and the like of the controller 70 . Also, it is preferable that this criterion value can be set and changed by a user of the inspection system 1 . In this case, the controller 70 saves the criterion value, which is input via a specified input device by the user, in the memory 73 .
  • the criterion value provides only the dimension of the molding fault that can be a practical defect as a product, and the criterion value is not a minimum dimension of the deformed portion that is theoretically detectable based on the resolution of the taken image.
  • the criterion value is set to the dimension of 1/10 mm, the fine deformed portion less than the criterion value is not determined as the molding fault.
  • the controller 70 divides the three-dimensional surface shape of the nominal contour data into two or more regions, and saves the criterion value set to each of the two or more divided regions in the memory 73 . For example, by respectively setting different criterion values to a portion with steep bending and a gently processed portion, it is possible to accurately detect the molding fault peculiar to each portion.
  • the measurement processor 71 is configured to perform, in addition to the inspection of the presence or absence of a molding fault, a diagnosis of the deterioration of a die in the press machine 2 based on a state of a scratch that may be formed on the surface of the workpiece 100 .
  • the scratch that may be formed on the surface of the molded article by friction between the metal material and the die exhibits a strong gloss.
  • the measurement processor 71 calculates, from the taken image of the workpiece 100 , a brightness value of each point in the image for a specific portion having the steep shape, and determines the presence or absence of the scratch caused by the friction with the die based on the calculated brightness value.
  • the system integrated management unit 72 integrally controls the operations of the manufacturing line including the conveying conveyor 3 and the defective product discharging conveyor 4 , and the inspection system 1 . Specifically, the system integrated management unit 72 interlocks and controls, in accordance with the manufacture of the workpiece 100 by the press machine 2 , the operation of each unit in the inspection system 1 and the operations of the conveying conveyor 3 and the defective product discharging conveyor 4 .
  • each unit in the inspection system 1 includes the workpiece pick-up robot 10 , the first imaging unit 20 , the second imaging unit 30 , the defective workpiece discharger 50 , and the marking robot 60 .
  • the system integrated management unit 72 performs the imaging of the workpiece 100 by controlling the first imaging unit at the timing when the workpiece 100 is placed in the imaging range of the first imaging unit 20 by controlling the workpiece pick-up robot 10 . Also, the system integrated management unit 72 controls the conveying conveyor 3 to convey the workpiece 100 to the imaging range of the second imaging unit 30 , and at that timing, the system integrated management unit 72 performs the imaging of the workpiece 100 by controlling the second imaging unit. In addition, the system integrated management unit 72 controls the defective product discharging conveyor 4 to convey a defective workpiece 100 to a work area of the marking robot 60 , and at that timing, the system integrated management unit 72 performs a marking on the surface of the workpiece 100 by controlling the marking robot 60 .
  • a procedure of an inspection process executed by the controller 70 will be described with reference to a flowchart of FIG. 5 .
  • This inspection process is executed with respect to each workpiece 100 when each workpiece 100 is placed in each imaging range of the first imaging unit 20 and the second imaging unit 30 by the workpiece pick-up robot 10 and the conveying conveyor 3 .
  • the controller 70 images the first surface or the second surface of the workpiece 100 by controlling each camera unit of the first imaging unit 20 or the second imaging unit 30 when the workpiece 100 is placed in the imaging range. Specifically, based on the control by the controller 70 , the projector 22 of each imaging camera unit alternately emits the image formed by the light having the stripe fringe pattern and the image formed by the light having no pattern. Then, the camera 23 of each imaging camera unit takes the fringe-patterned images and the no-patterned images projected onto the workpiece 100 and the standard gauges 40 a , 40 b . In S 102 , after the imaging in S 100 , each camera 23 transfers data of the fringe-patterned images and the non-patterned images to the measurement processor 71 of the controller 70 .
  • the controller 70 creates a height displacement map for each fringe-patterned image transferred in S 102 .
  • the height displacement map is a map data representing a distribution of the height displacement of each pixel in the taken images.
  • a known phase-shift method may be preferably used. That is, the controller 70 calculates the height displacement for each pixel in the taken images by using the taken images in which the stripe fringe pattern modulated to the sinusoidal wave is projected several times while shifting the phase.
  • the controller 70 creates a brightness map for each non-patterned image transferred in S 102 .
  • the brightness map is map data representing a distribution of the brightness value of each pixel in the taken images.
  • the controller 70 converts the height displacement map created in S 104 to a three-dimensional point group represented by a three-dimensional coordinate system. Then, with respect to each point in the converted three-dimensional point group, the measurement processor 71 assigns the brightness value of the corresponding point in the brightness map that is created in S 104 from the images taken by the same camera, and creates a three-dimensional point group/brightness data.
  • the controller 70 integrates data of the three-dimensional point group/brightness created from the images taken by two or more camera units into the same three-dimensional coordinate system. Specifically, when the first imaging unit 20 performed the imaging, the controller 70 integrates the three-dimensional point group/brightness data created from the images taken by the two camera units 21 a , 21 b composing the first imaging unit 20 into one to obtain a three-dimensional point group/brightness data of a whole first surface of the workpiece 100 .
  • the controller 70 integrates the three-dimensional point group/brightness data created from the images taken by the two camera units 31 a , 31 b composing the second imaging unit 30 to obtain a three-dimensional point group/brightness data of a whole second surface of the workpiece 100 .
  • the controller 70 specifies a steep portion from the nominal contour data corresponding to the first surface or second surface of the workpiece 100 imaged in S 100 .
  • the steep portion is a portion representing a steep shape having a bending curvature smaller than a specified threshold in the three-dimensional surface shape represented by the nominal contour data.
  • the controller 70 determines whether a distribution of the brightness values assigned to the portion corresponding to the steep portion falls under the scratch. Specifically, the controller 70 determines the presence of the scratch under a condition where a point exceeding a brightness threshold serving as a criterion for the scratch is continuously distributed beyond a specified dimensional range in the portion corresponding to the steep portion.
  • the controller 70 advances the process to S 116 .
  • the controller 70 advances the process to S 114 .
  • a warning information of the deterioration of the die in the press machine 2 is output to a given output destination (for example, a display device, a recording device and the like).
  • the controller 70 extracts data of the three-dimensional point group corresponding to the standard gauges 40 a , 40 b from the three-dimensional point group/brightness data integrated in S 108 . Then, the controller 70 evaluates the accuracy of the three-dimensional image measurement using the extracted three-dimensional point group data. Specifically, the controller 70 collates the three-dimensional surface shape represented by the three-dimensional point group corresponding to the standard gauges 40 a , 40 b and a three-dimensional surface shape represented by a given calibration data. Then, the controller 70 evaluates the accuracy of the measurement based on a matching degree between the three-dimensional surface shape of the standard gauges 40 a , 40 b and the three-dimensional surface shape of the calibration data.
  • the calibration data used for the accuracy evaluation in S 115 is data representing a reference three-dimensional surface shape of the standard gauges 40 a , 40 b when the standard gauges 40 a , 40 b are appropriately imaged.
  • the configuration data is prestored in the memory 73 and the like of the controller 70 .
  • the controller 70 branches the process depending on whether the accuracy evaluated in S 116 is within a given allowable range for a measurement accuracy. When the evaluated accuracy deviates the allowable range (S 118 : NO), the controller 70 advances the process to S 120 .
  • the controller 70 outputs information to instruct a reinspection of the workpiece 100 to a given output destination (for example, a display device, a recording device and the like).
  • the controller 70 advances the process to S 122 .
  • the controller 70 collates the three-dimensional surface shape of the inspection target represented by the three-dimensional point group/brightness data integrated in S 108 and the three-dimensional surface shape represented by the nominal contour data corresponding to the inspection target, and then specifies a deformed point that can be the possible molding fault.
  • the controller 70 compares the three-dimensional surface shape of the inspection target and the three-dimensional surface shape of the nominal contour data, and then creates a deviation value distribution diagram showing a similarity/dissimilarity of the shape of the inspection target. Then, the controller 70 detects, as a defective element point, the deformed point that exceeds an allowable value for the molding quality in the created deviation value distribution diagram.
  • the controller 70 selects the defective element points detected in S 122 . Specifically, from the defective element points detected in S 122 , the controller 70 selects a portion having two or more consecutive points (for example, three consecutive points or more) as the possible molding fault. On the other hand, the controller 70 removes other scattered defective element points. Also, the controller 70 calculates a dimension of the selected possible molding fault. Here, the dimension include, for example, a length, a depth or a height of a shape of the possible molding fault. In S 126 , the controller 70 determines whether the dimension of the shape of the possible molding fault selected in S 124 falls under the criterion value for the molding fault or more.
  • the measurement processor 71 may conduct the determination using one type of the criterion value corresponding to the workpiece 100 of the inspection target. Alternatively, the measurement processor 71 may conduct the determination using several types of the criterion value respectively set for several portions of the workpiece 100 . When using the several types of the criterion value, the measurement processor 71 uses the criterion value corresponding to a portion where the possible molding fault exists to conduct the determination.
  • the controller 70 advances the process to S 128 .
  • the workpiece 100 that is determined in S 126 is conveyed as a passed product based on the control of the controller 70 .
  • the controller 70 controls the workpiece pick-up robot 10 as follows. That is, the controller 70 makes the workpiece pick-up robot 10 execute an operation in which the workpiece 100 held by the workpiece pick-up robot 10 is placed on the conveying conveyor 3 .
  • the controller 70 controls the conveying conveyor 3 to convey the workpiece 100 forward.
  • the controller 70 advances the process to S 130 .
  • the controller 70 confirms that the possible molding fault, which is determined in S 126 to have the dimension equal to or more than the criterion value, is a molding fault. Then, the controller 70 records the information on the confirmed molding fault in a specified archive destination (for example, the memory 73 ).
  • the controller 70 records information including the three-dimensional point group data measured from the workpiece 100 , the dimension of the shape forming the molding fault, a position where the molding fault occurs, a time when the molding fault is detected, and an identification number to identify the workpiece 100 .
  • the information representing the position where the molding fault occurs the grid number assigned to each grid (see FIG. 4 ) set to the nominal contour data.
  • the controller 70 records the grid number corresponding to the position where the detected molding fault exists as the occurrence position of the molding fault.
  • the controller 70 creates a stereoscopic image in which a shading process is applied to the three-dimensional surface shape represented by the three-dimensional point group forming the molding fault, and saves the created image.
  • the workpiece 100 having the detected molding fault is discharged as a defective product by the control of the controller 70 .
  • the controller 70 controls the workpiece pick-up robot 10 as follows. That is, the controller 70 makes the workpiece pick-up robot 10 execute an operation to place the workpiece 100 held by the workpiece pick-up robot 10 on the defective product discharging conveyor 4 .
  • the controller 70 controls the defective workpiece discharger 50 as follows. That is, the controller 70 makes the defective workpiece discharger 50 execute an operation to lift the workpiece 100 from the conveying conveyor 3 , and then carry and place the lifted the workpiece 100 on the defective product discharging conveyor 4 .
  • the controller 70 operates the defective product discharging conveyor 4 to convey the workpiece 100 that is the defective product forward.
  • the controller 70 controls the marking robot 60 to print information related to the molding fault on the surface of the workpiece 100 .
  • the information the marking robot 60 prints on the workpiece 100 preferably includes, for example, the grid number indicating the position of the molding fault, and information indicating a type of the molding fault.
  • controller 70 determines, based on the dimensions and a shape of the molding fault, the type of the molding fault, such as a flaw, dent, missing part, burr, crack, dross, peeling of plating.
  • the inspection system 1 it is possible to inspect the presence or absence of the molding fault in the workpiece 100 with high speed and excellent accuracy by using the three-dimensional image measurement. Specifically, the inspection system 1 can detect the possible molding fault in the workpiece 100 by collating the three-dimensional surface shape of the workpiece 100 measured by the method of the three-dimensional image measurement and the nominal contour data representing the three-dimensional surface shape of the non-defective product of the workpiece 100 .
  • the inspection system 1 specifies, among the detected possible molding faults, only a possible molding fault in which the dimension thereof is equal to or more than the criterion value as the molding fault.
  • the three-dimensional image measurement for example, when the resolution of the image to be taken is sufficiently high, even a fine scratch that cannot be the defect as a product can be detected. Thus, by removing the fine scratch that does not satisfy the criterion value from the target of the molding fault, it is possible to accurately detect the molding fault whose dimension can be the defect as a product.
  • the criterion values can be arbitrarily set and changed by a user, which makes it possible to freely change the criterion value for the molding fault depending on a quality required for the product. Furthermore, by configuring such that the criterion value can be separately set for each region formed by diving the three-dimensional surface shape represented by the nominal contour data into two or more regions, it is possible to accurately detect the molding fault in accordance with the shape of each portion of the workpiece 100 .
  • the position of the molding fault can be indicated by the grid number obtained by dividing the three-dimensional surface shape represented by the nominal contour data at every given distance.
  • the marking robot 60 a mark indicating the type and the position of the molding fault can be directly printed on the surface of the workpiece 100 . This allows to easily recognize the position of the molding fault at the time of the actual observation of the workpiece 100 in which the molding fault is detected.
  • the scratch can be detected in the portion processed into a steep shape. This makes it possible to output the warning of the deterioration in the die. With this configuration, the time to correct the die can be accurately determined.
  • the inspection system 1 by arranging the standard gauge 40 a in the imaging range of the first imaging unit 20 and by arranging the standard gauge 40 b in the imaging range of the second imaging unit 30 , it is possible to guarantee the validity of each inspection result carried on each workpiece 100 .
  • the inspection system 1 the number of the camera unit configuring the first imaging unit 20 and the second imaging unit 30 can be set to one or more depending on a shape and a scale of the workpiece 100 .
  • the number of the camera unit configuring the first imaging unit 20 and the second imaging unit 30 can be set to one or more depending on a shape and a scale of the workpiece 100 .
  • the inspection system 1 it is possible to inspect the workpiece 100 without interrupting the flow of the conveyance of the workpiece 100 came out from the press machine 2 to a next manufacturing procession by cooperating with the conveying conveyor 3 and the defective product discharging conveyor 4 configuring the manufacturing line. This achieves improvement in both productivity and inspection speed of the molded article, and thus, even the total inspection of mass-produced molded articles can be realized.
  • the camera units 21 a , 21 b of the first imaging unit 20 correspond to one example of the first projection unit and the first imaging unit.
  • the camera units 31 a , 31 b of the second imaging unit 30 correspond to one example of the second projection unit and the second imaging unit.
  • the processes of SS 104 , S 106 , and S 108 executed by the controller 70 correspond to one example of the processes performed by a measuring unit and a gloss measuring unit.
  • the processes of S 122 and S 124 executed by the controller 70 correspond to one example of the process performed by a comparing and detecting unit.
  • the process of S 126 executed by the controller 70 correspond to one example of the process performed by a fault specifying unit.
  • the processes of S 130 and S 132 executed by the controller 70 , and the marking robot 60 correspond to one example of an output unit.
  • the processes of S 110 and S 112 executed by the controller 70 correspond to one example of the process performed by a scratch specifying device.
  • the standard gauges 40 a , 40 b correspond to one example of a standard.
  • the processes of S 116 and S 118 executed by the controller 70 correspond to one example of the process performed by a determiner.
  • the workpiece pick-up robot 10 corresponds to one example of a robot arm.
  • one function of one element may be achieved by two or more elements; or two or more functions of two or more elements may be achieved by one element.
  • a part of the configuration of the aforementioned each embodiment may be omitted; and at least a part of the configuration of the aforementioned each embodiment may be added to or replaced with another part of the aforementioned embodiments.
  • the present disclosure can also be achieved in various forms such as a program to enable a computer to function as the above-described the controller 70 , a substantive memory medium including a semiconductor memory storing the program, and an inspection method for a molded article.
  • the inspection system 1 may be further configured to have a function for inspecting dimensions of the molded article.
  • the inspection of the dimensions means to inspect an accuracy of the dimensions of a processed portion in the molded article, such as a distance between holes formed in the molded article and a distance from an end portion to a hole of the molded article, by measuring a distance or the like between arbitrary points of a geometric shape.
  • Such dimensional inspection is a standard function in a conventional noncontact three-dimensional measuring instrument.
  • the inspection system 1 is configured such that a user can arbitrarily and selectively execute three inspection modes of (1) inspecting only the molding fault, (2) inspecting only the dimensions, and (3) inspecting the molding fault and the dimensions at the same time.
  • the inspection system 1 a description has been made of a case where the inspection is performed on the molded article manufactured by press-forming.
  • the present disclosure is not limited to the application for the molded article manufactured by the press-forming, and the present disclosure is applicable to molded articles manufactured by various processing methods such as injection molding, forging, casting, and extrusion molding.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Quality & Reliability (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
US16/622,674 2017-06-16 2018-04-16 Inspection device Abandoned US20210150695A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017118775A JP6408654B1 (ja) 2017-06-16 2017-06-16 検査装置
JP2017-118775 2017-06-16
PCT/JP2018/015712 WO2018230134A1 (ja) 2017-06-16 2018-04-16 検査装置

Publications (1)

Publication Number Publication Date
US20210150695A1 true US20210150695A1 (en) 2021-05-20

Family

ID=63855311

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/622,674 Abandoned US20210150695A1 (en) 2017-06-16 2018-04-16 Inspection device

Country Status (6)

Country Link
US (1) US20210150695A1 (ko)
EP (1) EP3640584A4 (ko)
JP (1) JP6408654B1 (ko)
KR (1) KR20200028940A (ko)
CN (1) CN111886474A (ko)
WO (1) WO2018230134A1 (ko)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113849000A (zh) * 2021-09-15 2021-12-28 山东泰开智能配电有限公司 高压隔离开关镀银件镀层厚度自动检测系统的控制方法
US20220092765A1 (en) * 2019-01-24 2022-03-24 Sualab Co., Ltd. Defect inspection device

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019194065A1 (ja) * 2018-04-02 2019-10-10 日本電産株式会社 画像処理装置、画像処理方法、外観検査システムおよび外観検査方法
CN112330594B (zh) * 2020-10-13 2024-01-30 浙江华睿科技股份有限公司 一种纸筒缺陷检测方法、装置、电子设备及存储介质
WO2023106013A1 (ja) 2021-12-07 2023-06-15 Jfeスチール株式会社 プレス成形解析方法、プレス成形解析装置及びプレス成形解析プログラム
JP7416106B2 (ja) 2022-01-21 2024-01-17 Jfeスチール株式会社 プレス成形解析の解析精度評価方法
WO2023139900A1 (ja) * 2022-01-21 2023-07-27 Jfeスチール株式会社 プレス成形解析の解析精度評価方法
FR3133924A1 (fr) * 2022-03-24 2023-09-29 Psa Automobiles Sa Procede de detection et de marquage de defauts sur une piece et installation pour la mise en œuvre du procede
JP7274026B1 (ja) 2022-07-05 2023-05-15 株式会社ジーテクト プレス機
JP7343015B1 (ja) * 2022-08-29 2023-09-12 Jfeスチール株式会社 プレス成形品の製造方法
CN116007526B (zh) * 2023-03-27 2023-06-23 西安航天动力研究所 一种膜片刻痕深度自动测量系统及测量方法
KR102640549B1 (ko) * 2023-08-17 2024-02-27 (주)쎄미콤 3d스캐너를 이용한 프로파일 온도센서의 불량검사방법
CN118009889B (zh) * 2024-04-09 2024-06-18 常州铭赛机器人科技股份有限公司 工件点胶槽位置的测量方法

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58180933A (ja) * 1982-04-16 1983-10-22 Hitachi Ltd パタ−ン欠陥検査装置
US6462813B1 (en) * 1996-04-12 2002-10-08 Perceptron, Inc. Surface defect inspection system and method
US6956963B2 (en) * 1998-07-08 2005-10-18 Ismeca Europe Semiconductor Sa Imaging for a machine-vision system
JP2008064595A (ja) * 2006-09-07 2008-03-21 Olympus Corp 基板検査装置
JP2008267851A (ja) * 2007-04-17 2008-11-06 Ushio Inc パターン検査装置およびパターン検査方法
JP5693834B2 (ja) * 2009-09-17 2015-04-01 アルパイン株式会社 音声認識装置及び音声認識方法
EP2508871A4 (en) * 2009-11-30 2017-05-10 Nikon Corporation Inspection apparatus, measurement method for three-dimensional shape, and production method for structure
JP2013024852A (ja) * 2011-07-25 2013-02-04 Muramatsu:Kk 成形品画像処理検査装置
JP5934546B2 (ja) * 2012-03-29 2016-06-15 株式会社Screenホールディングス 描画装置および描画方法
JP6177017B2 (ja) * 2013-06-12 2017-08-09 住友化学株式会社 欠陥検査システム
JP6371044B2 (ja) * 2013-08-31 2018-08-08 国立大学法人豊橋技術科学大学 表面欠陥検査装置および表面欠陥検査方法
JP2015114309A (ja) 2013-12-16 2015-06-22 株式会社オプトン 計測装置
JP6382074B2 (ja) * 2014-11-05 2018-08-29 古河電気工業株式会社 外観検査装置、外観検査システム、及び外観検査方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220092765A1 (en) * 2019-01-24 2022-03-24 Sualab Co., Ltd. Defect inspection device
US11790512B2 (en) * 2019-01-24 2023-10-17 Sualab Co., Ltd. Defect inspection device
CN113849000A (zh) * 2021-09-15 2021-12-28 山东泰开智能配电有限公司 高压隔离开关镀银件镀层厚度自动检测系统的控制方法

Also Published As

Publication number Publication date
JP6408654B1 (ja) 2018-10-17
JP2019002834A (ja) 2019-01-10
EP3640584A1 (en) 2020-04-22
CN111886474A (zh) 2020-11-03
EP3640584A4 (en) 2021-03-10
KR20200028940A (ko) 2020-03-17
WO2018230134A1 (ja) 2018-12-20

Similar Documents

Publication Publication Date Title
US20210150695A1 (en) Inspection device
KR102090856B1 (ko) 물질 도포기
CN108573901B (zh) 裸芯片接合装置及半导体器件的制造方法
JP7174074B2 (ja) 画像処理装置、作業ロボット、基板検査装置および検体検査装置
CN110231352B (zh) 图像检查装置、图像检查方法以及图像检查记录介质
JP2019196964A (ja) 分類器の学習支援システム、学習データの収集方法、検査システム
JP7214432B2 (ja) 画像処理方法、画像処理プログラム、記録媒体、画像処理装置、生産システム、物品の製造方法
JP6696323B2 (ja) パターン検査装置およびパターン検査方法
JP2018195735A (ja) 半導体製造装置および半導体装置の製造方法
JPH06147836A (ja) シート寸法測定装置
JP5949214B2 (ja) 品質検査方法
CN114608458B (zh) 装片胶厚度检测装置及方法
JP2022105581A (ja) 成形不良の検出方法
KR102516586B1 (ko) 다이 본딩 장치 및 반도체 장치의 제조 방법
JP7268341B2 (ja) 検査性能診断装置、検査性能診断方法、検査性能診断装置用のプログラム、および、検査性能診断システム
JP5205224B2 (ja) 部品実装状態検査装置
JPH0545127A (ja) 帯状部材の位置、形状測定方法および装置
JPH0995028A (ja) 印字検査装置
JPH07104132B2 (ja) 実装部品外観検査方法
JP4420796B2 (ja) 容器の外観検査方法
JP5055095B2 (ja) 測定装置及び測定方法
JP2006226834A (ja) 表面検査装置、表面検査の方法
TWI703320B (zh) 刻印檢查裝置、刻印檢查方法及物品檢查裝置
JP2021018064A (ja) 外観検査方法及び外観検査装置
KR102005345B1 (ko) 라인 스캔 카메라를 이용한 자동차 정션 박스 터미널 단자 비전 검사 방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: OPTON CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YOGO, TERUAKI;REEL/FRAME:051279/0341

Effective date: 20191212

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION