CN108627515B - Device and method for calculating image area outside inspection object of inspection system - Google Patents

Device and method for calculating image area outside inspection object of inspection system Download PDF

Info

Publication number
CN108627515B
CN108627515B CN201810225234.4A CN201810225234A CN108627515B CN 108627515 B CN108627515 B CN 108627515B CN 201810225234 A CN201810225234 A CN 201810225234A CN 108627515 B CN108627515 B CN 108627515B
Authority
CN
China
Prior art keywords
workpiece
imaging
image
unit
inspection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810225234.4A
Other languages
Chinese (zh)
Other versions
CN108627515A (en
Inventor
吉田顺一郎
藁科文和
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fanuc Corp
Original Assignee
Fanuc Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fanuc Corp filed Critical Fanuc Corp
Publication of CN108627515A publication Critical patent/CN108627515A/en
Application granted granted Critical
Publication of CN108627515B publication Critical patent/CN108627515B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8867Grading and classifying of flaws using sequentially two or more inspection runs, e.g. coarse and fine, or detecting then analysing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • G01N2021/889Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques providing a bare video image, i.e. without visual measurement aids
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9515Objects of complex shape, e.g. examined with use of a surface follower device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37206Inspection of surface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45066Inspection robot
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Abstract

The invention provides a device and a method for calculating an image area outside an inspection object of an inspection system, which can easily identify the area outside the inspected surface of a workpiece. The device is provided with: a drawing acquisition unit that acquires drawing data of a workpiece; a designation receiving unit that receives designation of a surface to be inspected of the workpiece in the drawing data; and a non-inspection region calculation unit that calculates a region other than the surface to be inspected as a non-inspection region in an image within a field of view of the imaging unit when the workpiece and the imaging unit are positioned at an imaging position where at least a part of the designated surface to be inspected enters the field of view of the imaging unit.

Description

Device and method for calculating image area outside inspection object of inspection system
Technical Field
The present invention relates to an apparatus for calculating a region outside an inspection target of an inspection system for inspecting a surface of a workpiece, and a method for calculating a region outside an inspection target.
Background
An inspection system for inspecting a surface of a workpiece for damage or the like is known (for example, japanese patent laid-open No. 7-63537).
In the inspection system described above, when the surface of the workpiece is inspected, the surface is imaged by the imaging unit, and damage or the like is detected from the imaged image. In this case, there is a possibility that a region other than the surface to be inspected which needs to be inspected exists in the captured image. Conventionally, a technique capable of easily identifying such a region other than the surface to be inspected has been desired.
Disclosure of Invention
An aspect of the present invention is an apparatus for calculating an area outside an inspection target to be an inspection system including an imaging unit for imaging a workpiece and a moving mechanism for moving the workpiece or the imaging unit to position the workpiece and the imaging unit with respect to each other, and inspecting a surface of the workpiece, the apparatus including: a drawing acquisition unit that acquires drawing data of the workpiece; a designation receiving unit that receives designation of a surface to be inspected of the workpiece in the drawing data; and a non-inspection region calculation unit that calculates a region other than the surface to be inspected as a non-inspection region in an image within a field of view of the imaging unit when the workpiece and the imaging unit are positioned at an imaging position at which at least a part of the designated surface to be inspected enters the field of view of the imaging unit.
Another aspect of the present invention is a method of calculating an area outside an inspection target to be an inspection system including an imaging unit for imaging a workpiece and a moving mechanism for positioning the workpiece or the imaging unit with respect to each other to inspect a surface of the workpiece, the method including: acquiring drawing data of the workpiece; receiving designation of a surface to be inspected of the workpiece in the drawing data; and calculating a region other than the surface to be inspected as a non-inspection region in an image within a field of view of the imaging unit when the workpiece and the imaging unit are positioned at an imaging position at which at least a part of the designated surface to be inspected enters the field of view of the imaging unit.
According to an aspect of the present disclosure, a non-inspection region that does not need to be inspected by an inspection system can be calculated from drawing data of a workpiece. Thus, the non-inspection region can be automatically calculated when the inspection surface is inspected, and the operator can be saved from manually setting the non-inspection region according to the field of view of the imaging unit.
Drawings
The objects, features and advantages of the present invention will be further apparent from the following description of the embodiments with reference to the accompanying drawings.
Fig. 1 is a perspective view of an inspection system according to an embodiment.
FIG. 2 is a block diagram of the inspection system shown in FIG. 1.
Fig. 3 is an enlarged view of the robot shown in fig. 1, as viewed from the positive y-axis direction of the tool coordinate system.
Fig. 4 is an enlarged view of the imaging section and the workpiece in fig. 1.
Fig. 5 is a view showing the field of view of the surface to be inspected when the workpiece and the imaging unit are positioned at the 1 st imaging position.
Fig. 6 shows an example of an image captured by the imaging unit when the workpiece and the imaging unit are arranged at the 1 st imaging position.
Fig. 7 is a view showing a field of view of a surface to be inspected when the workpiece and the imaging unit are arranged at the nth imaging position (n is 1 to 12).
Fig. 8 is a flowchart showing an example of the operation flow of the apparatus shown in fig. 1.
Fig. 9 shows drawing data of a workpiece.
Fig. 10 shows an example of the point group generated in step S4 in fig. 8.
Fig. 11 is a flowchart showing an example of the flow of step S6 in fig. 8.
Fig. 12 shows an example of the estimation image generated by the series of the flow of step S6 shown in fig. 11.
Fig. 13 shows an example of a dot group according to another embodiment.
Fig. 14 shows an example of the estimation image generated in step S6 based on the point group shown in fig. 13.
FIG. 15 is a block diagram of an inspection system according to another embodiment.
Fig. 16 is a flowchart showing an example of the operation flow of the apparatus shown in fig. 15.
Fig. 17 shows an example of mask data generated in step S21 in fig. 16.
Fig. 18 is a perspective view of an inspection system according to another embodiment.
Fig. 19 is a flowchart showing an example of the operation flow of the apparatus shown in fig. 18.
Fig. 20 is a flowchart showing an example of the operation flow of step S6' in fig. 19.
Fig. 21 is a perspective view of an inspection system according to still another embodiment.
Fig. 22 is a flowchart showing an example of the operation flow of the apparatus shown in fig. 21.
Fig. 23 is a flowchart showing an example of the operation flow of step S30 in fig. 22.
Fig. 24 is a view for explaining the line of sight of each imaging element of the imaging unit shown in fig. 21.
Fig. 25 is a perspective view showing an inspection system according to still another embodiment.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. In the various embodiments described below, the same elements are denoted by the same reference numerals, and redundant description thereof is omitted. First, an inspection system 10 according to an embodiment will be described with reference to fig. 1 to 3.
The inspection system 10 includes a control unit 12, a moving mechanism 14, an imaging unit 16, an illumination device 18, and a device 50 (fig. 2). The control unit 12 includes a CPU, a storage unit (not shown), and the like, and controls the movement mechanism 14, the imaging unit 16, and the illumination device 18.
In the present embodiment, the movement mechanism 14 is a vertical multi-joint robot, and includes a robot base 20, a rotary cylinder 22, a robot arm 24, a wrist portion 26, and a robot hand 28. The robot base 20 is fixed on the floor of the work unit. The rotary cylinder 22 is provided on the robot base 20 so as to be rotatable about a vertical axis.
The robot arm 24 includes an upper arm 30 pivotably coupled to the pivot cylinder 22 and a forearm 32 pivotably coupled to a tip end of the upper arm 30. The wrist portion 26 is attached to the front end of the front arm portion 32, and supports the robot arm 28 so as to be rotatable about three axes.
As shown in fig. 3, the robot 28 includes a robot base 34, a plurality of fingers 36, and a finger driving unit (not shown). The robot base 34 is connected to the wrist portion 26. The plurality of finger portions 36 are provided on the robot base 34 so as to be openable and closable.
The plurality of finger portions 36 extend in one direction from the robot base 34, and have stepped portions 36a on mutually facing surfaces. When the hand 28 grips the workpiece W, the upper surface S of the workpiece WUEngaging with the step portion 36 a. The finger drive unit is, for example, an air cylinder, and is built in the robot base 34. The finger drive section opens and closes the fingers 36 in response to a command from the control section 12.
The moving mechanism 14 has a plurality of servo motors 38 (fig. 2). The servo motors 38 are respectively incorporated in the rotary cylinder 22, the robot arm 24, and the wrist portion 26 of the movement mechanism 14, and drive these components in accordance with commands (speed commands, torque commands, and the like) from the control unit 12.
As automatic control for controlling the components of the moving mechanism 14One of the coordinate systems, a robot coordinate system C is setR(FIG. 1). The control unit 12 uses a robot coordinate system CRFor reference, each component of the moving mechanism 14 is operated. For example, the robot coordinate system CRIs parallel to the vertical direction of the real space, and the rotary cylinder 22 surrounds the robot coordinate system CRIs rotated.
On the other hand, the tool coordinate system C is set for the robot 28T. The tool coordinate system CTIs one of the automatically controlled coordinate systems, by using the robot coordinate system CRRepresents the tool coordinate system CTThe position and direction of the robot 28 in the space.
As shown in fig. 3, in the present embodiment, a tool coordinate system CTIs located between the steps 36a of the finger parts 36, and the finger parts 36 extend from the robot base 34 to the tool coordinate system CTIs projected in the positive direction of the z-axis and extends to the tool coordinate system CTX-axis of (a) is set with a tool coordinate system CT
The control unit 12 is arranged in the robot coordinate system CRThe rotary cylinder 22, the robot arm 24, and the wrist part 26 are operated so that the position and posture of the robot arm 28 and the tool coordinate system C are matchedTThe predetermined position and posture are matched. Thereby, the robot hand 28 is in the robot coordinate system CRThe arrangement is arbitrary in position and posture.
The imaging unit 16 includes an optical system such as a focus lens and an imaging sensor such as a CCD sensor or a CMOS sensor. In the present embodiment, the imaging unit 16 is located in the robot coordinate system CRIs fixed at a predetermined position separately from the moving mechanism 14. The imaging unit 16 images an object such as a workpiece W in accordance with a command from the control unit 12, and transmits the captured image to the control unit 12.
The fixed position of the imaging unit 16 and the optical axis O of the imaging unit 16 (i.e., the optical path of the subject image incident on the optical system of the imaging unit 16) are in the robot coordinate system CRIs coordinated and stored in the storage unit of the control unit 12 in advance. Thus, the control unit 12 can recognize the robot coordinate system CRThe imaging unit 16 and the lightThe position of the axis O.
The lighting device 18 includes an incandescent lamp, a fluorescent lamp, an LED, or the like, and is fixed at a predetermined position. The illumination device 18 is turned on/off in response to a command from the control unit 12, and when turned on, irradiates light to the workpiece W held by the moving mechanism 14.
Referring next to fig. 1 to 7, the inspection system 10 inspects the surface S of the workpiece W to be inspectedIThe outline of the operation of (1) will be described. As shown in fig. 4 and 5, in the present embodiment, the workpiece W is a rectangular plate member having a total of four holes H.
Inspecting a surface S to be inspected of a workpiece WIIn this case, the control unit 12 first operates the moving mechanism 14 to grip the workpiece W stored in the predetermined storage location by the hand 28. At this time, the hand 28 grips the workpiece W at a predetermined gripping position. The designation of the gripping position will be described later.
Thereafter, the control unit 12 operates the moving mechanism 14 to move the workpiece W to the surface S to be inspectedIEnters the imaging position of the field of view a of the imaging section 16, and positions the workpiece W and the imaging section 16 relative to each other. The field of view a of the imaging unit 16 will be described with reference to fig. 4.
The imaging unit 16 has an angle of view indicating a range in which imaging is possible. The angle of view depends on the optical system of the imaging unit 16 and the type of the imaging sensor. Specifically, the longer the focal length of the camera lens or the smaller the light receiving surface of the image sensor, the narrower the angle of view.
An example of the angle of view of the imaging unit 16 is shown as a virtual line B in fig. 1 and 4. According to the angle of view B, the imaging unit 16 and the surface S to be inspectedIThe distance D between the workpiece W and the imaging unit 16 determines the surface S to be inspected which the imaging unit 16 can image when the workpiece W and the imaging unit 16 are positioned as shown in FIG. 1IUpper range (i.e., field of view a).
In other words, the field of view A indicates the imaging unit 16 and the surface S to be inspectedIWhen the two are arranged at a distance D from each other, the imaging unit 16 can image the surface S to be inspected in a focused state of the imaging unit 16IUpper region. Further, the resolution of the image captured by the imaging section 16 is inversely proportional to the field of view AIn relation, the smaller the field of view a, the higher the resolution of the resulting image.
After the hand 28 grips the gripping position of the workpiece W, the control unit 12 operates the moving mechanism 14 to arrange the hand 28 in the 1 st position and posture shown in fig. 1.
Specifically, the control unit 12 maps the tool coordinate system C to the tool coordinate system CTThe 1 st position and the direction (i.e., the origin position and the direction of each axis) shown in fig. 1 are set. Then, the control unit 12 operates the moving mechanism 14 to move the hand 28 gripping the workpiece W to the tool coordinate system C shown in fig. 1TThe predetermined position and posture are matched.
As a result, the hand 28 is disposed at the 1 st position and posture, and the workpiece W held by the hand 28 is positioned at the 1 st imaging position with respect to the imaging unit 16. At this time, the field of view a of the imaging unit 16 is directed to the surface S to be inspectedIAre arranged in the positions shown in fig. 1, 4 and 5. The optical axis O of the imaging unit 16 and the surface S to be inspectedIPerpendicular to the imaging unit 16 and the surface S to be inspectedISeparated from each other by a distance D.
After that, the control unit 12 sends a command to the lighting device 18 to turn on the lighting device 18. Thereby, the workpiece W held by the moving mechanism 14 is irradiated by the illumination device 18.
After that, the control unit 12 sends an imaging instruction to the imaging unit 16. Upon receiving the imaging command from the control unit 12, the imaging unit 16 images the surface S of the workpiece W to be inspectedI. Fig. 6 shows an example of an image captured by the imaging unit 16 when the workpiece W and the imaging unit 16 are positioned at the 1 st imaging position.
The image 40 shown in fig. 6 is an image that enters the field of view a of the imaging unit 16 when the workpiece W and the imaging unit 16 are positioned at the 1 st imaging position (i.e., when the hand 28 gripping the workpiece W is arranged at the 1 st position and posture). Each pixel of the image captured by the imaging unit 16 is represented by the imaging unit coordinate system C in fig. 6IAnd (4) showing. In other words, the imaging part coordinate system CIThe x-y coordinates of each pixel of the image 40 captured by the imaging unit 16 are defined.
Robot coordinate system CRMiddle shooting part coordinate system CIPosition and direction (i.e. ofOrigin position and direction of each axis) of the robot coordinate system CRThe fixed position of the middle imaging unit 16, and the line-of-sight direction and angle of view of the imaging unit 16.
Thereafter, the control unit 12 operates the moving mechanism 14 to place the hand 28 gripping the workpiece W in the 2 nd position and posture. When the hand 28 is disposed at the 2 nd position and posture, the workpiece W held by the hand 28 is disposed at the 2 nd imaging position with respect to the imaging unit 16. At this time, the field of view a of the imaging unit 16 is directed to the surface S to be inspectedIRegion A arranged in FIG. 72The position shown.
When the workpiece W and the imaging unit 16 are positioned at the 2 nd imaging position (that is, when the robot 28 is disposed at the 2 nd position and posture), the control unit 12 sends an imaging command to the imaging unit 16 to image the surface S to be inspected of the workpiece WIAnd transmits the photographed image to the control section 12. Thus, the shot corresponds to the area a in fig. 72The image of (2).
Here, region A in FIG. 7n(n is 1 to 12) indicates that the hand 2 gripping the workpiece W is disposed at the nth position and posture, and thus indicates that the workpiece W is disposed at the nth imaging position with respect to the imaging unit 16 and is opposed to the surface S to be inspectedIThe position of the field of view a of the imaging unit 16.
As shown in fig. 7, region anAnd region An+1Adjoining in such a way that one edge coincides with the other. In addition, 2 regions a adjacent to each other may be usednDefines the nth imaging position of the workpiece W and the imaging unit 16 so that at least a part of them overlap each other.
When the manipulator 28 is arranged at the n-th position and posture, the control unit 12 sets the tool coordinate system C to the n-th position and postureTSet to the nth position and direction. The control unit 12 operates the moving mechanism 14 to move the robot 28 in the tool coordinate system C arranged at the nth position and directionTThe predetermined position and posture are matched.
In this way, the control unit 12 arranges the hand 28 in the 3 rd position and orientation, the 4 th position and orientation, and the · nth position and orientation in this order, thereby aligning the workpiece W held by the hand 28 with the imaging unit 16The second imaging position is located to the 3 rd imaging position, the 4 th imaging position, · · nth imaging position. The control unit 12 images the surface S to be inspected of the workpiece W by the imaging unit 16 every time the workpiece W and the imaging unit 16 are positioned at their respective imaging positionsI
Thereby, the control unit 12 acquires the region a corresponding to fig. 71~A12For a total of 12 images. The control unit 12 analyzes each image acquired from the imaging unit 16, and detects the surface S to be inspectedIA flaw such as a flaw formed thereon.
The control unit 12 executes the operation of the above-described example according to the robot program. The robot program can be constructed by, for example, an operator performing teaching of an operation of arranging the hand 28 in the n-position and posture on the moving mechanism 14 using a teaching operation panel (not shown).
Alternatively, the robot program may be constructed by automatically generating the movement of the moving mechanism 14 to position and position the hand 28 at the nth position and posture based on the drawing data of the workpiece W.
The robot program includes a tool coordinate system C when the hand 28 is disposed at the nth position and postureTAnd the information of the nth position and direction of the moving mechanism 14 and the information of the rotation angle of each servomotor 38 of the moving mechanism 14.
As shown in fig. 6, the image 40 captured by the imaging unit 16 includes a surface S to be inspected such as a hole H of the workpiece W and an external space E of the workpiece WIThe other region. The region H, E outside the surface S to be inspected is an image region that does not need to be inspected by the inspection system 10.
The apparatus 50 of the present embodiment automatically calculates such an image area H, E that is outside the inspection target. As shown in fig. 2, the apparatus 50 includes a drawing acquisition unit 52, a designation reception unit 54, an image generation unit 56, and a non-inspection-area calculation unit 58. The image generator 56 includes a point group generator 57 and a drawn image generator 59.
In the present embodiment, the device 50 is attached to the control unit 12. Therefore, the functions of the drawing acquisition unit 52, the designation reception unit 54, the image generation unit 56 (the point group generation unit 57, the drawn image generation unit 59), and the non-inspection region calculation unit 58 are assumed by the control unit 12.
The operation flow of the apparatus 50 will be described below with reference to fig. 8 to 12. The flow shown in fig. 8 starts when the control unit 12 receives a non-inspection area calculation command from the operator.
In step S1, the control unit 12 acquires drawing data (2DCAD data, 3DCAD data, or the like) of the workpiece W. As an example, the drawing data of the workpiece W is stored in an external server provided outside the control unit 12 so as to be communicable with the control unit 12. In this case, in step S1, the control unit 12 accesses the external server and downloads the drawing data of the workpiece W from the external server.
As another example, drawing data of the workpiece W is stored in an external memory such as an EEPROM (registered trademark). In this case, the control unit 12 has an I/O port (for example, a USB port) into which an external memory is detachably inserted. In step S1, the operator inserts the external memory into the I/O port of the control unit 12, and the control unit 12 downloads the drawing data of the workpiece W from the external memory.
As another example, drawing data of the workpiece W is stored in advance in the storage unit of the control unit 12. In this case, in step S1, the control unit 12 reads the drawing data of the workpiece W from the storage unit.
In this way, in the present embodiment, the control unit 12 functions as the drawing acquisition unit 52 for acquiring the drawing data of the workpiece W. In step S1, the control unit 12 acquires a workpiece model W of the 3d cad data shown in fig. 9 as drawing data of the workpiece WM
In step S2, the control unit 12 receives the surface S to be inspectedIIs specified. As an example, the control unit 12 is provided with a display unit such as an LCD or an organic EL display, and an operation unit such as a keyboard or a touch panel.
The operator operates the operation unit to display the workpiece model W shown in fig. 9 on the display unitMIn (1), an inspected surface model S is specifiedIM. The control unit 12 receives an operation of the operation unit by the operator, and receives the surface model S to be inspectedIMIs specified.
In this way, in the present embodiment, the control unit 12 plays a role of receiving the surface S to be inspected in the drawing dataI(i.e., inspected surface model S)IM) The designation acceptance unit 54.
In step S3, the control unit 12 receives a designation of a gripping position when the robot 28 grips the workpiece W from the operator. The gripping position is based on a tool coordinate system C set by the control unit 12 when the robot hand 28 grips the workpiece WTIs determined by the position and direction of the sensor. For example, the operator operates the operation unit of the control unit 12 to display the workpiece model W on the display unitMIn (1), specifying a tool coordinate system CTThe position of the origin of (a).
Here, assume that the operator sets the tool coordinate system C as shown in FIG. 9TIs specified as a workpiece model WMUpper surface model S ofUMOf the center of (c). In this case, when the control unit 12 grips the workpiece W stored in the storage location with the robot hand 28, the tool coordinate system C is setTIs located on the upper surface S of the workpiece WUAnd a tool coordinate system CTY-z plane of (a) and the surface S to be inspectedIIn parallel, a tool coordinate system C is set for the workpiece WT
The control unit 12 operates the moving mechanism 14 to arrange the robot 28 in the set tool coordinate system CTThe workpiece W is gripped by the robot hand 28 at a predetermined position and posture.
As a result, the robot 28 is set in the tool coordinate system C designated by the operator as shown in fig. 1, 3, and 4TThe workpiece W is gripped at the corresponding gripping position. The control unit 12 receives an operation of the operation unit by the operator and converts the tool coordinate system C into the tool coordinate system CTIs set at the origin of the workpiece model WMIn the position shown in fig. 9.
In step S4, the control unit 12 specifies the surface S to be inspected in step S2 based on the drawing data acquired in step S1I(i.e., inspected surface model S)IM) And generating a point group. On the surface model S to be inspectedIMAn example of generating an image of a point group is shown in fig. 10.
In the image 60 shown in FIG. 10, the model S of the surface to be inspectedIMGenerating a point group 61, the point group 61 having a model S of the surface to be inspectedIMA plurality of dots uniformly dispersed. The tool coordinate system C to which these points are respectively assigned in step S3TIs coordinated. Thus, the surface model S to be inspectedIMIndicated by the dot clusters 61.
In this way, in the present embodiment, the control unit 12 plays a role of drawing data based on the workpiece W on the surface S to be inspectedI(i.e., inspected surface model S)IM) And a point group generating unit 57 for generating the point group 61.
In step S5, the control unit 12 sets the nth imaging position (i.e., the nth position and orientation at which the robot 28 is disposed, and the tool coordinate system C)TThe nth position and direction) is set to "1".
In step S6, the control unit 12 bases on the drawing data of the workpiece W (workpiece model W)M) An estimation image is generated which is captured by the imaging unit 16 when the workpiece W is positioned at the nth imaging position with respect to the imaging unit 16 by the moving mechanism 14.
This step S6 will be described with reference to fig. 11. Hereinafter, a case will be described where the nth shooting position number "n" is set to "1" at the start time of step S6.
In step S11, the control unit 12 acquires the tool coordinate system C set when the workpiece W is positioned at the 1 st imaging position with respect to the imaging unit 16T1 st position and direction.
As mentioned above, the tool coordinate system CTThe position and direction of (a) are included in a robot program constructed by teaching or the like. In step S11, the control unit 12 reads the tool coordinate system C from the storage unit T1 st position and direction.
In step S12, the control unit 12 calculates a tool coordinate system CTRobot coordinate system C of each point of point group 61 generated in step S4 when arranged at position 1 and directionRCoordinates of (2). As described above with reference to fig. 10, each point of the point group 61 can be set as the tool coordinate system CTIs shown in the coordinates of (a).
The control unit 12 bases on the tool coordinate system C acquired in step S11TWill make the tool coordinate system CTA tool coordinate system C of each point of the point group 61 arranged at the 1 st position and directionTTo the robot coordinate system CR
Specifically, the control unit 12 passes through the tool coordinate system C of each point in the point group 61TMultiplying the coordinates in (1) by the transformation matrix to thereby obtain a tool coordinate system C of each point of the point group 61TTo the robot coordinate system CR. The transformation matrix is a tool coordinate system C for arranging the tool at the 1 st position and directionTIs converted into a matrix of the robot coordinate system (e.g., a jacobian matrix).
Thus, the control unit 12 calculates the tool coordinate system CTRobot coordinate system C of each point of point group 61 arranged at 1 st position and directionRCoordinates of (2). The coordinates of each point thus calculated correspond to the surface S to be inspected positioned as shown in fig. 1 and 4IRobot coordinate system C of each point when generating point group 61RCoordinates of (2).
Next, the control unit 12 assigns a number "m" to each point at which the coordinates are calculated. For example, assuming that the total number of points for which coordinates are calculated is γ (═ 10000), each point is numbered as the m-th (m ═ 1 to γ) point. In step S13, the control unit 12 sets the number "m" of the assigned point to "1".
In step S14, the control unit 12 converts the coordinates of the m-th point of the points of the point group 61 whose coordinates were calculated in step S12 into the imaging unit coordinate system CI(i.e., projection).
Specifically, the control unit 12 multiplies the coordinate of the m-th point calculated in step S12 by the transformation matrix, thereby setting the robot coordinate system C of the m-th pointRThe coordinates in (1) are converted into a coordinate system C of the imaging partI. The transformation matrix is used for transforming the robot coordinate system CRIs converted into a photographing part coordinate system CIA matrix (e.g., a jacobi matrix).
In step S15, the control section 12 determines the beat converted in step S14Camera coordinate system CIWhether or not the coordinates (x, y) of (a) are within a predetermined range (e.g., 0. ltoreq. x.ltoreq. α, and 0. ltoreq. y.ltoreq. β) is determined in accordance with the model of the imaging unit 16 to define the range of the field of view A of the imaging unit 16, and stored in the storage unit of the control unit 12.
When determining that the coordinates of the point are within the predetermined range (i.e., "yes"), the control unit 12 proceeds to step S16. On the other hand, when determining that the coordinates of the point are outside the predetermined range (that is, "no"), the control unit 12 proceeds to step S17.
In step S16, the control unit 12 generates the coordinate system C for the image pickup unit converted in step S14IIs an image drawn with points. As described above, in the present embodiment, the control unit 12 functions as the drawn image generating unit 59.
In step S17, the control unit 12 self-adds "1" to the number "m" assigned to the dot (that is, m is m + 1).
In step S18, the control unit 12 determines whether or not the number "m" assigned to the dot is a value greater than the total number "γ" of dots. When determining that the number "m" is larger than "γ" (that is, "yes"), the control unit 12 ends the flow shown in fig. 11 and proceeds to step S7 of fig. 8. On the other hand, when determining that the number "m" is equal to or less than "γ" (that is, "no"), the control unit 12 returns to step S14.
In this way, until the control unit 12 determines yes in step S18, the process loops through steps S14 to S18 to generate the point group 61 in which the point group is converted into the imaging unit coordinate system CI(projection) in the case of a point-rendered image converging on the range of the field of view a.
An example of the image thus generated is shown in fig. 12. The image 62 shown in fig. 12 is an image (i.e., an estimated image) captured by the imaging unit 16 when the workpiece W is estimated to be positioned at the 1 st imaging position with respect to the imaging unit 16, and corresponds to the field of view a shown in fig. 6.
The coordinate system C of the part to be photographed of the estimated image 62IOrigin and imaging part coordinate system CIPoint P of1(α, 0), Point P2(α,β)、P3(0, β) dividing the image into a plurality of parts,the point group 61 includes a coordinate system C converted into an imaging unitIThe point group 61'.
In this way, the control unit 12 functions as the point group generating unit 57 to generate the point group 61, and functions as the drawing image generating unit 59 to draw each point group 61 on the imaging unit coordinate system CIThereby generating an inferred image 62. Therefore, the control unit 12 functions as the image generating unit 56 that generates the estimation image 62 based on the drawing data of the workpiece W.
Referring again to fig. 8, in step S7, the control unit 12 calculates a non-inspection region. Specifically, the control unit 12 calculates an image region other than the region of the point group 61' included in the estimation image 62 generated in step S6 as a non-inspection region.
As shown in fig. 12, the estimated image 62 includes image regions H ' and E ' as image regions other than the region of the point group 61 '. The image area H 'is an area corresponding to the hole H shown in fig. 6, and the image area E' is an area corresponding to the external space E shown in fig. 6.
For example, the control unit 12 calculates the number of points of the point group 61 'in a unit area (for example, an area of 10 pixels × 10 pixels) in the estimated image 62, and determines that the area is an image area H' or E 'other than the point group 61' when the number is equal to or less than a predetermined threshold value.
The control unit 12 can calculate the image areas H 'and E' by executing this operation over the entire estimated image 62. In this way, in the present embodiment, the control unit 12 functions as the non-inspection region calculation unit 58 that calculates the image regions H 'and E' as the non-inspection regions H 'and E'.
In step S8, the control unit 12 adds "1" to the number "n" of the nth shooting position (that is, n is n + 1). In step S9, the control unit 12 determines whether or not the number "n" is larger than 12.
When determining that the number "n" is larger than 12 (i.e., "yes"), the control unit 12 ends the flow shown in fig. 8. On the other hand, if the control unit 12 determines that the number "n" is not more than 12 (i.e., "no"), the process returns to step S6.
In this way it is possible to obtain,until the determination of step S9 is YES, the controller 12 loops through steps S6 to S9. Thus, the control unit 12 sequentially generates the regions a corresponding to fig. 7nAnd (n is 1 to 12) estimated images of the field of view a of the imaging unit 16, and the non-inspection region is calculated in each estimated image. Thus, the control section 12 regards the nth photographing position (i.e., region a)n) A non-examination region is calculated.
As described above, in the present embodiment, the control unit 12 calculates the non-inspection regions H 'and E' that do not need to be inspected by the inspection system 10, based on the drawing data of the workpiece W. According to the structure, the surface S to be inspected isIThe non-inspection regions H 'and E' can be automatically calculated at the time of inspection in (1), so that the operator can omit the operation of manually setting the non-inspection regions for each field of view a. Therefore, man-hours required for starting the inspection system 10 can be reduced.
Further, according to the present embodiment, even if the imaging position for positioning the workpiece W and the imaging unit 16 is changed, the non-inspection region can be automatically calculated from the changed imaging position. Therefore, the task of correcting the setting of the non-inspection region every time the position and posture of the moving mechanism 14 are changed can be omitted.
In addition, according to the present embodiment, the robot 28 is arranged at the nth position and posture based on the parameters (for example, the tool coordinate system C) used when the control unit 12 arranges the robot in real spaceTPosition and orientation) of the object, an estimated image 62 shown in fig. 12 is generated.
With this configuration, the surface S to be inspected can be made to be flatIThe region H, E in the image 40 captured in real space by the imaging unit 16 at the time of inspection (S) is accurately matched with the non-inspection regions H 'and E' in the estimated image 62 generated in step S16.
In step S4, the controller 12 may be configured to perform the inspection only on the surface S based on the drawing data acquired in step S1IThe edge of (2) generates a point group. Such an image is shown in fig. 13. In the image 64 shown in FIG. 13, the surface S to be inspected isIAnd the walls 74, 76, 78 and 80 that define the hole H.
The implementation methodIn the case of equation (S), in step S12 of fig. 11, the control unit 12 calculates the tool coordinate system CTRobot coordinate system C of each of the point groups 66, 68, 70, 72, 74, 76, 78, and 80 arranged at the nth position and directionRThe coordinates in (1) are assigned with the number "m" at each point of the calculated coordinates.
Further, in step S14, the control unit 12 converts the coordinates of the m-th point of the points 66, 68, 70, 72, 74, 76, 78, and 80 whose coordinates were calculated in step S12 into the imaging unit coordinate system CI(i.e., projection).
In step S16, the control unit 12 generates an estimated image of the field of view a of the imaging unit 16, and the imaging unit coordinate system C converted in step S14 is included in the estimated imageIAre plotted as points. Fig. 14 shows an example of the inferred image generated in this manner.
The estimation image 82 shown in fig. 14 corresponds to an image (fig. 6) that enters the field of view a of the imaging section 16 when the workpiece W is positioned at the 1 st imaging position with respect to the imaging section 16, as in fig. 12. The estimation image 82 shown in fig. 14 includes the point groups 70 ' and 72 ' obtained by converting the point groups 70 and 72 shown in fig. 13 and the point group 76 ' obtained by converting the point group 76 shown in fig. 13.
In step S7 of fig. 8, the control unit 12 calculates the non-inspection regions H ' and E ' based on the point groups 70 ', 72 ', and 76 ' included in the estimation image 82. For example, in step S14, the control unit 12 adds information (for example, a flag) for identifying the area inside the point groups 74, 76, 78, and 80 in fig. 13 as a non-inspection area to each of the points constituting the point groups 74, 76, 78, and 80.
The control unit 12 sets the region surrounded by the dot groups 66, 68, 70, and 72 as the surface S to be inspectedIThe information (e.g., a flag) for area identification of (a) is given to each of the points constituting the point groups 66, 68, 70, and 72.
Thus, when the point groups 66, 68, 70, 72, 74, 76, 78, and 80 are converted in step S14, it is possible to determine whether or not the region divided by the converted point groups is a non-inspection region by referring to the information of the point groups 66, 68, 70, 72, 74, 76, 78, and 80 assigned to the conversion source.
Next, an inspection system 90 according to another embodiment will be described with reference to fig. 1 and 15. The inspection system 90 includes a control unit 12, a moving mechanism 14, an imaging unit 16, an illumination device 18, and an apparatus 100. The present embodiment is different from the above-described apparatus 50 in that the apparatus 100 is mounted on the control unit 12 and further includes a mask generation unit 102.
Hereinafter, the operation of the apparatus 100 will be described with reference to fig. 16. In the flow shown in fig. 16, the same steps as in the flow shown in fig. 8 are denoted by the same step numbers, and detailed description thereof is omitted. After the flow of fig. 16 is started, the controller 12 executes step S1 to step S7 in the same manner as in the above-described embodiment.
After step S7, in step S21, the control unit 12 generates mask data. The mask data is for performing inspection of the inspected surface S with respect to the inspection system 90IThe time imaging unit 16 images the surface S to be inspected in real spaceIAn image (e.g., an image 40 shown in fig. 6) of the surface S to be inspected existing in the imageIThe other region (for example, a region H, E in fig. 6) is data of a process other than the inspection target.
The inspection system 90 is inspected for the surface S to be inspected as followsIIn this case, the case where the image pickup unit 16 picks up the image 40 shown in fig. 6 will be described.
For example, in step S21, the control unit 12 deletes the points of the point group 61 ' of the estimated image 62 (fig. 12) generated in step S6 to extract only the non-inspection regions H ' and E ', and generates the non-inspection regions H ' and E ' to be used for the inspection surface SIMask data of different single color (e.g., black) fills.
An example of such mask data is shown in fig. 17. In the mask data 104 shown in fig. 17, pixels indicating the non-inspection regions H 'and E' are colored in black. The control unit 12 stores the generated mask data 104 in the storage unit.
Then, the control unit 12 controls the surface S to be inspected to be on the surface S to be inspectedIThe mask data 104 generated in step S21 is superimposed on the image 40 captured by the imaging unit 16 at the time of inspection. Thus, non-inspection regions H 'and E'The pixels of the uniform image 40 (i.e., the pixels representing the holes H and the external space E in fig. 6) are painted in a single color.
As a result, when the control unit 12 analyzes the image 40 to detect a defect such as a flaw, the area of the hole H and the external space E in the image 40 is coated with a single color, and therefore, there is no characteristic in the area, and therefore, a defect such as a flaw is not detected. In this way, the region H, E within the image 40 that does not need to be inspected by the inspection system 90 can be virtually removed from the inspection object.
As another example, the control unit 12 may generate the image 40 having pixels corresponding to the non-inspection regions H 'and E' from the surface S to be inspectedIMask data for removing an object of image analysis performed during inspection.
For example, in step S21, the controller 12 extracts only the non-inspection regions H 'and E' from the estimated image 62 generated in step S6, and generates mask data in which the non-inspection regions H 'and E' have a flag application function.
Then, the control unit 12 controls the inspection surface SIThe image 40 captured by the imaging unit 16 during inspection (c) is superimposed on the generated mask data, and a flag is given to pixels of the image 40 that match the non-inspection regions H 'and E' included in the mask data.
In this case, when analyzing the image 40 to detect a defect such as a flaw, the control unit 12 does not perform image analysis on the pixel to which the flag is given. In this way, the region H, E in the image 40 that does not need to be inspected by the inspection system 90 can be removed from the inspection object.
In this way, in the present embodiment, the control unit 12 plays a role of giving the non-inspection regions H 'and E' and the surface S to be inspectedIDifferent colors, or a function of a mask generating section 102 (fig. 15) for giving the non-inspection regions H 'and E' as flags outside the inspection object to the pixels.
According to the present embodiment, since the image region not to be inspected can be removed from the inspection object by using the non-inspection regions H 'and E' calculated in step S7, the inspection system 90 can be caused to face the inspection surface SIThe inspection work is speeded up. Further, since the amount of information processing performed by the control unit 12 during the image analysis during the inspection can be reduced, the load on the resources (e.g., CPU) of the control unit 12 can be reduced.
Next, an inspection system 110 according to still another embodiment will be described with reference to fig. 2 and 18. The inspection system 110 is different from the inspection system 10 described above in the following configuration.
That is, in the inspection system 110, the imaging unit 16 is fixed to the wrist part 26 of the movement mechanism 14. On the other hand, the workpiece W is fixed to the workpiece holding portion 112 in the robot coordinate system CRIs disposed apart from the moving mechanism 14 at a predetermined position. The storage unit of the control unit 12 stores the robot coordinate system C in advanceRThe fixed position of the workpiece W.
In the present embodiment, the tool coordinate system CTThe imaging unit 16 is set. The tool coordinate system CTIs one of the automatically controlled coordinate systems, the tool coordinate system CTIn the robot coordinate system CRThe above expression defines the position and orientation of the imaging unit 16 in the space. In the present embodiment, the tool coordinate system CTThe z-axis of the tool coordinate system C is set to coincide with the optical axis O of the imaging unit 16T
The control unit 12 is arranged in the robot coordinate system CRThe position and orientation of the imaging unit 16 and the tool coordinate system C are determined by operating the rotary cylinder 22, the robot arm 24, and the wrist portion 26TThe predetermined position and posture are matched. Thus, the imaging unit 16 is in the robot coordinate system CRIs arranged at an arbitrary position and posture.
Next, the inspection system 110 inspects the inspected surface S of the workpiece WIThe outline of the operation of (1) will be described. On the inspected surface S of the inspected workpiece WIAt this time, the control unit 12 operates the moving mechanism 14 to move the imaging unit 16 to the surface S to be inspectedIEnters the imaging position of the field of view a of the imaging section 16, and positions the workpiece W and the imaging section 16 relative to each other.
Specifically, the control unit 12 maps the tool coordinate system C to the tool coordinate system CTIs set asPosition 1 and orientation. The control unit 12 operates the moving mechanism 14 to move the imaging unit 16 to the tool coordinate system C set to the 1 st position and directionTThe predetermined 1 st position and the predetermined posture are matched.
As a result, the imaging unit 16 is positioned at the 1 st imaging position with respect to the workpiece W by being arranged at the 1 st position and posture. Next, the control unit 12 sends a command to the lighting device 18 to turn on the lighting device 18. Thereby, the workpiece W fixed to the workpiece holding portion 112 is irradiated by the illumination device 18.
Next, the control unit 12 sends an imaging command to the imaging unit 16. Upon receiving the imaging command from the control unit 12, the imaging unit 16 images the surface S of the workpiece W to be inspectedI. Fig. 6 shows an example of an image captured by the imaging unit 16 when the imaging unit 16 is positioned at the 1 st imaging position with respect to the workpiece W.
The image 40 shown in fig. 6 is an image that enters the field of view a of the imaging section 16 when the imaging section 16 and the workpiece W are positioned at the 1 st imaging position. Each pixel of the image 40 captured by the imaging unit 16 is in the imaging unit coordinate system C in fig. 6IAnd (4) showing.
Robot coordinate system CRMiddle imaging part coordinate system CICan be determined from the robot coordinate system CRThe position of the imaging unit 16, and the line-of-sight direction and angle of view of the imaging unit 16.
Next, the control unit 12 operates the moving mechanism 14 to place the imaging unit 16 in the 2 nd position and posture. Thereby, the imaging unit 16 is positioned at the 2 nd imaging position with respect to the workpiece W. When the imaging unit 16 is positioned at the 2 nd imaging position with respect to the workpiece W, the field of view a of the imaging unit 16 is directed to the surface S to be inspectedIArranged in the region A in FIG. 72The position shown.
When the imaging unit 16 and the workpiece W are arranged at the 2 nd imaging position, the control unit 12 sends an imaging command to the imaging unit 16 to image the surface S to be inspected of the workpiece WI. Thereby, an image corresponding to the area a in fig. 7 is captured.
In this way, the control unit 12 arranges the imaging unit 16 at the 3 rd position and posture, and at the 4 th position and posture in this orderThe attitude, · · nth position and the attitude thereby sequentially position the imaging section 16 to the 3 rd imaging position, the 4 th imaging position, · · nth imaging position with respect to the workpiece W. The control unit 12 images the surface S to be inspected of the workpiece W by the imaging unit 16 every time the imaging unit 16 and the workpiece W are positioned at their respective imaging positionsI
Thus, the control section 12 acquires the region a corresponding to fig. 71~A12Total of 12 images. The control unit 12 analyzes each image acquired from the imaging unit 16, and detects the image formed on the surface S to be inspectedIAnd flaws such as scratches.
Next, the functions of the apparatus 50 provided in the inspection system 110 will be described with reference to fig. 19 and 20. In the flow shown in fig. 19, the same steps as those in the flow shown in fig. 8 are denoted by the same step numbers, and detailed description thereof is omitted.
After step S2, in step S3', the controller 12 receives the robot coordinate system CRThe fixed position of the workpiece W in (2). For example, the operator operates the operation unit of the control unit 12 to input the predetermined robot coordinate system CROf the workpiece W.
For example, in the embodiment shown in fig. 18, the workpiece W is fixed: inspected surface S of workpiece WIAnd robot coordinate system CRIs arranged in the slave robot coordinate system CRIs separated by a predetermined distance toward the positive x-axis direction.
The operator operates the operation unit of the control unit 12 to input the information for specifying the robot coordinate system CRParameter (inspected surface S) of arrangement of the workpiece WIAngle of (d), distance from the origin, etc.). The control unit 12 receives an operation of the operation unit by an operator, and acquires a robot coordinate system CRTo the workpiece W.
In step S5', the control unit 12 sets the nth imaging position (i.e., the nth position and orientation where the imaging unit 16 is disposed and the tool coordinate system C)TThe nth position and direction) is set to "1".
In step S6', the control unit 12Drawing data based on workpiece W (workpiece model W)M) An estimated image is generated which is captured by the imaging unit 16 when the imaging unit 16 is positioned at the nth imaging position with respect to the workpiece W by the moving mechanism 14.
This step S6' will be described with reference to fig. 20. In the flow shown in fig. 20, the same steps as those in the flow shown in fig. 11 are denoted by the same step numbers, and detailed description thereof is omitted. Hereinafter, a case will be described where the nth shooting position number "n" is set to "1" at the start time of step S6' (step S5).
After the start of step S6 ', in step S12', the control unit 12 calculates the robot coordinate system C of each point of the point group 61 generated in step S4RCoordinates of (2). Here, in the above step S3', the robot coordinate system CRThe fixed position of the workpiece W in (1) is specified.
Therefore, the control unit 12 bases on the robot coordinate system CRThe fixed position of the workpiece W in (1) and the point group 61 generated in step S4 can be calculated as corresponding to the surface S to be inspected fixed at the fixed positionIOf the point group 61, and a robot coordinate system C of each pointRCoordinates of (2). The control unit 12 assigns a number "m" (m is 1 to γ) to each point of the calculated coordinates.
In step S14 ', the control unit 12 converts the coordinates of the m-th point of the points of the point group 61 whose coordinates were calculated in step S12' into the imaging unit coordinate system CI(i.e., projection).
Imaging part coordinate system C at this timeIAccording to a set tool coordinate system C T1 st position and direction information, and the line-of-sight direction of the imaging unit 16 (i.e., the tool coordinate system C)TZ-axis) and viewing angle.
The control unit 12 multiplies the coordinates of the point calculated in step S12' by the conversion matrix, thereby converting the robot coordinate system C of the point into the robot coordinate system C of the pointRThe coordinates in (1) are converted into a coordinate system C of the imaging partI. The transformation matrix is used for transforming the robot coordinate system CRIs converted into a photographing part coordinate system CIOf the matrix of (a).
The control unit 12 then loops through steps S14' to S18 until it determines yes at step S18 in fig. 20, and generates the estimation image 62 shown in fig. 12.
In this way, the control unit 12 loops through steps S6' to S9 until it determines yes at step S9 in fig. 19. Thus, the control unit 12 sequentially generates the regions a corresponding to fig. 7nAnd (n is 1 to 12) estimated images of the field of view a of the imaging unit 16, and the non-inspection region is calculated in each estimated image.
As described above, according to the present embodiment, the control unit 12 can automatically calculate the non-inspection regions H 'and E' from the drawing data of the workpiece W, as in the above-described embodiment, and therefore, the operator can omit the operation of manually setting the non-inspection regions for each field of view a. Therefore, the man-hours required for starting the inspection system 90 can be reduced.
Next, an inspection system 120 according to still another embodiment will be described with reference to fig. 21. The inspection system 120 is different from the inspection system 10 described above in that it includes the device 130. The apparatus 130 of the present embodiment includes a drawing acquisition unit 52, a designation reception unit 54, and a non-inspection-area calculation unit 132.
The device 130 is mounted to the control unit 12. Therefore, the control unit 12 functions as the drawing acquisition unit 52, the designation reception unit 54, and the non-inspection region calculation unit 132. In the present embodiment, the control unit 12 generates the point group 61 described above, determines whether or not the line of sight of each imaging element of the imaging sensor incorporated in the imaging unit 16 passes through a point of the point group 61, and calculates a non-inspection region.
The line of sight of the imaging element will be described below with reference to fig. 24. The region (a) in fig. 24 is an enlarged view of a part of the light receiving surface of the imaging sensor 16 a. The imaging unit 16 includes an imaging sensor 16a and an optical system 16b including at least one lens. The imaging sensor 16a has an imaging part coordinate system C as shown in the area (a) of fig. 24IAnd a plurality of imaging elements I arranged in alignment in the x-axis direction and the y-axis direction.
The imaging elements I each have a line of sight K. The line of sight K coincides with a virtual straight line connecting each image pickup element I and the focal point J. Here, the robot coordinate system CREach imaging element I in (1) andthe coordinates of the focal point J can be determined from the robot coordinate system CRThe fixed position of the imaging unit 16 and the drawing data of the imaging unit 16. And a robot coordinate system C of the line of sight K of each imaging element IRCan be based on the robot coordinate system CRThe coordinates of each imaging element I and the focal point J in (1) are obtained.
The control unit 12 calculates the line of sight K of each imaging element I, determines whether or not the line of sight K passes through the points of the generated point group 61, and calculates the non-inspection region H, E included in the image (fig. 6) entering the field of view a of the imaging unit 16 when the workpiece W is disposed at the nth imaging position with respect to the imaging unit 16.
The operation flow of the apparatus 130 will be described below with reference to fig. 22. In the flow shown in fig. 22, the same steps as those in the flow shown in fig. 8 are denoted by the same step numbers, and detailed description thereof is omitted. After the flow of fig. 22 is started, the controller 12 executes steps S1 to S5 in the same manner as in the above-described embodiment.
After step S5, in step S30, the control unit 12 calculates a non-inspection region. This step 30 will be described with reference to fig. 23. In the flow shown in fig. 23, the same steps as those in the flow shown in fig. 11 are denoted by the same step numbers, and detailed description thereof is omitted.
After the flow shown in fig. 23 is started, the control unit 12 executes step S11 described above. Specifically, the control unit 12 acquires a tool coordinate system C set when the workpiece W is positioned at the nth imaging position with respect to the imaging unit 16TThe nth position and the direction of (a).
Next, the control unit 12 executes step S12 described above. Specifically, the control unit 12 calculates a tool coordinate system CTRobot coordinate system C of each point of point group 61 arranged at nth position and directionRCoordinates of (2).
In step S31, the control unit 12 sets the number "I" assigned to each image pickup element I to "1". For example, the imaging sensor of the imaging unit 16 includes a meter 107In the case of individual imaging elements I, 1 to 10 imaging elements I are assigned to each imaging element I7Any one of which is numbered "i".
In step S32, the control unit 12 calculates the line of sight K of the I-th imaging element Ii. For example, when the number "I" of the image pickup device I is set to "100" (i.e., I is 100) at the start time of step S32, the control unit 12 calculates the line of sight K of the 100 th image pickup device I100
As described above, the control unit 12 can control the robot coordinate system CRThe robot coordinate system C is obtained from the fixed position of the imaging unit 16 and the drawing data of the imaging unit 16RThe coordinate of the imaging element I and the focal point J of the ith.
The control unit 12 can calculate the robot coordinate system C from these coordinatesRLine of sight K iniCoordinates (or functions) of (a). The control unit 12 calculates the line of sight KiThe coordinates (or functions) of (a) are stored in a storage unit.
In step S33, the control unit 12 determines the line of sight K calculated in step S32iWhether or not the point of the point group 61 generated in step S4 passes. Specifically, the control unit 12 bases the line of sight K calculated in step S32 on the basis of the line of sight KiAnd the robot coordinate system C of each point of the point group 61 calculated in step S12RCoordinates of (5) to determine the line of sight KiWhether it passes a point of the point group 61.
The control unit 12 determines the line of sight KiWhen the point passes through the point group 61 (that is, "yes"), the process proceeds to step S34. On the other hand, the control unit 12 determines that the line of sight K is presentiIf the point does not pass through any point of the point group 61 (that is, "no"), the process proceeds to step S35.
If the determination in step S33 is yes, in step S34, the controller 12 sets the display of the I-th pixel imaged by the I-th imaging element I as the inspected surface S when the workpiece W is arranged at the n-th imaging position with respect to the imaging unit 161The pixels of the inspected surface.
On the other hand, if it is determined in step S33 as "no" in step S34, the controller 12 sets the I-th pixel imaged by the I-th imaging element I as a non-inspection region pixel displaying the non-inspection region H, E when the workpiece W is arranged at the n-th imaging position with respect to the imaging unit 16 in step S35.
In step S36, the control unit 12 adds "1" to the number "I" of the I-th image sensor I (i.e., I + 1). In step S37, the control unit 12 determines whether or not the number "I" is greater than the total number δ of the image pickup elements I (e.g., δ is 10)7) A large value.
If it is determined that the number "i" is larger than the total number δ (that is, "yes"), the control unit 12 proceeds to step S38. On the other hand, if the control unit 12 determines that the number "i" is equal to or less than the total number δ (i.e., no), the process returns to step S32.
In this way, the controller 12 loops through steps S32 to S37 until the determination of yes at step S37. Thus, the control unit 12 determines the line of sight K of all the imaging elements IiWhether or not the point of the point group 61 passes is determined by setting a pixel constituting the image of the field of view a entering the imaging unit 16 at the nth imaging position as a surface pixel to be inspected or a non-inspection area pixel.
In step S38, the control unit 12 divides the region to be inspected. Specifically, a non-inspection region (for example, region H, E) of an image (that is, an image composed of the 1 st to δ th pixels) included in the field of view a of the imaging unit 16 when the workpiece W is arranged at the nth imaging position with respect to the imaging unit 16 is divided based on the setting of the ith pixel (i: 1 to δ) (the surface pixel to be inspected or the non-inspection region pixel).
For example, the control unit 12 calculates the number of pixels of the surface to be inspected in a unit region (for example, a region of 10 pixels × 10 pixels) in the image of the field of view a of the imaging unit 16, and determines that the region is the non-inspection region H, E when the number is equal to or less than a predetermined threshold value.
The control unit 12 can calculate the non-inspection region H, E by executing this operation over the entire image in the field of view a. As described above, in the present embodiment, the control unit 12 functions as the non-inspection region calculation unit 132 that calculates the non-inspection region H, E.
According to the present embodiment, the control unit 12 can calculate the non-inspection region H, E based on the line of sight K of the imaging element I without generating the estimated image 62 described above.
Furthermore, the line of sight K of the imaging element IiCorresponding to the object image incident on the imaging element I, and the robot coordinate system CRThe line of sight K of each imaging element IiThe coordinates (or functions) of (a) may be calculated not as a straight line but as a light beam (for example, a cylindrical shape) having a predetermined cross-sectional area. Alternatively, the dots of the dot group 61 may be closed regions (e.g., circles) having a predetermined area.
Further, those skilled in the art will readily understand that the technical concept of the apparatus 130 shown in fig. 21 can also be applied to the inspection system shown in fig. 18 (i.e., the embodiment in which the imaging unit 16 is fixed to the wrist section 26).
The control unit 12 may simulate the functions of the devices 50, 100, and 130 using simulation software or the like. Specifically, the control unit 12 combines the three-dimensional model of the components (the robot base 20, the rotary cylinder 22, the robot arm 24, the wrist portion 26, and the robot hand 28) of the inspection system 10 and the workpiece model WMIs arranged in the virtual space.
The control unit 12 uses the surface model S to be inspectedIMEntering the virtual field of view A of the three-dimensional model of the imaging unit 16, the workpiece model W is modeledMAnd the three-dimensional model of the imaging unit 16 are positioned in a simulated manner in the virtual space.
The control unit 12 simulatively images the surface model S to be inspected by the three-dimensional model of the imaging unit 16IMAn inferred image within the virtual field of view a is generated. Then, the control unit 12 calculates a non-inspection region from the generated estimated image. By such a simulation method, the non-inspection region can be calculated as in the above-described embodiment.
The image region (i.e., non-inspection region) of the inspection system 10, 90, 110, 120 that is outside the inspection target may be set arbitrarily. Specifically, in the inspection system 10 or 90, there is a possibility that the surface S to be inspected may be reflected on the image of the field of view a of the imaging unit 16 that enters the real space when the robot 28 grips the workpiece WIAnd an upper finger 36.
In this case, when the control unit 12 generates the point group 61 in step S4 in fig. 8, the to-be-placed finger part 36 is arrangedInspection surface SIThe upper region is removed from the object to which the point group is assigned. A surface S to be inspected on which the finger parts 36 are arrangedIThe upper region can be specified based on drawing data of the finger 36, drawing data of the workpiece W, and a gripping position at which the hand 28 grips the workpiece W.
Thus, the control unit 12 can arrange the surface S to be inspected on which the finger parts 36 are arranged in step S7 described aboveIIs calculated as a non-examination region. As a result, the image region can be excluded from the inspection of the inspection system 10.
In the above-described embodiment, the image generating unit 56 generates the point groups 61, 66, 68, 70, 72, 74, 76, 78, and 80 (step S4), converts the point groups 61, 66, 68, 70, 72, 74, 76, 78, and 80 into the imaging unit coordinate system CI(steps S14, S14') to generate the inferred images 62, 82.
However, the image generating unit 56 is not limited to this, and may be set in the robot coordinate system CRIn, the tool coordinate system CTThe surface S to be inspected arranged at the n-th position and directionIThree-dimensional model (i.e., inspected surface model S)IM) Simulation, to be arranged in the robot coordinate system CRModel S of the surface to be inspectedIMConverted into a coordinate system C of the photographing partIAn estimated image within the field of view a of the imaging unit 16 is generated. In this case, the control unit 12 can generate an estimated image that substantially matches the image 40 shown in fig. 6.
In the above-described embodiment, the case where the devices 50 and 100 are mounted on the control unit 12 is described. However, the present invention is not limited to this, and the devices 50 and 100 may be configured as elements different from the control unit 12.
Fig. 25 shows such an embodiment. The inspection system 10 'shown in fig. 25 includes a control unit 12, a moving mechanism 14, an imaging unit 16, an illumination device 18, and a device 50'.
In the present embodiment, the device 50' is configured as an element different from the control unit 12, and is connected to the control unit 12 so as to be able to communicate with it. The apparatus 50' executes the flow shown in fig. 8 and 11 to calculate the non-inspection region, similarly to the apparatus 50 described above.
Similarly, the device 130 shown in fig. 21 may be configured as a different element from the control unit 12. The illumination device 18 may be omitted from the inspection system 10, 10', 90, 110, or 130, and the surface S to be inspected may be irradiated with natural light, for exampleI
In the above-described embodiment, the case where the moving mechanism 14 is constituted by the vertical articulated robot has been described, but the present invention is not limited to this, and may be constituted by a loader, for example.
The apparatus 50, 50', 100, or 130 may be constituted by one computer having a CPU, a storage unit, and the like. Alternatively, the drawing acquisition unit 52, the designation reception unit 54, the image generation unit 56 (the point group generation unit 57, the drawn image generation unit 59), the non-inspection region calculation units 58 and 132, and the mask generation unit 102 are each configured by one computer having a CPU, a storage unit, and the like.
The present disclosure has been described above by way of embodiments, but the embodiments described above do not limit the invention of the claims.

Claims (17)

1. An apparatus for calculating an image area outside an inspection object to be inspected in an inspection system having an imaging unit for imaging a workpiece and a moving mechanism for moving the workpiece or the imaging unit to position the workpiece and the imaging unit with respect to each other and inspecting a surface of the workpiece,
the apparatus is characterized by comprising:
a drawing acquisition unit that acquires CAD data of a workpiece model that models the workpiece;
a designation receiving unit that receives designation of a surface model to be inspected included in the workpiece model in the CAD data;
a point group generating unit that generates a point group on the surface model to be inspected based on the CAD data; and
and a non-inspection region calculation unit configured to calculate an image region other than the inspection surface in the image within the field of view of the imaging unit as a non-inspection region based on coordinate data of the point group in a moving mechanism coordinate system for controlling the moving mechanism when the workpiece and the imaging unit are positioned at an imaging position where at least a part of the inspection surface of the workpiece corresponding to the designated inspection surface model enters the field of view of the imaging unit.
2. The apparatus of claim 1,
an image generating unit that generates an estimated image that is estimated to be within the field of view captured by the imaging unit when the workpiece and the imaging unit are positioned at the imaging position, based on the CAD data,
the image generation unit includes:
the point group generating unit; and
a drawing image generating unit that generates the estimation image by drawing each of the point groups generated on at least a part of the inspection target surface model corresponding to at least a part of the inspection target surface estimated to be photographed when the workpiece and the photographing unit are positioned at the photographing position on a photographing unit coordinate system defining the field of view, using the coordinate data,
the non-inspection region calculation unit calculates, as the non-inspection region, an image region other than the surface to be inspected included in the estimated image generated by the image generation unit.
3. The apparatus of claim 1,
and a determination unit that determines whether or not the line of sight passes through the points constituting the point group in the moving mechanism coordinate system based on the coordinate data and coordinate data in the moving mechanism coordinate system of the line of sight of the plurality of imaging elements of the imaging unit when the workpiece and the imaging unit are positioned at the imaging position,
the non-inspection region calculation unit calculates the non-inspection region based on an inspected surface pixel corresponding to the image pickup device determined that the line of sight passes through the point and a non-inspection region pixel corresponding to the image pickup device determined that the line of sight does not pass through the point, which constitute an image within a field of view of the image pickup unit when the workpiece and the image pickup unit are positioned at the image pickup position.
4. The device according to any one of claims 1 to 3,
the inspection apparatus further includes a mask generating unit that gives a color different from that of the surface to be inspected to the non-inspection region, or gives a mark for making the non-inspection region out of the inspection target to pixels of the non-inspection region.
5. The device according to any one of claims 1 to 3,
the moving mechanism holds the workpiece at a predetermined holding position and moves the workpiece with respect to the imaging unit.
6. The device according to any one of claims 1 to 3,
the imaging unit is fixed to the moving mechanism and is moved relative to the workpiece by the moving mechanism.
7. The device according to any one of claims 1 to 3,
the non-inspection region calculation unit calculates the non-inspection region in the image in the field of view of the imaging unit for each of a plurality of imaging positions at which a plurality of regions of the inspection surface, which are different from each other, are positioned so as to enter the plurality of imaging positions in the field of view of the imaging unit.
8. An apparatus for calculating an image area outside an inspection object to be inspected in an inspection system having an imaging unit for imaging a workpiece and a moving mechanism for moving the workpiece or the imaging unit to position the workpiece and the imaging unit with respect to each other and inspecting a surface of the workpiece,
the apparatus is characterized by comprising:
a drawing acquisition unit that acquires CAD data of a workpiece model that models the workpiece;
a designation receiving unit that receives designation of a surface model to be inspected included in the workpiece model in the CAD data;
an image generating unit that generates, based on the CAD data, an estimated image estimated to be within a field of view of the imaging unit when the workpiece and the imaging unit are positioned at the imaging position where at least a part of an inspected surface of the workpiece corresponding to the designated inspected surface model enters the field of view of the imaging unit; and
a non-inspection region calculation unit that calculates an image region other than the surface to be inspected as a non-inspection region in an image within a field of view of the imaging unit when the workpiece and the imaging unit are positioned at the imaging position,
the image generation unit includes:
a point group generating unit that generates a point group on the surface model to be inspected based on the CAD data; and
a drawing image generating unit that generates the estimation image by drawing each of the point groups generated on at least a part of the surface model to be inspected corresponding to at least a part of the surface to be inspected estimated to be imaged when the workpiece and the imaging unit are positioned at the imaging position, on an imaging unit coordinate system defining the field of view,
the non-inspection region calculation unit calculates, as the non-inspection region, an image region other than the surface to be inspected included in the estimated image generated by the image generation unit.
9. The apparatus of claim 8,
the inspection apparatus further includes a mask generating unit that gives a color different from that of the surface to be inspected to the non-inspection region, or gives a mark for making the non-inspection region out of the inspection target to pixels of the non-inspection region.
10. The apparatus of claim 8,
the moving mechanism holds the workpiece at a predetermined holding position and moves the workpiece with respect to the imaging unit.
11. The apparatus of claim 9,
the moving mechanism holds the workpiece at a predetermined holding position and moves the workpiece with respect to the imaging unit.
12. The apparatus according to any one of claims 8 to 11,
the imaging unit is fixed to the moving mechanism and is moved relative to the workpiece by the moving mechanism.
13. The apparatus according to any one of claims 8 to 11,
the non-inspection region calculation unit calculates the non-inspection region in the image in the field of view of the imaging unit for each of a plurality of imaging positions at which a plurality of regions of the inspection surface, which are different from each other, are positioned so as to enter the plurality of imaging positions in the field of view of the imaging unit.
14. The apparatus of claim 12,
the non-inspection region calculation unit calculates the non-inspection region in the image in the field of view of the imaging unit for each of a plurality of imaging positions at which a plurality of regions of the inspection surface, which are different from each other, are positioned so as to enter the plurality of imaging positions in the field of view of the imaging unit.
15. An inspection system, characterized in that,
a device according to any one of claims 1 to 14.
16. A method of calculating an image area outside an inspection object to be inspected in an inspection system having an imaging unit for imaging a workpiece and a moving mechanism for moving the workpiece or the imaging unit to position the workpiece and the imaging unit with respect to each other, and inspecting a surface of the workpiece,
the method is characterized by comprising the following steps:
acquiring CAD data of a workpiece model for modeling the workpiece;
receiving designation of a surface model to be inspected included in the workpiece model in the CAD data;
generating a point group on the inspected surface model based on the CAD data; and
the image processing apparatus calculates an image area other than the surface to be inspected as a non-inspection area in an image within the field of view of the imaging unit based on coordinate data of the point group in a moving mechanism coordinate system for controlling the moving mechanism when the workpiece and the imaging unit are positioned at an imaging position where at least a part of the surface to be inspected of the workpiece corresponding to the designated surface model enters the field of view of the imaging unit.
17. A method of calculating an image area outside an inspection object to be inspected in an inspection system having an imaging unit for imaging a workpiece and a moving mechanism for moving the workpiece or the imaging unit to position the workpiece and the imaging unit with respect to each other, and inspecting a surface of the workpiece,
the method is characterized by comprising the following steps:
acquiring CAD data of a workpiece model for modeling the workpiece;
receiving designation of a surface model to be inspected included in the workpiece model in the CAD data;
generating an estimation image estimated to be within a field of view of the imaging unit when the workpiece and the imaging unit are positioned at an imaging position where at least a part of an object surface of the workpiece corresponding to the designated object surface model enters the field of view of the imaging unit, based on the CAD data; and
calculating an image region other than the surface to be inspected as a non-inspection region in an image within a field of view of the imaging unit when the workpiece and the imaging unit are positioned at the imaging position,
generating a point group on the inspected surface model based on the CAD data,
generating the estimated image by plotting each of the point groups generated on at least a part of the surface model to be inspected corresponding to at least a part of the surface to be inspected estimated to be imaged when the workpiece and the imaging section are positioned at the imaging position on an imaging section coordinate system defining the field of view,
and calculating an image region other than the surface to be inspected included in the generated estimated image as the non-inspection region.
CN201810225234.4A 2017-03-21 2018-03-19 Device and method for calculating image area outside inspection object of inspection system Active CN108627515B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017054628A JP6392922B1 (en) 2017-03-21 2017-03-21 Apparatus for calculating region that is not subject to inspection of inspection system, and method for calculating region that is not subject to inspection
JP2017-054628 2017-03-21

Publications (2)

Publication Number Publication Date
CN108627515A CN108627515A (en) 2018-10-09
CN108627515B true CN108627515B (en) 2020-02-18

Family

ID=63449992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810225234.4A Active CN108627515B (en) 2017-03-21 2018-03-19 Device and method for calculating image area outside inspection object of inspection system

Country Status (4)

Country Link
US (1) US10724963B2 (en)
JP (1) JP6392922B1 (en)
CN (1) CN108627515B (en)
DE (1) DE102018105955B4 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017207069A1 (en) * 2017-04-27 2018-10-31 Robert Bosch Gmbh Test device for optical inspection of an object, production plant with the test device and method for optical testing of the object with the test device
WO2019130543A1 (en) * 2017-12-28 2019-07-04 株式会社Fuji Information providing device, information providing method and program
JP7163115B2 (en) * 2018-09-12 2022-10-31 キヤノン株式会社 ROBOT SYSTEM, ROBOT SYSTEM CONTROL METHOD, PRODUCT MANUFACTURING METHOD, CONTROL DEVICE, OPERATION DEVICE, IMAGING DEVICE, CONTROL PROGRAM, AND RECORDING MEDIUM
US10969771B2 (en) * 2019-06-12 2021-04-06 Edison Welding Institute, Inc. Computed tomography for non-destructive evaluation of manufactured parts
CN111007084B (en) * 2020-01-03 2023-10-24 佛亚智能装备(苏州)有限公司 360-degree flaw detection method and shooting mechanism for shaft surface

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101034070A (en) * 2006-03-10 2007-09-12 欧姆龙株式会社 Device for and method of inspecting surface condition
CN102959355A (en) * 2010-07-05 2013-03-06 株式会社拓普康 Point cloud data processing device, point cloud data processing system, point cloud data processing method, and point cloud data processing program
CN103313921A (en) * 2011-02-15 2013-09-18 欧姆龙株式会社 Image processing apparatus and image processing system
JP2016224707A (en) * 2015-05-29 2016-12-28 リコーエレメックス株式会社 Inspection system
JP2017015396A (en) * 2015-06-26 2017-01-19 キヤノン株式会社 Inspection method, inspection apparatus, processing apparatus, program, and recording medium

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5321767A (en) * 1992-08-28 1994-06-14 Kabushiki Kaisha Komatsu Seisakusho Method of forming a mask in image processing operation
JP3293257B2 (en) 1993-08-30 2002-06-17 株式会社デンソー Surface defect inspection equipment
JP3433333B2 (en) * 1994-05-09 2003-08-04 大日本印刷株式会社 Defect inspection method
JP4206192B2 (en) * 2000-11-09 2009-01-07 株式会社日立製作所 Pattern inspection method and apparatus
SE524818C2 (en) * 2003-02-13 2004-10-05 Abb Ab A method and system for programming an industrial robot to move relatively defined positions on an object
DE102004007829B4 (en) * 2004-02-18 2007-04-05 Isra Vision Systems Ag Method for determining areas to be inspected
JP2009097922A (en) * 2007-10-15 2009-05-07 Daido Steel Co Ltd Visual inspection method and apparatus
WO2010091086A1 (en) * 2009-02-03 2010-08-12 Fanuc Robotics America, Inc. Method of controlling a robotic tool
US9885669B2 (en) * 2010-12-29 2018-02-06 Koh Young Technology Inc. Method of inspecting a substrate
JP5799516B2 (en) * 2011-02-03 2015-10-28 セイコーエプソン株式会社 Robot apparatus, inspection apparatus, inspection program, and inspection method
JP5912627B2 (en) * 2012-02-14 2016-04-27 川崎重工業株式会社 Imaging inspection apparatus, control apparatus and control method thereof
JP6286921B2 (en) * 2012-09-14 2018-03-07 株式会社リコー Image inspection apparatus, image inspection system, and image inspection method
JP6676286B2 (en) * 2015-05-12 2020-04-08 キヤノン株式会社 Information processing method and information processing apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101034070A (en) * 2006-03-10 2007-09-12 欧姆龙株式会社 Device for and method of inspecting surface condition
CN102959355A (en) * 2010-07-05 2013-03-06 株式会社拓普康 Point cloud data processing device, point cloud data processing system, point cloud data processing method, and point cloud data processing program
CN103313921A (en) * 2011-02-15 2013-09-18 欧姆龙株式会社 Image processing apparatus and image processing system
JP2016224707A (en) * 2015-05-29 2016-12-28 リコーエレメックス株式会社 Inspection system
JP2017015396A (en) * 2015-06-26 2017-01-19 キヤノン株式会社 Inspection method, inspection apparatus, processing apparatus, program, and recording medium

Also Published As

Publication number Publication date
US20180275073A1 (en) 2018-09-27
DE102018105955A1 (en) 2018-09-27
CN108627515A (en) 2018-10-09
US10724963B2 (en) 2020-07-28
JP2018155695A (en) 2018-10-04
DE102018105955B4 (en) 2021-04-01
JP6392922B1 (en) 2018-09-19

Similar Documents

Publication Publication Date Title
CN108627515B (en) Device and method for calculating image area outside inspection object of inspection system
CN106873550B (en) Simulation device and simulation method
JP5897624B2 (en) Robot simulation device for simulating workpiece removal process
US8406923B2 (en) Apparatus for determining pickup pose of robot arm with camera
CN110712194B (en) Object inspection device, object inspection system, and method of adjusting inspection position
US20080013825A1 (en) Simulation device of robot system
CN108961144A (en) Image processing system
US11446822B2 (en) Simulation device that simulates operation of robot
JP4553437B2 (en) Image inspection system and control method
US7711507B2 (en) Method and device for determining the relative position of a first object with respect to a second object, corresponding computer program and a computer-readable storage medium
CN109945780B (en) Object inspection system and object inspection method
Marchand Control camera and light source positions using image gradient information
US10434650B2 (en) Programming device which generates operation program and method for generating program
US20180231474A1 (en) Apparatus and method for generating operation program of inspection system
GB2582931A (en) A method for determining camera placement within a robotic cell environment
CN116472551A (en) Apparatus, robot system, method and computer program for adjusting parameters
JPH09323280A (en) Control method and system of manupulator
JP2017113815A (en) Image display method of robot system that holds object using robot
JP2020134221A (en) Scanning route generation device, scanning route generation program, scanning route generation method, and appearance inspection system
JP2021010994A (en) Sensor position attitude calibration apparatus and sensor position attitude calibration method
US20230154162A1 (en) Method For Generating Training Data Used To Learn Machine Learning Model, System, And Non-Transitory Computer-Readable Storage Medium Storing Computer Program
WO2022249295A1 (en) Robot simulation device
WO2022181500A1 (en) Simulation device using three-dimensional position information obtained from output from vision sensor
JP2019036072A (en) Image processing method, image processing system and manufacturing method
WO2022208963A1 (en) Calibration device for controlling robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant