WO2021010181A1 - Inspection device, inspection method, positioning method, and program - Google Patents

Inspection device, inspection method, positioning method, and program Download PDF

Info

Publication number
WO2021010181A1
WO2021010181A1 PCT/JP2020/026010 JP2020026010W WO2021010181A1 WO 2021010181 A1 WO2021010181 A1 WO 2021010181A1 JP 2020026010 W JP2020026010 W JP 2020026010W WO 2021010181 A1 WO2021010181 A1 WO 2021010181A1
Authority
WO
WIPO (PCT)
Prior art keywords
inspection
measurement
unit
robot arm
inspection object
Prior art date
Application number
PCT/JP2020/026010
Other languages
French (fr)
Japanese (ja)
Inventor
西 雄一
孝博 染次
真志 園田
崇善 西村
敦子 榎本
中須 信昭
Original Assignee
日立金属株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立金属株式会社 filed Critical 日立金属株式会社
Priority to JP2021532783A priority Critical patent/JPWO2021010181A1/ja
Publication of WO2021010181A1 publication Critical patent/WO2021010181A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination

Definitions

  • This disclosure relates to inspection equipment, inspection methods, positioning methods, and programs.
  • Patent Document 1 discloses an inspection device including a robot arm provided with a plurality of cameras, and an inspection object imaged by a plurality of cameras is inspected based on the imaging data. By judging the quality, it is possible to inspect the product with high quality without variation.
  • Patent Document 2 includes an inspection device composed of a robot arm equipped with a sensor unit, an inspection jig composed of a robot arm equipped with a holder for holding an inspection object, and a control unit.
  • a visual inspection method for performing a visual inspection of an inspection object is disclosed by the visual inspection system.
  • a visual inspection method has been proposed that combines the automatic extraction of appearance abnormalities of the inspection target by a computer and the quality judgment of the appearance abnormality portion by an operator.
  • Patent Documents 1 and 2 since it is necessary to teach the operation of the robot arm for each shape of the inspection object and the inspection item by using a programming pendant or the like, it takes a lot of man-hours to prepare for the inspection. It was inefficient.
  • the present disclosure has been made in view of the above-mentioned problems, and an object of the present disclosure is to provide an inspection device or the like that does not require teaching work of a robot arm.
  • the measurement position of the measurement sensor with respect to the inspection object is calculated based on the gripping robot arm for gripping the inspection object, the measurement sensor, and the shape data of the inspection object.
  • the measurement position calculation unit, the trajectory generation unit that generates the trajectory of the gripping robot arm based on the measurement position, and the operation of the gripping robot arm based on the trajectory are controlled to position the inspection object.
  • an inspection device including a robot control unit for performing the robot control unit, a measurement unit for measuring the inspection object by the measurement sensor after positioning, and an inspection unit for inspecting the inspection object based on the measured data. Will be done.
  • the computer calculates the measurement position of the measurement sensor with respect to the inspection object based on the shape data of the inspection object, and the inspection object is determined based on the measurement position.
  • An inspection method is provided for performing a step of measuring an object and a step of inspecting the inspection object based on the measured data.
  • the computer is a measurement position calculation unit that calculates the measurement position of the measurement sensor with respect to the inspection object based on the shape data of the inspection object, and the inspection object based on the measurement position.
  • a trajectory generating unit that generates a trajectory of a gripping robot arm that grips an object, a robot control unit that controls the operation of the gripping robot arm based on the trajectory and positions the inspection object, and a measurement after positioning.
  • a program is provided that functions as a measuring unit that measures the inspection object by a sensor and an inspection unit that inspects the inspection object based on the measured data.
  • a step of calculating a measurement position with respect to the inspection object based on the shape data of the inspection object and a grip for gripping the inspection object based on the measurement position A positioning method for executing a step of generating a trajectory of the robot arm for inspection and a step of controlling the operation of the gripping robot arm based on the trajectory and positioning the object to be inspected are provided.
  • an inspection device or the like that does not require teaching work of a robot arm is provided.
  • the figure which shows the example of the 2D optical system 42 The figure which shows the example of the 2D optical system 42 Block diagram showing an example of the hardware configuration of the control device 6 Block diagram showing an example of the functional configuration of the control device 6 Diagram showing the outline of the inspection process A flowchart showing the flow of processing for setting the inspection site 80 and the inspection specification 9.
  • the figure which shows the example of setting the inspection site 80 The figure which shows the example of setting the inspection specification 9.
  • the figure which shows the example of the geometric shape 81 (polyhedron)
  • the figure which shows the example of the geometric shape 81 (spherical grid)
  • the figure which shows the example which sets the measurement candidate position 101 The figure which shows a mode that the measurable area P1 of each measurement candidate position 101 is integrated.
  • the figure which shows the example of the measurement inspection table 100 Diagram showing the overall flow of inspection A flowchart showing the flow of the inspection operation executed by the inspection device 1.
  • FIG. 1A Flowchart showing the flow of measurement / inspection processing
  • the figure which shows the extraction example of the defect candidate 71 Diagram showing a display example of defective parts
  • FIG. 1 is a diagram showing an overall configuration of the inspection device 1 of the present embodiment.
  • the inspection device 1 is a device that inspects the appearance of the inspection object 7, and is mainly composed of a supply mechanism 2, a gripping mechanism 3, a measurement mechanism 4, a discharge mechanism 5, and a control device 6.
  • the supply mechanism 2 includes a conveyor 21 for supplying the inspection object 7 and a vision sensor 22 for recognizing the position and orientation of the inspection object 7.
  • the vision sensor 22 is arranged above the conveyor 21 and recognizes the position and orientation of the inspection object 7 on the conveyor 21.
  • the inspection object 7 is gripped by the gripping robot arm 31a or 31b based on the recognized position and posture of the inspection object 7.
  • the supply mechanism 2 is not limited to the example shown in the figure.
  • the supply mechanism 2 may be composed of a container (not shown) in which inspection objects 7 are piled up and a vision sensor 22 arranged above the container. In this case, the position and orientation of the inspection object 7 are recognized by the vision sensor 22, and the inspection object 7 is gripped by the gripping robot arm 31a or 31b.
  • the supply mechanism 2 may be composed of a tray (not shown) in which the inspection objects 7 are arranged in a fixed posture. In this case, since the position and posture of the inspection object 7 are known in advance, it is not necessary to provide the vision sensor 22. That is, the inspection object 7 is gripped by the gripping robot arm 31a or 31b without the vision sensor 22.
  • the supply mechanism 2 may be composed of a parts feeder (not shown). Also in this case, since the inspection object 7 is aligned in a constant posture, the inspection object 7 is gripped by the gripping robot arm 31a or 31b without the vision sensor 22.
  • the gripping mechanism 3 is composed of articulated (for example, 6-axis) robot arms (grasping robot arms 31a and 31b) arranged in a double-armed manner on both sides of the support column 18.
  • the gripping robot arms 31a and 31b include arm portions 311a and 311b and hand portions 312a and 312b, respectively.
  • the hand portions 312a and 312b are provided at the tips of the arm portions 311a and 311b.
  • the hand portions 312a and 312b may have any form as long as they can grip the inspection object 7, but the structure in which more than half of the inspection object 7 is exposed while being gripped by the hand portions 312a and 312b. Is desirable. If the hand portions 312a and 312b have such a structure, the entire surface of the inspection object 7 can be measured by one holding operation between the gripping robot arms 31a and 31b. As the hand portions 312a and 312b, a holding type, a suction type, or the like can be adopted.
  • the gripping robot arm 31a or 31b may be referred to as a gripping robot arm 31.
  • the arm portion of the gripping robot arm 31 is referred to as the arm portion 311 and the hand portion is referred to as the hand portion 312.
  • the measurement mechanism 4 includes an articulated (for example, 6-axis) robot arm (measurement robot arm 41) arranged above the support column 18, a two-dimensional optical system (hereinafter referred to as “2D optical system 42”), and It is composed of a three-dimensional sensor (hereinafter referred to as “3D sensor 45").
  • the 2D optical system 42 and the 3D sensor 45 are fixed to, for example, the tip of the measuring robot arm 41 directly or via another member.
  • FIGS. 2 and 3 are diagrams showing the 2D optical system 42.
  • FIG. 2 is a perspective view of the 2D optical system 42
  • FIG. 3 is an XZ cross-sectional view at the center of the 2D optical system 42.
  • the 2D optical system 42 includes a camera 43 including a main body 43a and a lens 43b, and illuminations 44 arranged in a plurality of hemispheres.
  • the camera 43 is arranged above the apex of the hemisphere and images the inspection object 7.
  • the camera 43 (main body 43a) has, for example, a CCD (Charge Coupled). Device) Image sensor and CMOS (Complementary Metal Oxide) Semiconductor) An image sensor such as an image sensor is installed.
  • the illumination 44 is, for example, a white LED. When the inspection object 7 is imaged, the lighting of the illumination 44 is switched, and two-dimensional images having different illumination angles can be acquired.
  • a plurality of cameras 43 may be arranged.
  • the 3D sensor 45 is a sensor that measures the three-dimensional shape of the inspection object 7.
  • the 3D sensor 45 can be used in any area such as a twin-lens or multi-lens stereo camera, an active stereo camera equipped with a light projecting unit such as a laser or a projector, and a device using the time-of-flight method. Anything that can measure three-dimensional data will do.
  • the camera 43 and the 3D sensor 45 are examples of the measurement sensors in the present disclosure.
  • the discharge mechanism 5 is composed of a conveyor 51 for discharging the inspection object 7.
  • a plurality of conveyors 51 may be provided so that the inspection object 7 can be classified according to the pass / fail of the inspection and the type of defect.
  • the control device 6 is a computer that controls the operations of the supply mechanism 2, the gripping mechanism 3, and the measuring mechanism 4.
  • the hardware configuration (FIG. 4) and the functional configuration (FIG. 5) of the control device 6 will be described.
  • FIG. 4 is a block diagram showing a hardware configuration of the control device 6.
  • the control unit 61, the storage unit 62, the communication unit 63, the input unit 64, the monitor 65, the peripheral device I / F unit 66, the UPS 67, and the like are connected via the bus 69. It is realized by a general-purpose computer. However, the present invention is not limited to this, and various configurations can be adopted depending on the application and purpose.
  • the control unit 61 is composed of a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
  • the CPU calls a program stored in the storage unit 62, ROM, recording medium, etc. into a work memory area on the RAM and executes the program, drives and controls each device connected via the bus 69, and the control device 6 performs the program. Realize the processing described later.
  • ROM is a non-volatile memory and permanently holds computer boot programs, programs such as BIOS, and data.
  • the RAM is a volatile memory, and includes a work area used by the control unit 61 to perform various processes while temporarily holding programs, data, and the like loaded from the storage unit 62, the ROM, the recording medium, and the like.
  • the storage unit 62 is an HDD (Hard Disk Drive) or the like, and stores a program executed by the control unit 61, data necessary for program execution, an OS (Operating System), and the like.
  • a control program corresponding to the OS and an application program for causing a computer to execute a process described later are stored.
  • Each of these program codes is read by the control unit 61 as necessary, transferred to the RAM, and read by the CPU to execute various processes.
  • the shape data 8 (for example, CAD data) of the inspection object 7 is stored in the storage unit 62.
  • a ROS Robot Operation System
  • an optical system simulator is constructed in the storage unit 62, and the measurable area P (imaging range) in the shape data 8 is calculated by inputting the shape data 8 of the inspection object 7 and the conditions of the optical system. It is possible to do. Further, the storage unit 62 stores in advance the learned deep learning device 72 used at the time of inspection.
  • the communication unit 63 has a communication control device, a communication port, etc., and is a communication interface that mediates communication between a computer and a network, and controls communication between other computers via the network.
  • the network can be wired or wireless.
  • the input unit 64 inputs data and has, for example, a pointing device such as a keyboard and a mouse, and an input device such as a numeric keypad. Operation instructions, operation instructions, data input, and the like can be given to the computer via the input unit 64.
  • the monitor 65 has a display device such as a liquid crystal panel, a logic circuit (video adapter or the like) for realizing a video function of a computer in cooperation with the display device.
  • the input unit 64 and the monitor 65 may be integrated as in the touch panel display.
  • the peripheral device I / F (Interface) unit 66 is a port for connecting the peripheral device to the computer, and the computer transmits / receives data to / from the peripheral device via the peripheral device I / F unit 66.
  • the peripheral device I / F unit 66 is composed of USB (Universal Serial Bus), LAN, IEEE1394, RS-232C, etc., and usually has a plurality of peripheral device I / Fs.
  • the connection form with peripheral devices may be wired or wireless.
  • the control device 6 includes a conveyor 21, a conveyor 51, a vision sensor 22, a gripping robot arm 31, a measuring robot arm 41, a 2D optical system 42, a 3D sensor 45, etc. via the peripheral device I / F (Interface) unit 66. Is connected with.
  • the UPS 67 is an uninterruptible power supply that continues to supply power even when the power is cut off due to a power failure or the like.
  • the bus 69 is a route that mediates the transfer of control signals, data signals, and the like between the devices.
  • the control device 6 may be configured by one computer, or may be configured so that a plurality of computers cooperate to execute the operation of the inspection device 1.
  • a simple configuration example an example in which the control device 6 is configured by one computer will be described.
  • FIG. 5 is a block diagram showing a functional configuration of the control device 6. As shown in the figure, the control device 6 is composed of the functions of the conveyor control unit 11, the work recognition unit 12, the robot control unit 13, the operation setting unit 14, the measurement unit 15, the inspection unit 16, and the data display unit 17. To.
  • the conveyor control unit 11 sends a control command to the drive units of the conveyors 21 and 51 to control the operation of the conveyors 21 and 51. Specifically, the conveyor control unit 11 operates the conveyor 21 on which the uninspected object 7 is placed in the supply direction (X direction in FIG. 1), and moves the inspection object 7 to the lower part of the vision sensor 22. Supply. After the inspection is completed, the conveyor control unit 11 operates the conveyor 51 on which the inspected object 7 is placed in the discharge direction (X direction in FIG. 1) to discharge the inspection object 7.
  • the work recognition unit 12 recognizes the position and orientation of the inspection object 7 supplied to the lower part of the vision sensor 22 by the vision sensor 22.
  • the operation of the gripping robot arm 31 is controlled based on the recognized position and posture of the inspection object 7, and the inspection object 7 is gripped by the hand unit 312.
  • the robot control unit 13 sends a control command to the drive unit of each joint of the gripping robot arm 31 and the measuring robot arm 41, and controls the operation of the gripping robot arm 31 and the measuring robot arm 41.
  • the robot control unit 13 moves the hand unit 312 of the gripping robot arm 31 to the picking position on the conveyor 21 based on the orbit generated by the orbit generation unit 143, which will be described later, and the inspection object. It is controlled so that the hand portion 312 grips the predetermined position of 7. As a result, the inspection object 7 on the conveyor 21 is picked.
  • the robot control unit 13 controls the trajectories of the gripping robot arm 31 and the measurement robot arm 41 based on the trajectories generated by the orbit generation unit 143, which will be described later, after gripping the inspection object 7 (after picking). Then, the inspection object 7 and the camera 43 are positioned. After positioning the inspection target 7 and the camera 43, the inspection target 7 is measured.
  • the robot control unit 13 controls the inspection object 7 so as to be switched between the hand unit 312a of the gripping robot arm 31a and the hand unit 312b of the gripping robot arm 31b.
  • a rule for gripping the inspection object 7 by the gripping robot arm 31a or 31b is set in advance for each measurement position 10, and the control unit 61 performs a holding operation based on this rule.
  • the robot control unit 13 moves the hand unit 312 of the gripping robot arm 31 to the release position on the conveyor 51 based on the trajectory generated by the trajectory generation unit 143 described later. At the same time, the gripping state of the inspection object 7 by the hand portion 312 is released. As a result, the inspected object 7 that has been inspected is released on the conveyor 51.
  • the motion setting unit 14 makes settings necessary for motion control of the gripping robot arm 31 and the measurement robot arm 41.
  • the operation setting unit 14 includes an inspection site setting unit 140, an inspection specification setting unit 141, and a measurement position calculation unit 142 (geometry generation unit 142a, measurement candidate position setting unit 142b, measurable area calculation unit 142c, It is composed of a measurement position selection unit 142d) and an orbit generation unit 143.
  • the inspection site setting unit 140 reads the shape data 8 (for example, CAD data) of the inspection object 7 from the storage unit 62, and sets the inspection site 80 to be inspected for the shape data 8.
  • shape data 8 for example, CAD data
  • the inspection specification setting unit 141 sets the inspection specification 9 for each inspection site 80 set by the inspection site setting unit 140.
  • the inspection specification 9 includes information on the defect type 91 and the defect specification 92.
  • the defect type 91 indicates the type of defect to be inspected, such as a "convex" defect, a "concave” defect, and a burr.
  • the defect specification 92 is a limit standard for each defect, and is used as a standard for determining the harmfulness / harmlessness of each defect.
  • the measurement position calculation unit 142 calculates the measurement position 10 of the inspection object 7 based on the shape data 8.
  • the measurement position 10 is the relative position of the camera 43 with respect to the shape data 8 (inspection object 7), and is, for example, the position coordinates of the camera 43 with the center of the shape data 8 (inspection object 7) as the origin. ..
  • the measurement position calculation unit 142 calculates a plurality of measurement positions 10 around the shape data 8 so that all the inspection sites 80 set in the shape data 8 are measured (imaged) by the camera 43.
  • the trajectories of the gripping robot arm 31 and the measuring robot arm 41 are automatically generated based on the measurement position 10.
  • the measurement position calculation unit 142 includes a geometry generation unit 142a, a measurement candidate position setting unit 142b, a measurable area calculation unit 142c, and a measurement position selection unit 142d.
  • the geometry generation unit 142a generates a geometry 81 that includes the shape data 8 and can define each direction of the shape data 8.
  • the measurement candidate position setting unit 142b sets one or more measurement candidate positions 101 in each direction defined by the geometric shape 81.
  • the measurable area calculation unit 142c calculates the measurable area P in the shape data 8 by the optical system simulator for each measurement candidate position 101 set by the measurement candidate position setting unit 142b.
  • the measurement position selection unit 142d selects the measurement position 10 from the measurement candidate positions 101 based on the measurable area P of each measurement candidate position 101. For example, the measurement position selection unit 142d selects the measurement position 10 so that all the inspection sites 80 (inspection target areas) are measured and the number of measurements is minimized.
  • the specific processing of the measurement position calculation unit 142 (geometry generation unit 142a, measurement candidate position setting unit 142b, measurable area calculation unit 142c, measurement position selection unit 142d) will be described later (see FIGS. 11 to 15).
  • the orbit generation unit 143 builds a CAD model of the inspection device 1 and peripheral equipment on the ROS, and then uses the PRM (Probabilistic Roadmap Planner) method and RRT (Rapidly).
  • PRM Probabilistic Roadmap Planner
  • RRT Rapidly
  • a path planning method such as the exploring Random Tree method is used to generate the trajectories of the gripping robot arm 31 and the measuring robot arm 41 avoiding interference.
  • the trajectory generation unit 143 calculates the picking position of the inspection object 7 (target position and target posture of the hand unit 312) based on the position and orientation of the inspection object 7 recognized by the work recognition unit 12. Then, the trajectory of the gripping robot arm 31 is generated based on the picking position. By controlling the gripping robot arm 31 by this trajectory, the inspection object 7 on the conveyor 21 is gripped (picked) by the hand portion 312 of the gripping robot arm 31.
  • the trajectory generation unit 143 generates the orbits of the gripping robot arm 31 and the measurement robot arm 41 based on the measurement position 10 calculated by the measurement position calculation unit 142.
  • the trajectory generation unit 143 has a gripping robot arm 31 and a measurement so that the relative position of the camera 43 with respect to the inspection object 7 is the measurement position 10 calculated by the measurement position calculation unit 142.
  • the inspection object 7 and the camera 43 are positioned at the measurement points. As a result, the inspection object 7 can be measured and inspected without teaching by a programming pendant or the like, and a highly efficient inspection is realized.
  • the trajectory generating unit 143 generates the trajectory of the gripping robot arm 31 based on the release position of the inspection object 7 (the target position and the target posture of the hand unit 312). By controlling the gripping robot arm 31 by this trajectory, the inspected object 7 that has been inspected is released on the conveyor 51.
  • the measurement unit 15 After positioning the inspection target 7 and the camera 43, the measurement unit 15 measures the inspection target 7. As shown in FIG. 5, the measuring unit 15 includes a first measuring unit 151 and a second measuring unit 152.
  • the first measurement unit 151 sends a measurement instruction to the 2D optical system 42 and measures the two-dimensional image data D1 of the inspection object 7.
  • the second measurement unit 152 sends a measurement instruction to the 3D sensor 45, and measures (shape measurement) the three-dimensional shape data D2 of the inspection object 7.
  • the inspection unit 16 inspects the inspection object 7 based on the measurement data measured by the measurement unit 15 (151, 152). As shown in FIG. 5, the inspection unit 16 is composed of a first inspection unit 161 and a second inspection unit 162.
  • the first inspection unit 161 extracts the defect candidate 71 based on the two-dimensional image data D1 of the inspection object 7 measured by the first measurement unit 151.
  • the first inspection unit 161 is a type of machine learning, deep learning (Deep).
  • the defect candidate 71 is extracted by using the deep learner 72 trained by Learning).
  • the deep learning device 72 is a discriminator that has been trained based on the quality image data (learning data) with / without defects prepared for each type of defect, and is stored in the storage unit 62 in advance.
  • the deep learning method SegNet, ResNet, or a method in which SegNet and ResNet are used in combination can be used, but the method is not limited to these methods.
  • the second inspection unit 162 performs a detailed shape inspection of the defect candidate 71 extracted by the first inspection unit 161 based on the three-dimensional shape data D2 of the inspection object 7 measured by the second measurement unit 152. Dimensional inspection) to determine the harmful / harmlessness of the final defect.
  • FIG. 6 is a diagram showing an outline of the inspection process.
  • the defect candidate 71 is extracted from the two-dimensional image data D1 using the deep learning device 72 (first inspection), and when the defect candidate 71 is not extracted, the inspection target is ". Judged as "good” (no defects).
  • the shape inspection (dimension inspection) of the defect candidate 71 is further performed by the three-dimensional shape data D2 (second inspection), and the defect specification 92 of the defect candidate 71 is obtained. By collating, the final pass / fail judgment is made.
  • FIG. 7 is a flowchart showing a flow of processing for setting the inspection site 80 and the inspection specification 9.
  • the control unit 61 of the control device 6 reads the shape data 8 of the inspection object 7 from the storage unit 62 and displays it on the monitor 65 (step S1).
  • FIG. 8 shows an example of shape data 8 of the inspection object 7 (joint).
  • the control unit 61 sets the inspection site 80 to be inspected with respect to the shape data 8 displayed on the monitor 65 (step S2).
  • This setting is performed by a user operation via the input unit 64.
  • the user can use the shape data 8 (CAD data of the pipe joint) as the inspection part 80 as the inspection part 80a (outer surface of the joint body) and the inspection part 80b (joint end). The outer surface of) is set.
  • the control unit 61 accepts the setting of the inspection specification 9 (defect type 91, defect specification 92) for each inspection site 80 set in step S2 (step S3).
  • the inspection specification 9a (defect type 91, defect specification 92) of the inspection part 80a (outer surface of the joint main body)
  • the inspection specification 9a-1 (“convex”, “ ⁇ 3.0 mm, height” 2.0 mm ")
  • inspection specifications 9a-2 (“concave ",” ⁇ 3.0 mm, depth 1.0 mm ") are set.
  • inspection specifications 9b (defect type 91, defect specifications 92) of the inspection site 80b (outer surface of the joint end)
  • inspection specifications 9b-1 (“convex”, “ ⁇ 3.0 mm, height 2.0 mm”)
  • Inspection specifications 9b-2 (“concave”, “ ⁇ 3.0 mm, depth 1.0 mm”)
  • inspection specifications 9b-3 (“burrs”, “0.5H ⁇ 2L (mm)”) are set.
  • control device 6 selects (calculates) the measurement position 10 so that all the inspection sites 80 set in step S2 are measured. Further, the control device 6 determines the quality of the inspection object 7 based on the inspection specifications 9 (defect type 91, defect specifications 92) of each inspection site 80 set in step S3.
  • FIG. 11 is a flowchart showing a flow of processing for calculating the measurement position 10.
  • the control unit 61 geometry generation unit 142a of the control device 6 generates the geometry 81 that includes the shape data 8 of the inspection object 7 and can define each direction of the shape data 8 (step S11). ).
  • a polyhedron 81 including the shape data 8 as shown in FIG. 12 and a spherical grid 81 including the shape data 8 as shown in FIG. 13 are generated.
  • each direction of the shape data 8 is defined by the direction of each surface of the polyhedron (normal direction of the surface).
  • each direction of the shape data 8 is defined by the direction of the intersection of the grid lines.
  • the control unit 61 (geometric generation unit 142a) will be described as generating the polyhedron 81 shown in FIG.
  • the polyhedron 81 is, for example, a regular dodecahedron or a regular icosahedron, but is not limited thereto.
  • the control unit 61 (measurement candidate position setting unit 142b) sets one or more measurement candidate positions 101 in each direction defined by the geometric shape 81 (step S12).
  • the control unit 61 sets one or more measurement candidate positions 101 (position coordinates of the camera 43) on the surface normal 84 passing through the center 83 of each surface 82 of the polyhedron 81 (geometric shape 81).
  • the imaging direction of each camera 43 installed at the measurement candidate position 101 is a direction perpendicular to the surface 82. That is, the image pickup center of each camera 43 is the center 83 of each surface 82.
  • FIG. 14 shows an example in which the measurement candidate position 101 is set on the surface normal 84 passing through the center 83 of the surface 82 having the polyhedron 81 (regular icosahedron).
  • three measurement candidate positions 101 are set on the surface normal 84.
  • the measurement candidate position 101 is appropriately set based on optical conditions such as the field of view range of the camera 43, the depth of the subject, and the working distance.
  • the control unit 61 (measurement candidate position setting unit 142b) sets the measurement candidate position 101 for all the surfaces 82 of the polyhedron 81. For example, when three measurement candidate positions 101 are set for each surface 82 of the regular icosahedron of FIG. 14, a total of 60 measurement candidate positions 101 are set.
  • control unit 61 calculates the measurable area P in the shape data 8 for each measurement candidate position 101 set in step S12 by using the optical system simulator (step S13). ).
  • control unit 61 measures all the inspection sites 80 (inspection target areas) based on the measurable region P of each measurement candidate position 101 calculated in step S13, and The measurement position 10 that minimizes the number of measurements is selected from the measurement candidate positions 101 (step S14).
  • control unit 61 (measurement position selection unit 142d) selects measurement candidate positions 101 one by one in a random order from all the measurement candidate positions 101, and measures the selected measurement candidate positions 101.
  • the possible area P will be integrated.
  • the region in which the measurable region P is integrated is called the integrated region R.
  • the control unit 61 discontinues the selection of the measurement candidate position 101 at the stage where the integrated region R covers (includes) all the inspection sites (inspection target regions), and measures the measurement candidate positions 101 selected so far. Select as 10.
  • the integrated region R is sequentially generated as R1, R2, and R3.
  • the integrated region R1 is equivalent to the measurable region P1 of the measurement candidate position 101a
  • the integrated region R2 is the region in which the measurable region P2 of the measurement candidate position 101b is integrated with the integrated region R1
  • R3 is the integrated region R2. This is a region in which the measurable region P3 of the measurement candidate position 101c is integrated.
  • the selection of the measurement candidate position 101 and the integration of the measurable area P are sequentially repeated, and the integrated area R covers (includes) all the inspection sites 80 (inspection target areas).
  • the selection of the measurement candidate position 101 is terminated at the stage, and the measurement candidate position 101 selected so far is selected as the measurement position 10. If the entire area of the measurable area P of the selected measurement candidate position 101 is included in the integrated area R generated up to that point, another measurement candidate position 101 is reselected. That is, the measurement position 10 is selected so that the already measured regions are not measured in duplicate.
  • the control unit 61 increases the number of surfaces.
  • the polyhedron 81 (polyhedron 81 with increased directional resolution) is regenerated, and the processes of steps S11 to S14 are executed again.
  • FIG. 16 shows a measurement inspection table 100 that holds various inspection information and the like in association with the measurement position 10 selected in step S14.
  • the inspection site 80, the defect type 91, and the defect specification 92 at each measurement position 10 are linked to each measurement position 10 (measurement positions 10-1, 10-2, ...) Arranged in the measurement order.
  • Lighting condition 93, first measurement data 94, second measurement data 95, and inspection result 96 information are retained.
  • the inspection site 80 is an inspection site 80 (inspection site that can be measured / inspected at each measurement position 10) included in the measurable region P of each measurement position 10 among the inspection sites 80 set in step S1 of FIG. ..
  • the defect type 91 and the defect specification 92 are the defect type 91 and the defect specification 92 associated with the inspection site 80 set in step S2 of FIG. 7.
  • the lighting condition 93 is a condition for which lighting 44 is to be turned on at the time of measurement, and is predetermined according to, for example, the type of defect type 91 (“convex”, “concave”, “burr” ).
  • the storage destination information of the measurement data is recorded after the measurement is executed.
  • Information on the inspection result is recorded in the inspection result 96 after the inspection is executed.
  • FIG. 17 shows an outline of the operation of the inspection device 1.
  • the inspection device 1 controls the trajectory of the gripping robot arm 31 to grip (pick) the inspection object 7 on the conveyor 21.
  • the inspection device 1 controls the trajectories of the gripping robot arm 31 and the measurement robot arm 41, and moves the inspection object 7 to the measurement point.
  • the inspection device 1 performs the measurement a plurality of times while changing the positional relationship between the inspection object 7 and the camera 43.
  • the inspection device 1 switches the inspection target 7 between the gripping robot arms 31a and 31b as necessary.
  • the inspection device 1 controls the trajectory of the gripping robot arm 31 and releases the inspection object 7 on the conveyor 51.
  • control device 6 The details of the operation of the inspection device 1 (control device 6) will be described with reference to the flowchart of FIG.
  • control unit 61 (conveyor control unit 11) of the control device 6 controls the operation of the conveyor 21 on which the uninspected object 7 is placed, and supplies the inspection object 7 to the lower part of the vision sensor 22 ( Step S21).
  • control unit 61 (work recognition unit 12) recognizes the position and orientation of the inspection object 7 supplied in step S21 by the vision sensor 22 (step S22).
  • control unit 61 robot control unit 13 moves the gripping robot arm 31 (31a or 31b) based on the position and posture of the inspection object 7 recognized in step S12, and the hand unit 312 inspects it.
  • the object 7 is controlled to be gripped (step S23).
  • control unit 61 (orbit generation unit 143) has the target position and the target position of the hand unit 312 capable of gripping the inspection object 7 based on the position and posture of the inspection object 7 recognized in step S22.
  • the target posture is calculated, and the trajectory of the gripping robot arm 31 is generated based on the calculated target position and target posture of the hand unit 312.
  • control unit 61 robot control unit 13 controls the trajectory of the gripping robot arm 31 based on the generated trajectory, and causes the hand unit 312 to grip the predetermined position of the inspection object 7. To control. As a result, the inspection object 7 is picked by the gripping robot arm 31.
  • control unit 61 robot control unit 13 controls the trajectories of the gripping robot arm 31 and the measurement robot arm 41, and positions the inspection object 7 and the camera 43 at the measurement point (step S24).
  • the relative position of the camera 43 with respect to the inspection object 7 is the measurement position 10 (in the first measurement) of the measurement inspection table 100 (see FIG. 16).
  • the trajectory of the gripping robot arm 31 and the measurement robot arm 41 is calculated so as to be at the measurement position 10-1), and a trajectory is generated.
  • the control unit 61 selects the orbits of the gripping robot arm 31 and the measurement robot arm 41 such that the relative position of the camera 43 with respect to the inspection object 7 is the measurement position 10.
  • a trajectory that satisfies a predetermined condition is uniquely determined.
  • the orbits satisfying the predetermined conditions are, for example, the orbits in which the average movement amount of the gripping robot arm 31 and the measurement robot arm 41 is minimized, and the average movement time of the gripping robot arm 31 and the measurement robot arm 41 is minimum. Orbit, etc.
  • control unit 61 robot control unit 13 controls the trajectories of the gripping robot arm 31 and the measurement robot arm 41 based on the generated trajectories, and positions the inspection object 7 and the camera 43.
  • control unit 61 (measurement unit 15, inspection unit 16) executes measurement / inspection of the inspection object 7 (step S25).
  • the control unit 61 (first measurement unit 151) sends a measurement instruction to the 2D optical system 42 and measures the two-dimensional image data D1 of the inspection object 7 (step S41). Specifically, the control unit 61 refers to the measurement inspection table 100 and measures the inspection site 80 that can be inspected at the corresponding measurement position 10 for each defect type 91. At this time, the control unit 61 may switch the illumination condition 93 according to each defect type 91.
  • the inspection site 80 at the measurement position 10-1 is “site A”, and the defect type 91 to be inspected is a “convex” defect and a “concave” defect.
  • the lighting condition 93 suitable for inspecting "convex” defects is “condition 1”
  • the lighting condition 93 suitable for inspecting "concave” defects is “condition 2”. Therefore, when the control unit 61 (first measurement unit 151) measures the “convex” defect in the “site A”, the control unit 61 turns on the illumination 44 based on the illumination condition 93 “condition 1” and then measures with the camera 43. I do. Further, when measuring the "concave” defect in the "part A”, the illumination 44 is turned on based on the illumination condition 93 "condition 2”, and then the measurement is performed by the camera 43.
  • the control unit 61 may turn on the determined lighting 44 (for example, turn on all the lighting 44) to perform the measurement, or the lighting 44 may be turned on.
  • the measurement may be performed while switching the above one by one. In the latter case, two-dimensional images for the number of illuminations having different illumination angles can be obtained.
  • the control unit 61 (first measurement unit 151) stores the two-dimensional image data D1 (measurement data) of the measured inspection object 7 in the storage unit 62, and stores it in the first measurement data 94 of the measurement inspection table 100. Record information.
  • control unit 61 (first inspection unit 161) inputs the two-dimensional image data D1 of the inspection object 7 measured in step S41 into the deep learning device 72, and inputs the defect candidate 71 from the two-dimensional image data D1. Extract (step S42).
  • FIG. 20 shows an extraction example of defect candidate 71.
  • FIG. 20A is an image taken by the camera 43 of the inspection object 7
  • FIG. 20B is an image in which the degree of defects obtained by deep learning is drawn with a contour map (contour map) (FIG. 20A). It is an image in which a contour map is superimposed on the photographed image of. As shown in FIG. 20B, for example, a portion where the degree of defect exceeds a predetermined threshold value is extracted as defect candidate 71 (71a, 71b, 71c).
  • step S42 If the defect candidate 71 is not extracted in step S42 (step S43; No), the control unit 61 (first inspection unit 161) determines that the inspection object 7 is “good” (no defect) and inspects it. The result is recorded in the inspection result column 96 of the measurement inspection table 100 (step S46).
  • step S42 the control unit 61 (second measurement unit 152) sends a measurement instruction to the 3D sensor 45 to provide three-dimensional shape data of the inspection object 7.
  • D2 is measured (step S44)
  • the control unit 61 may control the gripping robot arm 31 or the measurement robot arm 41 to adjust the measurement position of the 3D sensor 45 with respect to the inspection object 7.
  • control unit 61 (second inspection unit 162) inspects the shape of the defect candidate 71 extracted in step S42 based on the three-dimensional shape data D2 of the inspection object 7 measured in step S44. Dimension inspection) is performed to judge the quality.
  • the control unit 61 recognizes the shape of the defect candidate 71 (“convex” defect) from the three-dimensional shape data D2, and the defect specification 92 “ ⁇ 3.0 mm, height” of the measurement inspection table 100.
  • Shape inspection is performed by collating with "2.0 mm”. For example, when the control unit 61 (second inspection unit 162) recognizes that the ⁇ (diameter) of the defect candidate 71 is 3.0 mm or more and the height is 2.0 mm or more, it is “No” (“convex”). "There is a defect”), and in other cases, it is judged as “Good” (no "convex” defect).
  • control unit 61 (second inspection unit 162) records the inspection result in the inspection result column 96 of the measurement inspection table 100 (step S46).
  • step S26 the control unit 61 needs to switch between the gripping robot arms 31a and 31b of the inspection object 7 in order to perform the next measurement / inspection. (Step S27).
  • a rule for gripping the inspection object 7 by the gripping robot arm 31a or 31b is set in advance, and the control unit 61 determines whether or not it is necessary to change hands based on this rule. To judge.
  • control unit 61 When it is not necessary to change hands, the control unit 61 returns to step S24 and moves the gripping robot arm 31 and the measurement robot arm 41 to the next measurement point.
  • step S27 when it is necessary to change hands (step S27; Yes), the control unit 61 (robot control unit 13) inspects the inspection object 7 with the hand unit 312a of the gripping robot arm 31a and the hand unit 312b of the gripping robot arm 31b. It is controlled so as to switch between and (step S28).
  • the control unit 61 first sets the gripping robot arm 31a so that the hand portion 312a of the gripping robot arm 31a and the hand portion 312b of the gripping robot arm 31b have predetermined positions and postures at the time of changing hands.
  • the trajectory of the gripping robot arm 31b is controlled, and the hand portion 312a and the hand portion 312b are positioned.
  • control unit 61 causes the hand unit 312b to grip the predetermined position of the inspection object 7, and after the hand unit 312b grips the object 7, releases the grip of the hand unit 312a. As a result, the inspection target 7 is switched between the hand portions 312a and 312b.
  • step S24 the control unit 61 (robot control unit 13) moves the gripping robot arm 31 and the measurement robot arm 41 after the change to the next measurement point, and the inspection target. Position the object 7 and the camera 43.
  • steps 24 to S28 described above are repeatedly executed until all the inspections are completed (step S26; Yes).
  • control unit 61 robot control unit 13
  • the gripping robot arm 31 holding the inspection object 7 moves to the grip release position on the conveyor 51, and the hand unit 312. Is controlled to release the gripping state of the inspection object 7 (step S29).
  • the control unit 61 (conveyor control unit 11) controls the operation of the conveyor 51 on which the inspected object 7 is placed, and discharges the inspection object 7.
  • control unit 61 displays information on the inspection result of the inspection object 7 on the monitor 67. For example, as shown in FIG. 21, the control unit 61 displays a captured image in which the defective portion is clearly shown. Further, as shown in FIG. 22, the control unit 61 displays a daily transition graph of the defective product occurrence rate, and as shown in FIG. 23, the defective occurrence rate predicted from the manufacturing process parameters and the actual defective occurrence rate. Correlation analysis results showing the correlation with the rate may be displayed.
  • the inspection device 1 of the present embodiment includes a gripping robot arm 31 for gripping the inspection object 7, a measurement robot arm 41 provided with a camera 43, and a control device 6.
  • the control device 6 calculates the measurement position 10 of the camera 43 with respect to the inspection object 7 based on the shape data 8 of the inspection object 7, and the gripping robot arm 31 and the measurement robot arm based on the measurement position 10. Generates 41 orbits. Then, the control device 6 controls the operations of the gripping robot arm 31 and the measuring robot arm 41 based on the generated trajectory, and positions the inspection object 7 and the camera 43 at the measurement points.
  • the measurement position 10 is calculated so that all the inspection sites 80 are measured. Specifically, a polyhedron 81 including the shape data 8 is generated, one or more measurement candidate positions 101 are set on the surface normal line 84 passing through the center 83 of each surface 82 of the polyhedron 81, and each measurement candidate position 101 is set. Based on the measurable region P of the above, the measurement position 10 in which all the inspection sites 80 (inspection target regions) are measured and the number of measurements is the minimum is selected. As a result, the optimum measurement position 10 for measuring all the inspection sites 80 is automatically determined.
  • each inspection site 80 of the shape data 8 is associated with an inspection specification 9 such as a defect type 91 and a defect specification 92 to be inspected.
  • an inspection specification 9 such as a defect type 91 and a defect specification 92 to be inspected.
  • the measurement / inspection of the inspection object 7 is executed in two stages. That is, the defect candidate 71 is extracted from the two-dimensional image data D1 using the deep learning device 72 (first inspection), and if the defect candidate 71 is not extracted, the inspection target is “good” (no defect). Is determined. On the other hand, when the defect candidate 71 is extracted, the shape inspection (dimension inspection) of the defect candidate 71 is further performed based on the three-dimensional shape data D2 (second inspection), and the final quality judgment is performed.
  • shape inspection dimension inspection
  • final quality judgment is performed. Since the shape inspection (dimension inspection) is performed based on numerical data such as the dimensions, depth, and height of defect candidates, the technical basis for the inspection results becomes clear. It is possible to perform the inspection only by the shape inspection, but since the three-dimensional shape measurement by the 3D sensor 45 takes a long measurement time, in the present embodiment, the first measurement / inspection is comprehensive based on the two-dimensional image.
  • We adopted a two-step measurement / inspection method in which a detailed shape inspection (dimension inspection) based on a three-dimensional shape is performed by the second measurement / inspection.
  • the 2D optical system 42 and the 3D sensor 45 may be fixed to the inspection table 19 or the like of the inspection device 1A.
  • the trajectory control of the measurement robot arm 41 becomes unnecessary. That is, in step S23 of FIG. 18, the control unit 61 (orbit generation unit 143) of the control device 6 has a gripping robot arm so that the position of the camera 43 relative to the inspection object 7 is the measurement position 10. Generates 31 orbits. Then, the control unit 61 (robot control unit 13) controls the trajectory of the gripping robot arm 31 based on the generated trajectory, and positions the inspection object 7.
  • a 3D scanner and a tracking marker are mounted on the tip of the measurement robot arm 41, and the control unit 61 tracks the operation of the tracking marker with a tracking system to measure the three-dimensional shape of the inspection object 7. It may be configured as.
  • Inspection device 2 Supply mechanism 2D: Measuring robot arm 3: Gripping mechanism 4: Measuring mechanism 5: Discharge mechanism 6: Control device 7: Inspection object 8: Shape data 9: Inspection specifications 10: Measurement position 11: Conveyor Control unit 12: Work recognition unit 13: Robot control unit 14: Operation setting unit 15: Measurement unit 16: Inspection unit 17: Data display unit 21: Conveyor 31: Gripping robot arm 31a: Gripping robot arm 31b: Gripping robot Arm 41: Measurement robot arm 42: 2D optical system 43: Camera 44: Lighting 45: 3D sensor 51: Conveyor 71: Defect candidate 72: Deep learner 80: Inspection site 80a: Inspection site 80b: Inspection site 81: Geometric shape 81: Spherical grid 81: Polyhedron 82: Surface 83: Center 84: Surface normal line 91: Defect type 92: Defect specification 93: Illumination condition 94: First measurement data 95: Second measurement data 96: Inspection result 100: Measurement inspection Table 101: Measurement candidate position 101a

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)

Abstract

This inspection device 1 comprises: a robot arm 31 for gripping that grips an inspection target object 7; a camera 43; a measurement position calculation unit 142 that calculates a measurement position 10 for the camera 43 relative to the inspection target object 7, on the basis of shape data 8 for the inspection target object 7; a trajectory generation unit 143 that generates a trajectory for the robot arm 31 for gripping, on the basis of the measurement position 10; a robot control unit 13 that controls the action of the robot arm 31 for gripping, on the basis of the trajectory, and positions the inspection target object 7; a measurement unit 15 that, after positioning, uses the camera 43 and measures the inspection target object 7; and an inspection unit 16 that inspects the inspection target object 7 on the basis of the measured data.

Description

検査装置、検査方法、位置決め方法、およびプログラムInspection equipment, inspection methods, positioning methods, and programs
 本開示は、検査装置、検査方法、位置決め方法、およびプログラムに関する。 This disclosure relates to inspection equipment, inspection methods, positioning methods, and programs.
 従来から、製品の製造ライン等では、カメラ等のセンサとロボットアームを組み合わせた検査装置により、製品のキズや欠けなどの外観検査の自動化が進められている。 Conventionally, in product manufacturing lines, etc., automation of visual inspection such as scratches and chips on products has been promoted by an inspection device that combines a sensor such as a camera and a robot arm.
 例えば、特許文献1には、複数個のカメラが設けられたロボットアームから構成される検査装置が開示されており、複数個のカメラで撮像した検査対象物を撮像データに基づいて検査対象物の良否を判定することで、製品の検査をばらつきなく高い品質で行うことを可能としている。 For example, Patent Document 1 discloses an inspection device including a robot arm provided with a plurality of cameras, and an inspection object imaged by a plurality of cameras is inspected based on the imaging data. By judging the quality, it is possible to inspect the product with high quality without variation.
 また特許文献2には、センサユニットが装着されたロボットアームから構成される検査装置と、検査対象物を保持する保持具が装着されたロボットアームから構成される検査冶具と、制御ユニットとを備えた外観検査システムにより、検査対象物の外観検査を行う外観検査方法が開示されている。特に、コンピュータによる検査対象の外観異常の自動抽出と、作業者による外観異常箇所の良否判定とを組み合わせた外観検査方法が提案されている。 Further, Patent Document 2 includes an inspection device composed of a robot arm equipped with a sensor unit, an inspection jig composed of a robot arm equipped with a holder for holding an inspection object, and a control unit. A visual inspection method for performing a visual inspection of an inspection object is disclosed by the visual inspection system. In particular, a visual inspection method has been proposed that combines the automatic extraction of appearance abnormalities of the inspection target by a computer and the quality judgment of the appearance abnormality portion by an operator.
国際公開第2015/011782号公報International Publication No. 2015/011782 特開2018-59830号公報JP-A-2018-59830
 しかしながら、特許文献1、2では、プログラミングペンダント等により、検査対象物の形状や検査項目ごとにロボットアームの動作を教示する必要があるため、検査を実施するまでの準備に多大な工数がかかり、非効率であった。 However, in Patent Documents 1 and 2, since it is necessary to teach the operation of the robot arm for each shape of the inspection object and the inspection item by using a programming pendant or the like, it takes a lot of man-hours to prepare for the inspection. It was inefficient.
 本開示は、前述した問題点に鑑みてなされたものであり、ロボットアームの教示作業を必要としない検査装置等を提供することを目的とする。 The present disclosure has been made in view of the above-mentioned problems, and an object of the present disclosure is to provide an inspection device or the like that does not require teaching work of a robot arm.
 本開示の一実施形態によれば、検査対象物を把持する把持用ロボットアームと、測定センサと、前記検査対象物の形状データに基づいて前記検査対象物に対する前記測定センサの測定位置を算出する測定位置算出部と、前記測定位置に基づいて前記把持用ロボットアームの軌道を生成する軌道生成部と、前記軌道に基づいて前記把持用ロボットアームの動作を制御し、前記検査対象物の位置決めを行うロボット制御部と、位置決め後、前記測定センサにより前記検査対象物の測定を行う測定部と、測定されたデータに基づいて前記検査対象物の検査を行う検査部と、を備える検査装置が提供される。 According to one embodiment of the present disclosure, the measurement position of the measurement sensor with respect to the inspection object is calculated based on the gripping robot arm for gripping the inspection object, the measurement sensor, and the shape data of the inspection object. The measurement position calculation unit, the trajectory generation unit that generates the trajectory of the gripping robot arm based on the measurement position, and the operation of the gripping robot arm based on the trajectory are controlled to position the inspection object. Provided is an inspection device including a robot control unit for performing the robot control unit, a measurement unit for measuring the inspection object by the measurement sensor after positioning, and an inspection unit for inspecting the inspection object based on the measured data. Will be done.
 また本開示の一実施形態によれば、コンピュータが、検査対象物の形状データに基づいて前記検査対象物に対する測定センサの測定位置を算出するステップと、前記測定位置に基づいて前記検査対象物を把持する把持用ロボットアームの軌道を生成するステップと、前記軌道に基づいて前記把持用ロボットアームの動作を制御し、前記検査対象物の位置決めを行うステップと、位置決め後、前記測定センサにより前記検査対象物の測定を行うステップと、測定されたデータに基づいて前記検査対象物の検査を行うステップと、を実行する検査方法が提供される。 Further, according to one embodiment of the present disclosure, the computer calculates the measurement position of the measurement sensor with respect to the inspection object based on the shape data of the inspection object, and the inspection object is determined based on the measurement position. A step of generating a trajectory of the gripping robot arm to be gripped, a step of controlling the operation of the gripping robot arm based on the trajectory to position the inspection object, and after positioning, the inspection by the measurement sensor. An inspection method is provided for performing a step of measuring an object and a step of inspecting the inspection object based on the measured data.
 また本開示の一実施形態によれば、コンピュータを、検査対象物の形状データに基づいて前記検査対象物に対する測定センサの測定位置を算出する測定位置算出部、前記測定位置に基づいて前記検査対象物を把持する把持用ロボットアームの軌道を生成する軌道生成部、前記軌道に基づいて前記把持用ロボットアームの動作を制御し、前記検査対象物の位置決めを行うロボット制御部、位置決め後、前記測定センサにより前記検査対象物の測定を行う測定部、測定されたデータに基づいて前記検査対象物の検査を行う検査部、として機能させるプログラムが提供される。 Further, according to one embodiment of the present disclosure, the computer is a measurement position calculation unit that calculates the measurement position of the measurement sensor with respect to the inspection object based on the shape data of the inspection object, and the inspection object based on the measurement position. A trajectory generating unit that generates a trajectory of a gripping robot arm that grips an object, a robot control unit that controls the operation of the gripping robot arm based on the trajectory and positions the inspection object, and a measurement after positioning. A program is provided that functions as a measuring unit that measures the inspection object by a sensor and an inspection unit that inspects the inspection object based on the measured data.
 また本開示の一実施形態によれば、コンピュータが、検査対象物の形状データに基づいて前記検査対象物に対する測定位置を算出するステップと、前記測定位置に基づいて前記検査対象物を把持する把持用ロボットアームの軌道を生成するステップと、前記軌道に基づいて前記把持用ロボットアームの動作を制御し、前記検査対象物の位置決めを行うステップと、を実行する位置決め方法が提供される。 Further, according to one embodiment of the present disclosure, a step of calculating a measurement position with respect to the inspection object based on the shape data of the inspection object and a grip for gripping the inspection object based on the measurement position. A positioning method for executing a step of generating a trajectory of the robot arm for inspection and a step of controlling the operation of the gripping robot arm based on the trajectory and positioning the object to be inspected are provided.
 本開示によれば、ロボットアームの教示作業を必要としない検査装置等が提供される。 According to the present disclosure, an inspection device or the like that does not require teaching work of a robot arm is provided.
検査装置1の全体構成の例を示す図The figure which shows the example of the whole structure of the inspection apparatus 1. 2D光学系42の例を示す図The figure which shows the example of the 2D optical system 42 2D光学系42の例を示す図The figure which shows the example of the 2D optical system 42 制御装置6のハードウェア構成の例を示すブロック図Block diagram showing an example of the hardware configuration of the control device 6 制御装置6の機能構成の例を示すブロック図Block diagram showing an example of the functional configuration of the control device 6 検査処理の概要を示す図Diagram showing the outline of the inspection process 検査部位80、検査仕様9を設定する処理の流れを示すフローチャートA flowchart showing the flow of processing for setting the inspection site 80 and the inspection specification 9. 形状データ8の例を示す図The figure which shows the example of the shape data 8. 検査部位80を設定する例を示す図The figure which shows the example of setting the inspection site 80 検査仕様9を設定する例を示す図The figure which shows the example of setting the inspection specification 9. 測定位置10を算出する処理の流れを示すフローチャートFlow chart showing the flow of processing to calculate the measurement position 10 幾何形状81(多面体)の例を示す図The figure which shows the example of the geometric shape 81 (polyhedron) 幾何形状81(球面グリッド)の例を示す図The figure which shows the example of the geometric shape 81 (spherical grid) 測定候補位置101を設定する例を示す図The figure which shows the example which sets the measurement candidate position 101 各測定候補位置101の測定可能領域P1が統合される様子を示す図The figure which shows a mode that the measurable area P1 of each measurement candidate position 101 is integrated. 測定検査テーブル100の例を示す図The figure which shows the example of the measurement inspection table 100 検査の全体の流れを示す図Diagram showing the overall flow of inspection 検査装置1が実行する検査動作の流れを示すフローチャートA flowchart showing the flow of the inspection operation executed by the inspection device 1. 測定・検査処理の流れを示すフローチャートFlowchart showing the flow of measurement / inspection processing 欠陥候補71の抽出例を示す図The figure which shows the extraction example of the defect candidate 71 欠陥箇所の表示例を示す図Diagram showing a display example of defective parts 不良品発生率の推移グラフを示す図The figure which shows the transition graph of the defective product occurrence rate 製造工程パラメータから予測した不良発生率と実際の不良発生率との相関関係を示す相関解析結果の表示例を示す図The figure which shows the display example of the correlation analysis result which shows the correlation between the defect occurrence rate predicted from the manufacturing process parameter, and the actual defect occurrence rate. 検査装置1Aの全体構成の例を示す図The figure which shows the example of the whole structure of the inspection apparatus 1A
 以下図面に基づいて、本開示の実施形態(以下、「本実施形態」と表記)を詳細に説明する。 Hereinafter, embodiments of the present disclosure (hereinafter referred to as “the present embodiment”) will be described in detail based on the drawings.
(1.検査装置1の構成)
 図1は、本実施形態の検査装置1の全体構成を示す図である。検査装置1は、検査対象物7の外観検査を行う装置であり、主に、供給機構2、把持機構3、測定機構4、排出機構5、および制御装置6から構成される。
(1. Configuration of inspection device 1)
FIG. 1 is a diagram showing an overall configuration of the inspection device 1 of the present embodiment. The inspection device 1 is a device that inspects the appearance of the inspection object 7, and is mainly composed of a supply mechanism 2, a gripping mechanism 3, a measurement mechanism 4, a discharge mechanism 5, and a control device 6.
 供給機構2は、検査対象物7を供給するためのコンベア21と検査対象物7の位置および姿勢を認識するためのビジョンセンサ22から構成される。ビジョンセンサ22は、コンベア21の上方に配置され、コンベア21上の検査対象物7の位置および姿勢を認識する。認識された検査対象物7の位置および姿勢に基づいて、把持用ロボットアーム31aまたは31bにより検査対象物7が把持される。 The supply mechanism 2 includes a conveyor 21 for supplying the inspection object 7 and a vision sensor 22 for recognizing the position and orientation of the inspection object 7. The vision sensor 22 is arranged above the conveyor 21 and recognizes the position and orientation of the inspection object 7 on the conveyor 21. The inspection object 7 is gripped by the gripping robot arm 31a or 31b based on the recognized position and posture of the inspection object 7.
 なお、供給機構2は、図の例に限定されない。例えば、供給機構2が、検査対象物7が山積みされたコンテナ(不図示)とコンテナの上方に配置されたビジョンセンサ22から構成されてもよい。この場合、ビジョンセンサ22により検査対象物7の位置および姿勢が認識され、把持用ロボットアーム31aまたは31bにより検査対象物7が把持される。 Note that the supply mechanism 2 is not limited to the example shown in the figure. For example, the supply mechanism 2 may be composed of a container (not shown) in which inspection objects 7 are piled up and a vision sensor 22 arranged above the container. In this case, the position and orientation of the inspection object 7 are recognized by the vision sensor 22, and the inspection object 7 is gripped by the gripping robot arm 31a or 31b.
 また、供給機構2が、検査対象物7を一定の姿勢で整列したトレー(不図示)で構成されてもよい。この場合、検査対象物7の位置および姿勢があらかじめ既知であるため、ビジョンセンサ22を設ける必要がない。すなわち、ビジョンセンサ22なしで、検査対象物7が把持用ロボットアーム31aまたは31bにより把持される。 Further, the supply mechanism 2 may be composed of a tray (not shown) in which the inspection objects 7 are arranged in a fixed posture. In this case, since the position and posture of the inspection object 7 are known in advance, it is not necessary to provide the vision sensor 22. That is, the inspection object 7 is gripped by the gripping robot arm 31a or 31b without the vision sensor 22.
 また、供給機構2が、パーツフィーダ(不図示)で構成されてもよい。この場合も、検査対象物7が一定の姿勢で整列されるため、ビジョンセンサ22なしで検査対象物7が把持用ロボットアーム31aまたは31bにより把持される。 Further, the supply mechanism 2 may be composed of a parts feeder (not shown). Also in this case, since the inspection object 7 is aligned in a constant posture, the inspection object 7 is gripped by the gripping robot arm 31a or 31b without the vision sensor 22.
 把持機構3は、支柱18の両側部に双腕状に配置された多関節型(例えば6軸)のロボットアーム(把持用ロボットアーム31aおよび31b)から構成される。把持用ロボットアーム31a、31bは、それぞれ、アーム部311a、311bとハンド部312a、312bを備える。ハンド部312a、312bは、アーム部311a、311bの先端に設けられる。 The gripping mechanism 3 is composed of articulated (for example, 6-axis) robot arms (grasping robot arms 31a and 31b) arranged in a double-armed manner on both sides of the support column 18. The gripping robot arms 31a and 31b include arm portions 311a and 311b and hand portions 312a and 312b, respectively. The hand portions 312a and 312b are provided at the tips of the arm portions 311a and 311b.
 ハンド部312a、312bは、検査対象物7を把持可能な構造であればどのような形態でもよいが、ハンド部312a、312bにより把持された状態で、検査対象物7の半分以上が露出する構造であることが望ましい。ハンド部312a、312bをこのような構造とすれば、把持用ロボットアーム31a、31b間での一度の持ち替え動作で、検査対象物7の全面を測定することができる。ハンド部312a、312bとして、挟持タイプのもの、吸着タイプのものなどを採用できる。 The hand portions 312a and 312b may have any form as long as they can grip the inspection object 7, but the structure in which more than half of the inspection object 7 is exposed while being gripped by the hand portions 312a and 312b. Is desirable. If the hand portions 312a and 312b have such a structure, the entire surface of the inspection object 7 can be measured by one holding operation between the gripping robot arms 31a and 31b. As the hand portions 312a and 312b, a holding type, a suction type, or the like can be adopted.
 なお、本開示において、把持用ロボットアーム31aまたは31bを、把持用ロボットアーム31と表記する場合がある。この場合、把持用ロボットアーム31のアーム部はアーム部311、ハンド部はハンド部312と表記する。 In the present disclosure, the gripping robot arm 31a or 31b may be referred to as a gripping robot arm 31. In this case, the arm portion of the gripping robot arm 31 is referred to as the arm portion 311 and the hand portion is referred to as the hand portion 312.
 測定機構4は、支柱18の上部に配置された多関節型(例えば6軸)のロボットアーム(測定用ロボットアーム41)と、2次元光学系(以下「2D光学系42」と表記)と、3次元センサ(以下「3Dセンサ45」と表記)とから構成される。2D光学系42および3Dセンサ45は、例えば、測定用ロボットアーム41の先端部に直接、または他の部材を介して固定される。 The measurement mechanism 4 includes an articulated (for example, 6-axis) robot arm (measurement robot arm 41) arranged above the support column 18, a two-dimensional optical system (hereinafter referred to as “2D optical system 42”), and It is composed of a three-dimensional sensor (hereinafter referred to as "3D sensor 45"). The 2D optical system 42 and the 3D sensor 45 are fixed to, for example, the tip of the measuring robot arm 41 directly or via another member.
 図2、3は、2D光学系42を示す図である。図2は2D光学系42の斜視図であり、図3は2D光学系42の中心におけるXZ断面図である。図2、3に示すように、2D光学系42は、本体43aおよびレンズ43bからなるカメラ43と、半球状に複数配置された照明44から構成される。カメラ43は半球の頂点上方に配置され、検査対象物7を撮像する。 2 and 3 are diagrams showing the 2D optical system 42. FIG. 2 is a perspective view of the 2D optical system 42, and FIG. 3 is an XZ cross-sectional view at the center of the 2D optical system 42. As shown in FIGS. 2 and 3, the 2D optical system 42 includes a camera 43 including a main body 43a and a lens 43b, and illuminations 44 arranged in a plurality of hemispheres. The camera 43 is arranged above the apex of the hemisphere and images the inspection object 7.
 カメラ43(本体43a)には、例えば、CCD(Charge Coupled
Device)イメージセンサやCMOS(Complementary Metal Oxide
Semiconductor)イメージセンサ等の撮像センサが搭載されている。照明44は、例えば白色LEDである。検査対象物7の撮像時に、照明44の点灯が切り替えられ、照明角度の異なる2次元画像を取得することができる。なお、カメラ43は複数配置されてもよい。
The camera 43 (main body 43a) has, for example, a CCD (Charge Coupled).
Device) Image sensor and CMOS (Complementary Metal Oxide)
Semiconductor) An image sensor such as an image sensor is installed. The illumination 44 is, for example, a white LED. When the inspection object 7 is imaged, the lighting of the illumination 44 is switched, and two-dimensional images having different illumination angles can be acquired. A plurality of cameras 43 may be arranged.
 3Dセンサ45は、検査対象物7の3次元形状を測定するセンサである。3Dセンサ45は、例えば、二眼あるいは多眼のステレオカメラ、レーザやプロジェクタなどの投光部とカメラとを備えたアクティブ方式のステレオカメラ、タイムオブフライト法を用いる機器など、任意の領域の3次元データを測定できるものであればよい。
 なお、カメラ43や3Dセンサ45は、本開示における測定センサの一例である。
The 3D sensor 45 is a sensor that measures the three-dimensional shape of the inspection object 7. The 3D sensor 45 can be used in any area such as a twin-lens or multi-lens stereo camera, an active stereo camera equipped with a light projecting unit such as a laser or a projector, and a device using the time-of-flight method. Anything that can measure three-dimensional data will do.
The camera 43 and the 3D sensor 45 are examples of the measurement sensors in the present disclosure.
 排出機構5は、検査対象物7を排出するためのコンベア51から構成される。検査対象物7の検査の合否や欠陥の種類で分類できるように、コンベア51を複数設けても良い。 The discharge mechanism 5 is composed of a conveyor 51 for discharging the inspection object 7. A plurality of conveyors 51 may be provided so that the inspection object 7 can be classified according to the pass / fail of the inspection and the type of defect.
 制御装置6は、供給機構2、把持機構3、および測定機構4の動作を制御するコンピュータである。
 以下、制御装置6のハードウェア構成(図4)および機能構成(図5)について説明する。
The control device 6 is a computer that controls the operations of the supply mechanism 2, the gripping mechanism 3, and the measuring mechanism 4.
Hereinafter, the hardware configuration (FIG. 4) and the functional configuration (FIG. 5) of the control device 6 will be described.
(2.制御装置6のハードウェア構成)
 図4は、制御装置6のハードウェア構成を示すブロック図である。図に示すように、制御装置6は、制御部61、記憶部62、通信部63、入力部64、モニタ65、周辺機器I/F部66、UPS67等が、バス69を介して接続される汎用のコンピュータで実現される。但し、これに限ることなく、用途、目的に応じて様々な構成を採ることが可能である。
(2. Hardware configuration of control device 6)
FIG. 4 is a block diagram showing a hardware configuration of the control device 6. As shown in the figure, in the control device 6, the control unit 61, the storage unit 62, the communication unit 63, the input unit 64, the monitor 65, the peripheral device I / F unit 66, the UPS 67, and the like are connected via the bus 69. It is realized by a general-purpose computer. However, the present invention is not limited to this, and various configurations can be adopted depending on the application and purpose.
 制御部61は、CPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)等によって構成される。CPUは、記憶部62、ROM、記録媒体等に格納されるプログラムをRAM上のワークメモリ領域に呼び出して実行し、バス69を介して接続された各装置を駆動制御し、制御装置6が行う後述する処理を実現する。 The control unit 61 is composed of a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like. The CPU calls a program stored in the storage unit 62, ROM, recording medium, etc. into a work memory area on the RAM and executes the program, drives and controls each device connected via the bus 69, and the control device 6 performs the program. Realize the processing described later.
 ROMは、不揮発性メモリであり、コンピュータのブートプログラムやBIOS等のプログラム、データ等を恒久的に保持している。RAMは、揮発性メモリであり、記憶部62、ROM、記録媒体等からロードしたプログラム、データ等を一時的に保持するとともに、制御部61が各種処理を行う為に使用するワークエリアを備える。 ROM is a non-volatile memory and permanently holds computer boot programs, programs such as BIOS, and data. The RAM is a volatile memory, and includes a work area used by the control unit 61 to perform various processes while temporarily holding programs, data, and the like loaded from the storage unit 62, the ROM, the recording medium, and the like.
 記憶部62は、HDD(Hard Disk Drive)等であり、制御部61が実行するプログラム、プログラム実行に必要なデータ、OS(Operating System)等が格納される。プログラムに関しては、OSに相当する制御プログラムや、後述する処理をコンピュータに実行させるためのアプリケーションプログラムが格納されている。これらの各プログラムコードは、制御部61により必要に応じて読み出されてRAMに移され、CPUに読み出されて各種処理を実行する。 The storage unit 62 is an HDD (Hard Disk Drive) or the like, and stores a program executed by the control unit 61, data necessary for program execution, an OS (Operating System), and the like. As for the program, a control program corresponding to the OS and an application program for causing a computer to execute a process described later are stored. Each of these program codes is read by the control unit 61 as necessary, transferred to the RAM, and read by the CPU to execute various processes.
 本実施形態では、記憶部62には、検査対象物7の形状データ8(例えば、CADデータ)が記憶されている。また、記憶部62には、ROS(Robot Operation System)が構築されており、ROSのシミュレータ機能を用いることで、把持用ロボットアーム31および測定用ロボットアーム41の軌道を自動生成することが可能となっている。 In the present embodiment, the shape data 8 (for example, CAD data) of the inspection object 7 is stored in the storage unit 62. Further, a ROS (Robot Operation System) is constructed in the storage unit 62, and it is possible to automatically generate the trajectories of the gripping robot arm 31 and the measuring robot arm 41 by using the ROS simulator function. It has become.
 また、記憶部62には、光学系シミュレータが構築されており、検査対象物7の形状データ8と光学系の条件を入力することで、形状データ8における測定可能領域P(撮像範囲)を算出することが可能となっている。また、記憶部62には、検査時に使用される学習済みの深層学習器72があらかじめ記憶されている。 Further, an optical system simulator is constructed in the storage unit 62, and the measurable area P (imaging range) in the shape data 8 is calculated by inputting the shape data 8 of the inspection object 7 and the conditions of the optical system. It is possible to do. Further, the storage unit 62 stores in advance the learned deep learning device 72 used at the time of inspection.
 通信部63は、通信制御装置、通信ポート等を有し、コンピュータとネットワーク間の通信を媒介する通信インタフェースであり、ネットワークを介して、他のコンピュータ間との通信制御を行う。ネットワークは、有線、無線を問わない。 The communication unit 63 has a communication control device, a communication port, etc., and is a communication interface that mediates communication between a computer and a network, and controls communication between other computers via the network. The network can be wired or wireless.
 入力部64は、データの入力を行い、例えば、キーボード、マウス等のポインティングデバイス、テンキー等の入力装置を有する。入力部64を介して、コンピュータに対して、操作指示、動作指示、データ入力等を行うことができる。モニタ65は、液晶パネル等のディスプレイ装置、ディスプレイ装置と連携してコンピュータのビデオ機能を実現するための論理回路等(ビデオアダプタ等)を有する。なお、入力部64及びモニタ65は、タッチパネルディスプレイのように、一体となっていてもよい。 The input unit 64 inputs data and has, for example, a pointing device such as a keyboard and a mouse, and an input device such as a numeric keypad. Operation instructions, operation instructions, data input, and the like can be given to the computer via the input unit 64. The monitor 65 has a display device such as a liquid crystal panel, a logic circuit (video adapter or the like) for realizing a video function of a computer in cooperation with the display device. The input unit 64 and the monitor 65 may be integrated as in the touch panel display.
 周辺機器I/F(Interface)部66は、コンピュータに周辺機器を接続させるためのポートであり、周辺機器I/F部66を介してコンピュータは周辺機器とのデータの送受信を行う。周辺機器I/F部66は、USB(Universal Serial Bus)やLANやIEEE1394やRS-232C等によって構成されており、通常複数の周辺機器I/Fを有する。周辺機器との接続形態は有線、無線を問わない。制御装置6は、周辺機器I/F(Interface)部66を介して、コンベア21、コンベア51、ビジョンセンサ22、把持用ロボットアーム31、測定用ロボットアーム41、2D光学系42、3Dセンサ45等と接続される。 The peripheral device I / F (Interface) unit 66 is a port for connecting the peripheral device to the computer, and the computer transmits / receives data to / from the peripheral device via the peripheral device I / F unit 66. The peripheral device I / F unit 66 is composed of USB (Universal Serial Bus), LAN, IEEE1394, RS-232C, etc., and usually has a plurality of peripheral device I / Fs. The connection form with peripheral devices may be wired or wireless. The control device 6 includes a conveyor 21, a conveyor 51, a vision sensor 22, a gripping robot arm 31, a measuring robot arm 41, a 2D optical system 42, a 3D sensor 45, etc. via the peripheral device I / F (Interface) unit 66. Is connected with.
 UPS67は、停電などによって電力が断たれた場合にも電力を供給し続ける無停電電源装置である。
 バス69は、各装置間の制御信号、データ信号等の授受を媒介する経路である。
The UPS 67 is an uninterruptible power supply that continues to supply power even when the power is cut off due to a power failure or the like.
The bus 69 is a route that mediates the transfer of control signals, data signals, and the like between the devices.
 制御装置6は、1台のコンピュータで構成されてもよいし、複数のコンピュータが協働して検査装置1の動作を実行するように構成されてもよい。以下の説明では、簡素な構成例として、制御装置6が1台のコンピュータで構成される例を説明する。 The control device 6 may be configured by one computer, or may be configured so that a plurality of computers cooperate to execute the operation of the inspection device 1. In the following description, as a simple configuration example, an example in which the control device 6 is configured by one computer will be described.
(3.制御装置6の機能構成)
 図5は、制御装置6の機能構成を示すブロック図である。図に示すように、制御装置6は、コンベア制御部11、ワーク認識部12、ロボット制御部13、動作設定部14、測定部15、検査部16、およびデータ表示部17の各機能から構成される。
(3. Functional configuration of control device 6)
FIG. 5 is a block diagram showing a functional configuration of the control device 6. As shown in the figure, the control device 6 is composed of the functions of the conveyor control unit 11, the work recognition unit 12, the robot control unit 13, the operation setting unit 14, the measurement unit 15, the inspection unit 16, and the data display unit 17. To.
 コンベア制御部11は、コンベア21、51の駆動部に制御命令を送り、コンベア21、51の動作制御をおこなう。具体的には、コンベア制御部11は、未検査の検査対象物7が載置されたコンベア21を供給方向(図1のX方向)に動作させ、検査対象物7をビジョンセンサ22の下部まで供給する。また、検査終了後、コンベア制御部11は、検査済みの検査対象物7が載置されたコンベア51を排出方向(図1のX方向)に動作させ、検査対象物7を排出する。 The conveyor control unit 11 sends a control command to the drive units of the conveyors 21 and 51 to control the operation of the conveyors 21 and 51. Specifically, the conveyor control unit 11 operates the conveyor 21 on which the uninspected object 7 is placed in the supply direction (X direction in FIG. 1), and moves the inspection object 7 to the lower part of the vision sensor 22. Supply. After the inspection is completed, the conveyor control unit 11 operates the conveyor 51 on which the inspected object 7 is placed in the discharge direction (X direction in FIG. 1) to discharge the inspection object 7.
 ワーク認識部12は、ビジョンセンサ22により、ビジョンセンサ22の下部に供給された検査対象物7の位置および姿勢を認識する。認識された検査対象物7の位置および姿勢に基づいて、把持用ロボットアーム31が動作制御され、ハンド部312により検査対象物7が把持される。 The work recognition unit 12 recognizes the position and orientation of the inspection object 7 supplied to the lower part of the vision sensor 22 by the vision sensor 22. The operation of the gripping robot arm 31 is controlled based on the recognized position and posture of the inspection object 7, and the inspection object 7 is gripped by the hand unit 312.
 ロボット制御部13は、把持用ロボットアーム31および測定用ロボットアーム41の各関節の駆動部に制御命令を送り、把持用ロボットアーム31および測定用ロボットアーム41の動作を制御する。 The robot control unit 13 sends a control command to the drive unit of each joint of the gripping robot arm 31 and the measuring robot arm 41, and controls the operation of the gripping robot arm 31 and the measuring robot arm 41.
 具体的には、ロボット制御部13は、後述する軌道生成部143により生成される軌道に基づいて、把持用ロボットアーム31のハンド部312をコンベア21上のピッキング位置まで移動させるとともに、検査対象物7の予め決められた位置をハンド部312で把持させるように制御する。これにより、コンベア21上の検査対象物7がピッキングされる。 Specifically, the robot control unit 13 moves the hand unit 312 of the gripping robot arm 31 to the picking position on the conveyor 21 based on the orbit generated by the orbit generation unit 143, which will be described later, and the inspection object. It is controlled so that the hand portion 312 grips the predetermined position of 7. As a result, the inspection object 7 on the conveyor 21 is picked.
 また、ロボット制御部13は、検査対象物7の把持後(ピッキング後)、後述する軌道生成部143により生成される軌道に基づいて、把持用ロボットアーム31および測定用ロボットアーム41の軌道を制御し、検査対象物7およびカメラ43の位置決めを行う。検査対象物7およびカメラ43の位置決め後、検査対象物7の測定が行われる。 Further, the robot control unit 13 controls the trajectories of the gripping robot arm 31 and the measurement robot arm 41 based on the trajectories generated by the orbit generation unit 143, which will be described later, after gripping the inspection object 7 (after picking). Then, the inspection object 7 and the camera 43 are positioned. After positioning the inspection target 7 and the camera 43, the inspection target 7 is measured.
 また、ロボット制御部13は、検査対象物7を、把持用ロボットアーム31aのハンド部312aと把持用ロボットアーム31bのハンド部312bとの間で持ち替えるように制御する。例えば、測定位置10毎に、把持用ロボットアーム31a、31bのいずれで検査対象物7を把持するかのルールをあらかじめ定めておき、制御部61は、このルールに基づいて、持ち替え動作を行う。 Further, the robot control unit 13 controls the inspection object 7 so as to be switched between the hand unit 312a of the gripping robot arm 31a and the hand unit 312b of the gripping robot arm 31b. For example, a rule for gripping the inspection object 7 by the gripping robot arm 31a or 31b is set in advance for each measurement position 10, and the control unit 61 performs a holding operation based on this rule.
 また、ロボット制御部13は、検査対象物7の検査終了後、後述する軌道生成部143により生成される軌道に基づいて、把持用ロボットアーム31のハンド部312をコンベア51上のリリース位置まで移動させるとともに、検査対象物7のハンド部312による把持状態を解除する。これにより、コンベア51上に検査済みの検査対象物7がリリースされる。 Further, after the inspection of the inspection object 7 is completed, the robot control unit 13 moves the hand unit 312 of the gripping robot arm 31 to the release position on the conveyor 51 based on the trajectory generated by the trajectory generation unit 143 described later. At the same time, the gripping state of the inspection object 7 by the hand portion 312 is released. As a result, the inspected object 7 that has been inspected is released on the conveyor 51.
 動作設定部14は、把持用ロボットアーム31および測定用ロボットアーム41の動作制御に必要な設定を行う。動作設定部14は、図5に示すように、検査部位設定部140、検査仕様設定部141、測定位置算出部142(幾何生成部142a、測定候補位置設定部142b、測定可能領域算出部142c、測定位置選定部142d)、および軌道生成部143から構成される。 The motion setting unit 14 makes settings necessary for motion control of the gripping robot arm 31 and the measurement robot arm 41. As shown in FIG. 5, the operation setting unit 14 includes an inspection site setting unit 140, an inspection specification setting unit 141, and a measurement position calculation unit 142 (geometry generation unit 142a, measurement candidate position setting unit 142b, measurable area calculation unit 142c, It is composed of a measurement position selection unit 142d) and an orbit generation unit 143.
 検査部位設定部140は、検査対象物7の形状データ8(例えば、CADデータ)を記憶部62から読込み、形状データ8に対して、検査対象とする検査部位80を設定する。 The inspection site setting unit 140 reads the shape data 8 (for example, CAD data) of the inspection object 7 from the storage unit 62, and sets the inspection site 80 to be inspected for the shape data 8.
 検査仕様設定部141は、検査部位設定部140により設定された各検査部位80に対して検査仕様9を設定する。検査仕様9は、欠陥種類91および欠陥仕様92の情報を含む。欠陥種類91とは、「凸」欠陥、「凹」欠陥、バリなど、検査対象とする欠陥の種類を示す。欠陥仕様92とは、各欠陥の限度基準であり、各欠陥の有害/無害を判定するための基準として使用される。 The inspection specification setting unit 141 sets the inspection specification 9 for each inspection site 80 set by the inspection site setting unit 140. The inspection specification 9 includes information on the defect type 91 and the defect specification 92. The defect type 91 indicates the type of defect to be inspected, such as a "convex" defect, a "concave" defect, and a burr. The defect specification 92 is a limit standard for each defect, and is used as a standard for determining the harmfulness / harmlessness of each defect.
 検査部位設定部140および検査仕様設定部141の具体的な処理については後述する(図7~図10参照)。 Specific processing of the inspection site setting unit 140 and the inspection specification setting unit 141 will be described later (see FIGS. 7 to 10).
 測定位置算出部142は、形状データ8に基づいて検査対象物7の測定位置10を算出する。測定位置10とは、形状データ8(検査対象物7)に対するカメラ43の相対的な位置であり、例えば、形状データ8(検査対象物7)の中心を原点としたカメラ43の位置座標である。測定位置算出部142は、形状データ8に設定された全ての検査部位80がカメラ43により測定(撮像)されるように、形状データ8の周りに測定位置10を複数算出する。この測定位置10に基づいて把持用ロボットアーム31および測定用ロボットアーム41の軌道が自動生成される。 The measurement position calculation unit 142 calculates the measurement position 10 of the inspection object 7 based on the shape data 8. The measurement position 10 is the relative position of the camera 43 with respect to the shape data 8 (inspection object 7), and is, for example, the position coordinates of the camera 43 with the center of the shape data 8 (inspection object 7) as the origin. .. The measurement position calculation unit 142 calculates a plurality of measurement positions 10 around the shape data 8 so that all the inspection sites 80 set in the shape data 8 are measured (imaged) by the camera 43. The trajectories of the gripping robot arm 31 and the measuring robot arm 41 are automatically generated based on the measurement position 10.
 図5に示すように、測定位置算出部142は、幾何生成部142a、測定候補位置設定部142b、測定可能領域算出部142c、および測定位置選定部142dから構成される。 As shown in FIG. 5, the measurement position calculation unit 142 includes a geometry generation unit 142a, a measurement candidate position setting unit 142b, a measurable area calculation unit 142c, and a measurement position selection unit 142d.
 幾何生成部142aは、形状データ8を包含し、かつ、形状データ8の各方向を規定可能な幾何形状81を生成する。 The geometry generation unit 142a generates a geometry 81 that includes the shape data 8 and can define each direction of the shape data 8.
 測定候補位置設定部142bは、幾何形状81により規定される各方向に1以上の測定候補位置101を設定する。 The measurement candidate position setting unit 142b sets one or more measurement candidate positions 101 in each direction defined by the geometric shape 81.
 測定可能領域算出部142cは、測定候補位置設定部142bにより設定された各測定候補位置101について、光学系シミュレータにより、形状データ8における測定可能領域Pを算出する。 The measurable area calculation unit 142c calculates the measurable area P in the shape data 8 by the optical system simulator for each measurement candidate position 101 set by the measurement candidate position setting unit 142b.
 測定位置選定部142dは、各測定候補位置101の測定可能領域Pに基づいて、測定候補位置101の中から測定位置10を選定する。例えば、測定位置選定部142dは、全ての検査部位80(検査対象領域)が測定され、かつ、測定数が最小となるように測定位置10を選定する。 The measurement position selection unit 142d selects the measurement position 10 from the measurement candidate positions 101 based on the measurable area P of each measurement candidate position 101. For example, the measurement position selection unit 142d selects the measurement position 10 so that all the inspection sites 80 (inspection target areas) are measured and the number of measurements is minimized.
 測定位置算出部142(幾何生成部142a、測定候補位置設定部142b、測定可能領域算出部142c、測定位置選定部142d)の具体的な処理については後述する(図11~図15参照)。 The specific processing of the measurement position calculation unit 142 (geometry generation unit 142a, measurement candidate position setting unit 142b, measurable area calculation unit 142c, measurement position selection unit 142d) will be described later (see FIGS. 11 to 15).
 軌道生成部143は、ROS上に検査装置1や周辺設備のCADモデルを構築したうえで、PRM(Probabilistic Roadmap Planner)法やRRT(Rapidly
exploring Random Tree)法などの経路計画法(Motion Planning)を用いて、干渉を回避した把持用ロボットアーム31および測定用ロボットアーム41の軌道を生成する。
The orbit generation unit 143 builds a CAD model of the inspection device 1 and peripheral equipment on the ROS, and then uses the PRM (Probabilistic Roadmap Planner) method and RRT (Rapidly).
A path planning method (Motion Planning) such as the exploring Random Tree method is used to generate the trajectories of the gripping robot arm 31 and the measuring robot arm 41 avoiding interference.
 具体的には、軌道生成部143は、ワーク認識部12により認識した検査対象物7の位置および姿勢に基づいて、検査対象物7のピッキング位置(ハンド部312の目標位置および目標姿勢)を算出し、ピッキング位置に基づいて、把持用ロボットアーム31の軌道を生成する。この軌道によって、把持用ロボットアーム31を制御することで、コンベア21上の検査対象物7が把持用ロボットアーム31のハンド部312により把持(ピッキング)される。 Specifically, the trajectory generation unit 143 calculates the picking position of the inspection object 7 (target position and target posture of the hand unit 312) based on the position and orientation of the inspection object 7 recognized by the work recognition unit 12. Then, the trajectory of the gripping robot arm 31 is generated based on the picking position. By controlling the gripping robot arm 31 by this trajectory, the inspection object 7 on the conveyor 21 is gripped (picked) by the hand portion 312 of the gripping robot arm 31.
 また、軌道生成部143は、測定位置算出部142により算出された測定位置10に基づいて、把持用ロボットアーム31および測定用ロボットアーム41の軌道を生成する。具体的には、軌道生成部143は、検査対象物7に対するカメラ43の相対的な位置が、測定位置算出部142により算出された測定位置10となるように、把持用ロボットアーム31および測定用ロボットアーム41の軌道を生成する。この軌道によって、把持用ロボットアーム31および測定用ロボットアーム41を制御することで、検査対象物7およびカメラ43が測定ポイントへ位置決めされる。これにより、プログラミングペンダント等による教示なしで検査対象物7の測定・検査を行うことができ、効率性の高い検査が実現される。 Further, the trajectory generation unit 143 generates the orbits of the gripping robot arm 31 and the measurement robot arm 41 based on the measurement position 10 calculated by the measurement position calculation unit 142. Specifically, the trajectory generation unit 143 has a gripping robot arm 31 and a measurement so that the relative position of the camera 43 with respect to the inspection object 7 is the measurement position 10 calculated by the measurement position calculation unit 142. Generates the trajectory of the robot arm 41. By controlling the gripping robot arm 31 and the measuring robot arm 41 by this trajectory, the inspection object 7 and the camera 43 are positioned at the measurement points. As a result, the inspection object 7 can be measured and inspected without teaching by a programming pendant or the like, and a highly efficient inspection is realized.
 また、軌道生成部143は、検査終了後、検査対象物7のリリース位置(ハンド部312の目標位置および目標姿勢)に基づいて、把持用ロボットアーム31の軌道を生成する。この軌道によって、把持用ロボットアーム31を制御することで、コンベア51上に検査済みの検査対象物7がリリースされる。 Further, after the inspection is completed, the trajectory generating unit 143 generates the trajectory of the gripping robot arm 31 based on the release position of the inspection object 7 (the target position and the target posture of the hand unit 312). By controlling the gripping robot arm 31 by this trajectory, the inspected object 7 that has been inspected is released on the conveyor 51.
 測定部15は、検査対象物7およびカメラ43の位置決め後、検査対象物7の測定を行う。
 図5に示すように、測定部15は、第1測定部151および第2測定部152から構成される。
After positioning the inspection target 7 and the camera 43, the measurement unit 15 measures the inspection target 7.
As shown in FIG. 5, the measuring unit 15 includes a first measuring unit 151 and a second measuring unit 152.
 第1測定部151は、2D光学系42に測定指示を送り、検査対象物7の2次元画像データD1を測定する。 The first measurement unit 151 sends a measurement instruction to the 2D optical system 42 and measures the two-dimensional image data D1 of the inspection object 7.
 第2測定部152は、3Dセンサ45に測定指示を送り、検査対象物7の3次元形状データD2を測定(形状測定)する。 The second measurement unit 152 sends a measurement instruction to the 3D sensor 45, and measures (shape measurement) the three-dimensional shape data D2 of the inspection object 7.
 検査部16は、測定部15(151、152)により測定された測定データに基づいて検査対象物7の検査を行う。
 図5に示すように、検査部16は、第1検査部161および第2検査部162から構成される。
The inspection unit 16 inspects the inspection object 7 based on the measurement data measured by the measurement unit 15 (151, 152).
As shown in FIG. 5, the inspection unit 16 is composed of a first inspection unit 161 and a second inspection unit 162.
 第1検査部161は、第1測定部151により測定された検査対象物7の2次元画像データD1に基づいて、欠陥候補71を抽出する。 The first inspection unit 161 extracts the defect candidate 71 based on the two-dimensional image data D1 of the inspection object 7 measured by the first measurement unit 151.
 本実施形態では、第1検査部161は、機械学習の一種である深層学習(Deep
Learning)により学習させた深層学習器72を用いて、欠陥候補71を抽出する。深層学習器72は、欠陥の種類ごとに用意された欠陥あり/なしの良否画像データ(学習データ)に基づき学習させた識別器であり、あらかじめ記憶部62に記憶されている。深層学習手法としては、SegNetやResNet、あるいはSegNetとResNetを併用した手法などが利用できるが、これら手法に限定されない。
In the present embodiment, the first inspection unit 161 is a type of machine learning, deep learning (Deep).
The defect candidate 71 is extracted by using the deep learner 72 trained by Learning). The deep learning device 72 is a discriminator that has been trained based on the quality image data (learning data) with / without defects prepared for each type of defect, and is stored in the storage unit 62 in advance. As the deep learning method, SegNet, ResNet, or a method in which SegNet and ResNet are used in combination can be used, but the method is not limited to these methods.
 第2検査部162は、第2測定部152により測定された検査対象物7の3次元形状データD2に基づいて、第1検査部161により抽出された欠陥候補71の箇所の詳細な形状検査(寸法検査)を行い、最終的な欠陥の有害/無害を判定する。 The second inspection unit 162 performs a detailed shape inspection of the defect candidate 71 extracted by the first inspection unit 161 based on the three-dimensional shape data D2 of the inspection object 7 measured by the second measurement unit 152. Dimensional inspection) to determine the harmful / harmlessness of the final defect.
 図6は、検査処理の概要を示す図である。図に示すように、まず、深層学習器72を用いて2次元画像データD1から欠陥候補71を抽出し(第1検査)、欠陥候補71が抽出されなかった場合には、検査対象物を「良」(欠陥なし)と判定する。一方、欠陥候補71が抽出された場合には、さらに3次元形状データD2により欠陥候補71の箇所の形状検査(寸法検査)を行い(第2検査)、欠陥候補71の箇所の欠陥仕様92と照合することで、最終的な良否判定を行う。 FIG. 6 is a diagram showing an outline of the inspection process. As shown in the figure, first, the defect candidate 71 is extracted from the two-dimensional image data D1 using the deep learning device 72 (first inspection), and when the defect candidate 71 is not extracted, the inspection target is ". Judged as "good" (no defects). On the other hand, when the defect candidate 71 is extracted, the shape inspection (dimension inspection) of the defect candidate 71 is further performed by the three-dimensional shape data D2 (second inspection), and the defect specification 92 of the defect candidate 71 is obtained. By collating, the final pass / fail judgment is made.
(4.検査部位、検査仕様の設定)
 図7~図10を参照して、検査に先立ち、検査対象物7の形状データ8に対して検査対象とする検査部位80、および検査仕様9(欠陥種類91、欠陥仕様92)を設定する処理を説明する。
(4. Setting of inspection site and inspection specifications)
With reference to FIGS. 7 to 10, a process of setting an inspection site 80 to be inspected and an inspection specification 9 (defect type 91, defect specification 92) for the shape data 8 of the inspection object 7 prior to the inspection. Will be explained.
 図7は、検査部位80、および検査仕様9を設定する処理の流れを示すフローチャートである。
 制御装置6の制御部61は、まず、検査対象物7の形状データ8を記憶部62から読込み、モニタ65に表示する(ステップS1)。
 図8は、検査対象物7(継手)の形状データ8の例を示す。
FIG. 7 is a flowchart showing a flow of processing for setting the inspection site 80 and the inspection specification 9.
First, the control unit 61 of the control device 6 reads the shape data 8 of the inspection object 7 from the storage unit 62 and displays it on the monitor 65 (step S1).
FIG. 8 shows an example of shape data 8 of the inspection object 7 (joint).
 続いて、制御部61(検査部位設定部140)は、モニタ65上に表示された形状データ8に対して、検査対象とする検査部位80を設定する(ステップS2)。この設定は入力部64を介したユーザ操作により行われる。例えば、図9の例のように、ユーザは、形状データ8(配管継手のCADデータ)に対して、検査部位80として検査部位80a(継手本体部の外面)、および検査部位80b(継手端部の外面)を設定する。 Subsequently, the control unit 61 (inspection site setting unit 140) sets the inspection site 80 to be inspected with respect to the shape data 8 displayed on the monitor 65 (step S2). This setting is performed by a user operation via the input unit 64. For example, as in the example of FIG. 9, the user can use the shape data 8 (CAD data of the pipe joint) as the inspection part 80 as the inspection part 80a (outer surface of the joint body) and the inspection part 80b (joint end). The outer surface of) is set.
 続いて、制御部61(検査仕様設定部141)は、ステップS2において設定された各検査部位80に対して、検査仕様9(欠陥種類91、欠陥仕様92)の設定を受け付ける(ステップS3)。例えば、図10に示すように、検査部位80a(継手本体部の外面)の検査仕様9a(欠陥種類91、欠陥仕様92)として、検査仕様9a-1(「凸」、「Φ3.0mm、高さ2.0mm」)および検査仕様9a-2(「凹」、「Φ3.0mm、深さ1.0mm」)を設定する。 Subsequently, the control unit 61 (inspection specification setting unit 141) accepts the setting of the inspection specification 9 (defect type 91, defect specification 92) for each inspection site 80 set in step S2 (step S3). For example, as shown in FIG. 10, as the inspection specification 9a (defect type 91, defect specification 92) of the inspection part 80a (outer surface of the joint main body), the inspection specification 9a-1 (“convex”, “Φ3.0 mm, height” 2.0 mm ") and inspection specifications 9a-2 ("concave "," Φ3.0 mm, depth 1.0 mm ") are set.
 また、検査部位80b(継手端部の外面)の検査仕様9b(欠陥種類91、欠陥仕様92)として、検査仕様9b-1(「凸」、「Φ3.0mm、高さ2.0mm」)、検査仕様9b-2(「凹」、「Φ3.0mm、深さ1.0mm」)、および検査仕様9b-3(「バリ」、「0.5H×2L(mm)」)を設定する。 In addition, as inspection specifications 9b (defect type 91, defect specifications 92) of the inspection site 80b (outer surface of the joint end), inspection specifications 9b-1 (“convex”, “Φ3.0 mm, height 2.0 mm”), Inspection specifications 9b-2 (“concave”, “Φ3.0 mm, depth 1.0 mm”) and inspection specifications 9b-3 (“burrs”, “0.5H × 2L (mm)”) are set.
 後述するように、制御装置6は、ステップS2において設定された検査部位80が全て測定されるように、測定位置10を選定(算出)する。また、制御装置6は、ステップS3において設定した各検査部位80の検査仕様9(欠陥種類91、欠陥仕様92)に基づいて、検査対象物7の良否判定を行う。 As will be described later, the control device 6 selects (calculates) the measurement position 10 so that all the inspection sites 80 set in step S2 are measured. Further, the control device 6 determines the quality of the inspection object 7 based on the inspection specifications 9 (defect type 91, defect specifications 92) of each inspection site 80 set in step S3.
(5.測定位置の算出)
 図11~図15を参照して、検査対象物7の測定位置10を算出する処理を説明する。ここで算出された測定位置10に基づいて、把持用ロボットアーム31および測定用ロボットアーム41の軌道が生成される。
(5. Calculation of measurement position)
The process of calculating the measurement position 10 of the inspection object 7 will be described with reference to FIGS. 11 to 15. Based on the measurement position 10 calculated here, the trajectories of the gripping robot arm 31 and the measurement robot arm 41 are generated.
 図11は、測定位置10を算出する処理の流れを示すフローチャートである。
 まず、制御装置6の制御部61(幾何生成部142a)は、検査対象物7の形状データ8を包含し、かつ、形状データ8の各方向を規定可能な幾何形状81を生成する(ステップS11)。例えば、図12に示すような形状データ8を包含する多面体81や、図13に示すような形状データ8を包含する球面グリッド81などを生成する。
FIG. 11 is a flowchart showing a flow of processing for calculating the measurement position 10.
First, the control unit 61 (geometry generation unit 142a) of the control device 6 generates the geometry 81 that includes the shape data 8 of the inspection object 7 and can define each direction of the shape data 8 (step S11). ). For example, a polyhedron 81 including the shape data 8 as shown in FIG. 12 and a spherical grid 81 including the shape data 8 as shown in FIG. 13 are generated.
多面体(図12)の場合には、形状データ8の各方向が、多面体の各面の向き(面の法線方向)によって規定される。球面グリッド(図13)の場合には、形状データ8の各方向が、各グリッド線の交点の方向によって規定される。本実施形態では、制御部61(幾何生成部142a)は、図12の多面体81を生成するものとして説明する。多面体81は、例えば、正12面体や正20面体であるが、これらに限定されない。 In the case of a polyhedron (FIG. 12), each direction of the shape data 8 is defined by the direction of each surface of the polyhedron (normal direction of the surface). In the case of the spherical grid (FIG. 13), each direction of the shape data 8 is defined by the direction of the intersection of the grid lines. In the present embodiment, the control unit 61 (geometric generation unit 142a) will be described as generating the polyhedron 81 shown in FIG. The polyhedron 81 is, for example, a regular dodecahedron or a regular icosahedron, but is not limited thereto.
 続いて、制御部61(測定候補位置設定部142b)は、幾何形状81により規定される各方向に1以上の測定候補位置101を設定する(ステップS12)。本実施形態では、制御部61は、多面体81(幾何形状81)の各面82の中心83を通る面法線84上に1以上の測定候補位置101(カメラ43の位置座標)を設定する。なお、測定候補位置101に設置される各カメラ43の撮像方向は、面82に垂直な方向とする。つまり、各カメラ43の撮像中心は、各面82の中心83となる。 Subsequently, the control unit 61 (measurement candidate position setting unit 142b) sets one or more measurement candidate positions 101 in each direction defined by the geometric shape 81 (step S12). In the present embodiment, the control unit 61 sets one or more measurement candidate positions 101 (position coordinates of the camera 43) on the surface normal 84 passing through the center 83 of each surface 82 of the polyhedron 81 (geometric shape 81). The imaging direction of each camera 43 installed at the measurement candidate position 101 is a direction perpendicular to the surface 82. That is, the image pickup center of each camera 43 is the center 83 of each surface 82.
 図14は、多面体81(正20面体)のある面82の中心83を通る面法線84上に測定候補位置101を設定する例を示す。図の例では、面法線84上に3つの測定候補位置101が設定されている。測定候補位置101は、カメラ43の視野範囲、被写体深度、作動距離などの光学的条件に基づいて適宜設定される。 FIG. 14 shows an example in which the measurement candidate position 101 is set on the surface normal 84 passing through the center 83 of the surface 82 having the polyhedron 81 (regular icosahedron). In the example of the figure, three measurement candidate positions 101 are set on the surface normal 84. The measurement candidate position 101 is appropriately set based on optical conditions such as the field of view range of the camera 43, the depth of the subject, and the working distance.
 制御部61(測定候補位置設定部142b)は、多面体81の全ての面82について測定候補位置101を設定する。例えば、図14の正20面体の各面82毎にそれぞれ3点の測定候補位置101を設定した場合、計60点の測定候補位置101が設定される。 The control unit 61 (measurement candidate position setting unit 142b) sets the measurement candidate position 101 for all the surfaces 82 of the polyhedron 81. For example, when three measurement candidate positions 101 are set for each surface 82 of the regular icosahedron of FIG. 14, a total of 60 measurement candidate positions 101 are set.
 続いて、制御部61(測定可能領域算出部142c)は、ステップS12において設定された各測定候補位置101について、光学系シミュレータを用いて、形状データ8における測定可能領域Pを算出する(ステップS13)。 Subsequently, the control unit 61 (measurable area calculation unit 142c) calculates the measurable area P in the shape data 8 for each measurement candidate position 101 set in step S12 by using the optical system simulator (step S13). ).
 そして、制御部61(測定位置選定部142d)は、ステップS13において算出された各測定候補位置101の測定可能領域Pに基づいて、全ての検査部位80(検査対象領域)が測定され、かつ、測定数が最小となる測定位置10を測定候補位置101の中から選定する(ステップS14)。 Then, the control unit 61 (measurement position selection unit 142d) measures all the inspection sites 80 (inspection target areas) based on the measurable region P of each measurement candidate position 101 calculated in step S13, and The measurement position 10 that minimizes the number of measurements is selected from the measurement candidate positions 101 (step S14).
 例えば、制御部61(測定位置選定部142d)は、全ての測定候補位置101の中から、ランダムな順序で測定候補位置101を一つずつ選択していき、選択された測定候補位置101の測定可能領域Pを統合していく。測定可能領域Pが統合された領域を統合領域Rと呼ぶ。そして、制御部61は、統合領域Rが全ての検査部位(検査対象領域)を網羅(包含)した段階で測定候補位置101の選択を打ち切り、それまでに選択された測定候補位置101を測定位置10として選定する。 For example, the control unit 61 (measurement position selection unit 142d) selects measurement candidate positions 101 one by one in a random order from all the measurement candidate positions 101, and measures the selected measurement candidate positions 101. The possible area P will be integrated. The region in which the measurable region P is integrated is called the integrated region R. Then, the control unit 61 discontinues the selection of the measurement candidate position 101 at the stage where the integrated region R covers (includes) all the inspection sites (inspection target regions), and measures the measurement candidate positions 101 selected so far. Select as 10.
 図15を用いて具体的に説明する。例えば、図15に示すように、測定候補位置101a、101b、101cの順序でランダムに選択された場合、統合領域RがR1、R2、R3と順次生成される。ここで統合領域R1は測定候補位置101aの測定可能領域P1と等価であり、統合領域R2は統合領域R1に測定候補位置101bの測定可能領域P2を統合した領域であり、R3は統合領域R2に測定候補位置101cの測定可能領域P3を統合した領域である。 A specific description will be given with reference to FIG. For example, as shown in FIG. 15, when the measurement candidate positions 101a, 101b, and 101c are randomly selected in this order, the integrated region R is sequentially generated as R1, R2, and R3. Here, the integrated region R1 is equivalent to the measurable region P1 of the measurement candidate position 101a, the integrated region R2 is the region in which the measurable region P2 of the measurement candidate position 101b is integrated with the integrated region R1, and R3 is the integrated region R2. This is a region in which the measurable region P3 of the measurement candidate position 101c is integrated.
 以降同様に、測定候補位置101の選択および測定可能領域Pの統合(統合領域Rの拡張)を順次繰り返していき、統合領域Rが全ての検査部位80(検査対象領域)を網羅(包含)した段階で測定候補位置101の選択を打ち切り、それまでに選択された測定候補位置101を測定位置10として選定する。なお、選択した測定候補位置101の測定可能領域Pの全領域が、それまでに生成された統合領域Rに含まれる場合には、別の測定候補位置101を選択し直す。すなわち、既に測定された領域を重複して測定してしまうことがないように測定位置10が選定される。 Subsequently, similarly, the selection of the measurement candidate position 101 and the integration of the measurable area P (expansion of the integrated area R) are sequentially repeated, and the integrated area R covers (includes) all the inspection sites 80 (inspection target areas). The selection of the measurement candidate position 101 is terminated at the stage, and the measurement candidate position 101 selected so far is selected as the measurement position 10. If the entire area of the measurable area P of the selected measurement candidate position 101 is included in the integrated area R generated up to that point, another measurement candidate position 101 is reselected. That is, the measurement position 10 is selected so that the already measured regions are not measured in duplicate.
 また仮に、全ての測定候補位置101の測定可能領域Pを統合した統合領域Rが、全ての検査部位(検査対象領域)を網羅できなかった場合には、制御部61は、面数を増やした多面体81(方向分解能を上げた多面体81)を生成し直し、ステップS11~S14の処理を再度実行する。 If the integrated region R, which integrates the measurable regions P of all the measurement candidate positions 101, cannot cover all the inspection sites (inspection target regions), the control unit 61 increases the number of surfaces. The polyhedron 81 (polyhedron 81 with increased directional resolution) is regenerated, and the processes of steps S11 to S14 are executed again.
 図16は、ステップS14において選定された測定位置10に紐づけて各種検査情報等を保持する測定検査テーブル100を示す。測定検査テーブル100には、測定順に整列された各測定位置10(測定位置10-1、10-2、…)に紐づけて、各測定位置10における検査部位80、欠陥種類91、欠陥仕様92、照明条件93、第1測定データ94、第2測定データ95、検査結果96の情報が保持される。 FIG. 16 shows a measurement inspection table 100 that holds various inspection information and the like in association with the measurement position 10 selected in step S14. In the measurement inspection table 100, the inspection site 80, the defect type 91, and the defect specification 92 at each measurement position 10 are linked to each measurement position 10 (measurement positions 10-1, 10-2, ...) Arranged in the measurement order. , Lighting condition 93, first measurement data 94, second measurement data 95, and inspection result 96 information are retained.
 検査部位80は、図7のステップS1において設定した検査部位80のうち、各測定位置10の測定可能領域Pに含まれる検査部位80(各測定位置10で測定・検査可能な検査部位)である。欠陥種類91、欠陥仕様92は、図7のステップS2において設定した検査部位80に紐づく欠陥種類91、欠陥仕様92である。 The inspection site 80 is an inspection site 80 (inspection site that can be measured / inspected at each measurement position 10) included in the measurable region P of each measurement position 10 among the inspection sites 80 set in step S1 of FIG. .. The defect type 91 and the defect specification 92 are the defect type 91 and the defect specification 92 associated with the inspection site 80 set in step S2 of FIG. 7.
 照明条件93とは、測定時に、どの照明44を点灯させるかの条件であり、例えば、欠陥種類91の種類(「凸」「凹」「バリ」…)に応じて予め定められている。 The lighting condition 93 is a condition for which lighting 44 is to be turned on at the time of measurement, and is predetermined according to, for example, the type of defect type 91 (“convex”, “concave”, “burr” ...).
 第1測定データ94、第2測定データ95には、測定実行後に測定データの保存先情報が記録される。検査結果96には、検査実行後に検査結果の情報が記録される。 In the first measurement data 94 and the second measurement data 95, the storage destination information of the measurement data is recorded after the measurement is executed. Information on the inspection result is recorded in the inspection result 96 after the inspection is executed.
(6.検査装置1の動作)
 図17~図23を参照して、検査装置1の動作について説明する。
(6. Operation of inspection device 1)
The operation of the inspection device 1 will be described with reference to FIGS. 17 to 23.
 図17は、検査装置1の動作の概要を示す。まず、図17(a)のように、検査装置1は、把持用ロボットアーム31の軌道を制御し、コンベア21上の検査対象物7を把持(ピッキング)する。続いて、図17(b)のように、検査装置1は、把持用ロボットアーム31と測定用ロボットアーム41の軌道を制御し、検査対象物7を測定ポイントまで移動する。ここで、検査対象物7の検査部位を全て測定するため、検査装置1は、検査対象物7とカメラ43の位置関係を変えながら、複数回測定を行う。また、図17(c)のように、検査装置1は、必要に応じて、把持用ロボットアーム31a、31b間で検査対象物7の持ち替えを行う。検査終了後、図17(d)のように、検査装置1は、把持用ロボットアーム31の軌道を制御し、コンベア51上に検査対象物7をリリースする。 FIG. 17 shows an outline of the operation of the inspection device 1. First, as shown in FIG. 17A, the inspection device 1 controls the trajectory of the gripping robot arm 31 to grip (pick) the inspection object 7 on the conveyor 21. Subsequently, as shown in FIG. 17B, the inspection device 1 controls the trajectories of the gripping robot arm 31 and the measurement robot arm 41, and moves the inspection object 7 to the measurement point. Here, in order to measure all the inspection sites of the inspection object 7, the inspection device 1 performs the measurement a plurality of times while changing the positional relationship between the inspection object 7 and the camera 43. Further, as shown in FIG. 17C, the inspection device 1 switches the inspection target 7 between the gripping robot arms 31a and 31b as necessary. After the inspection is completed, as shown in FIG. 17D, the inspection device 1 controls the trajectory of the gripping robot arm 31 and releases the inspection object 7 on the conveyor 51.
 図18のフローチャートを参照して、検査装置1(制御装置6)の動作の詳細を説明する。 The details of the operation of the inspection device 1 (control device 6) will be described with reference to the flowchart of FIG.
 まず、制御装置6の制御部61(コンベア制御部11)は、未検査の検査対象物7が載置されたコンベア21を稼働制御し、ビジョンセンサ22の下部まで検査対象物7を供給する(ステップS21)。 First, the control unit 61 (conveyor control unit 11) of the control device 6 controls the operation of the conveyor 21 on which the uninspected object 7 is placed, and supplies the inspection object 7 to the lower part of the vision sensor 22 ( Step S21).
 続いて、制御部61(ワーク認識部12)は、ビジョンセンサ22により、ステップS21において供給された検査対象物7の位置および姿勢を認識する(ステップS22)。 Subsequently, the control unit 61 (work recognition unit 12) recognizes the position and orientation of the inspection object 7 supplied in step S21 by the vision sensor 22 (step S22).
 続いて、制御部61(ロボット制御部13)は、ステップS12において認識した検査対象物7の位置および姿勢に基づいて、把持用ロボットアーム31(31aまたは31b)を移動させ、ハンド部312で検査対象物7を把持させるように制御する(ステップS23)。 Subsequently, the control unit 61 (robot control unit 13) moves the gripping robot arm 31 (31a or 31b) based on the position and posture of the inspection object 7 recognized in step S12, and the hand unit 312 inspects it. The object 7 is controlled to be gripped (step S23).
 具体的には、まず、制御部61(軌道生成部143)は、ステップS22において認識した検査対象物7の位置および姿勢に基づいて、検査対象物7を把持可能なハンド部312の目標位置および目標姿勢を算出し、算出したハンド部312の目標位置および目標姿勢に基づいて、把持用ロボットアーム31の軌道を生成する。 Specifically, first, the control unit 61 (orbit generation unit 143) has the target position and the target position of the hand unit 312 capable of gripping the inspection object 7 based on the position and posture of the inspection object 7 recognized in step S22. The target posture is calculated, and the trajectory of the gripping robot arm 31 is generated based on the calculated target position and target posture of the hand unit 312.
 そして、制御部61(ロボット制御部13)は、生成された軌道に基づいて、把持用ロボットアーム31の軌道を制御するとともに、検査対象物7の予め決められた位置をハンド部312で把持させるように制御する。これにより、検査対象物7が把持用ロボットアーム31によりピッキングされる。 Then, the control unit 61 (robot control unit 13) controls the trajectory of the gripping robot arm 31 based on the generated trajectory, and causes the hand unit 312 to grip the predetermined position of the inspection object 7. To control. As a result, the inspection object 7 is picked by the gripping robot arm 31.
 続いて、制御部61(ロボット制御部13)は、把持用ロボットアーム31および測定用ロボットアーム41の軌道を制御し、測定ポイントへ検査対象物7およびカメラ43を位置決めする(ステップS24)。 Subsequently, the control unit 61 (robot control unit 13) controls the trajectories of the gripping robot arm 31 and the measurement robot arm 41, and positions the inspection object 7 and the camera 43 at the measurement point (step S24).
 具体的には、まず、制御部61(軌道生成部143)は、検査対象物7に対するカメラ43の相対的な位置が、測定検査テーブル100(図16参照)の測定位置10(最初の測定では測定位置10-1)となるように、把持用ロボットアーム31および測定用ロボットアーム41の軌道計算を行い、軌道を生成する。 Specifically, first, in the control unit 61 (orbit generation unit 143), the relative position of the camera 43 with respect to the inspection object 7 is the measurement position 10 (in the first measurement) of the measurement inspection table 100 (see FIG. 16). The trajectory of the gripping robot arm 31 and the measurement robot arm 41 is calculated so as to be at the measurement position 10-1), and a trajectory is generated.
 ここで、測定位置10は、検査対象物7に対するカメラ43の相対的な位置でしかないため、把持用ロボットアーム31および測定用ロボットアーム41の軌道が一意に定まらない。そのため、制御部61(軌道生成部143)は、検査対象物7に対するカメラ43の相対的な位置が測定位置10となるような把持用ロボットアーム31および測定用ロボットアーム41の軌道の中から、所定の条件を満たす軌道を一意に決定する。所定の条件を満たす軌道とは、例えば、把持用ロボットアーム31および測定用ロボットアーム41の平均移動量が最小となる軌道、把持用ロボットアーム31および測定用ロボットアーム41の平均移動時間が最小となる軌道、などである。 Here, since the measurement position 10 is only the position relative to the inspection object 7 of the camera 43, the trajectories of the gripping robot arm 31 and the measurement robot arm 41 cannot be uniquely determined. Therefore, the control unit 61 (orbit generation unit 143) selects the orbits of the gripping robot arm 31 and the measurement robot arm 41 such that the relative position of the camera 43 with respect to the inspection object 7 is the measurement position 10. A trajectory that satisfies a predetermined condition is uniquely determined. The orbits satisfying the predetermined conditions are, for example, the orbits in which the average movement amount of the gripping robot arm 31 and the measurement robot arm 41 is minimized, and the average movement time of the gripping robot arm 31 and the measurement robot arm 41 is minimum. Orbit, etc.
 そして、制御部61(ロボット制御部13)は、生成された軌道に基づいて、把持用ロボットアーム31および測定用ロボットアーム41の軌道を制御し、検査対象物7およびカメラ43の位置決めを行う。 Then, the control unit 61 (robot control unit 13) controls the trajectories of the gripping robot arm 31 and the measurement robot arm 41 based on the generated trajectories, and positions the inspection object 7 and the camera 43.
 続いて、制御部61(測定部15、検査部16)は、検査対象物7の測定・検査を実行する(ステップS25)。 Subsequently, the control unit 61 (measurement unit 15, inspection unit 16) executes measurement / inspection of the inspection object 7 (step S25).
 ここで、図19のフローチャートを参照して、測定・検査の処理を説明する。 Here, the measurement / inspection process will be described with reference to the flowchart of FIG.
 まず、制御部61(第1測定部151)は、2D光学系42に測定指示を送り、検査対象物7の2次元画像データD1を測定する(ステップS41)。具体的には、制御部61は、測定検査テーブル100を参照して、該当する測定位置10で検査可能な検査部位80について、欠陥種類91ごとに測定を行う。この際、制御部61は、各欠陥種類91に応じて照明条件93を切り替えてもよい。 First, the control unit 61 (first measurement unit 151) sends a measurement instruction to the 2D optical system 42 and measures the two-dimensional image data D1 of the inspection object 7 (step S41). Specifically, the control unit 61 refers to the measurement inspection table 100 and measures the inspection site 80 that can be inspected at the corresponding measurement position 10 for each defect type 91. At this time, the control unit 61 may switch the illumination condition 93 according to each defect type 91.
 例えば、図16を参照すると、測定位置10-1における検査部位80は「部位A」であり、検査対象とする欠陥種類91は「凸」欠陥および「凹」欠陥である。また、「凸」欠陥の検査に適した照明条件93は「条件1」、「凹」欠陥の検査に適した照明条件93は「条件2」である。したがって、制御部61(第1測定部151)は、「部位A」において「凸」欠陥の測定を行う場合、照明条件93「条件1」に基づき照明44を点灯したうえで、カメラ43による測定を行う。また、「部位A」において「凹」欠陥の測定を行う場合、照明条件93「条件2」に基づき照明44を点灯したうえで、カメラ43による測定を行う。 For example, referring to FIG. 16, the inspection site 80 at the measurement position 10-1 is “site A”, and the defect type 91 to be inspected is a “convex” defect and a “concave” defect. The lighting condition 93 suitable for inspecting "convex" defects is "condition 1", and the lighting condition 93 suitable for inspecting "concave" defects is "condition 2". Therefore, when the control unit 61 (first measurement unit 151) measures the “convex” defect in the “site A”, the control unit 61 turns on the illumination 44 based on the illumination condition 93 “condition 1” and then measures with the camera 43. I do. Further, when measuring the "concave" defect in the "part A", the illumination 44 is turned on based on the illumination condition 93 "condition 2", and then the measurement is performed by the camera 43.
なお、欠陥種類91に依らず、制御部61(第1測定部)は、決められた照明44を点灯(例えば全ての照明44を点灯)して測定を行ってもよいし、照明44の点灯を一つずつ切り替えながら測定を行ってもよい。後者の場合、照明角度が異なる照明数分の2次元画像が得られる。 Regardless of the defect type 91, the control unit 61 (first measurement unit) may turn on the determined lighting 44 (for example, turn on all the lighting 44) to perform the measurement, or the lighting 44 may be turned on. The measurement may be performed while switching the above one by one. In the latter case, two-dimensional images for the number of illuminations having different illumination angles can be obtained.
 制御部61(第1測定部151)は、測定された検査対象物7の2次元画像データD1(測定データ)を記憶部62に保存し、測定検査テーブル100の第1測定データ94に保存先情報を記録する。 The control unit 61 (first measurement unit 151) stores the two-dimensional image data D1 (measurement data) of the measured inspection object 7 in the storage unit 62, and stores it in the first measurement data 94 of the measurement inspection table 100. Record information.
 続いて、制御部61(第1検査部161)は、ステップS41において測定された検査対象物7の2次元画像データD1を深層学習器72に入力し、2次元画像データD1から欠陥候補71を抽出する(ステップS42)。 Subsequently, the control unit 61 (first inspection unit 161) inputs the two-dimensional image data D1 of the inspection object 7 measured in step S41 into the deep learning device 72, and inputs the defect candidate 71 from the two-dimensional image data D1. Extract (step S42).
 図20は、欠陥候補71の抽出例を示す。図20(a)は検査対象物7のカメラ43による撮影画像であり、図20(b)は深層学習により得られる欠陥の度合いをコンターマップ(等高線図)で描画した画像(図20(a)の撮影画像にコンターマップを重畳した画像)である。図20(b)に示すように、例えば、欠陥の度合いが所定の閾値を超える箇所が、欠陥候補71(71a、71b、71c)として抽出される。 FIG. 20 shows an extraction example of defect candidate 71. FIG. 20A is an image taken by the camera 43 of the inspection object 7, and FIG. 20B is an image in which the degree of defects obtained by deep learning is drawn with a contour map (contour map) (FIG. 20A). It is an image in which a contour map is superimposed on the photographed image of. As shown in FIG. 20B, for example, a portion where the degree of defect exceeds a predetermined threshold value is extracted as defect candidate 71 (71a, 71b, 71c).
 ステップS42において欠陥候補71が抽出されなかった場合には(ステップS43;No)、制御部61(第1検査部161)は、検査対象物7を「良」(欠陥なし)と判定し、検査結果を測定検査テーブル100の検査結果96欄に記録する(ステップS46)。 If the defect candidate 71 is not extracted in step S42 (step S43; No), the control unit 61 (first inspection unit 161) determines that the inspection object 7 is “good” (no defect) and inspects it. The result is recorded in the inspection result column 96 of the measurement inspection table 100 (step S46).
 ステップS42において欠陥候補71が抽出された場合には(ステップS43;Yes)、制御部61(第2測定部152)は、3Dセンサ45に測定指示を送り、検査対象物7の3次元形状データD2を測定し(ステップS44)、測定された3次元形状データD2を記憶部62に保存し、測定検査テーブル100の第2測定データ95に保存先情報を記録する。なお、3Dセンサ45による測定を行う際、制御部61は、把持用ロボットアーム31または測定用ロボットアーム41を制御し、検査対象物7に対する3Dセンサ45の測定位置を調整してもよい。 When the defect candidate 71 is extracted in step S42 (step S43; Yes), the control unit 61 (second measurement unit 152) sends a measurement instruction to the 3D sensor 45 to provide three-dimensional shape data of the inspection object 7. D2 is measured (step S44), the measured three-dimensional shape data D2 is stored in the storage unit 62, and the storage destination information is recorded in the second measurement data 95 of the measurement inspection table 100. When performing measurement by the 3D sensor 45, the control unit 61 may control the gripping robot arm 31 or the measurement robot arm 41 to adjust the measurement position of the 3D sensor 45 with respect to the inspection object 7.
 続いて、制御部61(第2検査部162)は、ステップS44において測定された検査対象物7の3次元形状データD2に基づいて、ステップS42において抽出された欠陥候補71の箇所の形状検査(寸法検査)を行い、良否を判定する。 Subsequently, the control unit 61 (second inspection unit 162) inspects the shape of the defect candidate 71 extracted in step S42 based on the three-dimensional shape data D2 of the inspection object 7 measured in step S44. Dimension inspection) is performed to judge the quality.
 例えば、図16の測定位置10-1における2次元画像データD1から「凸」欠陥の欠陥候補71が抽出されたとする。この場合、制御部61(第2検査部162)は、3次元形状データD2から欠陥候補71(「凸」欠陥)の形状を認識し、測定検査テーブル100の欠陥仕様92「φ3.0mm、高さ2.0mm」と照合することで、形状検査(寸法検査)を行う。例えば、制御部61(第2検査部162)は、欠陥候補71のφ(直径)が3.0mm以上、かつ、高さが2.0mm以上であると認識した場合、「否」(「凸」欠陥あり)と判定し、それ以外の場合は「良」(「凸」欠陥なし)と判定する。 For example, it is assumed that the defect candidate 71 of the "convex" defect is extracted from the two-dimensional image data D1 at the measurement position 10-1 in FIG. In this case, the control unit 61 (second inspection unit 162) recognizes the shape of the defect candidate 71 (“convex” defect) from the three-dimensional shape data D2, and the defect specification 92 “φ3.0 mm, height” of the measurement inspection table 100. Shape inspection (dimension inspection) is performed by collating with "2.0 mm". For example, when the control unit 61 (second inspection unit 162) recognizes that the φ (diameter) of the defect candidate 71 is 3.0 mm or more and the height is 2.0 mm or more, it is “No” (“convex”). "There is a defect"), and in other cases, it is judged as "Good" (no "convex" defect).
 そして、制御部61(第2検査部162)は、検査結果を測定検査テーブル100の検査結果96欄に記録する(ステップS46)。 Then, the control unit 61 (second inspection unit 162) records the inspection result in the inspection result column 96 of the measurement inspection table 100 (step S46).
 図18のフローチャートの説明に戻る。
 全ての検査が終了していない場合(ステップS26;No)、制御部61は、次の測定・検査を行うにあたり、検査対象物7の把持用ロボットアーム31a、31b間での持ち替えが必要か否かを判断する(ステップS27)。
Returning to the description of the flowchart of FIG.
When all the inspections have not been completed (step S26; No), the control unit 61 needs to switch between the gripping robot arms 31a and 31b of the inspection object 7 in order to perform the next measurement / inspection. (Step S27).
 例えば、測定位置10毎に、把持用ロボットアーム31a、31bのいずれで検査対象物7を把持するかのルールをあらかじめ定めておき、制御部61は、このルールに基づいて、持ち替えが必要か否かを判断する。 For example, for each measurement position 10, a rule for gripping the inspection object 7 by the gripping robot arm 31a or 31b is set in advance, and the control unit 61 determines whether or not it is necessary to change hands based on this rule. To judge.
 持ち替えが不要な場合、制御部61は、ステップS24へ戻り、次の測定ポイントへ把持用ロボットアーム31および測定用ロボットアーム41を移動させる。 When it is not necessary to change hands, the control unit 61 returns to step S24 and moves the gripping robot arm 31 and the measurement robot arm 41 to the next measurement point.
 一方、持ち替えが必要な場合(ステップS27;Yes)、制御部61(ロボット制御部13)は、検査対象物7を、把持用ロボットアーム31aのハンド部312aと把持用ロボットアーム31bのハンド部312bとの間で持ち替えるように制御する(ステップS28)。 On the other hand, when it is necessary to change hands (step S27; Yes), the control unit 61 (robot control unit 13) inspects the inspection object 7 with the hand unit 312a of the gripping robot arm 31a and the hand unit 312b of the gripping robot arm 31b. It is controlled so as to switch between and (step S28).
 例えば、検査対象物7を、把持用ロボットアーム31aのハンド部312aから把持用ロボットアーム31bのハンド部312bに持ち替えるとする。この場合、制御部61は、まず、把持用ロボットアーム31aのハンド部312aと把持用ロボットアーム31bのハンド部312bがあらかじめ決められた持ち替え時の位置および姿勢となるように、把持用ロボットアーム31aと把持用ロボットアーム31bの軌道を制御し、ハンド部312aとハンド部312bを位置決めする。 For example, suppose that the inspection object 7 is changed from the hand portion 312a of the gripping robot arm 31a to the hand portion 312b of the gripping robot arm 31b. In this case, the control unit 61 first sets the gripping robot arm 31a so that the hand portion 312a of the gripping robot arm 31a and the hand portion 312b of the gripping robot arm 31b have predetermined positions and postures at the time of changing hands. The trajectory of the gripping robot arm 31b is controlled, and the hand portion 312a and the hand portion 312b are positioned.
 続いて、制御部61は、検査対象物7の予め決められた位置をハンド部312bで把持させ、ハンド部312bによる把持後、ハンド部312aの把持を解除する。これにより、ハンド部312a、312b間での検査対象物7の持ち替えが行われる。 Subsequently, the control unit 61 causes the hand unit 312b to grip the predetermined position of the inspection object 7, and after the hand unit 312b grips the object 7, releases the grip of the hand unit 312a. As a result, the inspection target 7 is switched between the hand portions 312a and 312b.
 そして検査対象物7の持ち替え後、ステップS24へ戻り、制御部61(ロボット制御部13)は、持ち替え後の把持用ロボットアーム31および測定用ロボットアーム41を次の測定ポイントへ移動させ、検査対象物7およびカメラ43の位置決めを行う。 Then, after the inspection target 7 is changed, the process returns to step S24, and the control unit 61 (robot control unit 13) moves the gripping robot arm 31 and the measurement robot arm 41 after the change to the next measurement point, and the inspection target. Position the object 7 and the camera 43.
 上記したステップ24~ステップS28の処理は、全ての検査が終了するまで(ステップS26;Yes)、繰り返し実行される。 The processes of steps 24 to S28 described above are repeatedly executed until all the inspections are completed (step S26; Yes).
 検査対象物7の検査終了後、制御部61(ロボット制御部13)は、検査対象物7を把持している把持用ロボットアーム31をコンベア51上の把持解除位置まで移動させるとともに、ハンド部312を制御し、検査対象物7の把持状態を解除する(ステップS29)。これにより、コンベア51上に検査済みの検査対象物7がリリースされる。そして、制御部61(コンベア制御部11)は、検査済みの検査対象物7が載置されたコンベア51を動作制御し、検査対象物7を排出する。 After the inspection of the inspection object 7 is completed, the control unit 61 (robot control unit 13) moves the gripping robot arm 31 holding the inspection object 7 to the grip release position on the conveyor 51, and the hand unit 312. Is controlled to release the gripping state of the inspection object 7 (step S29). As a result, the inspected object 7 that has been inspected is released on the conveyor 51. Then, the control unit 61 (conveyor control unit 11) controls the operation of the conveyor 51 on which the inspected object 7 is placed, and discharges the inspection object 7.
 最後に、制御部61(データ表示部17)は、検査対象物7の検査結果に関する情報をモニタ67に表示する。例えば、制御部61は、図21に示すように、欠陥箇所を明示した撮影画像を表示する。また、制御部61は、図22に示すように、日ごとの不良品発生率の推移グラフを表示したり、図23に示すように、製造工程パラメータから予測した不良発生率と実際の不良発生率との相関関係を示す相関解析結果などを表示してもよい。 Finally, the control unit 61 (data display unit 17) displays information on the inspection result of the inspection object 7 on the monitor 67. For example, as shown in FIG. 21, the control unit 61 displays a captured image in which the defective portion is clearly shown. Further, as shown in FIG. 22, the control unit 61 displays a daily transition graph of the defective product occurrence rate, and as shown in FIG. 23, the defective occurrence rate predicted from the manufacturing process parameters and the actual defective occurrence rate. Correlation analysis results showing the correlation with the rate may be displayed.
 以上、図面を参照しながら、本実施形態について説明した。本実施形態の検査装置1は、検査対象物7を把持する把持用ロボットアーム31と、カメラ43が設けられた測定用ロボットアーム41と、制御装置6と、を備える。特に、制御装置6は、検査対象物7の形状データ8に基づいて、検査対象物7に対するカメラ43の測定位置10を算出し、測定位置10に基づいて把持用ロボットアーム31および測定用ロボットアーム41の軌道を生成する。そして、制御装置6は、生成した軌道に基づいて把持用ロボットアーム31および測定用ロボットアーム41の動作を制御し、検査対象物7およびカメラ43の測定ポイントへの位置決めを行う。 The present embodiment has been described above with reference to the drawings. The inspection device 1 of the present embodiment includes a gripping robot arm 31 for gripping the inspection object 7, a measurement robot arm 41 provided with a camera 43, and a control device 6. In particular, the control device 6 calculates the measurement position 10 of the camera 43 with respect to the inspection object 7 based on the shape data 8 of the inspection object 7, and the gripping robot arm 31 and the measurement robot arm based on the measurement position 10. Generates 41 orbits. Then, the control device 6 controls the operations of the gripping robot arm 31 and the measuring robot arm 41 based on the generated trajectory, and positions the inspection object 7 and the camera 43 at the measurement points.
 このように、検査対象物7の形状データ8(CADデータ等)から算出された測定位置10に基づいて、把持用ロボットアーム31および測定用ロボットアームの軌道を自動生成することで、従来のように、プログラミングペンダント等による教示なしで検査対象物7の測定・検査を行うことができ、効率性の高い検査を実現することが可能となる。 In this way, by automatically generating the trajectories of the gripping robot arm 31 and the measuring robot arm based on the measurement position 10 calculated from the shape data 8 (CAD data or the like) of the inspection object 7, as in the conventional case. In addition, it is possible to measure and inspect the inspection object 7 without teaching by a programming pendant or the like, and it is possible to realize a highly efficient inspection.
 また、本実施形態によれば、全ての検査部位80が測定されるように測定位置10が算出される。具体的には、形状データ8を包含する多面体81を生成し、多面体81の各面82の中心83を通る面法線84上に1以上の測定候補位置101を設定し、各測定候補位置101の測定可能領域Pに基づいて、全ての検査部位80(検査対象領域)が測定され、かつ、測定数が最小となる測定位置10を選定する。これにより、全ての検査部位80を測定するための最適な測定位置10が自動で決定される。 Further, according to the present embodiment, the measurement position 10 is calculated so that all the inspection sites 80 are measured. Specifically, a polyhedron 81 including the shape data 8 is generated, one or more measurement candidate positions 101 are set on the surface normal line 84 passing through the center 83 of each surface 82 of the polyhedron 81, and each measurement candidate position 101 is set. Based on the measurable region P of the above, the measurement position 10 in which all the inspection sites 80 (inspection target regions) are measured and the number of measurements is the minimum is selected. As a result, the optimum measurement position 10 for measuring all the inspection sites 80 is automatically determined.
 また、本実施形態によれば、形状データ8の各検査部位80に検査対象とする欠陥種類91や欠陥仕様92などの検査仕様9を紐づけて設定しておく。これにより、検査対象物7の良否判定を検査仕様9に基づいて効率的に行うことができる。 Further, according to the present embodiment, each inspection site 80 of the shape data 8 is associated with an inspection specification 9 such as a defect type 91 and a defect specification 92 to be inspected. As a result, the quality judgment of the inspection object 7 can be efficiently performed based on the inspection specification 9.
 また、本実施形態によれば、検査対象物7の測定・検査を2段階で実行する。すなわち、深層学習器72を用いて2次元画像データD1から欠陥候補71を抽出し(第1検査)、欠陥候補71が抽出されなかった場合には、検査対象物を「良」(欠陥なし)と判定する。一方、欠陥候補71が抽出された場合には、さらに3次元形状データD2に基づいて欠陥候補71の箇所の形状検査(寸法検査)を行い(第2検査)、最終的な良否判定を行う。 Further, according to the present embodiment, the measurement / inspection of the inspection object 7 is executed in two stages. That is, the defect candidate 71 is extracted from the two-dimensional image data D1 using the deep learning device 72 (first inspection), and if the defect candidate 71 is not extracted, the inspection target is “good” (no defect). Is determined. On the other hand, when the defect candidate 71 is extracted, the shape inspection (dimension inspection) of the defect candidate 71 is further performed based on the three-dimensional shape data D2 (second inspection), and the final quality judgment is performed.
 従来、あらかじめ抽出した欠陥の画像特徴量に基づいて欠陥を認識・分類するためのルールを作り込み、このルールに基づいて欠陥候補を抽出することが良く行われていた。しかし、このようなルールベースのアルゴリズムでは、欠陥の形態やアイテム、ワーク形状など、個々の条件ごとにパラメータや閾値の調整が必要となる。さらに、検査装置の立上調整時や運用後も、見逃しや過検出を低減するためにパラメータの調整作業を繰り返すなど、非常に多くの工数がかかる。また、判定精度は検査者の技量に依存し、アルゴリズムの作り込みにも限界があるため、複雑な検査対象物であるほど十分な判定精度が得られなかった。 Conventionally, it has been common practice to create a rule for recognizing and classifying defects based on the image feature amount of the defect extracted in advance, and to extract defect candidates based on this rule. However, in such a rule-based algorithm, it is necessary to adjust parameters and threshold values for each individual condition such as defect form, item, and work shape. Further, it takes a lot of man-hours such as repeating the parameter adjustment work in order to reduce oversight and over-detection even at the time of startup adjustment of the inspection device or after operation. In addition, the judgment accuracy depends on the skill of the inspector, and there is a limit to the algorithm, so that the judgment accuracy is not sufficient for a complicated inspection object.
 この点、本実施形態では、欠陥候補抽出に深層学習を適用することで、煩雑なパラメータ調整や閾値調整が不要となり、開発工数を大幅に削減できる。しかし、深層学習だけでは、欠陥の寸法、深さ、高さなどの数値データを得ることが難しく、また、内部の処理がブラックボックスであるため、深層学習により得られた結果について、技術的な根拠を示すことが難しいという問題がある。 In this respect, in the present embodiment, by applying deep learning to the defect candidate extraction, complicated parameter adjustment and threshold adjustment become unnecessary, and the development man-hours can be significantly reduced. However, it is difficult to obtain numerical data such as defect dimensions, depth, and height only by deep learning, and since the internal processing is a black box, the results obtained by deep learning are technical. There is a problem that it is difficult to show the rationale.
 そのため、本実施形態では、深層学習により欠陥候補が抽出された場合、該欠陥候補についてさらに形状検査(寸法検査)を行い、最終的な良否判定を行う。形状検査(寸法検査)は、欠陥候補の寸法、深さ、高さなどの数値データに基づいて実施されるため、検査結果についての技術的な根拠が明確になる。なお、形状検査のみで検査を行うことも可能であるが、3Dセンサ45による3次元形状測定は測定時間がかかるため、本実施形態では、1回目の測定・検査により2次元画像に基づく網羅的な検査を行い、2回目の測定・検査により3次元形状に基づく詳細な形状検査(寸法検査)を行う2段階の測定・検査方法を採用した。 Therefore, in the present embodiment, when defect candidates are extracted by deep learning, shape inspection (dimension inspection) is further performed on the defect candidates, and final quality judgment is performed. Since the shape inspection (dimension inspection) is performed based on numerical data such as the dimensions, depth, and height of defect candidates, the technical basis for the inspection results becomes clear. It is possible to perform the inspection only by the shape inspection, but since the three-dimensional shape measurement by the 3D sensor 45 takes a long measurement time, in the present embodiment, the first measurement / inspection is comprehensive based on the two-dimensional image. We adopted a two-step measurement / inspection method in which a detailed shape inspection (dimension inspection) based on a three-dimensional shape is performed by the second measurement / inspection.
 以上、添付図面を参照しながら、本開示に係る検査装置1等の好適な実施形態について説明したが、本開示はかかる例に限定されない。 Although the preferred embodiment of the inspection device 1 and the like according to the present disclosure has been described above with reference to the attached drawings, the present disclosure is not limited to such an example.
 例えば、図24に示すように、2D光学系42および3Dセンサ45は検査装置1Aの検査台19等に固定されてもよい。この場合、測定用ロボットアーム41の軌道制御が不要となる。すなわち、図18のステップS23において、制御装置6の制御部61(軌道生成部143)は、検査対象物7に対するカメラ43の相対的な位置が、測定位置10となるように、把持用ロボットアーム31の軌道を生成する。そして、制御部61(ロボット制御部13)は、生成された軌道に基づいて、把持用ロボットアーム31の軌道を制御し、検査対象物7の位置決めを行う。 For example, as shown in FIG. 24, the 2D optical system 42 and the 3D sensor 45 may be fixed to the inspection table 19 or the like of the inspection device 1A. In this case, the trajectory control of the measurement robot arm 41 becomes unnecessary. That is, in step S23 of FIG. 18, the control unit 61 (orbit generation unit 143) of the control device 6 has a gripping robot arm so that the position of the camera 43 relative to the inspection object 7 is the measurement position 10. Generates 31 orbits. Then, the control unit 61 (robot control unit 13) controls the trajectory of the gripping robot arm 31 based on the generated trajectory, and positions the inspection object 7.
 また、測定用ロボットアーム41の先端部に、3Dスキャナおよびトラッキングマーカを搭載し、制御部61が、トラッキングマーカの動作をトラッキングシステムで追跡することで、検査対象物7の立体形状を測定するように構成してもよい。 Further, a 3D scanner and a tracking marker are mounted on the tip of the measurement robot arm 41, and the control unit 61 tracks the operation of the tracking marker with a tracking system to measure the three-dimensional shape of the inspection object 7. It may be configured as.
 その他当業者であれば、本願で開示した技術的思想の範疇内において、各種の変更例又は修正例に想到し得ることは明らかであり、それらについても当然に本開示の技術的範囲に属するものと了解される。 It is clear that a person skilled in the art can come up with various modifications or modifications within the scope of the technical idea disclosed in the present application, and these also naturally belong to the technical scope of the present disclosure. It is understood that.
1    :検査装置
2    :供給機構
2D   :測定用ロボットアーム
3    :把持機構
4    :測定機構
5    :排出機構
6    :制御装置
7    :検査対象物
8    :形状データ
9    :検査仕様
10   :測定位置
11   :コンベア制御部
12   :ワーク認識部
13   :ロボット制御部
14   :動作設定部
15   :測定部
16   :検査部
17   :データ表示部
21   :コンベア
31   :把持用ロボットアーム
31a  :把持用ロボットアーム
31b  :把持用ロボットアーム
41   :測定用ロボットアーム
42   :2D光学系
43   :カメラ
44   :照明
45   :3Dセンサ
51   :コンベア
71   :欠陥候補
72   :深層学習器
80   :検査部位
80a  :検査部位
80b  :検査部位
81   :幾何形状
81   :球面グリッド
81   :多面体
82   :面
83   :中心
84   :面法線
91   :欠陥種類
92   :欠陥仕様
93   :照明条件
94   :第1測定データ
95   :第2測定データ
96   :検査結果
100  :測定検査テーブル
101  :測定候補位置
101a :測定候補位置
101b :測定候補位置
101c :測定候補位置
140  :検査部位設定部
141  :検査仕様設定部
142  :測定位置算出部
142a :幾何生成部
142b :測定候補位置設定部
142c :測定可能領域算出部
142d :測定位置選定部
143  :軌道生成部
151  :第1測定部
152  :第2測定部
161  :第1検査部
162  :第2検査部
311  :アーム部
311a :アーム部
311b :アーム部
312  :ハンド部
312a :ハンド部
312b :ハンド部
D1   :2次元画像データ
D2   :3次元形状データ
P    :測定可能領域
P1   :測定可能領域
P2   :測定可能領域
P3   :測定可能領域
R    :統合領域
R1   :統合領域
R2   :統合領域
1: Inspection device 2: Supply mechanism 2D: Measuring robot arm 3: Gripping mechanism 4: Measuring mechanism 5: Discharge mechanism 6: Control device 7: Inspection object 8: Shape data 9: Inspection specifications 10: Measurement position 11: Conveyor Control unit 12: Work recognition unit 13: Robot control unit 14: Operation setting unit 15: Measurement unit 16: Inspection unit 17: Data display unit 21: Conveyor 31: Gripping robot arm 31a: Gripping robot arm 31b: Gripping robot Arm 41: Measurement robot arm 42: 2D optical system 43: Camera 44: Lighting 45: 3D sensor 51: Conveyor 71: Defect candidate 72: Deep learner 80: Inspection site 80a: Inspection site 80b: Inspection site 81: Geometric shape 81: Spherical grid 81: Polyhedron 82: Surface 83: Center 84: Surface normal line 91: Defect type 92: Defect specification 93: Illumination condition 94: First measurement data 95: Second measurement data 96: Inspection result 100: Measurement inspection Table 101: Measurement candidate position 101a: Measurement candidate position 101b: Measurement candidate position 101c: Measurement candidate position 140: Inspection site setting unit 141: Inspection specification setting unit 142: Measurement position calculation unit 142a: Geometric generation unit 142b: Measurement candidate position setting Unit 142c: Measurable area calculation unit 142d: Measurement position selection unit 143: Orbit generation unit 151: First measurement unit 152: Second measurement unit 161: First inspection unit 162: Second inspection unit 311: Arm unit 311a: Arm Part 311b: Arm part 312: Hand part 312a: Hand part 312b: Hand part D1: Two-dimensional image data D2: Three-dimensional shape data P: Measurable area P1: Measurable area P2: Measurable area P3: Measurable area R : Integrated area R1: Integrated area R2: Integrated area

Claims (16)

  1.  検査対象物を把持する把持用ロボットアームと、測定センサと、
     前記検査対象物の形状データに基づいて前記検査対象物に対する前記測定センサの測定位置を算出する測定位置算出部と、
     前記測定位置に基づいて前記把持用ロボットアームの軌道を生成する軌道生成部と、
     前記軌道に基づいて前記把持用ロボットアームの動作を制御し、前記検査対象物の位置決めを行うロボット制御部と、
     位置決め後、前記測定センサにより前記検査対象物の測定を行う測定部と、
     測定されたデータに基づいて前記検査対象物の検査を行う検査部と、
    を備える検査装置。
    A gripping robot arm that grips the object to be inspected, a measurement sensor,
    A measurement position calculation unit that calculates the measurement position of the measurement sensor with respect to the inspection object based on the shape data of the inspection object.
    A trajectory generating unit that generates a trajectory of the gripping robot arm based on the measurement position,
    A robot control unit that controls the operation of the gripping robot arm based on the trajectory and positions the inspection object.
    After positioning, the measuring unit that measures the inspection object with the measuring sensor and
    An inspection unit that inspects the inspection object based on the measured data,
    Inspection device equipped with.
  2.  前記測定センサは、測定用ロボットアームに設けられ、
     前記軌道生成部は、前記把持用ロボットアームおよび前記測定用ロボットアームの軌道を生成し、
     前記ロボット制御部は、前記軌道に基づいて前記把持用ロボットアームおよび前記測定用ロボットアームの動作を制御し、前記検査対象物および前記測定センサの位置決めを行う
    請求項1に記載の検査装置。
    The measurement sensor is provided on the measurement robot arm.
    The trajectory generating unit generates the trajectories of the gripping robot arm and the measuring robot arm.
    The inspection device according to claim 1, wherein the robot control unit controls the operations of the gripping robot arm and the measurement robot arm based on the trajectory, and positions the inspection object and the measurement sensor.
  3.  前記形状データに検査対象とする検査部位を設定する検査部位設定部と、を更に備え、
     前記測定位置算出部は、全ての検査部位が前記測定センサにより測定されるように、前記測定位置を算出する
    請求項1または請求項2に記載の検査装置。
    Further provided with an inspection site setting unit for setting an inspection site to be inspected in the shape data.
    The inspection device according to claim 1 or 2, wherein the measurement position calculation unit calculates the measurement position so that all inspection sites are measured by the measurement sensor.
  4.  前記測定位置算出部は、
     前記形状データを包含し、かつ、前記形状データの各方向を規定可能な幾何形状を生成する幾何生成部と、
     前記幾何形状により規定される各方向に1以上の測定候補位置を設定する測定候補位置設定部と、
     前記測定候補位置の中から、測定位置を選定する測定位置選定部と、を含む
    請求項3に記載の検査装置。
    The measurement position calculation unit
    A geometry generating unit that includes the shape data and generates a geometry that can define each direction of the shape data.
    A measurement candidate position setting unit that sets one or more measurement candidate positions in each direction defined by the geometric shape, and a measurement candidate position setting unit.
    The inspection device according to claim 3, further comprising a measurement position selection unit that selects a measurement position from the measurement candidate positions.
  5.  前記幾何生成部は、前記形状データを包含する多面体を生成し、
     前記測定候補位置設定部は、前記多面体の各面の法線方向上に前記測定候補位置を設定する
    請求項4に記載の検査装置。
    The geometry generator generates a polyhedron that includes the shape data,
    The inspection device according to claim 4, wherein the measurement candidate position setting unit sets the measurement candidate position on the normal direction of each surface of the polyhedron.
  6.  前記測定位置選定部は、全ての検査部位が測定され、かつ、測定数が最小となるように測定位置を選定する
    請求項4または請求項5に記載の検査装置。
    The inspection device according to claim 4 or 5, wherein the measurement position selection unit selects a measurement position so that all inspection sites are measured and the number of measurements is minimized.
  7.  前記検査部位設定部により設定された検査部位に対して検査対象とする欠陥種類を設定する検査仕様設定部と、を更に備える
    請求項3から請求項6のいずれかに記載の検査装置。
    The inspection device according to any one of claims 3 to 6, further comprising an inspection specification setting unit for setting a defect type to be inspected for an inspection site set by the inspection site setting unit.
  8.  前記検査仕様設定部は、前記欠陥種類の欠陥の有害/無害を判定するための基準である限度基準をさらに設定し、
     前記検査部は、測定されたデータと前記欠陥種類の限度基準とを照合することで、前記検査対象物の検査を行う
    請求項7に記載の検査装置。
    The inspection specification setting unit further sets a limit standard which is a standard for determining harmful / harmlessness of the defect of the defect type.
    The inspection device according to claim 7, wherein the inspection unit inspects the inspection object by collating the measured data with the limit standard of the defect type.
  9.  前記測定部は、前記欠陥種類に応じた照明条件により前記検査対象物の測定を行う
    請求項7または請求項8に記載の検査装置。
    The inspection device according to claim 7 or 8, wherein the measuring unit measures the inspection object under lighting conditions according to the defect type.
  10.  前記測定部は、
     前記検査対象物の2次元画像を測定する第1測定部と、
     前記検査対象物の3次元形状を測定する第2測定部と、を含む
    請求項1から請求項9のいずれかに記載の検査装置。
    The measuring unit
    A first measuring unit that measures a two-dimensional image of the inspection object,
    The inspection device according to any one of claims 1 to 9, further comprising a second measuring unit for measuring the three-dimensional shape of the inspection object.
  11.  前記検査部は、
     前記第1測定部により測定された2次元画像に基づいて、前記検査対象物の欠陥候補を抽出する第1検査部と、
     前記第2測定部により測定された3次元形状に基づいて、抽出された前記欠陥候補の形状検査を行う第2検査部と、を含む
    請求項10に記載の検査装置。
    The inspection unit
    A first inspection unit that extracts defect candidates of the inspection object based on a two-dimensional image measured by the first measurement unit, and a first inspection unit.
    The inspection device according to claim 10, further comprising a second inspection unit that inspects the shape of the defect candidate extracted based on the three-dimensional shape measured by the second measurement unit.
  12.  前記検査部の検査結果を表示する表示部と、を更に備える
    請求項1から請求項11のいずれかに記載の検査装置。
    The inspection device according to any one of claims 1 to 11, further comprising a display unit for displaying the inspection result of the inspection unit.
  13.  前記ロボット制御部は、前記把持用ロボットアームが把持する前記検査対象物を、他の把持用ロボットアームに把持させるように制御する
    請求項1から請求項12のいずれかに記載の検査装置。
    The inspection device according to any one of claims 1 to 12, wherein the robot control unit controls the inspection object gripped by the gripping robot arm so as to be gripped by another gripping robot arm.
  14.  コンピュータが、
     検査対象物の形状データに基づいて前記検査対象物に対する測定センサの測定位置を算出するステップと、
     前記測定位置に基づいて前記検査対象物を把持する把持用ロボットアームの軌道を生成するステップと、
     前記軌道に基づいて前記把持用ロボットアームの動作を制御し、前記検査対象物の位置決めを行うステップと、
     位置決め後、前記測定センサにより前記検査対象物の測定を行うステップと、
     測定されたデータに基づいて前記検査対象物の検査を行うステップと、
    を実行する検査方法。
    The computer
    A step of calculating the measurement position of the measurement sensor with respect to the inspection object based on the shape data of the inspection object, and
    A step of generating a trajectory of a gripping robot arm that grips the inspection object based on the measurement position, and
    A step of controlling the operation of the gripping robot arm based on the trajectory and positioning the inspection object, and
    After positioning, the step of measuring the inspection object with the measurement sensor and
    Steps to inspect the inspection object based on the measured data,
    Inspection method to perform.
  15. コンピュータを、
     検査対象物の形状データに基づいて前記検査対象物に対する測定センサの測定位置を算出する測定位置算出部、
     前記測定位置に基づいて前記検査対象物を把持する把持用ロボットアームの軌道を生成する軌道生成部、
     前記軌道に基づいて前記把持用ロボットアームの動作を制御し、前記検査対象物の位置決めを行うロボット制御部、
     位置決め後、前記測定センサにより前記検査対象物の測定を行う測定部、
     測定されたデータに基づいて前記検査対象物の検査を行う検査部、
    として機能させるプログラム。
    Computer,
    A measurement position calculation unit that calculates the measurement position of the measurement sensor with respect to the inspection object based on the shape data of the inspection object.
    A trajectory generator that generates a trajectory of a gripping robot arm that grips the inspection object based on the measurement position.
    A robot control unit that controls the operation of the gripping robot arm based on the trajectory and positions the inspection object.
    After positioning, the measuring unit that measures the inspection object with the measuring sensor,
    Inspection unit that inspects the inspection object based on the measured data,
    A program that functions as.
  16.  コンピュータが、
     検査対象物の形状データに基づいて前記検査対象物に対する測定位置を算出するステップと、
     前記測定位置に基づいて前記検査対象物を把持する把持用ロボットアームの軌道を生成するステップと、
     前記軌道に基づいて前記把持用ロボットアームの動作を制御し、前記検査対象物の位置決めを行うステップと、
    を実行する位置決め方法。
     
    The computer
    A step of calculating the measurement position with respect to the inspection object based on the shape data of the inspection object, and
    A step of generating a trajectory of a gripping robot arm that grips the inspection object based on the measurement position, and
    A step of controlling the operation of the gripping robot arm based on the trajectory and positioning the inspection object, and
    Positioning method to perform.
PCT/JP2020/026010 2019-07-17 2020-07-02 Inspection device, inspection method, positioning method, and program WO2021010181A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021532783A JPWO2021010181A1 (en) 2019-07-17 2020-07-02

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-131635 2019-07-17
JP2019131635 2019-07-17

Publications (1)

Publication Number Publication Date
WO2021010181A1 true WO2021010181A1 (en) 2021-01-21

Family

ID=74210477

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/026010 WO2021010181A1 (en) 2019-07-17 2020-07-02 Inspection device, inspection method, positioning method, and program

Country Status (2)

Country Link
JP (1) JPWO2021010181A1 (en)
WO (1) WO2021010181A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023127021A1 (en) * 2021-12-27 2023-07-06 株式会社ニコン Control device, control system, robot system, control method, and computer program

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001255279A (en) * 2001-01-09 2001-09-21 Hitachi Ltd Pattern defect inspection method and its device
JP2007240434A (en) * 2006-03-10 2007-09-20 Omron Corp Surface condition testing method and surface condition testing device
JP2007248241A (en) * 2006-03-15 2007-09-27 Omron Corp Inspection method and device of surface state
JP2008292430A (en) * 2007-05-28 2008-12-04 Panasonic Electric Works Co Ltd Appearance inspecting method and appearance inspecting device
WO2015008373A1 (en) * 2013-07-19 2015-01-22 富士通株式会社 Information processing device, method of calculating inspection range, and program
JP2015085493A (en) * 2013-11-01 2015-05-07 セイコーエプソン株式会社 Robot, processor, and inspection method
JP2017062154A (en) * 2015-09-24 2017-03-30 アイシン精機株式会社 Defect detection device and defect detection method
JP2018004310A (en) * 2016-06-28 2018-01-11 キヤノン株式会社 Information processing device, measurement system, information processing method and program
JP2018194542A (en) * 2017-05-17 2018-12-06 オムロン株式会社 Image processing system, image processing apparatus, and image processing program
JP2019002788A (en) * 2017-06-15 2019-01-10 リョーエイ株式会社 Metal processing surface inspection method, and metal processing surface inspection device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001255279A (en) * 2001-01-09 2001-09-21 Hitachi Ltd Pattern defect inspection method and its device
JP2007240434A (en) * 2006-03-10 2007-09-20 Omron Corp Surface condition testing method and surface condition testing device
JP2007248241A (en) * 2006-03-15 2007-09-27 Omron Corp Inspection method and device of surface state
JP2008292430A (en) * 2007-05-28 2008-12-04 Panasonic Electric Works Co Ltd Appearance inspecting method and appearance inspecting device
WO2015008373A1 (en) * 2013-07-19 2015-01-22 富士通株式会社 Information processing device, method of calculating inspection range, and program
JP2015085493A (en) * 2013-11-01 2015-05-07 セイコーエプソン株式会社 Robot, processor, and inspection method
JP2017062154A (en) * 2015-09-24 2017-03-30 アイシン精機株式会社 Defect detection device and defect detection method
JP2018004310A (en) * 2016-06-28 2018-01-11 キヤノン株式会社 Information processing device, measurement system, information processing method and program
JP2018194542A (en) * 2017-05-17 2018-12-06 オムロン株式会社 Image processing system, image processing apparatus, and image processing program
JP2019002788A (en) * 2017-06-15 2019-01-10 リョーエイ株式会社 Metal processing surface inspection method, and metal processing surface inspection device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023127021A1 (en) * 2021-12-27 2023-07-06 株式会社ニコン Control device, control system, robot system, control method, and computer program

Also Published As

Publication number Publication date
JPWO2021010181A1 (en) 2021-01-21

Similar Documents

Publication Publication Date Title
US10551821B2 (en) Robot, robot control apparatus and robot system
US10894324B2 (en) Information processing apparatus, measuring apparatus, system, interference determination method, and article manufacturing method
US8406923B2 (en) Apparatus for determining pickup pose of robot arm with camera
US8098928B2 (en) Apparatus for picking up objects
US11407111B2 (en) Method and system to generate a 3D model for a robot scene
JP6892286B2 (en) Image processing equipment, image processing methods, and computer programs
JP5743499B2 (en) Image generating apparatus, image generating method, and program
US20180338090A1 (en) Image processing system, image processing device, and image processing program
JP6740288B2 (en) Object inspection apparatus, object inspection system, and method for adjusting inspection position
JP7481427B2 (en) Removal system and method
US9595095B2 (en) Robot system
JP2018202501A (en) Robot control device, robot, and robot system
WO2021039775A1 (en) Image processing device, image capturing device, robot, and robot system
JP6973233B2 (en) Image processing system, image processing device and image processing program
WO2021010181A1 (en) Inspection device, inspection method, positioning method, and program
JP2008168372A (en) Robot device and shape recognition method
JP2022160363A (en) Robot system, control method, image processing apparatus, image processing method, method of manufacturing products, program, and recording medium
CN116745576A (en) Restoring material properties using active illumination and cameras on robotic manipulators
JP2015085434A (en) Robot, image processing method and robot system
US10656097B2 (en) Apparatus and method for generating operation program of inspection system
US20230297068A1 (en) Information processing device and information processing method
WO2016151667A1 (en) Teaching device and method for generating control information
Fröhlig et al. Three-dimensional pose estimation of deformable linear object tips based on a low-cost, two-dimensional sensor setup and AI-based evaluation
CN115213894A (en) Robot image display method, display system, and recording medium
CN115194774A (en) Binocular vision-based control method for double-mechanical-arm gripping system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20840879

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021532783

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20840879

Country of ref document: EP

Kind code of ref document: A1