WO2024089852A1 - Dispositif de commande, système de robot et procédé de commande - Google Patents

Dispositif de commande, système de robot et procédé de commande Download PDF

Info

Publication number
WO2024089852A1
WO2024089852A1 PCT/JP2022/040235 JP2022040235W WO2024089852A1 WO 2024089852 A1 WO2024089852 A1 WO 2024089852A1 JP 2022040235 W JP2022040235 W JP 2022040235W WO 2024089852 A1 WO2024089852 A1 WO 2024089852A1
Authority
WO
WIPO (PCT)
Prior art keywords
end effector
height
workpiece
control device
robot
Prior art date
Application number
PCT/JP2022/040235
Other languages
English (en)
Japanese (ja)
Inventor
祐一郎 菊川
信夫 大石
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to PCT/JP2022/040235 priority Critical patent/WO2024089852A1/fr
Publication of WO2024089852A1 publication Critical patent/WO2024089852A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/404Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for compensation, e.g. for backlash, overshoot, tool offset, tool wear, temperature, machine construction errors, load, inertia

Definitions

  • This specification discloses a control device, a robot system, and a control method.
  • a robot system has been proposed that measures the position of a measurement piece formed to the shape of a workpiece with high precision from an image of the measurement piece held by a robot hand and positioned on a camera, and calculates the coordinate alignment of the robot body and the camera from the control position on the robot body and the measurement piece position (see, for example, Patent Document 1).
  • This robot system is said to be able to easily calibrate the coordinate alignment.
  • a workpiece may be picked up by an end effector attached to the robot, and the workpiece may be fixed at a specified height and imaged in order to be detected.
  • the workpiece height directly, or the height of the end effector indirectly.
  • This disclosure has been made in consideration of these issues, and its primary objective is to provide a control device, robot system, and control method that can further improve height accuracy.
  • the control device of the present disclosure includes: A control device for a robot equipped with an end effector that picks up a workpiece, a control unit that executes a height detection process of causing the end effector to pick up a jig having reference portions provided at predetermined intervals, capturing an image of the jig, and detecting a height of the end effector based on the reference portions included in the captured image; It is equipped with the following:
  • This control device detects the height of the end effector using the spacing of reference points in an image captured of a jig with reference points set at a specified interval, so it is less affected by mounting accuracy and detection sensitivity compared to, for example, a device that attaches a height jig to the end effector and detects the height with a touchdown sensor, and can further improve height accuracy.
  • FIG. 1 is a schematic explanatory diagram showing an example of a robot system 10.
  • FIG. 2 is an explanatory diagram showing an example of an imaging unit 14.
  • FIG. 2 is an explanatory diagram showing an example of a mounting table 17 and a calibration jig 18.
  • FIG. 2 is a perspective view showing an example of an end effector 22 .
  • FIG. 4 is an explanatory diagram showing an example of correspondence information 43 stored in a storage unit 42.
  • 10 is a flowchart showing an example of a height detection processing routine.
  • 11 is a flowchart showing an example of a mounting process routine.
  • FIG. 10 is an explanatory diagram showing an example of another robot system 10B.
  • FIG. 1 is an explanatory diagram of a robot system 100 having a touchdown sensor 119. 6 shows the measurement results of mounting accuracy when using the touchdown sensor 119. 4 shows the results of measurement of mounting accuracy when using the calibration jig 18 of the present disclosure.
  • FIG. 1 is a schematic explanatory diagram showing an example of a robot system 10 according to the present disclosure.
  • FIG. 2 is a schematic explanatory diagram showing an example of an imaging unit 14.
  • FIG. 3 is an explanatory diagram showing an example of a mounting table 17 and a calibration jig 18.
  • FIG. 4 is a perspective view showing an example of an end effector 22.
  • FIG. 5 is an explanatory diagram showing an example of correspondence information 43 stored in a memory unit 42.
  • the left-right direction (X-axis), front-back direction (Y-axis), and up-down direction (Z-axis) are as shown in FIGS. 1 and 2.
  • the robot system 10 is, for example, a device that mounts a workpiece 31 on a processing object 30, and includes a working device 11 and a supplying device 25.
  • the working device 11 includes a transport device 12, an imaging unit 14, a mounting table 17, a working section 20, and a control device 40.
  • the processing object 30 may include an insertion section into which a protruding section 33 of the workpiece 31 is inserted, and examples of the processing object 30 include a substrate or a three-dimensional base material (solid object).
  • the workpiece 31 is an electronic component and has a main body 32 and a protrusion 33.
  • the protrusions 33 are terminal pins and multiple protrusions 33 are formed on the main body 32.
  • the workpiece 31 is fixed by inserting the protrusions 33 into an insertion portion of the processing object 30.
  • the protrusions 33 have a tapered portion 34 having a tapered surface that narrows toward the tip, and a tip portion 35, which is the tip surface, formed on the tip side.
  • the protrusions 33 will also be referred to as "terminals" or "pins” below.
  • the transport device 12 is a unit that carries in, transports, fixes at the mounting position, and removes the processing object 30.
  • the transport device 12 transports the processing object 30 using a pair of conveyor belts that are spaced apart.
  • the imaging unit 14 is a device that captures an image of the underside of the workpiece 31 that is collected and held by the working section 20.
  • the imaging unit 14 is disposed between the conveying device 12 and the supply device 25.
  • the imaging unit 14 includes an imaging section 15 and an illumination section 16.
  • the imaging range of the imaging section 15 is above the imaging section 15, as shown in FIG. 2.
  • the imaging section 15 has an imaging element that generates an electric charge by receiving light and outputs the generated electric charge, such as a CMOS image sensor or a CCD image sensor.
  • the imaging section 15 has a depth of field of at least the distance between the main body 32 and the protruding section 33.
  • the illumination section 16 irradiates light onto the workpiece 31 held by the working section 20.
  • the imaging unit 14 captures an image of the workpiece 31 when the articulated arm robot 21 holding the workpiece 31 stops above the imaging section 15 or while moving, and outputs the captured image to the control device 40.
  • the control device 40 can use this captured image to inspect whether the shape and location of the workpiece 31 are normal, and detect the amount of misalignment, such as the position or rotation, of the workpiece 31 when it was collected.
  • the mounting table 17 is a table used for mounting a member, and is provided adjacent to the imaging unit 14.
  • a calibration jig 18 is placed on the mounting table 17.
  • the calibration jig 18 is a jig used to calibrate the height of the end effector 22 and/or the workpiece 31 before the articulated arm robot 21 picks up the workpiece 31 and performs the process of imaging it with the imaging unit 14.
  • the calibration jig 18 has reference parts 19 provided at predetermined intervals Lx, Ly on its underside.
  • the reference parts 19 are marks that can detect the respective intervals.
  • the reference parts 19 are circular marks, but are not limited to circles as long as the position can be determined and the length can be grasped.
  • the reference parts 19 may also be in a checkerboard pattern. In the robot system 10, the height of the calibration jig 18 at the time of imaging is determined according to the intervals of the reference parts 19.
  • the working unit 20 is a robot that places the workpiece 31 on the processing object 30.
  • the working unit 20 includes a multi-joint arm robot 21, an end effector 22, and a camera 23.
  • the multi-joint arm robot 21 is a work robot connected to a first arm and a second arm that rotate around a rotation axis.
  • the end effector 22 is a member that performs the work of picking up the workpiece 31 and placing it on the processing object 30, and is rotatably connected to the tip of the multi-joint arm robot 21. For example, as shown in FIG.
  • the end effector 22 may include a mechanical chuck 22a that physically grasps and picks up the workpiece 31, or a suction nozzle 22b that sucks up and picks up the workpiece 31 by negative pressure.
  • the end effector 22 may include either the mechanical chuck 22a or the suction nozzle 22b.
  • the camera 23 captures an image of the downward direction, and is used, for example, to capture an image of the workpiece 31 placed on the supply device 25 and recognize the posture of the workpiece 31. This camera 23 is fixed to the tip of the articulated arm robot 21.
  • the supply device 25 is a device that supplies the workpieces 31.
  • the supply device 25 includes a supply placement section 26, a vibration section 27, and a part supply section 28.
  • the supply placement section 26 is configured as a belt conveyor that has a placement surface and places the workpieces 31 with an indefinite posture on it and moves them in the forward and backward directions.
  • the vibration section 27 applies vibration to the supply placement section 26 to change the posture of the workpieces 31 on the supply placement section 26.
  • the part supply section 28 is disposed at the rear upper part of the supply placement section 26 (not shown) and is a device that stores the workpieces 31 and supplies them to the supply placement section 26.
  • the part supply section 28 releases the workpieces 31 to the supply placement section 26 when the number of parts on the placement surface falls below a predetermined number or periodically.
  • the working section 20 uses the articulated arm robot 21, the end effector 22, and the camera 23 to recognize and pick up the workpieces 31 that are in a posture that can be picked up and placed on the supply placement section 26.
  • the control device 40 is a device used in a robot equipped with an end effector 22 that picks up the workpiece 31.
  • the control device 40 has a microprocessor centered on the CPU 41, and controls the entire device including the transport device 12, the imaging unit 14, the working unit 20, and the supply device 25.
  • the CPU 41 is a control unit, and has the functions of an image control unit in an image processing device that executes image processing of the imaging unit 14, and an implementation control unit in an implementation device that controls the multi-joint arm robot 21.
  • the control device 40 outputs control signals to the transport device 12, the imaging unit 14, the working unit 20, and the supply device 25, while inputting signals from the transport device 12, the imaging unit 14, the working unit 20, and the supply device 25.
  • the control device 40 is equipped with a memory unit 42, a display device 47, and an input device 48.
  • the memory unit 42 is a large-capacity storage medium such as a HDD or a flash memory that stores various application programs and various data files. As shown in FIG. 5, the memory unit 42 stores correspondence information 43.
  • the correspondence information 43 associates the resolution calculated from the pitch of the reference portion 19 with the height of the reference portion 19.
  • the correspondence information 43 uses a unit resolution ( ⁇ m/pixel/mm) that indicates the amount of change in resolution relative to the change in height of the end effector 22, and experimentally determines the relationship between the resolution ( ⁇ m/pixel) and the height H of the workpiece 31 picked by the end effector 22, and has been determined such that the height H tends to increase as the resolution increases.
  • the display device 47 is a display that displays information about the robot system 10 and various input screens.
  • the input device 48 includes a keyboard and a mouse for input by the operator.
  • the control device 40 executes height detection processing to cause the end effector 22 to pick up the calibration jig 18, capture an image of the calibration jig 18, and detect the height of the end effector 22 based on the reference portion 19 included in the captured image.
  • the CPU 41 may also detect the height of the end effector 22 using a unit resolution that is the amount of change in resolution relative to a change in height of the reference portion 19 included in the captured image.
  • the CPU 41 also executes processing to cause the imaging unit 15 to capture the workpiece 31 picked up by the end effector 22 using the detected height of the end effector 22, and detect the workpiece 31.
  • the CPU 41 executes height detection processing at a specified calibration timing.
  • FIG. 6 is a flowchart showing an example of a height detection process routine executed by the CPU 41. This routine is stored in the storage unit 42, and is executed at predetermined intervals after the robot system 10 is started. When this routine is started, the CPU 41 first determines whether it is height detection timing (S100).
  • the CPU 41 determines that it is height detection timing when a calibration process is executed immediately after the robot system 10 is started, or when recalibration is performed after a predetermined time has elapsed since the operation. If it is not height detection timing, the CPU 41 ends this routine as it is.
  • the CPU 41 executes height detection processing of the end effector 22, which will be described below. Specifically, the CPU 41 controls the articulated arm robot 21 to pick up the calibration jig 18 with the end effector 22 and hold the calibration jig 18 at the imaging position of the imaging unit 14 (S110). The end effector 22 attached to the working unit 20 is the one that actually picks up the workpiece 31. Next, the CPU 41 performs imaging processing of the calibration jig 18, detects the reference portion 19 (S120), counts the number of pixels between the reference portions 19 in the captured image, and divides the actual distance between the reference portions 19 by the number of pixels to obtain the resolution (S130). Next, the CPU 41 obtains the height H from the obtained resolution using the correspondence information 43 (S140).
  • the control device 40 can determine the height of the end effector 22 or the height of the calibration jig 18 using the captured image of the calibration jig 18.
  • the CPU 41 uses the acquired height H to determine the difference with respect to the reference height, sets a correction value in the height direction to eliminate this difference, stores the set correction value in the memory unit 42 (S150), and ends this routine.
  • the control device 40 can acquire the height of the end effector 22 in a non-contact state by imaging the calibration jig 18 on which the reference portion 19 is provided, and set a correction value to correct that height.
  • FIG. 7 is a flow chart showing an example of a mounting process routine executed by the CPU 41.
  • the mounting process routine is stored in the memory unit 42, and is executed in response to a start command from the operator.
  • the CPU 41 first reads and acquires mounting condition information (S200).
  • the mounting condition information includes information such as the shape and size of the processing object 30 and the workpiece 31, as well as the placement position and placement number of the workpiece 31.
  • the CPU 41 causes the transport device 12 to transport the processing object 30 to the mounting position and fix it (S210), and controls the part supply unit 28 to supply the workpiece 31 to the supply placement unit 26 (S220).
  • the CPU 41 controls the multi-joint arm robot 21 to move the camera 23 above the supply placement unit 26, and causes the camera 23 to capture an image of the workpiece 31 on the supply placement unit 26, and performs a process to recognize the workpiece 31 that can be picked up (S230).
  • the CPU 41 recognizes the workpiece 31 on the supply placement unit 26 with the protrusion 33 on the lower side and the main body 32 on the upper side as a pickable part. When there is no workpiece 31 that can be picked up, the CPU 41 drives the vibration unit 27 to change the position of the workpiece 31. When there is no workpiece 31 on the supply placement unit 26, the CPU 41 drives the part supply unit 28 to supply the workpiece 31 onto the supply placement unit 26.
  • the CPU 41 drives the articulated arm robot 21 and the end effector 22, and causes the end effector 22 to pick up the workpiece 31 using the height correction value and move it to the top of the imaging unit 14 (S240).
  • the working unit 20 performs height correction and picks up the workpiece 31, so that the workpiece 31 can be picked up more reliably.
  • the CPU 41 also adjusts the imaging position and the control of the articulated arm robot 21 in advance so that the center position of the end effector 22 moves to the center of the image.
  • the CPU 41 moves the end effector 22 to an imaging position using the height correction value, stops it at that position, and images the workpiece 31 (S250).
  • the control device 40 can image the workpiece 31 at the corrected position, and can obtain a more appropriate image.
  • the CPU 41 uses the captured image to check the tip 35 of the protrusion 33 (S260) and determines whether the workpiece 31 is usable or not (S270).
  • the CPU 41 determines that the workpiece 31 is usable when the tips 35 of all protrusions 33 are detected to be within a predetermined tolerance range.
  • This tolerance range may be determined by determining the relationship between the positional deviation of the tips 35 and their placement on the processing object 30, and may be determined as a range within which the workpiece 31 can be properly placed on the processing object 30.
  • the CPU 41 executes a process to place the workpiece 31 at a predetermined position on the processing object 30 (S280). At this time, the CPU 41 may perform position correction based on the position of the tip 35 and place it on the processing object 30. The working unit 20 performs height correction and places the workpiece 31, so that the workpiece 31 can be placed more reliably.
  • the CPU 41 stops using the workpiece 31 and discards it, and notifies the operator of this with a message (S290). The operator is notified, for example, by displaying a message or an icon on the display device 47 to the effect that the protruding portion 33 of the workpiece 31 cannot be inserted into the insertion portion of the processing object 30.
  • the CPU 41 determines whether there is a next work 31 to be placed (S300), and if there is, executes the process from S220 onwards. That is, the CPU 41 repeatedly executes the process of supplying the work 31 to the supply placement section 26 as necessary, recognizing the work 31 that can be picked up, picking up the work 31, taking an image of the work 31 with the imaging unit 14, determining whether the work 31 is usable, and placing it.
  • the CPU 41 determines whether there is a processing object 30 on which the work 31 should be placed next (S310), and if there is a next processing object 30, executes the process from S210 onwards.
  • the CPU 41 ejects the processing object 30 on which the work 31 has been placed, carries in the next processing object 30, and executes the process from S220 onwards. On the other hand, when there is no next processing object 30 in S310, the CPU 41 ends this routine.
  • the control device 40 of this embodiment is an example of a control device of this disclosure
  • the CPU 41 is an example of a control unit
  • the end effector 22 is an example of an end effector
  • the calibration jig 18 is an example of a jig
  • the articulated arm robot 21 is an example of an arm robot
  • the robot system 10 is an example of a robot system. Note that this embodiment also clarifies an example of a control method of this disclosure by explaining the operation of the robot system 10.
  • the control device 40 of this embodiment described above is used in a robot equipped with an end effector 22 that picks up a workpiece 31.
  • This control device 40 has a CPU 41 as a control unit that executes a height detection process that causes the end effector 22 to pick up a calibration jig 18 having reference portions 19 provided at predetermined intervals, images the calibration jig 18, and detects the height of the end effector 22 based on the reference portions 19 included in the captured image.
  • This control device 40 detects the height H of the end effector 22 using the intervals between the reference portions 19 in an image obtained by capturing an image of the calibration jig 18 having reference portions 19 provided at predetermined intervals. Therefore, compared to, for example, a device that attaches a height jig to the end effector 22 and detects the height using a touchdown sensor, the influence of mounting accuracy and detection sensitivity is smaller, and height accuracy can be further improved.
  • the CPU 41 as the control unit detects the height H of the end effector 22 using the unit resolution, which is the amount of change in resolution relative to the change in height of the reference portion 19 included in the captured image, and the height accuracy can be further improved using the unit resolution. Furthermore, the CPU 41 uses the detected height H of the end effector 22 to have the imaging unit capture the workpiece 31 picked up by the end effector 22, and detects the workpiece 31. In order to do this, the accuracy of the imaging process of the workpiece 31 can be further improved using the detection result of the height detection process in which the calibration jig 18 is imaged and image-processed. Furthermore, the CPU 41 executes the height detection process at a predetermined calibration timing, and therefore can execute the height detection process as a calibration process of the device.
  • the accuracy in the height direction of the multi-joint arm robot 21 can be further improved by using the captured image of the calibration jig 18, in which the reference portions 19 are provided at predetermined intervals.
  • the robot system 10 also includes an articulated arm robot 21 equipped with an end effector 22 that picks up the workpiece 31, and a control device 40. This robot system 10 can further improve the height accuracy, just like the control device 40.
  • control device 40 of the working device 11 has the functions of the control device of the present disclosure, but this is not particularly limited, and a control device may be provided in the working device 11 or the articulated arm robot 21.
  • the CPU 41 has the functions of a control unit, but this is not particularly limited, and a control unit may be provided separately from the CPU 41.
  • the height H of the end effector 22 is detected using an image captured of the calibration jig 18, so the accuracy of the height can be further improved.
  • the CPU 41 detects the height H of the end effector 22 using the unit resolution, but this is not particularly limited as long as the height is detected using a captured image of the calibration jig 18.
  • the correspondence information 43 may be the correspondence between the pitch of the reference portion 19 and the height H, and the CPU 41 may directly determine the height H from the pitch of the reference portion 19.
  • a correction value in the height direction of the end effector 22 is calculated from the captured image of the calibration jig 18, and this correction value is used to collect, capture, and position the workpiece 31, but the use of the correction value may be omitted in one or more of these processes.
  • the workpiece 31 is provided with a protrusion 33, and the protrusion 33 is inserted into the insertion portion of the processing object 30.
  • this is not particularly limited, and the workpiece 31 may not have a protrusion 33, and the processing object 30 may not have an insertion portion.
  • the workpiece 31 having a protrusion 33 has a greater effect on mounting accuracy, and the processing disclosed herein is more meaningful.
  • the end effector 22 is described as having a mechanical chuck 22a and a suction nozzle 22b, but either the mechanical chuck 22a or the suction nozzle 22b may be omitted.
  • FIG. 8 is an explanatory diagram showing an example of another robot system 10B.
  • This robot system 10B includes a mounting device 11B and a management device 60.
  • the mounting device 11B includes a transport device 12B that transports a board 30B, a supply device 25B that includes a feeder 29, an imaging unit 14, a mounting unit 20B, and a control device 40.
  • the mounting unit 20B includes a mounting table 17 on which the calibration jig 18 is placed, a mounting head 21B, an end effector 22B attached to the mounting head 21B, and a head moving unit 24 that moves the mounting head 21B in the XY direction.
  • the control device 40 is the same as in the above embodiment. Even when using such a mounting device 11B, the height accuracy of the XY robot can be improved when placing a workpiece 31 having a protruding portion protruding from the main body onto the processing object 30, and thus the accuracy of mounting the workpiece 31 onto the processing object 30 can be improved.
  • control device 40 has been described as control device 40, but it is not particularly limited to this and may be a control method for a robot, or this control method may be a program executed by a computer.
  • control method of the present disclosure may be configured as follows.
  • the control method of the present disclosure is a control method executed by a computer that controls a robot equipped with an end effector that picks up a workpiece, comprising: (a) causing the end effector to pick up a jig having reference portions provided at predetermined intervals and capturing an image of the jig; (b) executing a height detection process to detect a height of the end effector based on the reference portion included in the obtained captured image; It includes.
  • the height of the end effector is detected using the spacing of reference parts in an image captured of a jig having reference parts provided at a predetermined interval, thereby further improving the accuracy of the height.
  • various aspects of the control device described above may be adopted, and steps may be added to achieve each function of the control device described above.
  • the "height of the end effector” may be the height of a predetermined position (e.g., the bottom end) of the end effector, or may be synonymous with the height of the workpiece being picked by the end effector.
  • Experimental example 1 corresponds to a comparative example
  • experimental example 2 corresponds to an embodiment of the present disclosure.
  • FIG 9 is an explanatory diagram of a robot system 100 having a conventional touchdown sensor 119.
  • the robot system 100 comprises an imaging unit 14 having an imaging section 15, a multi-joint arm robot 21, a contact jig 118, and a touchdown sensor 119.
  • the contact jig 118 is brought into contact with the touchdown sensor 119, and the height of the end effector 22 is detected at the position of contact.
  • a robot system 10 was created that detects the height of the end effector 22 using an image captured by the calibration jig 18 (see Figures 1 to 5).
  • the height H of the end effector 22 was detected using an image captured by the calibration jig 18, and the mounting accuracy of the workpiece 31 was measured.
  • the angle of the workpiece 31 was changed to 0°, 90°, 180°, and 270° to determine the mounting accuracy.
  • FIG. 10 shows the measurement results of the mounting accuracy when the touchdown sensor 119 was used.
  • FIG. 11 shows the measurement results of the mounting accuracy when the calibration jig 18 of the present disclosure was used.
  • the robot system 10 of the present disclosure which detects the height of the end effector 22 using the captured image of the calibration jig 18, can perform height detection with higher accuracy, and thus can improve the mounting accuracy of the workpiece 31.
  • the image processing device, mounting device, mounting system, and image processing method disclosed herein can be used, for example, in the field of electronic component mounting.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

Un dispositif de commande selon la présente divulgation est utilisé pour un robot équipé d'un effecteur terminal permettant de saisir des pièces, et comprend une unité de commande qui exécute un processus de détection de hauteur au cours duquel l'effecteur terminal est amené à saisir un gabarit sur lequel des pièces de référence sont disposées à des intervalles prescrits, le gabarit est photographié, et la hauteur de l'effecteur terminal est détectée sur la base d'une pièce de référence contenue dans l'image photographique obtenue.
PCT/JP2022/040235 2022-10-27 2022-10-27 Dispositif de commande, système de robot et procédé de commande WO2024089852A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/040235 WO2024089852A1 (fr) 2022-10-27 2022-10-27 Dispositif de commande, système de robot et procédé de commande

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/040235 WO2024089852A1 (fr) 2022-10-27 2022-10-27 Dispositif de commande, système de robot et procédé de commande

Publications (1)

Publication Number Publication Date
WO2024089852A1 true WO2024089852A1 (fr) 2024-05-02

Family

ID=90830381

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/040235 WO2024089852A1 (fr) 2022-10-27 2022-10-27 Dispositif de commande, système de robot et procédé de commande

Country Status (1)

Country Link
WO (1) WO2024089852A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04114607U (ja) * 1991-03-22 1992-10-09 株式会社明電舎 ロボツトの位置合わせ装置
JPH0970781A (ja) * 1995-09-07 1997-03-18 Shinko Electric Co Ltd 自立走行ロボットの三次元位置姿勢較正方法
JPH10340112A (ja) * 1997-06-06 1998-12-22 Matsushita Electric Ind Co Ltd 自動キャリブレーション機能付きロボット
JP2015182144A (ja) * 2014-03-20 2015-10-22 キヤノン株式会社 ロボットシステムおよびロボットシステムの校正方法
WO2018173192A1 (fr) * 2017-03-23 2018-09-27 株式会社Fuji Procédé de détermination de parallélisme de robot articulé et dispositif de réglage d'inclinaison de robot articulé

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04114607U (ja) * 1991-03-22 1992-10-09 株式会社明電舎 ロボツトの位置合わせ装置
JPH0970781A (ja) * 1995-09-07 1997-03-18 Shinko Electric Co Ltd 自立走行ロボットの三次元位置姿勢較正方法
JPH10340112A (ja) * 1997-06-06 1998-12-22 Matsushita Electric Ind Co Ltd 自動キャリブレーション機能付きロボット
JP2015182144A (ja) * 2014-03-20 2015-10-22 キヤノン株式会社 ロボットシステムおよびロボットシステムの校正方法
WO2018173192A1 (fr) * 2017-03-23 2018-09-27 株式会社Fuji Procédé de détermination de parallélisme de robot articulé et dispositif de réglage d'inclinaison de robot articulé

Similar Documents

Publication Publication Date Title
JP5174583B2 (ja) 電子部品実装装置の制御方法
JP6293899B2 (ja) 実装装置
JP2017139388A (ja) 実装装置
CN111869342B (zh) 元件安装装置
JPWO2017126025A1 (ja) 実装装置および撮像処理方法
JP2008227069A (ja) 部品移載装置及び表面実装機
WO2024089852A1 (fr) Dispositif de commande, système de robot et procédé de commande
JP2013251398A (ja) 部品実装装置および部品実装方法
WO2015052755A1 (fr) Dispositif de montage
JP2009164276A (ja) 部品実装装置における吸着位置補正方法
CN108702866B (zh) 元件判定装置及元件判定方法
JP2017073431A (ja) 画像認識装置
JP7425091B2 (ja) 検査装置及び検査方法
JPH11186796A (ja) 部品装着装置
JP6938673B2 (ja) 作業装置及びその制御方法
JP6728501B2 (ja) 画像処理システムおよび部品実装機
JP7536112B2 (ja) 画像処理装置、実装装置、実装システム、画像処理方法及び実装方法
JP7478993B2 (ja) 部品実装装置および部品実装システム
JP7398309B2 (ja) 表面実装機、及び、画像データの送信方法
WO2024111022A1 (fr) Dispositif d'inspection d'aspect de composant et procédé d'inspection d'aspect de composant
JP7124126B2 (ja) 部品実装装置
JP7384916B2 (ja) 情報処理装置及び実装装置
JP2022107194A (ja) 撮像装置及び基板作業装置
JPWO2018042590A1 (ja) 部品実装装置および位置認識方法
JP2006093247A (ja) 電子部品実装装置及びノズル返却方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22963499

Country of ref document: EP

Kind code of ref document: A1