WO2020067234A1 - Dispositif de commande - Google Patents

Dispositif de commande Download PDF

Info

Publication number
WO2020067234A1
WO2020067234A1 PCT/JP2019/037749 JP2019037749W WO2020067234A1 WO 2020067234 A1 WO2020067234 A1 WO 2020067234A1 JP 2019037749 W JP2019037749 W JP 2019037749W WO 2020067234 A1 WO2020067234 A1 WO 2020067234A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
robot
arm
joints
Prior art date
Application number
PCT/JP2019/037749
Other languages
English (en)
Japanese (ja)
Inventor
浪越 孝宏
Original Assignee
日本電産株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産株式会社 filed Critical 日本電産株式会社
Publication of WO2020067234A1 publication Critical patent/WO2020067234A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • the present invention relates to a control device that controls an articulated robot.
  • Japanese Patent Laid-Open Publication No. 2008-87074 discloses a technique in which a work is taken out by a robot based on recognition and measurement of the work by a visual sensor.
  • an object of the present invention is to provide a control device that can determine that one of a joint and an arm has been replaced.
  • a control device for controlling an articulated robot having a plurality of joints and an arm, wherein the control device controls a predetermined position of the articulated robot.
  • An imaging unit attached to the imaging unit for imaging a surrounding image, a control unit for controlling the articulated robot to a predetermined basic posture, and the imaging unit when the articulated robot is controlled to the predetermined basic posture.
  • a comparison unit that compares the image with a predetermined image, and a determination unit that determines whether any of the plurality of joints and the arm has been replaced according to a comparison result of the comparison unit.
  • a control device is provided.
  • FIG. 1 is a perspective view illustrating a configuration example of a robot control system using a control device according to a first embodiment of the present invention. It is a block diagram showing an example of composition of a robot control system. It is a figure showing a specific example of a marker. 6 is a flowchart illustrating an example of a process at the time of starting the robot. It is a perspective view showing an example of a basic posture of a robot.
  • FIG. 4 is a diagram illustrating an example of an image acquired from an image pickup signal of a camera in a basic posture.
  • FIG. 4 is a conceptual diagram illustrating an example of a state where a part of a robot arm is replaced.
  • FIG. 7 is a diagram illustrating an example of an image acquired from an image pickup signal of a camera in a basic posture in a state where a part of an arm is replaced.
  • 13 is a flowchart illustrating an example of processing at the time of starting a robot according to a second embodiment. It is a figure which shows a DH parameter notionally.
  • FIG. 4 is a diagram illustrating an example of DH parameters stored in a setting holding unit.
  • FIG. 14 is a diagram illustrating an example of a marker image according to the third embodiment.
  • FIG. 4 is a diagram illustrating an example of an image acquired from an image pickup signal of a camera in a basic posture.
  • FIG. 7 is a diagram illustrating an example of an image acquired from an image pickup signal of a camera in a basic posture in a state where a part of an arm is replaced. It is a perspective view showing the example of composition of the robot control system in a modification.
  • FIG. 4 is a diagram illustrating an example of an image acquired from an image pickup signal of a camera in a basic posture.
  • FIG. 7 is a diagram illustrating an example of an image acquired from an image pickup signal of a camera in a basic posture in a state where a part of an arm is replaced. It is a perspective view showing the example of composition of the modification provided with a plurality of markers. It is a perspective view showing the example of composition of the modification provided with a plurality of markers.
  • FIG. 1 is a perspective view illustrating a configuration example of a robot control system using the control device according to the first embodiment.
  • the robot control system includes a multi-joint robot (hereinafter, simply referred to as a robot) 1 having a plurality of joints A and an arm B, and a controller (hereinafter, referred to as a robot) that controls the operation of the robot 1 in accordance with an instruction from an external device.
  • the camera 3 is attached to, for example, a hand portion at the tip of the robot 1.
  • the robot control system includes a marker (image display unit) 4 for displaying a reference image for detecting the state of the robot 1.
  • the joint A of the robot 1 includes a plurality of joints 31a, 31b, 32a, 32b, 33a, 33b.
  • the joint 31 is rotatably attached to the base 30.
  • the arm section B includes a plurality of arms 35 and 36.
  • the robot 1 is modularized so that a plurality of joints 31a to 33b and a part of the arms 35 and 36 can be replaced depending on the application.
  • the controller 2 includes an image acquisition unit 21 that acquires an image from a video signal captured by the camera 3, an image analysis unit 22 that analyzes the image acquired by the image acquisition unit 21, and a robot.
  • the robot 1 includes a setting holding unit 23 that holds setting information for controlling the robot 1 and a posture calculation unit 24 that calculates the posture of the robot 1 according to an instruction signal from an external device.
  • the controller 2 includes a drive instruction unit 25 that drives the robot 1 in accordance with the posture calculated by the posture calculation unit 24, a control unit 26 that controls the operation of the entire controller 2, and a warning signal that displays a warning.
  • a warning instruction unit 27 for outputting.
  • the warning display according to the warning signal from the warning instruction unit 27 may be any of sound, light, and text.
  • the controller 2 can control the operation of the robot 1 by analyzing the image acquired by the image acquiring unit 21 with the image analyzing unit 22.
  • the robot 1 includes motors 11, 12, 13, 14, 15, 16 for driving the joints 31a to 33b, and a motor driver 17 for driving the motors 11 to 16.
  • the state of each of the motors 11 to 16 is supplied to the control unit 26 via, for example, a transmission path connected in a daisy chain.
  • a scale image is displayed on the surface of the marker 4, for example, as shown in FIG.
  • This image may be printed or may be displayed by engraving as long as it can be imaged by the camera 3.
  • the marker 4 is provided at a position that falls within the angle of view of the camera 3 when the robot 1 is controlled to a basic posture (a posture for acquiring a surrounding image when the robot 1 is activated).
  • the control unit 26 of the controller 2 executes a control process shown in FIG.
  • the robot 1 is controlled to a predetermined basic posture.
  • the basic posture is, for example, a posture in which an image of the marker 4 can be captured by the camera 3 as shown in FIG.
  • the control unit 26 instructs the posture calculation unit 24 to provide information on the basic posture (such as the rotation angles of the joints 31a to 33b) to control the robot 1 to the basic posture.
  • the posture calculation unit 24 calculates the posture of the robot 1 according to the instruction from the control unit 26, and causes the drive instruction unit 25 to generate a drive instruction.
  • the drive instruction unit 25 generates a drive instruction for the robot 1 according to the instruction from the posture calculation unit 24, and supplies the drive instruction to the motor driver 17 of the robot 1.
  • the motor driver 17 drives the motors 11 to 16 according to the drive instruction from the drive instruction unit 25. Thereby, the robot 1 is controlled to the basic posture.
  • the image acquiring unit 21 acquires an image from the video signal from the camera 3.
  • the control unit 26 instructs the image analysis unit 22 to analyze the image in S3.
  • the image analysis unit 22 holds, for example, an image in the basic posture at the end of the previous time as the reference image, and compares the reference image with the image in the basic posture at the time of the current startup.
  • the image analysis unit 22 may hold the image in the basic posture at the time of the previous activation as a reference image, and compare the reference image with the image in the basic posture at the time of the current activation.
  • the control unit 26 confirms (determines) whether or not there is a difference between the image in the basic posture at the time of this activation and the reference image according to the result of the analysis by the image analysis unit 22. If there is no difference from the reference image, the control unit 26 determines that none of the joints 31a to 33b and the arms 35 and 36 has been replaced, proceeds to S5, and shifts to a normal operation, and FIG. Is completed. On the other hand, if there is a difference from the reference image, the control unit 26 determines that one of the joints 31a to 33b and the arms 35 and 36 has been replaced, and proceeds to S6 to stop the robot 1 in an emergency. 4 ends.
  • the joints 31 a to 33 b and the arms 35 and 36 are determined based on whether or not there is a difference between the image acquired from the video signal of the camera 3 and the reference image when the robot 1 is controlled to the basic posture. It can be determined that either has been replaced.
  • the setting information for operating the robot 1 needs to be changed. If the setting information is not properly changed, the operation of the robot 1 may be hindered. For this reason, in this embodiment, when any one of the joints 31a to 33b and the arms 35 and 36 is replaced, the robot 1 is emergency stopped. This allows the user to be notified that the setting information needs to be changed.
  • a warning signal may be output by the warning instruction unit 27 to cause an external device to display a warning. This makes it possible to reliably notify the user that the setting information needs to be changed.
  • FIG. 6 shows an example of an image acquired from the image pickup signal of the camera 3 in the basic posture.
  • an image 71 of a part of the hand at the tip of the robot 1 and an image 72 of the marker 4 are displayed.
  • the image analysis unit 22 stores this image as the above-described reference image.
  • FIG. 7 when the arm 36 having the length L1 is replaced with the arm 37 having the length L2 shorter than the length L1, the basic posture of the robot 1 at the time of starting is as shown in FIG. Become.
  • the image acquired from the image pickup signal of the camera 3 is, for example, as shown in FIG. 9, and the image 73 of the marker 4 is displayed at a position different from that in FIG.
  • the robot control system of the present embodiment has the same configuration as that of the first embodiment.
  • the emergency stop is performed when it is determined that any of the joints 31a to 33b and the arms 35 and 36 has been replaced according to the analysis result of the image analysis unit 22.
  • setting information for operating the robot 1 is automatically changed according to the analysis result of the image analysis unit 22.
  • FIG. 10 is a flowchart showing a process at the time of starting the robot 1 in the present embodiment.
  • the processing from S1 to S5 is the same as the processing in FIG. 4, and a description thereof will be omitted.
  • the control unit 26 determines that one of the joints 31a to 33b and the arms 35 and 36 has been replaced, and proceeds to S11.
  • the control unit 26 calculates, for example, a DH parameter as setting information for controlling the robot 1.
  • the control unit 26 supplies the calculated DH parameter to the setting holding unit 23, and sets it as a DH parameter for controlling the robot 1 thereafter. Thereafter, the control unit 26 proceeds to S5, shifts to the normal operation, and ends the processing of FIG.
  • FIG. 11 is a diagram conceptually showing DH parameters.
  • DH is an abbreviation for Denavit-Hartenberg.
  • the DH parameter is obtained, for example, when the position of the hand portion at the tip of the robot 1 is determined or the position of the hand portion is designated according to the angles of the joints 31a to 33b and the lengths of the arms 35 and 36. First, the angle of each joint 31a-33b is obtained.
  • the DH parameter is stored in the setting holding unit 23 as tabular data as shown in FIG. 12, for example. Since d2 and d3 in FIG. 11 are offset by the same offset amount, FIG. 12 shows an example in which the values are treated as 0.0 for convenience.
  • the setting information for controlling the robot 1 can be automatically calculated and set. it can. For this reason, appropriate setting information can be set without stopping the robot 1 in an emergency.
  • the robot control system is configured similarly to the above-described first and second embodiments.
  • the image of the marker 4 is a scale image as shown in FIG. 3 described above. In the present embodiment, for example, a two-dimensional image pattern as shown in FIG. 13 is used. .
  • the reference image is in the state shown in FIG. 14, for example, and the arm 36 is replaced with the arm 37
  • the image acquired from the video signal of the camera 3 in the basic posture is shown in FIG. As shown.
  • the image in the basic posture and the reference image are compared by the image analysis unit 22, it is determined whether any of the joints 31a to 33b and any of the arms 35 and 36 has been replaced, as in the first embodiment. be able to. Further, from the change in the display position of the images 75, 76, 77, 78 of the marker 4, the DH parameters after the replacement can be automatically calculated and set as in the second embodiment.
  • the image of the marker 4 is not limited to the image shown in FIG. 13 as long as the image can be detected, but may be an image such as a two-dimensional barcode.
  • a two-dimensional barcode image for example, information indicating a position may be included in the barcode image. This makes it possible to easily detect a change in the position of the camera 3 in the basic posture.
  • the marker 4 is provided, and whether any of the joints 31a to 33b and the arms 35 and 36 has been replaced based on the difference between the image obtained from the video signal of the camera 3 in the basic posture and the reference image Was determined.
  • the marker 4 is not provided, and for example, as shown in FIG. 16, it is determined whether or not any of the joints 31a to 33b and the arms 35 and 36 has been replaced by using an image around the robot 1. Is also good. In this case, whether or not any of the joints 31a to 33b and the arms 35 and 36 has been replaced may be determined using the feature points and the like in the surrounding images.
  • the image acquired from the video signal of the camera 3 in the basic posture shown in FIG. 16 includes, for example, an image 81 of a surrounding object 80 as shown in FIG.
  • the upper left corner in the image 81 is a feature point 82.
  • an image acquired from the video signal of the camera 3 in the basic posture at the time of startup after the arm 36 is replaced with the arm 37 is, for example, as shown in FIG.
  • the position of the upper left feature point 84 in the image 83 of the object 80 in this image changes according to the difference between the lengths of the exchanged arms 36 and 37.
  • the number of basic positions is not limited to one.
  • a plurality of basic postures for example, as shown in FIGS. 19 and 20, a plurality of markers 4A and 4B are provided at positions within the angle of view of the camera 3 corresponding to the plurality of basic postures. Then, the robot 1 is controlled to the basic posture shown in FIG. 19 to acquire the image of the marker 4A, and the robot 1 is controlled to the basic posture shown in FIG. 20 to acquire the image of the marker 4B.
  • the basic posture is set corresponding to the replaceable one.
  • each DH parameter is held, and the closest DH parameter among the DH parameters obtained from the image is set. You may make it set to the holding

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un dispositif de commande, qui commande un robot à articulations multiples ayant de multiples articulations et un bras, ledit dispositif de commande comprenant : une unité d'imagerie qui est fixée à une position prescrite sur le robot à articulations multiples, et qui capture une image de la périphérie ; une unité de commande qui amène le robot à articulations multiples à une posture de base prescrite ; une unité de comparaison qui compare une image prescrite et une image prise par l'unité d'imagerie lorsque le robot à articulations multiples a été amené à la posture de base prescrite ; et une unité de détermination qui, en fonction des résultats de la comparaison de l'unité de comparaison, détermine si l'un quelconque du bras et des multiples articulations a été remplacé ou non.
PCT/JP2019/037749 2018-09-28 2019-09-26 Dispositif de commande WO2020067234A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018184454 2018-09-28
JP2018-184454 2018-09-28

Publications (1)

Publication Number Publication Date
WO2020067234A1 true WO2020067234A1 (fr) 2020-04-02

Family

ID=69949430

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/037749 WO2020067234A1 (fr) 2018-09-28 2019-09-26 Dispositif de commande

Country Status (1)

Country Link
WO (1) WO2020067234A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH058185A (ja) * 1991-06-29 1993-01-19 Fanuc Ltd ロボツト機体の作動誤差の自動測定方法
JPH11204996A (ja) * 1998-01-08 1999-07-30 Matsushita Electric Ind Co Ltd 電子部品実装装置におけるノズル取付状態の検査方法
JP2007019297A (ja) * 2005-07-08 2007-01-25 Matsushita Electric Ind Co Ltd 電子部品の実装装置および実装方法
JP2014148040A (ja) * 2014-05-21 2014-08-21 Seiko Epson Corp 位置制御方法、ロボット
WO2015197100A1 (fr) * 2014-06-23 2015-12-30 Abb Technology Ltd Procédé d'étalonnage d'un robot et système de robot
WO2018029844A1 (fr) * 2016-08-11 2018-02-15 富士機械製造株式会社 Machine d'usinage de substrat
WO2018061957A1 (fr) * 2016-09-28 2018-04-05 川崎重工業株式会社 Système de diagnostic de main de transport de substrat

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH058185A (ja) * 1991-06-29 1993-01-19 Fanuc Ltd ロボツト機体の作動誤差の自動測定方法
JPH11204996A (ja) * 1998-01-08 1999-07-30 Matsushita Electric Ind Co Ltd 電子部品実装装置におけるノズル取付状態の検査方法
JP2007019297A (ja) * 2005-07-08 2007-01-25 Matsushita Electric Ind Co Ltd 電子部品の実装装置および実装方法
JP2014148040A (ja) * 2014-05-21 2014-08-21 Seiko Epson Corp 位置制御方法、ロボット
WO2015197100A1 (fr) * 2014-06-23 2015-12-30 Abb Technology Ltd Procédé d'étalonnage d'un robot et système de robot
WO2018029844A1 (fr) * 2016-08-11 2018-02-15 富士機械製造株式会社 Machine d'usinage de substrat
WO2018061957A1 (fr) * 2016-09-28 2018-04-05 川崎重工業株式会社 Système de diagnostic de main de transport de substrat

Similar Documents

Publication Publication Date Title
CN108297096B (zh) 校准装置、校准方法以及计算机能够读取的介质
US10201900B2 (en) Control device, robot, and robot system
JP4763074B2 (ja) ロボットのツール先端点の位置の計測装置および計測方法
JP5233601B2 (ja) ロボットシステム、ロボット制御装置およびロボット制御方法
CN105817712B (zh) 刮研加工装置和刮研加工方法
JP5815761B2 (ja) 視覚センサのデータ作成システム及び検出シミュレーションシステム
WO1992001539A1 (fr) Procede d'etalonnage d'un capteur visuel
CN111565895B (zh) 机器人系统及机器人控制方法
JP5272617B2 (ja) ロボット装置及びロボット装置の制御方法
KR102403716B1 (ko) 로봇 시스템
JPH0435885A (ja) 視覚センサのキャリブレーション方法
JP2010152664A (ja) 画像を利用したセンサレスモータ駆動ロボット
JP2018167334A (ja) 教示装置および教示方法
JP2019069493A (ja) ロボットシステム
JPH07286820A (ja) 3次元視覚センサを用いた位置計測方法及び位置ずれ補正方法
CN114055460B (zh) 示教方法及机器人系统
JP5573537B2 (ja) ロボットのティーチングシステム
WO2020067234A1 (fr) Dispositif de commande
JP5521506B2 (ja) ロボット
JP2001158599A5 (fr)
WO2020067240A1 (fr) Dispositif de commande
JP3754340B2 (ja) 位置検出装置
JP2010149225A (ja) ロボットシステム、ロボットシステムの制御装置および制御方法
JP6559425B2 (ja) レーザ照射制御装置
JPH0549998A (ja) 視覚付き自動塗布機

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19865769

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19865769

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP