WO2022118471A1 - Robot operation device, robot operation method, and robot system - Google Patents

Robot operation device, robot operation method, and robot system Download PDF

Info

Publication number
WO2022118471A1
WO2022118471A1 PCT/JP2020/045281 JP2020045281W WO2022118471A1 WO 2022118471 A1 WO2022118471 A1 WO 2022118471A1 JP 2020045281 W JP2020045281 W JP 2020045281W WO 2022118471 A1 WO2022118471 A1 WO 2022118471A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
unit
display
input
movement direction
Prior art date
Application number
PCT/JP2020/045281
Other languages
French (fr)
Japanese (ja)
Inventor
宗 石川
真也 矢頭
重義 稲垣
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to PCT/JP2020/045281 priority Critical patent/WO2022118471A1/en
Priority to JP2022566744A priority patent/JPWO2022118471A1/ja
Publication of WO2022118471A1 publication Critical patent/WO2022118471A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/06Control stands, e.g. consoles, switchboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators

Definitions

  • This specification discloses a robot operating device, a robot operating method, and a robot system.
  • an information processing device in which a direction is instructed by a user's operation by an operation panel and the movable part of the robot is moved in the direction indicated by the direction instruction (see, for example, Patent Document 1).
  • This information processing device first acquires an operation signal (movement direction instruction) from the operation input unit of the operation panel. Subsequently, the information processing apparatus acquires the position of the operator in the workspace coordinate system from the operator position measurement unit (infrared sensor). Next, the information processing apparatus calculates a movement vector for the direction specified by the movement direction instruction from the position of the operator in the acquired workspace coordinate system, and converts it into a movement vector in the base coordinate system. Then, the information processing apparatus adds the converted movement vector to the current position of the robot, obtains the position of the movement destination in the base coordinate system, and outputs a signal indicating the obtained position of the movement destination to the robot.
  • the operation panel is equipped with a display device, and the state in which the robot moves in response to a movement direction instruction by the user (operator) is displayed on the display device as a robot model.
  • the appearance of the robot as seen by the user differs depending on the position and direction of the operator with respect to the robot. For this reason, simply displaying the robot model on the operation panel may not match the posture of the robot seen by the operator with the posture of the robot model displayed on the screen, making intuitive operation difficult. ..
  • the main purpose of this disclosure is to enable the operator to operate the robot more intuitively.
  • the robot operating device of the present disclosure is It is a robot operating device that operates a robot.
  • the gist is to prepare.
  • the robot operating device of the present disclosure detects at least one of the positions and directions relative to the reference position of the robot, and displays the robot model in a posture in which the robot is viewed from at least one of the detected positions and directions.
  • the posture of the robot seen by the operator can be made to correspond (match) with the posture of the robot model displayed on the display unit, so that the operator uses the operating device while looking at the displayed robot model. It is possible to operate the robot more intuitively.
  • FIG. 1 is an external perspective view of the robot system.
  • FIG. 2 is a side view of the robot body.
  • FIG. 3 is a block diagram showing an electrical connection relationship between the robot body, the robot control device, and the robot operation device.
  • the robot system 1 includes a robot 10 including a robot main body 20 and a robot control device 70 (see FIG. 3), and a robot operation device 100 that operates the robot by a user's operation input. ..
  • the robot operation device 100 is used as a teaching operation device for teaching the operation contents to the robot 10, for example, by operating the robot 10 and recording the operation obtained by the operation.
  • the robot body 20 includes a first arm 21, a second arm 22, a base 25, a first arm drive device 35, a second arm drive device 36, a posture holding device 37, an elevating device 40, and a rotation 3.
  • a shaft mechanism 50 is provided.
  • the first arm 21, the second arm 22, and the rotary three-axis mechanism 50 may be simply referred to as an arm.
  • the base end portion of the first arm 21 is connected to the elevating member 26 via the first joint shaft 31.
  • the first arm drive device 35 includes a motor 35a and an encoder 35b.
  • the rotation shaft of the motor 35a is connected to the first joint shaft 31 via a speed reducer (not shown).
  • the first joint shaft 31 extends in the vertical direction (Z-axis direction), and the first arm drive device 35 drives the first joint shaft 31 by the motor 35a so that the first joint shaft 31 is used as a fulcrum.
  • the arm 21 is rotated (horizontally swiveled) along a horizontal plane (XY plane).
  • the encoder 35b is attached to the rotation shaft of the motor 35a and is configured as a rotary encoder that detects the amount of rotational displacement of the motor 35a.
  • the base end portion of the second arm 22 is connected to the tip end portion of the first arm 21 via the second joint shaft 32.
  • the second arm drive device 36 includes a motor 36a and an encoder 36b.
  • the rotation shaft of the motor 36a is connected to the second joint shaft 32 via a speed reducer (not shown).
  • the second joint shaft 32 extends in a direction parallel to the first joint shaft 31 (Z-axis direction), and the second arm drive device 36 rotationally drives the second joint shaft 32 by the motor 36a to drive the second joint shaft 32.
  • the second arm 22 is rotated (horizontally swiveled) along the horizontal plane with the joint axis 32 as a fulcrum.
  • the encoder 36b is attached to the rotation shaft of the motor 36a and is configured as a rotary encoder that detects the amount of rotational displacement of the motor 36a.
  • An elevating device 40 is installed on the base 25. As shown in FIGS. 1 and 2, the elevating device 40 is fixed to the elevating member 26 and the base 25, and extends in the vertical direction (Z-axis direction) to guide the movement of the slider 41.
  • the guide member 42, the ball screw shaft 43 (elevating shaft) extending in the vertical direction and screwed to the ball screw nut (not shown) fixed to the slider 41, and the ball screw shaft 43 are rotationally driven. It includes a motor 44 and an encoder 45 (see FIG. 3).
  • the elevating device 40 rotates the ball screw shaft 43 by the motor 44 to elevate and elevate the elevating member 26 fixed to the slider 41 along the guide member 42.
  • the encoder 45 is configured as a linear encoder that detects the elevating position of the slider 41 (elevating member 26).
  • the rotary 3-axis mechanism 50 is connected to the tip end portion of the second arm 22 via a posture holding shaft 33 extending in the vertical direction.
  • the rotation three-axis mechanism 50 includes a first rotation axis 51, a second rotation axis 52, and a third rotation axis 53 that are orthogonal to each other, a first rotation device 55 that rotates the first rotation axis 51, and a second rotation axis 52.
  • a second rotating device 56 for rotating and a third rotating device 57 for rotating the third rotating shaft 53 are provided.
  • the first rotation shaft 51 is supported in a posture orthogonal to the posture holding shaft 33.
  • the second rotation axis 52 is supported in an orthogonal posture with respect to the first rotation axis 51.
  • the third rotation axis 53 is supported in an orthogonal posture with respect to the second rotation axis 52.
  • the first rotary device 55 includes a motor 55a that rotationally drives the first rotary shaft 51, and an encoder 55b that is attached to the rotary shaft of the motor 55a and detects the amount of rotational displacement of the motor 55a.
  • the second rotating device 56 includes a motor 56a that rotationally drives the second rotating shaft 52, and an encoder 56b that is attached to the rotating shaft of the motor 56a and detects the amount of rotational displacement of the motor 56a.
  • the third rotating device 57 includes a motor 57a that rotationally drives the third rotating shaft 53, and an encoder 57b that is attached to the rotating shaft of the motor 57a and detects the amount of rotational displacement of the motor 57a. Further, an end effector as a hand of the robot body 20 is attached to the third rotation shaft 53.
  • the robot main body 20 of the present embodiment has translational motions in three directions of the X-axis direction, the Y-axis direction, and the Z-axis direction by the first arm drive device 35, the second arm drive device 36, and the elevating device 40, and three rotation axes.
  • the mechanism 50 By combining with the rotational movement in the three directions of the X-axis (pitching), the Y-axis (rolling), and the Z-axis (yowing) by the mechanism 50, the hand can be moved to an arbitrary position in an arbitrary posture.
  • the posture holding device 37 holds the posture of the rotating 3-axis mechanism 50 (direction of the first rotating shaft 51) in a fixed direction regardless of the postures of the first arm 21 and the second arm 22.
  • the posture holding device 37 includes a motor 37a and an encoder 37b.
  • the rotation shaft of the motor 37a is connected to the posture holding shaft 33 via a speed reducer (not shown).
  • the posture holding device 37 has a posture based on the rotation angle of the first joint shaft 31 and the rotation angle of the second joint shaft 32 so that the axial direction of the first rotation shaft 51 is always in the left-right direction (X-axis direction).
  • the target rotation angle of the holding shaft 33 is set, and the motor 37a is driven and controlled so that the posture holding shaft 33 has the target rotation angle. This makes it possible to independently control the translational motion in three directions and the rotational motion in three directions, which facilitates control.
  • the robot control device 70 is configured as a microprocessor centered on a CPU. Detection signals from the encoders 35b, 36b, 37b, 45, 55b, 56b, 57b and the like are input to the robot control device 70. Further, the robot control device 70 outputs drive signals to the motors 35a, 36a, 37a, 44, 55a, 56a, 57a. Further, the robot control device 70 is connected to the robot operation device 100 via a cable 11.
  • the robot 20 configured in this way operates as follows. That is, the robot control device 70 first acquires the target position of the hand of the arm in the base coordinate system (X, Y, Z) set in the base 25. Subsequently, the robot control device 70 sets the target rotation angles of the joint axes 31, 32, etc. by solving the inverse kinematics with respect to the acquired target position. Then, the robot control device 70 sets a torque command by feedback control so that the rotation angles of the joint shafts 31, 32, etc. detected by the encoders 35b, 36b, etc. match the respective target rotation angles, and the torque command is given. The motors 35a, 36a, etc. are controlled so that the torque corresponding to the above is output.
  • the robot operation device 100 is a controller for moving the robot main body 20 by the operation input of the operator, and includes a display unit 110, an operation input unit 120, a position detection unit 130, a direction detection unit 140, and a processing unit 150.
  • the display unit 110 is a display device that displays an image such as a liquid crystal display or an organic EL display.
  • the operation input unit 120 is a touch panel for detecting an operation input to the screen of the display unit 110, and the operator inputs various operations by touching an operation button (software button) displayed on the display unit 110.
  • FIG. 4 is an explanatory diagram showing an example of an image display of the display unit. As shown in the figure, the display unit 110 has a model (robot model) 111 of the robot body 20 and movement instruction buttons 112a to 112d for each movement direction for operating and inputting the movement amount and the instruction of the movement direction of the robot body 20. And are displayed.
  • the position detection unit 130 detects the position of the robot operation device 100 with respect to the robot body 20, and includes, for example, a camera that captures an object and an image processing device that processes the captured image of the camera.
  • the position detection unit 130 captures the robot body 20 with a camera, processes the captured image with an image processing device, and recognizes the reference position of the robot body 20, so that the robot operating device 100 is relative to the reference position. Detect the position.
  • the reference position is set to the origin O of the base coordinate system (X, Y, Z) set in the base 25, as shown in FIG.
  • the position detection unit 130 detects the position of the robot operating device 100 as a rotation angle ⁇ around the X axis and a rotation angle ⁇ around the Z axis with the Y axis direction as 0 degrees in the base coordinate system (X, Y, Z). do.
  • an ultrasonic sensor or the like may be used as the position detection unit 130.
  • the direction detection unit 140 detects the direction (orientation) of the robot operating device 100 with respect to the reference position of the robot body 20.
  • the configuration including the above-mentioned camera and image processing device may be used, or a gyro sensor may be used. Can be done.
  • the direction detection unit 140 detects the direction of the robot operating device 100 as a rotation angle ⁇ around the X axis and a rotation angle ⁇ around the Z axis with the Y axis direction as 0 degrees in the base coordinate system (X, Y, Z). do.
  • the processing unit 150 is configured as a microprocessor centered on a CPU. An operation signal from the operation input unit 120, a detection signal from the position detection unit 130, a detection signal from the direction detection unit 140, and the like are input to the processing unit 150. A display signal or the like to the display unit 110 is output from the processing unit 150. Further, the processing unit 150 is communicably connected to the robot control device 70 via the cable 11 and exchanges data and control signals with each other.
  • FIG. 6 is a flowchart showing an example of the display control process executed by the processing unit 150. This process is repeatedly executed at predetermined time intervals.
  • the processing unit 150 When the display control process is executed, the processing unit 150 first acquires the rotation angle ⁇ around the X axis and the rotation angle ⁇ around the Z axis in the base coordinate system (X, Y, Z) from the position detection unit 130. At the same time (step S100), the rotation angle ⁇ around the X axis and the rotation angle ⁇ around the Z axis in the base coordinate system (X, Y, Z) are acquired from the direction detection unit 140 (step S110). Subsequently, the processing unit 150 sets the rotation matrix R ⁇ by the following equation (1) based on the rotation angles ⁇ and ⁇ acquired in step S100 (step S120), and sets the rotation angles ⁇ and ⁇ acquired in step S110.
  • the rotation matrix R ⁇ is set by the following equation (2) (step S130).
  • the processing unit 150 changes the coordinate system from the base coordinate system (X, Y, Z) to the controller coordinate system by rotating the base coordinate system (X, Y, Z) using the rotation matrices R ⁇ and R ⁇ ⁇ . Convert to (X', Y', Z') (step S140).
  • FIG. 7 is an explanatory diagram showing a state of coordinate conversion. In the example of the figure, the case where the base coordinate system (X, Y, Z) is rotated by the rotation angles ⁇ and ⁇ around the Z axis is shown.
  • the Z'axis overlaps with the Z axis of the base coordinate system (X, Y, Z), and the X'axis and the Y'axis are the Z axes.
  • the coordinate system is rotated by the rotation angle of the sum of the rotation angle ⁇ and the rotation angle ⁇ with the axis as the fulcrum.
  • the processing unit 150 displays the robot model on the display unit 110 (step S150), displays the movement instruction button (step S160), and ends the display control process.
  • FIG. 8 is an explanatory diagram showing the posture of the robot body as seen from the robot operating device (controller coordinate system).
  • FIG. 9 is an explanatory diagram showing a display example of the robot model and the movement instruction button at the position and direction of the robot operating device of FIG.
  • the robot model 111 displayed on the display unit 110 is an image (projected image) obtained by projecting the robot body 20 onto the X'Z'plane of the controller coordinate system (X', Y', Z').
  • the plurality of movement instruction buttons 112a to 112f displayed on the display unit 110 are in the X-axis direction (front-back direction), Y-axis direction (left-right direction), and Z-axis direction (left-right direction) of the base coordinate system (X, Y, Z). It is an arrow image of 6 directions corresponding to each (vertical direction).
  • the directions of the arrows of the movement instruction buttons 112a to 112f are the X-axis direction (front-back direction), Y-axis direction (left-right direction), and Z-axis direction (up-down direction) of the corresponding base coordinate system (X, Y, Z).
  • the direction when the above-mentioned controller coordinate system (X', Y', Z') is projected onto the X'Z'plane is shown.
  • the operator touch-operates the position of one of the movement instruction buttons (operation input unit 120) in the X-axis direction (front-back direction) and Y-axis direction of the base coordinate system (X, Y, Z).
  • the robot main body 20 can be instructed to move in the direction indicated by the movement instruction button in the (horizontal direction) and the Z-axis direction (vertical direction).
  • the robot model 111 is displayed on the display unit 110 in the posture when the robot body 20 is viewed from the robot operation device 100, the posture of the robot body 20 viewed by the operator and the display unit 110 are displayed. It is possible to match the posture of the robot model 111. As a result, the operator can operate the robot body 20 more intuitively while looking at the robot model 111 displayed on the display unit 110.
  • FIG. 10 is a flowchart showing an example of the movement instruction transmission process. This process is repeatedly executed at predetermined time intervals.
  • the processing unit 150 When the move instruction transmission process is executed, the processing unit 150 first determines whether or not the operation input of any of the move instruction buttons 112a to 112f has been made (step S200). This process can be performed by determining the coordinates at which the operation input unit 120 detects the operation input. When the processing unit 150 determines that the operation input of any of the movement instruction buttons has not been made, the processing unit 150 ends the movement instruction transmission process.
  • the processing unit 150 determines that the operation input of any of the movement instruction buttons has been made, the processing unit 150 sets the movement direction and the movement amount of the robot body 20 in the controller coordinate system (X', Y', Z') (Ste S210).
  • the movement direction can be set by determining the operated movement instruction button among the movement instruction buttons 112a to 112f. Further, the movement amount can be set based on the operation time of the movement instruction button.
  • the processing unit 150 sets the inverse matrix R ⁇ -1 of the rotation matrix R ⁇ being set in the above-mentioned display control process by the following equation (3), and the inverse matrix R ⁇ of the rotation matrix R ⁇ ⁇ being set. -1 is set by the following equation (4) (step S220).
  • the processing unit 150 uses the set inverse matrices R ⁇ -1 and R ⁇ -1 to instruct the movement direction and movement amount of the controller coordinate system (X', Y', Z') in the base coordinate system ( It is converted into the instruction of X, Y, Z) (step S220). Then, the processing unit 150 transmits an instruction of the movement direction and the movement amount of the converted base coordinate system (X, Y, Z) to the robot control device 70 (step S230), and ends the movement instruction transmission process.
  • the robot control device 70 that has received the instruction of the movement direction and the amount of movement sets the target position of the hand of the arm based on the received instruction, and the motor corresponding to the above-mentioned process so that the hand moves to the set target position. To control. As a result, the operator can move the robot body 20 in any direction of its movable range by operating the robot operating device 100.
  • the operation input unit 120 of the present embodiment corresponds to the input unit of the present disclosure
  • the processing unit 150 that executes the movement instruction transmission process corresponds to the instruction unit
  • the display unit 110 corresponds to the display unit
  • the position detection unit corresponds to the detection unit
  • the processing unit 150 that executes the display control process corresponds to the display control unit.
  • the processing unit 150 has a base coordinate system (X, Y, Z) based on the rotation angles ⁇ and ⁇ from the position detection unit 130 and the rotation angles ⁇ and ⁇ from the direction detection unit 140.
  • the robot model 111 was displayed as an image of the robot body 20 projected onto the X'Z'plane.
  • the processing unit 150 may process as follows. That is, the processing unit 150 determines whether or not the rotation angles ⁇ and ⁇ from the position detection unit 130 and the rotation angles ⁇ and ⁇ from the direction detection unit 140 substantially match.
  • This determination is a process of determining whether or not the robot operating device 100 (operator) is facing the direction of the robot main body 20.
  • the processing unit 150 determines that the rotation angles ⁇ and ⁇ and the rotation angles ⁇ and ⁇ match
  • the processing unit 150 base coordinate system (X, Y) based on the rotation angles ⁇ and ⁇ (or rotation angles ⁇ and ⁇ ).
  • Z) is converted into the controller coordinate system (X', Y', Z'), and the robot model 111 and a plurality of movement instruction buttons 112a to 112f are displayed as an image projected on the X'Z'plane of the robot body 20. do.
  • the processing unit 150 determines that the rotation angles ⁇ and ⁇ and the rotation angles ⁇ and ⁇ do not match, the processing unit 150 does not display the robot model 111 and the plurality of movement instruction buttons 112a to 112f.
  • the processing unit 150 views the robot model 111 in the posture of the robot main body 20 when viewed in that direction, as shown in FIG. Display.
  • the processing unit 150 is directed toward the robot body 20 without displaying the robot model, as shown in FIG. Display a message prompting. This makes it possible to prevent the robot body 20 from moving due to an unintended operation.
  • the processing unit 150 also changes the display directions of the movement instruction buttons 112a to 112f according to the change in the posture of the robot model 111, but it does not have to be changed.
  • the robot operation device 100 has a position detection unit 130 for detecting the relative position of the robot operation device 100 with respect to the robot body 20 as a detection unit, and the robot operation device 100 with respect to the robot body 20.
  • a direction detection unit 140 for detecting a specific direction is provided.
  • the robot operating device 100 may omit either the position detection unit 130 or the direction detection unit 140.
  • the robot control device 70 and the robot operation device 100 are connected so as to be able to communicate with each other via the cable 11, but may be connected so as to be able to communicate wirelessly.
  • the robot operating device of the present disclosure may be configured as follows. That is, in the robot operation device of the present disclosure, the input unit is provided in the display unit, and the display control unit displays a movement direction that can be input to the input unit at a corresponding position of the display unit. The display is performed, and the movement direction display may be changed according to the posture of the robot model displayed on the display unit. In this way, the operator of the robot can operate the robot more intuitively.
  • the indicating unit uses the robot based on at least one of the relative position and direction of the robot operating device with respect to the reference position of the robot in the moving direction input to the input unit. May be changed in the moving direction with respect to the reference position.
  • the operator can more intuitively use the robot by setting the movement direction that can be input to the input unit to match at least one of the relative position and direction of the robot operating device with respect to the reference position of the robot. Can be operated.
  • the present disclosure is not limited to the form of the robot operating device described above, and may be a form of a robot operating method or a form of a robot system including a robot and the robot operating device.
  • This disclosure can be used in the manufacturing industry of robots and their operating devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

This robot operation device comprises: an input unit for inputting a movement direction; an instruction unit that instructs a robot to move in the movement direction input to the input unit; a detection unit that detects at least one of the position and the direction of the robot relative to a reference position; a display unit that displays a robot model; and a display control unit for controlling the display unit so that the robot model is displayed using an orientation in which the robot is visible from at least one of the position and the direction detected by the detection unit.

Description

ロボット操作装置およびロボット操作方法並びにロボットシステムRobot operating device, robot operating method, and robot system
 本明細書は、ロボット操作装置およびロボット操作方法並びにロボットシステムについて開示する。 This specification discloses a robot operating device, a robot operating method, and a robot system.
 従来、操作盤によるユーザの操作により方向指示を行ない、方向指示が示す方向へロボットの可動部分を移動させる情報処理装置が提案されている(例えば、特許文献1参照)。この情報処理装置は、まず、操作盤の操作入力部から操作信号(移動方向指示)を取得する。続いて、情報処理装置は、操作者位置計測部(赤外線センサ)からワークスペース座標系における操作者の位置を取得する。次に、情報処理装置は、取得したワークスペース座標系における操作者の位置から、移動方向指示で指示する方向に対する移動ベクトルを算出し、ベース座標系における移動ベクトルに変換する。そして、情報処理装置は、変換した移動ベクトルをロボットの現在の位置に加えてベース座標系における移動先の位置を求め、求めた移動先の位置を示す信号をロボットへ出力する。 Conventionally, an information processing device has been proposed in which a direction is instructed by a user's operation by an operation panel and the movable part of the robot is moved in the direction indicated by the direction instruction (see, for example, Patent Document 1). This information processing device first acquires an operation signal (movement direction instruction) from the operation input unit of the operation panel. Subsequently, the information processing apparatus acquires the position of the operator in the workspace coordinate system from the operator position measurement unit (infrared sensor). Next, the information processing apparatus calculates a movement vector for the direction specified by the movement direction instruction from the position of the operator in the acquired workspace coordinate system, and converts it into a movement vector in the base coordinate system. Then, the information processing apparatus adds the converted movement vector to the current position of the robot, obtains the position of the movement destination in the base coordinate system, and outputs a signal indicating the obtained position of the movement destination to the robot.
特開2009-119579号公報Japanese Unexamined Patent Publication No. 2009-119579
 ロボット操作装置として、操作盤に表示装置を備え、ユーザ(操作者)による移動方向指示に応じてロボットが移動する様子をロボットモデルとして当該表示装置に表示することも考えられる。しかしながら、ユーザが見るロボットの見え方は、ロボットに対する操作者の位置や方向によって異なる。このため、操作盤にロボットモデルを表示させるだけでは、操作者が見るロボットの姿勢と画面に表示されるロボットモデルの姿勢とが一致しないことがあり、直感的な操作を行なうことが困難となる。 As a robot operation device, it is conceivable that the operation panel is equipped with a display device, and the state in which the robot moves in response to a movement direction instruction by the user (operator) is displayed on the display device as a robot model. However, the appearance of the robot as seen by the user differs depending on the position and direction of the operator with respect to the robot. For this reason, simply displaying the robot model on the operation panel may not match the posture of the robot seen by the operator with the posture of the robot model displayed on the screen, making intuitive operation difficult. ..
 本開示は、操作者がより直感的にロボットを操作可能とすることを主目的とする。 The main purpose of this disclosure is to enable the operator to operate the robot more intuitively.
 本開示は、上述の主目的を達成するために以下の手段を採った。 This disclosure has taken the following steps to achieve the above-mentioned main objectives.
 本開示のロボット操作装置は、
 ロボットを操作するロボットの操作装置であって、
 移動方向を入力する入力部と、
 前記入力部に入力された移動方向への移動を前記ロボットに指示する指示部と、
 前記ロボットの基準位置に対する相対的な位置および方向の少なくとも一方を検出する検出部と、
 ロボットモデルを表示する表示部と、
 前記検出部により検出された位置および方向の少なくも一方から前記ロボットを見た姿勢で前記ロボットモデルが表示されるよう前記表示部を制御する表示制御部と、
 を備えることを要旨とする。
The robot operating device of the present disclosure is
It is a robot operating device that operates a robot.
An input unit for inputting the movement direction and
An instruction unit that instructs the robot to move in the movement direction input to the input unit, and
A detector that detects at least one of the position and direction relative to the reference position of the robot,
A display unit that displays the robot model and
A display control unit that controls the display unit so that the robot model is displayed in a posture in which the robot is viewed from at least one of the positions and directions detected by the detection unit.
The gist is to prepare.
 この本開示のロボット操作装置では、ロボットの基準位置に対する相対的な位置および方向の少なくとも一方を検出し、検出した位置および方向の少なくとも一方からロボットを見た姿勢でロボットモデルを表示する。これにより、操作者が見たロボットの姿勢と表示部に表示されるロボットモデルの姿勢とを対応(一致)させることができるため、操作者は、表示されるロボットモデルを見ながら操作装置を用いてより直感的にロボットを操作することが可能となる。 The robot operating device of the present disclosure detects at least one of the positions and directions relative to the reference position of the robot, and displays the robot model in a posture in which the robot is viewed from at least one of the detected positions and directions. As a result, the posture of the robot seen by the operator can be made to correspond (match) with the posture of the robot model displayed on the display unit, so that the operator uses the operating device while looking at the displayed robot model. It is possible to operate the robot more intuitively.
ロボットシステムの外観斜視図である。It is an external perspective view of a robot system. ロボット本体の側面図である。It is a side view of a robot body. ロボット本体、ロボット制御装置およびロボット操作装置の電気的な接続関係を示すブロック図である。It is a block diagram which shows the electrical connection relation of a robot body, a robot control device, and a robot operation device. 表示部の画像表示の一例を示す説明図である。It is explanatory drawing which shows an example of the image display of a display part. ベース座標系(ロボット座標系)を説明する説明図である。It is explanatory drawing explaining the base coordinate system (robot coordinate system). 表示制御処理の一例を示すフローチャートである。It is a flowchart which shows an example of a display control process. 座標変換の様子を示す説明図である。It is explanatory drawing which shows the state of the coordinate transformation. ロボット操作装置(コントローラ座標系)から見たロボット本体の姿勢を示す説明図である。It is explanatory drawing which shows the posture of the robot body seen from the robot operation device (controller coordinate system). 図8のロボット操作装置の位置および方向におけるロボットモデルおよび移動指示ボタンの表示例を示す説明図である。It is explanatory drawing which shows the display example of the robot model and the movement instruction button in the position and direction of the robot operation apparatus of FIG. 移動指示送信処理の一例を示すフローチャートである。It is a flowchart which shows an example of the movement instruction transmission processing. 表示部にロボットモデルが表示される場合を示す説明図である。It is explanatory drawing which shows the case where the robot model is displayed on the display part. 表示部にロボットモデルが表示されない場合を示す説明図である。It is explanatory drawing which shows the case where the robot model is not displayed on the display part.
 次に、本開示を実施するための形態について図面を参照しながら説明する。図1は、ロボットシステムの外観斜視図である。図2は、ロボット本体の側面図である。図3は、ロボット本体、ロボット制御装置およびロボット操作装置の電気的な接続関係を示すブロック図である。 Next, the mode for carrying out the present disclosure will be described with reference to the drawings. FIG. 1 is an external perspective view of the robot system. FIG. 2 is a side view of the robot body. FIG. 3 is a block diagram showing an electrical connection relationship between the robot body, the robot control device, and the robot operation device.
 ロボットシステム1は、図1に示すように、ロボット本体20とロボット制御装置70(図3参照)とを含むロボット10と、ユーザの操作入力により当該ロボットを操作するロボット操作装置100と、を備える。なお、ロボット操作装置100は、例えば、ロボット10を操作し、その操作によって得られた動作を記録することで、ロボット10に動作内容を教示する教示操作装置として用いられる。 As shown in FIG. 1, the robot system 1 includes a robot 10 including a robot main body 20 and a robot control device 70 (see FIG. 3), and a robot operation device 100 that operates the robot by a user's operation input. .. The robot operation device 100 is used as a teaching operation device for teaching the operation contents to the robot 10, for example, by operating the robot 10 and recording the operation obtained by the operation.
 ロボット本体20は、第1アーム21と、第2アーム22と、ベース25と、第1アーム駆動装置35と、第2アーム駆動装置36と、姿勢保持装置37と、昇降装置40と、回転3軸機構50と、を備える。なお、第1アーム21と第2アーム22と回転3軸機構50とは、単にアームと呼ぶ場合がある。 The robot body 20 includes a first arm 21, a second arm 22, a base 25, a first arm drive device 35, a second arm drive device 36, a posture holding device 37, an elevating device 40, and a rotation 3. A shaft mechanism 50 is provided. The first arm 21, the second arm 22, and the rotary three-axis mechanism 50 may be simply referred to as an arm.
 第1アーム21は、基端部が第1関節軸31を介して昇降部材26に連結されている。第1アーム駆動装置35は、モータ35aとエンコーダ35bとを備える。モータ35aの回転軸は、図示しない減速機を介して第1関節軸31に接続されている。第1関節軸31は鉛直方向(Z軸方向)に延在し、第1アーム駆動装置35は、モータ35aにより第1関節軸31を駆動することにより、第1関節軸31を支点に第1アーム21を水平面(XY平面)に沿って回動(水平旋回)させる。エンコーダ35bは、モータ35aの回転軸に取り付けられ、モータ35aの回転変位量を検出するロータリエンコーダとして構成される。 The base end portion of the first arm 21 is connected to the elevating member 26 via the first joint shaft 31. The first arm drive device 35 includes a motor 35a and an encoder 35b. The rotation shaft of the motor 35a is connected to the first joint shaft 31 via a speed reducer (not shown). The first joint shaft 31 extends in the vertical direction (Z-axis direction), and the first arm drive device 35 drives the first joint shaft 31 by the motor 35a so that the first joint shaft 31 is used as a fulcrum. The arm 21 is rotated (horizontally swiveled) along a horizontal plane (XY plane). The encoder 35b is attached to the rotation shaft of the motor 35a and is configured as a rotary encoder that detects the amount of rotational displacement of the motor 35a.
 第2アーム22は、基端部が第2関節軸32を介して第1アーム21の先端部に連結されている。第2アーム駆動装置36は、モータ36aとエンコーダ36bとを備える。モータ36aの回転軸は、図示しない減速機を介して第2関節軸32に接続されている。第2関節軸32は第1関節軸31と平行な方向(Z軸方向)に延在し、第2アーム駆動装置36は、モータ36aにより第2関節軸32を回転駆動することにより、第2関節軸32を支点に第2アーム22を水平面に沿って回動(水平旋回)させる。エンコーダ36bは、モータ36aの回転軸に取り付けられ、モータ36aの回転変位量を検出するロータリエンコーダとして構成される。 The base end portion of the second arm 22 is connected to the tip end portion of the first arm 21 via the second joint shaft 32. The second arm drive device 36 includes a motor 36a and an encoder 36b. The rotation shaft of the motor 36a is connected to the second joint shaft 32 via a speed reducer (not shown). The second joint shaft 32 extends in a direction parallel to the first joint shaft 31 (Z-axis direction), and the second arm drive device 36 rotationally drives the second joint shaft 32 by the motor 36a to drive the second joint shaft 32. The second arm 22 is rotated (horizontally swiveled) along the horizontal plane with the joint axis 32 as a fulcrum. The encoder 36b is attached to the rotation shaft of the motor 36a and is configured as a rotary encoder that detects the amount of rotational displacement of the motor 36a.
 ベース25には、昇降装置40が設置されている。昇降装置40は、図1および図2に示すように、昇降部材26に固定されたスライダ41と、ベース25に固定されると共に鉛直方向(Z軸方向)に延出してスライダ41の移動をガイドするガイド部材42と、鉛直方向に延出すると共にスライダ41に固定されたボールねじナット(図示せず)に螺合されるボールねじ軸43(昇降軸)と、ボールねじ軸43を回転駆動するモータ44と、エンコーダ45(図3参照)とを備える。昇降装置40は、モータ44によりボールねじ軸43を回転駆動することにより、スライダ41に固定された昇降部材26をガイド部材42に沿って昇降させる。昇降部材26の昇降により、アーム(第1アーム21、第2アーム22および回転3軸機構50)も昇降する。エンコーダ45は、スライダ41(昇降部材26)の昇降位置を検出するリニアエンコーダとして構成される。 An elevating device 40 is installed on the base 25. As shown in FIGS. 1 and 2, the elevating device 40 is fixed to the elevating member 26 and the base 25, and extends in the vertical direction (Z-axis direction) to guide the movement of the slider 41. The guide member 42, the ball screw shaft 43 (elevating shaft) extending in the vertical direction and screwed to the ball screw nut (not shown) fixed to the slider 41, and the ball screw shaft 43 are rotationally driven. It includes a motor 44 and an encoder 45 (see FIG. 3). The elevating device 40 rotates the ball screw shaft 43 by the motor 44 to elevate and elevate the elevating member 26 fixed to the slider 41 along the guide member 42. By raising and lowering the elevating member 26, the arms (first arm 21, second arm 22 and rotary 3-axis mechanism 50) are also raised and lowered. The encoder 45 is configured as a linear encoder that detects the elevating position of the slider 41 (elevating member 26).
 回転3軸機構50は、鉛直方向に延在する姿勢保持用軸33を介して第2アーム22の先端部に連結されている。回転3軸機構50は、互いに直交する第1回転軸51,第2回転軸52および第3回転軸53と、第1回転軸51を回転させる第1回転装置55と、第2回転軸52を回転させる第2回転装置56と、第3回転軸53を回転させる第3回転装置57とを備える。第1回転軸51は、姿勢保持用軸33に対して直交姿勢で支持されている。第2回転軸52は、第1回転軸51に対して直交姿勢で支持されている。第3回転軸53は、第2回転軸52に対して直交姿勢で支持される。第1回転装置55は、第1回転軸51を回転駆動するモータ55aと、モータ55aの回転軸に取り付けられモータ55aの回転変位量を検出するエンコーダ55bとを有する。第2回転装置56は、第2回転軸52を回転駆動するモータ56aと、モータ56aの回転軸に取り付けられモータ56aの回転変位量を検出するエンコーダ56bとを有する。第3回転装置57は、第3回転軸53を回転駆動するモータ57aと、モータ57aの回転軸に取り付けられモータ57aの回転変位量を検出するエンコーダ57bとを有する。また、第3回転軸53には、ロボット本体20の手先としてのエンドエフェクタが取り付けられる。 The rotary 3-axis mechanism 50 is connected to the tip end portion of the second arm 22 via a posture holding shaft 33 extending in the vertical direction. The rotation three-axis mechanism 50 includes a first rotation axis 51, a second rotation axis 52, and a third rotation axis 53 that are orthogonal to each other, a first rotation device 55 that rotates the first rotation axis 51, and a second rotation axis 52. A second rotating device 56 for rotating and a third rotating device 57 for rotating the third rotating shaft 53 are provided. The first rotation shaft 51 is supported in a posture orthogonal to the posture holding shaft 33. The second rotation axis 52 is supported in an orthogonal posture with respect to the first rotation axis 51. The third rotation axis 53 is supported in an orthogonal posture with respect to the second rotation axis 52. The first rotary device 55 includes a motor 55a that rotationally drives the first rotary shaft 51, and an encoder 55b that is attached to the rotary shaft of the motor 55a and detects the amount of rotational displacement of the motor 55a. The second rotating device 56 includes a motor 56a that rotationally drives the second rotating shaft 52, and an encoder 56b that is attached to the rotating shaft of the motor 56a and detects the amount of rotational displacement of the motor 56a. The third rotating device 57 includes a motor 57a that rotationally drives the third rotating shaft 53, and an encoder 57b that is attached to the rotating shaft of the motor 57a and detects the amount of rotational displacement of the motor 57a. Further, an end effector as a hand of the robot body 20 is attached to the third rotation shaft 53.
 本実施形態のロボット本体20は、第1アーム駆動装置35と第2アーム駆動装置36と昇降装置40とによるX軸方向,Y軸方向およびZ軸方向の3方向の並進運動と、回転3軸機構50によるX軸回り(ピッチング),Y軸回り(ローリング)およびZ軸回り(ヨーイング)の3方向の回転運動との組み合わせにより、手先を任意の姿勢で任意の位置へ移動させることができる。 The robot main body 20 of the present embodiment has translational motions in three directions of the X-axis direction, the Y-axis direction, and the Z-axis direction by the first arm drive device 35, the second arm drive device 36, and the elevating device 40, and three rotation axes. By combining with the rotational movement in the three directions of the X-axis (pitching), the Y-axis (rolling), and the Z-axis (yowing) by the mechanism 50, the hand can be moved to an arbitrary position in an arbitrary posture.
 姿勢保持装置37は、第1アーム21および第2アーム22の姿勢によらず回転3軸機構50の姿勢(第1回転軸51の向き)を一定の向きに保持するものである。姿勢保持装置37は、モータ37aとエンコーダ37bとを備える。モータ37aの回転軸は、図示しない減速機を介して姿勢保持用軸33に接続されている。姿勢保持装置37は、第1回転軸51の軸方向が常時、左右方向(X軸方向)となるように第1関節軸31の回転角度と第2関節軸32の回転角度とに基づいて姿勢保持用軸33の目標回転角度を設定し、姿勢保持用軸33が目標回転角度となるようにモータ37aを駆動制御する。これにより、3方向の並進運動の制御と3方向の回転運動の制御とをそれぞれ独立して行なうことが可能となり、制御が容易となる。 The posture holding device 37 holds the posture of the rotating 3-axis mechanism 50 (direction of the first rotating shaft 51) in a fixed direction regardless of the postures of the first arm 21 and the second arm 22. The posture holding device 37 includes a motor 37a and an encoder 37b. The rotation shaft of the motor 37a is connected to the posture holding shaft 33 via a speed reducer (not shown). The posture holding device 37 has a posture based on the rotation angle of the first joint shaft 31 and the rotation angle of the second joint shaft 32 so that the axial direction of the first rotation shaft 51 is always in the left-right direction (X-axis direction). The target rotation angle of the holding shaft 33 is set, and the motor 37a is driven and controlled so that the posture holding shaft 33 has the target rotation angle. This makes it possible to independently control the translational motion in three directions and the rotational motion in three directions, which facilitates control.
 ロボット制御装置70は、図示しないが、CPUを中心としたマイクロプロセッサとして構成される。ロボット制御装置70には、各エンコーダ35b,36b,37b,45,55b,56b,57bからの検出信号などが入力されている。また、ロボット制御装置70からは、各モータ35a,36a,37a,44,55a,56a,57aへの駆動信号が出力されている。また、ロボット制御装置70は、ロボット操作装置100とケーブル11を介して接続されている。 Although not shown, the robot control device 70 is configured as a microprocessor centered on a CPU. Detection signals from the encoders 35b, 36b, 37b, 45, 55b, 56b, 57b and the like are input to the robot control device 70. Further, the robot control device 70 outputs drive signals to the motors 35a, 36a, 37a, 44, 55a, 56a, 57a. Further, the robot control device 70 is connected to the robot operation device 100 via a cable 11.
 こうして構成されたロボット20は、以下のように動作する。すなわち、ロボット制御装置70は、まず、ベース25に設定されたベース座標系(X,Y,Z)において、アームの手先の目標位置を取得する。続いて、ロボット制御装置70は、取得した目標位置に対して逆運動学を解くことにより各関節軸31,32等の目標回転角度を設定する。そして、ロボット制御装置70は、各エンコーダ35b,36b等により検出される各関節軸31,32等の回転角度がそれぞれの目標回転角度に一致するようにフィードバック制御によりトルク指令を設定し、トルク指令に応じたトルクが出力されるようモータ35a,36a等を制御する。 The robot 20 configured in this way operates as follows. That is, the robot control device 70 first acquires the target position of the hand of the arm in the base coordinate system (X, Y, Z) set in the base 25. Subsequently, the robot control device 70 sets the target rotation angles of the joint axes 31, 32, etc. by solving the inverse kinematics with respect to the acquired target position. Then, the robot control device 70 sets a torque command by feedback control so that the rotation angles of the joint shafts 31, 32, etc. detected by the encoders 35b, 36b, etc. match the respective target rotation angles, and the torque command is given. The motors 35a, 36a, etc. are controlled so that the torque corresponding to the above is output.
 ロボット操作装置100は、操作者の操作入力によってロボット本体20を移動させるためのコントローラであり、表示部110と操作入力部120と位置検出部130と方向検出部140と処理部150とを備える。 The robot operation device 100 is a controller for moving the robot main body 20 by the operation input of the operator, and includes a display unit 110, an operation input unit 120, a position detection unit 130, a direction detection unit 140, and a processing unit 150.
 表示部110は、液晶ディスプレイや有機ELディスプレイなどの画像を表示する表示装置である。操作入力部120は、表示部110の画面に対する操作入力を検出するタッチパネルであり、表示部110に表示される操作ボタン(ソフトウエアボタン)をタッチすることにより操作者による各種操作の入力を行なう。図4は、表示部の画像表示の一例を示す説明図である。図示するように、表示部110には、ロボット本体20のモデル(ロボットモデル)111と、ロボット本体20の移動量と移動方向の指示を操作入力するための移動方向ごとの移動指示ボタン112a~112dと、が表示される。 The display unit 110 is a display device that displays an image such as a liquid crystal display or an organic EL display. The operation input unit 120 is a touch panel for detecting an operation input to the screen of the display unit 110, and the operator inputs various operations by touching an operation button (software button) displayed on the display unit 110. FIG. 4 is an explanatory diagram showing an example of an image display of the display unit. As shown in the figure, the display unit 110 has a model (robot model) 111 of the robot body 20 and movement instruction buttons 112a to 112d for each movement direction for operating and inputting the movement amount and the instruction of the movement direction of the robot body 20. And are displayed.
 位置検出部130は、ロボット本体20に対するロボット操作装置100の位置を検出するものであり、例えば、対象物を撮像するカメラと、カメラの撮像画像を処理する画像処理装置と、を備える。この位置検出部130は、カメラでロボット本体20を撮像し、画像処理装置で撮像画像を処理してロボット本体20の基準位置を認識することにより、当該基準位置に対するロボット操作装置100の相対的な位置を検出する。ここで、基準位置は、本実施形態では、図5に示すように、ベース25に設定されたベース座標系(X,Y,Z)の原点Oに定められる。位置検出部130は、ロボット操作装置100の位置を、ベース座標系(X,Y,Z)におけるY軸方向を0度としたX軸周りの回転角度αおよびZ軸周りの回転角度βとして検出する。なお、位置検出部130としては、カメラの他に、超音波センサなどが用いられてもよい。 The position detection unit 130 detects the position of the robot operation device 100 with respect to the robot body 20, and includes, for example, a camera that captures an object and an image processing device that processes the captured image of the camera. The position detection unit 130 captures the robot body 20 with a camera, processes the captured image with an image processing device, and recognizes the reference position of the robot body 20, so that the robot operating device 100 is relative to the reference position. Detect the position. Here, in the present embodiment, the reference position is set to the origin O of the base coordinate system (X, Y, Z) set in the base 25, as shown in FIG. The position detection unit 130 detects the position of the robot operating device 100 as a rotation angle α around the X axis and a rotation angle β around the Z axis with the Y axis direction as 0 degrees in the base coordinate system (X, Y, Z). do. In addition to the camera, an ultrasonic sensor or the like may be used as the position detection unit 130.
 方向検出部140は、ロボット本体20の基準位置に対するロボット操作装置100の方向(向き)を検出するものであり、例えば、上述したカメラと画像処理装置とを備える構成を用いたり、ジャイロセンサを用いたりすることができる。方向検出部140は、ロボット操作装置100の方向を、ベース座標系(X,Y,Z)におけるY軸方向を0度としたX軸周りの回転角度γおよびZ軸周りの回転角度δとして検出する。 The direction detection unit 140 detects the direction (orientation) of the robot operating device 100 with respect to the reference position of the robot body 20. For example, the configuration including the above-mentioned camera and image processing device may be used, or a gyro sensor may be used. Can be done. The direction detection unit 140 detects the direction of the robot operating device 100 as a rotation angle γ around the X axis and a rotation angle δ around the Z axis with the Y axis direction as 0 degrees in the base coordinate system (X, Y, Z). do.
 処理部150は、図示しないが、CPUを中心としたマイクロプロセッサとして構成される。処理部150には、操作入力部120からの操作信号や、位置検出部130からの検出信号、方向検出部140からの検出信号などが入力されている。処理部150からは、表示部110への表示信号などが出力されている。また、処理部150は、ケーブル11を介してロボット制御装置70と通信可能に接続されており、互いにデータや制御信号のやり取りを行なう。 Although not shown, the processing unit 150 is configured as a microprocessor centered on a CPU. An operation signal from the operation input unit 120, a detection signal from the position detection unit 130, a detection signal from the direction detection unit 140, and the like are input to the processing unit 150. A display signal or the like to the display unit 110 is output from the processing unit 150. Further, the processing unit 150 is communicably connected to the robot control device 70 via the cable 11 and exchanges data and control signals with each other.
 次に、こうして構成された本実施形態のロボットシステム1の動作について説明する。特に、ロボット操作装置100への操作入力に基づいてロボット本体20を移動させる際の動作について説明する。図6は、処理部150により実行される表示制御処理の一例を示すフローチャートである。この処理は、所定時間毎に繰り返し実行される。 Next, the operation of the robot system 1 of the present embodiment configured in this way will be described. In particular, the operation when the robot main body 20 is moved based on the operation input to the robot operation device 100 will be described. FIG. 6 is a flowchart showing an example of the display control process executed by the processing unit 150. This process is repeatedly executed at predetermined time intervals.
 表示制御処理が実行されると、処理部150は、まず、位置検出部130からベース座標系(X,Y,Z)におけるX軸周りの回転角度αとZ軸周りの回転角度βとを取得すると共に(ステップS100)、方向検出部140からベース座標系(X,Y,Z)におけるX軸周りの回転角度γとZ軸周りの回転角度δとを取得する(ステップS110)。続いて、処理部150は、ステップS100で取得した回転角度α,βに基づいて次式(1)により回転行列Rαβを設定し(ステップS120)、ステップS110で取得した回転角度γ,δに基づいて次式(2)により回転行列Rγδを設定する(ステップS130)。次に、処理部150は、回転行列RαβおよびRγδを用いてベース座標系(X,Y,Z)を回転させることで座標系をベース座標系(X,Y,Z)からコントローラ座標系(X’,Y’,Z’)へ変換する(ステップS140)。図7は、座標変換の様子を示す説明図である。図の例では、Z軸周りに回転角度β,δだけベース座標系(X,Y,Z)を回転させる場合を示す。図示するように、コントローラ座標系(X’,Y’,Z’)は、Z’軸がベース座標系(X,Y,Z)のZ軸と重なり、X’軸およびY’軸が当該Z軸を支点に回転角度βと回転角度δとの和の回転角度だけ回転させた座標系となる。 When the display control process is executed, the processing unit 150 first acquires the rotation angle α around the X axis and the rotation angle β around the Z axis in the base coordinate system (X, Y, Z) from the position detection unit 130. At the same time (step S100), the rotation angle γ around the X axis and the rotation angle δ around the Z axis in the base coordinate system (X, Y, Z) are acquired from the direction detection unit 140 (step S110). Subsequently, the processing unit 150 sets the rotation matrix R αβ by the following equation (1) based on the rotation angles α and β acquired in step S100 (step S120), and sets the rotation angles γ and δ acquired in step S110. Based on this, the rotation matrix R γδ is set by the following equation (2) (step S130). Next, the processing unit 150 changes the coordinate system from the base coordinate system (X, Y, Z) to the controller coordinate system by rotating the base coordinate system (X, Y, Z) using the rotation matrices R αβ and R γ δ. Convert to (X', Y', Z') (step S140). FIG. 7 is an explanatory diagram showing a state of coordinate conversion. In the example of the figure, the case where the base coordinate system (X, Y, Z) is rotated by the rotation angles β and δ around the Z axis is shown. As shown in the figure, in the controller coordinate system (X', Y', Z'), the Z'axis overlaps with the Z axis of the base coordinate system (X, Y, Z), and the X'axis and the Y'axis are the Z axes. The coordinate system is rotated by the rotation angle of the sum of the rotation angle β and the rotation angle δ with the axis as the fulcrum.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 処理部150は、こうして座標変換を行なうと、表示部110にロボットモデルを表示すると共に(ステップS150)、移動指示ボタンを表示して(ステップS160)、表示制御処理を終了する。図8は、ロボット操作装置(コントローラ座標系)から見たロボット本体の姿勢を示す説明図である。図9は、図8のロボット操作装置の位置および方向におけるロボットモデルおよび移動指示ボタンの表示例を示す説明図である。表示部110に表示されるロボットモデル111は、ロボット本体20をコントローラ座標系(X’,Y’,Z’)のX’Z’平面に投影した像(投影像)である。また、表示部110に表示される複数の移動指示ボタン112a~112fは、ベース座標系(X,Y,Z)のX軸方向(前後方向),Y軸方向(左右方向)およびZ軸方向(上下方向)にそれぞれ対応する6方向の矢印画像である。各移動指示ボタン112a~112fの矢印の方向は、対応するベース座標系(X,Y,Z)のX軸方向(前後方向),Y軸方向(左右方向)およびZ軸方向(上下方向)を上述したコントローラ座標系(X’,Y’,Z’)のX’Z’平面に投影したときの方向を示す。これにより、操作者は、いずれかの移動指示ボタンの位置(操作入力部120)をタッチ操作することにより、ベース座標系(X,Y,Z)のX軸方向(前後方向),Y軸方向(左右方向)およびZ軸方向(上下方向)のうちその移動指示ボタンが指し示す方向への移動をロボット本体20に指示することができる。このとき、表示部110には、ロボットモデル111が、ロボット操作装置100からロボット本体20を見たときの姿勢で表示されるため、操作者が見るロボット本体20の姿勢と表示部110に表示されるロボットモデル111の姿勢とを一致させることができる。これにより、操作者は、表示部110に表示されるロボットモデル111を見ながら、より直感的にロボット本体20を操作することが可能となる。 When the coordinate conversion is performed in this way, the processing unit 150 displays the robot model on the display unit 110 (step S150), displays the movement instruction button (step S160), and ends the display control process. FIG. 8 is an explanatory diagram showing the posture of the robot body as seen from the robot operating device (controller coordinate system). FIG. 9 is an explanatory diagram showing a display example of the robot model and the movement instruction button at the position and direction of the robot operating device of FIG. The robot model 111 displayed on the display unit 110 is an image (projected image) obtained by projecting the robot body 20 onto the X'Z'plane of the controller coordinate system (X', Y', Z'). Further, the plurality of movement instruction buttons 112a to 112f displayed on the display unit 110 are in the X-axis direction (front-back direction), Y-axis direction (left-right direction), and Z-axis direction (left-right direction) of the base coordinate system (X, Y, Z). It is an arrow image of 6 directions corresponding to each (vertical direction). The directions of the arrows of the movement instruction buttons 112a to 112f are the X-axis direction (front-back direction), Y-axis direction (left-right direction), and Z-axis direction (up-down direction) of the corresponding base coordinate system (X, Y, Z). The direction when the above-mentioned controller coordinate system (X', Y', Z') is projected onto the X'Z'plane is shown. As a result, the operator touch-operates the position of one of the movement instruction buttons (operation input unit 120) in the X-axis direction (front-back direction) and Y-axis direction of the base coordinate system (X, Y, Z). The robot main body 20 can be instructed to move in the direction indicated by the movement instruction button in the (horizontal direction) and the Z-axis direction (vertical direction). At this time, since the robot model 111 is displayed on the display unit 110 in the posture when the robot body 20 is viewed from the robot operation device 100, the posture of the robot body 20 viewed by the operator and the display unit 110 are displayed. It is possible to match the posture of the robot model 111. As a result, the operator can operate the robot body 20 more intuitively while looking at the robot model 111 displayed on the display unit 110.
 次に、操作者が表示部110に表示された移動指示ボタンを操作入力したときの動作について説明する。図10は、移動指示送信処理の一例を示すフローチャートである。この処理は、所定時間毎に繰り返し実行される。 Next, the operation when the operator inputs the operation instruction button displayed on the display unit 110 will be described. FIG. 10 is a flowchart showing an example of the movement instruction transmission process. This process is repeatedly executed at predetermined time intervals.
 移動指示送信処理が実行されると、処理部150は、まず、移動指示ボタン112a~112fのうちいずれかの移動指示ボタンの操作入力がなされたか否かを判定する(ステップS200)。この処理は、操作入力部120が操作入力を検知した座標を判定することにより行なうことができる。処理部150は、いずれの移動指示ボタンの操作入力もなされていないと判定すると、移動指示送信処理を終了する。 When the move instruction transmission process is executed, the processing unit 150 first determines whether or not the operation input of any of the move instruction buttons 112a to 112f has been made (step S200). This process can be performed by determining the coordinates at which the operation input unit 120 detects the operation input. When the processing unit 150 determines that the operation input of any of the movement instruction buttons has not been made, the processing unit 150 ends the movement instruction transmission process.
 一方、処理部150は、いずれかの移動指示ボタンの操作入力がなされたと判定すると、コントローラ座標系(X’,Y’,Z’)におけるロボット本体20の移動方向と移動量とを設定する(ステップS210)。ここで、移動方向は、移動指示ボタン112a~112fのうち操作された移動指示ボタンを判定することにより設定することができる。また、移動量は、移動指示ボタンの操作時間に基づいて設定することができる。続いて、処理部150は、上述した表示制御処理において設定中の回転行列Rαβの逆行列Rαβ -1を次式(3)により設定すると共に設定中の回転行列Rγδの逆行列Rγδ -1を次式(4)により設定する(ステップS220)。 On the other hand, when the processing unit 150 determines that the operation input of any of the movement instruction buttons has been made, the processing unit 150 sets the movement direction and the movement amount of the robot body 20 in the controller coordinate system (X', Y', Z') ( Step S210). Here, the movement direction can be set by determining the operated movement instruction button among the movement instruction buttons 112a to 112f. Further, the movement amount can be set based on the operation time of the movement instruction button. Subsequently, the processing unit 150 sets the inverse matrix R αβ -1 of the rotation matrix R αβ being set in the above-mentioned display control process by the following equation (3), and the inverse matrix R γδ of the rotation matrix R γ δ being set. -1 is set by the following equation (4) (step S220).
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 次に、処理部150は、設定した逆行列Rαβ -1およびRγδ -1を用いてコントローラ座標系(X’,Y’,Z’)の移動方向および移動量の指示をベース座標系(X,Y,Z)の指示に変換する(ステップS220)。そして、処理部150は、変換したベース座標系(X,Y,Z)の移動方向および移動量の指示をロボット制御装置70へ送信して(ステップS230)、移動指示送信処理を終了する。移動方向および移動量の指示を受信したロボット制御装置70は、受信した指示に基づいてアームの手先の目標位置を設定し、設定した目標位置に手先が移動するように上述した処理によって対応するモータを制御する。これにより、操作者は、ロボット操作装置100を操作することで、ロボット本体20をその可動範囲の任意の方向へ移動させることが可能となる。 Next, the processing unit 150 uses the set inverse matrices R αβ -1 and R γδ -1 to instruct the movement direction and movement amount of the controller coordinate system (X', Y', Z') in the base coordinate system ( It is converted into the instruction of X, Y, Z) (step S220). Then, the processing unit 150 transmits an instruction of the movement direction and the movement amount of the converted base coordinate system (X, Y, Z) to the robot control device 70 (step S230), and ends the movement instruction transmission process. The robot control device 70 that has received the instruction of the movement direction and the amount of movement sets the target position of the hand of the arm based on the received instruction, and the motor corresponding to the above-mentioned process so that the hand moves to the set target position. To control. As a result, the operator can move the robot body 20 in any direction of its movable range by operating the robot operating device 100.
 ここで、実施形態の主要な要素と請求の範囲に記載した本開示の主要な要素との対応関係について説明する。即ち、本実施形態の操作入力部120が本開示の入力部に相当し、移動指示送信処理を実行する処理部150が指示部に相当し、表示部110が表示部に相当し、位置検出部130および方向検出部140が検出部に相当し、表示制御処理を実行する処理部150が表示制御部に相当する。 Here, the correspondence between the main elements of the embodiment and the main elements of the present disclosure described in the claims will be described. That is, the operation input unit 120 of the present embodiment corresponds to the input unit of the present disclosure, the processing unit 150 that executes the movement instruction transmission process corresponds to the instruction unit, the display unit 110 corresponds to the display unit, and the position detection unit. The 130 and the direction detection unit 140 correspond to the detection unit, and the processing unit 150 that executes the display control process corresponds to the display control unit.
 なお、本開示は上述した実施形態に何ら限定されることはなく、本開示の技術的範囲に属する限り種々の態様で実施し得ることはいうまでもない。 It should be noted that the present disclosure is not limited to the above-described embodiment, and it goes without saying that the present disclosure can be carried out in various embodiments as long as it belongs to the technical scope of the present disclosure.
 例えば、上述した実施形態では、処理部150は、位置検出部130からの回転角度α,βと方向検出部140からの回転角度γ,δとに基づいてベース座標系(X,Y,Z)をコントローラ座標系(X’,Y’,Z’)に変換し、ロボット本体20をX’Z’平面へ投影した像としてロボットモデル111を表示した。しかし、処理部150は、以下のように処理してもよい。すなわち、処理部150は、位置検出部130からの回転角度α,βと方向検出部140からの回転角度γ,δとが概ね一致しているか否かを判定する。この判定は、ロボット操作装置100(操作者)がロボット本体20の方向を向いているか否かを判定する処理である。処理部150は、回転角度α,βと回転角度γ,δとが一致していると判定した場合に回転角度α,β(または回転角度γ,δ)に基づいてベース座標系(X,Y,Z)をコントローラ座標系(X’,Y’,Z’)に変換し、ロボット本体20をX’Z’平面へ投影した像としてロボットモデル111と複数の移動指示ボタン112a~112fとを表示する。一方、処理部150は、回転角度α,βと回転角度γ,δとが一致しないと判定すると、ロボットモデル111と複数の移動指示ボタン112a~112fとを表示しない。このように、処理部150は、ロボット操作装置100(操作者)がロボット本体20の方向を向いているときには、図11に示すように、その方向で見るロボット本体20の姿勢でロボットモデル111を表示させる。一方、処理部150は、ロボット操作装置100(操作者)がロボット本体20の方向を向いていないときには、図12に示すように、ロボットモデルを表示させることなく、ロボット本体20の方を向くように促すメッセージを表示する。これにより、意図しない操作によりロボット本体20が移動しないようにすることができる。 For example, in the above-described embodiment, the processing unit 150 has a base coordinate system (X, Y, Z) based on the rotation angles α and β from the position detection unit 130 and the rotation angles γ and δ from the direction detection unit 140. Was converted into the controller coordinate system (X', Y', Z'), and the robot model 111 was displayed as an image of the robot body 20 projected onto the X'Z'plane. However, the processing unit 150 may process as follows. That is, the processing unit 150 determines whether or not the rotation angles α and β from the position detection unit 130 and the rotation angles γ and δ from the direction detection unit 140 substantially match. This determination is a process of determining whether or not the robot operating device 100 (operator) is facing the direction of the robot main body 20. When the processing unit 150 determines that the rotation angles α and β and the rotation angles γ and δ match, the processing unit 150 base coordinate system (X, Y) based on the rotation angles α and β (or rotation angles γ and δ). , Z) is converted into the controller coordinate system (X', Y', Z'), and the robot model 111 and a plurality of movement instruction buttons 112a to 112f are displayed as an image projected on the X'Z'plane of the robot body 20. do. On the other hand, when the processing unit 150 determines that the rotation angles α and β and the rotation angles γ and δ do not match, the processing unit 150 does not display the robot model 111 and the plurality of movement instruction buttons 112a to 112f. As described above, when the robot operating device 100 (operator) is facing the direction of the robot main body 20, the processing unit 150 views the robot model 111 in the posture of the robot main body 20 when viewed in that direction, as shown in FIG. Display. On the other hand, when the robot operating device 100 (operator) is not facing the direction of the robot body 20, the processing unit 150 is directed toward the robot body 20 without displaying the robot model, as shown in FIG. Display a message prompting. This makes it possible to prevent the robot body 20 from moving due to an unintended operation.
 上述した実施形態では、処理部150は、ロボットモデル111の姿勢を変更するのに応じて移動指示ボタン112a~112fの表示方向も変更したが、変更しなくてもよい。 In the above-described embodiment, the processing unit 150 also changes the display directions of the movement instruction buttons 112a to 112f according to the change in the posture of the robot model 111, but it does not have to be changed.
 上述した実施形態では、ロボット操作装置100は、検出部として、ロボット本体20に対するロボット操作装置100の相対的な位置を検出するための位置検出部130と、ロボット本体20に対するロボット操作装置100の相対的な方向を検出するための方向検出部140と、を備えた。しかし、ロボット操作装置100は、位置検出部130および方向検出部140のうちいずれか一方を省略してもよい。 In the above-described embodiment, the robot operation device 100 has a position detection unit 130 for detecting the relative position of the robot operation device 100 with respect to the robot body 20 as a detection unit, and the robot operation device 100 with respect to the robot body 20. A direction detection unit 140 for detecting a specific direction is provided. However, the robot operating device 100 may omit either the position detection unit 130 or the direction detection unit 140.
 上述した実施形態では、ロボット制御装置70とロボット操作装置100とは、ケーブル11を介して通信可能に接続されたが、無線通信が可能に接続されてもよい。 In the above-described embodiment, the robot control device 70 and the robot operation device 100 are connected so as to be able to communicate with each other via the cable 11, but may be connected so as to be able to communicate wirelessly.
 なお、本開示のロボット操作装置は、以下のように構成されてもよい。
 すなわち、本開示のロボット操作装置において、前記入力部は、前記表示部に設けられ、前記表示制御部は、前記入力部に入力可能な移動方向を前記表示部の対応する位置に表示する移動方向表示を行なうものであって、前記表示部に表示されたロボットモデルの姿勢に応じて前記移動方向表示を変更してもよい。こうすれば、ロボットの操作者は、より直感的にロボットを操作することが可能となる。
The robot operating device of the present disclosure may be configured as follows.
That is, in the robot operation device of the present disclosure, the input unit is provided in the display unit, and the display control unit displays a movement direction that can be input to the input unit at a corresponding position of the display unit. The display is performed, and the movement direction display may be changed according to the posture of the robot model displayed on the display unit. In this way, the operator of the robot can operate the robot more intuitively.
 また、本開示のロボット操作装置において、前記指示部は、前記入力部に入力された移動方向を前記ロボットの基準位置に対する前記ロボット操作装置の相対的な位置および方向の少なくとも一方に基づいて前記ロボットの前記基準位置に対する移動方向に変更してもよい。こうすれば、入力部に入力可能な移動方向を、ロボットの基準位置に対するロボット操作装置の相対的な位置および方向の少なくとも一方に合わせた方向とすることで、操作者は、より直感的にロボットを操作することが可能となる。 Further, in the robot operating device of the present disclosure, the indicating unit uses the robot based on at least one of the relative position and direction of the robot operating device with respect to the reference position of the robot in the moving direction input to the input unit. May be changed in the moving direction with respect to the reference position. By doing so, the operator can more intuitively use the robot by setting the movement direction that can be input to the input unit to match at least one of the relative position and direction of the robot operating device with respect to the reference position of the robot. Can be operated.
 また、本開示は、上述したロボット操作装置の形態に限られず、ロボット操作方法の形態としてもよいし、ロボットとロボット操作装置とを備えたロボットシステムの形態としてもよい。 Further, the present disclosure is not limited to the form of the robot operating device described above, and may be a form of a robot operating method or a form of a robot system including a robot and the robot operating device.
 本開示は、ロボットやその操作装置の製造産業などに利用可能である。 This disclosure can be used in the manufacturing industry of robots and their operating devices.
 1 ロボットシステム、10 ロボット、11 ケーブル、20 ロボット本体、21 第1アーム、22 第2アーム、25 ベース、26 昇降部材、31 第1関節軸、32 第2関節軸、33 姿勢保持用軸、35 第1アーム駆動装置、35a モータ、35b エンコーダ、36 第2アーム駆動装置、36a モータ、36b エンコーダ、37 姿勢保持装置、37a モータ、37b エンコーダ、40 昇降装置、41 スライダ、42 ガイド部材、43 ボールねじ軸、44 モータ、45 エンコーダ、50 回転3軸機構、51 第1回転軸、52 第2回転軸、53 第3回転軸、55 第1回転装置、55a モータ、55b エンコーダ、56 第2回転装置、56a モータ、56b エンコーダ、57 第3回転装置、57a モータ、57b エンコーダ、70 ロボット制御装置、100 ロボット操作装置、110 表示部、111 ロボットモデル、112a~112f 移動指示ボタン、120 操作入力部、130 位置検出部、140 方向検出部、150 処理部。 1 robot system, 10 robot, 11 cable, 20 robot body, 21 1st arm, 22 2nd arm, 25 base, 26 elevating member, 31 1st joint axis, 32 2nd joint axis, 33 posture holding axis, 35 1st arm drive device, 35a motor, 35b encoder, 36 2nd arm drive device, 36a motor, 36b encoder, 37 posture holding device, 37a motor, 37b encoder, 40 lifting device, 41 slider, 42 guide member, 43 ball screw Shaft, 44 motor, 45 encoder, 50 rotary 3-axis mechanism, 51 1st rotary shaft, 52 2nd rotary shaft, 53 3rd rotary shaft, 55 1st rotary device, 55a motor, 55b encoder, 56 2nd rotary device, 56a motor, 56b encoder, 57 third rotation device, 57a motor, 57b encoder, 70 robot control device, 100 robot operation device, 110 display unit, 111 robot model, 112a to 112f movement instruction button, 120 operation input unit, 130 position Detection unit, 140 direction detection unit, 150 processing unit.

Claims (5)

  1.  ロボットを操作するロボット操作装置であって、
     移動方向を入力する入力部と、
     前記入力部に入力された移動方向への移動を前記ロボットに指示する指示部と、
     前記ロボットの基準位置に対する相対的な位置および方向の少なくとも一方を検出する検出部と、
     ロボットモデルを表示する表示部と、
     前記検出部により検出された位置および方向の少なくも一方から前記ロボットを見た姿勢で前記ロボットモデルが表示されるよう前記表示部を制御する表示制御部と、
     を備えるロボット操作装置。
    It is a robot operation device that operates a robot.
    An input unit for inputting the movement direction and
    An instruction unit that instructs the robot to move in the movement direction input to the input unit, and
    A detector that detects at least one of the position and direction relative to the reference position of the robot,
    A display unit that displays the robot model and
    A display control unit that controls the display unit so that the robot model is displayed in a posture in which the robot is viewed from at least one of the positions and directions detected by the detection unit.
    A robot operating device equipped with.
  2.  請求項1に記載のロボット操作装置であって、
     前記入力部は、前記表示部に設けられ、
     前記表示制御部は、前記入力部に入力可能な移動方向を前記表示部の対応する位置に表示する移動方向表示を行なうものであって、前記表示部に表示されたロボットモデルの姿勢に応じて前記移動方向表示を変更する、
     ロボット操作装置。
    The robot operating device according to claim 1.
    The input unit is provided on the display unit.
    The display control unit displays a movement direction that displays a movement direction that can be input to the input unit at a corresponding position of the display unit, and according to the posture of the robot model displayed on the display unit. Change the movement direction display,
    Robot operating device.
  3.  請求項1または2に記載のロボット操作装置であって、
     前記指示部は、前記入力部に入力された移動方向を前記ロボットの基準位置に対する前記ロボット操作装置の相対的な位置および方向の少なくとも一方に基づいて変更する、
     ロボット操作装置。
    The robot operating device according to claim 1 or 2.
    The instruction unit changes the movement direction input to the input unit based on at least one of the relative position and direction of the robot operating device with respect to the reference position of the robot.
    Robot operating device.
  4.  移動方向を入力し、入力した移動方向への移動をロボットに指示することで、該ロボットを操作するロボット操作方法であって、
     前記ロボットを操作するにあたって、前記ロボットの基準位置に対する相対的な位置および方向の少なくとも一方を検出し、該検出した位置および方向の少なくとも一方から前記ロボットを見た姿勢でロボットモデルを表示する、
     ロボット操作方法。
    It is a robot operation method for operating the robot by inputting a movement direction and instructing the robot to move in the input movement direction.
    In operating the robot, at least one of the positions and directions relative to the reference position of the robot is detected, and the robot model is displayed in a posture in which the robot is viewed from at least one of the detected positions and directions.
    Robot operation method.
  5.  ロボットと、
     移動方向を入力する入力部と、前記入力部に入力された移動方向への移動を前記ロボットに指示する指示部と、前記ロボットの基準位置に対する相対的な位置および方向の少なくとも一方を検出する検出部と、ロボットモデルを表示する表示部と、前記検出部により検出された位置および方向の少なくも一方から前記ロボットを見た姿勢で前記ロボットモデルが表示されるよう前記表示部を制御する表示制御部と、を有するロボットの操作装置と、
     を備えるロボットシステム。
    With a robot
    Detection that detects at least one of an input unit for inputting a movement direction, an instruction unit for instructing the robot to move in the movement direction input to the input unit, and a position and direction relative to the reference position of the robot. Display control that controls the display unit so that the robot model is displayed in a posture in which the robot is viewed from at least one of a unit, a display unit that displays the robot model, and a position and a direction detected by the detection unit. The operation device of the robot having a part, and
    A robot system equipped with.
PCT/JP2020/045281 2020-12-04 2020-12-04 Robot operation device, robot operation method, and robot system WO2022118471A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2020/045281 WO2022118471A1 (en) 2020-12-04 2020-12-04 Robot operation device, robot operation method, and robot system
JP2022566744A JPWO2022118471A1 (en) 2020-12-04 2020-12-04

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/045281 WO2022118471A1 (en) 2020-12-04 2020-12-04 Robot operation device, robot operation method, and robot system

Publications (1)

Publication Number Publication Date
WO2022118471A1 true WO2022118471A1 (en) 2022-06-09

Family

ID=81854097

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/045281 WO2022118471A1 (en) 2020-12-04 2020-12-04 Robot operation device, robot operation method, and robot system

Country Status (2)

Country Link
JP (1) JPWO2022118471A1 (en)
WO (1) WO2022118471A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011189431A (en) * 2010-03-12 2011-09-29 Denso Wave Inc Robot system
US20160346921A1 (en) * 2014-04-04 2016-12-01 Abb Schwelz Ag Portable apparatus for controlling robot and method thereof
JP2018183845A (en) * 2017-04-26 2018-11-22 ファナック株式会社 Operation device, robot system, and operation method, for operating robot
JP2019198926A (en) * 2018-05-16 2019-11-21 株式会社安川電機 Device for operation, control system, control method and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011189431A (en) * 2010-03-12 2011-09-29 Denso Wave Inc Robot system
US20160346921A1 (en) * 2014-04-04 2016-12-01 Abb Schwelz Ag Portable apparatus for controlling robot and method thereof
JP2018183845A (en) * 2017-04-26 2018-11-22 ファナック株式会社 Operation device, robot system, and operation method, for operating robot
JP2019198926A (en) * 2018-05-16 2019-11-21 株式会社安川電機 Device for operation, control system, control method and program

Also Published As

Publication number Publication date
JPWO2022118471A1 (en) 2022-06-09

Similar Documents

Publication Publication Date Title
US11197730B2 (en) Manipulator system
JP6255724B2 (en) Robot and robot operation method
KR20180059888A (en) Robot teaching method and robot arm control device
WO2020090809A1 (en) External input device, robot system, control method for robot system, control program, and recording medium
CN114905487B (en) Teaching device, teaching method, and recording medium
US20220176567A1 (en) Robot instructing apparatus, teaching pendant, and method of instructing a robot
EP4046756A1 (en) Multi-joint robot
US10960542B2 (en) Control device and robot system
CN114055460B (en) Teaching method and robot system
JP4277825B2 (en) Robot teaching system
WO2022118471A1 (en) Robot operation device, robot operation method, and robot system
WO2017175340A1 (en) Optimization device and vertically articulated robot provided with same
JP2017052031A (en) Robot operation device and robot operation method
WO2018167855A1 (en) Operation screen display device
US20220250236A1 (en) Teaching device, teaching method, and recording medium
KR101765052B1 (en) Operating apparatus having flexible movement and control method thereof
US11738469B2 (en) Control apparatus, robot system, and control method
WO2023162225A1 (en) Robot control device and multijoint robot
US20210154845A1 (en) Teaching apparatus, control method, and teaching program
KR101545918B1 (en) Smart phone capable of teaching manipulator and method for teaching manipulator intuitively using the same
CN112643683B (en) Teaching method
US20230001567A1 (en) Teaching Support Device
JP7490349B2 (en) Input device, control method for input device, robot system, method for manufacturing article using robot system, control program and recording medium
JP6842668B2 (en) Input system for remote control

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20964322

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022566744

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20964322

Country of ref document: EP

Kind code of ref document: A1