WO2023248439A1 - Robot system, robot control device, and robot control program - Google Patents

Robot system, robot control device, and robot control program Download PDF

Info

Publication number
WO2023248439A1
WO2023248439A1 PCT/JP2022/025181 JP2022025181W WO2023248439A1 WO 2023248439 A1 WO2023248439 A1 WO 2023248439A1 JP 2022025181 W JP2022025181 W JP 2022025181W WO 2023248439 A1 WO2023248439 A1 WO 2023248439A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
display screen
image
robot system
image displayed
Prior art date
Application number
PCT/JP2022/025181
Other languages
French (fr)
Japanese (ja)
Inventor
智 青木
Original Assignee
ファナック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ファナック株式会社 filed Critical ファナック株式会社
Priority to PCT/JP2022/025181 priority Critical patent/WO2023248439A1/en
Priority to TW112121064A priority patent/TW202400375A/en
Publication of WO2023248439A1 publication Critical patent/WO2023248439A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators

Definitions

  • the embodiments mentioned in this application relate to a robot system, a robot control device, and a robot control program.
  • vision systems robot vision
  • videos images acquired by the vision system
  • an industrial robot will be explained as an example, but the present embodiment is not limited to application to industrial robots, and can be applied to collaborative robots and other applications. It can be widely applied to a variety of robots.
  • JP2006-289531A Japanese Patent Application Publication No. 2000-079587 JP2020-093385A
  • the image displayed on the display screen of the teaching operation panel is determined by the orientation of the camera attached to the industrial robot. Therefore, depending on the orientation of the camera, the direction in which the industrial robot moves as seen by the worker operating the teaching pendant and the direction in which it moves in the image on the display screen may not be intuitively acceptable.
  • the problem to be solved by the present invention is a robot system, a robot control device, and a robot control program that can smoothly perform operations by eliminating the sense of discomfort between the direction of movement of the robot actually operated and the direction of movement in the image on the display screen.
  • the goal is to provide the following.
  • an image capturing device a terminal device having a display screen that displays images captured by the image capturing device, and a terminal device that is operated based on the image displayed on the display screen.
  • a robot system comprising: a robot that moves a second direction in which a subject in the image displayed on the display screen moves when a predetermined part of the robot is operated to move in a first direction;
  • a robotic system is provided that adjusts based on a first orientation.
  • FIG. 1 is a diagram schematically showing an industrial robot system as an example of the robot system according to the present embodiment.
  • FIG. 2 is a diagram (part 1) for explaining the first example of the robot system according to the present embodiment.
  • FIG. 3 is a diagram (part 2) for explaining the first example of the robot system according to the present embodiment.
  • FIG. 4 is a diagram (part 3) for explaining the first example of the robot system according to the present embodiment.
  • FIG. 5 is a diagram (part 1) for explaining the second example of the robot system according to the present embodiment.
  • FIG. 6 is a diagram (part 2) for explaining the second example of the robot system according to the present embodiment.
  • FIG. 7 is a diagram for explaining a first application example of the robot system according to this embodiment.
  • FIG. 1 is a diagram schematically showing an industrial robot system as an example of the robot system according to the present embodiment.
  • FIG. 2 is a diagram (part 1) for explaining the first example of the robot system according to the present embodiment.
  • FIG. 3 is a diagram (
  • FIG. 8 is a diagram for explaining a second application example of the robot system according to this embodiment.
  • FIG. 9 is a diagram for explaining a third application example of the robot system according to this embodiment.
  • FIG. 10 is a flowchart for explaining an example of processing of the robot control program according to this embodiment.
  • FIG. 11 is a flowchart for explaining another example of the processing of the robot control program according to this embodiment.
  • FIG. 1 is a diagram schematically showing an industrial robot system as an example of the robot system according to the present embodiment.
  • an industrial robot system 100 as an example of a robot system according to the present embodiment includes an industrial robot 1, a robot control device 2, and a teaching pendant (terminal device) 3.
  • a gripping portion (hand portion) 11A is provided at the tip of the arm 11 of the industrial robot 1, and the gripping portion 11A grips, for example, a workpiece (object) 5 placed on a workbench 4 and holds it in a predetermined position. It is designed to perform the following processing.
  • a camera (image photographing device) 12 for photographing the workpiece 5 and the like is attached near the gripping portion 11A of the arm 11, and an image (video) including the workpiece 5 photographed by this camera 12 is sent to the robot control device 2. is output to.
  • reference numeral 13 is a ring illumination (illumination device) provided on the outer periphery of the camera 12 and for irradiating the workpiece 5 with light L to obtain a clear image.
  • the robot control device 2 refers to images from the camera 12 and controls the industrial robot 1 based on, for example, a pre-installed program (software program).
  • the teaching pendant 3 includes a display screen 31 and an operation section 32, and is connected to the robot control device 2 by wired or wireless communication.
  • the teaching pendant 3 allows a predetermined position of the industrial robot 1 using the gripping part 11A by, for example, an operator (teaching person) operating the operating part 32 while checking the image on the display screen 31.
  • the robot controller 2 is used to teach the operation of the robot via the robot controller 2.
  • the robot control device 2 not only controls the industrial robot 1 based on commands from the teaching pendant 3, but also processes images displayed on the display screen 31 of the teaching pendant 3, etc., as will be described in detail later.
  • the example of the robot system according to the present embodiment described in detail below is realized by the processing of the robot control device 2, or by the robot control program executed by the arithmetic processing unit (CPU) of the robot control device 2. be done.
  • the robot system according to the present embodiment is not limited to an industrial robot system that grips a workpiece 5 with a gripper 11A as shown in FIG. It may be a system. Furthermore, as described above, the robot system according to this embodiment is not limited to industrial robot systems, and can be widely applied to various robot systems including collaborative robots. be.
  • FIGS. 2(a) and 2(b) are for explaining an example of teaching processing in a robot system to which this embodiment is not applied
  • FIGS. 4(a) and 4(b) are for explaining an example of teaching processing in a robot system to which this embodiment is not applied.
  • FIG. 3 is for explaining the processing by the robot control device in the robot system of the first embodiment.
  • a worker (teacher) P operates the operation unit 32 while checking the image on the display screen 31 of the teaching pendant 3, and the robot (industrial robot) 1
  • the hand portion 11A is moved in the leftward direction DA (from right to left in FIG. 2(a)).
  • an image (video) captured by the camera 12 and input via the robot control device 2 is displayed on the display screen 31 of the teaching pendant 3.
  • the image displayed on the display screen 31 of the teaching pendant 3 changes depending on, for example, the orientation of the camera 12. Specifically, for example, as shown in FIG. 2(b), even if the worker P actually moves the hand portion 11A of the robot 1 to the left (in real space) DA, the teaching operation panel In the image on the display screen 31 of No. 3, the subject (for example, the workpiece W(5)) moves upward Db (from the bottom to the top in FIG. 2(b)).
  • the worker P may feel uncomfortable because the direction DA in which the hand portion 11A of the robot 1 is actually moved and the direction Db in which the subject moves in the image on the display screen 31 are different. That is, the worker P feels uncomfortable in the movement operation of the hand section 11A, and as a result, the teaching operation cannot be performed smoothly, which may lead to a decrease in work efficiency.
  • the robot control device 2 receives an image taken by the camera 12, rotates the image 90 degrees counterclockwise (Ra), and outputs the image to the teaching pendant 3.
  • the operator P operates the operating unit 32 while checking the image on the display screen 31 of the teaching pendant 3, and the robot 1
  • the hand portion 11A is moved to the leftward direction DA
  • the subject (workpiece W) also moves to the leftward direction Da in the image on the display screen 31.
  • the worker P operates the operating section 32 of the teaching pendant 3 to make the robot move while checking the image after the counterclockwise rotation Ra of 90 degrees displayed on the display screen 31 of the teaching pendant 3, for example.
  • the hand portion 11A of No. 1 is actually moved in the direction of DA.
  • the worker P can perform the teaching work smoothly without feeling any discomfort, that is, without causing a decrease in work efficiency.
  • the camera 12 is attached near the moving hand portion 11A of the robot 1, but for example, it may be fixed above the robot 1 (for example, fixed to the ceiling 6 above the robot 1). The same applies in any case.
  • an image taken by a fixed camera 12' is displayed on the display screen 31 of the teaching pendant 3 via the robot control device 2. For example, if the direction in which the hand section 11A of the robot 1 moves in the image on the display screen 31 does not match the direction in which the hand section 11A of the robot 1 actually operated by the worker P is moved, the worker P It will give you a sense of discomfort.
  • This discomfort may occur, for example, when the worker P presses and operates a predetermined part of the robot 1 (for example, the hand section 11A), or when the worker P actually operates the hand guide (guide section) 14 provided on the hand section 11A. It becomes even larger when moving by applying force. That is, when the worker P actually applies force to a part of the robot 1 as will be described in detail later with reference to FIGS. This may lead to an even greater reduction in work efficiency.
  • FIG. 5 and 6 are diagrams for explaining a second example of the robot system according to this embodiment.
  • the robot system of the second embodiment has the above-mentioned
  • a coordinate system is added to the image displayed on the display screen 31 and displayed.
  • a world coordinate system CA is set for the robot 1 in real space (actual).
  • this world coordinate system CA is displayed as the coordinate system Cb in the image displayed on the display screen 31 of the teaching pendant 3 and furthermore, the world coordinate system CA is displayed as the coordinate system Cb in the image displayed on the display screen 31 of the teaching pendant 3.
  • the image on the display screen 31 is rotated 90 degrees counterclockwise to eliminate the operator's discomfort.
  • the world coordinate system CA in real space is added and displayed as the coordinate system Ca to the image in which the moving direction of the subject has been corrected.
  • the worker P can easily check the correspondence between the coordinate system CA in the real space photographed by the camera 12 and the coordinate system Ca in the image displayed on the display screen 31 of the teaching pendant 3. Become.
  • FIG. 6(a) for example, only the image before being rotated 90 degrees counterclockwise and the coordinate system Cb of that image may be displayed as the image displayed on the display screen 31. can. Furthermore, the rotation direction of the image (90 degrees counterclockwise rotation Ra) may be displayed to match the moving direction with respect to the image on the display screen 31. That is, before directly displaying the image shown in FIG. 6(b), for example, the image shown in FIG. 6(a) is displayed, and the image shown in FIG. 6(a) or FIG. ) can also be selected.
  • FIG. 7 is a diagram for explaining a first application example of the robot system according to the present embodiment.
  • reference numeral 3' indicates a tablet (terminal device), and 31' indicates a display screen (display area) of a touch panel on the tablet 3'.
  • the operation section 32 of the teaching pendant 3 in FIG. 1 described above can also be provided as a touch operation area on the display screen 31' of the tablet 3', for example.
  • the tablet 3' is equipped with a GPS (Global Positioning System) function, a geomagnetic sensor (azimuth sensor), etc.
  • the tablet 3' may be a dedicated device provided by the provider that provides the robot system 100, but a general-purpose tablet owned by the user may also be used.
  • a communication terminal such as a general-purpose smartphone owned by the user.
  • the robot control program according to the present embodiment is installed and used on the tablet or smartphone.
  • the worker P while checking the image displayed on the display screen 31' of the tablet 3', Apply force to 11A and move it. At this time, the worker P may operate the touch operation area on the display screen 31' of the tablet 3'.
  • the worker P when the worker P actually applies force to the hand portion 11A of the robot 1 to move it, by applying this embodiment, the worker P can perform the operation without feeling any discomfort, and the work efficiency can be improved. becomes possible to rise.
  • the effect of improving work efficiency due to the absence of discomfort in the direction in which the subject moves in the image on the display screen 31' is particularly It becomes something big.
  • the moving direction of the hand section 11A that is actually moved by the worker P is always shown in the image displayed on the display screen 31' of the tablet 3'. This can be made to match the moving direction of the hand portion 11A. That is, the image displayed on the display screen of the tablet 3' changes in real time according to the orientation of the display screen 31' of the tablet 3' held in the hand of the worker P, so that the image does not make the worker P feel uncomfortable. It looks like this.
  • FIG. 8 is a diagram for explaining a second application example of the robot system according to the present embodiment.
  • reference numeral 16 is a holding point for charging the battery of the tablet 3' and calibrating the position and orientation of the tablet 3' using a GPS function, a geomagnetic sensor, etc. mounted on the tablet 3', for example. shows.
  • the worker P takes out and uses the tablet 3', which has been placed in the holding location 16 and whose battery has been charged and whose position and orientation have been calibrated, for example. Thereby, the robot control device 2 can recognize the position and orientation of the tablet 3'.
  • the robot control device 2 can receive, for example, output signals from encoders provided in the motors of each movable part of the robot 1, and can recognize the positions and moving directions of the camera 12 and the hand part 11A. In this way, the robot control device 2 can recognize the orientation (position and orientation) of the display screen 31' of the tablet 3', the orientation (distance and direction) in which the hand section 11A is positioned, and so on.
  • the robot system according to the present embodiment described with reference to FIGS. 2 to 6 recognizes the relative relationship between the objects (the workpiece W and a predetermined location where the robot 1 moves) in the image displayed on the display screen 31'. It becomes possible to realize the first and second embodiments.
  • FIG. 9 is a diagram for explaining a third application example of the robot system according to the present embodiment.
  • reference numeral 14 indicates a hand guide (guide portion), and 12' indicates a camera fixed above the robot 1 (for example, on the ceiling 6).
  • the hand guide 14 is provided near the hand section 11A of the robot 1, and is used by the worker P to actually apply force to the hand guide 14 to move the hand section 11A and teach a predetermined operation.
  • the robot system shown in FIG. 9 includes both a camera 12 attached near the hand portion 11A of the robot 1 and a camera 12' fixed to the ceiling 6 above the robot 1. ' may also be the case.
  • the worker P operates the operating section 32 while checking the image displayed on the display screen 31 of the teaching pendant 3 connected to the robot control device 2, and further operates the hand guide 14 with his or her own hands.
  • the hand portion 11A of the robot 1 is moved (operated) by applying force.
  • the processing of the image taken by the camera 12 attached near the hand part 11A of the robot 1 is as explained with reference to FIGS. The same applies to processing of captured images.
  • the subject photographed by the fixed camera 12' for example, the hand section 11A of the robot 1
  • the direction in which the worker P moved the hand section 11A via the hand guide 14 and the direction in which the object was photographed by the fixed camera 12' are determined.
  • the processing in the robot systems of the first and second embodiments described above can also be applied to the direction of movement of the hand section 11A in the image displayed on the display screen 31 of the teaching pendant 3.
  • FIG. 10 is a flowchart for explaining an example of the processing of the robot control program according to the present embodiment. It is something.
  • step ST11 the hand guide is used to move the hand part A of the robot in the front direction (back-and-forth direction) and in the left-right direction when viewed from the worker P. Then, the process proceeds to step ST12. That is, in step ST11, the hand portion (grip portion) 11A of the robot 1 is moved in the front direction and left/right direction using the hand guide 14 described with reference to FIG.
  • the robot control device 2 (arithmetic processing device) can accurately recognize the robot 1 based on the output signal of an encoder provided to the motor of each movable part of the robot 1 or the pulse signal controlling each motor.
  • step ST12 the relative relationship between the world coordinate system of the camera 12 (for example, the world coordinate system CA in FIG. 5) and the direction of the worker P is calculated from the motion of the robot 1. That is, the robot control device 2 calculates the positional coordinates in the world coordinate system CA in which the hand portion 11A of the robot 1 moves, and the direction in which the worker P operates the hand portion 11A of the robot 1.
  • the robot control device 2 calculates the positional coordinates in the world coordinate system CA in which the hand portion 11A of the robot 1 moves, and the direction in which the worker P operates the hand portion 11A of the robot 1.
  • step ST13 where the position and orientation of the camera with respect to the world coordinate system is calculated
  • step ST14 where the relative relationship between the orientation of the camera 12 and the orientation of the worker P is obtained by coordinate transformation. That is, the robot control device 2 acquires the relationship between the coordinate system (world coordinate system) CA in real space and the coordinate system Cb in the image displayed on the display screen 31 of the teaching pendant 3.
  • step ST15 the process proceeds to step ST15, and based on the relative position and orientation, the image is displayed on the display screen 31 so that, for example, the front direction of the worker P and the upper direction of the display image (the image on the display screen 31) match.
  • the robot control device 2 rotates the image on the display screen 31 so that the coordinate system Cb in the image displayed on the display screen 31 becomes the coordinate system Ca.
  • the direction in which the worker P actually moves the hand portion 11A of the robot 1 matches the direction in which the subject moves in the image on the display screen 31, eliminating the discomfort of the worker P and allowing him to work. It becomes possible to do it smoothly.
  • FIG. 11 is a flowchart for explaining another example of the processing of the robot control program according to the present embodiment. This is for illustrative purposes only.
  • step ST21 when another example of the processing of the robot control program starts, in step ST21, the tablet 3' on which the GPS sensor, geomagnetic sensor, etc. are mounted is placed at a predetermined location on the robot in a predetermined direction. Deploy. That is, the position and orientation of the tablet 3' is calibrated as described with reference to FIG. 8. Thereby, the robot control device 2 can recognize the relative relationship between the hand portion 11A of the robot 1 and the image on the display screen 31' of the tablet 3'.
  • step ST22 information on the position and orientation of the tablet 3' is acquired using a GPS sensor, a geomagnetic sensor, etc. while the tablet 3' is being held. That is, the robot control device 2 can acquire information about the tablet 3' from the tablet 3' connected via a wired or wireless line.
  • steps ST21 and ST22 include, for example, as described with reference to FIG. 8, the tablet 3' is placed on the holding location 16, the battery is charged, and the position and orientation of the tablet 3' are calibrated. Once done, you can make it unnecessary.
  • step ST23 the process proceeds to step ST23, and the relative coordinates of the camera 12 from the robot base 15 are acquired.
  • the robot control device 2 can obtain the relative coordinates of the camera 12 with respect to the robot base 15 using, for example, output signals and control signals of encoders provided to the motors of each movable part of the robot 1.
  • step ST24 the process proceeds to step ST24 to calculate the relative position and orientation from the camera 12 to the tablet 3'. That is, the robot control device 2 determines the relative position from the camera 12 to the tablet 3' based on the position and orientation of the tablet 3' obtained in step ST22 and the relative coordinates of the camera 12 with respect to the robot base 15 obtained in step ST23. Calculate the position and orientation of the object.
  • the worker P is rotated based on the relative position and orientation so that the front direction of the worker P and the upper direction of the display image (the image displayed on the display screen 31' of the tablet 3') coincide. do. That is, the robot control device 2 performs, for example, the rotation processing of the displayed image as described with reference to FIGS. 2 to 4. Thereby, the direction in which the hand portion 11A of the robot 1 is actually moved matches the direction in which the subject moves in the image on the display screen 31, so that the worker P can work smoothly without feeling uncomfortable.
  • step ST26 coordinates based on the relative position and orientation are displayed on the display screen of the tablet 3'. That is, as described with reference to FIGS. 5 and 6, the coordinate system is added to the display image and displayed. Thereby, the worker P can easily confirm the correspondence between the coordinate system CA in the real space photographed by the camera 12 and the coordinate system Ca (Cb) in the image displayed on the display screen 31' of the tablet 3'. becomes possible.
  • the robot control program according to the present embodiment described above may be provided by being recorded on a computer-readable non-temporary recording medium or a non-volatile semiconductor storage device, or may be provided via wired or wireless communication.
  • the computer-readable non-temporary recording medium may be, for example, an optical disk such as a CD-ROM (Compact Disc Read Only Memory) or a DVD-ROM, or a hard disk device.
  • PROM Programmable Read Only Memory
  • flash memory etc.
  • distribution from the server device may be provided via a wired or wireless WAN (Wide Area Network), LAN (Local Area Network), the Internet, or the like.
  • the robot can be operated without any discomfort between the direction of movement of the robot to be actually operated and the direction of movement in the image on the display screen. can be carried out smoothly.
  • Robot 2 Robot control device 3 Teaching operation panel (terminal device) 3' Tablet (terminal device) 4 Workbench 5, W Work (object) 6 Ceiling (above the robot) 11 Arm 11A Gripping part (hand part) 12,12' camera (image capture device) 13 Ring lighting (lighting device) 14 Hand guide (guide part) 15 Robot base 16 Holding location 31, 31' Display screen 32 Operation unit 100 Industrial robot system (robot system)

Abstract

A robot system 100 comprises: an image capturing device 12; a terminal device 3 provided with a display screen 31 that displays an image captured by the image capturing device 12; and a robot 1 operated on the basis of the image displayed on the display screen. When a prescribed location 11A of the robot 1 is operated so as to move in a first direction, a second direction in which a subject in the image displayed moves is adjusted on the basis of the first direction.

Description

ロボットシステム、ロボット制御装置およびロボット制御プログラムRobot system, robot control device and robot control program
 この出願で言及する実施例は、ロボットシステム、ロボット制御装置およびロボット制御プログラムに関する。 The embodiments mentioned in this application relate to a robot system, a robot control device, and a robot control program.
 近年、産業用ロボットや協働ロボットには、ビジョンシステム(ロボットビジョン)が適用され、そのビジョンシステムにより取得された画像(映像)に基づいて、産業用ロボット等に所定の動作を行わせるものが提供されている。 In recent years, vision systems (robot vision) have been applied to industrial robots and collaborative robots, and there are systems that allow industrial robots to perform predetermined actions based on images (videos) acquired by the vision system. provided.
 すなわち、ロボットとビジョンセンサを組み合わせることで、「固定したカメラ」や「ロボットに取り付けたカメラ」で撮影した画像を表示画面上にリアルタイムで表示しながらロボットを操作するものが実用化されている。また、例えば、産業用ロボットとして、作業者(教示者)が教示操作盤の表示画面の画像を確認しながら、所定の動作を教示するものも実用化されている。 In other words, by combining a robot and a vision sensor, it has been put into practical use that the robot can be operated while displaying images taken with a "fixed camera" or "a camera attached to the robot" on a display screen in real time. Furthermore, for example, industrial robots that teach predetermined operations to a worker (teacher) while checking an image on a display screen of a teaching operation panel have also been put into practical use.
 なお、本明細書では、説明を簡略化するために、産業用ロボットを例にとって説明するが、本実施形態は、産業用ロボットへの適用に限定されるものではなく、協働ロボットを始めとする様々なロボットに対して幅広く適用することが可能である。 Note that in this specification, in order to simplify the explanation, an industrial robot will be explained as an example, but the present embodiment is not limited to application to industrial robots, and can be applied to collaborative robots and other applications. It can be widely applied to a variety of robots.
 従来、表示画像を確認しながらロボットの教示や遠隔操作を行う技術として、様々な提案がなされている。 In the past, various proposals have been made as techniques for teaching or remotely controlling a robot while checking a displayed image.
特開2006-289531号公報JP2006-289531A 特開2000-079587号公報Japanese Patent Application Publication No. 2000-079587 特開2020-093385号公報JP2020-093385A
 前述したように、例えば、産業用ロボットとして、作業者が教示操作盤の表示画面の画像を確認しながら、所定の動作を教示するものが実用化されている。しかしながら、産業用ロボットと作業者の相対的な向きによっては、表示画面の画像が作業者に違和感を与え、操作に支障をきたすことがあった。 As mentioned above, for example, industrial robots that teach predetermined actions to a worker while checking an image on a display screen of a teaching operation panel have been put into practical use. However, depending on the relative orientation of the industrial robot and the worker, the image on the display screen may give the worker a sense of discomfort, which may impede operation.
 具体的に、例えば、教示操作盤の表示画面に映し出される画像は、産業用ロボットに取り付けたカメラの向きによって決まっている。そのため、カメラの向きによっては、教示操作盤を操作する作業者から見た産業用ロボットの動く向きと、表示画面の画像における動く向きが直感的に受け入れられないことがあった。 Specifically, for example, the image displayed on the display screen of the teaching operation panel is determined by the orientation of the camera attached to the industrial robot. Therefore, depending on the orientation of the camera, the direction in which the industrial robot moves as seen by the worker operating the teaching pendant and the direction in which it moves in the image on the display screen may not be intuitively acceptable.
 すなわち、作業者が実際に操作する産業用ロボットの動く向きと、教示操作盤の表示画面に映し出される画像における動く向きに違和感があり、その違和感に起因して、作業者が教示操作を円滑に行えないことがあった。 In other words, there is a sense of incongruity between the direction of movement of the industrial robot actually operated by the worker and the direction of movement in the image projected on the display screen of the teaching operation panel, and due to this sense of incongruity, it is difficult for the worker to perform the teaching operation smoothly. There were things I couldn't do.
 本発明が解決しようとする課題は、実際に操作するロボットの動く向きと、表示画面の画像における動く向きの違和感をなくして操作を円滑に行うことができるロボットシステム、ロボット制御装置およびロボット制御プログラムを提供することである。 The problem to be solved by the present invention is a robot system, a robot control device, and a robot control program that can smoothly perform operations by eliminating the sense of discomfort between the direction of movement of the robot actually operated and the direction of movement in the image on the display screen. The goal is to provide the following.
 本発明に係る一実施形態によれば、画像撮影装置と、前記画像撮影装置で撮影された画像を表示する表示画面を有する端末装置と、前記表示画面に表示される前記画像に基づいて操作されるロボットと、を備えるロボットシステムであって、前記ロボットの所定個所を第1の向きに動くように操作するとき、前記表示画面に表示される前記画像における被写体が動く第2の向きを、前記第1の向きに基づいて調整する、ロボットシステムが提供される。 According to one embodiment of the present invention, there is provided an image capturing device, a terminal device having a display screen that displays images captured by the image capturing device, and a terminal device that is operated based on the image displayed on the display screen. A robot system comprising: a robot that moves a second direction in which a subject in the image displayed on the display screen moves when a predetermined part of the robot is operated to move in a first direction; A robotic system is provided that adjusts based on a first orientation.
 本発明の目的および効果は、特に請求項において指摘される構成要素および組み合わせを用いることによって認識され且つ得られるであろう。前述の一般的な説明および後述の詳細な説明の両方は、例示的および説明的なものであり、請求の範囲に記載されている本発明を制限するものではない。 The objects and advantages of the invention will be realized and obtained by means of the elements and combinations particularly pointed out in the claims. Both the foregoing general description and the following detailed description are intended to be exemplary and explanatory and are not intended to limit the invention as claimed.
図1は、本実施形態に係るロボットシステムの一例としての産業用ロボットシステムを模式的に示す図である。FIG. 1 is a diagram schematically showing an industrial robot system as an example of the robot system according to the present embodiment. 図2は、本実施形態に係るロボットシステムの第1実施例を説明するための図(その1)である。FIG. 2 is a diagram (part 1) for explaining the first example of the robot system according to the present embodiment. 図3は、本実施形態に係るロボットシステムの第1実施例を説明するための図(その2)である。FIG. 3 is a diagram (part 2) for explaining the first example of the robot system according to the present embodiment. 図4は、本実施形態に係るロボットシステムの第1実施例を説明するための図(その3)である。FIG. 4 is a diagram (part 3) for explaining the first example of the robot system according to the present embodiment. 図5は、本実施形態に係るロボットシステムの第2実施例を説明するための図(その1)である。FIG. 5 is a diagram (part 1) for explaining the second example of the robot system according to the present embodiment. 図6は、本実施形態に係るロボットシステムの第2実施例を説明するための図(その2)である。FIG. 6 is a diagram (part 2) for explaining the second example of the robot system according to the present embodiment. 図7は、本実施形態に係るロボットシステムの第1適用例を説明するための図である。FIG. 7 is a diagram for explaining a first application example of the robot system according to this embodiment. 図8は、本実施形態に係るロボットシステムの第2適用例を説明するための図である。FIG. 8 is a diagram for explaining a second application example of the robot system according to this embodiment. 図9は、本実施形態に係るロボットシステムの第3適用例を説明するための図である。FIG. 9 is a diagram for explaining a third application example of the robot system according to this embodiment. 図10は、本実施形態に係るロボット制御プログラムの処理の一例を説明するためのフローチャートである。FIG. 10 is a flowchart for explaining an example of processing of the robot control program according to this embodiment. 図11は、本実施形態に係るロボット制御プログラムの処理の他の例を説明するためのフローチャートである。FIG. 11 is a flowchart for explaining another example of the processing of the robot control program according to this embodiment.
 以下、本実施形態に係るロボットシステム、ロボット制御装置およびロボット制御プログラムの実施例を、添付図面を参照して詳述する。図1は、本実施形態に係るロボットシステムの一例としての産業用ロボットシステムを模式的に示す図である。 Hereinafter, examples of the robot system, robot control device, and robot control program according to the present embodiment will be described in detail with reference to the accompanying drawings. FIG. 1 is a diagram schematically showing an industrial robot system as an example of the robot system according to the present embodiment.
 図1に示されるように、本実施形態に係るロボットシステムの一例としての産業用ロボットシステム100は、産業用ロボット1、ロボット制御装置2、および、教示操作盤(端末装置)3を備える。産業用ロボット1のアーム11の先端には、把持部(ハンド部)11Aが設けられ、この把持部11Aにより、例えば、作業台4に載置されたワーク(対象物)5を把持して所定の処理を行うようになっている。 As shown in FIG. 1, an industrial robot system 100 as an example of a robot system according to the present embodiment includes an industrial robot 1, a robot control device 2, and a teaching pendant (terminal device) 3. A gripping portion (hand portion) 11A is provided at the tip of the arm 11 of the industrial robot 1, and the gripping portion 11A grips, for example, a workpiece (object) 5 placed on a workbench 4 and holds it in a predetermined position. It is designed to perform the following processing.
 アーム11における把持部11Aの近傍には、ワーク5等を撮影するためのカメラ(画像撮影装置)12が取り付けられ、このカメラ12により撮影されたワーク5を含む画像(映像)がロボット制御装置2に出力される。ここで、参照符号13は、カメラ12の外周に設けられ、ワーク5に光Lを照射して鮮明な画像を取得するためのリング照明(照明装置)である。ロボット制御装置2は、カメラ12からの画像を参照し、例えば、予めインストールされたプログラム(ソフトウェアプログラム)等に基づいて、産業用ロボット1を制御する。 A camera (image photographing device) 12 for photographing the workpiece 5 and the like is attached near the gripping portion 11A of the arm 11, and an image (video) including the workpiece 5 photographed by this camera 12 is sent to the robot control device 2. is output to. Here, reference numeral 13 is a ring illumination (illumination device) provided on the outer periphery of the camera 12 and for irradiating the workpiece 5 with light L to obtain a clear image. The robot control device 2 refers to images from the camera 12 and controls the industrial robot 1 based on, for example, a pre-installed program (software program).
 教示操作盤3は、表示画面31および操作部32を備え、有線または無線通信によりロボット制御装置2に接続される。ここで、教示操作盤3は、例えば、作業者(教示者)が表示画面31の画像を確認しながら操作部32を操作することにより、産業用ロボット1に対して把持部11Aを使用した所定の動作をロボット制御装置2を介して教示するために使用される。 The teaching pendant 3 includes a display screen 31 and an operation section 32, and is connected to the robot control device 2 by wired or wireless communication. Here, the teaching pendant 3 allows a predetermined position of the industrial robot 1 using the gripping part 11A by, for example, an operator (teaching person) operating the operating part 32 while checking the image on the display screen 31. The robot controller 2 is used to teach the operation of the robot via the robot controller 2.
 ロボット制御装置2は、教示操作盤3からの指令に基づいて産業用ロボット1を制御するだけでなく、後に詳述するように、教示操作盤3の表示画面31に表示される画像の処理等も行う。すなわち、以下に詳述する本実施形態に係るロボットシステムの実施例は、ロボット制御装置2の処理により実現され、或いは、ロボット制御装置2の演算処理装置(CPU)が実行するロボット制御プログラムにより実現される。 The robot control device 2 not only controls the industrial robot 1 based on commands from the teaching pendant 3, but also processes images displayed on the display screen 31 of the teaching pendant 3, etc., as will be described in detail later. We also do That is, the example of the robot system according to the present embodiment described in detail below is realized by the processing of the robot control device 2, or by the robot control program executed by the arithmetic processing unit (CPU) of the robot control device 2. be done.
 なお、本実施形態に係るロボットシステムは、図1に示すようなワーク5を把持部11Aで把持する産業用ロボットシステムに限定されず、把持以外の様々な機能を有するハンド部を備えた産業ロボットシステムであってもよい。さらに、本実施形態に係るロボットシステムは、産業用ロボットシステムに限定されるものではなく、協働ロボットを始めとする様々なロボットシステムに対して幅広く適用することができるのは、前述した通りである。 Note that the robot system according to the present embodiment is not limited to an industrial robot system that grips a workpiece 5 with a gripper 11A as shown in FIG. It may be a system. Furthermore, as described above, the robot system according to this embodiment is not limited to industrial robot systems, and can be widely applied to various robot systems including collaborative robots. be.
 [第1実施例]
 図2~図4は、本実施形態に係るロボットシステムの第1実施例を説明するための図である。ここで、図2(a)および図2(b)は、本実施形態を適用しないロボットシステムにおける教示処理の一例を説明するためのものであり、図4(a)および図4(b)は、第1実施例のロボットシステムにおける教示処理の一例を説明するためのものである。なお、図3は、第1実施例のロボットシステムにおけるロボット制御装置による処理を説明するためのものである。
[First example]
2 to 4 are diagrams for explaining a first example of the robot system according to this embodiment. Here, FIGS. 2(a) and 2(b) are for explaining an example of teaching processing in a robot system to which this embodiment is not applied, and FIGS. 4(a) and 4(b) are for explaining an example of teaching processing in a robot system to which this embodiment is not applied. , is for explaining an example of the teaching process in the robot system of the first embodiment. Note that FIG. 3 is for explaining the processing by the robot control device in the robot system of the first embodiment.
 図2(a)に示されるように、例えば、作業者(教示者)Pが教示操作盤3の表示画面31の画像を確認しながら操作部32を操作して、ロボット(産業用ロボット)1のハンド部11Aを左向きDA(図2(a)中、右から左の方向)に移動させる。このとき、図2(b)に示されるように、教示操作盤3の表示画面31には、カメラ12により撮影され、ロボット制御装置2を介して入力された画像(映像)が映し出される。 As shown in FIG. 2(a), for example, a worker (teacher) P operates the operation unit 32 while checking the image on the display screen 31 of the teaching pendant 3, and the robot (industrial robot) 1 The hand portion 11A is moved in the leftward direction DA (from right to left in FIG. 2(a)). At this time, as shown in FIG. 2(b), an image (video) captured by the camera 12 and input via the robot control device 2 is displayed on the display screen 31 of the teaching pendant 3.
 ここで、教示操作盤3の表示画面31に映し出される画像は、例えば、カメラ12の向き等により変化する。具体的に、例えば、図2(b)に示されるように、作業者Pがロボット1のハンド部11Aを実際に(実空間で)左向きDAに移動させた場合であっても、教示操作盤3の表示画面31における画像では、被写体(例えば、ワークW(5))は上向きDb(図2(b)中、下から上の方向)に移動する。 Here, the image displayed on the display screen 31 of the teaching pendant 3 changes depending on, for example, the orientation of the camera 12. Specifically, for example, as shown in FIG. 2(b), even if the worker P actually moves the hand portion 11A of the robot 1 to the left (in real space) DA, the teaching operation panel In the image on the display screen 31 of No. 3, the subject (for example, the workpiece W(5)) moves upward Db (from the bottom to the top in FIG. 2(b)).
 そのため、作業者Pは、ロボット1のハンド部11Aを実際に移動させた向きDAと、表示画面31の画像において被写体が移動する向きDbが異なるため、違和感を感じることがある。すなわち、作業者Pは、ハンド部11Aの移動操作に違和感を覚え、その結果、教示操作を円滑に行えず、作業効率の低下を招く虞がある。 Therefore, the worker P may feel uncomfortable because the direction DA in which the hand portion 11A of the robot 1 is actually moved and the direction Db in which the subject moves in the image on the display screen 31 are different. That is, the worker P feels uncomfortable in the movement operation of the hand section 11A, and as a result, the teaching operation cannot be performed smoothly, which may lead to a decrease in work efficiency.
 このような場合、本実施形態に係るロボットシステムの第1実施例によれば、例えば、図3に示されるように、教示操作盤3の表示画面31における画像を反時計回りに90度回転Raして表示する。すなわち、ロボット制御装置2は、カメラ12により撮影された画像を受け取り、その画像を反時計回りに90度回転Raした後の画像を教示操作盤3に出力する。 In such a case, according to the first example of the robot system according to the present embodiment, for example, as shown in FIG. and display it. That is, the robot control device 2 receives an image taken by the camera 12, rotates the image 90 degrees counterclockwise (Ra), and outputs the image to the teaching pendant 3.
 これにより、図4(a)および図4(b)に示されるように、例えば、作業者Pが教示操作盤3の表示画面31の画像を確認しながら操作部32を操作して、ロボット1のハンド部11Aを左向きDAに移動させた場合、表示画面31の画像でも被写体(ワークW)が同じ左向きDaに移動することになる。 As a result, as shown in FIGS. 4(a) and 4(b), for example, the operator P operates the operating unit 32 while checking the image on the display screen 31 of the teaching pendant 3, and the robot 1 When the hand portion 11A is moved to the leftward direction DA, the subject (workpiece W) also moves to the leftward direction Da in the image on the display screen 31.
 そして、作業者Pは、例えば、教示操作盤3の表示画面31に映し出される反時計回りに90度回転Raした後の画像を確認しながら、教示操作盤3の操作部32を操作してロボット1のハンド部11Aを実際にDAの向きに移動させる。その結果、作業者Pは、違和感を感じることなく、すなわち、作業効率の低下を招くことなく、円滑な教示作業を行うとが可能になる。 Then, the worker P operates the operating section 32 of the teaching pendant 3 to make the robot move while checking the image after the counterclockwise rotation Ra of 90 degrees displayed on the display screen 31 of the teaching pendant 3, for example. The hand portion 11A of No. 1 is actually moved in the direction of DA. As a result, the worker P can perform the teaching work smoothly without feeling any discomfort, that is, without causing a decrease in work efficiency.
 なお、上述した説明において、カメラ12は、ロボット1の移動するハンド部11Aの近傍に取り付けられているが、例えば、ロボット1の上方に固定(例えば、ロボット1の上方の天井6に固定)した場合であっても同様である。図9を参照して後に詳述するが、例えば、固定されたカメラ12’により撮影された画像は、ロボット制御装置2を経由して教示操作盤3の表示画面31に表示される。例えば、この表示画面31の画像におけるロボット1のハンド部11Aが移動する向きと、作業者Pが実際に操作したロボット1のハンド部11Aを移動させた向きが一致していないと、作業者Pに違和感を与えることになる。 In the above description, the camera 12 is attached near the moving hand portion 11A of the robot 1, but for example, it may be fixed above the robot 1 (for example, fixed to the ceiling 6 above the robot 1). The same applies in any case. As will be described in detail later with reference to FIG. 9, for example, an image taken by a fixed camera 12' is displayed on the display screen 31 of the teaching pendant 3 via the robot control device 2. For example, if the direction in which the hand section 11A of the robot 1 moves in the image on the display screen 31 does not match the direction in which the hand section 11A of the robot 1 actually operated by the worker P is moved, the worker P It will give you a sense of discomfort.
 この違和感は、例えば、作業者Pがロボット1の所定個所(例えば、ハンド部11A)を押して操作する場合、或いは、作業者Pがハンド部11Aに設けられたハンドガイド(ガイド部)14を実際に力を加えて移動させる場合等において、より一層大きなものになる。すなわち、図7および図9を参照して後に詳述するような作業者Pが実際にロボット1の一部に力を加える作業では、表示画面31の画像における被写体の移動する向きの違和感が、より一層大きな作業効率の低下を招く虞がある。 This discomfort may occur, for example, when the worker P presses and operates a predetermined part of the robot 1 (for example, the hand section 11A), or when the worker P actually operates the hand guide (guide section) 14 provided on the hand section 11A. It becomes even larger when moving by applying force. That is, when the worker P actually applies force to a part of the robot 1 as will be described in detail later with reference to FIGS. This may lead to an even greater reduction in work efficiency.
 [第2実施例]
 図5および図6は、本実施形態に係るロボットシステムの第2実施例を説明するための図である。図5と図4(a)、図6(a)と図3、並びに、図6(b)と図4(b)の比較から明らかなように、本第2実施例のロボットシステムでは、上述した第1実施例のロボットシステムに対して、表示画面31に表示される画像に座標系を追加して表示するようになっている。
[Second example]
5 and 6 are diagrams for explaining a second example of the robot system according to this embodiment. As is clear from the comparison between FIG. 5 and FIG. 4(a), FIG. 6(a) and FIG. 3, and FIG. 6(b) and FIG. 4(b), the robot system of the second embodiment has the above-mentioned In contrast to the robot system of the first embodiment, a coordinate system is added to the image displayed on the display screen 31 and displayed.
 まず、図5に示されるように、実空間(実際)のロボット1に対して、世界座標系(ワールド座標系)CAを設定する。図6(a)に示されるように、この世界座標系CAは、教示操作盤3の表示画面31に表示される画像における座標系Cbとして表示され、さらに、図3を参照して説明したのと同様に、作業者の違和感をなくすために表示画面31における画像を反時計回りに90度回転する。 First, as shown in FIG. 5, a world coordinate system CA is set for the robot 1 in real space (actual). As shown in FIG. 6(a), this world coordinate system CA is displayed as the coordinate system Cb in the image displayed on the display screen 31 of the teaching pendant 3, and furthermore, the world coordinate system CA is displayed as the coordinate system Cb in the image displayed on the display screen 31 of the teaching pendant 3. Similarly, the image on the display screen 31 is rotated 90 degrees counterclockwise to eliminate the operator's discomfort.
 すなわち、図6(b)に示されるように、実空間の世界座標系CAは、被写体の移動する向きが修正された画像に対して、座標系Caとして追加して表示される。これにより、作業者Pは、カメラ12で撮影した実空間における座標系CAと、教示操作盤3の表示画面31に表示される画像における座標系Caの対応関係を容易に確認することが可能になる。 That is, as shown in FIG. 6(b), the world coordinate system CA in real space is added and displayed as the coordinate system Ca to the image in which the moving direction of the subject has been corrected. Thereby, the worker P can easily check the correspondence between the coordinate system CA in the real space photographed by the camera 12 and the coordinate system Ca in the image displayed on the display screen 31 of the teaching pendant 3. Become.
 なお、図6(a)に示されるように、例えば、表示画面31に表示する画像として、反時計回りに90度回転する前の画像、並びに、その画像における座標系Cbのみを表示することもできる。さらに、表示画面31の画像に対して、移動する向きを一致させるための画像の回転方向(反時計回りに90度回転Ra)を表示してもよい。すなわち、図6(b)に示す画像を直接表示する前に、例えば、図6(a)に示す画像を表示し、作業者Pに対して、図6(a)の画像または図6(b)の画像のいずれかを選択可能とすることもできる。 Note that, as shown in FIG. 6(a), for example, only the image before being rotated 90 degrees counterclockwise and the coordinate system Cb of that image may be displayed as the image displayed on the display screen 31. can. Furthermore, the rotation direction of the image (90 degrees counterclockwise rotation Ra) may be displayed to match the moving direction with respect to the image on the display screen 31. That is, before directly displaying the image shown in FIG. 6(b), for example, the image shown in FIG. 6(a) is displayed, and the image shown in FIG. 6(a) or FIG. ) can also be selected.
 図7は、本実施形態に係るロボットシステムの第1適用例を説明するための図である。図7において、参照符号3’は、タブレット(端末装置)を示し、31’は、タブレット3’におけるタッチパネルの表示画面(表示領域)を示す。なお、前述した図1における教示操作盤3の操作部32は、例えば、タブレット3’の表示画面31’におけるタッチ操作領域として設けることもできる。 FIG. 7 is a diagram for explaining a first application example of the robot system according to the present embodiment. In FIG. 7, reference numeral 3' indicates a tablet (terminal device), and 31' indicates a display screen (display area) of a touch panel on the tablet 3'. Note that the operation section 32 of the teaching pendant 3 in FIG. 1 described above can also be provided as a touch operation area on the display screen 31' of the tablet 3', for example.
 タブレット3’は、GPS(Global Positioning System)機能および地磁気センサ(方位センサ)等を搭載している。なお、タブレット3’は、ロボットシステム100を提供するプロバイダによる専用のものであってもよいが、ユーザが所有している汎用的なタブレットを利用することもできる。また、タブレット3’としては、例えば、ユーザが所有している汎用的なスマートフォンといった通信端末を利用することも可能である。ここで、ユーザが所有するタブレットやスマートフォンをタブレット3’として利用する場合、そのタブレットやスマートフォンに対して、本実施形態に係るロボット制御プログラムをインストールして使用することになる。 The tablet 3' is equipped with a GPS (Global Positioning System) function, a geomagnetic sensor (azimuth sensor), etc. Note that the tablet 3' may be a dedicated device provided by the provider that provides the robot system 100, but a general-purpose tablet owned by the user may also be used. Further, as the tablet 3', it is also possible to use, for example, a communication terminal such as a general-purpose smartphone owned by the user. Here, when the user uses a tablet or smartphone owned by the user as the tablet 3', the robot control program according to the present embodiment is installed and used on the tablet or smartphone.
 図7に示されるように、本実施形態に係るロボットシステムの第1適用例において、作業者Pは、タブレット3’の表示画面31’に表示された画像を確認しながら、ロボット1のハンド部11Aに対して実際に力を加えて移動させる。このとき、作業者Pは、タブレット3’の表示画面31’におけるタッチ操作領域を操作してもよい。 As shown in FIG. 7, in the first application example of the robot system according to the present embodiment, the worker P, while checking the image displayed on the display screen 31' of the tablet 3', Apply force to 11A and move it. At this time, the worker P may operate the touch operation area on the display screen 31' of the tablet 3'.
 このように、例えば、作業者Pが実際にロボット1のハンド部11Aに対して実際に力を加えて動かす場合、本実施形態を適用することにより、違和感なく操作を行うことができ、作業効率を上昇させることが可能になる。本適用例のように、作業者Pが自身の手でハンド部11Aを押して移動させる場合、表示画面31’の画像において、被写体が移動する向きに対する違和感のなさによる作業効率向上の効果は、特に大きなものになる。 In this way, for example, when the worker P actually applies force to the hand portion 11A of the robot 1 to move it, by applying this embodiment, the worker P can perform the operation without feeling any discomfort, and the work efficiency can be improved. becomes possible to rise. As in this application example, when the worker P pushes and moves the hand part 11A with his or her own hand, the effect of improving work efficiency due to the absence of discomfort in the direction in which the subject moves in the image on the display screen 31' is particularly It becomes something big.
 なお、タブレット3’はGPS機能および地磁気センサ等を搭載しているので、作業者Pが実際に動かすハンド部11Aの移動する向きを、常にタブレット3’の表示画面31’に表示される画像におけるハンド部11Aの移動する向きに一致させることができる。すなわち、タブレット3’の表示画面に表示される画像は、作業者Pが手に持っているタブレット3’の表示画面31’の向きに応じてリアルタイムに変化し、作業者Pに違和感を与えないようになっている。 In addition, since the tablet 3' is equipped with a GPS function, a geomagnetic sensor, etc., the moving direction of the hand section 11A that is actually moved by the worker P is always shown in the image displayed on the display screen 31' of the tablet 3'. This can be made to match the moving direction of the hand portion 11A. That is, the image displayed on the display screen of the tablet 3' changes in real time according to the orientation of the display screen 31' of the tablet 3' held in the hand of the worker P, so that the image does not make the worker P feel uncomfortable. It looks like this.
 図8は、本実施形態に係るロボットシステムの第2適用例を説明するための図である。図8において、参照符号16は、例えば、タブレット3’のバッテリを充電すると共に、タブレット3’に搭載されたGPS機能および地磁気センサ等によるタブレット3’の位置姿勢のキャリブレーションを行うための保持個所を示す。なお、作業者Pは、例えば、保持個所16に載置されてバッテリの充電および位置姿勢のキャリブレーションが済んだタブレット3’を取り出して使用する。これにより、ロボット制御装置2は、タブレット3’の位置姿勢を認識することができる。 FIG. 8 is a diagram for explaining a second application example of the robot system according to the present embodiment. In FIG. 8, reference numeral 16 is a holding point for charging the battery of the tablet 3' and calibrating the position and orientation of the tablet 3' using a GPS function, a geomagnetic sensor, etc. mounted on the tablet 3', for example. shows. Note that the worker P takes out and uses the tablet 3', which has been placed in the holding location 16 and whose battery has been charged and whose position and orientation have been calibrated, for example. Thereby, the robot control device 2 can recognize the position and orientation of the tablet 3'.
 さらに、ロボット制御装置2は、例えば、ロボット1の各可動部のモータに設けたエンコーダの出力信号を受け取り、カメラ12およびハンド部11Aの位置や移動する向きを認識することができる。このように、ロボット制御装置2は、タブレット3’の表示画面31’の向き(位置姿勢)およびハンド部11Aが位置する向き(距離および方向)等を認識することができるため、タブレット3’の表示画面31’に表示される画像における被写体(ワークWやロボット1の移動する所定個所)の相対関係を認識して、図2~図6を参照して説明した本実施形態に係るロボットシステムの第1および第2実施例を実現することが可能になる。 Furthermore, the robot control device 2 can receive, for example, output signals from encoders provided in the motors of each movable part of the robot 1, and can recognize the positions and moving directions of the camera 12 and the hand part 11A. In this way, the robot control device 2 can recognize the orientation (position and orientation) of the display screen 31' of the tablet 3', the orientation (distance and direction) in which the hand section 11A is positioned, and so on. The robot system according to the present embodiment described with reference to FIGS. 2 to 6 recognizes the relative relationship between the objects (the workpiece W and a predetermined location where the robot 1 moves) in the image displayed on the display screen 31'. It becomes possible to realize the first and second embodiments.
 なお、図8の説明では、タブレット3’を保持個所16に載置して、タブレット3’の充電および位置姿勢のキャリブレーションを行う例を説明したが、例えば、タブレット3’のキャリブレーションのみを行うこともできる。すなわち、例えば、キャリブレーションを行うロボット1の特定個所に表示ライトを設け、その表示ライトの位置に対して、予め定められた向きでタブレット3’を配置してキャリブレーションを行うことも可能である。 In addition, in the explanation of FIG. 8, an example was explained in which the tablet 3' is placed on the holding location 16 and the tablet 3' is charged and the position and orientation are calibrated. You can also do this. That is, for example, it is also possible to provide a display light at a specific location on the robot 1 to be calibrated, and to perform calibration by arranging the tablet 3' in a predetermined direction with respect to the position of the display light. .
 図9は、本実施形態に係るロボットシステムの第3適用例を説明するための図である。図9において、参照符号14は、ハンドガイド(ガイド部)を示し、12’は、ロボット1の上方(例えば、天井6)に固定されたカメラを示す。ここで、ハンドガイド14は、ロボット1のハンド部11Aの近傍に設けられ、作業者Pが実際にハンドガイド14に力を加えることでハンド部11Aを移動させて所定の動作を教示するためのものである。なお、図9に示すロボットシステムは、ロボット1のハンド部11Aの近傍に取り付けられたカメラ12と、ロボット1の上方の天井6に固定されたカメラ12’の両方を備えているが、カメラ12’だけの場合もあり得る。 FIG. 9 is a diagram for explaining a third application example of the robot system according to the present embodiment. In FIG. 9, reference numeral 14 indicates a hand guide (guide portion), and 12' indicates a camera fixed above the robot 1 (for example, on the ceiling 6). Here, the hand guide 14 is provided near the hand section 11A of the robot 1, and is used by the worker P to actually apply force to the hand guide 14 to move the hand section 11A and teach a predetermined operation. be. The robot system shown in FIG. 9 includes both a camera 12 attached near the hand portion 11A of the robot 1 and a camera 12' fixed to the ceiling 6 above the robot 1. ' may also be the case.
 作業者Pは、例えば、ロボット制御装置2に接続された教示操作盤3の表示画面31に表示された画像を確認しながら操作部32を操作し、さらに、作業者自身の手によりハンドガイド14に力を加えてロボット1のハンド部11Aを移動(動作)させる。ここで、ロボット1のハンド部11Aの近傍に取り付けられたカメラ12により撮影された画像の処理は、図2~図6を参照して説明した通りであるが、カメラ(固定カメラ)12’により撮影された画像の処理に関しても同様である。 For example, the worker P operates the operating section 32 while checking the image displayed on the display screen 31 of the teaching pendant 3 connected to the robot control device 2, and further operates the hand guide 14 with his or her own hands. The hand portion 11A of the robot 1 is moved (operated) by applying force. Here, the processing of the image taken by the camera 12 attached near the hand part 11A of the robot 1 is as explained with reference to FIGS. The same applies to processing of captured images.
 すなわち、固定カメラ12’により撮影された被写体(例えば、ロボット1のハンド部11A)に関し、作業者Pがハンドガイド14を介してハンド部11Aを移動させた向きと、固定カメラ12’により撮影して教示操作盤3の表示画面31に表示される画像におけるハンド部11Aの移動の向きに関しても、前述した第1および第2実施例のロボットシステムにおける処理を適用することができる。 That is, regarding the subject photographed by the fixed camera 12' (for example, the hand section 11A of the robot 1), the direction in which the worker P moved the hand section 11A via the hand guide 14 and the direction in which the object was photographed by the fixed camera 12' are determined. The processing in the robot systems of the first and second embodiments described above can also be applied to the direction of movement of the hand section 11A in the image displayed on the display screen 31 of the teaching pendant 3.
 図10は、本実施形態に係るロボット制御プログラムの処理の一例を説明するためのフローチャートであり、例えば、GPS機能等を搭載しない教示操作盤3を使用して作業を行う場合を説明するためのものである。 FIG. 10 is a flowchart for explaining an example of the processing of the robot control program according to the present embodiment. It is something.
 図10に示されるように、ロボット制御プログラムの処理の一例が開始すると、ステップST11において、ハンドガイドを使用してロボットのハンド部Aを作業者Pから見て正面方向(前後方向)および左右方向に移動し、ステップST12に進む。すなわち、ステップST11では、図9を参照して説明したハンドガイド14を使用してロボット1のハンド部(把持部)11Aを正面方向および左右方向に移動する。 As shown in FIG. 10, when an example of the processing of the robot control program starts, in step ST11, the hand guide is used to move the hand part A of the robot in the front direction (back-and-forth direction) and in the left-right direction when viewed from the worker P. Then, the process proceeds to step ST12. That is, in step ST11, the hand portion (grip portion) 11A of the robot 1 is moved in the front direction and left/right direction using the hand guide 14 described with reference to FIG.
 なお、ロボット1のハンド部11Aの近傍に設けたハンドガイド14を正面方向に移動することによるハンド部11Aの位置変化、および、ハンドガイド14を左右方向に移動することによるハンド部11Aの位置変化は、例えば、ロボット1の各可動部のモータに設けたエンコーダの出力信号、或いは、各モータを制御するパルス信号等に基づいて、ロボット制御装置2(演算処理装置)が正確に認識することができる。 Note that the position change of the hand part 11A caused by moving the hand guide 14 provided near the hand part 11A of the robot 1 in the front direction and the change in the position of the hand part 11A caused by moving the hand guide 14 in the left-right direction are as follows. For example, the robot control device 2 (arithmetic processing device) can accurately recognize the robot 1 based on the output signal of an encoder provided to the motor of each movable part of the robot 1 or the pulse signal controlling each motor.
 ステップST12では、ロボット1の動作からカメラ12の世界座標系(例えば、図5における世界座標系CA)と作業者Pの向きの相対関係を計算する。すなわち、ロボット制御装置2は、世界座標系CAにおけるロボット1のハンド部11Aが移動する位置座標、および、作業者Pがロボット1のハンド部11Aを操作する向きを計算する。 In step ST12, the relative relationship between the world coordinate system of the camera 12 (for example, the world coordinate system CA in FIG. 5) and the direction of the worker P is calculated from the motion of the robot 1. That is, the robot control device 2 calculates the positional coordinates in the world coordinate system CA in which the hand portion 11A of the robot 1 moves, and the direction in which the worker P operates the hand portion 11A of the robot 1.
 次に、ステップST13に進んで、カメラの世界座標系に対する位置姿勢を計算し、ステップST14に進み、座標変換によりカメラ12の向きと作業者Pの向きの相対関係を取得する。すなわち、ロボット制御装置2は、実空間における座標系(世界座標系)CAと、教示操作盤3の表示画面31に表示される画像における座標系Cbの関係を取得する。 Next, the process proceeds to step ST13, where the position and orientation of the camera with respect to the world coordinate system is calculated, and the process proceeds to step ST14, where the relative relationship between the orientation of the camera 12 and the orientation of the worker P is obtained by coordinate transformation. That is, the robot control device 2 acquires the relationship between the coordinate system (world coordinate system) CA in real space and the coordinate system Cb in the image displayed on the display screen 31 of the teaching pendant 3.
 そして、ステップST15に進んで、相対的な位置姿勢に基づいて、例えば、作業者Pの正面方向と表示画像(表示画面31の画像)の上方向を一致させるように、表示画面31に表示される画像を回転する。すなわち、ロボット制御装置2は、表示画面31に表示される画像における座標系Cbが座標系Caとなるように、表示画面31の画像を回転する。これにより、作業者Pがロボット1のハンド部11Aを実際に移動させる向きが、表示画面31の画像において被写体が移動する向きに一致することになり、作業者Pの違和感をなくして、作業を円滑に行うことが可能になる。 Then, the process proceeds to step ST15, and based on the relative position and orientation, the image is displayed on the display screen 31 so that, for example, the front direction of the worker P and the upper direction of the display image (the image on the display screen 31) match. Rotate the image. That is, the robot control device 2 rotates the image on the display screen 31 so that the coordinate system Cb in the image displayed on the display screen 31 becomes the coordinate system Ca. As a result, the direction in which the worker P actually moves the hand portion 11A of the robot 1 matches the direction in which the subject moves in the image on the display screen 31, eliminating the discomfort of the worker P and allowing him to work. It becomes possible to do it smoothly.
 図11は、本実施形態に係るロボット制御プログラムの処理の他の例を説明するためのフローチャートであり、例えば、GPS機能および地磁気センサ等を搭載したタブレット3’を使用して作業を行う場合を説明するためのものである。 FIG. 11 is a flowchart for explaining another example of the processing of the robot control program according to the present embodiment. This is for illustrative purposes only.
 図11に示されるように、ロボット制御プログラムの処理の他の例が開始すると、ステップST21において、GPSセンサおよび地磁気センサ等が搭載されたタブレット3’をロボットの所定個所で予め定められた向きに配置する。すなわち、図8を参照して説明したようなタブレット3’の位置姿勢のキャリブレーションを行う。これにより、ロボット制御装置2は、ロボット1のハンド部11Aとタブレット3’の表示画面31’の画像の相対関係を認識することができる。 As shown in FIG. 11, when another example of the processing of the robot control program starts, in step ST21, the tablet 3' on which the GPS sensor, geomagnetic sensor, etc. are mounted is placed at a predetermined location on the robot in a predetermined direction. Deploy. That is, the position and orientation of the tablet 3' is calibrated as described with reference to FIG. 8. Thereby, the robot control device 2 can recognize the relative relationship between the hand portion 11A of the robot 1 and the image on the display screen 31' of the tablet 3'.
 さらに、ステップST22に進んで、タブレット3’を持った状態でGPSセンサおよび地磁気センサ等によりタブレット3’の位置姿勢の情報を取得する。すなわち、ロボット制御装置2は、有線または無線回線で接続されたタブレット3’から、そのタブレット3’の情報を取得することができる。なお、ステップST21およびST22の処理は、例えば、図8を参照して説明したように、タブレット3’を保持個所16に載置し、バッテリの充電と共に、タブレット3’の位置姿勢のキャリブレーションを済ませておけば不要とすることができる。 Further, the process proceeds to step ST22, and information on the position and orientation of the tablet 3' is acquired using a GPS sensor, a geomagnetic sensor, etc. while the tablet 3' is being held. That is, the robot control device 2 can acquire information about the tablet 3' from the tablet 3' connected via a wired or wireless line. Note that the processes in steps ST21 and ST22 include, for example, as described with reference to FIG. 8, the tablet 3' is placed on the holding location 16, the battery is charged, and the position and orientation of the tablet 3' are calibrated. Once done, you can make it unnecessary.
 次に、ステップST23に進んで、ロボット基台15からカメラ12の相対座標を取得する。なお、ロボット制御装置2は、例えば、ロボット1の各可動部のモータに設けたエンコーダの出力信号や制御信号等により、ロボット基台15に対するカメラ12の相対座標を取得することができる。さらに、ステップST24に進んで、カメラ12からタブレット3’への相対的な位置姿勢を計算する。すなわち、ロボット制御装置2は、ステップST22で取得したタブレット3’の位置姿勢、および、ステップST23で取得したロボット基台15に対するカメラ12の相対座標に基づいて、カメラ12からタブレット3’への相対的な位置姿勢を計算する。 Next, the process proceeds to step ST23, and the relative coordinates of the camera 12 from the robot base 15 are acquired. Note that the robot control device 2 can obtain the relative coordinates of the camera 12 with respect to the robot base 15 using, for example, output signals and control signals of encoders provided to the motors of each movable part of the robot 1. Furthermore, the process proceeds to step ST24 to calculate the relative position and orientation from the camera 12 to the tablet 3'. That is, the robot control device 2 determines the relative position from the camera 12 to the tablet 3' based on the position and orientation of the tablet 3' obtained in step ST22 and the relative coordinates of the camera 12 with respect to the robot base 15 obtained in step ST23. Calculate the position and orientation of the object.
 そして、ステップST25に進んで、相対的な位置姿勢に基づいて、作業者Pの正面方向と表示画像(タブレット3’の表示画面31’に表示される画像)の上方向を一致させるように回転する。すなわち、ロボット制御装置2は、例えば、図2~図4を参照して説明したような表示画像の回転処理を行う。これにより、作業者Pは、ロボット1のハンド部11Aを実際に移動させた向きと、表示画面31の画像において被写体が移動する向きが一致するので、違和感なく円滑に作業を行うことができる。 Then, proceeding to step ST25, the worker P is rotated based on the relative position and orientation so that the front direction of the worker P and the upper direction of the display image (the image displayed on the display screen 31' of the tablet 3') coincide. do. That is, the robot control device 2 performs, for example, the rotation processing of the displayed image as described with reference to FIGS. 2 to 4. Thereby, the direction in which the hand portion 11A of the robot 1 is actually moved matches the direction in which the subject moves in the image on the display screen 31, so that the worker P can work smoothly without feeling uncomfortable.
 さらに、ステップST26に進んで、タブレット3’の表示画面上に、相対的な位置姿勢に基づいた座標を表示する。すなわち、図5および図6を参照して説明したように、座標系を表示画像に追加して表示する。これにより、作業者Pは、カメラ12で撮影した実空間における座標系CAと、タブレット3’の表示画面31’に表示される画像における座標系Ca(Cb)の対応関係を容易に確認することが可能になる。 Further, the process proceeds to step ST26, and coordinates based on the relative position and orientation are displayed on the display screen of the tablet 3'. That is, as described with reference to FIGS. 5 and 6, the coordinate system is added to the display image and displayed. Thereby, the worker P can easily confirm the correspondence between the coordinate system CA in the real space photographed by the camera 12 and the coordinate system Ca (Cb) in the image displayed on the display screen 31' of the tablet 3'. becomes possible.
 以上において、上述した本実施形態に係るロボット制御プログラムは、コンピュータ読み取り可能な非一時的記録媒体や不揮発性半導体記憶装置に記録して提供してもよく、また、有線または無線を介して提供してもよい。ここで、コンピュータ読み取り可能な非一時的記録媒体としては、例えば、CD-ROM(Compact Disc Read Only Memory)やDVD-ROM等の光ディスク、或いは、ハードディスク装置等が考えられる。また、不揮発性半導体記憶装置としては、PROM(Programmable Read Only Memory)やフラッシュメモリ(Flash Memory)等が考えられる。さらに、サーバ装置からの配信としては、有線または無線によるWAN(Wide Area Network)、LAN(Local Area Network)またはインターネット等を介した提供が考えられる。 In the above, the robot control program according to the present embodiment described above may be provided by being recorded on a computer-readable non-temporary recording medium or a non-volatile semiconductor storage device, or may be provided via wired or wireless communication. You can. Here, the computer-readable non-temporary recording medium may be, for example, an optical disk such as a CD-ROM (Compact Disc Read Only Memory) or a DVD-ROM, or a hard disk device. Further, as a nonvolatile semiconductor memory device, PROM (Programmable Read Only Memory), flash memory, etc. can be considered. Further, distribution from the server device may be provided via a wired or wireless WAN (Wide Area Network), LAN (Local Area Network), the Internet, or the like.
 以上、詳述したように、本実施形態に係るロボットシステム、ロボット制御装置およびロボット制御プログラムによれば、実際に操作するロボットの動く向きと、表示画面の画像における動く向きの違和感をなくして操作を円滑に行うことが可能になる。 As described above in detail, according to the robot system, robot control device, and robot control program according to the present embodiment, the robot can be operated without any discomfort between the direction of movement of the robot to be actually operated and the direction of movement in the image on the display screen. can be carried out smoothly.
 本開示の実施形態について詳述したが、本開示は上述した個々の実施形態に限定されるものではない。これらの実施形態は、発明の要旨を逸脱しない範囲で、または、特許請求の範囲に記載された内容とその均等物から導き出される本発明の思想および趣旨を逸脱しない範囲で、種々の追加、置き換え、変更、部分的削除等が可能である。例えば、上述した実施形態において、各動作の順序や各処理の順序は、一例として示したものであり、これらに限定されるものではない。また、上述した実施形態の説明に数値または数式が用いられている場合も同様である。 Although the embodiments of the present disclosure have been described in detail, the present disclosure is not limited to the individual embodiments described above. These embodiments may include various additions and substitutions without departing from the gist of the invention or the spirit and spirit of the present invention derived from the content described in the claims and equivalents thereof. , change, partial deletion, etc. are possible. For example, in the embodiments described above, the order of each operation and the order of each process are shown as examples, and are not limited to these. Further, the same applies when numerical values or formulas are used in the explanation of the embodiments described above.
 1  産業用ロボット(ロボット)
 2  ロボット制御装置
 3  教示操作盤(端末装置)
 3’  タブレット(端末装置)
 4  作業台
 5,W  ワーク(対象物)
 6  天井(ロボットの上方)
 11  アーム
 11A  把持部(ハンド部)
 12,12’  カメラ(画像撮影装置)
 13  リング照明(照明装置)
 14  ハンドガイド(ガイド部)
 15  ロボット基台
 16  保持個所
 31,31’  表示画面
 32  操作部
 100  産業用ロボットシステム(ロボットシステム)
1 Industrial robot (robot)
2 Robot control device 3 Teaching operation panel (terminal device)
3' Tablet (terminal device)
4 Workbench 5, W Work (object)
6 Ceiling (above the robot)
11 Arm 11A Gripping part (hand part)
12,12' camera (image capture device)
13 Ring lighting (lighting device)
14 Hand guide (guide part)
15 Robot base 16 Holding location 31, 31' Display screen 32 Operation unit 100 Industrial robot system (robot system)

Claims (11)

  1.  画像撮影装置と、
     前記画像撮影装置で撮影された画像を表示する表示画面を有する端末装置と、
     前記表示画面に表示される前記画像に基づいて操作されるロボットと、を備えるロボットシステムであって、
     前記ロボットの所定個所を第1の向きに動くように操作するとき、前記表示画面に表示される前記画像における被写体が動く第2の向きを、前記第1の向きに基づいて調整する、ロボットシステム。
    an image capturing device;
    a terminal device having a display screen that displays images captured by the image capturing device;
    A robot system comprising: a robot operated based on the image displayed on the display screen,
    When operating a predetermined part of the robot to move in a first direction, a robot system adjusts a second direction in which a subject moves in the image displayed on the display screen based on the first direction. .
  2.  前記所定個所を前記第1の向きに動くように作業者が操作するとき、前記表示画面に表示される前記画像における前記被写体が動く前記第2の向きを、前記第1の向きに基づいて前記作業者に違和感を与えないように調整する、請求項1に記載のロボットシステム。 When the operator operates the predetermined location to move in the first direction, the second direction in which the subject moves in the image displayed on the display screen is determined based on the first direction. The robot system according to claim 1, wherein the robot system is adjusted so as not to give a feeling of discomfort to a worker.
  3.  前記ロボットは、前記所定個所の近傍に設けられたガイド部を含み、
     前記作業者は、前記ガイド部を動かして前記ロボットを動作させると共に、前記端末装置の前記表示画面に表示される前記画像を確認しながら前記端末装置を操作する、請求項2に記載のロボットシステム。
    The robot includes a guide section provided near the predetermined location,
    The robot system according to claim 2, wherein the worker moves the guide section to operate the robot and operates the terminal device while checking the image displayed on the display screen of the terminal device. .
  4.  前記所定個所を前記第1の向きに動くように操作するとき、前記表示画面に表示される前記画像における前記被写体が動く前記第2の向きが前記第1の向きに一致するように、前記画像を回転する、請求項1乃至請求項3のいずれか1項に記載のロボットシステム。 When the predetermined location is operated to move in the first direction, the second direction in which the subject moves in the image displayed on the display screen matches the first direction. The robot system according to any one of claims 1 to 3, wherein the robot system rotates.
  5.  前記表示画面に表示される前記画像に対して、前記被写体の座標系を追加して表示する、請求項1乃至請求項4のいずれか1項に記載のロボットシステム。 The robot system according to any one of claims 1 to 4, wherein a coordinate system of the subject is added to and displayed on the image displayed on the display screen.
  6.  前記画像撮影装置は、前記ロボットの前記所定個所の近傍に取り付けられ、
     前記被写体は、前記ロボットにより処理が行われるワークを含む、請求項1乃至請求項5のいずれか1項に記載のロボットシステム。
    The image capturing device is attached near the predetermined location of the robot,
    The robot system according to any one of claims 1 to 5, wherein the subject includes a workpiece to be processed by the robot.
  7.  前記画像撮影装置は、前記ロボットの上方に固定して取り付けられ、
     前記被写体は、前記ロボットの前記所定個所を含む、請求項1乃至請求項5のいずれか1項に記載のロボットシステム。
    The image capturing device is fixedly attached above the robot,
    The robot system according to any one of claims 1 to 5, wherein the subject includes the predetermined location of the robot.
  8.  前記端末装置は、前記ロボットを操作する教示操作盤、タブレットまたはスマートフォンである、請求項1乃至請求項7のいずれか1項に記載のロボットシステム。 The robot system according to any one of claims 1 to 7, wherein the terminal device is a teaching pendant, a tablet, or a smartphone for operating the robot.
  9.  前記ロボットは、産業用ロボットまたは協働ロボットであり、
     前記所定個所は、前記産業用ロボットまたは前記協働ロボットのハンド部である、請求項1乃至請求項7のいずれか1項に記載のロボットシステム。
    The robot is an industrial robot or a collaborative robot,
    The robot system according to any one of claims 1 to 7, wherein the predetermined location is a hand portion of the industrial robot or the collaborative robot.
  10.  画像撮影装置で撮影された画像を表示する表示画面を有する端末装置の操作に基づいて、ロボットを制御するロボット制御装置であって、
     前記ロボットの所定個所を第1の向きに動くように操作するとき、前記表示画面に表示される前記画像における被写体が動く第2の向きを、前記第1の向きに基づいて調整する、ロボット制御装置。
    A robot control device that controls a robot based on the operation of a terminal device having a display screen that displays images captured by an image capturing device,
    Robot control, when operating a predetermined part of the robot to move in a first direction, a second direction in which a subject moves in the image displayed on the display screen is adjusted based on the first direction. Device.
  11.  画像撮影装置で撮影された画像を表示する表示画面を有する端末装置の操作に基づいて、ロボットを制御するロボット制御プログラムであって、
     演算処理装置に、
      前記ロボットの所定個所を第1の向きに動くように操作するとき、前記表示画面に表示される前記画像における被写体が動く第2の向きを、前記第1の向きに基づいて調整する、処理を実行させるロボット制御プログラム。
    A robot control program that controls a robot based on the operation of a terminal device having a display screen that displays images photographed by an image photographing device, the program comprising:
    In the arithmetic processing unit,
    When a predetermined part of the robot is operated to move in a first direction, a second direction in which the subject moves in the image displayed on the display screen is adjusted based on the first direction. Robot control program to be executed.
PCT/JP2022/025181 2022-06-23 2022-06-23 Robot system, robot control device, and robot control program WO2023248439A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2022/025181 WO2023248439A1 (en) 2022-06-23 2022-06-23 Robot system, robot control device, and robot control program
TW112121064A TW202400375A (en) 2022-06-23 2023-06-06 Robot system, robot control device, and robot control program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/025181 WO2023248439A1 (en) 2022-06-23 2022-06-23 Robot system, robot control device, and robot control program

Publications (1)

Publication Number Publication Date
WO2023248439A1 true WO2023248439A1 (en) 2023-12-28

Family

ID=89379317

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/025181 WO2023248439A1 (en) 2022-06-23 2022-06-23 Robot system, robot control device, and robot control program

Country Status (2)

Country Link
TW (1) TW202400375A (en)
WO (1) WO2023248439A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6311291A (en) * 1986-07-03 1988-01-18 三菱電機株式会社 Remote control type manipulator device
JP2011189431A (en) * 2010-03-12 2011-09-29 Denso Wave Inc Robot system
JP2019101476A (en) * 2017-11-28 2019-06-24 シュナイダーエレクトリックホールディングス株式会社 Operation guide system
JP2020075354A (en) * 2018-11-01 2020-05-21 キヤノン株式会社 External input device, robot system, control method for robot system, control program, and recording medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6311291A (en) * 1986-07-03 1988-01-18 三菱電機株式会社 Remote control type manipulator device
JP2011189431A (en) * 2010-03-12 2011-09-29 Denso Wave Inc Robot system
JP2019101476A (en) * 2017-11-28 2019-06-24 シュナイダーエレクトリックホールディングス株式会社 Operation guide system
JP2020075354A (en) * 2018-11-01 2020-05-21 キヤノン株式会社 External input device, robot system, control method for robot system, control program, and recording medium

Also Published As

Publication number Publication date
TW202400375A (en) 2024-01-01

Similar Documents

Publication Publication Date Title
CN105666505B (en) Robot system having display for augmented reality
US11724388B2 (en) Robot controller and display device using augmented reality and mixed reality
US10786906B2 (en) Robot system
JP6420229B2 (en) A robot system including a video display device that superimposes and displays an image of a virtual object on a video of a robot
US20190202058A1 (en) Method of programming an industrial robot
WO2017033357A1 (en) Robot system
US7373220B2 (en) Robot teaching device
JP2008254150A (en) Teaching method and teaching device of robot
US11173601B2 (en) Teaching device for performing robot teaching operations and teaching method
WO2020090809A1 (en) External input device, robot system, control method for robot system, control program, and recording medium
US20150273689A1 (en) Robot control device, robot, robotic system, teaching method, and program
JP6445092B2 (en) Robot system displaying information for teaching robots
JP2005106825A (en) Method and apparatus for determining position and orientation of image receiving device
JP2019042843A (en) Robot system and method for operating the same
JP2016187846A (en) Robot, robot controller and robot system
US10766135B2 (en) Teach pendant and robot control system
WO2023248439A1 (en) Robot system, robot control device, and robot control program
JP5573537B2 (en) Robot teaching system
WO2018214156A1 (en) Method of correcting locomotion control command of robot, and related apparatus for same
JP2021065971A (en) Robot teaching system, image forming method and program
JP3754340B2 (en) Position detection device
JP7190552B1 (en) Robot teaching system
JP7366264B2 (en) Robot teaching method and robot working method
JPH0430981A (en) Control unit for television camera of remote control type robot
JP2020175453A (en) Remote control device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22948000

Country of ref document: EP

Kind code of ref document: A1