WO2022176928A1 - 教示装置 - Google Patents
教示装置 Download PDFInfo
- Publication number
- WO2022176928A1 WO2022176928A1 PCT/JP2022/006253 JP2022006253W WO2022176928A1 WO 2022176928 A1 WO2022176928 A1 WO 2022176928A1 JP 2022006253 W JP2022006253 W JP 2022006253W WO 2022176928 A1 WO2022176928 A1 WO 2022176928A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- force sensor
- robot
- virtual image
- setting information
- virtual
- Prior art date
Links
- 238000010586 diagram Methods 0.000 description 14
- 230000003190 augmentative effect Effects 0.000 description 12
- 230000006870 function Effects 0.000 description 9
- 239000003550 marker Substances 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 5
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000013500 data storage Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000000034 method Methods 0.000 description 1
- 238000005498 polishing Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000013024 troubleshooting Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/42—Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39451—Augmented reality for robot programming
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39529—Force, torque sensor in wrist, end effector
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40586—6-DOF force sensor
Definitions
- the present invention relates to a teaching device.
- Force sensors are often used when robots are used to perform precision fitting, gear matching, and work surface matching.
- a force sensor mounted on the robot detects the force and moment during the work, and controls the detected force and moment to a predetermined value.
- An example of a robot system that executes such force control is described in Patent Document 1.
- the position information of the force sensors is stored as internal data of the robot control device or the teaching device, and the position information (position and orientation) of the force sensors is correctly set by the user or the teacher. It can be difficult to verify that In addition, force control may not work properly if the position information of the force sensor is set incorrectly, but investigating the cause (troubleshooting) when the robot does not work properly is a difficult task that requires a high level of knowledge. be.
- One aspect of the present disclosure includes a setting information storage unit that stores setting information that defines the position and orientation of a force sensor with respect to a coordinate system set in a robot, and a real space that includes a predetermined object that supports the robot or the force sensor.
- the virtual image representing the force sensor assumes a position and orientation according to the setting information.
- a virtual image superimposition display unit that superimposes and displays the virtual image in a space.
- the user can visually check it from the virtual image of the force sensor superimposed and displayed on the real space or the virtual space. can be grasped quickly and easily.
- FIG. 3 is a diagram showing a coordinate system set in a robot and a coordinate system of a force sensor;
- FIG. It is a figure showing the hardware structural example of a robot control apparatus and a teaching apparatus. It is a functional block diagram of a robot control device and a teaching device.
- FIG. 4 is a diagram showing a first example of arrangement of force sensors; 5B is a diagram for explaining a case where the position information of the force sensor in FIG. 5A has an error;
- FIG. 5B is a diagram showing an image in which a virtual image of the force sensor is superimposed on the real space or the virtual space when the position information of the force sensor in FIG. 5A has an error;
- FIG. 11 is a diagram showing a second example of arrangement of force sensors; 7B is a diagram for explaining a case where the position information of the force sensor in FIG. 7A has an error; FIG. 7B is a diagram showing an image in which a virtual image of the force sensor is superimposed on the real space or the virtual space when there is an error in the position information of the force sensor in FIG. 7A; FIG. It is a figure showing the equipment configuration of the robot system concerning a 2nd embodiment. FIG. 11 is a diagram for explaining a case where position information of a force sensor is erroneous in the robot system of FIG.
- FIG. 10 is a diagram showing an image in which a virtual image of the force sensor is superimposed on the real space or the virtual space when the position information of the force sensor of the robot system of FIG. 10 has an error;
- FIG. FIG. 13 represents an example of a virtual image with asymmetry for a force sensor;
- FIG. 1 is a diagram showing the configuration of a robot system 100 including a teaching device 30 according to a first embodiment.
- the robot system 100 includes a robot 10 , a robot controller 50 that controls the robot 10 , and a teaching device 30 that teaches (programs) the robot 10 .
- the robot 10 is equipped with a force sensor 21 and is configured to be able to perform various operations (fitting, pressing, phase matching, polishing, deburring, etc.) by force control.
- the force sensor 21 is attached to the flange 11 at the tip of the arm of the robot 10 via the bracket 15, as shown in the enlarged view circled in FIG.
- FIG. 1 shows a configuration in which a hand 41 is attached to the distal end side of the force sensor 21 .
- the robot 10 can perform a fitting operation of fitting the work W gripped by the hand 41 into a fitting hole of a target work (not shown).
- the output value (force/moment) of the force sensor 21 is output as a value in the coordinate system 201 of the force sensor 21 as shown in FIG.
- the coordinate system set in the robot 10 there is a robot coordinate system 101 set at the base of the robot 10, a flange coordinate system 102 set at the flange surface, or the like (see FIG. 2).
- the robot system 100 defines the relationship between the coordinate system 201 of the force sensor 21 and the coordinate system set in the robot 10. Retains setting information as internal data.
- the setting information that defines the relationship between the coordinate system 201 of the force sensor 21 and the coordinate system set in the robot 10 is, for example, the force sensor 21 (coordinate system 201).
- Force sensor position information (setting information) (x1, y1, z1, th1, th2, th3)
- x1, y1, and z1 are the X, Y, and Z coordinates of the force sensor 21 in the coordinate system of the robot 10
- th1, th2, and th3 are the X, Y, and Z axes in the coordinate system of the robot 10, respectively. It is the rotation angle of the force sensor 21 .
- the position information of the force sensor 21 is set and input via, for example, an interface for inputting setting values of the teaching device 30 and reflected in the robot control device 50 . If there is an error in the positional information of the force sensor 21, the robot 10 may not operate as intended. Incorrect location information) is generally difficult to determine.
- the teaching device 30 creates a virtual image representing a force sensor in a virtual space or a real space based on setting information defining the position and orientation of the force sensor. By superimposing on the , when there is an error in the setting information, the user can quickly and easily visually grasp the fact.
- FIG. 3 is a diagram showing a hardware configuration example of the robot control device 50 and the teaching device 30.
- the robot controller 50 provides a processor 51 with a memory 52 (ROM, RAM, non-volatile memory, etc.), an input/output interface 53, an operation unit 54 including various operation switches, etc. via a bus. It may have a configuration as a connected general computer.
- the teaching device 30 provides a processor 31 with a memory 32 (ROM, RAM, non-volatile memory, etc.), a display unit (display) 33, an operation unit 34 configured by an input device such as a keyboard (or software keys), an input It may have a general computer configuration to which the output interface 35 and the like are connected via a bus.
- the teaching device 30 may further have a camera 36 and an inertial sensor 37 .
- the camera 36 is a camera that captures two-dimensional images, but other types of cameras (stereo cameras, etc.) may be used.
- the inertial sensor 37 is a sensor (gyro, acceleration sensor, etc.) for estimating a position by odometry technology.
- the teaching device 30 is connected to the robot control device 50 by wire or wirelessly.
- the teaching device 30 is assumed to be a tablet terminal, but various information processing devices such as a teaching operation panel, a smart phone, and a personal computer can be used as the teaching device 30 .
- the robot controller 50 includes a storage unit 501 storing an operation program of the robot, settings related to the coordinate system, and other various setting information, and a robot operation based on the operation program and various setting information. and an operation control unit 502 for controlling.
- the motion control section 502 executes force control based on the detection value of the force sensor 21 .
- the motion control unit 502 provides position/orientation information of the robot 10 (robot control part) and information necessary for the teaching device 30 to execute its functions, in response to a request from the teaching device 30. 30.
- the teaching device 30 has a setting information storage unit 301 that stores position information (setting information) of the force sensor, and a model data storage that stores 3D models of each object constituting the robot system 100 and virtual image data of the force sensor 21. a camera position/posture estimation unit 303 for estimating the position/posture of the camera 36; and a virtual image superimposed display unit 304.
- the setting information storage unit 301 may be configured from a non-volatile memory, or may be a temporary buffer configured within a RAM. Note that the setting information storage unit 301 may store other various types of information regarding settings of the robot 10 .
- the virtual image superimposed display unit 304 displays a virtual image representing the force sensor 21 in a real space including a predetermined object supporting the robot 10 or the force sensor 21, or in a virtual space including a model of the robot 10 or a model of a predetermined object. has a function of superimposing and displaying a virtual image representing the force sensor 21 on a real space or a virtual space so that the force sensor 21 assumes a position and posture according to the setting information. Therefore, the virtual image superimposed display unit 304 is configured to include an augmented reality image processing unit 305 having a function of generating an augmented reality image, or a virtual reality image processing unit 306 having a function of generating a virtual reality image. be.
- the camera position/orientation estimating unit 303 performs the following procedure to extract the camera 36 (teaching device 30) is obtained, and the position/orientation of the camera 36 (teaching device 30) in the coordinate system (robot coordinate system 101) fixed in the work space is tracked.
- the camera position/posture estimation unit 303 acquires the arrangement position of the robot 10 in the work space (robot coordinate system 101 ) from the robot control device 50 or the storage unit in the teaching device 30 .
- the camera position/orientation estimation unit 303 captures, for example, a visual marker attached to the base of the robot 10 with the camera 36 (prompts the user to capture an image of the visual marker).
- a visual marker is, for example, a marker known in the art that has a visual pattern capable of measuring the position/orientation of a camera from a two-dimensional image taken of the visual marker.
- the camera position/orientation estimating unit 303 performs image processing on the captured image of the visual marker to grasp the position/orientation of the camera 36 (teaching device 30) in the robot coordinate system, is registered in the robot coordinate system 101 .
- the camera position/orientation estimator 303 obtains the movement of the camera 36 (teaching device 30) by odometry technology based on the output values of the inertial sensor 37, and Continuously update the position/posture.
- the augmented reality image processing unit 305 adds the position and orientation of the force sensor 21 to the image (video) captured by the camera 36 based on the position/orientation of the camera 36 (teaching device 30) obtained from the camera position/orientation estimation unit 303. It has a function of superimposing and displaying a virtual image of the force sensor 21 in a position/orientation according to information (setting information).
- the virtual reality image processing unit 306 arranges the model of each object constituting the robot system 100 in the virtual space based on the actual arrangement position information, and also stores the position information of the force sensor 21 in the virtual space ( It has a function to superimpose a virtual image of the force sensor 21 in a position/orientation according to setting information).
- the virtual image superimposed display unit 304 displays the virtual image of the force sensor 21 according to the position information (setting information) of the force sensor 21 will be described below.
- the position information (setting information) of the force sensor 21 is set as position information with respect to the flange coordinate system 102 set in the robot 10 .
- Display Example 1 In Display Example 1, as shown in FIG. 5A, the force sensor 21 is attached to the flange surface via the bracket 15 so that the center axis of the force sensor 21 and the center axis of the flange 11 are aligned. That is, the Z-axis of the flange coordinate system 102 and the Z-axis of the coordinate system of the force sensor 21 match.
- the position information (setting information) of the force sensor 21 may be set to (0, 0, dz, 0, 0, 0) as values on the flange coordinate system 102 of the robot 10 .
- the robot control device 50 assumes that the force sensor 21 is at a position such as that shown in FIG.
- the virtual image superimposed display unit 304 of the teaching device 30 displays the position information (dx, dy, dz2, 0, 0, 0), a virtual image (3D model) 21M of the force sensor 21 is displayed.
- the augmented reality image processing unit 305 has the function of displaying such an augmented reality image.
- the virtual image 21M of the force sensor 21 is displayed on the display unit 33 in the image of the real space in which the robot 10 is captured, in the position and orientation according to the setting information.
- An error in the positional information (setting information) of the force sensor 21 is instantly grasped from a positional comparison between the robot 10 and the virtual image 21M, or a positional comparison between the actual force sensor 21 and the virtual image 21M on the display screen. be able to.
- an image representing the coordinate system 201 of the force sensor 21 may be superimposed and displayed according to the position information (setting information) of the force sensor 21. In this case, it becomes easier to visually recognize the direction (orientation) of the position information (setting information) of the force sensor 21, and it is possible to more accurately visually recognize an error in the setting information.
- an image representing the coordinate system 201 of the force sensor 21 may be superimposed as the virtual image of the force sensor 21 instead of the virtual image 21M of the force sensor 21 .
- Display example 2 is an example in which the force sensor 21 is attached to the flange 11 via a bracket 15A having a relatively long shape in the horizontal direction in the figure.
- the central axis of the force sensor 21 does not coincide with the central axis of the flange 11, and the force sensor 21 is attached to the bracket 15A offset in the horizontal direction (Y-axis direction) in the figure.
- the position information (setting information) of the force sensor 21 may be set as (Dx, Dy, Dz, 0, 0, 0) as values on the flange coordinate system 102 of the robot 10 .
- the position information (setting information) of the force sensor 21 is incorrect (Dx2, Dy2, Dz2, th1, th2, th3), Dx2 ⁇ Dx, Dy2 ⁇ Dy, Dz2 ⁇ Dz, th1 ⁇ 0, th2 ⁇ 0, Assume that th3 ⁇ 0 is set.
- the robot control device 50 assumes that the force sensor 21 is at the position shown in FIG. 7B, for example, and obtains the force and moment at the point of action of the force. As shown in FIG.
- the virtual image superimposed display unit 304 (augmented reality image processing unit 305) of the teaching device 30 displays the position information of the force sensor 21 ( Dx2, Dy2, Dz2, th1, th2, th3), a virtual image (3D model) 21M of the force sensor 21 is displayed.
- the virtual image 21M of the force sensor 21 is displayed on the display unit 33 in the image of the real space in which the robot 10 is captured, in the position and orientation according to the setting information, so that the user can see the robot 10 (or An error in the positional information (setting information) of the force sensor 21 can be instantly grasped from a positional comparison between the bracket 15A) and the virtual image 21M, or a positional comparison between the actual force sensor 21 and the virtual image 21M.
- an image representing the coordinate system 201 of the force sensor 21 may be superimposed and displayed according to the position information (setting information) of the force sensor 21. In this case, it becomes easier to visually recognize the direction (orientation) of the position information (setting information) of the force sensor 21, and it is possible to more accurately visually recognize an error in the setting information.
- an image representing the coordinate system 201 of the force sensor 21 may be superimposed as the virtual image of the force sensor 21 instead of the virtual image 21M of the force sensor 21 .
- the display examples 1 and 2 described above are based on augmented reality in which the virtual image 21M of the force sensor 21 is superimposed on the image of the real space. Such images may be displayed as virtual reality images.
- the virtual reality image processing unit 306 performs the function of such virtual reality display. In this case, the virtual reality image processing unit 306 generates an image in which each object constituting the robot system 100 is arranged in the virtual space based on the actual arrangement information. Then, the virtual reality image processing unit 306 superimposes the virtual image 21M of the force sensor 21 so that the virtual image 21M of the force sensor 21 assumes the position/orientation according to the position information (setting information) of the force sensor 21 in the virtual space. Even in this case, the user can instantly grasp an error in the positional information (setting information) of the force sensor 21 from the positional comparison between the model of the robot 10 (or the model of the bracket) and the virtual image 21M.
- the camera position obtained by the camera position/orientation estimating unit 303 may be used as the viewpoint position when generating a virtual reality image, or the viewpoint position may be fixed at an arbitrary position within the work space.
- the teaching device 30 may omit the camera 36, the inertial sensor 37, and the camera position/orientation estimator 303.
- FIG. 9 is a diagram showing the configuration of a robot system 100A according to the second embodiment.
- the force sensor 22 was mounted on the robot 10 in the first embodiment, the force sensor 22 is mounted on the workbench 81 via the pedestal 82 in the second embodiment. That is, in the first embodiment, the object that supports the force sensor 21 is the robot 10, while in the second embodiment, the object that supports the force sensor 22 is the workbench 81 (or the base 82).
- the robot system 100A includes a robot 10, a robot control device 50, and a teaching device 30. Functions of the robot control device 50 and the teaching device 30 are equivalent to those of the first embodiment.
- a hand 41 is attached to the flange 11 at the tip of the arm of the robot 10 . The hand 41 grips the work W1. The robot 10 performs the work of fitting the work W1 into the fitting hole on the work W2 fixed to the workbench 81 via the force sensor 22 .
- the position information (setting information) of the force sensor 22 is set as the coordinate position/orientation in the robot coordinate system 101, for example.
- the original position and orientation of the force sensor 22 are assumed to be (x1, y1, z1, 0, 0, 0).
- the position information (setting information) of the force sensor 22 is incorrect (x2, y2, z2, th1, th2, th3), x2 ⁇ x1, y2 ⁇ y1, z2 ⁇ z1, th1 ⁇ 0, th2 ⁇ 0, Assume that th3 ⁇ 0 is set.
- the robot control device 50 assumes that the force sensor 22 is positioned as shown in FIG.
- the virtual image superimposed display unit 304 (augmented reality image processing unit 305) of the teaching device 30 displays the position information of the force sensor 22 ( x2, y2, z2, th1, th2, th3), a virtual image (3D model) 22M of the force sensor 22 is displayed.
- the virtual image 22M of the force sensor 22 is displayed in the image of the real space in which the workbench 81 (or the pedestal 82), which is a predetermined object supporting the force sensor 22, the robot 10, etc., is displayed according to the setting information.
- the user can compare the positions of the workbench 81 (or the pedestal 82) and the virtual image 22M on the display screen, or compare the actual force sensor 22 on the display screen.
- An error in the positional information (setting information) of the force sensor 22 can be instantly grasped from the positional comparison with the virtual image 22M.
- an image representing the coordinate system 201 of the force sensor 22 may be superimposed on the virtual image 22M of the force sensor 22. In this case, it becomes easier to visually recognize the direction (orientation) of the position information (setting information) of the force sensor 22, and it is possible to more accurately visually recognize an error in the setting information.
- an image representing the coordinate system 201 of the force sensor 22 may be superimposed and displayed as the virtual image of the force sensor 22.
- the display example described above is based on augmented reality in which the virtual image 22M of the force sensor 21 is displayed superimposed on the image of the real space. It may be displayed as an image.
- the virtual reality image processing unit 306 generates an image in which each object constituting the robot system 100A is arranged in the virtual space based on the actual arrangement information. Then, the virtual reality image processing unit 306 superimposes the virtual image 22M so that the virtual image 22M of the force sensor 22 assumes a position/orientation according to the position information (setting information) of the force sensor 22 in the virtual space. In this case as well, the user can obtain the positional information (set (Information) errors can be grasped instantly.
- the virtual image of the force sensor superimposed and displayed on the real space or virtual space allows the user to determine if there is an error in the set position/orientation of the force sensor. This can be quickly and easily grasped visually.
- the 3D model of the force sensor and the coordinate system of the force sensor are shown as examples of the virtual image of the force sensor, but there are various forms of the virtual image of the force sensor. Possible.
- FIG. 12 another example of the virtual image of the force sensor is shown.
- a virtual image 23M of the force sensor shown in FIG. 12 is an example of a virtual image having asymmetry. By giving the virtual image of the force sensor asymmetry, it is possible to make it easier to recognize an error in setting the posture.
- the shape of the virtual image 23M is set so as to have the connector 25 and the connector mounting portion 26 on the side surface of the cylindrical body portion 24 .
- the virtual image 23M has a shape having rotational asymmetry also around the Z axis.
- marks 211, 212, and 213 indicating the positions of the respective axes of the force sensor coordinate system may be formed on the virtual image 23M.
- the mark 211 is represented by "X-", which indicates that the direction of the mark 211 with respect to the origin position set on the central axis is the minus side of the X axis.
- the mark 212 is expressed as "Y+”, indicating that the side on which the mark 212 is located with respect to the origin position is the positive direction of the Y axis.
- the mark 213 is a display in which "Z ⁇ " and "Z+” are arranged so that "Z+” is arranged on the lower side, indicating that the lower side of the central axis line is the positive side of the Z axis.
- Such marks 211, 212, and 213 make it possible to recognize the coordinate system set in the force sensor, thereby making it easier to recognize an error in the orientation of the virtual image 23M.
- FIG. 12 illustrates a coordinate system 201 designated by marks 211, 212, and 213. As shown in FIG.
- an augmented reality image or a virtual reality image is displayed on the display device of a teaching device as a tablet-type terminal. It may also be displayed on a wearable display device.
- a glasses-type AR display device that superimposes a virtual image on a real landscape may be used.
- the functional blocks of the teaching device shown in FIG. 4 may be realized by the processor of the teaching device executing various software stored in the storage device, or by using hardware such as ASIC (Application Specific Integrated Circuit). It may be realized by a main configuration.
- ASIC Application Specific Integrated Circuit
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Numerical Control (AREA)
- Manipulator (AREA)
Abstract
Description
図1は第1実施形態に係る教示装置30を含むロボットシステム100の機器構成を表す図である。ロボットシステム100は、ロボット10と、ロボット10を制御するロボット制御装置50と、ロボット10の教示(プログラミング)を行うための教示装置30とを含む。
力センサ位置情報(設定情報)=(x1,y1,z1,th1,th2,th3)
ここで、x1,y1,z1は、ロボット10の座標系における力センサ21のX,Y,Z座標、th1、th2、th3はそれぞれロボット10の座標系におけるX軸、Y軸、Z軸周りの力センサ21の回転角度である。
(A1)カメラ位置姿勢推定部303は、作業空間(ロボット座標系101)におけるロボット10の配置位置をロボット制御装置50或いは教示装置30内の記憶部から取得する。
(A2)カメラ位置姿勢推定部303は、例えば、ロボット10の基部に貼り付けられた視覚マーカをカメラ36で撮像する(ユーザに視覚マーカを撮像するように促す)。視覚マーカは、例えば、視覚マーカを撮影した2次元画像からカメラの位置・姿勢を計測可能な視覚パターンを有する当分野で知られたマーカである。
(A3)カメラ位置姿勢推定部303は、視覚マーカを撮影した画像を画像処理することにより、カメラ36(教示装置30)のロボット座標系における位置・姿勢を把握し、カメラ36(教示装置30)の位置・姿勢をロボット座標系101に登録する。
(A4)以後、カメラ位置姿勢推定部303は、慣性センサ37の出力値に基づいてオドメトリ技術によりカメラ36(教示装置30)の動きを求め、ロボット座標系101におけるカメラ36(教示装置30)の位置・姿勢を継続的に更新する。
表示例1では、図5Aに示すように、力センサ21は、力センサ21の中心軸とフランジ11の中心軸が一致するように、ブラケット15を介してフランジ面に取り付けられる。すなわち、フランジ座標系102のZ軸と力センサ21の座標系のZ軸とが一致している。この場合、力センサ21の位置情報(設定情報)は、ロボット10のフランジ座標系102上の値として、(0,0,dz,0,0,0)と設定すればよい。
表示例2は、図7Aに示すように、力センサ21を、図中横方向に比較的長い形状を有するブラケット15Aを介してフランジ11に取り付ける場合の例である。この場合、力センサ21の中心軸は、フランジ11の中心軸と一致しておらず、力センサ21は図中横方向(Y軸方向)にオフセットしてブラケット15Aに取り付けられている。本例において、力センサ21の位置情報(設定情報)は、ロボット10のフランジ座標系102上の値として、(Dx,Dy,Dz,0,0,0)と設定すればよい。
第2実施形態は、上述の第1実施形態と比較して、力センサの配置位置が異なる場合の例である。図9は、第2実施形態に係るロボットシステム100Aの構成を表す図である。第1実施形態では、力センサ22はロボット10に搭載されていたが、第2実施形態では、力センサ22が作業台81上に台座82を介して取り付けられる。すなわち、第1実施形態では力センサ21を支持する物体はロボット10である一方、第2実施形態では力センサ22を支持する物体は作業台81(或いは台座82)である。
11 フランジ
15、15A ブラケット
21、22 力センサ
21M、22M、23M 仮想画像
24 胴体部
25 コネクタ
26 コネクタ取付部
30 教示装置
31 プロセッサ
32 メモリ
33 表示部
34 操作部
35 入出力インタフェース
36 カメラ
37 慣性センサ
50 ロボット制御装置
51 プロセッサ
52 メモリ
53 入出力インタフェース
54 操作部
81 作業台
82 台座
100、100A ロボットシステム
101 ロボット座標系
102 フランジ座標系
201 座標系
211、212、213 マーク
501 記憶部
502 動作制御部
301 設定情報記憶部
302 モデルデータ記憶部
303 カメラ位置姿勢推定部
304 仮想画像重畳表示部
305 拡張現実画像処理部
306 仮想現実画像処理部
Claims (5)
- ロボットに設定した座標系に対する力センサの位置及び姿勢を定義する設定情報を記憶する設定情報記憶部と、
前記ロボット若しくは前記力センサを支持する所定の物体を含む実空間、又は、前記ロボットのモデル若しくは前記所定の物体のモデルを含む仮想空間において、前記力センサを表す仮想的な画像が前記設定情報にしたがった位置及び姿勢をとるように、前記実空間又は前記仮想空間に前記仮想的な画像を重畳して表示する仮想画像重畳表示部と、
を備える教示装置。 - 前記仮想的な画像は、前記力センサの3Dモデルの画像である、請求項1に記載の教示装置。
- 前記仮想的な画像は、非対称性を有する、請求項1又は2に記載の教示装置。
- 前記仮想的な画像は、前記力センサに設定された座標系の所定の座標軸周りで回転非対称である、請求項3に記載の教示装置。
- 前記仮想的な画像は、前記力センサに設定された座標系を表す画像を有する、請求項1から4のいずれか一項に記載の教示装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280014336.6A CN116867619A (zh) | 2021-02-18 | 2022-02-16 | 示教装置 |
DE112022000381.2T DE112022000381T5 (de) | 2021-02-18 | 2022-02-16 | Einlernvorrichtung |
JP2023500910A JPWO2022176928A1 (ja) | 2021-02-18 | 2022-02-16 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-024527 | 2021-02-18 | ||
JP2021024527 | 2021-02-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022176928A1 true WO2022176928A1 (ja) | 2022-08-25 |
Family
ID=82932256
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/006253 WO2022176928A1 (ja) | 2021-02-18 | 2022-02-16 | 教示装置 |
Country Status (5)
Country | Link |
---|---|
JP (1) | JPWO2022176928A1 (ja) |
CN (1) | CN116867619A (ja) |
DE (1) | DE112022000381T5 (ja) |
TW (1) | TW202233368A (ja) |
WO (1) | WO2022176928A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024009767A1 (ja) * | 2022-07-08 | 2024-01-11 | ソニーグループ株式会社 | キャリブレーション実行装置、キャリブレーション・システム、および方法、並びにプログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017104944A (ja) * | 2015-12-10 | 2017-06-15 | ファナック株式会社 | 仮想物体の画像をロボットの映像に重畳表示する映像表示装置を備えるロボットシステム |
JP2019081242A (ja) * | 2017-10-31 | 2019-05-30 | セイコーエプソン株式会社 | シミュレーション装置、制御装置およびロボット |
JP2020123049A (ja) * | 2019-01-29 | 2020-08-13 | セイコーエプソン株式会社 | 寸法単位設定方法、情報処理装置、及びロボットシステム |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5338297B2 (ja) | 2008-12-19 | 2013-11-13 | 株式会社安川電機 | ロボットの制御装置 |
-
2022
- 2022-01-24 TW TW111102938A patent/TW202233368A/zh unknown
- 2022-02-16 WO PCT/JP2022/006253 patent/WO2022176928A1/ja active Application Filing
- 2022-02-16 DE DE112022000381.2T patent/DE112022000381T5/de active Pending
- 2022-02-16 JP JP2023500910A patent/JPWO2022176928A1/ja active Pending
- 2022-02-16 CN CN202280014336.6A patent/CN116867619A/zh active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017104944A (ja) * | 2015-12-10 | 2017-06-15 | ファナック株式会社 | 仮想物体の画像をロボットの映像に重畳表示する映像表示装置を備えるロボットシステム |
JP2019081242A (ja) * | 2017-10-31 | 2019-05-30 | セイコーエプソン株式会社 | シミュレーション装置、制御装置およびロボット |
JP2020123049A (ja) * | 2019-01-29 | 2020-08-13 | セイコーエプソン株式会社 | 寸法単位設定方法、情報処理装置、及びロボットシステム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024009767A1 (ja) * | 2022-07-08 | 2024-01-11 | ソニーグループ株式会社 | キャリブレーション実行装置、キャリブレーション・システム、および方法、並びにプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN116867619A (zh) | 2023-10-10 |
DE112022000381T5 (de) | 2023-09-28 |
JPWO2022176928A1 (ja) | 2022-08-25 |
TW202233368A (zh) | 2022-09-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105666505B (zh) | 具备扩展现实对应显示器的机器人系统 | |
EP3552077B1 (en) | Systems and methods for tracking motion and gesture of heads and eyes | |
JP5850962B2 (ja) | ビジュアルフィードバックを利用したロボットシステム | |
CN106945007B (zh) | 机器人系统、机器人、以及机器人控制装置 | |
JP5471355B2 (ja) | 3次元視覚センサ | |
Lambrecht et al. | Spatial programming for industrial robots based on gestures and augmented reality | |
WO2015045834A1 (ja) | マーカ画像処理システム | |
JP2021000678A (ja) | 制御システムおよび制御方法 | |
JP6445092B2 (ja) | ロボットの教示のための情報を表示するロボットシステム | |
JP2018167334A (ja) | 教示装置および教示方法 | |
JP2018144144A (ja) | 画像処理装置、画像処理方法、及びコンピュータプログラム | |
JP6885856B2 (ja) | ロボットシステムおよびキャリブレーション方法 | |
JP5093058B2 (ja) | ロボットの座標の結合方法 | |
JP2009269134A (ja) | 視覚検査装置のシミュレーション装置 | |
WO2022176928A1 (ja) | 教示装置 | |
JP2006026790A (ja) | 教示モデル生成装置 | |
JP2023038776A (ja) | 指令値生成装置、方法、及びプログラム | |
JP7439410B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
EP4261636A1 (en) | Processing device, processing system, head mounted display, processing method, program, and storage medium | |
WO2019093299A1 (ja) | 位置情報取得装置およびそれを備えたロボット制御装置 | |
WO2022092168A1 (ja) | ロボット制御装置及びロボットシステム | |
US20240316756A1 (en) | Teaching devic18/270,398 | |
Niu et al. | Eye-in-hand manipulation for remote handling: Experimental setup | |
JP2022163836A (ja) | ロボット画像の表示方法、コンピュータープログラム、及び、ロボット画像の表示システム | |
JP2003203252A (ja) | 画像表示装置及びその方法並びに記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22756246 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023500910 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18259779 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112022000381 Country of ref document: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280014336.6 Country of ref document: CN |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22756246 Country of ref document: EP Kind code of ref document: A1 |