JP2013049102A  Robot control device and method of determining robot attitude  Google Patents
Robot control device and method of determining robot attitude Download PDFInfo
 Publication number
 JP2013049102A JP2013049102A JP2011187343A JP2011187343A JP2013049102A JP 2013049102 A JP2013049102 A JP 2013049102A JP 2011187343 A JP2011187343 A JP 2011187343A JP 2011187343 A JP2011187343 A JP 2011187343A JP 2013049102 A JP2013049102 A JP 2013049102A
 Authority
 JP
 Japan
 Prior art keywords
 vector
 hand
 posture
 teaching
 robot
 Prior art date
 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
 Withdrawn
Links
 239000011159 matrix material Substances 0.000 claims abstract description 40
 238000003384 imaging method Methods 0.000 claims abstract description 18
 230000036544 posture Effects 0.000 description 111
 238000001514 detection method Methods 0.000 description 6
 238000000034 method Methods 0.000 description 5
 238000006243 chemical reaction Methods 0.000 description 3
 230000000875 corresponding Effects 0.000 description 3
 210000004247 Hand Anatomy 0.000 description 2
 238000010586 diagram Methods 0.000 description 2
 210000003811 Fingers Anatomy 0.000 description 1
 210000001503 Joints Anatomy 0.000 description 1
 210000003813 Thumb Anatomy 0.000 description 1
 210000000707 Wrist Anatomy 0.000 description 1
 230000003321 amplification Effects 0.000 description 1
 238000003199 nucleic acid amplification method Methods 0.000 description 1
Images
Abstract
A robot control apparatus that makes teaching work easier under certain conditions.
When teaching is performed such that an approach vector a that coincides with the line of sight of a camera at a hand coordinate of a robot arm always points to one point of an imaging target that is a target point, the position of the hand is taught. The approach vector a is obtained from the vector ^{B} P _{W} indicating the target point from the base coordinate origin and the vector ^{B} P _{T} indicating the hand. When the orientation vector o is determined with the norm being “1.0”, the normal vector o is determined by the outer product of the approach vector a and the orientation vector o. A rotation matrix ^{B} R _{T} indicating the posture of the hand at the teaching point is used as a rotation matrix ^{B} R _{W} indicating the known posture of the hand coordinate system, and a rotation matrix ^{T} R _{W} described by normal, orientation, and approach vectors. obtained from the inverse matrix of, it sets the orientation of the hand determined by the rotation matrix ^{B} R _{T} as the posture at the teaching point.
[Selection] Figure 1
Description
The present invention relates to a robot control apparatus and a robot posture determination method for teaching a vertical articulated robot so that an approach vector in hand coordinates always faces a target point.
When teaching a robot, in general, a position along a trajectory for moving the hand of the robot is taught, and a posture to be taken by the robot at the teaching point is also taught. For example, Patent Document 1 discloses a technique related to teaching and subsequent interpolation.
By the way, when attaching a camera to the hand of a robot and moving the camera to capture an image of a workpiece, the robot's posture during teaching is naturally fixed by the camera's line of sight. It is decided to be in the direction to catch the workpiece. For this purpose, for example, it is necessary to set each teaching point as close as possible, manually set the coordinates and posture of the hand at each teaching point, and interpolate between the teaching points in the same manner as in the prior art. Alternatively, each teaching point is set to be separated by a certain distance, and similarly, the coordinates and orientation of the hand at each teaching point are set manually, and then the interpolation of the teaching points is performed by fixing the tool and Interpolation by external TCP (Tool Center Point) operation in the form of gripping and working the work is also conceivable.
In any of the abovedescribed methods, it is necessary to determine coordinates and postures for each teaching point. However, the determination of coordinates is performed by a base coordinate (basic coordinate) system that is coordinates of the robot body. Therefore, it takes time and effort to increase the teaching points, but the work itself is not difficult. However, the task of determining the posture is performed in the coordinate system of the hand of the robot, and since the xyz direction changes depending on the orientation of the hand, it is difficult to grasp the posture, and each task is very difficult to perform. is there.
The present invention has been made in view of the above circumstances, and an object of the present invention is to provide a robot control device that makes it easier to teach the posture of the hand with respect to the work of imaging a work with a camera attached to the hand of the robot, and The object is to provide a robot attitude determination method.
According to the robot control apparatus of the first aspect, since the workpiece is imaged by the camera attached to the hand of the robot, teaching is performed for the vertical articulated robot so that the approach vector in the hand coordinates always faces the target point. It is assumed that this is done. In this case, when the position of the hand is taught, the first vector determining means obtains an approach vector from a vector indicating the target point from the base coordinate origin of the robot and a vector indicating the hand (teaching point) from the base coordinate origin. .
The second vector determining means sets one of the orientation vector and the normal vector of the hand coordinates as the second vector, so that the norm of the second vector is set to “1.0”, and the coordinate value of the second vector is orthonormal Determine based on conditions. Then, the third vector determining means determines the remaining third vector by the outer product of the approach vector and the second vector. At this stage, the normal, orientation, and approach vectors of the hand coordinates are obtained.
Then, the hand posture determination means determines the posture of the hand at the teaching point by using a rotation matrix indicating the posture of the hand coordinate system based on the known base coordinate system, and a rotation matrix described by normal, orient, and approach vectors. Based on the inverse matrix, the teaching posture setting means automatically sets the hand posture determined by the hand posture determining means as the posture at the teaching point. In other words, the work performed by the user in order to set the teaching point so that the line of sight of the camera attached to the hand of the robot always faces the work direction, only has to input the coordinates that give the position according to the base coordinate system, User workload can be greatly reduced. Since the posture of the hand of the robot at the teaching point is calculated and automatically set, the operator does not need to teach the posture of the hand at each teaching point, and the working time can be greatly shortened. .
According to the robot control apparatus of the second aspect, it is assumed that teaching is performed under the same conditions as in the first aspect. When the position of the hand is taught, the first vector determining means calculates a normal vector standing at the teaching point, which is obtained by the outer product of any two vectors existing in a plane parallel to the imaging surface of the camera at the teaching point. It is determined as an approach vector of hand coordinates. If the approach vector is determined, the second and third vector determining means, the hand posture determining means and the teaching posture setting means operate in the same manner as in claim 1 so that the posture of the robot hand at the teaching point is automatically determined. Set to Therefore, the operator need not teach the posture of the hand at each teaching point as in the first aspect, and the working time can be greatly shortened.
According to the robot control apparatus of the third aspect, when the interpolation unit interpolates between the two teaching points to determine one or more interpolation points, the first to third vector determination units also perform the interpolation points. The approach vector, the second vector, and the third vector are respectively determined, and the hand posture determining means also determines the hand posture at the interpolation point. Therefore, even when the movement locus of the robot hand is interpolated, the hand posture at each interpolation point can be automatically set.
(First embodiment)
Hereinafter, a first embodiment of the present invention will be described with reference to FIGS. FIG. 5 shows the configuration of a control system including a vertical articulated (6axis) robot. The robot body 1 has a 6axis arm on a base (rotating shaft) 2 in this case, and a tool such as a hand (not shown), a camera described later, and the like are attached to the tip of the arm. A first arm 3 is rotatably connected to the base 2 via a first joint J1. A lower end portion of a second arm 4 extending upward via the second joint J2 is rotatably connected to the first arm 3, and a third portion is connected to a tip portion of the second arm 4. The third arm 5 is rotatably connected via the joint J3.
A fourth arm 6 is rotatably connected to the tip of the third arm 5 via a fourth joint J4, and a fifth arm is connected to the tip of the fourth arm 6 via a fifth joint J5. 7 is rotatably connected, and the sixth arm 8 is rotatably connected to the fifth arm 7 via a sixth joint J6. In addition, in each joint J1J6, each arm 38 is rotationally driven by the servomotor which is not shown in figure.
A connection cable 12 is connected between the robot body 1 and a control device (controller, first to third vector determining means, hand posture determining means, teaching posture setting means, and interpolation means) 11. Accordingly, the servo motor that drives each axis of the robot body 1 is controlled by the control device 11. The pad controller (input means) 13 is provided with gripping portions 15L and 15R for the user to hold with both left and right hands, for example, on both ends of the horizontally long main body 14, and the user holds the gripping portions 15L and 15R with both hands. Joy pads 16L and 16R that can be pressed with a thumb while being held are disposed. In addition, buttons 17L and 17R that can be pressed with an index finger are arranged above the gripping portions 15L and 15R in the drawing.
A camera (imaging means) 21 using a CCD (Charge Coupled Device) or a CMOS image sensor is attached to the arm 8 of the robot body 1, and the imaging target (work) 22 is captured by the camera 21 as an image. A personal computer (personal computer) 23 is connected to the control device 11 via a cable 24.
The personal computer 23 includes a storage device (storage means) such as a memory or a hard disk. The pad controller 13 is connected to the personal computer 23 via the connection cable 18 and executes highspeed data transfer with the personal computer 23 via the communication interface. Information such as operation signals input by operating the joypads 16L and 16R is transmitted from the pad controller 13 to the control device 11 via the personal computer 23. The camera 21 is connected to the personal computer 23 via the cable 25, and image data captured by the camera 21 is transmitted to the personal computer 23 and displayed on the display 23D.
As shown in FIG. 6, the control device 11 includes a CPU 31 as a control unit, a drive circuit 32 as drive means for driving the motor 30 of each joint, a detection circuit 33, and the like. The CPU 31 is connected to a ROM 34 that stores a system program for the entire robot, a RAM 35 that stores an operation program for the robot body 1, and an interface 36 for connecting the teaching pendant 3. In FIG. 6, the shoulder portion 5, the lower arm 6, the upper arm 7, and the wrist 8 are shown as a single block as a movable portion, and only one motor 30 that is a driving source for these joints is shown accordingly. .
The detection circuit 33 is for detecting the current position (rotation angle) and current speed (rotation speed) of each joint. The detection circuit 33 includes a rotary encoder provided in the motor 30 that drives each joint. 37 is connected. The rotary encoder 37 serves both as a position sensor and a speed sensor, and outputs a pulse signal corresponding to the rotation angle of each motor 30, and the pulse signal is given to the detection circuit 33. The detection circuit 33 detects each motor 30 and thus the current position of each joint based on the pulse signal from each rotary encoder 37, and each motor 30 based on the number of pulse signals input from each rotary encoder 37 per unit time. Eventually, the current speed of each joint is detected, and information on the position and speed is supplied to the drive circuit 32 and CPU 31 of each motor 30.
Each drive circuit 32 compares the position command value and speed command value given from the CPU 31 with the current position and current speed given from the detection circuit 33, and supplies each motor 30 with a current corresponding to the deviation. Drive them. As a result, various operations are performed by the movement of the central portion of a flange (not shown), which is the tip (hand) of the robot arm, following a path that sequentially passes through the command position.
The movement path of the robot arm tip is given by a teaching operation performed using the pad controller 13 (or a teaching pendant (not shown)). In this teaching work, a plurality of positions on the locus that the robot arm tip should follow are sequentially taught as command positions. In the conventional teaching operation, the posture of the tip of the robot arm at each command position is also taught. In this embodiment, the posture is determined as described later. The taught command position and the determined posture are stored in the RAM 35. In actual robot work, the control device 11 sets a curve that smoothly interpolates between a plurality of given command positions and smoothly follows the command positions, and controls so that the tip of the robot arm moves on the curve. To do.
The position of the tip of the robot arm is indicated by where the origin of the threedimensional coordinates (tool coordinates) fixed to the flange is on the robot coordinates (base coordinates, base coordinates). In addition, the posture of the tip of the robot arm is defined by the orientation indicated by the threeaxis unit vectors (normal, orientation, and approach vectors) of the tool coordinates fixed to the flange on the robot coordinates.
As shown in FIG. 5, when the imaging object 22 is captured by the camera 21 attached to the hand, which is the tip of the robot arm, basically the line of sight of the camera 21 is always the imaging object (subject) 22 with respect to the posture of the hand. It is normal to teach to face. Therefore, under such a situation, the posture of the hand can be determined geometrically without the operator teaching.
FIG. 1 shows the relationship between the position / posture of the hand; camera 21 (tool coordinate) and the position / posture of the imaging target 22 (work coordinate) with reference to the origin of the base coordinates that are the coordinates of the robot body 1. FIG. FIG. 2 is a flowchart showing processing for determining the posture of the hand. The position of the work coordinate viewed from the base coordinates is indicated by the vector ^{B} P _{W} and the posture is indicated by the rotation matrix ^{B} R _{W} (the notation of the vector and the rotation matrix is different from the notation in the figure due to format restrictions) These are known because 22 is fixed. The position of the tool coordinates viewed from the base coordinates is indicated by a vector ^{B} P _{T} , the posture is indicated by a rotation matrix ^{B} R _{T} , and the vector ^{B} P _{T} is determined by teaching the position of the hand (step S1).
Then, the direction of line of sight of the camera 21 faces the imaged object 22, since the matching in the direction indicated of the tool coordinate approach vector a (first vector), the vector _{a = (a x, a y} , a z) is From the relationship between each coordinate shown in FIG. 1, it calculates by (1) Formula (step S2, 1st vector determination means).
Here, the rotation matrix ^{T} R _{W,} since a posture itself of the tool coordinate, normal vector n of the tool coordinate Orient vector o, described by approach vector a.
The normal, orient, and approach vectors n, o, and a have orthonormal conditions. Therefore, the inner product of the approach vector a and the orientation vector o is zero,
a _{x} o _{x} + a _{y} o _{y} + a _{z} o _{z} = 0 (4)
o _{x} ^{2} + o _{y} ^{2} + o _{z} ^{2} = 0 (5)
Therefore, for example, if the norm of the orientation vector o is determined to be “1.0” (unit length), the orientation vector o (second vector) is determined from the equations (4) and (5) (step S3, second). 2 vector determination means).
Then, since the normal vector n (third vector) is obtained by the outer product of the approach vector a and the orientation vector o,
In step S2 to S4, normal, Orient, each vector n approaches, o, because a is determined, the rotation matrix ^{T} R _{W} is determined (step S5), (2) the expression, a rotation matrix ^{B} is an attitude of the end _{RT} is calculated (step S6, hand posture determination means). With the above processing, the hand posture is automatically determined by calculation based on the determination of the hand position by teaching.
Once the position of the hand is taught and the posture of the hand is determined by calculation, interpolation between a plurality of teaching positions (command positions) is then performed by a spline curve (interpolating means). Details of the spline interpolation method are disclosed in, for example, Japanese Patent Application LaidOpen No. 200742021, and the description thereof is omitted here.
Further, the interpolation performed for the command position is similarly applied to the hand posture to perform the interpolation. FIG. 3 is a flowchart showing the posture interpolation process. When the teaching point is set for the position and orientation (step S11), the teaching point (tool coordinates viewed from the base coordinates) is moved from the tool (handing) to move the hand of the robot body 1 around the imaging object 22. Coordinate conversion is performed so that the position and orientation (work coordinates viewed from the tool coordinates) to the viewed imaging object 22 are obtained (step S12).
Next, each teaching posture at each teaching position is specified as a coordinate value. Since the taught posture is given as a rotation angle value (X direction rotation component, Y direction rotation component, Z direction rotation component), a spatial coordinate value (X direction posture coordinate value, Y direction posture coordinate) using a rotation matrix. Value, Zdirection posture coordinate value) (rotation matrix conversion processing, posture coordinate conversion processing means). A rotation matrix corresponding to the posture of the hand at each position is represented by the converted coordinates. That is, this rotation matrix is a threedimensional coordinate value representing the normal vector n (X axis), the orientation vector o (Y axis), and the approach vector a (Z axis) that specify the hand posture on the base coordinates, respectively. Become. Further, a combination of each direction coordinate value of each teaching position and each direction posture coordinate value at the teaching position corresponds to the movement target position.
Then, a virtual point obtained by extending an arbitrary length in each of the X, Y, and Z axes with the teaching position as the origin is a posture point (amplified X direction posture coordinate value, amplified Y direction posture coordinate value, amplified Z direction). Posture coordinate value) (step S13, posture vector amplification processing means). That is, as shown in FIG. 4, if the teaching posture is considered to be a direction vector standing with the teaching position (SC) as the origin, teaching is performed by determining the threedimensional coordinate values (SN, SO, SA) of the direction vector. The posture is specified (however, only SO and SA are shown in FIG. 4). The “arbitrary length” may be any length that can specify the teaching posture, and is, for example, n (> 2) times the unit vector of the coordinate system.
Here, the coordinate value indicating the posture represented by using the rotation matrix is a value close to zero. The calculation for performing the interpolation is performed by the CPU 31 built in the control device 11. However, since the CPU 31 has a calculation limit digit, if a value close to zero is used from the beginning, each part beyond the calculation limit digit is used. There are values that affect the posture at the teaching point. However, since the subtle difference is swallowed up as a calculation error in the calculation of the CPU 31, the interpolation value cannot be calculated so that the posture changes smoothly. Therefore, in order to avoid such a situation, the length of the vector is multiplied by n, and a value for calculating so as to smoothly change the posture within the calculation limit digit of the CPU 31 is taken in.
By the processing in step S13, three posture points for one teaching posture are determined as threedimensional coordinate values, respectively, and the teaching posture can be interpolated by treating them in the same manner as the “position”. That is, trajectory processing is performed and interpolated so that the posture points of X, Y, and Z are respectively connected as smooth trajectories by a spline curve like the teaching position described above (step S14, amplified posture coordinate value connection processing). means).
Here, FIG. 4 conceptually shows a process of interpolating the teaching position and the teaching posture. In order to avoid complication of the illustration, as described above, only the Y and Z axes are shown for the posture points. 4 (a) and 4 (b) show a case where each teaching position is subjected to curve interpolation, and FIG. 4 (c) shows a case where the Y direction posture point and the Z direction posture point are determined for the teaching posture at each teaching position. Yes, each arrow corresponds to an orientation vector and an approach vector. FIG. 4D shows a case where the Ydirection posture point is subjected to curve interpolation, and FIG. 4E shows a case where the Zdirection posture point is subjected to curve interpolation. In addition, curve interpolation is similarly performed for an X direction (normal vector o) posture point (not shown).
In step S14, the posture points are independently interpolated in each of the X, Y, and Z directions. However, since the interpolation points in these directions correspond to the respective axes of the threedimensional coordinates, the relationship between them is an orthonormal condition. It is necessary to satisfy. Therefore, correction is performed so as to satisfy the condition (step S15). Thus, the teaching posture interpolation process is completed.
Then, when the coordinates of the interpolation point obtained in step S15 are transformed so as to be the position / orientation of the tool as viewed from the base (step S16), the trajectory of the hand (X, Y, Z each obtained by adding the interpolation point) The trajectory of the direction / posture point) is converted into an angle of each axis J1 to J6 (step S17, teaching posture determination means). This completes the posture interpolation process.
As described above, according to the present embodiment, the control device 11 causes the approach vector a that coincides with the line of sight of the camera 21 in the hand arm coordinates of the robot arm to always face one point of the imaging target 22 that is the target point. In teaching, when the position of the hand is taught, the approach vector a is obtained from the vector ^{B} P _{W} indicating the target point from the base coordinate origin of the robot body 1 and the vector ^{B} P _{T} indicating the hand from the base coordinate origin. . When the norm is determined to be “1.0” in order to determine the orientation vector o and the other two coordinate values are determined based on the orthonormal condition, the normal vector o is determined as the approach vector a and the orientation vector o. Determined by outer product.
The rotation matrix ^{B} R _{T} indicating the posture of the hand at the teaching point is described by the rotation matrix ^{B} R _{W} indicating the posture of the hand coordinate system based on the known base coordinate system and the normal, orient, and approach vectors. obtained from the inverse matrix of that rotation matrix ^{T} R _{W,} the attitude of the end determined by the rotation matrix ^{B} R _{T,} automatically set as the posture at the teaching point.
That is, under the situation where the approach vector a in the hand coordinates is always directed toward the target point, the posture of the hand of the robot body 1 at the teaching point can be automatically set because it is obtained by calculation. Therefore, the operator only has to input the coordinate value of each teaching point in the base coordinate system, and it is not necessary to teach the posture of the hand at each teaching point, and the working time can be greatly shortened. In addition, when the control device 11 interpolates between two teaching points and determines one or more interpolation points, the approach vector a, the orientation vector o, and the normal vector n are also determined for the interpolation points, and the interpolation points are determined. Since the posture of the hand is automatically set by obtaining the posture of the hand, the posture of the hand at each interpolation point can be set so that the line of sight of the camera 21 captures the imaging object 22.
Conventionally, for each teaching point of the hand, a work for giving a coordinate value indicating a position and a work for giving a posture have been required. However, according to the method and method of this embodiment, the “work for giving a posture” Since it becomes unnecessary, only the work of assigning coordinate values for each teaching point and the work of assigning the coordinates of the imaging target 22 (workpiece) are performed. In view of the fact that the “work for imparting posture” is very complicated, it can be said that the substantial work amount is ½ or less.
(Second embodiment)
7 and 8 show a second embodiment. The same parts as those in the first embodiment are denoted by the same reference numerals and the description thereof is omitted, and different parts will be described below. In the second embodiment, the approach vector a is determined by a method different from that in the first embodiment. That is, the shape of the camera 21 is known, and the plane that is parallel to the imaging surface in the outer shape of the camera 21 is specified in advance from the shape. If any two different vectors included in this plane are plane vectors s1 and s2, the normal vector a is expressed by equation (7).
In the flowchart shown in FIG. 8, in step S2 ′ (first vector determination means) instead of step S2, if the approach vector a is determined by equation (7), the posture of the hand is thereafter performed in exactly the same manner as in the first embodiment. A rotation matrix ^{B} R _{T} can be calculated.
As described above, according to the second embodiment, when the position of the hand is taught, it is obtained by the outer product of any two vectors s1 and s2 existing in a plane parallel to the imaging surface of the camera 21 at the taught point. The normal vector standing at the teaching point is determined as the approach vector a of the hand coordinates. The orientation vector o and the normal vector n are determined in the same manner as in the first embodiment, and the posture of the hand at the teaching point is automatically set. That is, the posture of the hand of the robot body 1 at the teaching point can be automatically set because it can be obtained by calculation. Therefore, the operator need not teach the posture of the hand at each teaching point as in the first embodiment, and the working time can be greatly shortened.
The present invention is not limited to the embodiments described above or shown in the drawings, but can be modified or expanded as follows.
The normal vector n may be calculated first as the second vector, and the orientation vector o may be calculated as the third vector by the outer product of the approach vector a and the normal vector n.
The present invention is not limited to a 6axis robot, and can be applied to other multiaxis robots.
In the drawings, 1 is a robot body, 11 is a control device (first to third vector determining means, hand posture determining means, teaching posture setting means, interpolation means), and 21 is a camera.
Claims (6)
When the position of the hand is taught, a first vector determination for determining the approach vector from a vector indicating the target point from the base coordinate origin of the robot and a vector indicating the hand (teaching point) from the base coordinate origin Means,
In order to determine one of the orientation vector and the normal vector of the hand coordinates as the second vector, the norm of the second vector is set to “1.0”, and the coordinate value of the second vector is set based on the orthonormal condition. Second vector determining means for determining;
Third vector determining means for determining a third vector, which is the other of the orientation vector and the normal vector, by an outer product of the approach vector and the second vector;
The hand posture determined from the rotation matrix indicating the posture of the hand coordinate system based on the base coordinate system and the inverse matrix of the rotation matrix described by the normal, orient, and approach vectors based on the base coordinate system. A determination means;
A robot control apparatus comprising teaching posture setting means for automatically setting the hand posture determined by the hand posture determination means as the posture at the teaching point.
When the position of the hand is taught, a normal vector standing at the teaching point, which is obtained by the outer product of any two vectors existing in a plane parallel to the imaging surface of the camera at the teaching point, is determined as the hand coordinates. First vector determining means for determining as an approach vector of
In order to determine one of the orientation vector and the normal vector of the hand coordinates as the second vector, the norm of the second vector is set to “1.0”, and the coordinate value of the second vector is set based on the orthonormal condition. Second vector determining means for determining;
Third vector determining means for determining a third vector, which is the other of the orientation vector and the normal vector, by an outer product of the approach vector and the second vector;
The hand posture determined from the rotation matrix indicating the posture of the hand coordinate system based on the base coordinate system and the inverse matrix of the rotation matrix described by the normal, orient, and approach vectors based on the base coordinate system. A determination means;
A robot control apparatus comprising teaching posture setting means for automatically setting the hand posture determined by the hand posture determination means as the posture at the teaching point.
The first to third vector determining means determine the approach vector, the second vector, and the third vector for the interpolation point, respectively.
The robot control apparatus according to claim 1, wherein the hand posture determination unit also obtains a hand posture at the interpolation point.
When the position of the hand is taught, the approach vector is obtained from a vector indicating the target point from the base coordinate origin of the robot and a vector indicating the hand (teaching point) from the base coordinate origin.
In order to determine one of the orientation vector and the normal vector of the hand coordinates as the second vector, the norm of the second vector is set to “1.0”, and the coordinate value of the second vector is set based on the orthonormal condition. Decide
A third vector that is the other of the orientation vector and the normal vector is determined by an outer product of the approach vector and the second vector;
Obtaining the posture of the hand at the teaching point from a rotation matrix indicating the posture of the hand coordinate system based on the base coordinate system and an inverse matrix of a rotation matrix described by the normal, orientation, and approach vectors,
A robot posture determination method, wherein the determined posture of the hand is automatically set as a posture at the teaching point.
When the position of the hand is taught, a normal vector standing at the teaching point, which is obtained by the outer product of any two vectors existing in a plane parallel to the imaging surface of the camera at the teaching point, is determined as the hand. Determined as the coordinate approach vector,
In order to determine one of the orientation vector and the normal vector of the hand coordinates as the second vector, the norm of the second vector is set to “1.0”, and the coordinate value of the second vector is set based on the orthonormal condition. Decide
A third vector that is the other of the orientation vector and the normal vector is determined by an outer product of the approach vector and the second vector;
Obtaining the posture of the hand at the teaching point from a rotation matrix indicating the posture of the hand coordinate system based on the base coordinate system and an inverse matrix of a rotation matrix described by the normal, orientation, and approach vectors,
A robot posture determination method, wherein the determined posture of the hand is automatically set as a posture at the teaching point.
For the interpolation point, the approach vector, the second vector, and the third vector are respectively determined.
6. The robot posture determination method according to claim 4, wherein the posture of the hand at the interpolation point is also obtained.
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

JP2011187343A JP2013049102A (en)  20110830  20110830  Robot control device and method of determining robot attitude 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

JP2011187343A JP2013049102A (en)  20110830  20110830  Robot control device and method of determining robot attitude 
Publications (1)
Publication Number  Publication Date 

JP2013049102A true JP2013049102A (en)  20130314 
Family
ID=48011612
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

JP2011187343A Withdrawn JP2013049102A (en)  20110830  20110830  Robot control device and method of determining robot attitude 
Country Status (1)
Country  Link 

JP (1)  JP2013049102A (en) 
Cited By (6)
Publication number  Priority date  Publication date  Assignee  Title 

JP2014208400A (en) *  20140610  20141106  株式会社デンソーウェーブ  Robot controller and robot attitude interpolation method 
JP2015178157A (en) *  20140319  20151008  株式会社デンソーウェーブ  Robot control method and robot control device 
JP2015178158A (en) *  20140319  20151008  株式会社デンソーウェーブ  Robot control method and robot control device 
JP2015182147A (en) *  20140320  20151022  株式会社デンソーウェーブ  Robot control method and robot control apparatus 
CN105451461A (en) *  20151125  20160330  四川长虹电器股份有限公司  PCB board positioning method based on SCARA robot 
CN106272444A (en) *  20160831  20170104  山东中清智能科技有限公司  A kind of realize trick relation and method that dual robot relation is demarcated simultaneously 

2011
 20110830 JP JP2011187343A patent/JP2013049102A/en not_active Withdrawn
Cited By (7)
Publication number  Priority date  Publication date  Assignee  Title 

JP2015178157A (en) *  20140319  20151008  株式会社デンソーウェーブ  Robot control method and robot control device 
JP2015178158A (en) *  20140319  20151008  株式会社デンソーウェーブ  Robot control method and robot control device 
JP2015182147A (en) *  20140320  20151022  株式会社デンソーウェーブ  Robot control method and robot control apparatus 
JP2014208400A (en) *  20140610  20141106  株式会社デンソーウェーブ  Robot controller and robot attitude interpolation method 
CN105451461A (en) *  20151125  20160330  四川长虹电器股份有限公司  PCB board positioning method based on SCARA robot 
CN106272444A (en) *  20160831  20170104  山东中清智能科技有限公司  A kind of realize trick relation and method that dual robot relation is demarcated simultaneously 
CN106272444B (en) *  20160831  20181113  山东中清智能科技股份有限公司  A method of realizing that trick relationship and dual robot relationship are demarcated simultaneously 
Similar Documents
Publication  Publication Date  Title 

JP4167940B2 (en)  Robot system  
US20180257238A1 (en)  Manipulator system  
JP6468741B2 (en)  Robot system and robot system calibration method  
Kofman et al.  Teleoperation of a robot manipulator using a visionbased humanrobot interface  
JP2013049102A (en)  Robot control device and method of determining robot attitude  
US20150273689A1 (en)  Robot control device, robot, robotic system, teaching method, and program  
US9193072B2 (en)  Robot and control method thereof  
JP6445092B2 (en)  Robot system displaying information for teaching robots  
JP5108032B2 (en)  Multijoint structure teaching device  
JP4591043B2 (en)  Method of gripping an arbitrarily shaped object by a robot  
JP2018167334A (en)  Teaching device and teaching method  
JP6897396B2 (en)  Control devices, robot systems and control methods  
WO2020090809A1 (en)  External input device, robot system, control method for robot system, control program, and recording medium  
JP5948914B2 (en)  Robot control apparatus, robot control method, robot control program, and robot system  
JP6322949B2 (en)  Robot control apparatus, robot system, robot, robot control method, and robot control program  
JP5428639B2 (en)  Robot control apparatus and robot teaching method  
JP5729226B2 (en)  Robot position and orientation interpolation method and robot control apparatus  
JP2014208400A (en)  Robot controller and robot attitude interpolation method  
CN112118940A (en)  Direct teaching device and direct teaching method for robot  
JP3940998B2 (en)  Robot equipment  
JP2017124468A (en)  Method of controlling robot, method of manufacturing component, robot device, program, and recording medium  
JP6268924B2 (en)  Robot and robot working method  
JP2011224745A (en)  Robot teaching device and controller for the same, and program  
JP2012135835A (en)  Robot control device, and robot posture interpolation method  
JP2003062775A (en)  Teaching system for human hand type robot 
Legal Events
Date  Code  Title  Description 

A300  Withdrawal of application because of no request for examination 
Free format text: JAPANESE INTERMEDIATE CODE: A300 Effective date: 20141104 