WO2024037658A1 - Method and apparatus for controlling pointing action of robot, and electronic device and storage medium - Google Patents

Method and apparatus for controlling pointing action of robot, and electronic device and storage medium Download PDF

Info

Publication number
WO2024037658A1
WO2024037658A1 PCT/CN2023/118887 CN2023118887W WO2024037658A1 WO 2024037658 A1 WO2024037658 A1 WO 2024037658A1 CN 2023118887 W CN2023118887 W CN 2023118887W WO 2024037658 A1 WO2024037658 A1 WO 2024037658A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
robotic arm
arm
target object
target
Prior art date
Application number
PCT/CN2023/118887
Other languages
French (fr)
Chinese (zh)
Inventor
黄秋兰
谢安桓
朱世强
顾建军
王鑫
梁定坤
留云
Original Assignee
之江实验室
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 之江实验室 filed Critical 之江实验室
Publication of WO2024037658A1 publication Critical patent/WO2024037658A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the present application relates to the technical field of robot manipulator anthropomorphic movements, and in particular to a robot pointing movement control method, device, electronic equipment and storage medium.
  • This application provides a robot pointing action control method, device, electronic equipment and storage medium to realize the application of pointing to target object scenes and improve the applicability of the robot.
  • This application provides a robot pointing action control method, including:
  • the selected robotic arm is controlled to point to the target object.
  • the pose pointing to the target object includes:
  • the posture of the robotic arm end of the selected robotic arm is generated.
  • mapping the target coordinates to coordinates within the reachable range of the end of the selected robotic arm includes: based on the robot coordinate system, the shoulder joint of the selected robotic arm points to the target object. The angle between the vector of and the The relationship between the Z-direction coordinates, the coordinates of the shoulder joint of the selected robotic arm and the arm length is used to obtain the Z-coordinate of the robotic arm end of the selected robotic arm;
  • Generating the posture of the end of the robotic arm of the selected robotic arm based on the mapped coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the specified direction includes:
  • the unit vector of the vector of the target object pointing to the end of the robotic arm is determined as the three Z-direction components of the attitude matrix of the end of the selected robotic arm that are used to form the Z-direction vector.
  • the end of the selected robotic arm's robotic arm posture The state matrix is a matrix with 3 rows and 3 columns;
  • the selected robot arm When the end of the selected robot arm points to the target object, and the three Z-direction components of the attitude matrix are positive, and the selected robot arm is a right robot arm, determine the composition of the attitude matrix.
  • the three Y-direction components of the Y-direction vector are positive;
  • the selected robot arm When the end of the selected robot arm points to the target object, and the three Z-direction components of the attitude matrix are positive, and the selected robot arm is the left robot arm, determine the composition of the attitude matrix.
  • the three Y-direction components of the Y-direction vector are negative;
  • the three X-direction components of the attitude matrix used to form the X-direction vector are determined based on the three Z-direction components and the three Y-direction components of the attitude matrix of the end of the selected robot arm.
  • the method further includes:
  • the use of measuring equipment to test the obtained pose of the end of the robotic arm and determine the error of the pointing action of the end of the selected robotic arm toward the target object includes:
  • the angle between the vector and the Z-direction vector of the attitude matrix of the robot end of the selected robot arm is determined as the error.
  • determining the trajectory of the selected robotic arm from the current posture of each joint of the selected robotic arm to the target posture of each joint includes:
  • trajectory interpolation the trajectory of the robotic arm end of the selected robotic arm from the current posture to the target posture of each joint is obtained.
  • the inverse kinematics is solved to obtain the target posture of each joint of the robotic arm, including:
  • trajectory interpolation is performed to control the movement of the robot arm from the current configuration of the robot to the target configuration of the robot pointing to the target object.
  • the current configuration of the robot is The configuration includes the current angles of each of the joints.
  • This application provides a robot pointing action control device, including:
  • the robot arm selection module is used to select the robot arm of the robot according to the target coordinates of the target object in the robot coordinate system when the target object exceeds the reachable range of the robot's robot arm end.
  • the robot arm is closer to the target object;
  • the pose determination module at the end of the robotic arm is used to obtain the robotic arm of the selected robotic arm based on the mapping of the target coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the predetermined rules of the pointing action.
  • the end point points to the pose of the target object within the reachable range;
  • a trajectory determination module configured to determine the trajectory of the selected robotic arm from the current posture of each joint of the selected robotic arm to the target posture of each joint based on the obtained posture of the end of the robotic arm;
  • An action control module is used to control the selected robotic arm to point to the target object based on the trajectory.
  • This application provides an electronic device, including:
  • processors one or more processors
  • Memory used to store one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the above robot pointing action control method.
  • the present application provides a computer-readable storage medium on which computer instructions are stored, and the instructions are processed by a processor. During execution, the above robot pointing action control method is implemented.
  • the robot pointing action control method of the present application uses, when the target object exceeds the reachable range of the end of the robot's mechanical arm, based on the mapping of the target coordinates, the shoulder joint of the selected mechanical arm is in the robot coordinate system. According to the predetermined rules of shoulder joint coordinates and pointing actions, the position and posture of the end of the selected robotic arm pointing to the target object within the reachable range is obtained, so as to complete the pointing action of the selected robotic arm pointing to the target object. In this way, applications directed to target object scenes can be realized and the applicability of the robot can be improved.
  • Figure 1 shows a schematic flowchart of a robot pointing action control method provided by an embodiment of the present application.
  • Figure 2 shows a specific flow diagram of a robot pointing action control method provided by an embodiment of the present application.
  • Figure 3 shows a simplified schematic diagram of the structure of the robot coordinate system and the target object of the robot pointing action control method provided by the embodiment of the present application.
  • Figure 4a shows a schematic diagram of the current configuration of the robot according to the robot pointing action control method provided by the embodiment of the present application.
  • Figure 4b shows a schematic diagram of the target configuration of the robot according to the robot pointing action control method provided by the embodiment of the present application.
  • FIG. 5 is a schematic flowchart of measuring action accuracy of the robot pointing action control method provided by the embodiment of the present application.
  • Figure 6 shows a module schematic diagram of a robot pointing action control device provided by an embodiment of the present application.
  • Figure 7 shows a module block diagram of an electronic device provided by an embodiment of the present application.
  • the steps of the corresponding methods are not necessarily performed in the order shown and described in this application. In some other embodiments, methods may include more or fewer steps than described herein. In addition, a single step described in this application may be broken down into multiple steps for description in other embodiments; and multiple steps described in this application may also be combined into a single step in other embodiments. describe.
  • embodiments of the present application provide a robot pointing action control method.
  • the target object exceeds the reachable range of the end of the robot's robotic arm
  • the target object is positioned in the robot coordinate system according to the
  • select the robot arm that is closer to the target object according to the mapping of the target coordinates, the shoulder joint coordinates of the selected robot arm's shoulder joint in the robot coordinate system and the reservation of the pointing action Rules are used to obtain the pose of the end of the selected manipulator pointing to the target object within the reachable range; based on the obtained pose of the end of the manipulator, determine the current pose of the selected manipulator along each joint to each joint.
  • the trajectory of the target pose of the joint based on the trajectory, control the selected robotic arm to point to the target object.
  • the shoulder joint coordinates and pointing actions of the shoulder joint of the selected mechanical arm in the robot coordinate system are determined According to the predetermined rules, the position and posture of the end of the selected robot arm pointing to the target object within the reachable range is obtained, so as to complete the pointing action of the selected robot arm pointing to the target object. In this way, applications directed to target object scenes can be realized and the applicability of the robot can be improved.
  • the robot pointing action control method in the embodiment of the present application is applied to application scenarios in which the robot points to a target object.
  • Application scenarios where the robot points to the target object may include but are not limited to scenarios where the robot explains or delivers demonstration content in the exhibition hall.
  • Demonstration content may include but is not limited to videos, speeches, and teaching content.
  • Target audiences may include, but are not limited to, the specific content of the demo content.
  • Figure 1 shows a schematic flowchart of a robot pointing action control method provided by an embodiment of the present application.
  • the robot pointing action control method includes the following steps 110 to 140:
  • Step 110 When the target object exceeds the reachable range of the end of the robot's mechanical arm, select the robot arm closer to the target object among the two robot arms of the robot based on the target coordinates of the target object in the robot coordinate system.
  • the target object exceeds the reachable range of the end of the robot's mechanical arm, it means that the target object is outside the reachable range of the end of the robot's mechanical arm.
  • the target object and the dual manipulator arms of the robot are in the same robot coordinate system. Therefore, the selected manipulator arm that is closer to the target object can be obtained conveniently and quickly.
  • the above step 110 can realize the use of a robotic arm that is closer to the target object to complete the pointing action, improve the convenience of pointing, and is more in line with user needs and more conducive to operation implementation.
  • the above-mentioned step 110 includes: determining the area where the target object is located from the two pre-divided areas corresponding to the two robot arms of the robot according to the target coordinates of the target object in the robot coordinate system; The robot arm corresponding to the area where the target object is located is determined as the robot arm that is closer to the target object than the two robot arms of the robot.
  • the above step 110 includes: determining the vectors between the target object and the shoulder joints of the robotic arm respectively, and the mapping distance on the horizontal coordinate plane of the robot coordinate system; , determined as the robotic arm that is closer to the target object than the dual robotic arms of the robot.
  • Euclidean distance Euclidean Distance
  • Step 120 According to the mapping of the target coordinates, the shoulder joint coordinates of the selected robot arm's shoulder joint in the robot coordinate system and the predetermined rules of the pointing action, obtain the position of the selected robot arm's end point pointing to the target object within the reachable range. Posture.
  • the predetermined rules for pointing actions serve as constraints for pointing actions, which can provide reliable guarantee for the execution of subsequent specified actions.
  • the above step 120 can obtain the position point where the end of the selected robotic arm points to the target object within the reachable range through the mapping conversion of the target coordinates. This enables subsequent control of the selected robotic arm to point to the target object.
  • the position and posture of the robotic arm end of the selected robotic arm pointing to the target object within the reachable range includes the position and posture of the robotic arm end of the selected robotic arm in the robot coordinate system.
  • a position point whose distance from the shoulder of the selected robot arm is less than the arm length of the selected robot arm can be determined as a target in a direction in which the end of the robot arm of the selected robot arm points toward the target object. Coordinates map to positions within reach of the arm end of the selected arm.
  • the arm length of the selected robotic arm in the direction in which the end of the selected robotic arm points to the target object, can be multiplied by the reduction coefficient to obtain a reduced position point within the reachable range as the target Coordinates map to positions within reach of the arm end of the selected arm.
  • the reduction coefficient is greater than 0 and less than 1, thereby ensuring that the end of the robotic arm is within the reachable range. See below for detailed instructions.
  • the arm length of the selected robotic arm can be reduced by a predetermined length in the direction in which the end of the selected robotic arm points to the target object, to obtain a reduced position point within the reachable range, as The target coordinates map to a location within reach of the end of the selected robot's arm.
  • the predetermined length is greater than 0 and less than the arm length. In order to ensure that the selected robotic arm stretches as much as possible, increase the pointing range of the robotic arm end, and ensure that the robotic arm end is within the reachable range, the predetermined length should be as small as possible than the arm length. See below for detailed instructions.
  • Step 130 Determine the trajectory of the selected robotic arm from the current posture of each joint of the selected robotic arm to the target posture of each joint based on the obtained posture of the end of the robotic arm. This trajectory can be used as the basis for controlling the selected robot arm towards the target object.
  • Step 140 Control the selected robotic arm to point to the target object according to the trajectory to complete the pointing action of the end of the selected robotic arm pointing to the target object.
  • Figure 2 shows a specific flow diagram of a robot pointing action control method provided by an embodiment of the present application.
  • the method may also include but is not limited to the following steps 101 to 104 to determine the reachable range of the target object beyond the end of the robot's manipulator arm:
  • Step 101 Establish the robot world coordinate system, the robot coordinate system, and the local coordinate system of each joint of the double robotic arm.
  • the robot coordinate system refers to the robot's base coordinate system.
  • Step 102 According to the angle between the vector pointing the origin of the robot coordinate system to the coordinates of the target object in the robot coordinate system and the X-axis of the robot coordinate system, obtain the angle at which the robot chassis needs to rotate so that the target object is within the front range of the robot.
  • step 102 can be implemented through but is not limited to the following three steps:
  • the Y-direction component of is a vector
  • the X-direction component. in It is the difference between the angle ⁇ and the orientation angle ⁇ of the front of the robot in the world coordinate system. This value may exceed [- ⁇ , ⁇ ].
  • the ⁇ obtained by the above calculation is the minimum angle of chassis rotation.
  • Step 103 According to the robot's coordinates w p robot in the world coordinate system and the orientation angle ⁇ , convert the target object's target coordinate w p target in the world coordinate system to the coordinates in the robot coordinate system. Specifically, the following formula can be used ( 5) ⁇ (6) Calculation:
  • r p target is the coordinates of the target object in the robot coordinate system
  • R robot is the attitude matrix of the robot in the world coordinate system
  • target is the target object
  • robot is the robot.
  • Step 104 Determine the reachable range of the target object beyond the end of the robot's manipulator arm. In the above embodiment of step 104, if the target object is outside the maximum reachable range of the end of the robot's manipulator arm, it is determined that the target object exceeds the reachable range of the end of the robot's manipulator arm. In another embodiment of the above step 104, if the distance between the target object and the shoulder joints of the robot is greater than the arm length of the robot's mechanical arm, it is determined that the target object is beyond the reachable range of the end of the robot's mechanical arm.
  • Figure 3 shows a simplified schematic diagram of the structure of the robot coordinate system and the target object of the robot pointing action control method provided by the embodiment of the present application.
  • the XY plane coordinate system is the coordinate system projected on the XY plane by the robot's base coordinate system.
  • the XY plane coordinate system includes the origin O of the XY plane coordinate system corresponding to the head position 21 of the robot, the left shoulder joint 22 of the left robot arm and the right shoulder joint 23 of the right robot arm, the target object 24, and the selected machine.
  • the following describes in detail how to determine the three-dimensional coordinates of the position point 25 in the robot coordinate system, that is, the X coordinate, the Y coordinate, and the Z coordinate.
  • Step 120 may include but is not limited to the following steps 121 to 123 .
  • Step 121 Map the target coordinates to coordinates within the reachable range of the end of the selected robotic arm.
  • the first step is to determine the angle between the vector of the shoulder joint of the selected manipulator pointing to the target object in the robot coordinate system and the X-axis of the robot coordinate system, and the selected manipulator The arm length of the selected robot arm is obtained to obtain the X coordinate and Y coordinate of the end of the robot arm of the selected robot arm.
  • the shoulder joint r p s of the selected manipulator points to the vector of the target object r p target
  • the angle ⁇ with the X-axis of the robot coordinate system, and the arm length L of the selected robotic arm are used to obtain the X and Y coordinates r p e ( x) and r p e (y) of the end r p e of the robotic arm.
  • the setting of K will make the distance between the position point 25 and the shoulder joint of the selected robotic arm smaller than the arm length L of the selected robotic arm.
  • the second step is to obtain the Z coordinate r of the end of the robotic arm of the selected robotic arm based on the relationship between the Z coordinate of the target object in the robot coordinate system, the coordinates of the shoulder joint of the selected robotic arm, and the arm length .
  • p e (z) Specifically, it can be calculated by the following formula (8).
  • is a value greater than zero and less than L, and can be a traversal value within a range.
  • the above formula (8-2) is used to explain that in the Z direction, the position of the target object is higher than the shoulder joint and lower than the total height of the shoulder joint and the robot arm. It means that the position of the target object itself is higher than the shoulder joint and within the reach of the robot arm. Within the range, subtract ⁇ from r p target (z) of the target object, so that the end of the robot's manipulator arm can point to the target object at a lower position than the target object, which is more consistent with the target object that the end of the robot's manipulator arm simulates pointing to. .
  • the above formula (8-3) is used to illustrate that in the Z direction, the position of the target object is higher than the total height of the shoulder joint and the robotic arm, subtracted by r p s (z) + L from the total height of the shoulder joint and the robotic arm. ⁇ , the point where the robot arm points is lower than the target object. In this way, the end of the selected robot arm can be within the reachable range, and the end of the selected robot arm can be pointed to the target object within the maximum reachable range of the selected robot arm.
  • Step 122 Determine the designated direction of the end of the robotic arm of the selected robotic arm according to the predetermined rules of the pointing action.
  • Step 123 Generate the posture of the end of the robotic arm of the selected robotic arm based on the mapped coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the specified direction.
  • the first step is to determine the unit vector of the vector pointing the target object toward the end of the robotic arm as 3 of the attitude matrix of the end of the robotic arm of the selected robotic arm used to form the Z direction vector.
  • Z direction components the attitude matrix of the end of the selected robotic arm is a matrix of 3 rows and 3 columns.
  • the attitude matrix R of the end of the selected robotic arm is represented by the following formula (9):
  • X-direction vector in, is the X-direction vector, ax, ay, and az are three X-direction components respectively. These three X-direction components belong to the first column. is the Y-direction vector, ox, oy, and oz are three Y-direction components respectively. These three Y-direction components belong to the second column. is the Z-direction vector, nx, ny and nz are three Z-direction components respectively. These three Z-direction components belong to the third column.
  • the first step above can be calculated by the following formulas (10) ⁇ (11).
  • the vector from the target object r p target to the end of the robot arm r p e of the selected robot arm The unit vector of is the third column of the attitude matrix of the end of the selected robotic arm.
  • the second step is to determine the attitude matrix used to form Y when the end of the selected robot arm points to the target object, the three Z-direction components of the attitude matrix are positive, and the selected robot arm is the right robot arm.
  • the three Y-direction components of the direction vector are positive;
  • the third step is to determine the attitude matrix used to form Y when the end of the robot arm of the selected robot arm points to the target object, the three Z-direction components of the attitude matrix are positive, and the selected robot arm is the left robot arm.
  • the three Y-direction components of the direction vector are negative.
  • Figure 4a shows a schematic diagram of the current configuration of the robot according to the robot pointing action control method provided by the embodiment of the present application.
  • Figure 4b shows a schematic diagram of the target configuration of the robot according to the robot pointing action control method provided by the embodiment of the present application.
  • the dual robotic arms include a left robotic arm (not labeled in the figure) and a right robotic arm 30 .
  • the right robotic arm 30 may include but is not limited to the right shoulder joint 31 , the right large arm 32 , the right small arm 33 and the right palm 34 . The same applies to the left robotic arm, so I won’t go into details here.
  • Step 4 According to the right-hand rule, determine the three X-direction components of the attitude matrix used to form the X-direction vector based on the three Z-direction components and the three Y-direction components of the attitude matrix at the end of the selected robot arm. In this way, compared with the implementation process of solving nonlinear equations, it is more convenient and does not rely on existing libraries, such as matlab's library for solving nonlinear equations, which can increase the calculation speed.
  • step 4 can be achieved through the following steps: According to the right-hand rule, the second column of the robot end attitude matrix can be and the third column Find the first column, expressed by the following formula (16):
  • the above step 130 may further include, but is not limited to, the following two steps.
  • the first step is to solve the inverse kinematics based on the obtained posture of the end of the robotic arm to obtain the target posture of each joint of the robotic arm.
  • the second step is to obtain the trajectory of the end of the selected robotic arm from the current posture to the target posture of each joint through trajectory interpolation.
  • the above-mentioned step 1 may further include but is not limited to:
  • the target configuration includes the target angles of each joint
  • trajectory interpolation is performed to control the movement of the robot arm from the current configuration of the robot to the target configuration of the robot pointing to the target object.
  • the current configuration of the robot includes the current angles of each joint.
  • FIG. 5 is a schematic flowchart of measuring action accuracy of the robot pointing action control method provided by the embodiment of the present application.
  • the method may include but is not limited to: step 150 , using a measuring device to test the obtained pose of the end of the robotic arm, and determine the end of the robotic arm of the selected robotic arm. Error in pointing motion toward a target object. In this way, after controlling the robot arm to point to the target, the deviation of the robot arm pointing to the target is determined, thereby measuring the accuracy of the pointing action to facilitate subsequent adjustments and control optimization.
  • using a measurement device to test the obtained pose of the end of the robotic arm, and determining the error of the pointing action of the end of the selected robotic arm toward the target object includes: a first step, determining that the target object points to the selected Vector of the end of the robotic arm.
  • a measuring device can be used to test the posture of the end of the robotic arm, and based on the position of the target object and the measured position of the end of the robotic arm, the vector pointing from the target object to the end of the robotic arm is calculated.
  • the second step is to determine the angle between the vector and the Z-direction vector of the attitude matrix of the robot end of the selected robot arm as the above error.
  • step 150 may further include but is not limited to the following 1) to 3):
  • step 1) Based on the test results in step 1) above and the coordinates r p target of the target object in the robot coordinate system, calculate the vector of the target object pointing to the end of the robotic arm.
  • Figure 6 shows a module schematic diagram of a robot pointing action control device provided by an embodiment of the present application.
  • the robot points to the action control device, including the following modules:
  • the robotic arm selection module 41 is used to select the one of the robot's dual robotic arms that is closer to the target object based on the target coordinates of the target object in the robot coordinate system when the target object exceeds the reachable range of the end of the robot's robotic arm.
  • the pose determination module 42 of the end of the robotic arm is used to obtain the end of the robotic arm of the selected robotic arm based on the mapping of the target coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the predetermined rules of the pointing action.
  • the trajectory determination module 43 is used to determine the trajectory of the selected robotic arm from the current posture of each joint of the selected robotic arm to the target posture of each joint based on the obtained posture of the end of the robotic arm;
  • the action control module 44 is used to control the selected robotic arm to point to the target object according to the trajectory.
  • the pose determination module 42 at the end of the robotic arm includes:
  • the coordinate mapping submodule is used to map the target coordinates to coordinates within the reachable range of the end of the selected robotic arm;
  • the designated direction determination submodule is used to determine the designated direction of the end of the robotic arm of the selected robotic arm according to the predetermined rules of the pointing action;
  • the attitude generation submodule of the end of the robotic arm is used to generate the attitude of the end of the robotic arm of the selected robotic arm based on the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected robotic arm in the robot coordinate system, and the specified direction. .
  • the coordinate mapping submodule is specifically configured to: in the robot coordinate system, the intersection between the vector of the shoulder joint of the selected manipulator pointing to the target object and the X-axis of the robot coordinate system angle, and The arm length of the selected robotic arm is used to obtain the X coordinate and Y coordinate of the end of the robotic arm of the selected robotic arm; based on the Z coordinate of the target object in the robot coordinate system and the coordinates of the shoulder joint of the selected robotic arm, the According to the relationship between the arm lengths, the Z coordinate of the end of the robotic arm of the selected robotic arm is obtained;
  • the attitude generation submodule at the end of the robotic arm is specifically used for:
  • the unit vector of the vector of the target object pointing to the end of the robotic arm is determined as the three Z-direction components of the attitude matrix of the end of the selected robotic arm that are used to form the Z-direction vector.
  • the end of the selected robotic arm's robotic arm The posture matrix is a matrix with 3 rows and 3 columns;
  • the selected robot arm When the end of the selected robot arm points to the target object, and the three Z-direction components of the attitude matrix are positive, and the selected robot arm is a right robot arm, determine the composition of the attitude matrix.
  • the three Y-direction components of the Y-direction vector are positive;
  • the selected robot arm When the end of the selected robot arm points to the target object, and the three Z-direction components of the attitude matrix are positive, and the selected robot arm is the left robot arm, determine the composition of the attitude matrix.
  • the three Y-direction components of the Y-direction vector are negative;
  • the three X-direction components of the attitude matrix used to form the X-direction vector are determined based on the three Z-direction components and the three Y-direction components of the attitude matrix of the end of the selected robot arm.
  • the device further includes: a pointing action error determination module, configured to use a measurement device to test the obtained pose of the end of the robotic arm, and determine the pointing action of the end of the selected robotic arm toward the target object. error.
  • a pointing action error determination module configured to use a measurement device to test the obtained pose of the end of the robotic arm, and determine the pointing action of the end of the selected robotic arm toward the target object. error.
  • the error determination module of the pointing action is specifically used to: determine the vector of the target object pointing to the end of the robotic arm of the selected robotic arm; determine the attitude matrix between the vector and the end of the robotic arm of the selected robotic arm. The angle between the Z direction vectors is used as the above error.
  • the trajectory determination module 43 includes:
  • the target posture determination submodule is used to solve the inverse kinematics based on the obtained posture of the end of the robotic arm, and obtain the target posture of each joint of the robotic arm;
  • the trajectory determination submodule is used to obtain the trajectory of the end of the selected robotic arm from the current posture to the target posture of each joint through trajectory interpolation.
  • the pose determination module 42 of the end of the robotic arm is specifically used to: solve the inverse kinematics based on the obtained pose of the end of the robotic arm, and obtain the target configuration of the selected robotic arm.
  • the target configuration includes the target angle of each joint; based on the current configuration of the robot and the target configuration, trajectory interpolation is performed to control the robot arm to move from the current configuration of the robot to point to the target object
  • the target configuration of the robot, and the current configuration of the robot includes the current angles of each joint.
  • This application also provides a computer-readable storage medium on which computer instructions are stored. When the instructions are executed by a processor, the above-mentioned robot pointing action control method is implemented.
  • the present application also provides an electronic device.
  • Figure 7 shows a module block diagram of the electronic device provided by an embodiment of the present application.
  • the electronic device 50 includes one or more processors 51 for implementing the robot pointing action control method as described above.
  • the electronic device 50 may include a memory 59, which may store programs that can be called by the processor 51, and may include a non-volatile storage medium. In some embodiments, electronic device 50 may include memory 58 and interface 57 . In some embodiments, the electronic device 50 may also include other hardware depending on the actual application.
  • the memory 59 in the embodiment of the present application stores a program thereon, and when the program is executed by the processor 51, it is used to implement the above-mentioned robot pointing action control method.
  • the application may take the form of a computer program product implemented on one or more memories 59 (including but not limited to disk memory, CD-ROM, optical memory, etc.) having program code embodied therein.
  • Memory 59 includes permanent and non-permanent, removable and non-removable media, and information storage may be accomplished by any method or technology.
  • Information may be computer-readable instructions, data structures, modules of programs, or other data. Examples of memory 59 include, but are not limited to: phase change memory (Phase Change RAM, PRAM), static random access memory (Static Random-Access Memory, SRAM), dynamic random access memory (Dynamic Random Access Memory, DRAM), and others.
  • RAM Random Access Memory
  • ROM Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • Flash Memory Flash Memory
  • CD-ROM Compact Disc Read Only Memory
  • DVD Digital Versatile Disc
  • magnetic cassettes tapes and Disk storage or other magnetic storage devices or any other non-transmission medium that can be used to store information that can be accessed by a computing device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

A method and apparatus for controlling a pointing action of a robot, and an electronic device and a storage medium. The method for controlling a pointing action of a robot comprises: when a target object exceeds a reachable range of a robotic arm end of a robot, selecting, according to target object coordinates of the target object in a robot coordinate system, a robotic arm, which is closer to the target object, from double robotic arms of the robot (110); according to the mapping of the target object coordinates, shoulder joint coordinates of a shoulder joint of the selected robotic arm in the robot coordinate system, and a predetermined rule of a pointing action, obtaining a pose of a robotic arm end of the selected robotic arm pointing to the target object within the reachable range (120); according to the obtained pose of the robotic arm end, determining a trajectory of the selected robotic arm along the current pose of each joint of the selected robotic arm to a target pose of each joint (130); and according to the trajectory, controlling the selected robotic arm to point to the target object (140).

Description

机器人指向动作控制方法、装置、电子设备和存储介质Robot pointing action control method, device, electronic equipment and storage medium 技术领域Technical field
本申请涉及机器人机械臂拟人动作技术领域,尤其涉及一种机器人指向动作控制方法、装置、电子设备和存储介质。The present application relates to the technical field of robot manipulator anthropomorphic movements, and in particular to a robot pointing movement control method, device, electronic equipment and storage medium.
背景技术Background technique
随着科技的发展,机器人的应用已经进入到人们生活的方方面面,诸如很多商场、餐厅、展厅等均已引入服务机器人,但在目前的机器人与人交互技术中,语音和视觉技术相对成熟,肢体动作控制方面比较欠缺,从而导致机器人与人交互时比较呆板。如果在此基础上增加肢体语言,会让机器人与人互动时更加拟人。With the development of science and technology, the application of robots has entered all aspects of people's lives, such as many shopping malls, restaurants, exhibition halls, etc. have introduced service robots. However, in the current robot-human interaction technology, voice and visual technology are relatively mature, and physical The lack of movement control makes the robot dull when interacting with people. If body language is added to this basis, the robot will be more human-like when interacting with people.
相关技术中,机器人的机械臂执行动作时,当目标对象处于机械臂的可达范围之外的情况下,机器人不会对目标对象做任何处理。如此,机械臂的应用场景较为局限。In the related technology, when the robot's mechanical arm performs an action, when the target object is outside the reachable range of the robot arm, the robot will not perform any processing on the target object. In this way, the application scenarios of robotic arms are relatively limited.
发明内容Contents of the invention
本申请提供了一种机器人指向动作控制方法、装置、电子设备和存储介质,实现指向目标对象场景的应用,提高机器人的适用性。This application provides a robot pointing action control method, device, electronic equipment and storage medium to realize the application of pointing to target object scenes and improve the applicability of the robot.
本申请提供了一种机器人指向动作控制方法,包括:This application provides a robot pointing action control method, including:
在目标对象超出机器人的机械臂末端的可达范围的情况下,依据所述目标对象在机器人坐标系中的目标坐标,选择所述机器人的双机械臂中的与所述目标对象距离更近的机械臂;When the target object exceeds the reachable range of the end of the robot's manipulator arm, according to the target coordinates of the target object in the robot coordinate system, select the one of the two manipulator arms of the robot that is closer to the target object. robotic arm;
根据所述目标坐标的映射、所选机械臂的肩关节在机器人坐标系中的肩关节坐标及指向动作的预定规则,获得所选机械臂的机械臂末端在可达范围内指向所述目标对象的位姿;According to the mapping of the target coordinates, the shoulder joint coordinates of the selected robot arm's shoulder joint in the robot coordinate system and the predetermined rules of the pointing action, it is obtained that the end of the selected robot arm points to the target object within the reachable range. posture;
依据所获得的机械臂末端的位姿,确定所选机械臂沿所选机械臂的各关节的当前位姿到所述各关节的目标位姿的轨迹;Based on the obtained posture of the end of the robotic arm, determine the trajectory of the selected robotic arm from the current posture of each joint of the selected robotic arm to the target posture of each joint;
依据所述轨迹,控制所选机械臂指向目标对象。According to the trajectory, the selected robotic arm is controlled to point to the target object.
进一步的,所述根据所述目标坐标的映射、所选机械臂的肩关节在机器人坐标系中的肩关节坐标及指向动作的预定规则,获得所选机械臂的机械臂末端在可达范围内指向所述目标对象的位姿,包括:Further, based on the mapping of the target coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system and the predetermined rules of the pointing action, it is obtained that the robotic arm end of the selected robotic arm is within the reachable range The pose pointing to the target object includes:
将所述目标坐标映射到所选机械臂的机械臂末端的可达范围内的坐标;Map the target coordinates to coordinates within the reachable range of the end of the robotic arm of the selected robotic arm;
依据所述指向动作的预定规则,确定所选机械臂的机械臂末端的指定方向;Determine the designated direction of the end of the robotic arm of the selected robotic arm according to the predetermined rules of the pointing action;
依据所映射的坐标、所选机械臂的肩关节在机器人坐标系中的肩关节坐标以及所述指定方向,生成所选机械臂的机械臂末端的姿态。Based on the mapped coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the specified direction, the posture of the robotic arm end of the selected robotic arm is generated.
进一步的,所述将所述目标坐标映射到所选机械臂的机械臂末端的可达范围内的坐标,包括:依据所述机器人坐标系中,所选机械臂的肩关节指向所述目标对象的向量与所述机器人坐标系的X轴的夹角,以及所选机械臂的臂长,获得所选机械臂的机械臂末端的X坐标及Y坐标;依据所述目标对象在机器人坐标系中的Z方向坐标与所选机械臂的肩关节的坐标、所述臂长之间的关系,获得所选机械臂的机械臂末端的Z坐标;Further, mapping the target coordinates to coordinates within the reachable range of the end of the selected robotic arm includes: based on the robot coordinate system, the shoulder joint of the selected robotic arm points to the target object. The angle between the vector of and the The relationship between the Z-direction coordinates, the coordinates of the shoulder joint of the selected robotic arm and the arm length is used to obtain the Z-coordinate of the robotic arm end of the selected robotic arm;
和/或,and / or,
所述依据所映射的坐标、所选机械臂的肩关节在机器人坐标系中的肩关节坐标以及所述指定方向,生成所选机械臂的机械臂末端的姿态,包括:Generating the posture of the end of the robotic arm of the selected robotic arm based on the mapped coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the specified direction, includes:
将所述目标对象指向机械臂末端的向量的单位向量,确定为所选机械臂的机械臂末端的姿态矩阵的用于组成Z方向向量的3个Z方向分量,所选机械臂的机械臂末端的姿 态矩阵为3行3列的矩阵;The unit vector of the vector of the target object pointing to the end of the robotic arm is determined as the three Z-direction components of the attitude matrix of the end of the selected robotic arm that are used to form the Z-direction vector. The end of the selected robotic arm's robotic arm posture The state matrix is a matrix with 3 rows and 3 columns;
在所选机械臂的机械臂末端指向目标对象,且所述姿态矩阵的所述3个Z方向分量为正,所选机械臂为右机械臂的情况下,确定所述姿态矩阵的用于组成Y方向向量的3个Y方向分量为正;When the end of the selected robot arm points to the target object, and the three Z-direction components of the attitude matrix are positive, and the selected robot arm is a right robot arm, determine the composition of the attitude matrix. The three Y-direction components of the Y-direction vector are positive;
在所选机械臂的机械臂末端指向目标对象,且所述姿态矩阵的所述3个Z方向分量为正,所选机械臂为左机械臂的情况下,确定所述姿态矩阵的用于组成Y方向向量的3个Y方向分量为负;When the end of the selected robot arm points to the target object, and the three Z-direction components of the attitude matrix are positive, and the selected robot arm is the left robot arm, determine the composition of the attitude matrix. The three Y-direction components of the Y-direction vector are negative;
依据右手法则,根据所选机械臂的机械臂末端的姿态矩阵的所述3个Z方向分量及3个Y方向分量,确定所述姿态矩阵的用于组成X方向向量的3个X方向分量。According to the right-hand rule, the three X-direction components of the attitude matrix used to form the X-direction vector are determined based on the three Z-direction components and the three Y-direction components of the attitude matrix of the end of the selected robot arm.
进一步的,在所述依据所述轨迹,控制所选机械臂指向目标对象之后,所述方法还包括:Further, after controlling the selected robotic arm to point to the target object based on the trajectory, the method further includes:
使用测量设备测试所获得的机械臂末端的位姿,确定所选机械臂的机械臂末端指向目标对象的指向动作的误差。Use a measuring device to test the obtained pose of the robotic arm end, and determine the error in the pointing action of the selected robotic arm's robotic arm end towards the target object.
进一步的,所述使用测量设备测试所获得的机械臂末端的位姿,确定所选机械臂的机械臂末端指向目标对象的指向动作的误差,包括:Further, the use of measuring equipment to test the obtained pose of the end of the robotic arm and determine the error of the pointing action of the end of the selected robotic arm toward the target object includes:
确定所述目标对象指向所选机械臂的机械臂末端的向量;Determine a vector pointing from the target object to the end of the robotic arm of the selected robotic arm;
确定所述向量与所选机械臂的机械臂末端的姿态矩阵的Z方向向量的夹角,作为所述误差。The angle between the vector and the Z-direction vector of the attitude matrix of the robot end of the selected robot arm is determined as the error.
进一步的,所述依据所获得的机械臂末端的位姿,确定所选机械臂沿所选机械臂的各关节的当前位姿到所述各关节的目标位姿的轨迹,包括:Further, based on the obtained posture of the end of the robotic arm, determining the trajectory of the selected robotic arm from the current posture of each joint of the selected robotic arm to the target posture of each joint includes:
依据所获得的机械臂末端的位姿,求解逆运动学,获得机械臂的各关节的目标姿态;Based on the obtained posture of the end of the robotic arm, solve the inverse kinematics and obtain the target posture of each joint of the robotic arm;
通过轨迹插补,得到所选机械臂的机械臂末端从当前位姿到所述各关节的目标位姿的轨迹。Through trajectory interpolation, the trajectory of the robotic arm end of the selected robotic arm from the current posture to the target posture of each joint is obtained.
进一步的,所述依据所获得的机械臂末端的位姿,求解逆运动学,获得机械臂的各关节的目标姿态,包括:Further, based on the obtained posture of the end of the robotic arm, the inverse kinematics is solved to obtain the target posture of each joint of the robotic arm, including:
依据所获得的机械臂末端的位姿,求解逆运动学,获得所选机械臂的目标构型,所述目标构型包括所述各关节的目标角度;Based on the obtained posture of the end of the robotic arm, solve the inverse kinematics to obtain the target configuration of the selected robotic arm, where the target configuration includes the target angle of each joint;
依据所述机器人的当前构型以及所述目标构型,进行轨迹插补,控制机械臂从所述机器人的当前构型运动到指向所述目标对象的机器人的目标构型,所述机器人的当前构型包括所述各关节的当前角度。According to the current configuration of the robot and the target configuration, trajectory interpolation is performed to control the movement of the robot arm from the current configuration of the robot to the target configuration of the robot pointing to the target object. The current configuration of the robot is The configuration includes the current angles of each of the joints.
本申请提供了一种机器人指向动作控制装置,包括:This application provides a robot pointing action control device, including:
机械臂选择模块,用于在目标对象超出机器人的机械臂末端的可达范围的情况下,依据所述目标对象在机器人坐标系中的目标坐标,选择所述机器人的双机械臂中的与所述目标对象距离更近的机械臂;The robot arm selection module is used to select the robot arm of the robot according to the target coordinates of the target object in the robot coordinate system when the target object exceeds the reachable range of the robot's robot arm end. The robot arm is closer to the target object;
机械臂末端的位姿确定模块,用于根据所述目标坐标的映射、所选机械臂的肩关节在机器人坐标系中的肩关节坐标及指向动作的预定规则,获得所选机械臂的机械臂末端在可达范围内指向所述目标对象的位姿;The pose determination module at the end of the robotic arm is used to obtain the robotic arm of the selected robotic arm based on the mapping of the target coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the predetermined rules of the pointing action. The end point points to the pose of the target object within the reachable range;
轨迹确定模块,用于依据所获得的机械臂末端的位姿,确定所选机械臂沿所选机械臂的各关节的当前位姿到所述各关节的目标位姿的轨迹;A trajectory determination module, configured to determine the trajectory of the selected robotic arm from the current posture of each joint of the selected robotic arm to the target posture of each joint based on the obtained posture of the end of the robotic arm;
动作控制模块,用于依据所述轨迹,控制所选机械臂指向目标对象。An action control module is used to control the selected robotic arm to point to the target object based on the trajectory.
本申请提供了一种电子设备,包括:This application provides an electronic device, including:
一个或多个处理器;one or more processors;
存储器,用于存储一个或多个程序;Memory, used to store one or more programs;
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现上述机器人指向动作控制方法。When the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the above robot pointing action control method.
本申请提供了一种计算机可读存储介质,其上存储有计算机指令,该指令被处理器 执行时实现上述机器人指向动作控制方法。The present application provides a computer-readable storage medium on which computer instructions are stored, and the instructions are processed by a processor. During execution, the above robot pointing action control method is implemented.
在一些实施例中,本申请的机器人指向动作控制方法,通过在目标对象超出机器人的机械臂末端的可达范围的情况下,根据目标坐标的映射,所选机械臂的肩关节在机器人坐标系中的肩关节坐标及指向动作的预定规则,获得所选机械臂的机械臂末端在可达范围内指向目标对象的位姿,以完成所选机械臂指向所述目标对象的指向动作。这样可以实现指向目标对象场景的应用,提高机器人的适用性。In some embodiments, the robot pointing action control method of the present application uses, when the target object exceeds the reachable range of the end of the robot's mechanical arm, based on the mapping of the target coordinates, the shoulder joint of the selected mechanical arm is in the robot coordinate system. According to the predetermined rules of shoulder joint coordinates and pointing actions, the position and posture of the end of the selected robotic arm pointing to the target object within the reachable range is obtained, so as to complete the pointing action of the selected robotic arm pointing to the target object. In this way, applications directed to target object scenes can be realized and the applicability of the robot can be improved.
附图说明Description of drawings
图1所示为本申请实施例提供的机器人指向动作控制方法的流程示意图。Figure 1 shows a schematic flowchart of a robot pointing action control method provided by an embodiment of the present application.
图2所示为本申请实施例提供的机器人指向动作控制方法的具体流程示意图。Figure 2 shows a specific flow diagram of a robot pointing action control method provided by an embodiment of the present application.
图3所示为本申请实施例提供的机器人指向动作控制方法的机器人坐标系与目标对象的结构简化示意图。Figure 3 shows a simplified schematic diagram of the structure of the robot coordinate system and the target object of the robot pointing action control method provided by the embodiment of the present application.
图4a所示为本申请实施例提供的机器人指向动作控制方法的机器人的当前构型的示意图。Figure 4a shows a schematic diagram of the current configuration of the robot according to the robot pointing action control method provided by the embodiment of the present application.
图4b所示为本申请实施例提供的机器人指向动作控制方法的机器人的目标构型的示意图。Figure 4b shows a schematic diagram of the target configuration of the robot according to the robot pointing action control method provided by the embodiment of the present application.
图5所示为本申请实施例提供的机器人指向动作控制方法的测量动作精度的流程示意图。FIG. 5 is a schematic flowchart of measuring action accuracy of the robot pointing action control method provided by the embodiment of the present application.
图6所示为本申请实施例提供的机器人指向动作控制装置的模块示意图。Figure 6 shows a module schematic diagram of a robot pointing action control device provided by an embodiment of the present application.
图7所示为本申请实施例提供的电子设备的模块框图。Figure 7 shows a module block diagram of an electronic device provided by an embodiment of the present application.
具体实施方式Detailed ways
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施例并不代表与本申请一个或多个实施例相一致的所有实施例。相反,它们仅是与如所附权利要求书中所详述的、本申请一个或多个实施例的一些方面相一致的装置和方法的例子。Exemplary embodiments will be described in detail herein, examples of which are illustrated in the accompanying drawings. When the following description refers to the drawings, the same numbers in different drawings refer to the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with one or more embodiments of the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of one or more embodiments of the present application as detailed in the appended claims.
需要说明的是:在其他实施例中并不一定按照本申请示出和描述的顺序来执行相应方法的步骤。在一些其他实施例中,其方法所包括的步骤可以比本申请所描述的更多或更少。此外,本申请中所描述的单个步骤,在其他实施例中可能被分解为多个步骤进行描述;而本申请中所描述的多个步骤,在其他实施例中也可能被合并为单个步骤进行描述。It should be noted that in other embodiments, the steps of the corresponding methods are not necessarily performed in the order shown and described in this application. In some other embodiments, methods may include more or fewer steps than described herein. In addition, a single step described in this application may be broken down into multiple steps for description in other embodiments; and multiple steps described in this application may also be combined into a single step in other embodiments. describe.
为了解决机械臂的应用场景较为局限的技术问题,本申请实施例提供一种机器人指向动作控制方法,在目标对象超出机器人的机械臂末端的可达范围的情况下,依据目标对象在机器人坐标系中的目标坐标,选择机器人的双机械臂中的与目标对象距离更近的机械臂;根据目标坐标的映射、所选机械臂的肩关节在机器人坐标系中的肩关节坐标及指向动作的预定规则,获得所选机械臂的机械臂末端在可达范围内指向所述目标对象的位姿;依据所获得的机械臂末端的位姿,确定所选机械臂沿各关节的当前位姿到各关节的目标位姿的轨迹;依据轨迹,控制所选机械臂指向目标对象。In order to solve the technical problem of limited application scenarios of robotic arms, embodiments of the present application provide a robot pointing action control method. When the target object exceeds the reachable range of the end of the robot's robotic arm, the target object is positioned in the robot coordinate system according to the According to the target coordinates in the robot's two robot arms, select the robot arm that is closer to the target object; according to the mapping of the target coordinates, the shoulder joint coordinates of the selected robot arm's shoulder joint in the robot coordinate system and the reservation of the pointing action Rules are used to obtain the pose of the end of the selected manipulator pointing to the target object within the reachable range; based on the obtained pose of the end of the manipulator, determine the current pose of the selected manipulator along each joint to each joint. The trajectory of the target pose of the joint; based on the trajectory, control the selected robotic arm to point to the target object.
在本申请实施例中,通过在目标对象超出机器人的机械臂末端的可达范围的情况下,根据目标坐标的映射,所选机械臂的肩关节在机器人坐标系中的肩关节坐标及指向动作的预定规则,获得所选机械臂的机械臂末端在可达范围内指向目标对象的位姿,以完成所选机械臂指向所述目标对象的指向动作。这样可以实现指向目标对象场景的应用,提高机器人的适用性。In the embodiment of the present application, when the target object exceeds the reachable range of the end of the robot's mechanical arm, according to the mapping of the target coordinates, the shoulder joint coordinates and pointing actions of the shoulder joint of the selected mechanical arm in the robot coordinate system are determined According to the predetermined rules, the position and posture of the end of the selected robot arm pointing to the target object within the reachable range is obtained, so as to complete the pointing action of the selected robot arm pointing to the target object. In this way, applications directed to target object scenes can be realized and the applicability of the robot can be improved.
本申请实施例的机器人指向动作控制方法应用于机器人指向目标对象的应用场景。 机器人指向目标对象的应用场景可以包括但不限于机器人展厅讲解或者宣讲演示内容的场景。演示内容可以包括但不限于视频、演讲稿、教学内容。目标对象可以包括但不限于演示内容的具体内容。The robot pointing action control method in the embodiment of the present application is applied to application scenarios in which the robot points to a target object. Application scenarios where the robot points to the target object may include but are not limited to scenarios where the robot explains or delivers demonstration content in the exhibition hall. Demonstration content may include but is not limited to videos, speeches, and teaching content. Target audiences may include, but are not limited to, the specific content of the demo content.
以下详细介绍机器人指向动作控制方法的具体实现过程。The specific implementation process of the robot pointing action control method is introduced in detail below.
图1所示为本申请实施例提供的机器人指向动作控制方法的流程示意图。Figure 1 shows a schematic flowchart of a robot pointing action control method provided by an embodiment of the present application.
如图1所示,该机器人指向动作控制方法,包括如下步骤110至步骤140:As shown in Figure 1, the robot pointing action control method includes the following steps 110 to 140:
步骤110,在目标对象超出机器人的机械臂末端的可达范围的情况下,依据目标对象在机器人坐标系中的目标坐标,选择机器人的双机械臂中的与目标对象距离更近的机械臂。Step 110: When the target object exceeds the reachable range of the end of the robot's mechanical arm, select the robot arm closer to the target object among the two robot arms of the robot based on the target coordinates of the target object in the robot coordinate system.
如此,目标对象超出机器人的机械臂末端的可达范围说明目标对象处于机器人的机械臂末端的可达范围之外。而,通过确定目标对象在机器人坐标系中的目标坐标,使得目标对象和机器人的双机械臂在同一机器人坐标系,因此,可方便快速地得到与目标对象距离更近的所选机械臂。In this way, if the target object exceeds the reachable range of the end of the robot's mechanical arm, it means that the target object is outside the reachable range of the end of the robot's mechanical arm. However, by determining the target coordinates of the target object in the robot coordinate system, the target object and the dual manipulator arms of the robot are in the same robot coordinate system. Therefore, the selected manipulator arm that is closer to the target object can be obtained conveniently and quickly.
上述步骤110可以实现使用更接近目标对象的机械臂,完成指向动作,提高指向的便捷性,也更符合用户需求,更有利于操作实现。The above step 110 can realize the use of a robotic arm that is closer to the target object to complete the pointing action, improve the convenience of pointing, and is more in line with user needs and more conducive to operation implementation.
上述步骤110的实现方式有多种。在一种实现方式中,上述步骤110包括:依据目标对象在机器人坐标系中的目标坐标,从预先划分的与机器人的双机械臂分别对应的两个区域中,确定目标对象所处的区域;将目标对象所处的区域对应的机械臂,确定为目标对象距离机器人的双机械臂更近的机械臂。There are many ways to implement the above step 110. In one implementation, the above-mentioned step 110 includes: determining the area where the target object is located from the two pre-divided areas corresponding to the two robot arms of the robot according to the target coordinates of the target object in the robot coordinate system; The robot arm corresponding to the area where the target object is located is determined as the robot arm that is closer to the target object than the two robot arms of the robot.
在另一种实现方式中,上述步骤110包括:确定目标对象分别与机械臂的双肩关节之间的向量,在机器人坐标系的水平坐标平面上的映射距离;将更短映射距离对应的机械臂,确定为目标对象距离机器人的双机械臂更近的机械臂。In another implementation, the above step 110 includes: determining the vectors between the target object and the shoulder joints of the robotic arm respectively, and the mapping distance on the horizontal coordinate plane of the robot coordinate system; , determined as the robotic arm that is closer to the target object than the dual robotic arms of the robot.
在该实施例中,根据机器人的各肩关节的尺寸,比如肩关节的高度h以及肩宽w,可以得到机器人坐标系中,左肩关节坐标rpls=(0,w,h)和右肩关节坐标rprs=(0,-w,h),分别计算机器人坐标系中,左肩关节、右肩关节与目标对象的欧式距离(Euclidean Distance),选择距离小的机械臂,作为所选机械臂,来执行指向动作。In this embodiment, according to the dimensions of each shoulder joint of the robot, such as the height h and shoulder width w of the shoulder joint, the left shoulder joint coordinates r p ls = (0, w, h) and the right shoulder joint coordinates in the robot coordinate system can be obtained Joint coordinates r p rs = (0, -w, h), respectively calculate the Euclidean distance (Euclidean Distance) between the left shoulder joint, right shoulder joint and the target object in the robot coordinate system, and select the robot arm with the smallest distance as the selected machine. arm to perform pointing actions.
步骤120,根据目标坐标的映射、所选机械臂的肩关节在机器人坐标系中的肩关节坐标及指向动作的预定规则,获得所选机械臂的机械臂末端在可达范围内指向目标对象的位姿。Step 120: According to the mapping of the target coordinates, the shoulder joint coordinates of the selected robot arm's shoulder joint in the robot coordinate system and the predetermined rules of the pointing action, obtain the position of the selected robot arm's end point pointing to the target object within the reachable range. Posture.
其中,指向动作的预定规则作为指向动作的约束条件,可以为后续指定动作的执行提供可靠的保障。Among them, the predetermined rules for pointing actions serve as constraints for pointing actions, which can provide reliable guarantee for the execution of subsequent specified actions.
上述步骤120可以通过目标坐标的映射转换,获得所选机械臂的机械臂末端在可达范围内指向目标对象的位置点。这样可以实现后续控制所选机械臂指向目标对象。The above step 120 can obtain the position point where the end of the selected robotic arm points to the target object within the reachable range through the mapping conversion of the target coordinates. This enables subsequent control of the selected robotic arm to point to the target object.
上述所选机械臂的机械臂末端在可达范围内指向目标对象的位姿包括机器人所选机械臂的机械臂末端在机器人坐标系下的位置和姿态。The position and posture of the robotic arm end of the selected robotic arm pointing to the target object within the reachable range includes the position and posture of the robotic arm end of the selected robotic arm in the robot coordinate system.
上述步骤120确定所选机械臂的机械臂末端在机器人坐标系下的位置的实施例有多种。在一些实施例中,可以在所选机械臂的机械臂末端指向目标对象的方向上,确定与所选机械臂的肩部之间的距离小于所选机械臂的臂长的位置点,作为目标坐标映射到所选机械臂的机械臂末端的可达范围内的位置。There are many embodiments for determining the position of the end of the robotic arm of the selected robotic arm in the robot coordinate system in the above step 120 . In some embodiments, a position point whose distance from the shoulder of the selected robot arm is less than the arm length of the selected robot arm can be determined as a target in a direction in which the end of the robot arm of the selected robot arm points toward the target object. Coordinates map to positions within reach of the arm end of the selected arm.
在另一些实施例中,可以在所选机械臂的机械臂末端指向目标对象的方向上,将所选机械臂的臂长乘以缩减系数,得到处于可达范围内的缩减位置点,作为目标坐标映射到所选机械臂的机械臂末端的可达范围内的位置。其中,缩减系数大于0且小于1,从而保证机械臂末端处于可达范围内的位置。详细说明请参见下文。In other embodiments, in the direction in which the end of the selected robotic arm points to the target object, the arm length of the selected robotic arm can be multiplied by the reduction coefficient to obtain a reduced position point within the reachable range as the target Coordinates map to positions within reach of the arm end of the selected arm. Among them, the reduction coefficient is greater than 0 and less than 1, thereby ensuring that the end of the robotic arm is within the reachable range. See below for detailed instructions.
在又一些实施例中,可以在所选机械臂的机械臂末端指向目标对象的方向上,将所选机械臂的臂长减小预定长度,得到处于可达范围内的减小位置点,作为目标坐标映射到所选机械臂的机械臂末端的可达范围内的位置。其中,所述预定长度大于0小于臂长。 为保证所选机械臂尽可能的舒展,增大机械臂末端的指向范围,且又保证机械臂末端处于可达范围内,预定长度尽可能地比臂长小。详细说明请参见下文。In some embodiments, the arm length of the selected robotic arm can be reduced by a predetermined length in the direction in which the end of the selected robotic arm points to the target object, to obtain a reduced position point within the reachable range, as The target coordinates map to a location within reach of the end of the selected robot's arm. Wherein, the predetermined length is greater than 0 and less than the arm length. In order to ensure that the selected robotic arm stretches as much as possible, increase the pointing range of the robotic arm end, and ensure that the robotic arm end is within the reachable range, the predetermined length should be as small as possible than the arm length. See below for detailed instructions.
当然其他可以确定所选机械臂的机械臂末端在机器人坐标系下的位置的方法,均属于本申请实施例的保护范围,在此不再一一举例。Of course, other methods that can determine the position of the end of the selected robotic arm in the robot coordinate system fall within the scope of the embodiments of the present application, and will not be cited one by one here.
步骤130,依据所获得的机械臂末端的位姿,确定所选机械臂沿所选机械臂的各关节的当前位姿到各关节的目标位姿的轨迹。该轨迹可以作为所选机械臂指向目标对象的控制的基础。Step 130: Determine the trajectory of the selected robotic arm from the current posture of each joint of the selected robotic arm to the target posture of each joint based on the obtained posture of the end of the robotic arm. This trajectory can be used as the basis for controlling the selected robot arm towards the target object.
步骤140,依据轨迹,控制所选机械臂指向目标对象,以完成所选机械臂的机械臂末端指向目标对象的指向动作。Step 140: Control the selected robotic arm to point to the target object according to the trajectory to complete the pointing action of the end of the selected robotic arm pointing to the target object.
图2所示为本申请实施例提供的机器人指向动作控制方法的具体流程示意图。Figure 2 shows a specific flow diagram of a robot pointing action control method provided by an embodiment of the present application.
如图2所示,在上述步骤110之前,该方法还可以包括但不限于如下步骤101至步骤104,来确定目标对象超出机器人的机械臂末端的可达范围:As shown in Figure 2, before the above-mentioned step 110, the method may also include but is not limited to the following steps 101 to 104 to determine the reachable range of the target object beyond the end of the robot's manipulator arm:
步骤101,建立机器人世界坐标系、机器人坐标系、双机械臂各关节的局部坐标系。Step 101: Establish the robot world coordinate system, the robot coordinate system, and the local coordinate system of each joint of the double robotic arm.
其中,机器人坐标系是指机器人的基坐标系(base coordinate system)。Among them, the robot coordinate system refers to the robot's base coordinate system.
步骤102,根据机器人坐标系的原点指向机器人坐标系下的目标对象的坐标的向量,与机器人坐标系的X轴的夹角,获得使目标对象处于机器人正面范围内,机器人底盘需要转动的角度。Step 102: According to the angle between the vector pointing the origin of the robot coordinate system to the coordinates of the target object in the robot coordinate system and the X-axis of the robot coordinate system, obtain the angle at which the robot chassis needs to rotate so that the target object is within the front range of the robot.
具体的,上述步骤102可以通过但不限于如下3个步骤实现:Specifically, the above step 102 can be implemented through but is not limited to the following three steps:
(1)、使用如下公式(1)计算机器人世界坐标系中,机器人的坐标wprobot指向目标坐标wptarget的向量与机器人世界坐标系的X轴的夹角α:
(1) Use the following formula (1) to calculate the vector of the robot's coordinate w p robot pointing to the target coordinate w p target in the robot world coordinate system. The angle α with the X-axis of the robot world coordinate system:
其中,为向量的Y方向分量,为向量的X方向分量。in, is a vector The Y-direction component of is a vector The X-direction component.
(2)、使用如下公式(2)~(3)计算该夹角α与机器人正面在世界坐标系下的朝向角度θ的关系,从而求出机器人底盘需要旋转的角度β:

(2) Use the following formulas (2) to (3) to calculate the relationship between the angle α and the orientation angle θ of the front of the robot in the world coordinate system, thereby finding the angle β that the robot chassis needs to rotate:

其中,为向量的Y方向分量,为向量的X方向分量。其中,为该夹角α与机器人正面在世界坐标系下的朝向角度θ的差值,该值可能超过了[-π,π],经以上运算得到的β为底盘转动的最小角度。in, is a vector The Y-direction component of is a vector The X-direction component. in, It is the difference between the angle α and the orientation angle θ of the front of the robot in the world coordinate system. This value may exceed [-π, π]. The β obtained by the above calculation is the minimum angle of chassis rotation.
(3)、使用如下公式(4)得到机器人底盘旋转后,在世界坐标系中的朝向角度γ:
(3) Use the following formula (4) to obtain the orientation angle γ in the world coordinate system after the robot chassis is rotated:
步骤103,根据机器人在世界坐标系中的坐标wprobot及朝向角度γ,将目标对象在世界坐标系中的目标坐标wptarget,转换到机器人坐标系中的坐标,具体可通过如下公式(5)~(6)计算:

Step 103: According to the robot's coordinates w p robot in the world coordinate system and the orientation angle γ, convert the target object's target coordinate w p target in the world coordinate system to the coordinates in the robot coordinate system. Specifically, the following formula can be used ( 5)~(6) Calculation:

其中,rptarget为目标对象在机器人坐标系下的坐标,Rrobot为机器人在世界坐标系下的姿态矩阵,为机器人在世界坐标系下的姿态矩阵的逆矩阵,target为目标对象,robot为机器人。Among them, r p target is the coordinates of the target object in the robot coordinate system, R robot is the attitude matrix of the robot in the world coordinate system, is the inverse matrix of the robot's posture matrix in the world coordinate system, target is the target object, and robot is the robot.
步骤104,确定目标对象超出机器人的机械臂末端的可达范围。其中,在上述步骤104的实施例中,若目标对象处于机器人的机械臂末端的最大可达范围之外,则确定目标对象超出机器人的机械臂末端的可达范围。在上述步骤104的另一实施例中,若目标对象与机器人的双肩关节之间的距离大于机器人的机械臂的臂长,则确定目标对象超出机器人的机械臂末端的可达范围。Step 104: Determine the reachable range of the target object beyond the end of the robot's manipulator arm. In the above embodiment of step 104, if the target object is outside the maximum reachable range of the end of the robot's manipulator arm, it is determined that the target object exceeds the reachable range of the end of the robot's manipulator arm. In another embodiment of the above step 104, if the distance between the target object and the shoulder joints of the robot is greater than the arm length of the robot's mechanical arm, it is determined that the target object is beyond the reachable range of the end of the robot's mechanical arm.
图3所示为本申请实施例提供的机器人指向动作控制方法的机器人坐标系与目标对象的结构简化示意图。Figure 3 shows a simplified schematic diagram of the structure of the robot coordinate system and the target object of the robot pointing action control method provided by the embodiment of the present application.
如图3所示,XY平面坐标系是机器人基坐标系投影在XY平面的坐标系。其中,XY平面坐标系包括用于表示机器人的头部位置21对应的XY平面坐标系原点O,左机械臂的左肩关节22和右机械臂的右肩关节23,目标对象24,以及所选机械臂的机械臂末端的可达范围内的位置点25等。下面详细说明如何确定位置点25在机器人坐标系的三维坐标,即X坐标、Y坐标及Z坐标。As shown in Figure 3, the XY plane coordinate system is the coordinate system projected on the XY plane by the robot's base coordinate system. Among them, the XY plane coordinate system includes the origin O of the XY plane coordinate system corresponding to the head position 21 of the robot, the left shoulder joint 22 of the left robot arm and the right shoulder joint 23 of the right robot arm, the target object 24, and the selected machine. The position point 25 and so on within the reachable range of the end of the robotic arm. The following describes in detail how to determine the three-dimensional coordinates of the position point 25 in the robot coordinate system, that is, the X coordinate, the Y coordinate, and the Z coordinate.
继续结合图3进行说明,上述步骤120可以包括但不限于如下步骤121至步骤123。步骤121,将目标坐标映射到所选机械臂的机械臂末端的可达范围内的坐标。Continuing to explain with reference to FIG. 3 , the above step 120 may include but is not limited to the following steps 121 to 123 . Step 121: Map the target coordinates to coordinates within the reachable range of the end of the selected robotic arm.
在上述步骤121的一些实施例中,第1个步骤为,依据机器人坐标系中,所选机械臂的肩关节指向目标对象的向量与机器人坐标系的X轴的夹角,以及所选机械臂的臂长,获得所选机械臂的机械臂末端的X坐标及Y坐标。In some embodiments of the above step 121, the first step is to determine the angle between the vector of the shoulder joint of the selected manipulator pointing to the target object in the robot coordinate system and the X-axis of the robot coordinate system, and the selected manipulator The arm length of the selected robot arm is obtained to obtain the X coordinate and Y coordinate of the end of the robot arm of the selected robot arm.
示例性的,依据机器人坐标系中,所选机械臂的肩关节rps指向目标对象rptarget的向量与机器人坐标系的X轴的夹角η,以及所选机械臂的臂长L,获得机械臂末端rpe的X、Y坐标rpe(x)、rpe(y),具体可通过如下公式(7)计算:
For example, according to the robot coordinate system, the shoulder joint r p s of the selected manipulator points to the vector of the target object r p target The angle η with the X-axis of the robot coordinate system, and the arm length L of the selected robotic arm, are used to obtain the X and Y coordinates r p e ( x) and r p e (y) of the end r p e of the robotic arm. Specifically, Calculated by the following formula (7):
其中,K的设置会使得位置点25与所选机械臂的肩关节之间的距离,小于所选机械臂的臂长L。Among them, the setting of K will make the distance between the position point 25 and the shoulder joint of the selected robotic arm smaller than the arm length L of the selected robotic arm.
第2个步骤为,依据目标对象在机器人坐标系中的Z方向坐标,与所选机械臂的肩关节的坐标、臂长之间的关系,获得所选机械臂的机械臂末端的Z坐标rpe(z)。具体可通过如下公式(8)计算。
The second step is to obtain the Z coordinate r of the end of the robotic arm of the selected robotic arm based on the relationship between the Z coordinate of the target object in the robot coordinate system, the coordinates of the shoulder joint of the selected robotic arm, and the arm length . p e (z). Specifically, it can be calculated by the following formula (8).
其中,δ为大于零且小于L的数值,可为范围内的遍历值。Among them, δ is a value greater than zero and less than L, and can be a traversal value within a range.
上述公式(8-1)用于说明在Z方向上,目标对象的位置低于肩关节,所选机械臂的机械臂末端会比肩关节更低,通过肩关节的rps(z)减去δ,符合机器人的机械臂末端仿真指向低于肩关节的目标对象。 The above formula (8-1) is used to illustrate that in the Z direction, the position of the target object is lower than the shoulder joint, and the end of the selected robotic arm will be lower than the shoulder joint. Subtracted by r p s (z) of the shoulder joint δ, in line with the simulation of the end of the robot's manipulator pointing toward the target object below the shoulder joint.
上述公式(8-2)用于说明在Z方向上,目标对象的位置比肩关节高且低于肩关节和机械臂的总和高度,说明目标对象的位置本身高于肩关节且处于机械臂的可达范围内,通过目标对象的rptarget(z)减去δ,这样机器人的机械臂末端可以比目的对象更低一点的位置指向目标对象,这样更符合机器人的机械臂末端仿真指向的目标对象。The above formula (8-2) is used to explain that in the Z direction, the position of the target object is higher than the shoulder joint and lower than the total height of the shoulder joint and the robot arm. It means that the position of the target object itself is higher than the shoulder joint and within the reach of the robot arm. Within the range, subtract δ from r p target (z) of the target object, so that the end of the robot's manipulator arm can point to the target object at a lower position than the target object, which is more consistent with the target object that the end of the robot's manipulator arm simulates pointing to. .
上述公式(8-3)用于说明在Z方向上,目标对象的位置高于肩关节和机械臂的总和高度,通过肩关节和机械臂的总和高度的rps(z)+L减去δ,机械臂指向所处的位置点低于目标对象。如此,可以使得所选机械臂的机械臂末端处于可达范围内,尽可能在所选机械臂的机械臂末端最大可达范围内指向目标对象。The above formula (8-3) is used to illustrate that in the Z direction, the position of the target object is higher than the total height of the shoulder joint and the robotic arm, subtracted by r p s (z) + L from the total height of the shoulder joint and the robotic arm. δ, the point where the robot arm points is lower than the target object. In this way, the end of the selected robot arm can be within the reachable range, and the end of the selected robot arm can be pointed to the target object within the maximum reachable range of the selected robot arm.
步骤122,依据指向动作的预定规则,确定所选机械臂的机械臂末端的指定方向。Step 122: Determine the designated direction of the end of the robotic arm of the selected robotic arm according to the predetermined rules of the pointing action.
步骤123,依据所映射的坐标、所选机械臂的肩关节在机器人坐标系中的肩关节坐标以及指定方向,生成所选机械臂的机械臂末端的姿态。Step 123: Generate the posture of the end of the robotic arm of the selected robotic arm based on the mapped coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the specified direction.
在上述步骤123的一些实施例中,第1步骤为,将目标对象指向机械臂末端的向量的单位向量,确定为所选机械臂的机械臂末端的姿态矩阵的用于组成Z方向向量的3个Z方向分量,所选机械臂的机械臂末端的姿态矩阵为3行3列的矩阵。示例性的,所选机械臂的机械臂末端的姿态矩阵R如下公式(9)表示:
In some embodiments of the above step 123, the first step is to determine the unit vector of the vector pointing the target object toward the end of the robotic arm as 3 of the attitude matrix of the end of the robotic arm of the selected robotic arm used to form the Z direction vector. Z direction components, the attitude matrix of the end of the selected robotic arm is a matrix of 3 rows and 3 columns. For example, the attitude matrix R of the end of the selected robotic arm is represented by the following formula (9):
其中,为X方向向量,ax、ay、az分别为3个X方向分量,这3个X方向分量属于第一列为Y方向向量,ox、oy、oz分别为3个Y方向分量,这3个Y方向分量属于第二列为Z方向向量,nx、ny及nz分别为3个Z方向分量,这3个Z方向分量属于第三列 in, is the X-direction vector, ax, ay, and az are three X-direction components respectively. These three X-direction components belong to the first column. is the Y-direction vector, ox, oy, and oz are three Y-direction components respectively. These three Y-direction components belong to the second column. is the Z-direction vector, nx, ny and nz are three Z-direction components respectively. These three Z-direction components belong to the third column.
上述第1步骤可以通过如下公式(10)~(11)计算,由目标对象rptarget指向所选机械臂的机械臂末端rpe的向量的单位向量,为所选机械臂的机械臂末端的姿态矩阵的第三列即:

The first step above can be calculated by the following formulas (10) ~ (11). The vector from the target object r p target to the end of the robot arm r p e of the selected robot arm The unit vector of is the third column of the attitude matrix of the end of the selected robotic arm. Right now:

第2步骤为,在所选机械臂的机械臂末端指向目标对象,且姿态矩阵的3个Z方向分量为正,所选机械臂为右机械臂的情况下,确定姿态矩阵的用于组成Y方向向量的3个Y方向分量为正;The second step is to determine the attitude matrix used to form Y when the end of the selected robot arm points to the target object, the three Z-direction components of the attitude matrix are positive, and the selected robot arm is the right robot arm. The three Y-direction components of the direction vector are positive;
第3步骤为,在所选机械臂的机械臂末端指向目标对象,且姿态矩阵的3个Z方向分量为正,所选机械臂为左机械臂的情况下,确定姿态矩阵的用于组成Y方向向量的3个Y方向分量为负。The third step is to determine the attitude matrix used to form Y when the end of the robot arm of the selected robot arm points to the target object, the three Z-direction components of the attitude matrix are positive, and the selected robot arm is the left robot arm. The three Y-direction components of the direction vector are negative.
图4a所示为本申请实施例提供的机器人指向动作控制方法的机器人的当前构型的示意图。图4b所示为本申请实施例提供的机器人指向动作控制方法的机器人的目标构型的示意图。Figure 4a shows a schematic diagram of the current configuration of the robot according to the robot pointing action control method provided by the embodiment of the present application. Figure 4b shows a schematic diagram of the target configuration of the robot according to the robot pointing action control method provided by the embodiment of the present application.
如图4a和图4b所示,机器人的基坐标系的基座中心位置为Or。双机械臂包括左机械臂(图中未标示)及右机械臂30。以下以右机械臂30作为所选机械臂为例,右机械臂30可以包括但不限于右肩关节31、右大胳膊32、右小胳膊33及右手掌34。左机械臂同理,在此不再赘述。 As shown in Figure 4a and Figure 4b, the base center position of the robot's base coordinate system is Or . The dual robotic arms include a left robotic arm (not labeled in the figure) and a right robotic arm 30 . The following takes the right robotic arm 30 as the selected robotic arm as an example. The right robotic arm 30 may include but is not limited to the right shoulder joint 31 , the right large arm 32 , the right small arm 33 and the right palm 34 . The same applies to the left robotic arm, so I won’t go into details here.
继续结合图4a和图4b进行说明,上述第2步骤和3步骤可以通过如下步骤实现:Continuing to illustrate with reference to Figure 4a and Figure 4b, the above steps 2 and 3 can be achieved through the following steps:
依据机械臂指向目标对象时手掌斜向上的规则,可知指向目标对象时,机械臂末端的姿态矩阵的R的Z方向分量为正,右机械臂指向目标对象时,机械臂末端的姿态矩阵R的Y方向分量为正,表示手掌向上。而左机械臂指向目标对象时,该Y方向分量为负,此处以如图4a和图4b所示的右机械臂为例,则满足如下公式(12):
According to the rule that when the robotic arm points to the target object, the palm of the hand is tilted upward, it can be seen that when pointing to the target object, the Z-direction component of R of the attitude matrix of the end of the robotic arm is positive, when the right robotic arm points to the target object, the Y-direction component of the attitude matrix R at the end of the robotic arm If it is positive, it means the palm is facing upward. When the left manipulator points to the target object, the Y direction component is negative. Taking the right robotic arm as shown in Figure 4a and Figure 4b as an example, the following formula (12) is satisfied:
oz的值影响手掌的手心朝上的幅度,可依据指向效果选取,本实施用例选择oz=0.4,也可根据实际求解结果,在该值附近浮动,从而可得:
The value of oz affects the upward direction of the palm, and can be selected based on the pointing effect. In this implementation use case, oz=0.4 is selected. It can also float around this value based on the actual solution results, so that we can get:
设:
set up:
从而可得:
Thus we can get:
左机械臂同理,在此不再赘述。The same applies to the left robotic arm, so I won’t go into details here.
第4步骤,依据右手法则,根据所选机械臂的机械臂末端的姿态矩阵的3个Z方向分量及3个Y方向分量,确定姿态矩阵的用于组成X方向向量的3个X方向分量。如此,相比于解非线性方程的实现过程,更为方便,并且不依赖于现有的库,比如matlab的解非线性方程的库,可以提高计算速度。Step 4: According to the right-hand rule, determine the three X-direction components of the attitude matrix used to form the X-direction vector based on the three Z-direction components and the three Y-direction components of the attitude matrix at the end of the selected robot arm. In this way, compared with the implementation process of solving nonlinear equations, it is more convenient and does not rely on existing libraries, such as matlab's library for solving nonlinear equations, which can increase the calculation speed.
上述第4步骤可以通过如下步骤实现:依据右手法则,可由机械臂末端姿态矩阵的第二列和第三列求出第一列,如下公式(16)表示:
The above-mentioned step 4 can be achieved through the following steps: According to the right-hand rule, the second column of the robot end attitude matrix can be and the third column Find the first column, expressed by the following formula (16):
上述步骤130可以进一步包括但不限于如下2个步骤。第1步骤为,依据所获得的机械臂末端的位姿,求解逆运动学,获得机械臂的各关节的目标姿态。第2步骤为,通过轨迹插补,得到所选机械臂的机械臂末端从当前位姿到各关节的目标位姿的轨迹。The above step 130 may further include, but is not limited to, the following two steps. The first step is to solve the inverse kinematics based on the obtained posture of the end of the robotic arm to obtain the target posture of each joint of the robotic arm. The second step is to obtain the trajectory of the end of the selected robotic arm from the current posture to the target posture of each joint through trajectory interpolation.
其中,上述第1步骤可以进一步包括但不限于:Among them, the above-mentioned step 1 may further include but is not limited to:
依据所获得的机械臂末端的位姿,求解逆运动学,获得所选机械臂的目标构型, 目标构型包括各关节的目标角度;Based on the obtained pose of the end of the robotic arm, solve the inverse kinematics and obtain the target configuration of the selected robotic arm. The target configuration includes the target angles of each joint;
依据机器人的当前构型以及目标构型,进行轨迹插补,控制机械臂从机器人的当前构型运动到指向目标对象的机器人的目标构型,机器人的当前构型包括各关节的当前角度。Based on the current configuration of the robot and the target configuration, trajectory interpolation is performed to control the movement of the robot arm from the current configuration of the robot to the target configuration of the robot pointing to the target object. The current configuration of the robot includes the current angles of each joint.
图5所示为本申请实施例提供的机器人指向动作控制方法的测量动作精度的流程示意图。FIG. 5 is a schematic flowchart of measuring action accuracy of the robot pointing action control method provided by the embodiment of the present application.
结合图1至图5所示,在上述步骤140之后,该方法可以包括但不限于还:步骤150,使用测量设备测试所获得的机械臂末端的位姿,确定所选机械臂的机械臂末端指向目标对象的指向动作的误差。如此,控制机械臂指向目标后,确定机械臂指向目标的偏差,从而测量指向动作的精度,以方便后续的调整及控制优化。As shown in FIGS. 1 to 5 , after the above-mentioned step 140 , the method may include but is not limited to: step 150 , using a measuring device to test the obtained pose of the end of the robotic arm, and determine the end of the robotic arm of the selected robotic arm. Error in pointing motion toward a target object. In this way, after controlling the robot arm to point to the target, the deviation of the robot arm pointing to the target is determined, thereby measuring the accuracy of the pointing action to facilitate subsequent adjustments and control optimization.
在一些实施例中,使用测量设备测试所获得的机械臂末端的位姿,确定所选机械臂的机械臂末端指向目标对象的指向动作的误差,包括:第一步骤,确定目标对象指向所选机械臂的机械臂末端的向量。具体的,可以使用测量设备测试机械臂末端的位姿,依据目标对象的位置及测量到的机械臂末端的位置,计算目标对象指向机械臂末端的向量。第二步骤,确定向量与所选机械臂的机械臂末端的姿态矩阵的Z方向向量的夹角,作为上述误差。In some embodiments, using a measurement device to test the obtained pose of the end of the robotic arm, and determining the error of the pointing action of the end of the selected robotic arm toward the target object, includes: a first step, determining that the target object points to the selected Vector of the end of the robotic arm. Specifically, a measuring device can be used to test the posture of the end of the robotic arm, and based on the position of the target object and the measured position of the end of the robotic arm, the vector pointing from the target object to the end of the robotic arm is calculated. The second step is to determine the angle between the vector and the Z-direction vector of the attitude matrix of the robot end of the selected robot arm as the above error.
进一步的,上述步骤150可以进一步包括但不限于如下1)至3):Further, the above step 150 may further include but is not limited to the following 1) to 3):
1)、使用激光跟踪仪测试机械臂末端的位姿,测量使用的坐标系与机器人坐标系重合,测试结果即机械臂末端在机器人坐标系中的实际位姿(rpe′,rRe′);1) Use a laser tracker to test the posture of the end of the robotic arm. The coordinate system used for measurement coincides with the robot coordinate system. The test result is the actual posture of the end of the robotic arm in the robot coordinate system ( r p e ′, r R e ′);
2)、由上述步骤1)中的测试结果,以及目标对象在机器人坐标系中的坐标rptarget,计算目标对象指向机械臂末端的向量 2). Based on the test results in step 1) above and the coordinates r p target of the target object in the robot coordinate system, calculate the vector of the target object pointing to the end of the robotic arm.
3)、姿态矩阵rRe′的第三列表示的向量与上述步骤2)中的向量的夹角,即为指向动作的误差δ,通过如下公式(17)计算:
3), the vector represented by the third column of the attitude matrix r R e With the vector in step 2) above The angle between , which is the error δ of the pointing action, is calculated by the following formula (17):
图6所示为本申请实施例提供的机器人指向动作控制装置的模块示意图。Figure 6 shows a module schematic diagram of a robot pointing action control device provided by an embodiment of the present application.
如图6所示,该机器人指向动作控制装置,包括如下模块:As shown in Figure 6, the robot points to the action control device, including the following modules:
机械臂选择模块41,用于在目标对象超出机器人的机械臂末端的可达范围的情况下,依据目标对象在机器人坐标系中的目标坐标,选择机器人的双机械臂中的与目标对象距离更近的机械臂;The robotic arm selection module 41 is used to select the one of the robot's dual robotic arms that is closer to the target object based on the target coordinates of the target object in the robot coordinate system when the target object exceeds the reachable range of the end of the robot's robotic arm. Nearby robotic arm;
机械臂末端的位姿确定模块42,用于根据目标坐标的映射、所选机械臂的肩关节在机器人坐标系中的肩关节坐标及指向动作的预定规则,获得所选机械臂的机械臂末端在可达范围内指向所述目标对象的位姿;The pose determination module 42 of the end of the robotic arm is used to obtain the end of the robotic arm of the selected robotic arm based on the mapping of the target coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the predetermined rules of the pointing action. The pose pointing to the target object within the reachable range;
轨迹确定模块43,用于依据所获得的机械臂末端的位姿,确定所选机械臂沿所选机械臂的各关节的当前位姿到各关节的目标位姿的轨迹;The trajectory determination module 43 is used to determine the trajectory of the selected robotic arm from the current posture of each joint of the selected robotic arm to the target posture of each joint based on the obtained posture of the end of the robotic arm;
动作控制模块44,用于依据轨迹,控制所选机械臂指向目标对象。The action control module 44 is used to control the selected robotic arm to point to the target object according to the trajectory.
在一些实施例中,所述机械臂末端的位姿确定模块42,包括:In some embodiments, the pose determination module 42 at the end of the robotic arm includes:
坐标映射子模块,用于将所述目标坐标映射到所选机械臂的机械臂末端的可达范围内的坐标;The coordinate mapping submodule is used to map the target coordinates to coordinates within the reachable range of the end of the selected robotic arm;
指定方向确定子模块,用于依据所述指向动作的预定规则,确定所选机械臂的机械臂末端的指定方向;The designated direction determination submodule is used to determine the designated direction of the end of the robotic arm of the selected robotic arm according to the predetermined rules of the pointing action;
机械臂末端的姿态生成子模块,用于依据所映射的坐标、所选机械臂的肩关节在机器人坐标系中的肩关节坐标以及所述指定方向,生成所选机械臂的机械臂末端的姿态。The attitude generation submodule of the end of the robotic arm is used to generate the attitude of the end of the robotic arm of the selected robotic arm based on the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected robotic arm in the robot coordinate system, and the specified direction. .
在一些实施例中,所述坐标映射子模块,具体用于:依据所述机器人坐标系中,所选机械臂的肩关节指向所述目标对象的向量与所述机器人坐标系的X轴的夹角,以及 所选机械臂的臂长,获得所选机械臂的机械臂末端的X坐标及Y坐标;依据所述目标对象在机器人坐标系中的Z方向坐标与所选机械臂的肩关节的坐标、所述臂长之间的关系,获得所选机械臂的机械臂末端的Z坐标;In some embodiments, the coordinate mapping submodule is specifically configured to: in the robot coordinate system, the intersection between the vector of the shoulder joint of the selected manipulator pointing to the target object and the X-axis of the robot coordinate system angle, and The arm length of the selected robotic arm is used to obtain the X coordinate and Y coordinate of the end of the robotic arm of the selected robotic arm; based on the Z coordinate of the target object in the robot coordinate system and the coordinates of the shoulder joint of the selected robotic arm, the According to the relationship between the arm lengths, the Z coordinate of the end of the robotic arm of the selected robotic arm is obtained;
和/或,and / or,
所述机械臂末端的姿态生成子模块,具体用于:The attitude generation submodule at the end of the robotic arm is specifically used for:
将所述目标对象指向机械臂末端的向量的单位向量,确定为所选机械臂的机械臂末端的姿态矩阵的用于组成Z方向向量的3个Z方向分量,所选机械臂的机械臂末端的姿态矩阵为3行3列的矩阵;The unit vector of the vector of the target object pointing to the end of the robotic arm is determined as the three Z-direction components of the attitude matrix of the end of the selected robotic arm that are used to form the Z-direction vector. The end of the selected robotic arm's robotic arm The posture matrix is a matrix with 3 rows and 3 columns;
在所选机械臂的机械臂末端指向目标对象,且所述姿态矩阵的所述3个Z方向分量为正,所选机械臂为右机械臂的情况下,确定所述姿态矩阵的用于组成Y方向向量的3个Y方向分量为正;When the end of the selected robot arm points to the target object, and the three Z-direction components of the attitude matrix are positive, and the selected robot arm is a right robot arm, determine the composition of the attitude matrix. The three Y-direction components of the Y-direction vector are positive;
在所选机械臂的机械臂末端指向目标对象,且所述姿态矩阵的所述3个Z方向分量为正,所选机械臂为左机械臂的情况下,确定所述姿态矩阵的用于组成Y方向向量的3个Y方向分量为负;When the end of the selected robot arm points to the target object, and the three Z-direction components of the attitude matrix are positive, and the selected robot arm is the left robot arm, determine the composition of the attitude matrix. The three Y-direction components of the Y-direction vector are negative;
依据右手法则,根据所选机械臂的机械臂末端的姿态矩阵的所述3个Z方向分量及3个Y方向分量,确定所述姿态矩阵的用于组成X方向向量的3个X方向分量。According to the right-hand rule, the three X-direction components of the attitude matrix used to form the X-direction vector are determined based on the three Z-direction components and the three Y-direction components of the attitude matrix of the end of the selected robot arm.
在一些实施例中,所述装置还包括:指向动作的误差确定模块,用于使用测量设备测试所获得的机械臂末端的位姿,确定所选机械臂的机械臂末端指向目标对象的指向动作的误差。In some embodiments, the device further includes: a pointing action error determination module, configured to use a measurement device to test the obtained pose of the end of the robotic arm, and determine the pointing action of the end of the selected robotic arm toward the target object. error.
在一些实施例中,指向动作的误差确定模块,具体用于:确定所述目标对象指向所选机械臂的机械臂末端的向量;确定所述向量与所选机械臂的机械臂末端的姿态矩阵的Z方向向量的夹角,作为上述误差。In some embodiments, the error determination module of the pointing action is specifically used to: determine the vector of the target object pointing to the end of the robotic arm of the selected robotic arm; determine the attitude matrix between the vector and the end of the robotic arm of the selected robotic arm. The angle between the Z direction vectors is used as the above error.
在一些实施例中,所述轨迹确定模块43,包括:In some embodiments, the trajectory determination module 43 includes:
目标姿态确定子模块,用于依据所获得的机械臂末端的位姿,求解逆运动学,获得机械臂的各关节的目标姿态;The target posture determination submodule is used to solve the inverse kinematics based on the obtained posture of the end of the robotic arm, and obtain the target posture of each joint of the robotic arm;
轨迹确定子模块,用于通过轨迹插补,得到所选机械臂的机械臂末端从当前位姿到所述各关节的目标位姿的轨迹。The trajectory determination submodule is used to obtain the trajectory of the end of the selected robotic arm from the current posture to the target posture of each joint through trajectory interpolation.
在一些实施例中,所述机械臂末端的位姿确定模块42,具体用于:依据所获得的机械臂末端的位姿,求解逆运动学,获得所选机械臂的目标构型,所述目标构型包括所述各关节的目标角度;依据所述机器人的当前构型以及所述目标构型,进行轨迹插补,控制机械臂从所述机器人的当前构型运动到指向所述目标对象的机器人的目标构型,所述机器人的当前构型包括所述各关节的当前角度。In some embodiments, the pose determination module 42 of the end of the robotic arm is specifically used to: solve the inverse kinematics based on the obtained pose of the end of the robotic arm, and obtain the target configuration of the selected robotic arm. The target configuration includes the target angle of each joint; based on the current configuration of the robot and the target configuration, trajectory interpolation is performed to control the robot arm to move from the current configuration of the robot to point to the target object The target configuration of the robot, and the current configuration of the robot includes the current angles of each joint.
上述装置中各个模块的功能和作用的实现过程具体详见上述方法中对应步骤的实现过程,在此不再赘述。The specific implementation process of the functions and effects of each module in the above device can be found in the implementation process of the corresponding steps in the above method, and will not be described again here.
本申请还提供了一种计算机可读存储介质,其上存储有计算机指令,该指令被处理器执行时实现上述机器人指向动作控制方法。This application also provides a computer-readable storage medium on which computer instructions are stored. When the instructions are executed by a processor, the above-mentioned robot pointing action control method is implemented.
本申请还提供了一种电子设备,图7所示为本申请实施例提供的电子设备的模块框图。The present application also provides an electronic device. Figure 7 shows a module block diagram of the electronic device provided by an embodiment of the present application.
如图7所示,电子设备50包括一个或多个处理器51,用于实现如上所述的机器人指向动作控制方法。As shown in FIG. 7 , the electronic device 50 includes one or more processors 51 for implementing the robot pointing action control method as described above.
在一些实施例中,电子设备50可以包括存储器59,存储器59可以存储有可被处理器51调用的程序,可以包括非易失性存储介质。在一些实施例中,电子设备50可以包括内存58和接口57。在一些实施例中,电子设备50还可以根据实际应用包括其他硬件。In some embodiments, the electronic device 50 may include a memory 59, which may store programs that can be called by the processor 51, and may include a non-volatile storage medium. In some embodiments, electronic device 50 may include memory 58 and interface 57 . In some embodiments, the electronic device 50 may also include other hardware depending on the actual application.
本申请实施例的存储器59,其上存储有程序,该程序被处理器51执行时,用于实现上述的机器人指向动作控制方法。 The memory 59 in the embodiment of the present application stores a program thereon, and when the program is executed by the processor 51, it is used to implement the above-mentioned robot pointing action control method.
本申请可采用在一个或多个其中包含有程序代码的存储器59(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。存储器59包括永久性和非永久性、可移动和非可移动媒体,可以任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。存储器59的例子包括但不限于:相变内存(Phase Change RAM,PRAM)、静态随机存取存储器(Static Random-Access Memory,SRAM)、动态随机存取存储器(Dynamic Random Access Memory,DRAM)、其他类型的随机存取存储器(Random Access Memory,RAM)、只读存储器(Read-Only Memory,ROM)、电可擦除可编程只读存储器(Electrically Erasable Programmable Read Only Memory,EEPROM)、快闪记忆体(Flash Memory)或其他内存技术、只读光盘只读存储器(Compact Disc Read Only Memory,CD-ROM)、数字多功能光盘(Digital Versatile Disc,DVD)或其他光学存储、磁盒式磁带,磁带及磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。The application may take the form of a computer program product implemented on one or more memories 59 (including but not limited to disk memory, CD-ROM, optical memory, etc.) having program code embodied therein. Memory 59 includes permanent and non-permanent, removable and non-removable media, and information storage may be accomplished by any method or technology. Information may be computer-readable instructions, data structures, modules of programs, or other data. Examples of memory 59 include, but are not limited to: phase change memory (Phase Change RAM, PRAM), static random access memory (Static Random-Access Memory, SRAM), dynamic random access memory (Dynamic Random Access Memory, DRAM), and others. Types of Random Access Memory (RAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Flash Memory (Flash Memory) or other memory technology, Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, magnetic cassettes, tapes and Disk storage or other magnetic storage devices or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
以上所述仅为本申请的较佳实施例而已,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。The above are only preferred embodiments of the present application and are not intended to limit the present application. Any modifications, equivalent substitutions, improvements, etc. made within the spirit and principles of the present application shall be included in the present application. within the scope of protection.
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。 It should also be noted that the terms "comprises,""comprises," or any other variation thereof are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that includes a list of elements not only includes those elements, but also includes Other elements are not expressly listed or are inherent to the process, method, article or equipment. Without further limitation, the statement "comprises a..." qualifies an element as not excluding the presence of other identical elements in a process, method, article, or device that includes the stated element.

Claims (10)

  1. 一种机器人指向动作控制方法,其特征在于,包括:A robot pointing action control method, which is characterized by including:
    在目标对象超出机器人的机械臂末端的可达范围的情况下,依据所述目标对象在机器人坐标系中的目标坐标,选择所述机器人的双机械臂中的与所述目标对象距离更近的机械臂;When the target object exceeds the reachable range of the end of the robot's manipulator arm, according to the target coordinates of the target object in the robot coordinate system, select the one of the two manipulator arms of the robot that is closer to the target object. robotic arm;
    根据所述目标坐标的映射、所选机械臂的肩关节在机器人坐标系中的肩关节坐标及指向动作的预定规则,获得所选机械臂的机械臂末端在可达范围内指向所述目标对象的位姿;According to the mapping of the target coordinates, the shoulder joint coordinates of the selected robot arm's shoulder joint in the robot coordinate system and the predetermined rules of the pointing action, it is obtained that the end of the selected robot arm points to the target object within the reachable range. posture;
    依据所获得的机械臂末端的位姿,确定所选机械臂沿所选机械臂的各关节的当前位姿到所述各关节的目标位姿的轨迹;Based on the obtained posture of the end of the robotic arm, determine the trajectory of the selected robotic arm from the current posture of each joint of the selected robotic arm to the target posture of each joint;
    依据所述轨迹,控制所选机械臂指向目标对象。According to the trajectory, the selected robotic arm is controlled to point to the target object.
  2. 如权利要求1所述的机器人指向动作控制方法,其特征在于,所述根据所述目标坐标的映射、所选机械臂的肩关节在机器人坐标系中的肩关节坐标及指向动作的预定规则,获得所选机械臂的机械臂末端在可达范围内指向所述目标对象的位姿,包括:The robot pointing action control method according to claim 1, characterized in that, based on the mapping of the target coordinates, the shoulder joint coordinates of the selected mechanical arm's shoulder joint in the robot coordinate system and the predetermined rules of the pointing action, Obtain the pose of the end of the selected robotic arm pointing to the target object within the reachable range, including:
    将所述目标坐标映射到所选机械臂的机械臂末端的可达范围内的坐标;Map the target coordinates to coordinates within the reachable range of the end of the robotic arm of the selected robotic arm;
    依据所述指向动作的预定规则,确定所选机械臂的机械臂末端的指定方向;Determine the designated direction of the end of the robotic arm of the selected robotic arm according to the predetermined rules of the pointing action;
    依据所映射的坐标、所选机械臂的肩关节在机器人坐标系中的肩关节坐标以及所述指定方向,生成所选机械臂的机械臂末端的姿态。Based on the mapped coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the specified direction, the posture of the robotic arm end of the selected robotic arm is generated.
  3. 如权利要求2所述的机器人指向动作控制方法,其特征在于,The robot pointing action control method according to claim 2, characterized in that:
    所述将所述目标坐标映射到所选机械臂的机械臂末端的可达范围内的坐标,包括:Mapping the target coordinates to coordinates within the reachable range of the end of the selected robotic arm includes:
    依据所述机器人坐标系中,所选机械臂的肩关节指向所述目标对象的向量与所述机器人坐标系的X轴的夹角,以及所选机械臂的臂长,获得所选机械臂的机械臂末端的X坐标及Y坐标;According to the angle between the vector of the selected robot arm's shoulder joint pointing to the target object in the robot coordinate system and the X-axis of the robot coordinate system, and the arm length of the selected robot arm, the value of the selected robot arm is obtained. The X coordinate and Y coordinate of the end of the robotic arm;
    依据所述目标对象在机器人坐标系中的Z方向坐标与所选机械臂的肩关节的坐标、所述臂长之间的关系,获得所选机械臂的机械臂末端的Z坐标;According to the relationship between the Z direction coordinate of the target object in the robot coordinate system, the coordinates of the shoulder joint of the selected robotic arm, and the arm length, the Z coordinate of the robotic arm end of the selected robotic arm is obtained;
    和/或,and / or,
    所述依据所映射的坐标、所选机械臂的肩关节在机器人坐标系中的肩关节坐标以及所述指定方向,生成所选机械臂的机械臂末端的姿态,包括:Generating the posture of the end of the robotic arm of the selected robotic arm based on the mapped coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the specified direction, includes:
    将所述目标对象指向机械臂末端的向量的单位向量,确定为所选机械臂的机械臂末端的姿态矩阵的用于组成Z方向向量的3个Z方向分量,所选机械臂的机械臂末端的姿态矩阵为3行3列的矩阵;The unit vector of the vector of the target object pointing to the end of the robotic arm is determined as the three Z-direction components of the attitude matrix of the end of the selected robotic arm that are used to form the Z-direction vector. The end of the selected robotic arm's robotic arm The posture matrix is a matrix with 3 rows and 3 columns;
    在所选机械臂的机械臂末端指向目标对象,且所述姿态矩阵的所述3个Z方向分量为正,所选机械臂为右机械臂的情况下,确定所述姿态矩阵的用于组成Y方向向量的3个Y方向分量为正;When the end of the selected robot arm points to the target object, and the three Z-direction components of the attitude matrix are positive, and the selected robot arm is a right robot arm, determine the composition of the attitude matrix. The three Y-direction components of the Y-direction vector are positive;
    在所选机械臂的机械臂末端指向目标对象,且所述姿态矩阵的所述3个Z方向分量为正,所选机械臂为左机械臂的情况下,确定所述姿态矩阵的用于组成Y方向向量的3个Y方向分量为负;When the end of the selected robot arm points to the target object, and the three Z-direction components of the attitude matrix are positive, and the selected robot arm is the left robot arm, determine the composition of the attitude matrix. The three Y-direction components of the Y-direction vector are negative;
    依据右手法则,根据所选机械臂的机械臂末端的姿态矩阵的所述3个Z方向分量及3个Y方向分量,确定所述姿态矩阵的用于组成X方向向量的3个X方向分量。According to the right-hand rule, the three X-direction components of the attitude matrix used to form the X-direction vector are determined based on the three Z-direction components and the three Y-direction components of the attitude matrix of the end of the selected robot arm.
  4. 如权利要求1所述的机器人指向动作控制方法,其特征在于,在所述依据所述轨迹,控制所选机械臂指向目标对象之后,所述方法还包括:The robot pointing action control method according to claim 1, characterized in that, after controlling the selected robotic arm to point to the target object based on the trajectory, the method further includes:
    使用测量设备测试所获得的机械臂末端的位姿,确定所选机械臂的机械臂末端指向目标对象的指向动作的误差。Use a measuring device to test the obtained pose of the robotic arm end, and determine the error in the pointing action of the selected robotic arm's robotic arm end towards the target object.
  5. 如权利要求4所述的机器人指向动作控制方法,其特征在于,所述使用测量设备测试所获得的机械臂末端的位姿,确定所选机械臂的机械臂末端指向目标对象的指向 动作的误差,包括:The robot pointing action control method according to claim 4, characterized in that the measurement device is used to test the obtained posture of the end of the robotic arm to determine the direction in which the end of the robotic arm of the selected robotic arm points to the target object. Action errors include:
    确定所述目标对象指向所选机械臂的机械臂末端的向量;Determine a vector pointing from the target object to the end of the robotic arm of the selected robotic arm;
    确定所述向量与所选机械臂的机械臂末端的姿态矩阵的Z方向向量的夹角,作为所述误差。The angle between the vector and the Z-direction vector of the attitude matrix of the robot end of the selected robot arm is determined as the error.
  6. 如权利要求1所述的机器人指向动作控制方法,其特征在于,所述依据所获得的机械臂末端的位姿,确定所选机械臂沿所选机械臂的各关节的当前位姿到所述各关节的目标位姿的轨迹,包括:The robot pointing action control method according to claim 1, characterized in that, based on the obtained posture of the end of the robotic arm, determining the current posture of each joint of the selected robotic arm along the selected robotic arm to the The trajectory of the target pose of each joint includes:
    依据所获得的机械臂末端的位姿,求解逆运动学,获得机械臂的各关节的目标姿态;Based on the obtained posture of the end of the robotic arm, solve the inverse kinematics and obtain the target posture of each joint of the robotic arm;
    通过轨迹插补,得到所选机械臂的机械臂末端从当前位姿到所述各关节的目标位姿的轨迹。Through trajectory interpolation, the trajectory of the robotic arm end of the selected robotic arm from the current posture to the target posture of each joint is obtained.
  7. 如权利要求6所述的机器人指向动作控制方法,其特征在于,所述依据所获得的机械臂末端的位姿,求解逆运动学,获得机械臂的各关节的目标姿态,包括:The robot pointing action control method according to claim 6, wherein the step of solving the inverse kinematics based on the obtained posture of the end of the robotic arm to obtain the target posture of each joint of the robotic arm includes:
    依据所获得的机械臂末端的位姿,求解逆运动学,获得所选机械臂的目标构型,所述目标构型包括所述各关节的目标角度;Based on the obtained posture of the end of the robotic arm, solve the inverse kinematics to obtain the target configuration of the selected robotic arm, where the target configuration includes the target angle of each joint;
    依据所述机器人的当前构型以及所述目标构型,进行轨迹插补,控制机械臂从所述机器人的当前构型运动到指向所述目标对象的机器人的目标构型,所述机器人的当前构型包括所述各关节的当前角度。According to the current configuration of the robot and the target configuration, trajectory interpolation is performed to control the movement of the robot arm from the current configuration of the robot to the target configuration of the robot pointing to the target object. The current configuration of the robot is The configuration includes the current angles of each of the joints.
  8. 一种机器人指向动作控制装置,其特征在于,包括:A robot pointing action control device, characterized by including:
    机械臂选择模块,用于在目标对象超出机器人的机械臂末端的可达范围的情况下,依据所述目标对象在机器人坐标系中的目标坐标,选择所述机器人的双机械臂中的与所述目标对象距离更近的机械臂;The robot arm selection module is used to select the robot arm of the robot according to the target coordinates of the target object in the robot coordinate system when the target object exceeds the reachable range of the robot's robot arm end. The robot arm is closer to the target object;
    机械臂末端的位姿确定模块,用于根据所述目标坐标的映射、所选机械臂的肩关节在机器人坐标系中的肩关节坐标及指向动作的预定规则,获得所选机械臂的机械臂末端在可达范围内指向所述目标对象的位姿;The pose determination module at the end of the robotic arm is used to obtain the robotic arm of the selected robotic arm based on the mapping of the target coordinates, the shoulder joint coordinates of the selected robotic arm's shoulder joint in the robot coordinate system, and the predetermined rules of the pointing action. The end point points to the pose of the target object within the reachable range;
    轨迹确定模块,用于依据所获得的机械臂末端的位姿,确定所选机械臂沿所选机械臂的各关节的当前位姿到所述各关节的目标位姿的轨迹;A trajectory determination module, configured to determine the trajectory of the selected robotic arm from the current posture of each joint of the selected robotic arm to the target posture of each joint based on the obtained posture of the end of the robotic arm;
    动作控制模块,用于依据所述轨迹,控制所选机械臂指向目标对象。An action control module is used to control the selected robotic arm to point to the target object according to the trajectory.
  9. 一种电子设备,其特征在于,包括:An electronic device, characterized by including:
    一个或多个处理器;one or more processors;
    存储器,用于存储一个或多个程序;Memory, used to store one or more programs;
    当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-7任一项所述的方法。When the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method according to any one of claims 1-7.
  10. 一种计算机可读存储介质,其上存储有计算机指令,其特征在于,该指令被处理器执行时实现如权利要求1-7中任一项所述方法。 A computer-readable storage medium on which computer instructions are stored, characterized in that when the instructions are executed by a processor, the method according to any one of claims 1-7 is implemented.
PCT/CN2023/118887 2023-01-03 2023-09-14 Method and apparatus for controlling pointing action of robot, and electronic device and storage medium WO2024037658A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310009464.8A CN115922728B (en) 2023-01-03 2023-01-03 Robot pointing motion control method, apparatus, electronic device, and storage medium
CN202310009464.8 2023-01-03

Publications (1)

Publication Number Publication Date
WO2024037658A1 true WO2024037658A1 (en) 2024-02-22

Family

ID=86654337

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/118887 WO2024037658A1 (en) 2023-01-03 2023-09-14 Method and apparatus for controlling pointing action of robot, and electronic device and storage medium

Country Status (2)

Country Link
CN (1) CN115922728B (en)
WO (1) WO2024037658A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115922728B (en) * 2023-01-03 2023-06-30 之江实验室 Robot pointing motion control method, apparatus, electronic device, and storage medium
CN116141341B (en) * 2023-04-21 2023-08-08 之江实验室 Method for realizing pointing action of five-degree-of-freedom mechanical arm meeting Cartesian space constraint

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011224737A (en) * 2010-04-21 2011-11-10 Toyota Motor Corp Guide robot, guide method, and program for controlling guide
US20120173019A1 (en) * 2010-12-29 2012-07-05 Samsung Electronics Co., Ltd. Robot and control method thereof
CN107972026A (en) * 2016-10-25 2018-05-01 深圳光启合众科技有限公司 Robot, mechanical arm and its control method and device
JP2020175466A (en) * 2019-04-17 2020-10-29 アズビル株式会社 Teaching device and teaching method
CN113601504A (en) * 2021-08-04 2021-11-05 之江实验室 Robot limb action control method and device, electronic device and storage medium
CN113814988A (en) * 2021-11-24 2021-12-21 之江实验室 7-degree-of-freedom SRS type mechanical arm inverse solution analysis method and device and electronic equipment
CN115922728A (en) * 2023-01-03 2023-04-07 之江实验室 Robot pointing motion control method, device, electronic device and storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102030141B1 (en) * 2014-06-11 2019-10-10 현대자동차(주) Method and system for controlling elbow of robot
CN105415372B (en) * 2015-12-09 2017-04-12 常州汉迪机器人科技有限公司 Multi-joint robot track planning method under constraint of safety space
US10723022B2 (en) * 2016-09-16 2020-07-28 Carbon Robotics, Inc. System and calibration, registration, and training methods
KR102001214B1 (en) * 2017-10-31 2019-10-01 충남대학교산학협력단 Apparatus and method for dual-arm robot teaching based on virtual reality
JP6720950B2 (en) * 2017-11-13 2020-07-08 株式会社安川電機 Laser processing method, controller and robot system
CN108187310B (en) * 2017-12-21 2019-05-31 东南大学 Feel that the limb motion of information and posture information is intended to understand and upper-limbs rehabilitation training robot and its control method based on power
CN109048890B (en) * 2018-07-13 2021-07-13 哈尔滨工业大学(深圳) Robot-based coordinated trajectory control method, system, device and storage medium
CN112775931B (en) * 2019-11-05 2022-06-28 深圳市优必选科技股份有限公司 Mechanical arm control method and device, computer-readable storage medium and robot
EP3824839A1 (en) * 2019-11-19 2021-05-26 Koninklijke Philips N.V. Robotic positioning of a device
CN112828885B (en) * 2020-12-30 2022-09-20 诺创智能医疗科技(杭州)有限公司 Hybrid master-slave mapping method, mechanical arm system and computer equipment
CN114310915B (en) * 2022-02-16 2022-09-09 哈尔滨工业大学 Space manipulator butt joint end tool trajectory planning method based on visual feedback
CN114952868B (en) * 2022-07-26 2022-11-11 之江实验室 7-degree-of-freedom SRS (sounding reference Signal) type mechanical arm control method and device and piano playing robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011224737A (en) * 2010-04-21 2011-11-10 Toyota Motor Corp Guide robot, guide method, and program for controlling guide
US20120173019A1 (en) * 2010-12-29 2012-07-05 Samsung Electronics Co., Ltd. Robot and control method thereof
CN107972026A (en) * 2016-10-25 2018-05-01 深圳光启合众科技有限公司 Robot, mechanical arm and its control method and device
JP2020175466A (en) * 2019-04-17 2020-10-29 アズビル株式会社 Teaching device and teaching method
CN113601504A (en) * 2021-08-04 2021-11-05 之江实验室 Robot limb action control method and device, electronic device and storage medium
CN113814988A (en) * 2021-11-24 2021-12-21 之江实验室 7-degree-of-freedom SRS type mechanical arm inverse solution analysis method and device and electronic equipment
CN115922728A (en) * 2023-01-03 2023-04-07 之江实验室 Robot pointing motion control method, device, electronic device and storage medium

Also Published As

Publication number Publication date
CN115922728B (en) 2023-06-30
CN115922728A (en) 2023-04-07

Similar Documents

Publication Publication Date Title
WO2024037658A1 (en) Method and apparatus for controlling pointing action of robot, and electronic device and storage medium
US8560122B2 (en) Teaching and playback method based on control of redundancy resolution for robot and computer-readable medium controlling the same
JP6137155B2 (en) Interference avoidance method, control device, and program
WO2021128787A1 (en) Positioning method and apparatus
Chaumette et al. Visual servo control. I. Basic approaches
JP2019516146A (en) Robot obstacle avoidance control system, method, robot and storage medium
CN114800534B (en) Mechanical arm control method and device
WO2022252676A1 (en) Method and apparatus for calibrating robotic arm flange physical origin, and electronic device
KR20220155921A (en) Method for controlling a robot device
CN111216136A (en) Multi-degree-of-freedom mechanical arm control system, method, storage medium and computer
JP7262373B2 (en) Trajectory plan generation device, trajectory plan generation method, and trajectory plan generation program
Cong et al. A new decoupled control law for image-based visual servoing control of robot manipulators
JPH02224004A (en) Interference check device for moving body
JPH08108383A (en) Manipulator control device
Fomena et al. Improvements on visual servoing from spherical targets using a spherical projection model
He et al. Haptic-aided robot path planning based on virtual tele-operation
CN112476435B (en) Calibration method and calibration device for gravity acceleration direction and storage medium
Marchand et al. Visual servoing through mirror reflection
JP2019197333A (en) Path correction method and control device of multiple spindle processing machine
JP6515828B2 (en) Interference avoidance method
Astad et al. Vive for robotics: Rapid robot cell calibration
WO2022141160A1 (en) Master-slave mapping method for parallel platform, and mechanical arm system and storage medium
Park et al. Tracking on lie group for robot manipulators
CN112907669A (en) Camera pose measuring method and device based on coplanar feature points
JP2015058493A (en) Control device, robot system, robot, robot operation information generation method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23854553

Country of ref document: EP

Kind code of ref document: A1