CN115922728A - Robot pointing motion control method, device, electronic device and storage medium - Google Patents

Robot pointing motion control method, device, electronic device and storage medium Download PDF

Info

Publication number
CN115922728A
CN115922728A CN202310009464.8A CN202310009464A CN115922728A CN 115922728 A CN115922728 A CN 115922728A CN 202310009464 A CN202310009464 A CN 202310009464A CN 115922728 A CN115922728 A CN 115922728A
Authority
CN
China
Prior art keywords
mechanical arm
robot
target object
arm
tail end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310009464.8A
Other languages
Chinese (zh)
Other versions
CN115922728B (en
Inventor
朱世强
黄秋兰
王凡
谢安桓
梁定坤
顾建军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202310009464.8A priority Critical patent/CN115922728B/en
Publication of CN115922728A publication Critical patent/CN115922728A/en
Application granted granted Critical
Publication of CN115922728B publication Critical patent/CN115922728B/en
Priority to PCT/CN2023/118887 priority patent/WO2024037658A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application provides a robot pointing motion control method and device, electronic equipment and a storage medium. The method comprises the following steps: under the condition that a target object exceeds the reachable range of the tail end of a mechanical arm of the robot, selecting the mechanical arm which is closer to the target object in the two mechanical arms of the robot according to the target object coordinate of the target object in a robot coordinate system; according to the mapping of the target object coordinate, the shoulder joint coordinate of the shoulder joint of the selected mechanical arm in the robot coordinate system and a preset rule of pointing motion, acquiring the pose of the mechanical arm of the selected mechanical arm pointing to the target object within the reachable range of the tail end of the mechanical arm; determining the track from the current pose of each joint of the selected mechanical arm to the target pose of each joint of the selected mechanical arm according to the obtained pose of the tail end of the mechanical arm; and controlling the selected mechanical arm to point to the target object according to the track.

Description

Robot pointing motion control method, device, electronic device and storage medium
Technical Field
The invention relates to the technical field of anthropomorphic robot of a robot mechanical arm, in particular to a robot pointing motion control method and device, electronic equipment and a storage medium.
Background
With the development of science and technology, the application of robots has entered into the aspects of people's life, such as many shopping malls, restaurants, exhibition halls, etc., service robots have been introduced, but in the current robot-human interaction technology, the voice and vision technology is relatively mature, and the aspect of limb action control is relatively deficient, so that the robot-human interaction is relatively rigid. If the body language is added on the basis, the robot can be more anthropomorphic when interacting with people.
In the related art, when a robot arm of a robot performs an action, the robot does not perform any processing on a target object when the target object is out of the reach of the robot arm. Therefore, the application scenarios of the mechanical arm are limited.
Disclosure of Invention
The application provides a robot pointing motion control method and device, electronic equipment and a storage medium, which are used for realizing the application of a pointing target object scene and improving the applicability of a robot.
The application provides a robot pointing motion control method, which comprises the following steps:
under the condition that a target object exceeds the reachable range of the tail end of a mechanical arm of the robot, selecting the mechanical arm which is closer to the target object in the two mechanical arms of the robot according to the target object coordinate of the target object in a robot coordinate system;
according to the mapping of the target object coordinate, the shoulder joint coordinate of the shoulder joint of the selected mechanical arm in the robot coordinate system and a preset rule of pointing motion, acquiring the pose of the mechanical arm of the selected mechanical arm pointing to the target object within the reachable range of the tail end of the mechanical arm;
determining the track from the current position of each joint of the selected mechanical arm to the target position of each joint of the selected mechanical arm according to the obtained position of the tail end of the mechanical arm;
and controlling the selected mechanical arm to point to the target object according to the track.
Further, the obtaining the pose of the selected mechanical arm pointing to the target object within the reach of the mechanical arm end according to the mapping of the target object coordinates, the shoulder joint coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the predetermined rule of the pointing action includes:
mapping the target object coordinates to coordinates within a reach of the tail end of the selected mechanical arm;
determining the designated direction of the tail end of the mechanical arm of the selected mechanical arm according to the preset rule of the pointing action;
generating a pose of the robot arm tip of the selected robot arm based on the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected robot arm in the robot coordinate system, and the specified direction.
Further, the mapping the target object coordinates to coordinates within reach of the end of the robot arm of the selected robot arm comprises: acquiring an X coordinate and a Y coordinate of the tail end of the mechanical arm of the selected mechanical arm according to an included angle between a vector of the shoulder joint of the selected mechanical arm pointing to the target object and an X axis of the robot coordinate system and the arm length of the selected mechanical arm in the robot coordinate system; acquiring a Z coordinate of the tail end of the mechanical arm of the selected mechanical arm according to the Z-direction coordinate of the target object in a robot coordinate system and the relation between the coordinate of the shoulder joint of the selected mechanical arm and the arm length;
and/or the presence of a gas in the gas,
generating a pose of the robot arm tip of the selected robot arm from the mapped coordinates, shoulder joint coordinates of the shoulder joint of the selected robot arm in the robot coordinate system, and the specified direction, comprising:
determining the unit vector of the target object pointing to the tail end of the mechanical arm as 3Z-direction components of the attitude matrix of the tail end of the mechanical arm of the selected mechanical arm for forming the Z-direction vector, wherein the attitude matrix of the tail end of the mechanical arm of the selected mechanical arm is a matrix with 3 rows and 3 columns;
determining that 3Y-direction components of the attitude matrix, which are used for forming a Y-direction vector, are positive when the tail end of the mechanical arm of the selected mechanical arm points to a target object, and the 3Z-direction components of the attitude matrix are positive, and the selected mechanical arm is a right mechanical arm;
determining that 3Y-direction components of the attitude matrix for forming a Y-direction vector are negative when the tail end of the mechanical arm of the selected mechanical arm points to a target object, the 3Z-direction components of the attitude matrix are positive, and the selected mechanical arm is a left mechanical arm;
and determining the 3Z-direction components and the 3Y-direction components of the attitude matrix of the tail end of the mechanical arm of the selected mechanical arm according to a right-hand rule, and determining the 3X-direction components of the attitude matrix for forming an X-direction vector.
Further, after the controlling the selected mechanical arm to point at the target object according to the trajectory, the method further includes:
and testing the obtained pose of the tail end of the mechanical arm by using the measuring equipment, and determining the error of the pointing action of the tail end of the mechanical arm of the selected mechanical arm pointing to the target object.
Further, the testing the obtained pose of the end of the mechanical arm by using the measuring equipment to determine the error of the pointing motion of the end of the mechanical arm of the selected mechanical arm pointing to the target object includes:
determining a vector of the target object pointing to the end of the mechanical arm of the selected mechanical arm;
and determining an included angle between the vector and the Z-direction component of the attitude matrix of the tail end of the mechanical arm as the error.
Further, the determining, according to the obtained pose of the end of the mechanical arm, a trajectory from the current pose of the end of the mechanical arm along each joint to the target pose of each joint of the selected mechanical arm includes:
solving inverse kinematics according to the obtained pose of the tail end of the mechanical arm to obtain the target pose of each joint of the mechanical arm;
and obtaining the track from the current pose to the target pose of each joint of the tail end of the mechanical arm of the selected mechanical arm through track interpolation.
Further, the obtaining the target postures of the joints of the mechanical arm by solving the inverse kinematics according to the obtained pose of the end of the mechanical arm includes:
solving inverse kinematics according to the obtained pose of the tail end of the mechanical arm to obtain a target configuration of the selected mechanical arm, wherein the target configuration comprises target angles of all joints;
and performing track interpolation according to the current configuration of the robot and the target configuration, and controlling a mechanical arm to move from the current configuration of the robot to the target configuration of the robot pointing to the target object, wherein the current configuration of the robot comprises the current angle of each joint.
The application provides a directional action controlling means of robot includes:
the robot arm selection module is used for selecting the robot arm which is closer to the target object in the two robot arms of the robot according to the target object coordinate of the target object in the robot coordinate system under the condition that the target object exceeds the reachable range of the tail end of the robot arm of the robot;
the pose determining module of the tail end of the mechanical arm is used for obtaining the pose of the mechanical arm of the selected mechanical arm pointing to the target object within the reachable range of the tail end of the mechanical arm according to the mapping of the target object coordinate, the shoulder joint coordinate of the shoulder joint of the selected mechanical arm in the robot coordinate system and the preset rule of the pointing action;
the track determining module is used for determining tracks from the current pose of each joint of the selected mechanical arm to the target pose of each joint of the selected mechanical arm according to the obtained pose of the tail end of the mechanical arm;
and the action control module is used for controlling the selected mechanical arm to point to the target object according to the track.
The application provides an electronic equipment, includes:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a method as in any above.
The present application provides a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the method as described in any one of the above.
In some embodiments, the robot pointing motion control method of the present application obtains a pose of a selected robot arm pointing to a target object within an reach of an end of the robot arm of the selected robot arm according to a predetermined rule of shoulder joint coordinates and pointing motions of a shoulder joint of the selected robot arm in a robot coordinate system based on a mapping of coordinates of the target object in a case where the target object is beyond the reach of the end of the robot arm of the robot, so as to complete a pointing motion of the selected robot arm pointing to the target object. Therefore, the application of pointing to the target object scene can be realized, and the applicability of the robot is improved.
Drawings
Fig. 1 is a schematic flowchart illustrating a robot pointing motion control method according to an embodiment of the present disclosure;
fig. 2 is a detailed flowchart of the robot pointing motion control method shown in fig. 1;
fig. 3 is a simplified schematic diagram illustrating a robot coordinate system and a target object in the robot pointing motion control method shown in fig. 1;
FIG. 4a is a schematic diagram of a current configuration of a robot of the robot pointing motion control method of FIG. 1;
FIG. 4b is a schematic diagram of a target configuration of the robot pointing motion control method shown in FIG. 1;
fig. 5 is a schematic flow chart illustrating the measurement motion accuracy of the robot pointing motion control method according to the embodiment of the present application;
fig. 6 is a block diagram of a robot pointing motion control apparatus according to an embodiment of the present invention;
fig. 7 is a block diagram illustrating an electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with one or more embodiments of the present specification. Rather, they are merely examples of apparatus and methods consistent with certain aspects of one or more embodiments of the specification, as detailed in the claims which follow.
It should be noted that: in other embodiments, the steps of the corresponding methods are not necessarily performed in the order shown and described herein. In some other embodiments, the method may include more or fewer steps than those described herein. Moreover, a single step described in this specification may be broken down into multiple steps for description in other embodiments; multiple steps described in this specification may be combined into a single step in other embodiments.
In order to solve the technical problem that an application scene of a mechanical arm is relatively limited, the embodiment of the application provides a robot pointing motion control method, wherein under the condition that a target object exceeds an reachable range of the tail end of the mechanical arm of a robot, the mechanical arm closer to the target object in two mechanical arms of the robot is selected according to the target object coordinate of the target object in a robot coordinate system; according to the mapping of the target object coordinate, the shoulder joint coordinate of the shoulder joint of the selected mechanical arm in the robot coordinate system and a preset rule of pointing motion, acquiring the pose pointing to the target object in the reachable range of the tail end of the mechanical arm of the selected mechanical arm; determining the track from the current pose of each joint to the target pose of each joint of the selected mechanical arm according to the obtained pose of the tail end of the mechanical arm; and controlling the selected mechanical arm to point to the target object according to the track.
In the embodiment of the application, in the case that the target object exceeds the reachable range of the end of the mechanical arm of the robot, according to the mapping of the coordinates of the target object, the shoulder joint coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the preset rule of the pointing motion, the pose pointing to the target object within the reachable range of the end of the mechanical arm of the selected mechanical arm is obtained, so as to complete the pointing motion of the selected mechanical arm to the target object. Therefore, the application of pointing to the target object scene can be realized, and the applicability of the robot is improved.
The robot pointing motion control method is applied to an application scene of the robot pointing to the target object. The robot pointing to target object application scenario may include, but is not limited to, a scenario where a robot auditorium explains or speaks a presentation. The presentation content may include, but is not limited to, video, lecture, and instructional content. The target object may include, but is not limited to, specific content including presentation content.
The following describes a specific implementation process of the robot pointing motion control method in detail.
Fig. 1 is a schematic flowchart illustrating a robot pointing motion control method according to an embodiment of the present application.
As shown in fig. 1, the robot pointing motion control method includes the following steps 110 to 140:
and 110, under the condition that the target object exceeds the reachable range of the tail end of the mechanical arm of the robot, selecting the mechanical arm which is closer to the target object from the two mechanical arms of the robot according to the target object coordinate of the target object in the robot coordinate system. As such, the target object exceeding the reach of the end of the mechanical arm of the robot indicates that the target object is outside the reach of the end of the mechanical arm of the robot. And the target object coordinates of the target object in the robot coordinate system enable the target object and the two mechanical arms of the robot to be in the same robot coordinate system, and the selected mechanical arm can be conveniently and quickly obtained.
The step 110 can realize that the mechanical arm closer to the target object is used to complete the pointing action, improve the pointing convenience, better meet the user requirements and be more beneficial to operation and realization.
There are various ways to implement the above step 110. In an implementation manner of the step 110, in the first step, it is determined that the target object is located in two corresponding areas pre-divided by two arms of the robot according to the target object coordinates of the target object in the robot coordinate system. And a second step of selecting the mechanical arm corresponding to the area where the target object is located, wherein the mechanical arm is closer to the two mechanical arms of the robot.
In another implementation of step 110, in a first step, a mapping distance between the vectors of the target object and the respective shoulder joints of the robot arm on a horizontal coordinate plane of the robot coordinate system is determined. And a second step of determining the mechanical arm corresponding to the shorter mapping distance as the mechanical arm with the target object closer to the two mechanical arms of the robot.
In another specific embodiment of step 110, the left shoulder joint coordinate in the robot coordinate system can be obtained according to the size of each joint of the robot, including the height h and the width w of each shoulder joint, for example r p ls =0,w, h and right shoulder joint coordinates r p rs And = (= 0, -w, h), which calculates euclidean distances between the left shoulder joint and the target object in the robot coordinate system, and the right shoulder joint in the robot coordinate system, and selects a robot arm having a small distance as the selected robot arm to execute the pointing motion.
And step 120, acquiring the pose of the target object pointed within the reach range of the tail end of the mechanical arm of the selected mechanical arm according to the mapping of the target object coordinate, the shoulder joint coordinate of the shoulder joint of the selected mechanical arm in the robot coordinate system and the preset rule of the pointing action.
The predetermined rule of the pointing action serves as a constraint condition of the pointing action, and reliable guarantee can be provided for the execution of the subsequent specified action.
The step 120 may obtain a position point pointing to the target object within the reachable range of the end of the selected robot arm through mapping conversion of the target object coordinates. This may enable subsequent control of the selected mechanical arm to point at the target object.
The pose of the end of the mechanical arm selected by the robot pointing to the target object within the reach range of the end of the mechanical arm selected by the robot comprises the position and the pose of the end of the mechanical arm selected by the robot in the robot coordinate system.
There are various embodiments of determining the position of the end of the arm of the selected arm in the robot coordinate system in the above-described step 120. In some embodiments, a position point at which the distance from the shoulder of the selected robot arm is smaller than the arm length of the selected robot arm may be determined as a position to which the target object coordinates are mapped within the reach of the robot arm end of the selected robot arm in the direction in which the robot arm end of the selected robot arm points to the target object.
In other embodiments, the arm length of the selected robot arm may be multiplied by a reduction coefficient in a direction in which the robot arm end of the selected robot arm points to the target object, to obtain a reduced position point within an reachable range, which is mapped to a position within the reachable range of the robot arm end of the selected robot arm as the target object coordinate. The reduction coefficient is larger than 0 and smaller than 1, so that the tail end of the mechanical arm is located at a position within an accessible range. For a detailed description, see below.
In still other embodiments, the arm length of the selected robot arm may be reduced by a predetermined length in a direction in which the robot arm end of the selected robot arm points to the target object, resulting in a reduced position point within the reach as the target object coordinate mapping to a position within the reach of the robot arm end of the selected robot arm. Wherein the predetermined length is greater than 0 and less than the arm length. In order to ensure that the selected mechanical arm is stretched as much as possible, the pointing range of the tail end of the mechanical arm is enlarged, and the tail end of the mechanical arm is ensured to be within the reachable range, and the preset length is smaller than the arm length as much as possible. For a detailed description, see below.
Of course, other positions that can determine the end of the selected robot arm in the robot coordinate system are all within the scope of the embodiments of the present application, and are not exemplified herein.
And step 130, determining the track from the current pose of each joint of the selected mechanical arm to the target pose of each joint of the selected mechanical arm according to the obtained pose of the tail end of the mechanical arm. The trajectory of step 130 above may be used as a basis for the control of the selected robotic arm to point at the target object.
And step 140, controlling the selected mechanical arm to point to the target object according to the track so as to complete the pointing action of the tail end of the mechanical arm of the selected mechanical arm to point to the target object.
Fig. 2 is a schematic flowchart illustrating a specific flow of the robot pointing motion control method shown in fig. 1.
As shown in fig. 2, before step 110, the method may further include, but is not limited to, the following steps 101 to 104 to determine that the target object is beyond the reach of the end of the mechanical arm of the robot:
step 101, establishing a world coordinate system of the robot, a coordinate system of the robot and a local coordinate system of each joint of the two mechanical arms.
The robot coordinate system refers to a base coordinate system of the robot.
And 102, acquiring an angle of the target object in the front range of the robot and the required rotation angle of the robot according to the vector of the coordinate of the target object under the robot coordinate system pointed by the origin of the robot coordinate system and the included angle of the X axis of the robot coordinate system.
Specifically, the step 102 can be implemented by, but not limited to, the following 3 steps:
(1) Calculating the coordinates of the robot in the world coordinate system of the robot w p robot Pointing to target object coordinates w p target Vector of (2)
Figure BDA0004035918380000081
Angle α to the X-axis of the robot world coordinate system:
Figure BDA0004035918380000091
wherein,
Figure BDA0004035918380000092
is a vector->
Figure BDA0004035918380000093
Is greater than or equal to the Y-direction component->
Figure BDA0004035918380000094
Is a vector->
Figure BDA0004035918380000095
The X-direction component of (a).
(2) And calculating the relation between the included angle alpha and the orientation angle theta of the front face of the robot under the world coordinate system, thereby obtaining the angle beta of the chassis of the robot, which needs to rotate:
Figure BDA00040359183800000912
Figure BDA0004035918380000096
wherein,
Figure BDA0004035918380000097
the difference between the angle alpha and the orientation angle theta of the front face of the robot in the world coordinate system may exceed [ -pi, pi [ -]And the beta obtained by the calculation is the minimum angle of the chassis rotation.
(3) And obtaining an orientation angle γ in the world coordinate system after the robot rotates:
Figure BDA0004035918380000098
103, according to the coordinates of the robot in the world coordinate system w p robot And an orientation angle gamma, the target object coordinate of the target object in the world coordinate system w p target And converting the coordinates into coordinates in a robot coordinate system, specifically as follows:
Figure BDA0004035918380000099
Figure BDA00040359183800000910
wherein,
Figure BDA00040359183800000911
for robots in the worldInverse matrix of attitude matrix under boundary coordinate system, -1 is inverse matrix, R robot Is a posture matrix of the robot under a world coordinate system, r p target the coordinates of the target object in the robot coordinate system are shown, target is the target object, and robot is the robot.
Step 104, determining that the target object exceeds the reachable range of the tail end of the mechanical arm of the robot. In the embodiment of step 104, if the target object is outside the maximum reach of the end of the mechanical arm of the robot, it is determined that the target object is beyond the reach of the end of the mechanical arm of the robot. In another embodiment of the above step 104, if the distance between the target object and the shoulder joints of the robot is greater than the arm length of the robot arm, it is determined that the target object is beyond the reach of the end of the robot arm of the robot.
Fig. 3 is a simplified schematic diagram illustrating a robot coordinate system and a target object in the robot pointing motion control method shown in fig. 1.
As shown in fig. 3, the XY plane coordinate system is a coordinate system in which the robot base coordinate system is projected on the XY plane. Wherein the coordinate system of the XY plane includes an origin O of the XY plane coordinate system corresponding to the head position 21 representing the robot, a left shoulder joint 22 of the left arm and a right shoulder joint 23 of the right arm, a target object 24, and a position point 25 within the reach of the arm end of the selected arm, and the like. The determination of the three-dimensional coordinates of the position point 25 in the robot coordinate system, i.e. the X-coordinate, the Y-coordinate and the Z-coordinate, is explained in detail below.
Continuing with fig. 3, step 120 may include, but is not limited to, the following steps 121 through 123. And step 121, mapping the target object coordinates to coordinates within the reach range of the tail end of the mechanical arm of the selected mechanical arm.
In some embodiments of step 121, in step 1, an X coordinate and a Y coordinate of the end of the selected robot arm are obtained according to an angle between a vector of the shoulder joint of the selected robot arm pointing to the target object in the robot coordinate system and an X axis of the robot coordinate system and an arm length of the selected robot arm.
The above step 121 is exemplary according toShoulder joint of selected mechanical arm in robot coordinate system r p s Pointing to a target object r p targer Vector of (2)
Figure BDA0004035918380000101
Obtaining the included angle eta between the robot and the X axis of the robot coordinate system and the arm length L of the selected mechanical arm r p e X, Y coordinates:
Figure BDA0004035918380000102
wherein K is set such that the distance between the position point 25 and the shoulder joint L of the selected robot arm is smaller than the arm length L of the selected robot arm.
And 2, acquiring the Z coordinate of the tail end of the mechanical arm of the selected mechanical arm according to the Z-direction coordinate of the target object in the robot coordinate system and the relation between the coordinate of the shoulder joint of the selected mechanical arm and the arm length.
Figure BDA0004035918380000103
Where δ is a number greater than zero and less than L, and may be a traversal value within a range.
The above equation (1) is used to explain that the target object is located lower than the shoulder joint in the Z direction, and the end of the robot arm of the selected robot arm is located lower than the shoulder joint, passing through the shoulder joint r p s (z) minus δ, consistent with the robot arm tip simulation pointing to a target object below the shoulder joint.
The above equation (2) is used to describe that the target object is located higher than the shoulder joint and lower than the total height of the shoulder joint and the robot arm in the Z direction, that the target object is located higher than the shoulder joint and within the reach of the robot arm, and that the target object passes through r p target (z) minus delta so that the end of the arm of the robot can point to the target object at a point lower than the target object, which is more in line with the end of the arm of the robotThe targeted object is simulated.
The above equation (3) is used to explain that the position of the target object is higher than the total height of the shoulder joint and the robot arm in the Z direction, and passes through the total height of the shoulder joint and the robot arm r p s (z) + L minus δ, the robot arm is pointed at a point below the target object. In this way, the end of the robot arm of the selected robot arm can be made within the reach, and the target object can be pointed as far as possible within the maximum reach of the end of the robot arm of the selected robot arm.
The designated direction of the end of the arm of the selected arm is determined 122 according to the predetermined rules of the pointing motion.
And step 123, generating the posture of the tail end of the mechanical arm of the selected mechanical arm according to the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the designated direction.
In some embodiments of the above step 123, step 1, a unit vector of a vector of the target object pointing to the end of the robot arm is determined as 3Z-direction components of a posture matrix of the end of the robot arm of the selected robot arm for constituting the Z-direction vector, the posture matrix of the end of the robot arm of the selected robot arm being a matrix of 3 rows and 3 columns. Illustratively, the pose matrix R of the end of the arm of the selected arm is represented as follows:
Figure BDA0004035918380000111
wherein,
Figure BDA0004035918380000112
for the X-direction vector, ax, ay, az are 3X-direction components, respectively, and these 3X-direction components belong to the first column->
Figure BDA0004035918380000113
As Y-direction vector, ox, oy, oz are 3Y-direction components, respectively, and these 3Y-direction components belong to the second column->
Figure BDA0004035918380000114
Is a Z-direction vector, nx, ny and nz are 3Z-direction components, respectively, the 3Z-direction components belonging to a third column->
Figure BDA0004035918380000115
The 1 st step can be performed by the target object r p target End of arm pointing to selected arm r p e Vector of (2)
Figure BDA0004035918380000121
Is the third column of the attitude matrix of the end of the arm of the selected robot arm->
Figure BDA0004035918380000122
Namely:
Figure BDA0004035918380000123
Figure BDA0004035918380000124
step 2, under the condition that the tail end of the mechanical arm of the selected mechanical arm points to the target object, 3 components in the Z direction of the attitude matrix are positive, and the selected mechanical arm is a right mechanical arm, determining 3Y-direction components of the attitude matrix for forming a Y-direction vector to be positive;
and 3, when the tail end of the mechanical arm of the selected mechanical arm points to the target object, 3 components in the Z direction of the attitude matrix are positive, and the selected mechanical arm is the left mechanical arm, determining that 3Y-direction components of the attitude matrix, which are used for forming a Y-direction vector, are negative.
Fig. 4a is a schematic diagram illustrating a current configuration of a robot of the robot pointing motion control method shown in fig. 1. Fig. 4b is a schematic diagram illustrating a target configuration of the robot pointing motion control method illustrated in fig. 1.
As shown in fig. 4a and 4b, the base of the robotBase center position O of the mark system r . The dual robot includes a left robot (not shown) and a right robot 30. Taking the right mechanical arm 30 as an example of the selected mechanical arm, the right mechanical arm 30 may include, but is not limited to, a right shoulder joint 31, a right big arm 32, a right small arm 33, and a right palm 34. The left mechanical arm has the same structure and is not described in detail herein.
With continuing reference to fig. 4a and 4b, the above steps 2 and 3 can be implemented by the following steps:
according to the rule that the palm inclines upwards when the mechanical arm points to the target object, the terminal matrix of the mechanical arm can be known when the mechanical arm points to the target object
Figure BDA0004035918380000125
When the component in the Z direction is positive and the right mechanical arm points to the target object, the gesture of the tail end of the mechanical arm is greater than or equal to>
Figure BDA0004035918380000126
The Y-direction component of (a) is positive and is used to indicate palmar-up. While the left arm is pointing at the target object, the Y-direction component is negative, here taking the right arm as shown in fig. 4a and 4b as an example:
Figure BDA0004035918380000131
the value of oz affects the upward range of the palm, and can be selected according to the pointing effect, the embodiment selects oz =0.4, and can also float around the value according to the actual solution result, so that:
Figure BDA0004035918380000132
setting:
Figure BDA0004035918380000133
thus, it is possible to obtain:
Figure BDA0004035918380000134
the left arm is the same, and will not be described again.
And 4, determining 3Z-direction components and 3Y-direction components of a posture matrix of the tail end of the mechanical arm of the selected mechanical arm according to a right-hand rule, and determining 3X-direction components of the posture matrix for forming an X-direction vector. Therefore, compared with the implementation process of solving the nonlinear equation, the method is more convenient, does not depend on the existing library, such as the library of solving the nonlinear equation of matlab, and improves the calculation speed.
The above-mentioned step 4 can be realized by the following steps: according to the right-hand rule, the second column of the attitude matrix of the tail end of the mechanical arm
Figure BDA0004035918380000135
And the third column->
Figure BDA0004035918380000136
The first row is obtained:
Figure BDA0004035918380000137
the step 130 may further include, but is not limited to, the following 2 steps. And 1, solving inverse kinematics according to the obtained pose of the tail end of the mechanical arm to obtain the target pose of each joint of the mechanical arm. And 2, obtaining the track from the current pose to the target pose of each joint of the tail end of the mechanical arm of the selected mechanical arm through track interpolation.
Wherein, the step 1 may further include, but is not limited to:
solving inverse kinematics according to the obtained pose of the tail end of the mechanical arm to obtain the target configuration of the selected mechanical arm, wherein the target configuration comprises the target angle of each joint;
and performing track interpolation according to the current configuration and the target configuration of the robot, and controlling the mechanical arm to move from the current configuration of the robot to the target configuration of the robot pointing to the target object, wherein the current configuration of the robot comprises the current angle of each joint.
Fig. 5 is a schematic flow chart illustrating the measurement accuracy of the robot pointing motion control method according to the embodiment of the present application.
As shown in fig. 1 to 5, after the step 140, the method may include, but is not limited to, further including: and 150, testing the obtained pose of the tail end of the mechanical arm by using the measuring equipment, and determining the error of the pointing action of the tail end of the mechanical arm of the selected mechanical arm pointing to the target object. Therefore, after the mechanical arm is controlled to point to the target, the deviation of the mechanical arm pointing to the target is determined, so that the precision of pointing action is measured, and subsequent adjustment and control optimization are facilitated.
In some embodiments, testing the obtained pose of the end of the robotic arm using the measurement device to determine an error in a pointing motion of the end of the robotic arm of the selected robotic arm to a target object comprises: first, a vector of the target object pointing to the end of the arm of the selected arm is determined. Specifically, the pose of the tail end of the mechanical arm is tested by using measuring equipment; and calculating the vector of the target object pointing to the tail end of the mechanical arm according to the position of the target object and the measured position of the tail end of the mechanical arm. Secondly, an angle between the vector and the Z-direction component of the attitude matrix of the end of the selected robot arm is determined as the error.
Further, the above step 150 may further include, but is not limited to, the following 1) to 3):
1) Testing the pose of the tail end of the mechanical arm by using the laser tracker, wherein the coordinate system used for the measurement is coincided with the robot coordinate system, and the test result is the actual pose of the tail end of the mechanical arm in the robot coordinate system: ( r p e ′, r R e ′);
2) The test result in the step 1) and the coordinates of the target object in the robot coordinate system r p target Calculating the vector of the target object pointing to the end of the mechanical arm
Figure BDA0004035918380000141
3) Attitude matrix r R e ' vector represented by the third column
Figure BDA0004035918380000151
And the vector ^ in the above step 2)>
Figure BDA0004035918380000152
The included angle of (d) is the error δ of the pointing motion:
Figure BDA0004035918380000153
fig. 6 is a block diagram schematically illustrating a robot pointing motion control apparatus according to an embodiment of the present invention.
As shown in fig. 6, the robot pointing motion control device includes the following modules:
a mechanical arm selection module 41, configured to select a mechanical arm closer to a target object in two mechanical arms of the robot according to a target object coordinate of the target object in a robot coordinate system when the target object exceeds an reachable range of a mechanical arm end of the robot;
the pose determining module 42 of the tail end of the mechanical arm is used for obtaining the pose of the target object pointed in the reachable range of the tail end of the mechanical arm according to the mapping of the coordinates of the target object, the coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and a preset rule of pointing actions;
a trajectory determination module 43, configured to determine, according to the obtained pose of the end of the mechanical arm, a trajectory from the current pose of the selected mechanical arm along each joint of the selected mechanical arm to the target pose of each joint;
and the motion control module 44 is used for controlling the selected mechanical arm to point to the target object according to the track.
In some embodiments, the pose determination module of the robot arm tip comprises:
the coordinate mapping submodule is used for mapping the target object coordinate to a coordinate within the reach range of the tail end of the mechanical arm selected;
the appointed direction determining submodule is used for determining the appointed direction of the tail end of the mechanical arm of the selected mechanical arm according to the preset rule of the pointing action;
and the gesture generating submodule of the tail end of the mechanical arm is used for generating the gesture of the tail end of the mechanical arm of the selected mechanical arm according to the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected mechanical arm in the robot coordinate system and the specified direction.
In some embodiments, the coordinate mapping sub-module is specifically configured to: acquiring an X coordinate and a Y coordinate of the tail end of the mechanical arm of the selected mechanical arm according to an included angle between a vector of the shoulder joint of the selected mechanical arm pointing to the target object and an X axis of the robot coordinate system and the arm length of the selected mechanical arm in the robot coordinate system; acquiring a Z coordinate of the tail end of the mechanical arm of the selected mechanical arm according to the Z-direction coordinate of the target object in a robot coordinate system and the relation between the coordinate of the shoulder joint of the selected mechanical arm and the arm length;
and/or the presence of a gas in the gas,
the gesture generation submodule at the tail end of the mechanical arm is specifically used for:
determining the unit vector of the target object pointing to the tail end of the mechanical arm as 3Z-direction components of the attitude matrix of the tail end of the mechanical arm of the selected mechanical arm for forming the Z-direction vector, wherein the attitude matrix of the tail end of the mechanical arm of the selected mechanical arm is a matrix with 3 rows and 3 columns;
determining that 3Y-direction components of the attitude matrix, which are used for forming a Y-direction vector, are positive when the tail end of the mechanical arm of the selected mechanical arm points to a target object, and the 3Z-direction components of the attitude matrix are positive, and the selected mechanical arm is a right mechanical arm;
determining that 3Y-direction components of the attitude matrix for forming a Y-direction vector are negative when the tail end of the mechanical arm of the selected mechanical arm points to a target object, the 3Z-direction components of the attitude matrix are positive, and the selected mechanical arm is a left mechanical arm;
and determining the 3Z-direction components and the 3Y-direction components of the attitude matrix at the tail end of the mechanical arm of the selected mechanical arm according to a right hand rule, and determining the 3X-direction components of the attitude matrix for forming an X-direction vector.
In some embodiments, after the controlling the pointing motion of the selected mechanical arm to point at the target object according to the trajectory, the apparatus further includes: and the pointing motion error determination module is used for testing the obtained pose of the tail end of the mechanical arm by using the measuring equipment to determine the pointing motion error of the tail end of the mechanical arm pointing to the target object.
In some embodiments, the error determination module of the pointing motion is specifically configured to: determining a vector of the target object pointing to the end of the mechanical arm of the selected mechanical arm; and determining an included angle between the vector and the Z-direction component of the attitude matrix of the tail end of the mechanical arm selected as the error.
In some embodiments, the trajectory determination module comprises:
the target posture determining submodule is used for solving inverse kinematics according to the obtained pose of the tail end of the mechanical arm to obtain the target posture of each joint of the mechanical arm;
and the track determining submodule is used for obtaining the track from the current pose to the target pose of each joint of the tail end of the mechanical arm of the selected mechanical arm through track interpolation.
In some embodiments, the pose determination module of the end of the robotic arm is specifically configured to: solving inverse kinematics according to the obtained pose of the tail end of the mechanical arm to obtain a target configuration of the selected mechanical arm, wherein the target configuration comprises target angles of all joints; and performing track interpolation according to the current configuration of the robot and the target configuration, and controlling a mechanical arm to move from the current configuration of the robot to the target configuration of the robot pointing to the target object, wherein the current configuration of the robot comprises the current angle of each joint.
The implementation process of the functions and actions of each module in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
Fig. 7 is a block diagram illustrating an electronic device 50 according to an embodiment of the present disclosure.
As shown in fig. 7, the electronic device 50 comprises one or more processors 51 for implementing the robot pointing motion control method as described above.
In some embodiments, the electronic device 50 may include a memory 59, and the memory 59 may store programs that may be called by the processor 51, and may include a non-volatile storage medium. In some embodiments, the electronic device 50 may include a memory 58 and an interface 57. In some embodiments, the electronic device 50 may also include other hardware depending on the actual application.
The memory 59 of the embodiment of the present application stores thereon a program for implementing the robot pointing motion control method as described above when the program is executed by the processor 51.
This application may take the form of a computer program product that is embodied on one or more memories 59 (including but not limited to disk memory, CD-ROM, optical storage, etc.) having program code embodied therein. Memory 59 includes permanent and non-permanent, removable and non-removable media and may implement any method or technology for storing information. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of memory 59 include, but are not limited to: phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by a computing device.
The functional service robot is used for pointing to display information containing a target object and comprises a memory and a processor, wherein the memory stores a computer program, and the processor calls the computer program in the memory to realize the method.
The above description is only a preferred embodiment of the present disclosure, and should not be taken as limiting the present disclosure, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present disclosure should be included in the protection scope of the present disclosure.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the statement that "comprises a … …" defines an element does not exclude the presence of additional like elements in the process, method, article, or apparatus that comprises the element.

Claims (10)

1. A robot pointing motion control method is characterized by comprising the following steps:
under the condition that a target object exceeds the reachable range of the tail end of a mechanical arm of the robot, selecting the mechanical arm which is closer to the target object in the two mechanical arms of the robot according to the target object coordinate of the target object in a robot coordinate system;
according to the mapping of the target object coordinate, the shoulder joint coordinate of the shoulder joint of the selected mechanical arm in the robot coordinate system and a preset rule of pointing motion, acquiring the pose of the mechanical arm of the selected mechanical arm pointing to the target object within the reachable range of the tail end of the mechanical arm;
determining the track from the current pose of each joint of the selected mechanical arm to the target pose of each joint of the selected mechanical arm according to the obtained pose of the tail end of the mechanical arm;
and controlling the selected mechanical arm to point to the target object according to the track.
2. A robot pointing motion control method according to claim 1, wherein the obtaining of the pose of the selected robot arm pointing to the target object within the reach of the robot arm tip from the map of the target object coordinates, the shoulder joint coordinates of the shoulder joint of the selected robot arm in the robot coordinate system, and the predetermined rule of pointing motion, comprises:
mapping the target object coordinates to coordinates within a reach of the tail end of the selected mechanical arm;
determining the designated direction of the tail end of the mechanical arm of the selected mechanical arm according to the preset rule of the pointing action;
generating a pose of the robot arm tip of the selected robot arm based on the mapped coordinates, the shoulder joint coordinates of the shoulder joint of the selected robot arm in the robot coordinate system, and the specified direction.
3. A robot pointing motion control method according to claim 2, wherein the mapping the target object coordinates to coordinates within reach of the end of the arm of the selected arm comprises: acquiring an X coordinate and a Y coordinate of the tail end of the mechanical arm of the selected mechanical arm according to an included angle between a vector of the shoulder joint of the selected mechanical arm pointing to the target object and an X axis of the robot coordinate system and the arm length of the selected mechanical arm in the robot coordinate system; acquiring a Z coordinate of the tail end of the mechanical arm of the selected mechanical arm according to the Z-direction coordinate of the target object in a robot coordinate system and the relation between the coordinate of the shoulder joint of the selected mechanical arm and the arm length;
and/or the presence of a gas in the gas,
generating a pose of the robot arm tip of the selected robot arm from the mapped coordinates, shoulder joint coordinates of the shoulder joint of the selected robot arm in the robot coordinate system, and the specified direction, comprising:
determining the unit vector of the target object pointing to the tail end of the mechanical arm as 3Z-direction components of the attitude matrix of the tail end of the mechanical arm of the selected mechanical arm for forming the Z-direction vector, wherein the attitude matrix of the tail end of the mechanical arm of the selected mechanical arm is a matrix with 3 rows and 3 columns;
determining that 3Y-direction components of the attitude matrix, which are used for forming a Y-direction vector, are positive when the tail end of the mechanical arm of the selected mechanical arm points to a target object, and the 3Z-direction components of the attitude matrix are positive, and the selected mechanical arm is a right mechanical arm;
determining that 3Y-direction components of the attitude matrix for forming a Y-direction vector are negative when the tail end of the mechanical arm of the selected mechanical arm points to a target object, the 3Z-direction components of the attitude matrix are positive, and the selected mechanical arm is a left mechanical arm;
and determining the 3Z-direction components and the 3Y-direction components of the attitude matrix at the tail end of the mechanical arm of the selected mechanical arm according to a right hand rule, and determining the 3X-direction components of the attitude matrix for forming an X-direction vector.
4. A robot pointing motion control method according to claim 1, wherein after said controlling the selected robot arm to point to the target object in accordance with the trajectory, the method further comprises:
and testing the obtained pose of the tail end of the mechanical arm by using the measuring equipment, and determining the error of the pointing action of the tail end of the mechanical arm of the selected mechanical arm to the target object.
5. A robot pointing motion control method according to claim 4, wherein the testing the obtained pose of the tip of the robot arm using the measuring device to determine the error of the pointing motion of the tip of the robot arm of the selected robot arm to the target object, comprises:
determining a vector of the target object pointing to the end of the mechanical arm of the selected mechanical arm;
and determining an included angle between the vector and the Z-direction component of the attitude matrix of the tail end of the mechanical arm of the selected mechanical arm as the error.
6. The robot pointing motion control method according to claim 1, wherein the determining the trajectory of the robot arm end of the selected robot arm along the current pose of each joint to the target pose of each joint in accordance with the obtained pose of the robot arm end comprises:
solving inverse kinematics according to the obtained pose of the tail end of the mechanical arm to obtain the target pose of each joint of the mechanical arm;
and obtaining the track from the current pose to the target pose of each joint of the tail end of the mechanical arm of the selected mechanical arm through track interpolation.
7. The robot pointing motion control method according to claim 6, wherein the obtaining the target pose of each joint of the robot arm by solving the inverse kinematics according to the obtained pose of the end of the robot arm comprises:
solving inverse kinematics according to the obtained pose of the tail end of the mechanical arm to obtain a target configuration of the selected mechanical arm, wherein the target configuration comprises target angles of all joints;
and performing track interpolation according to the current configuration of the robot and the target configuration, and controlling a mechanical arm to move from the current configuration of the robot to the target configuration of the robot pointing to the target object, wherein the current configuration of the robot comprises the current angle of each joint.
8. A robot pointing motion control device, comprising:
the robot arm selection module is used for selecting a robot arm which is closer to a target object in two robot arms of the robot according to the target object coordinate of the target object in a robot coordinate system under the condition that the target object exceeds the reachable range of the tail end of the robot arm of the robot;
the pose determining module of the tail end of the mechanical arm is used for obtaining the pose of the target object pointed to by the tail end of the mechanical arm in the reachable range of the tail end of the mechanical arm according to the mapping of the target object coordinate, the shoulder joint coordinate of the shoulder joint of the selected mechanical arm in the robot coordinate system and the preset rule of pointing actions;
the track determining module is used for determining tracks from the current pose of each joint of the selected mechanical arm to the target pose of each joint of the selected mechanical arm according to the obtained pose of the tail end of the mechanical arm;
and the action control module is used for controlling the selected mechanical arm to point to the target object according to the track.
9. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable storage medium having stored thereon computer instructions, which when executed by a processor, implement the method of any one of claims 1-7.
CN202310009464.8A 2023-01-03 2023-01-03 Robot pointing motion control method, apparatus, electronic device, and storage medium Active CN115922728B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310009464.8A CN115922728B (en) 2023-01-03 2023-01-03 Robot pointing motion control method, apparatus, electronic device, and storage medium
PCT/CN2023/118887 WO2024037658A1 (en) 2023-01-03 2023-09-14 Method and apparatus for controlling pointing action of robot, and electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310009464.8A CN115922728B (en) 2023-01-03 2023-01-03 Robot pointing motion control method, apparatus, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN115922728A true CN115922728A (en) 2023-04-07
CN115922728B CN115922728B (en) 2023-06-30

Family

ID=86654337

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310009464.8A Active CN115922728B (en) 2023-01-03 2023-01-03 Robot pointing motion control method, apparatus, electronic device, and storage medium

Country Status (2)

Country Link
CN (1) CN115922728B (en)
WO (1) WO2024037658A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116141341A (en) * 2023-04-21 2023-05-23 之江实验室 Method for realizing pointing action of five-degree-of-freedom mechanical arm meeting Cartesian space constraint
WO2024037658A1 (en) * 2023-01-03 2024-02-22 之江实验室 Method and apparatus for controlling pointing action of robot, and electronic device and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150142796A (en) * 2014-06-11 2015-12-23 현대자동차주식회사 Method and system for controlling elbow of robot
CN105415372A (en) * 2015-12-09 2016-03-23 常州汉迪机器人科技有限公司 Multi-joint robot track planning method under constraint of safety space
WO2018053430A1 (en) * 2016-09-16 2018-03-22 Carbon Robotics, Inc. System and calibration, registration, and training methods
CN109048890A (en) * 2018-07-13 2018-12-21 哈尔滨工业大学(深圳) Coordination method for controlling trajectory, system, equipment and storage medium based on robot
KR20190048589A (en) * 2017-10-31 2019-05-09 충남대학교산학협력단 Apparatus and method for dual-arm robot teaching based on virtual reality
US20190143448A1 (en) * 2017-11-13 2019-05-16 Kabushiki Kaisha Yaskawa Denki Laser machining method, controller, and robot system
WO2019119724A1 (en) * 2017-12-21 2019-06-27 东南大学 Force sense information and posture information based limb motion intention understanding and upper limb rehabilitation training robot control method
CN112775931A (en) * 2019-11-05 2021-05-11 深圳市优必选科技股份有限公司 Mechanical arm control method and device, computer readable storage medium and robot
WO2021089550A1 (en) * 2019-11-06 2021-05-14 Koninklijke Philips N.V. Robotic positioning of a device
CN112828885A (en) * 2020-12-30 2021-05-25 诺创智能医疗科技(杭州)有限公司 Hybrid master-slave mapping method, mechanical arm system and computer equipment
CN113814988A (en) * 2021-11-24 2021-12-21 之江实验室 7-degree-of-freedom SRS type mechanical arm inverse solution analysis method and device and electronic equipment
CN114310915A (en) * 2022-02-16 2022-04-12 哈尔滨工业大学 Space manipulator butt joint end tool trajectory planning method based on visual feedback
CN114952868A (en) * 2022-07-26 2022-08-30 之江实验室 7-degree-of-freedom SRS (sounding reference Signal) type mechanical arm control method and device and piano playing robot

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011224737A (en) * 2010-04-21 2011-11-10 Toyota Motor Corp Guide robot, guide method, and program for controlling guide
KR101789756B1 (en) * 2010-12-29 2017-11-20 삼성전자주식회사 Robot and method for controlling the same
CN107972026B (en) * 2016-10-25 2021-05-04 河北亿超机械制造股份有限公司 Robot, mechanical arm and control method and device thereof
JP2020175466A (en) * 2019-04-17 2020-10-29 アズビル株式会社 Teaching device and teaching method
CN113601504A (en) * 2021-08-04 2021-11-05 之江实验室 Robot limb action control method and device, electronic device and storage medium
CN115922728B (en) * 2023-01-03 2023-06-30 之江实验室 Robot pointing motion control method, apparatus, electronic device, and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150142796A (en) * 2014-06-11 2015-12-23 현대자동차주식회사 Method and system for controlling elbow of robot
CN105415372A (en) * 2015-12-09 2016-03-23 常州汉迪机器人科技有限公司 Multi-joint robot track planning method under constraint of safety space
WO2018053430A1 (en) * 2016-09-16 2018-03-22 Carbon Robotics, Inc. System and calibration, registration, and training methods
KR20190048589A (en) * 2017-10-31 2019-05-09 충남대학교산학협력단 Apparatus and method for dual-arm robot teaching based on virtual reality
US20190143448A1 (en) * 2017-11-13 2019-05-16 Kabushiki Kaisha Yaskawa Denki Laser machining method, controller, and robot system
WO2019119724A1 (en) * 2017-12-21 2019-06-27 东南大学 Force sense information and posture information based limb motion intention understanding and upper limb rehabilitation training robot control method
CN109048890A (en) * 2018-07-13 2018-12-21 哈尔滨工业大学(深圳) Coordination method for controlling trajectory, system, equipment and storage medium based on robot
CN112775931A (en) * 2019-11-05 2021-05-11 深圳市优必选科技股份有限公司 Mechanical arm control method and device, computer readable storage medium and robot
WO2021089550A1 (en) * 2019-11-06 2021-05-14 Koninklijke Philips N.V. Robotic positioning of a device
CN112828885A (en) * 2020-12-30 2021-05-25 诺创智能医疗科技(杭州)有限公司 Hybrid master-slave mapping method, mechanical arm system and computer equipment
CN113814988A (en) * 2021-11-24 2021-12-21 之江实验室 7-degree-of-freedom SRS type mechanical arm inverse solution analysis method and device and electronic equipment
CN114310915A (en) * 2022-02-16 2022-04-12 哈尔滨工业大学 Space manipulator butt joint end tool trajectory planning method based on visual feedback
CN114952868A (en) * 2022-07-26 2022-08-30 之江实验室 7-degree-of-freedom SRS (sounding reference Signal) type mechanical arm control method and device and piano playing robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李泊,李成群: "取药机械手的设计及其运动分析与仿真", 机床与液压, vol. 47, no. 17, pages 72 - 75 *
邓顺;周康渠;: "基于DSP的焊装机器人控制算法研究与仿真设计", 重庆工商大学学报(自然科学版), no. 01, pages 87 - 93 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024037658A1 (en) * 2023-01-03 2024-02-22 之江实验室 Method and apparatus for controlling pointing action of robot, and electronic device and storage medium
CN116141341A (en) * 2023-04-21 2023-05-23 之江实验室 Method for realizing pointing action of five-degree-of-freedom mechanical arm meeting Cartesian space constraint
CN116141341B (en) * 2023-04-21 2023-08-08 之江实验室 Method for realizing pointing action of five-degree-of-freedom mechanical arm meeting Cartesian space constraint

Also Published As

Publication number Publication date
WO2024037658A1 (en) 2024-02-22
CN115922728B (en) 2023-06-30

Similar Documents

Publication Publication Date Title
CN115922728B (en) Robot pointing motion control method, apparatus, electronic device, and storage medium
US20110093119A1 (en) Teaching and playback method based on control of redundancy resolution for robot and computer-readable medium controlling the same
US6798415B2 (en) Rendering collisions of three-dimensional models
CN105004353B (en) A kind of star sensor dynamic star chart emulation mode
CN110561421B (en) Mechanical arm indirect dragging demonstration method and device
CN110657799B (en) Space real-time positioning method, computer device and computer readable storage medium
Stilman et al. Augmented reality for robot development and experimentation
JP2017073753A (en) Correction method, program, and electronic apparatus
CN113696188B (en) Hand-eye calibration data acquisition method and device, electronic equipment and storage medium
Manou et al. Off-line programming of an industrial robot in a virtual reality environment
JP2015058492A (en) Control device, robot system, robot, robot operation information generation method, and program
CN111216136A (en) Multi-degree-of-freedom mechanical arm control system, method, storage medium and computer
US9652879B2 (en) Animation of a virtual object
Zarubin et al. Caging complex objects with geodesic balls
CN113409444A (en) Three-dimensional reconstruction method and device, electronic equipment and storage medium
JP2019197333A (en) Path correction method and control device of multiple spindle processing machine
CN111127661B (en) Data processing method and device and electronic equipment
Jiang et al. Error analysis and experiments of 3D reconstruction using a RGB-D sensor
Astad et al. Vive for robotics: Rapid robot cell calibration
CN115164823B (en) Method and device for acquiring gyroscope information of camera
CN113450903B (en) Human body action mapping method and device, computer equipment and storage medium
CN108733211A (en) Tracing system, its operating method, controller and computer-readable recording medium
Henriksson et al. Maximizing the use of computational resources in multi-camera feedback control
CN113705378A (en) Sample data generation method and device and electronic equipment
JP2022183723A (en) Processor, servo system, processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant