CN111300484A - Method for determining joint positioning error of robot, robot and storage medium - Google Patents

Method for determining joint positioning error of robot, robot and storage medium Download PDF

Info

Publication number
CN111300484A
CN111300484A CN202010177273.9A CN202010177273A CN111300484A CN 111300484 A CN111300484 A CN 111300484A CN 202010177273 A CN202010177273 A CN 202010177273A CN 111300484 A CN111300484 A CN 111300484A
Authority
CN
China
Prior art keywords
positioning
robot
joint
determining
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010177273.9A
Other languages
Chinese (zh)
Other versions
CN111300484B (en
Inventor
杨文超
李业
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Robotics Co Ltd
Original Assignee
Cloudminds Chengdu Technologies Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Chengdu Technologies Co ltd filed Critical Cloudminds Chengdu Technologies Co ltd
Priority to CN202010177273.9A priority Critical patent/CN111300484B/en
Publication of CN111300484A publication Critical patent/CN111300484A/en
Application granted granted Critical
Publication of CN111300484B publication Critical patent/CN111300484B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the invention relates to the technical field of robots, and discloses a method for determining joint positioning errors of a robot, the robot and a storage medium. In the present invention, a method for determining joint positioning error of a robot is applied to a robot, the robot includes a joint and an end effector, the method includes: controlling each joint of the robot to move according to the target pose of the end effector so as to drive the end effector to move; after each joint of the robot finishes moving, acquiring image data of a first positioning label, wherein the first positioning label is arranged on the surface of the end effector or the surface of the joint; determining the actual pose of the end effector according to the image data of the first positioning tag; and determining joint positioning errors of the robot according to the actual pose and the target pose. By adopting the embodiment, the joint positioning error can be accurately determined, and the cost for determining the joint positioning error is reduced.

Description

Method for determining joint positioning error of robot, robot and storage medium
Technical Field
The embodiment of the invention relates to the technical field of robots, in particular to a method for determining joint positioning errors of a robot, the robot and a storage medium.
Background
With the change of the robot technology, people increasingly expect to see that the robot has the appearance similar to that of a human being and can complete free operations such as receiving coffee and pressing an elevator like a normal person. With the continuous introduction of humanoid robots, the shape of the robots is more and more similar to that of humans, but the robots are still very laborious to perform some delicate operations like humans, such as picking up an apple from a cluttered desktop, holding a cup for receiving coffee, etc., and the accumulated errors of the mechanical joint movements during the execution of the operation commands by the robots cause the actual positions of the robot arms extending over to deviate greatly from the target positions.
The inventors found that at least the following problems exist in the related art: at present, errors between the actual pose and the target pose of the robot are obtained through additional equipment, but the errors are continuously changed along with the movement of a mechanical structure, and need to be updated regularly, so that the cost is high, and the operation is inconvenient.
Disclosure of Invention
An object of embodiments of the present invention is to provide a method for determining a joint positioning error of a robot, and a storage medium, which can accurately determine a joint positioning error and reduce the cost for determining a joint positioning error.
To solve the above technical problem, an embodiment of the present invention provides a method for determining joint positioning error of a robot, the method being applied to the robot, the robot including joints and an end effector, the method including: controlling each joint of the robot to move according to the target pose of the end effector so as to drive the end effector to move; after each joint of the robot finishes moving, acquiring image data of a first positioning label, wherein the first positioning label is arranged on the surface of the end effector or the surface of the joint; determining the actual pose of the end effector according to the image data of the first positioning tag; and determining joint positioning errors of the robot according to the actual pose and the target pose.
Embodiments of the present invention also provide a robot including: the joint is provided with a first positioning label on the surface or the end effector is provided with a first positioning label; the processor is respectively connected with the drivers of all joints so as to control the movement of all joints; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of determining joint positioning error of a robot described above.
Embodiments of the present invention also provide a computer-readable storage medium storing a computer program which, when executed by a processor, implements the above-described method of determining joint positioning errors of a robot.
Compared with the prior art, the robot has the advantages that the first positioning tag is arranged on the joint surface of the robot or the first positioning tag is arranged on the surface of the end effector, the robot determines the actual pose of the end effector by acquiring the image data of the first positioning tag and determining the joint positioning error according to the image data of the first positioning tag, and the joint positioning error is determined according to the actual pose and the target pose.
Additionally, determining an actual pose of the end effector from the image data of the first positional tag comprises: determining a first pose of the first positioning label according to the image data of the first positioning label; and determining the actual pose of the end effector according to the first pose and the prestored position relation between the first positioning label and the end effector. The first pose of the first positioning label is positioned through the image data of the first positioning label, and the positioning mode is simple; according to the first pose and the position relation between the first positioning label and the end effector, the actual pose of the end effector can be determined quickly and accurately, the mode is simple, and the cost is low.
In addition, the first positioning tag comprises a plurality of positioning characteristic information; determining a first pose of the first positioning tag according to the image data of the first positioning tag, comprising: identifying positioning characteristic information in the image data, and determining two-dimensional coordinate information of the positioning characteristic information in the image data; constructing three-dimensional coordinate information of the positioning characteristic information according to the size information and the two-dimensional coordinate information of the first positioning label; and determining the first pose of the first positioning label according to the three-dimensional coordinate information, the two-dimensional coordinate information and the projection relation between the three-dimensional coordinate system and the two-dimensional coordinate system. The first positioning label comprises a plurality of positioning characteristic information, so that the first positioning label can be conveniently and accurately identified, the position and the attitude of the first positioning label can be quickly and accurately determined by determining the position of the positioning characteristic information,
in addition, a first positioning tag is arranged on the surface of the joint, and a second positioning tag is arranged on the surface of the end effector; before controlling the movements of the joints of the robot according to the acquired target poses of the end effector of the robot, the method further comprises: collecting a calibration image, wherein the calibration image comprises a first positioning label and a second positioning label; respectively determining a first calibration pose of the first positioning label and a second calibration pose of the second positioning label according to the calibration image; and calibrating the position relation between the first positioning label and the second positioning label according to the first calibration pose and the second calibration pose, and taking the calibrated position relation as the position relation between the first positioning label and the end effector. The position relation between the first positioning label and the end effector is determined before the robot is controlled to move, the accuracy of the position relation is ensured, and the accuracy of subsequently determining the joint positioning error is further improved.
In addition, after determining joint positioning errors of the robot according to the actual pose and the target pose, the method further comprises the following steps: adjusting the pose of the target according to the joint positioning error; and controlling the movement of each joint of the robot again according to the re-determined target pose. And according to the joint positioning error, the target position is adjusted, and the accuracy of the robot motion is improved.
In addition, the first positioning tags are multiple and are arranged on the joints in a surrounding mode. Set up a plurality of first location labels, and a plurality of first location labels encircle and set up on the joint, avoid appearing the robot and take place to shelter from the problem of first location label in the motion process.
In addition, the first positioning tag includes: two-dimensional codes and/or bar codes.
In addition, the joint provided with the first positioning tag is a terminal joint. The tail end joint and the tail end actuator belong to the same rigid body, and no other joint exists between the tail end joint and the tail end actuator, so that errors of other joints are avoided, and the accuracy of determining the joint positioning error is further improved.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
Fig. 1 is a detailed flowchart of a method for determining joint positioning errors of a robot according to a first embodiment of the present invention;
fig. 2 is a schematic diagram of an embodiment of determining an actual pose of an end effector according to the first embodiment of the present invention;
FIG. 3 is a detailed flowchart of a method for determining joint positioning error of a robot according to a second embodiment of the present invention;
FIG. 4 is a schematic illustration of a first positioning tab arrangement provided in accordance with a second embodiment of the present invention;
FIG. 5 is a top view of a ring formed by a plurality of first positioning tabs provided in accordance with a second embodiment of the present invention;
FIG. 6 is a schematic illustration of a first positioning tab arrangement provided in accordance with a second embodiment of the invention;
fig. 7 is a schematic view of a connection structure of a robot according to a third embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present application in various embodiments of the present invention. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
The inventor finds that in the fine operation process of the robot, due to the fact that errors exist in the movement of each joint, the actual pose and the target pose of an end effector of the robot are greatly different after the movement of a plurality of joints is superposed, at present, a high-precision joint code disc is usually installed to reduce the error of each joint, but the error between the target pose and the actual pose gradually increases along with the increase of the movement times of the robot, and the high-precision code disc is very expensive, so that the cost of the robot is high. Another way is to determine joint positioning errors between the actual pose of the end effector of the robot and the target position in advance by an additional device, and in the process of determining the target pose, the target pose is compensated by the joint positioning errors, but since the joint positioning errors are determined in advance by the additional device, the joint positioning errors change with the passage of time, so that the end effector of the robot cannot reach the target pose, and the fine operation of the robot is affected.
A first embodiment of the invention relates to a method of determining joint positioning errors of a robot. The robot is applied to a robot, and the robot comprises joints and an end effector. The method for determining the joint positioning error of the robot is applied to the robot, and the specific flow is shown in figure 1:
step 101: and controlling each joint of the robot to move according to the target pose of the end effector so as to drive the end effector to move.
Specifically, the robot comprises a plurality of joints and an end effector arranged on the end joint, wherein the end joint can be a wrist joint, an ankle joint and the like of the robot, and the end effector can be a palm, a sole and the like of the robot. The target pose may include: a target position and a pose of the end effector. The target pose can be determined according to the information of the target position and the specified operation information, for example, if the robot is required to grab the coke on the table, the target pose can be determined according to the position information of the coke and the grabbed pose.
According to the position and the attitude of the target, the motion parameters of each joint of the robot can be determined, so that the robot can be operated according to the position and the attitude of the targetThe determined motion parameters controlling the movement of each joint, e.g. setting the pose of the object to RTgoalAnd calculating the position to which the robot hand needs to move according to the target pose, planning linkage parameters of each joint according to the position to which the robot hand needs to move, and driving the rotation of each joint by controlling code disc parameters of the joint so as to drive the palm connected with the tail end joint to move.
Step 102: after each joint of the robot finishes moving, image data of a first positioning label is obtained, and the first positioning label is arranged on the surface of the end effector or the surface of the joint.
Specifically, the first positioning tag may be an image containing positioning feature information, for example, a two-dimensional code, a barcode, or the like.
In this embodiment, the first positioning tag is described by taking a two-dimensional code as an example. The two-dimensional code comprises four vertexes, each vertex comprises identification information used for identification, and the identification information of the four vertexes can be used as the positioning feature information of the first positioning label.
The first positioning tag may be provided on a surface of an end effector or a surface of a joint of the robot. For example, the two-dimensional code is provided on the palm surface or the back surface of the robot hand, the two-dimensional code is provided on the instep surface, or the like. The first positioning label may also be printed directly on the surface of the end effector or the joint surface, or the image printed with the first positioning label may be affixed to the surface of the end effector or the joint surface. In addition, the first positioning label can be embedded on the robot end effector or the joint, a transparent layer made of transparent materials is covered on the surface of the first positioning label, so that image data of the first positioning label can be acquired, and the covered transparent surface can protect the first positioning label from being damaged. The joint provided with the first positioning tag can be a terminal joint or other joints.
The image data of the first positioning label can be acquired through the acquisition device on the robot, the acquisition device can be a camera, the acquisition device can be arranged at the position where the eyes of the robot are located, and the acquisition device can also be arranged at other positions.
Step 103: and determining the actual pose of the end effector according to the image data of the first positioning label.
In one example, the process of determining the actual pose of the end effector may employ sub-steps as shown in fig. 2.
Substep S11: and determining a first pose of the first positioning label according to the image data of the first positioning label.
Specifically, the first positioning tag includes a plurality of positioning characteristic information so that the first positioning tag can be accurately identified. If the first positioning label is the two-dimensional code, information contained in four vertexes of the two-dimensional code is positioning characteristic information.
In one example, the process of determining the first pose of the first positioning tag may be: identifying each positioning characteristic information in the image data, and determining two-dimensional coordinate information of each positioning characteristic information in the image data; according to the size information and the two-dimensional coordinate information of the first positioning label, three-dimensional coordinate information of each piece of positioning feature information is constructed; and determining the first pose of the first positioning label according to the three-dimensional coordinate information, the two-dimensional coordinate information and the projection relation between the three-dimensional coordinate system and the two-dimensional coordinate system.
The following describes a process of determining the first position by taking the first positioning label as a two-dimensional code and setting the two-dimensional code at the palm center of the robot as an example:
the method comprises the steps of identifying the positioning characteristic information of the two-dimensional code, calculating the positions of four vertexes of the two-dimensional code, rapidly identifying the positioning characteristic information by adopting a visual reference library AprilTag, calculating the position information of the positioning characteristic information relative to image data, obtaining coordinates of the four vertexes of the two-dimensional code due to the fact that the two-dimensional code has the four vertexes, and recording the coordinates of the four vertexes as: p2D[4](ii) a The two-dimensional code is arranged in the palm of the robot, the shape of the shot two-dimensional code is unchanged, and the side length of the two-dimensional code is L; according to the size information of the two-dimensional code, determining the 3D coordinate P of four vertexes of the two-dimensional code3D[4]The 3D coordinates of the two-dimensional code are as shown in equations (1) to (4):
P3D[0]point3d (-0.5L, 0) formula (1);
P3D[1]point3d (0.5L, -0.5L, 0) formula (2);
P3D[2]point3d (0.5L, 0) formula (3);
P3D[4]point3d (-0.5L, 0) formula (4);
after the 3D coordinates of the four vertices of the two-dimensional code are determined, the first pose RT is solved simultaneously according to a projection relationship, which can be shown as formula (5):
Figure BDA0002411237470000061
wherein, in the formula (5), ZcIs a constant number fx、fy、cxAnd cyThe RT is a transformation matrix of the two-dimensional code under a camera coordinate system and is also a first pose of the two-dimensional code, and is an internal parameter of a collecting device (such as a camera).
Substep S12: and determining the actual pose of the end effector according to the first pose and the prestored position relation between the first positioning label and the end effector.
Specifically, when the first positioning tag is disposed on the end effector, the positional relationship between the first positioning tag and the end effector may be set as follows: the position of the first positioning tag is equal to the position of the end effector. Based on the first position, the actual position of the end effector is determined to be the first position according to the first position and the pre-stored position relation.
Step 104: and determining joint positioning errors of the robot according to the actual pose and the target pose.
Specifically, a deviation between the actual pose and the target pose is calculated, and the deviation is taken as a joint positioning error. It should be noted that, in the process of calculating the joint positioning error, the coordinate system between the actual pose and the target pose needs to be unified, and since the target pose is based on the coordinate system of the robot, the actual pose can be converted from the coordinate system of the acquisition device to the coordinate system of the robot, and since the position of the acquisition device on the robot isFixing, the transformation relation of the actual pose transformed to the coordinate system of the robot is fixed, for example, H is the transformation relation between the camera coordinate system and the robot coordinate system, and the transformed actual pose is expressed as: RT (reverse transcription)realAnd when the coordinate system is converted into the real pose, the RT is the real pose before the coordinate system is converted.
It is worth mentioning that the first positioning label can be arranged in a plurality of ways, so that the situation that the robot is shielded in the movement process is avoided.
Compared with the prior art, the robot has the advantages that the first positioning tag is arranged on the joint surface of the robot or the first positioning tag is arranged on the surface of the end effector, the robot determines the actual pose of the end effector by acquiring the image data of the first positioning tag and determining the joint positioning error according to the image data of the first positioning tag, and the joint positioning error is determined according to the actual pose and the target pose.
A second embodiment of the invention relates to a method of determining joint positioning errors of a robot. The second embodiment is substantially the same as the first embodiment, and mainly differs therefrom in that: in the second embodiment of the present invention, step 104 is performed: after determining joint positioning errors of the robot according to the actual pose and the target pose, the method further comprises: adjusting the pose of the target according to the joint positioning error; and controlling the movement of each joint of the robot again according to the re-determined target pose. The specific flow of the method for determining the joint positioning error of the robot is shown in fig. 3.
Step 201: and controlling each joint of the robot to move according to the target pose of the end effector so as to drive the end effector to move.
Step 202: after each joint of the robot finishes moving, image data of a first positioning label is obtained, and the first positioning label is arranged on the surface of the end effector or the surface of the joint.
Step 203: and determining the actual pose of the end effector according to the image data of the first positioning label.
In one example, the first positioning tag is disposed on a surface of a joint, which may be any joint. The first positioning tags may be multiple, the multiple first positioning tags are disposed around the joint, the first positioning tags are disposed in the manner shown in fig. 4, a1 to A3 are the first positioning tags, and B is the joint.
It is worth mentioning that the first positioning tag is arranged on the joint, so that the shielding situation can be effectively avoided, for example, in the process that the robot hand moves the held object to the designated position, because the robot hand holds the object, if the first positioning tag is arranged on the palm of the robot hand, the first positioning tag is shielded.
The following describes a process of determining the first position by taking the first positioning tag as a two-dimensional code and arranging a plurality of two-dimensional codes around the end joint of the robot as an example:
the method comprises the steps of identifying the positioning characteristic information of the two-dimensional code, calculating the positions of four vertexes of the two-dimensional code, rapidly identifying the positioning characteristic information by adopting a visual reference library AprilTag, calculating the position information of the positioning characteristic information relative to image data, obtaining coordinates of the four vertexes of the two-dimensional code due to the fact that the two-dimensional code has the four vertexes, and recording the coordinates of the four vertexes as: p2D[4](ii) a The two-dimensional code is arranged on a joint at the tail end of the robot, the two-dimensional code is arranged in a mode shown in fig. 4, the shot two-dimensional code is in a circular arc shape, namely a plane square when the two-dimensional code in the real world is no longer in time, as shown in fig. 5, fig. 5 is an annular top view formed by a plurality of two-dimensional codes, the side length of a single two-dimensional code is L, and the annular radius formed by the two-dimensional codes is R; the transverse length of the two-dimensional code is the chord length instead of the arc length, the chord length xL is calculated, as shown in formula (6),
Figure BDA0002411237470000071
determining the 3D of four vertexes of the two-dimensional code according to the determined chord length xL and the side length L of the two-dimensional codeCoordinate P3D[4]The 3D coordinates of the two-dimensional code are as shown in equations (7) to (10):
P3D[0]point3d (-0.5 × xL, -0.5 × L,0) formula (7);
P3D[1]point3d (0.5 × xL, -0.5 × L,0) formula (8);
P3D[2]point3d (0.5 × xL,0.5 × L,0) formula (9);
P3D[4]point3d (-0.5 × xL,0.5 × L,0) formula (10);
after the 3D coordinates of the four vertices of the two-dimensional code are determined, the first pose RT is solved simultaneously according to a projection relationship, which may be shown as formula (11):
Figure BDA0002411237470000081
wherein, in formula (11), ZcIs a constant number fx、fy、cxAnd cyThe RT is a transformation matrix of the two-dimensional code under a camera coordinate system and is also a first pose of the two-dimensional code, which is an internal parameter of a collecting device (such as a camera).
And determining the actual pose of the end effector according to the first pose and the predetermined position relation between the first positioning label and the end effector.
In one example, before controlling the movements of the joints of the robot according to the acquired target poses of the end effector of the robot, the position relationship between the first positioning tag and the end effector may be determined in advance, and the process of determining the position relationship between the first positioning tag and the end effector may be: collecting a calibration image, wherein the calibration image comprises a first positioning label and a second positioning label; respectively determining a first calibration pose of the first positioning label and a second calibration pose of the second positioning label according to the calibration image; and calibrating the position relation between the first positioning label and the second positioning label according to the first calibration pose and the second calibration pose, and taking the calibrated position relation as the position relation between the first positioning label and the end effector.
In particular, as shown in figure 6,the second positioning label h is arranged at the palm center of the robot, the first positioning label is arranged at the tail end joint (such as the wrist), the first positioning label 0,1,2 … N is arranged, the acquisition device acquires calibration images of the two-dimensional code 1 and the two-dimensional code 2 at the same time, the pose of the two-dimensional code is calculated according to the calibration images, and the pose conversion relationship from the two-dimensional code 2 to the two-dimensional code 1 is as follows:
Figure BDA0002411237470000082
similarly, the pose transformation relation between the two-dimensional code 1 and the two-dimensional code 0 can be obtained:
Figure BDA0002411237470000083
the pose conversion relation from the two-dimensional code 0 to the two-dimensional code h is as follows:
Figure BDA0002411237470000084
therefore, if the two-dimensional code 2 is photographed from the side, the actual pose RT of the palm center can be calculatedh' is:
Figure BDA0002411237470000085
step 204: and determining joint positioning errors of the robot according to the actual pose and the target pose.
This step is substantially the same as step 104 in the first embodiment, and will not be described herein.
Step 205: and adjusting the pose of the target according to the joint positioning error.
Specifically, the joint positioning error may be returned to the control end of the robot to compensate the target pose determined in step 201, so as to obtain an adjusted target pose, and steps 201 to 204 may be executed again according to the adjusted target pose until the determined joint positioning error is within the preset range.
According to the method for determining the joint positioning error of the robot, the target position is adjusted according to the joint positioning error, and the accuracy of the robot motion is improved.
The steps of the above methods are divided for clarity, and the implementation may be combined into one step or split some steps, and the steps are divided into multiple steps, so long as the same logical relationship is included, which are all within the protection scope of the present patent; it is within the scope of the patent to add insignificant modifications to the algorithms or processes or to introduce insignificant design changes to the core design without changing the algorithms or processes.
A third embodiment of the present invention relates to a robot, in which the connection structure of the robot 30 is, as shown in fig. 7, a joint 301 and an end effector 302, a first positioning tag is provided on a surface of the joint 301 or a first positioning tag is provided on the end effector 301; at least one processor 303, wherein the processor 303 is respectively connected with the drivers of the joints to control the motion of the joints; and a memory 304 communicatively coupled to the at least one processor 303; wherein the memory 304 stores instructions executable by the at least one processor 303, the instructions being executable by the at least one processor 303 to enable the at least one processor 303 to perform the method of determining joint positioning errors of a robot of the first or second embodiment.
Where the memory 304 and the processor 303 are connected by a bus, the bus may comprise any number of interconnected buses and bridges, the buses linking one or more of the various circuits of the processor 303 and the memory 304 together. The bus may also link various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor 303 is transmitted over a wireless medium via an antenna, which further receives the data and transmits the data to the processor 303.
The processor 303 is responsible for managing the bus and general processing and may also provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. And the memory may be used to store data used by the processor in performing operations.
A fourth embodiment of the present invention relates to a computer-readable storage medium storing a computer program which, when executed by a processor, implements the method of determining joint positioning errors of a robot according to the first or second embodiment.
Those skilled in the art can understand that all or part of the steps in the method of the foregoing embodiments may be implemented by a program to instruct related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, etc.) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the invention, and that various changes in form and details may be made therein without departing from the spirit and scope of the invention in practice.

Claims (10)

1. A method of determining joint positioning error of a robot, for application to a robot, the robot including joints and an end effector, the method comprising:
controlling each joint of the robot to move according to the target pose of the end effector so as to drive the end effector to move;
after each joint of the robot finishes moving, acquiring image data of a first positioning label, wherein the first positioning label is arranged on the surface of the end effector or the surface of the joint;
determining the actual pose of the end effector according to the image data of the first positioning tag;
and determining joint positioning errors of the robot according to the actual pose and the target pose.
2. The method of determining joint positioning errors of a robot of claim 1, wherein determining the actual pose of the end effector from the image data of the first positioning tag comprises:
determining a first pose of the first positioning label according to the image data of the first positioning label;
and determining the actual pose of the end effector according to the first pose and the prestored position relation between the first positioning tag and the end effector.
3. The method of determining joint positioning error of a robot of claim 2, wherein the first positioning tag includes a plurality of positioning feature information;
the determining a first pose of the first positioning tag according to the image data of the first positioning tag includes:
identifying the positioning feature information in the image data, and determining two-dimensional coordinate information of the positioning feature information in the image data;
according to the size information of the first positioning label and the two-dimensional coordinate information, three-dimensional coordinate information of the positioning feature information is constructed;
and determining a first pose of the first positioning label according to the three-dimensional coordinate information, the two-dimensional coordinate information and the projection relation between the three-dimensional coordinate system and the two-dimensional coordinate system.
4. A method of determining joint positioning error of a robot according to claim 2 or 3, wherein the first positioning tag is provided on a surface of the joint and the second positioning tag is provided on a surface of the end effector;
before the controlling of the movements of the joints of the robot according to the acquired target poses of the end effector of the robot, the method further includes:
acquiring a calibration image, wherein the calibration image comprises the first positioning label and the second positioning label;
respectively determining a first calibration pose of the first positioning label and a second calibration pose of the second positioning label according to the calibration image;
and calibrating the position relation between the first positioning label and the second positioning label according to the first calibration pose and the second calibration pose, and taking the calibrated position relation as the position relation between the first positioning label and the end effector.
5. The method of determining joint positioning errors of a robot according to any of claims 1-4, characterized in that after said determining joint positioning errors of the robot from the actual pose and the target pose, the method further comprises:
adjusting the target pose according to the joint positioning error; and controlling the movement of each joint of the robot again according to the re-determined target pose.
6. The method of determining joint positioning error of a robot according to any one of claims 1 to 5, wherein the first positioning tag is provided in plurality, and a plurality of the first positioning tags are provided around the joint.
7. The method of determining joint positioning error of a robot of any of claims 1-6, characterized in that the first positioning tag comprises: two-dimensional codes and/or bar codes.
8. Method for determining joint positioning errors of a robot according to any of the claims 1-7, characterized in that the joint provided with the first positioning tag is an end joint.
9. A robot, comprising:
the joint is provided with a first positioning label on the surface or the end effector is provided with the first positioning label;
at least one processor, wherein the processor is respectively connected with the driver of each joint so as to control the motion of each joint; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method of determining joint positioning error of a robot as claimed in any one of claims 1 to 8.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method of determining joint positioning errors of a robot of any one of claims 1 to 8.
CN202010177273.9A 2020-03-13 2020-03-13 Method for determining joint positioning error of robot, robot and storage medium Active CN111300484B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010177273.9A CN111300484B (en) 2020-03-13 2020-03-13 Method for determining joint positioning error of robot, robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010177273.9A CN111300484B (en) 2020-03-13 2020-03-13 Method for determining joint positioning error of robot, robot and storage medium

Publications (2)

Publication Number Publication Date
CN111300484A true CN111300484A (en) 2020-06-19
CN111300484B CN111300484B (en) 2022-10-21

Family

ID=71155125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010177273.9A Active CN111300484B (en) 2020-03-13 2020-03-13 Method for determining joint positioning error of robot, robot and storage medium

Country Status (1)

Country Link
CN (1) CN111300484B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114147725A (en) * 2021-12-21 2022-03-08 乐聚(深圳)机器人技术有限公司 Zero point adjustment method, device, equipment and storage medium for robot
CN114347037A (en) * 2022-02-16 2022-04-15 中国医学科学院北京协和医院 Robot system fault detection processing method based on composite identification and robot system
CN114536401A (en) * 2022-02-16 2022-05-27 中国医学科学院北京协和医院 Robot system fault detection processing method based on multiple pose identifications and robot system
CN114536399A (en) * 2022-01-07 2022-05-27 中国人民解放军海军军医大学第一附属医院 Error detection method based on multiple pose identifications and robot system
CN114619441A (en) * 2020-12-10 2022-06-14 北京极智嘉科技股份有限公司 Robot and two-dimensional code pose detection method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001277167A (en) * 2000-03-31 2001-10-09 Okayama Pref Gov Shin Gijutsu Shinko Zaidan Three-dimensional attitude recognizing method
JP2017204085A (en) * 2016-05-10 2017-11-16 トヨタ自動車株式会社 Image recognition system
CN109531568A (en) * 2018-11-29 2019-03-29 浙江树人学院 A kind of joint of mechanical arm control method
CN110370271A (en) * 2019-04-30 2019-10-25 杭州亿恒科技有限公司 The joint transmission ratio error calibration method of industrial serial manipulator
CN110421562A (en) * 2019-07-24 2019-11-08 中国地质大学(武汉) Mechanical arm calibration system and scaling method based on four item stereo visions
CN110480638A (en) * 2019-08-20 2019-11-22 南京博约智能科技有限公司 A kind of self-compensating palletizing method of articulated robot position and attitude error and its palletizing system
CN110834333A (en) * 2019-11-14 2020-02-25 中科新松有限公司 Robot hand-eye calibration method and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001277167A (en) * 2000-03-31 2001-10-09 Okayama Pref Gov Shin Gijutsu Shinko Zaidan Three-dimensional attitude recognizing method
JP2017204085A (en) * 2016-05-10 2017-11-16 トヨタ自動車株式会社 Image recognition system
CN109531568A (en) * 2018-11-29 2019-03-29 浙江树人学院 A kind of joint of mechanical arm control method
CN110370271A (en) * 2019-04-30 2019-10-25 杭州亿恒科技有限公司 The joint transmission ratio error calibration method of industrial serial manipulator
CN110421562A (en) * 2019-07-24 2019-11-08 中国地质大学(武汉) Mechanical arm calibration system and scaling method based on four item stereo visions
CN110480638A (en) * 2019-08-20 2019-11-22 南京博约智能科技有限公司 A kind of self-compensating palletizing method of articulated robot position and attitude error and its palletizing system
CN110834333A (en) * 2019-11-14 2020-02-25 中科新松有限公司 Robot hand-eye calibration method and storage medium

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114619441A (en) * 2020-12-10 2022-06-14 北京极智嘉科技股份有限公司 Robot and two-dimensional code pose detection method
CN114619441B (en) * 2020-12-10 2024-03-26 北京极智嘉科技股份有限公司 Robot and two-dimensional code pose detection method
CN114147725A (en) * 2021-12-21 2022-03-08 乐聚(深圳)机器人技术有限公司 Zero point adjustment method, device, equipment and storage medium for robot
CN114147725B (en) * 2021-12-21 2024-04-02 乐聚(深圳)机器人技术有限公司 Zero point adjustment method, device and equipment for robot and storage medium
CN114536399A (en) * 2022-01-07 2022-05-27 中国人民解放军海军军医大学第一附属医院 Error detection method based on multiple pose identifications and robot system
CN114347037A (en) * 2022-02-16 2022-04-15 中国医学科学院北京协和医院 Robot system fault detection processing method based on composite identification and robot system
CN114536401A (en) * 2022-02-16 2022-05-27 中国医学科学院北京协和医院 Robot system fault detection processing method based on multiple pose identifications and robot system
CN114347037B (en) * 2022-02-16 2024-03-29 中国医学科学院北京协和医院 Robot system fault detection processing method based on composite identification and robot system
CN114536401B (en) * 2022-02-16 2024-03-29 中国医学科学院北京协和医院 Robot system fault detection processing method based on multiple pose identifiers and robot system

Also Published As

Publication number Publication date
CN111300484B (en) 2022-10-21

Similar Documents

Publication Publication Date Title
CN111300484B (en) Method for determining joint positioning error of robot, robot and storage medium
US9075411B2 (en) Robot system, calibration method of robot system, robot, calibration device, and digital camera
CN107139178B (en) Unmanned aerial vehicle and vision-based grabbing method thereof
US20180260685A1 (en) Hierarchical robotic controller apparatus and methods
CN113910219B (en) Exercise arm system and control method
JP6429473B2 (en) Robot system, robot system calibration method, program, and computer-readable recording medium
US9517560B2 (en) Robot system and calibration method of the robot system
JP7299330B2 (en) Multi-camera image processing
CN106483963B (en) Automatic calibration method of robot system
US11413748B2 (en) System and method of direct teaching a robot
CN108436909A (en) A kind of hand and eye calibrating method of camera and robot based on ROS
CN113499138B (en) Active navigation system for surgical operation and control method thereof
US20120294509A1 (en) Robot control system, robot system and program
JP2019537077A (en) Simultaneous positioning map creation navigation method, apparatus and system using indicators
CN113362396B (en) Mobile robot 3D hand-eye calibration method and device
CN111055289B (en) Method and device for calibrating hand and eye of robot, robot and storage medium
US9844881B2 (en) Robotic device including machine vision
US20210008724A1 (en) Method and apparatus for managing robot system
US11964400B2 (en) Device and method for controlling a robot to pick up an object in various positions
CN110167721B (en) Robot system, automatic calibration method and storage device
CN114536399A (en) Error detection method based on multiple pose identifications and robot system
US20240001557A1 (en) Robot and robot hand-eye calibrating method
CN113858206B (en) Robot job control method, robot, and computer-readable storage medium
CN115431278B (en) Robot calibration method, system and storage medium based on VTK feature point transformation
WO2024057836A1 (en) Control method for controlling transport of object, transport device for transporting object, and work system provided with transport device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210207

Address after: 200245 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant after: Dalu Robot Co.,Ltd.

Address before: 610094 West Section of Fucheng Avenue, Chengdu High-tech District, Sichuan Province

Applicant before: CLOUDMINDS (CHENGDU) TECHNOLOGIES Co.,Ltd.

TA01 Transfer of patent application right
CB02 Change of applicant information

Address after: 201111 Building 8, No. 207, Zhongqing Road, Minhang District, Shanghai

Applicant after: Dayu robot Co.,Ltd.

Address before: 200245 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant before: Dalu Robot Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant