CN115488883B - Robot hand-eye calibration method, device and system - Google Patents

Robot hand-eye calibration method, device and system Download PDF

Info

Publication number
CN115488883B
CN115488883B CN202211081777.6A CN202211081777A CN115488883B CN 115488883 B CN115488883 B CN 115488883B CN 202211081777 A CN202211081777 A CN 202211081777A CN 115488883 B CN115488883 B CN 115488883B
Authority
CN
China
Prior art keywords
robot
tail end
controlling
target
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211081777.6A
Other languages
Chinese (zh)
Other versions
CN115488883A (en
Inventor
刘佳君
马涛
吴哲明
邹怡蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qunqing Huachuang Nanjing Intelligent Technology Co ltd
Original Assignee
Qunqing Huachuang Nanjing Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qunqing Huachuang Nanjing Intelligent Technology Co ltd filed Critical Qunqing Huachuang Nanjing Intelligent Technology Co ltd
Priority to CN202211081777.6A priority Critical patent/CN115488883B/en
Publication of CN115488883A publication Critical patent/CN115488883A/en
Application granted granted Critical
Publication of CN115488883B publication Critical patent/CN115488883B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to the technical field of robots, in particular to a robot hand-eye calibration method, device and system. The robot hand-eye calibration method comprises the following steps: controlling the robot to move through a visual servo method, so that laser spots emitted by a laser range finder fixedly connected to the tail end of the robot are overlapped with a plurality of characteristic points in sequence; according to the reading of the laser range finder when the laser light spot is overlapped with the plurality of characteristic points, calculating to obtain a first coordinate set of the plurality of characteristic points in a robot coordinate system; acquiring a second coordinate set of a plurality of characteristic points in a camera coordinate system to be calibrated; and according to the first coordinate set and the second coordinate set, calculating to obtain the conversion relation between the camera coordinate system to be calibrated and the robot coordinate system. The robot hand-eye calibration method, the device and the system provided by the invention have the advantages that the tail end of the robot is not contacted with the calibration tool in the calibration process, the problem of collision cannot occur, the calibration is easy, the calibration precision is higher, and the efficiency is higher.

Description

Robot hand-eye calibration method, device and system
Technical Field
The invention relates to the technical field of robots, in particular to a robot hand-eye calibration method, device and system.
Background
At present, a plurality of vision auxiliary robots can be calibrated by hands and eyes before being put into operation, and the traditional robot hand and eye calibration method is to control the robots to enable the tips of the tail end probes to contact characteristic points on a calibration plate, collect images through cameras to be calibrated, extract coordinate information of the characteristic points in an image processing mode, and acquire coordinate information of the tail end probes under a robot base coordinate system through coordinate conversion in combination with robot kinematics so as to achieve hand and eye calibration.
However, the robot end is easy to collide when approaching the characteristic point, so that the calibration plate is subjected to position deviation or breakage, or the robot end is subjected to elastic deformation, breakage and the like, so that the calibration difficulty is high, the calibration precision is low, the efficiency of the manual adjustment process is low, the attention is difficult to continuously concentrate, and the calibration precision is influenced.
Disclosure of Invention
The invention aims to provide a robot hand-eye calibration method, device and system, which are used for solving the technical problems of high calibration difficulty, difficult guarantee of calibration precision and low efficiency in the robot hand-eye calibration method in the prior art.
The robot hand-eye calibration method provided by the invention comprises the following steps:
Controlling the robot to move through a visual servo method, so that laser light spots emitted by a laser range finder fixedly connected to the tail end of the robot are sequentially overlapped with a plurality of characteristic points;
according to the reading of the laser range finder when the laser light spots are overlapped with the characteristic points and the conversion relation between the laser range finder coordinate system and the robot coordinate system, calculating to obtain a first coordinate set of the characteristic points in the robot coordinate system;
acquiring a plurality of second coordinate sets of the characteristic points in a camera coordinate system to be calibrated;
and according to the first coordinate set and the second coordinate set, calculating to obtain a conversion relation between the camera coordinate system to be calibrated and the robot coordinate system.
Preferably, as an implementation manner, the visual servoing method includes:
sequentially taking a plurality of feature points as target feature points, and acquiring the positions of the target feature points;
controlling the tail end of the robot to translate a first distance along a first direction, and acquiring a first position where the laser spot is currently located; then, controlling the tail end of the robot to translate a second distance along a second direction, and acquiring a second position where the laser spot is currently located, wherein the second direction is opposite to the first direction, and the difference value between the second distance and the first distance is larger than or equal to a first preset difference value;
And acquiring a first target position closest to the target characteristic point on the connecting line of the first position and the second position, and controlling the tail end of the robot to drive the laser spot to translate to the first target position.
Preferably, as an implementation manner, the visual servoing method further includes:
controlling the tail end of the robot to translate a third distance along a third direction, and acquiring a third position where the laser spot is currently located; then, controlling the tail end of the robot to translate a fourth distance along a fourth direction, and acquiring a fourth position where the laser spot is currently located, wherein the third direction is perpendicular to the first direction, the fourth direction is opposite to the third direction, and a difference value between the fourth distance and the third distance is larger than or equal to a second preset difference value;
and acquiring a second target position closest to the target characteristic point on the connecting line of the third position and the fourth position, and controlling the tail end of the robot to drive the laser spot to translate to the second target position.
Preferably, as an implementation manner, the method further includes: controlling the tail end of the robot to sequentially circularly move along the first direction, the second direction, the third direction and the fourth direction;
After each time the tail end of the robot is controlled to translate along the first direction and the second direction once in sequence, a first interval value between the first target position and the target characteristic point is obtained; each time the tail end of the robot is controlled to translate along the third direction and the fourth direction once, a second interval value between the second target position and the target characteristic point is obtained;
and when the first interval value or the second interval value meets the superposition condition, controlling the tail end of the robot to stop circulating movement.
Preferably, as an implementation manner, the method further includes:
and controlling an auxiliary camera to acquire images of the target feature points and the laser spots, and acquiring the positions of the target feature points, the positions of the laser spots, the first target position, the second target position, the first interval value and the second interval value through image processing.
Preferably, as an implementation manner, the first distance and the third distance are both greater than or equal to a radius of a circumcircle of the target feature point, and the first preset difference value and the second preset difference value are greater than or equal to a radius of a circumcircle of the target feature point.
Preferably, as an implementation manner, after the step of obtaining the first target position closest to the target feature point on the line between the first position and the second position, the method further includes: acquiring first position information of the first target position on a connecting line of the first position and the second position, feeding back a first moving distance and a first moving direction according to the first position information, and controlling the tail end of the robot to drive the laser spot to move along the first moving direction by the first moving distance so as to enable the laser spot to translate to the first target position;
and/or, after the step of obtaining the second target position closest to the target feature point on the line between the third position and the fourth position, the method further includes: and acquiring second position information of the second target position on a connecting line of the third position and the fourth position, feeding back a second moving distance and a second moving direction according to the second position information, and controlling the tail end of the robot to drive the laser spot to move along the second moving direction for the second moving distance so as to enable the laser spot to translate to the second target position.
Preferably, as an implementation manner, the method further includes:
controlling the tail end probe of the robot to randomly move to a certain position in a space, setting a fixed point at the position, and acquiring a third coordinate of the fixed point in the robot coordinate system;
controlling the robot to run, enabling the laser light spot to coincide with the fixed point so as to obtain a set of characteristic information, wherein the characteristic information comprises a fourth coordinate of the center of the tail end of the robot in the coordinate system of the robot, a vector of the tail end of the robot axially in the coordinate system of the tail end of the robot and a reading of the laser range finder;
controlling the robot to change the pose for a plurality of times to obtain a plurality of sets of characteristic information;
and according to the third coordinate and the plurality of sets of characteristic information, calculating to obtain a conversion relation between the laser range finder coordinate system and the robot coordinate system.
The invention also provides a robot hand-eye calibration device, which comprises:
the first control module is used for controlling the robot to move through a visual servo method, so that laser light spots emitted by a laser range finder fixedly connected to the tail end of the robot are sequentially overlapped with a plurality of characteristic points;
The first calculation module is used for calculating a first coordinate set of the plurality of characteristic points in the robot coordinate system according to the reading of the laser range finder when the laser light spots are overlapped with the plurality of characteristic points and the conversion relation between the laser range finder coordinate system and the robot coordinate system;
the acquisition module is used for acquiring a plurality of second coordinate sets of the characteristic points in a camera coordinate system to be calibrated;
and the second calculation module is used for calculating and obtaining the conversion relation between the camera coordinate system to be calibrated and the robot coordinate system according to the first coordinate set and the second coordinate set.
The invention also provides a robot hand-eye calibration system, which comprises a robot, a laser range finder, a camera to be calibrated, a calibration tool and a controller, wherein the laser range finder is fixedly connected to the tail end of the robot, a plurality of characteristic points are arranged on the calibration tool, and the robot and the camera to be calibrated are electrically connected with the controller; the controller is used for executing the robot hand-eye calibration method.
Compared with the prior art, the invention has the beneficial effects that:
the robot hand-eye calibration method, the device and the system provided by the invention can realize the robot hand-eye calibration, and the tail end of the robot is not contacted with the calibration tool in the calibration process, so that the problem of collision can not occur, the characteristic points can not shift or break, the tail end of the robot is not easy to generate elastic deformation or even break, the calibration is easy, the calibration precision is higher, and the robot is not easy to damage; in addition, the invention controls the movement of the robot by a visual servo method, does not need manual adjustment, is convenient for improving the efficiency, and can further ensure the calibration precision.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
FIG. 1 is a first schematic flow chart of a robot hand-eye calibration method provided by an embodiment of the invention;
FIG. 2 is a first schematic flow chart of a visual servoing method provided by an embodiment of the invention;
FIG. 3 is a first schematic flow chart of a calibration method of a laser assisted system according to an embodiment of the present invention;
FIG. 4 is a second schematic flow chart of a robot hand-eye calibration method provided by an embodiment of the invention;
FIG. 5 is a second schematic flow chart of a servo control method provided by an embodiment of the present invention;
FIG. 6 is a second schematic flow chart of a method for calibrating a vision assistance system provided by an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a robot hand-eye calibration device according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a robot hand-eye calibration system according to an embodiment of the present invention.
Reference numerals illustrate:
701-a first control module; 702-a first computing module; 703-an acquisition module; 704-a second computing module; 801-robot; 802-laser rangefinder; 803-a camera to be calibrated; 804-feature points; 805-control means; 806-an auxiliary camera; 807-data processing means.
Detailed Description
The following description of the embodiments of the present invention will be made apparent and fully in view of the accompanying drawings, in which some, but not all embodiments of the invention are shown. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention will now be described in further detail by way of specific examples of embodiments in connection with the accompanying drawings.
FIG. 1 is a schematic flow chart of a method for calibrating a hand and an eye of a robot in one embodiment of the invention, the method comprising:
s102, controlling the robot to move through a visual servo method, so that laser spots emitted by a laser range finder fixedly connected to the tail end of the robot are overlapped with a plurality of characteristic points in sequence.
Referring to fig. 8, a laser range finder 802 is fixedly connected to the tail end of a robot 801, so that the conversion relationship between the tail end coordinate system of the robot and the coordinate system of the laser range finder can be kept unchanged; when the laser spot emitted by the laser range finder 802 is coincident with a certain feature point 804, the coordinates of the laser spot are the same as those of the feature point 804; the criterion for overlapping is that the center of the laser spot is overlapped with the center of the feature point, and of course, if there are other corresponding reference points that can be used for determining overlapping between the laser spot and the feature point 804, the criterion for overlapping may be that the reference point of the laser spot is overlapped with the reference point of the feature point 804. The plurality of feature points 804 may be respectively configured separately, or may be equally configured on the same calibration tool, and the calibration tool may be a planar structure, or may be a three-dimensional structure, for example, a calibration plate may be used as the calibration tool.
S104, calculating to obtain a first coordinate set of the plurality of characteristic points in the robot coordinate system according to the reading of the laser range finder when the laser light spots are overlapped with the plurality of characteristic points and the conversion relation between the laser range finder coordinate system and the robot coordinate system.
The coordinates of the laser spot (the coordinates of the center of the laser spot in practice) are directly obtained according to the reading of the laser range finder, and when the laser spot coincides with the feature point, the coordinates of the laser spot are the coordinates of the feature point (the coordinates of the center of the feature point in practice). The first coordinate set is composed of a plurality of first coordinates, and the first coordinates are measured when the laser light spots are respectively overlapped with the characteristic points.
S106, obtaining a second coordinate set of the plurality of feature points in the camera coordinate system to be calibrated.
The camera to be calibrated can shoot a plurality of feature points, a plurality of second coordinates of the feature points in a coordinate system can be obtained through image processing, and the second coordinates form a second coordinate set.
S108, according to the first coordinate set and the second coordinate set, calculating to obtain a conversion relation between the camera coordinate system to be calibrated and the robot coordinate system.
In summary, the robot hand-eye calibration method provided by the embodiment can realize the robot hand-eye calibration, and the calibration process has no contact between the tail end of the robot and the calibration tool, so that the problem of collision can not occur, the characteristic points can not shift or be damaged, the tail end of the robot is not easy to generate elastic deformation or even fracture, the calibration is easy, the calibration precision is high, and the robot is not easy to damage; in addition, the robot is controlled to move through a visual servo method, manual adjustment is not needed, efficiency is improved conveniently, and calibration accuracy can be further guaranteed.
FIG. 2 is a schematic flow chart of a visual servoing method, which may specifically include the steps of:
s202, sequentially taking the plurality of feature points as target feature points, and acquiring the positions of the target feature points.
S204, controlling the tail end of the robot to translate a first distance along a first direction, and acquiring a first position where a laser spot is currently located; and then, controlling the tail end of the robot to translate a second distance along a second direction, and acquiring a second position where the laser spot is currently located, wherein the second direction is opposite to the first direction, and the difference value between the second distance and the first distance is greater than or equal to a preset difference value.
When the tail end of the robot moves according to the steps, the laser spot moves once along a straight line parallel to the first direction from the starting point to a position with a certain distance from the starting point, wherein the first direction is preferably the positive x-axis direction of the tail end coordinate system of the robot, and the second direction is preferably the negative x-axis direction of the tail end coordinate system of the robot, so that the control of the robot is more convenient. The translational movement of the tail end of the robot is actually coordinated movement of all joints of the robot, so that the tail end of the robot translates.
S206, acquiring a first target position closest to the target feature point on the connection line of the first position and the second position, and controlling the tail end of the robot to drive the laser spot to translate to the first target position.
The robot moves, so that after the tail end of the robot drives the laser spot to translate to the first target position, the distance between the laser spot and the target characteristic point can be reduced, and the laser spot is closer to the target characteristic point or coincides with the target characteristic point. Specifically, first position information of the first target position on a connecting line of the first position and the second position can be obtained, the first moving distance and the first moving direction are fed back according to the first position information, the tail end of the robot is controlled to drive the laser spot to move along the first moving direction for the first moving distance, and the laser spot is enabled to translate to the first target position.
S208, controlling the tail end of the robot to translate a third distance along a third direction, and acquiring a third position where the laser spot is currently located; and then, controlling the tail end of the robot to translate a fourth distance along a fourth direction, and acquiring a fourth position where the laser spot is currently located, wherein the third direction is perpendicular to the first direction, the fourth direction is opposite to the third direction, and the difference value between the fourth distance and the third distance is greater than or equal to a second preset difference value.
When the robot end moves according to the above steps, the laser spot moves once along a straight line perpendicular to the first direction with the first target position reached by the laser spot in step S206 as a starting point, and reaches a position with a certain distance from the first target position, wherein the third direction is preferably positive y-axis of the robot end coordinate system, and the fourth direction is preferably negative y-axis of the robot end coordinate system, so that the control of the robot is more convenient.
S210, a second target position closest to the target feature point on the connecting line of the third position and the fourth position is obtained, and the tail end of the robot is controlled to drive the laser spot to translate to the second target position.
The robot moves, so that after the tail end of the robot drives the laser spot to translate to the second target position, the distance between the laser spot and the target characteristic point can be further reduced, and the laser spot is closer to the target characteristic point or coincides with the target characteristic point. Specifically, second position information of the second target position on a connecting line of the third position and the fourth position can be obtained, a second moving distance and a second moving direction are fed back according to the second position information, and the tail end of the robot is controlled to drive the laser spot to move along the second moving direction for the second moving distance, so that the laser spot is translated to the second target position.
In fact, after the robot tip moves once in the first direction, the second direction, the third direction and the fourth direction in order, the laser spot center is likely not to coincide with the target feature point yet, so the robot tip can be controlled to move cyclically in the first direction, the second direction, the third direction and the fourth direction in order, that is, the above steps S204 to S210 are repeatedly performed in order.
In the process of repeatedly executing the steps S204 to S210, after each control robot terminal translates along the first direction and the second direction in turn, a first distance value between the first target position and the target feature point is obtained, that is, when executing the step S206 each time, a first distance value between the current first target position and the target feature point is obtained; after each control robot terminal translates along the third direction and the fourth direction in turn, a second distance value between the second target position and the target feature point is obtained, that is, when step S210 is executed each time, a second distance value between the current second target position and the target feature point is obtained. Specifically, the first pitch value and the second pitch value may be acquired by image processing.
When the obtained first interval value or second interval value meets the superposition condition, controlling the tail end of the robot to stop moving circularly, namely, if the obtained first interval meets the superposition condition after executing the step S206 for a certain time, judging that the laser light spot is superposed with the target characteristic point at the moment, and the laser light spot is not required to be continuously close to the target characteristic point, so that the tail end of the robot can be controlled to stop moving circularly; correspondingly, if the obtained second distance satisfies the overlapping condition after executing the step S210 for a certain time, it is determined that the laser spot overlaps the target feature point at this time, and the robot end is controlled to stop moving circularly without the laser spot continuously approaching the target feature point. The above-mentioned overlapping condition may be that the pitch value is equal to 0, or that the pitch value is smaller than a certain threshold value, which is close to 0. The mode of iteratively searching the minimum distance value can realize the accurate alignment of the robot to the characteristic points, and is convenient for ensuring higher hand-eye calibration precision.
Preferably, referring to fig. 8, an auxiliary camera 806 may be added, where the auxiliary camera 806 may be installed near the feature points 804, so that the feature points 804 are located in the field of view of the auxiliary camera 806, and the auxiliary camera 806 is controlled to collect images of the target feature points and the laser spots in the initial state, and obtain initial position coordinates of the center of the target feature points and the center of the laser spots in the auxiliary camera coordinate system through image processing; similarly, when the end of the robot 801 moves to the first position and the second position, the auxiliary camera 806 is controlled to collect the laser spot image in the current state, and the first position coordinate and the second position coordinate of the laser spot center in the auxiliary camera coordinate system are obtained through image processing. The coordinates of the first target position closest to the target feature point on the connection line between the first position and the second position and the coordinates of the second target position closest to the target feature point on the connection line between the third position and the fourth position can be obtained through image processing. It should be noted that, unlike an industrial camera, the auxiliary camera 806 has a smaller field of view, and in turn, the image of the auxiliary camera 806 has a higher corresponding pixel per unit length, so that the obtained coordinates of the center of the laser spot and the coordinates of the center of the feature point are more accurate, which is beneficial to improving the calibration accuracy.
Specifically, the first distance is set to be greater than or equal to the radius of the circumscribed circle of the target feature point, and the first preset difference value and the second preset difference value are set to be greater than or equal to the radius of the circumscribed circle of the target feature point, so that the approximation range can be limited to move inside the identification point, the operation speed is increased, and meanwhile the loss caused by the servo motion of the robot can be reduced.
In practice, the third distance may be set to be consistent with the first distance, and the fourth distance may be set to be consistent with the second distance, and the second preset difference may be consistent with the first preset difference.
Fig. 3 is a schematic flowchart of a method for calibrating a laser auxiliary system, which may be used in advance to calibrate the laser auxiliary system before the step S102 to obtain the conversion relationship between the laser range finder coordinate system and the robot coordinate system in the step S104, where the visual servo method specifically includes the following steps:
s302, controlling the tail end probe of the robot to randomly move to a certain position in a space, setting a fixed point at the position, and acquiring a third coordinate of the fixed point in a robot coordinate system.
The fixed point is set at the position where the probe at the tail end of the robot is located when the robot randomly moves and stays at a certain position, and is not preset, that is, the fixed point does not exist yet when the robot moves, so that the terminal probe does not touch the fixed point and collision is avoided when the robot moves, the calibration precision can be guaranteed, and the robot is not easy to damage. In addition, in this step, the robot motion may be controlled either automatically or manually.
S304, controlling the robot to run, enabling the laser spot to coincide with the fixed point, and obtaining a set of characteristic information, wherein the characteristic information comprises a fourth coordinate of the center of the tail end of the robot in a coordinate system of the robot, a vector of the axial direction of the tail end of the robot in the coordinate system of the tail end of the robot and a reading of the laser range finder.
S306, controlling the robot to change the pose for a plurality of times, and obtaining a plurality of sets of characteristic information.
The pose of the robot is changed, so that the purpose that the laser light spots coincide with the fixed points is achieved under different pose states of the robot, and when the pose of the robot is changed once, the laser light spots coincide with the fixed points, the feature information is acquired once, and therefore the robot is changed for a plurality of times, and a plurality of groups of feature information can be obtained. In this step, the robot motion can be controlled automatically or manually.
And S308, calculating to obtain the conversion relation between the laser range finder coordinate system and the robot coordinate system according to the third coordinate and the plurality of sets of characteristic information.
According to the third coordinate and the multiple sets of characteristic information, the coordinates of the origin of the laser range finder coordinate system in the robot terminal coordinate system and the vectors of the laser indication direction in the robot terminal coordinate system are calculated, and according to the conversion relation between the robot terminal coordinate system and the robot coordinate system, the conversion relation between the laser range finder coordinate system and the robot coordinate system is calculated finally.
FIG. 4 is a schematic flow chart of a robot hand-eye calibration method according to an embodiment of the invention, the method comprising:
s401, starting;
s402, calibrating a vision auxiliary system, wherein the vision auxiliary system comprises an auxiliary camera and a laser range finder fixedly connected to the tail end of the robot.
S403, controlling a laser range finder to emit laser;
s404, controlling the robot to move, and giving an initial position of a laser spot;
specifically, the initial position of the laser spot is close to the characteristic point.
S405, controlling an auxiliary camera to collect the laser spot center and the characteristic point center;
s406, servo-controlling the robot to move so that the center of the laser spot moves to the center of the characteristic point;
s407, acquiring coordinates of the feature point center in a robot coordinate system;
specifically, according to the reading of the laser range finder and the conversion relation between the coordinate system of the laser range finder and the coordinate system of the robot, the coordinate of the laser spot center in the coordinate system of the robot can be calculated, and then the coordinate of the characteristic point center in the coordinate system of the robot is obtained; the number of the characteristic points is plural, and the characteristic points can be sequentially used as the characteristic points according to the steps, so that a series of coordinates of the center of the characteristic points in a robot coordinate system can be obtained.
S408, acquiring coordinates of the feature point center in a camera coordinate system to be calibrated;
specifically, a camera to be calibrated is controlled to shoot a series of feature points, and coordinates of the centers of the series of feature points in a coordinate system of the camera to be calibrated are obtained through image processing.
S409, calibration is completed;
specifically, according to the coordinates of a series of feature point centers in the robot coordinate system and the coordinates of a series of feature point centers in the camera coordinate system to be calibrated, the conversion relation between the robot coordinate system and the camera coordinate system to be calibrated is obtained through calculation.
S410, ending.
Fig. 5 is a schematic flowchart of a servo control method, which may be specifically used to control the motion of the robot to align the center of the laser spot with the center of the feature point in the step S406, where the servo control method includes:
s501, start.
S502, obtaining a laser spot center P 0 And coordinates of the feature point center O in an auxiliary camera coordinate system;
specifically, a laser spot center P 0 And the coordinates of the feature point center O in the auxiliary camera coordinate system can be obtained in an image processing manner.
S503, controlling the robot to forward translate along the x-axis of the terminal coordinate system by a distance d 1 Negative translation distance d 2 Respectively reach point P xn1 、P xn2
S504, obtaining O to point P xn1 、P xn2 Shortest distance L of straight line min Shortest distance point P n1 Coordinates of (c);
specifically, after each movement of the robot, the auxiliary camera is controlled to collect the laser spot image so as to obtain the laser after the first translationSpot center position point P xn1 Coordinates in the auxiliary camera coordinate system and the center position point P of the laser spot after the second translation xn2 Coordinates in the auxiliary camera coordinate system, where d 1 Not smaller than the radius R, |d of the circumscribed circle of the feature point 2 -d 1 The I is more than or equal to R; further, the feature point center O to the point P is obtained in the image processing direction xn1 、P xn2 Shortest distance L of straight line min And point P xn1 、P xn2 Point P on the straight line corresponding to the shortest distance n1 Is defined by the coordinates of (a).
S505, controlling the robot to move so as to enable the center of the laser spot to translate to a point P n1 As a new starting point;
specifically, through P n1 At point P xn1 、P xn2 The position information in the straight line is fed back to the moving distance D of the robot, and the movement of the robot is controlled to enable the center of the laser spot to translate to a point P n1
S506, judging whether L min ≤threshold;
If not, executing step S507; if yes, step S511 is performed.
S507, controlling the robot to forward translate along the y-axis of the tail end coordinate system by a distance d 1 Negative translation distance d 2 Respectively reach point P yn1 、P yn2 . Wherein threshold is a preset critical point.
S508, obtaining O to the point P yn1 、P yn2 Shortest distance L of straight line min Shortest distance point P n2 Coordinates of (c);
specifically, after each movement of the robot, the auxiliary camera is controlled to collect the laser spot image so as to obtain a center position point P after twice translation of the laser spot yn1 、P yn2 Further, the coordinates of the feature point center O to the point P are obtained by an image processing method yn1 、P yn2 Shortest distance L of straight line min Point P yn1 、P yn2 Point P of the shortest distance corresponding to the straight line n2 As a new starting point.
S509, controlling the machineThe person moves to make the center of the laser spot translate to point P n2 As a new starting point;
specifically, through P n2 At point P yn1 、P yn2 The position information in the straight line is fed back to the moving distance D of the robot, and the movement of the robot is controlled to enable the center of the laser spot to translate to a point P n2
S510, judging whether L min ≤threshold;
If not, returning to execute the step S503; if yes, go to step S511; wherein the value of n is the number of times of execution of the corresponding step, for example, when step S503 is executed for the first time, n=1; the second time step S503 is performed, n=2, and so on. Wherein threshold is a preset critical point.
S511, it is determined that the laser spot center coincides with the feature point center.
S512, ending.
Fig. 6 is a schematic flowchart of a calibration method of a vision assistance system, which may be specifically used to calibrate the vision assistance system in the step S202 described above:
s601, controlling the robot to move, enabling the probe at the tail end of the robot to randomly move to a certain position in a space, setting a fixed point at the position, and recording the coordinates of the fixed point in a robot coordinate system;
s602, controlling the robot to drive the laser range finder to move so that a laser spot falls on the fixed point set in the step S601, recording the coordinates of the center of the tail end of the robot in a robot coordinate system and the vectors of the axial direction of the tail end of the robot in the robot coordinate system, and recording the readings of the laser range finder at the moment;
wherein, the robot end center is defined as a robot sixth axis center, and the robot end axial direction is defined as a robot sixth axis axial direction.
S603, controlling the robot to change at least 5 poses, and repeating the step S602;
s604, calculating to obtain the coordinates of the origin of the laser range finder coordinate system in the robot terminal coordinate system and the vectors of the laser indication direction in the robot terminal coordinate system according to the recorded results of the steps, and further obtaining the conversion relation between the laser range finder coordinate system and the robot coordinate system;
S605, calibration is completed.
FIG. 7 is a diagram of a robot hand-eye calibration device according to an embodiment of the present invention, comprising:
the first control module 701 is configured to control the movement of the robot by using a visual servo method, so that a laser spot emitted by a laser range finder fixedly connected to the end of the robot sequentially coincides with a plurality of feature points;
the first calculation module 702 is configured to calculate a first coordinate set of the plurality of feature points in the robot coordinate system according to the readings of the laser rangefinder when the laser light spot is overlapped with the plurality of feature points and the conversion relationship between the laser rangefinder coordinate system and the robot coordinate system;
an obtaining module 703, configured to obtain a second coordinate set of the plurality of feature points in the camera coordinate system to be calibrated;
the second calculating module 704 is configured to calculate a conversion relationship between the camera coordinate system to be calibrated and the robot coordinate system according to the first coordinate set and the second coordinate set.
The robot hand-eye calibration device provided by the embodiment can realize the robot hand-eye calibration, and the tail end of the robot is not contacted with the calibration tool in the calibration process, so that the problem of collision cannot occur, the characteristic points cannot be shifted or damaged, the tail end of the robot cannot be easily deformed or even broken, the calibration is easy, the calibration precision is high, and the robot cannot be easily damaged; in addition, the robot is controlled to move through a visual servo method, manual adjustment is not needed, efficiency is improved conveniently, and calibration accuracy can be further guaranteed.
Optionally, as an embodiment, the first control module 701 specifically includes:
the first control unit is used for sequentially taking the plurality of characteristic points as target characteristic points and acquiring the positions of the target characteristic points; the tail end of the robot is controlled to translate a first distance along a first direction, and a first position where a laser spot is currently located is obtained; then, controlling the tail end of the robot to translate a second distance along a second direction, and acquiring a second position where the laser spot is currently located, wherein the second direction is opposite to the first direction, and the difference value between the second distance and the first distance is larger than or equal to a first preset difference value; and acquiring a first target position closest to the target characteristic point on the connecting line of the first position and the second position and a first interval value between the first target position and the target characteristic point, and controlling the tail end of the robot to drive the laser spot to translate to the first target position.
The second control unit is used for controlling the tail end of the robot to translate a third distance along a third direction and acquiring a third position where the laser spot is currently located; then, controlling the tail end of the robot to translate a fourth distance along a fourth direction, and acquiring a fourth position where the laser spot is currently located, wherein the third direction is perpendicular to the first direction, the fourth direction is opposite to the third direction, and the difference value between the fourth distance and the third distance is greater than or equal to a second preset difference value; and acquiring a second target position closest to the target characteristic point on the connecting line of the third position and the fourth position and a second interval value between the second target position and the target characteristic point, and controlling the tail end of the robot to drive the laser spot to translate to the second target position.
And the third control unit is used for controlling the tail end of the robot to circularly move along the first direction, the second direction, the third direction and the fourth direction in sequence, and controlling the tail end of the robot to stop circularly moving when the first interval value or the second interval value meets the superposition condition.
Optionally, as an embodiment, the robot hand-eye calibration device provided in this embodiment may further include:
the second control module is used for controlling the probe at the tail end of the robot to randomly move to a certain position in a space, setting a fixed point at the position and obtaining a third coordinate of the fixed point in a robot coordinate system;
the third control module is used for controlling the robot to run, so that the laser light spots coincide with the fixed points to obtain a set of characteristic information, wherein the characteristic information comprises a fourth coordinate of the center of the tail end of the robot in a coordinate system of the robot, a vector of the axial direction of the tail end of the robot in the coordinate system of the tail end of the robot and the reading of the laser range finder;
and the fourth control module is used for controlling the robot to change the pose for a plurality of times and obtaining a plurality of sets of characteristic information.
And the third calculation module is used for calculating and obtaining the conversion relation between the coordinate system of the laser range finder and the coordinate system of the robot according to the third coordinates and the plurality of sets of characteristic information.
Fig. 8 is a diagram of a robot hand-eye calibration system according to an embodiment of the present invention, which includes a robot 801, a laser rangefinder 802, a camera 803 to be calibrated, a calibration tool and a controller, wherein the laser rangefinder 802 is fixedly connected to the end of the robot 801, a plurality of feature points 804 are disposed on the calibration tool, and the robot 801 and the camera 803 to be calibrated are electrically connected to the controller; the controller is used for executing the robot hand-eye calibration method.
The robot hand-eye calibration system provided in this embodiment further includes an auxiliary camera 806 for collecting laser spot images and feature point images.
The controller may specifically include a control device 805 for controlling the movement of the robot 801, and a data processing device 807, where the camera 803 to be calibrated and the auxiliary camera 806 may be connected to the data processing device 807, and the data processing device 807 may acquire images acquired by the auxiliary camera 806 and the camera 803 to be calibrated, and acquire corresponding data information through image processing; the control means 805 is connected to the data processing means 807 and the robot 801, respectively, so that the control means 805 can control the robot 801 to perform a corresponding movement based on the data information obtained by the data processing means 807.
Of course, it will be appreciated by those skilled in the art that implementing all or part of the above-described methods in the embodiments may be implemented by a computer program instruction control apparatus, where the program may be stored in a computer readable storage medium, and the program may include the above-described methods in the embodiments when executed, where the storage medium may be a memory, a magnetic disk, an optical disk, or the like.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications may be made by one skilled in the art without departing from the spirit and scope of the invention, and the scope of the invention should be assessed accordingly to that of the appended claims.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (7)

1. The robot hand-eye calibration method is characterized by comprising the following steps of:
controlling the robot to move through a visual servo method, so that laser light spots emitted by a laser range finder fixedly connected to the tail end of the robot are sequentially overlapped with a plurality of characteristic points;
according to the reading of the laser range finder when the laser light spots are overlapped with the characteristic points and the conversion relation between the laser range finder coordinate system and the robot coordinate system, calculating to obtain a first coordinate set of the characteristic points in the robot coordinate system;
acquiring a plurality of second coordinate sets of the characteristic points in a camera coordinate system to be calibrated;
according to the first coordinate set and the second coordinate set, calculating to obtain a conversion relation between the camera coordinate system to be calibrated and the robot coordinate system;
the visual servoing method comprises the following steps:
sequentially taking a plurality of feature points as target feature points, and acquiring the positions of the target feature points;
controlling the tail end of the robot to translate a first distance along a first direction, and acquiring a first position where the laser spot is currently located; then, controlling the tail end of the robot to translate a second distance along a second direction, and acquiring a second position where the laser spot is currently located, wherein the second direction is opposite to the first direction, and the difference value between the second distance and the first distance is larger than or equal to a first preset difference value;
Acquiring a first target position closest to the target characteristic point on a connecting line of the first position and the second position, and controlling the tail end of the robot to drive the laser spot to translate to the first target position;
controlling the tail end of the robot to translate a third distance along a third direction, and acquiring a third position where the laser spot is currently located; then, controlling the tail end of the robot to translate a fourth distance along a fourth direction, and acquiring a fourth position where the laser spot is currently located, wherein the third direction is perpendicular to the first direction, the fourth direction is opposite to the third direction, and a difference value between the fourth distance and the third distance is larger than or equal to a second preset difference value;
acquiring a second target position closest to the target characteristic point on a connecting line of the third position and the fourth position, and controlling the tail end of the robot to drive the laser spot to translate to the second target position;
controlling the tail end of the robot to sequentially circularly move along the first direction, the second direction, the third direction and the fourth direction;
after each time the tail end of the robot is controlled to translate along the first direction and the second direction once in sequence, a first interval value between the first target position and the target characteristic point is obtained; each time the tail end of the robot is controlled to translate along the third direction and the fourth direction once, a second interval value between the second target position and the target characteristic point is obtained;
And when the first interval value or the second interval value meets the superposition condition, controlling the tail end of the robot to stop circulating movement.
2. The robot hand-eye calibration method according to claim 1, further comprising:
and controlling an auxiliary camera to acquire images of the target feature points and the laser spots, and acquiring the positions of the target feature points, the positions of the laser spots, the first target position, the second target position, the first interval value and the second interval value through image processing.
3. The robot hand-eye calibration method according to claim 1 or 2, wherein the first distance and the third distance are equal to or greater than a radius of a circumscribed circle of the target feature point, and the first preset difference value and the second preset difference value are equal to or greater than the radius of the circumscribed circle of the target feature point.
4. The robot hand-eye calibration method according to claim 1 or 2, wherein after the step of acquiring a first target position closest to the target feature point on a line connecting the first position and the second position, the visual servoing method further comprises: acquiring first position information of the first target position on a connecting line of the first position and the second position, feeding back a first moving distance and a first moving direction according to the first position information, and controlling the tail end of the robot to drive the laser spot to move along the first moving direction by the first moving distance so as to enable the laser spot to translate to the first target position;
And/or, after the step of obtaining the second target position closest to the target feature point on the line between the third position and the fourth position, the visual servoing method further includes: and acquiring second position information of the second target position on a connecting line of the third position and the fourth position, feeding back a second moving distance and a second moving direction according to the second position information, and controlling the tail end of the robot to drive the laser spot to move along the second moving direction for the second moving distance so as to enable the laser spot to translate to the second target position.
5. The robot hand-eye calibration method according to claim 1 or 2, characterized in that the robot hand-eye calibration method further comprises:
controlling the tail end probe of the robot to randomly move to a certain position in a space, setting a fixed point at the position, and acquiring a third coordinate of the fixed point in the robot coordinate system;
controlling the robot to run, enabling the laser light spot to coincide with the fixed point so as to obtain a set of characteristic information, wherein the characteristic information comprises a fourth coordinate of the center of the tail end of the robot in the coordinate system of the robot, a vector of the tail end of the robot axially in the coordinate system of the tail end of the robot and a reading of the laser range finder;
Controlling the robot to change the pose for a plurality of times to obtain a plurality of sets of characteristic information;
and according to the third coordinate and the plurality of sets of characteristic information, calculating to obtain a conversion relation between the laser range finder coordinate system and the robot coordinate system.
6. A robot hand-eye calibration device, comprising:
the first control module is used for controlling the robot to move through a visual servo method, so that laser light spots emitted by a laser range finder fixedly connected to the tail end of the robot are sequentially overlapped with a plurality of characteristic points;
the first calculation module is used for calculating a first coordinate set of the plurality of characteristic points in the robot coordinate system according to the reading of the laser range finder when the laser light spots are overlapped with the plurality of characteristic points and the conversion relation between the laser range finder coordinate system and the robot coordinate system;
the acquisition module is used for acquiring a plurality of second coordinate sets of the characteristic points in a camera coordinate system to be calibrated;
the second calculation module is used for calculating and obtaining the conversion relation between the camera coordinate system to be calibrated and the robot coordinate system according to the first coordinate set and the second coordinate set;
The first control module includes:
the first control unit is used for sequentially taking the plurality of characteristic points as target characteristic points and acquiring the positions of the target characteristic points; the device is also used for controlling the tail end of the robot to translate a first distance along a first direction and acquiring a first position where the laser spot is currently located; the method is also used for controlling the tail end of the robot to translate a second distance along a second direction and obtaining a second position where the laser spot is currently located, wherein the second direction is opposite to the first direction, and the difference value between the second distance and the first distance is larger than or equal to a first preset difference value; the laser spot translation device is also used for acquiring a first target position closest to the target characteristic point on the connecting line of the first position and the second position, and controlling the tail end of the robot to drive the laser spot to translate to the first target position;
the second control unit is used for controlling the tail end of the robot to translate a third distance along a third direction and acquiring a third position where the laser spot is currently located; the method is also used for controlling the tail end of the robot to translate a fourth distance along a fourth direction and obtaining a fourth position where the laser spot is currently located, wherein the third direction is perpendicular to the first direction, the fourth direction is opposite to the third direction, and the difference value between the fourth distance and the third distance is larger than or equal to a second preset difference value; the laser spot is further used for acquiring a second target position closest to the target characteristic point on the connecting line of the third position and the fourth position, and controlling the tail end of the robot to drive the laser spot to translate to the second target position;
A third control unit for controlling the robot tip to move cyclically in the first direction, the second direction, the third direction and the fourth direction in sequence; the first distance value between the first target position and the target feature point is obtained after the tail end of the robot is controlled to translate along the first direction and the second direction once in sequence; the method is further used for obtaining a second distance value between the second target position and the target characteristic point after each control of the tail end of the robot sequentially translates along the third direction and the fourth direction once; and the robot terminal is further used for controlling the robot terminal to stop circulating movement when the first interval value or the second interval value meets the superposition condition.
7. The robot hand-eye calibration system is characterized by comprising a robot, a laser range finder, a camera to be calibrated, a calibration tool and a controller, wherein the laser range finder is fixedly connected to the tail end of the robot, a plurality of characteristic points are arranged on the calibration tool, and the robot and the camera to be calibrated are electrically connected with the controller;
the controller is configured to perform the robot hand-eye calibration method of any one of claims 1-5.
CN202211081777.6A 2022-09-06 2022-09-06 Robot hand-eye calibration method, device and system Active CN115488883B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211081777.6A CN115488883B (en) 2022-09-06 2022-09-06 Robot hand-eye calibration method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211081777.6A CN115488883B (en) 2022-09-06 2022-09-06 Robot hand-eye calibration method, device and system

Publications (2)

Publication Number Publication Date
CN115488883A CN115488883A (en) 2022-12-20
CN115488883B true CN115488883B (en) 2023-11-07

Family

ID=84468592

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211081777.6A Active CN115488883B (en) 2022-09-06 2022-09-06 Robot hand-eye calibration method, device and system

Country Status (1)

Country Link
CN (1) CN115488883B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426166A (en) * 2013-07-09 2013-12-04 杭州电子科技大学 Robot hand-eye co-location method based on laser and single eye
CN105014678A (en) * 2015-07-16 2015-11-04 深圳市得意自动化科技有限公司 Robot hand-eye calibration method based on laser range finding
CN106839979A (en) * 2016-12-30 2017-06-13 上海交通大学 The hand and eye calibrating method of line structured laser sensor
CN107253190A (en) * 2017-01-23 2017-10-17 梅卡曼德(北京)机器人科技有限公司 The device and its application method of a kind of high precision machines people trick automatic camera calibration
CN110197461A (en) * 2019-06-06 2019-09-03 上海木木聚枞机器人科技有限公司 A kind of coordinate transformation relation determines method, apparatus, equipment and storage medium
CN111238368A (en) * 2020-01-15 2020-06-05 中山大学 Three-dimensional scanning method and device
CN111272098A (en) * 2020-03-28 2020-06-12 新蔚来智能科技(深圳)有限公司 Calibration method and calibration device for laser sensor mounting position

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021012124A1 (en) * 2019-07-19 2021-01-28 西门子(中国)有限公司 Robot hand-eye calibration method and apparatus, computing device, medium and product

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426166A (en) * 2013-07-09 2013-12-04 杭州电子科技大学 Robot hand-eye co-location method based on laser and single eye
CN105014678A (en) * 2015-07-16 2015-11-04 深圳市得意自动化科技有限公司 Robot hand-eye calibration method based on laser range finding
CN106839979A (en) * 2016-12-30 2017-06-13 上海交通大学 The hand and eye calibrating method of line structured laser sensor
CN107253190A (en) * 2017-01-23 2017-10-17 梅卡曼德(北京)机器人科技有限公司 The device and its application method of a kind of high precision machines people trick automatic camera calibration
CN110197461A (en) * 2019-06-06 2019-09-03 上海木木聚枞机器人科技有限公司 A kind of coordinate transformation relation determines method, apparatus, equipment and storage medium
CN111238368A (en) * 2020-01-15 2020-06-05 中山大学 Three-dimensional scanning method and device
CN111272098A (en) * 2020-03-28 2020-06-12 新蔚来智能科技(深圳)有限公司 Calibration method and calibration device for laser sensor mounting position

Also Published As

Publication number Publication date
CN115488883A (en) 2022-12-20

Similar Documents

Publication Publication Date Title
CN110238849B (en) Robot hand-eye calibration method and device
CN108161936B (en) Optimized robot calibration method and device
CN110666798B (en) Robot vision calibration method based on perspective transformation model
CN113825980A (en) Robot eye calibration method, device, computing equipment, medium and product
US20140233094A1 (en) Microscope system and storage medium
CN105014678A (en) Robot hand-eye calibration method based on laser range finding
JP2005201824A (en) Measuring device
CN114174006A (en) Robot eye calibration method, device, computing equipment, medium and product
CN111012506A (en) Robot-assisted puncture surgery end tool center calibration method based on stereoscopic vision
CN110936378B (en) Robot hand-eye relation automatic calibration method based on incremental compensation
CN110370316A (en) It is a kind of based on the robot TCP scaling method vertically reflected
CN112620926A (en) Welding spot tracking method and device and storage medium
CN112082477A (en) Universal tool microscope three-dimensional measuring device and method based on structured light
CN114714029A (en) Automatic arc welding method and device for aluminium alloy
JP2016097474A (en) Robot and robot system
CN115488883B (en) Robot hand-eye calibration method, device and system
US11922616B2 (en) Alignment device
WO2018188442A1 (en) Imaging method, device and system
CN114670199A (en) Identification positioning device, system and real-time tracking system
CN108152829B (en) Two-dimensional laser radar mapping device with linear guide rail and mapping method thereof
CN112565591A (en) Automatic focusing lens calibration method, electronic equipment and storage medium
JP2019188467A (en) Recording device, welding support device, recording method and program
CN114654457B (en) Multi-station precise alignment method for mechanical arm with long-short vision distance guidance
CN116276938A (en) Mechanical arm positioning error compensation method and device based on multi-zero visual guidance
JP2017102405A (en) Microscope, image pasting method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Room 103, Building A, No. 59 Wangqiao Road, Xiongzhou Street, Liuhe District, Nanjing City, Jiangsu Province, 211500

Applicant after: Qunqing Huachuang (Nanjing) Intelligent Technology Co.,Ltd.

Address before: Room 0106-674, Floor 1, No. 26, Shangdi Information Road, Haidian District, Beijing 100085

Applicant before: Qunqing Huachuang (Beijing) Intelligent Technology Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant