CN111890354A - Robot hand-eye calibration method, device and system - Google Patents

Robot hand-eye calibration method, device and system Download PDF

Info

Publication number
CN111890354A
CN111890354A CN202010603344.7A CN202010603344A CN111890354A CN 111890354 A CN111890354 A CN 111890354A CN 202010603344 A CN202010603344 A CN 202010603344A CN 111890354 A CN111890354 A CN 111890354A
Authority
CN
China
Prior art keywords
laser
calibration
robot
frame
mechanical arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010603344.7A
Other languages
Chinese (zh)
Other versions
CN111890354B (en
Inventor
罗法蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Peking University
Original Assignee
Peking University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Peking University filed Critical Peking University
Priority to CN202010603344.7A priority Critical patent/CN111890354B/en
Publication of CN111890354A publication Critical patent/CN111890354A/en
Application granted granted Critical
Publication of CN111890354B publication Critical patent/CN111890354B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/0095Means or methods for testing manipulators

Abstract

The application provides a robot hand-eye calibration method, device and system. Wherein the method comprises the following steps: controlling a laser positioning frame to generate a laser positioning point set for calibrating the robot, wherein the laser positioning point set comprises a plurality of calibration points formed by intersecting laser beams; controlling the front end of a mechanical arm of the robot to sequentially touch a plurality of the calibration points in the laser calibration point set; shooting a calibration image of the front end of the mechanical arm touching the calibration point each time the front end of the mechanical arm touches the calibration point; determining a first conversion matrix of pixel coordinates of the calibration point in the calibration image and physical coordinates in a calibration frame coordinate system; and calibrating the hand and the eye of the robot according to the first conversion matrix. The robot arm calibration method and the robot arm calibration device have the advantages that physical damage to the robot arm cannot be caused, so that calibration accuracy can be guaranteed, and action precision and service life of the robot arm are guaranteed.

Description

Robot hand-eye calibration method, device and system
Technical Field
The application relates to the technical field of robots, in particular to a robot hand-eye calibration method, device and system.
Background
A Robot (Robot) is a machine device that automatically executes work, and can accept human commands, run pre-programmed programs, and perform actions according to principles formulated by artificial intelligence techniques.
In order to meet the requirements of the manufacturing industry for transition, upgrading and development, intelligent industrial robots integrated with vision systems are increasingly used in modern factories. In the field of robots, pose estimation is performed by using a vision method to help a mechanical arm to achieve accurate target grabbing, and important application is achieved, wherein hand-eye calibration is a very basic and critical problem. The aim of hand-eye calibration is simply to acquire the relationship between the robot coordinate system and the camera coordinate system and finally transfer the vision recognition result to the robot coordinate system.
In the prior art, when the hand and eye calibration is performed on the robot, the calibration plate is mainly adopted for realizing the hand and eye calibration, and the robot arm and the calibration plate are physically damaged due to frequent physical contact of the robot arm, so that the calibration accuracy is influenced, and the action precision and the service life of the robot arm are influenced.
Disclosure of Invention
The application aims to provide a robot hand-eye calibration method, device and system.
The application provides a robot hand-eye calibration method in a first aspect, which includes:
controlling a laser positioning frame to generate a laser positioning point set for calibrating the robot, wherein the laser positioning point set comprises a plurality of calibration points formed by intersecting laser beams;
controlling the front end of a mechanical arm of the robot to sequentially touch a plurality of the calibration points in the laser calibration point set;
shooting a calibration image of the front end of the mechanical arm touching the calibration point each time the front end of the mechanical arm touches the calibration point;
determining a first conversion matrix of pixel coordinates of the calibration point in the calibration image and physical coordinates in a calibration frame coordinate system;
and calibrating the hand and the eye of the robot according to the first conversion matrix.
In some embodiments of the first aspect of the present application, the laser calibration frame includes a frame body and a plurality of laser emitters disposed along the frame body, wherein laser beams emitted by the plurality of laser emitters intersect to form a plurality of calibration points within the frame body.
In some embodiments of the first aspect of the present application, the laser positioning frame further includes laser receivers, the laser receivers are disposed in one-to-one correspondence with the laser emitters, and each laser receiver receives a laser beam emitted by a laser emitter corresponding to the laser receiver;
control the manipulator front end of robot touches in proper order a plurality of laser calibration point is concentrated after the calibration point, still include:
and detecting whether the front end of the mechanical arm touches the calibration point or not according to the change information of the laser beam received by each laser receiver.
In some embodiments of the first aspect of the present application, the method further comprises:
determining two laser receivers with weakened received laser beams according to the change information of the laser beams received by each laser receiver when the front end of the mechanical arm touches the calibration point each time;
and determining the physical coordinates of the calibration point touched by the front end of the mechanical arm in a calibration frame coordinate system according to the intersection position of the laser beams received by the two laser receivers.
This application second aspect provides a robot hand eye calibration device, includes:
the laser calibration point set generating module is used for controlling a laser calibration frame to generate a laser calibration point set used for calibrating the robot, wherein the laser calibration point set comprises a plurality of calibration points formed by intersecting laser beams;
the calibration point touch module is used for controlling the front end of a mechanical arm of the robot to sequentially touch a plurality of the calibration points in the laser calibration point set;
the calibration image shooting module is used for shooting a calibration image of the front end of the mechanical arm touching the calibration point each time the front end of the mechanical arm touches the calibration point;
the conversion matrix determining module is used for determining a first conversion matrix of the pixel coordinates of the calibration point in the calibration image and the physical coordinates in the calibration frame coordinate system;
and the hand-eye calibration module is used for performing hand-eye calibration on the robot according to the first conversion matrix.
In some embodiments of the second aspect of the present application, the laser calibration frame includes a frame body and a plurality of laser emitters disposed along the frame body, wherein laser beams emitted by the plurality of laser emitters intersect to form a plurality of calibration points within the frame body.
In some embodiments of the second aspect of the present application, the laser positioning frame further includes laser receivers, the laser receivers are disposed in one-to-one correspondence with the laser emitters, and each laser receiver receives a laser beam emitted by a laser emitter corresponding to the laser receiver;
the device further comprises:
and the touch detection module is used for detecting whether the front end of the mechanical arm touches the calibration point or not according to the change information of the laser beam received by each laser receiver.
In some embodiments of the second aspect of the present application, the apparatus further comprises:
the change receiver determining module is used for determining two laser receivers which are weakened in received laser beams according to the change information of the laser beams received by each laser receiver when the front end of the mechanical arm touches the calibration point each time;
and the physical coordinate determination module is used for determining the physical coordinates of the calibration point touched by the front end of the mechanical arm in a calibration frame coordinate system according to the intersection position of the laser beams received by the two laser receivers.
In a third aspect, the present application provides a robot for performing the method provided in the first aspect of the present application to achieve hand-eye calibration.
The fourth aspect of the present application provides a robot hand-eye calibration system, including: the robot and the laser mark fix the frame;
the robot is connected with the laser calibration frame;
the laser positioning frame is used for generating a laser positioning point set used for calibrating the robot under the control of the robot;
the robot is used for realizing hand-eye calibration by executing the method provided by the first aspect of the application based on the laser positioning frame.
Compared with the prior art, the robot eye calibration method provided by the application generates a laser calibration point set for calibrating the robot by controlling the laser calibration frame, wherein the laser calibration point set comprises a plurality of calibration points formed by intersecting laser beams, then controls the front end of the mechanical arm of the robot to sequentially touch the plurality of calibration points in the laser calibration point set, and then shoots the front end of the mechanical arm to touch the calibration image of the calibration point and determines the first conversion matrix of the pixel coordinates in the calibration image and the physical coordinates in the calibration frame coordinate system of the robot according to the first conversion matrix, so that the robot can be calibrated by the hand eye calibration method. This application has adopted laser calibration point set to replace traditional calibration board, because laser calibration point set belongs to optical signal, can not produce the physical contact with robot arm, consequently, can not cause the physical damage to robot arm to both can ensure the calibration degree of accuracy, help ensureing robot arm's action precision and life again.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 illustrates a flow chart of a robot hand-eye calibration method provided by some embodiments of the present application;
FIG. 2 illustrates a schematic diagram of a laser calibration block provided by some embodiments of the present application;
FIG. 3 illustrates a schematic diagram of a hand-eye calibration principle provided by some embodiments of the present application;
FIG. 4 illustrates a schematic diagram of a robotic hand-eye calibration apparatus provided in some embodiments of the present application;
fig. 5 illustrates a schematic diagram of a robot provided by some embodiments of the present application.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
It is to be noted that, unless otherwise specified, technical or scientific terms used herein shall have the ordinary meaning as understood by those skilled in the art to which this application belongs.
In addition, the terms "first" and "second", etc. are used to distinguish different objects, rather than to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
The embodiment of the application provides a robot hand-eye calibration method, device and system, which are described in an exemplary manner by combining the embodiment and the accompanying drawings.
Referring to fig. 1, which shows a flowchart of a robot hand-eye calibration method provided in some embodiments of the present application, as shown in fig. 1, the robot hand-eye calibration method may include the following steps:
step S101: controlling a laser positioning frame to generate a laser positioning point set for calibrating the robot, wherein the laser positioning point set comprises a plurality of calibration points formed by intersecting laser beams;
step S102: controlling the front end of a mechanical arm of the robot to sequentially touch a plurality of the calibration points in the laser calibration point set;
step S103: shooting a calibration image of the front end of the mechanical arm touching the calibration point each time the front end of the mechanical arm touches the calibration point;
step S104: and determining a first conversion matrix of the pixel coordinates of the calibration point in the calibration image and the physical coordinates in a calibration frame coordinate system.
Step S105: and calibrating the hand and the eye of the robot according to the first conversion matrix.
Compared with the prior art, the robot eye calibration method provided by the embodiment of the application generates the laser calibration point set for calibrating the robot by controlling the laser calibration frame, wherein the laser calibration point set comprises a plurality of calibration points formed by intersecting laser beams, then controls the front end of the mechanical arm of the robot to sequentially touch the plurality of calibration points in the laser calibration point set, and then shoots the calibration image of the calibration point touched by the front end of the mechanical arm every time the front end of the mechanical arm touches the calibration point, and determines the first conversion matrix of the pixel coordinate in the calibration image and the physical coordinate in the calibration frame coordinate system of the robot, so that the robot can be calibrated by the first conversion matrix. This application has adopted laser calibration point set to replace traditional calibration board, because laser calibration point set belongs to optical signal, can not produce the physical contact with robot arm, consequently, can not cause the physical damage to robot arm to both can ensure the calibration degree of accuracy, help ensureing robot arm's action precision and life again.
For ease of understanding, refer to fig. 2, which illustrates a schematic diagram of a laser calibration frame provided in some embodiments of the present application, and as shown in the figure, the laser calibration frame includes a frame body and a plurality of laser emitters arranged along the frame body, and laser beams emitted by the plurality of laser emitters intersect to form a plurality of calibration points in the frame body.
The outline shape of the frame may be a square as shown in fig. 2, or may be a rectangle, a circle, an ellipse, or any other shape, and the physical coordinates of each calibration point in the calibration frame coordinate system may be determined as long as the laser beam emission direction is stable.
It is easy to understand that when the laser beams are perpendicular to each other, the position of the index point can be conveniently determined according to the position where the laser beams perpendicularly intersect, and the calculation is convenient.
Furthermore, when the frame body is square or rectangular, the position coordinates of each laser transmitter and each laser receiver are easier to determine, so that the spatial position of the laser beam is convenient to determine, the position of the calibration point is more convenient to determine, and the calculation and calibration speed is further improved.
In some preferred embodiments, as shown in fig. 2, the frame body is rectangular, the laser generators are disposed on any two adjacent and mutually perpendicular frames in the frame body, and are respectively perpendicular to the frame where the laser generators are located to emit laser beams to the frames, and the laser emitters on the same frame may be disposed at equal intervals to form a regularly arranged calibration point array, which is helpful for improving calibration speed and accuracy.
In step S103, the camera of the robot may be used to capture a calibration image of the front end of the robot arm touching the calibration point, and the capturing may be controlled and triggered manually, for example, when the human eye observes that the front end of the robot arm touches the calibration point, the camera of the robot is used to capture the calibration image of the front end of the robot arm touching the calibration point.
In addition, an embodiment of the present application further provides an implementation manner of automatically detecting a touch and automatically triggering shooting of a calibration image, and on the basis of any of the foregoing implementation manners, in some modification implementations, as shown in fig. 2, the laser positioning frame further includes laser receivers, the laser receivers are arranged in one-to-one correspondence with the laser transmitters, and each laser receiver receives a laser beam emitted by a laser transmitter corresponding to the laser receiver;
control the manipulator front end of robot touches in proper order a plurality of laser calibration point is concentrated after the calibration point, still include:
and detecting whether the front end of the mechanical arm touches the calibration point or not according to the change information of the laser beam received by each laser receiver.
It is easy to understand that the laser receiver should be disposed on the frame opposite to the laser transmitter corresponding to the laser receiver, so that the laser beam emitted by the laser transmitter is normally incident on the window of the laser receiver, and when the laser beam is blocked, the laser beam received by the laser receiver changes, for example, the light intensity becomes weak, the light flux becomes small, the light energy decreases, and the like, indicating that the front end of the mechanical arm touches the calibration point, therefore, by detecting the change information of the light intensity, the light flux, and the light energy, it can be accurately detected whether the front end of the mechanical arm touches the calibration point in real time, and further triggering the action of shooting the calibration image, so as to implement the automatic detection of the touch action and the automatic shooting of the calibration image.
It should be noted that, because each calibration point is obtained by intersecting at least two laser beams, it is necessary to determine that the front end of the robot arm touches the calibration point after detecting that the laser beams received by at least two laser receivers are weakened or disappeared, so as to ensure the accuracy of detection.
Since the hand-eye calibration requires the physical coordinates of the calibration point in the calibration frame coordinate system, and therefore the physical coordinates of the calibration point in the calibration frame coordinate system need to be determined, the physical coordinates in the step S104 may be manually measured and determined by a worker, and then provided to a computer for processing.
In addition, the present application provides an implementation of automatically measuring the physical coordinates, and on the basis of any of the foregoing implementations, in some modified implementations, the method may further include:
determining two laser receivers with weakened received laser beams according to the change information of the laser beams received by each laser receiver when the front end of the mechanical arm touches the calibration point each time;
and determining the physical coordinates of the calibration point touched by the front end of the mechanical arm in a calibration frame coordinate system according to the intersection position of the laser beams received by the two laser receivers.
The position and the posture of the laser calibration frame can be fixed and unchangeable in the calibration process, so that the physical coordinates of each calibration point in the calibration frame coordinate system can be determined easily and accurately according to the mode. Through the embodiment, the automatic detection of the physical coordinates of the calibration point can be realized, the efficiency of the detection and the whole calibration process is improved, and meanwhile, the problem that the manual measurement of the physical coordinates is prone to error is reduced.
In the embodiment of the present application, any calibration algorithm based on a calibration plate provided in the prior art may be implemented, for example, a 9-point calibration algorithm, and when implemented, only the laser calibration frame provided in the present application needs to replace a conventional calibration plate, so as to implement the hand-eye calibration of a robot.
For example, referring to fig. 3, which illustrates a schematic diagram of a Hand-Eye calibration principle provided by some embodiments of the present application, steps S104 and S105 may be implemented by referring to fig. 3, which is a schematic diagram of a Hand-Eye calibration principle for an Eye-in-Hand robot, as shown in fig. 3, the Eye-in-Hand, i.e., a mode in which a camera is fixed on an arm of the robot, for the Eye-in-Hand robot, the purpose of the Hand-Eye calibration is to obtain a spatial position relationship between the camera and a front end of the manipulator, and the camera is bound to the front end of the manipulator, so that a position of the camera relative to the front end of the manipulator is constant, i.e., a transformation matrix between a coordinate system of the camera and a coordinate system of the front end of the manipulator is also constant, and the transformation matrix is set as
Figure BDA0002559937380000081
In particular, it may be a 4 x 4 transformation matrix.
Setting laser calibration during calibrationAfter the frame pose is determined, the laser calibration frame pose is fixed, and then the laser calibration frame pose and the robot base are kept still, namely the position of the calibration frame coordinate system relative to the base coordinate system is fixed, and a transformation matrix is set as
Figure BDA0002559937380000082
The transformation relation between the front end coordinate system and the base coordinate system of the mechanical arm is also certain, generally given by a robot manufacturer, and the transformation matrix is set as
Figure BDA0002559937380000083
The transformation matrix of the camera coordinate system and the calibration frame coordinate system, i.e. the first transformation matrix, is set to
Figure BDA0002559937380000084
Then there is the formula:
Figure BDA0002559937380000085
the calibration image can be shot for multiple times at different position angles by moving the front end of the mechanical arm and the camera for multiple times, and then a first conversion matrix is obtained by utilizing the pixel coordinates of the calibration point in the calibration image and the physical coordinates in the calibration frame coordinate system
Figure BDA0002559937380000086
In some embodiments, it may be determined that the physical coordinate assumption of the calibration point in the calibration frame coordinate system is represented by cal, and the pixel coordinate assumption of the calibration point obtained from the calibration image captured by the camera is represented by cam, and then the formula is obtained:
Figure BDA0002559937380000087
by shooting multiple groups of calibration images, a matrix equation set and a solution equation set are obtained, because
Figure BDA0002559937380000088
And
Figure BDA0002559937380000089
is fixed, therefore, the conversion matrix between the camera coordinate system and the manipulator front end coordinate system can be conveniently obtained
Figure BDA00025599373800000810
The calibration relation of the robot can be determined, and the hand-eye calibration is realized.
By the embodiment, the calibration relation of the robot can be accurately determined, the hand-eye calibration is realized, and the robot calibration method has the advantages of simple and efficient calculation and easiness in implementation.
In the foregoing embodiment, a robot hand-eye calibration method is provided, and correspondingly, the present application also provides a robot hand-eye calibration device. The robot hand-eye calibration device provided by the embodiment of the application can implement the robot hand-eye calibration method, and the robot hand-eye calibration device can be implemented through software, hardware or a software and hardware combined mode. For example, the robot hand-eye calibration device may comprise integrated or separate functional modules or units to perform the corresponding steps of the above-described methods. Please refer to fig. 4, which illustrates a schematic diagram of a robot hand-eye calibration apparatus according to some embodiments of the present application. Since the apparatus embodiments are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for relevant points. The device embodiments described below are merely illustrative.
As shown in fig. 4, the robot eye calibration apparatus 10 may include:
the laser calibration point set generating module 101 is configured to control a laser calibration frame to generate a laser calibration point set used for calibrating the robot, where the laser calibration point set includes a plurality of calibration points formed by intersecting laser beams;
a calibration point touch module 102, configured to control a front end of a manipulator arm of the robot to sequentially touch a plurality of calibration points in the laser calibration point set;
a calibration image shooting module 103, configured to shoot a calibration image of the front end of the mechanical arm touching the calibration point each time the front end of the mechanical arm touches the calibration point;
a transformation matrix determining module 104, configured to determine a first transformation matrix of pixel coordinates of the calibration point in the calibration image and physical coordinates in a calibration frame coordinate system;
and the hand-eye calibration module 105 is used for performing hand-eye calibration on the robot according to the first conversion matrix.
In some variations of embodiments of the present application, the laser calibration frame includes a frame body and a plurality of laser emitters disposed along the frame body, and laser beams emitted by the plurality of laser emitters intersect to form a plurality of calibration points in the frame body.
In some modified embodiments of the present application, the laser positioning frame further includes laser receivers, the laser receivers are arranged in one-to-one correspondence with the laser emitters, and each laser receiver receives a laser beam emitted by a laser emitter corresponding to the laser receiver;
the apparatus 10 further comprises:
and the touch detection module is used for detecting whether the front end of the mechanical arm touches the calibration point or not according to the change information of the laser beam received by each laser receiver.
In some variations of embodiments of the present application, the apparatus further comprises:
the change receiver determining module is used for determining two laser receivers which are weakened in received laser beams according to the change information of the laser beams received by each laser receiver when the front end of the mechanical arm touches the calibration point each time;
and the physical coordinate determination module is used for determining the physical coordinates of the calibration point touched by the front end of the mechanical arm in a calibration frame coordinate system according to the intersection position of the laser beams received by the two laser receivers.
The robot eye calibration device 10 provided in the embodiment of the present application and the robot eye calibration method provided in the foregoing embodiment of the present application have the same inventive concept and the same beneficial effects.
The embodiment of the present application further provides a robot corresponding to the robot hand-eye calibration method provided by the foregoing embodiment, and the robot may be used to execute the robot hand-eye calibration method provided by any of the foregoing embodiments of the present application to achieve hand-eye calibration.
Please refer to fig. 5, which illustrates a schematic diagram of a robot provided in some embodiments of the present application. As shown in fig. 5, the robot 2 includes: the system comprises a processor 200, a memory 201, a bus 202 and a communication interface 203, wherein the processor 200, the communication interface 203 and the memory 201 are connected through the bus 202; the memory 201 stores a computer program that can be executed on the processor 200, and the processor 200 executes the robot hand-eye calibration method provided by any one of the foregoing embodiments when executing the computer program.
The Memory 201 may include a high-speed Random Access Memory (RAM) and may further include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 203 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used.
Bus 202 can be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. The memory 201 is used for storing a program, and the processor 200 executes the program after receiving an execution instruction, and the robot eye calibration method disclosed by any of the foregoing embodiments of the present application may be applied to the processor 200, or implemented by the processor 200.
The processor 200 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 200. The Processor 200 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 201, and the processor 200 reads the information in the memory 201 and completes the steps of the method in combination with the hardware thereof.
The robot provided by the embodiment of the application and the robot eye calibration method provided by the embodiment of the application have the same inventive concept and the same beneficial effects.
The embodiment of the present application further provides a robot hand-eye calibration system corresponding to the robot hand-eye calibration method provided by the foregoing embodiment, where the robot hand-eye calibration system may include: the robot and the laser mark fix the frame;
the robot is connected with the laser calibration frame;
the laser positioning frame is used for generating a laser positioning point set used for calibrating the robot under the control of the robot;
the robot can be used for realizing hand-eye calibration by executing the robot hand-eye calibration method provided by any one of the previous embodiments of the application based on the laser calibration frame.
The related contents can be understood by referring to the foregoing description of the embodiments, and are not described herein again.
The robot eye calibration system provided by the embodiment of the application and the robot eye calibration method provided by the embodiment of the application have the same inventive concept and the same beneficial effects.
It should be noted that the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present disclosure, and the present disclosure should be construed as being covered by the claims and the specification.

Claims (10)

1. A robot hand-eye calibration method is characterized by comprising the following steps:
controlling a laser positioning frame to generate a laser positioning point set for calibrating the robot, wherein the laser positioning point set comprises a plurality of calibration points formed by intersecting laser beams;
controlling the front end of a mechanical arm of the robot to sequentially touch a plurality of the calibration points in the laser calibration point set;
shooting a calibration image of the front end of the mechanical arm touching the calibration point each time the front end of the mechanical arm touches the calibration point;
determining a first conversion matrix of pixel coordinates of the calibration point in the calibration image and physical coordinates in a calibration frame coordinate system;
and calibrating the hand and the eye of the robot according to the first conversion matrix.
2. The method of claim 1, wherein the laser positioning frame comprises a frame body and a plurality of laser emitters disposed along the frame body, wherein laser beams emitted by the plurality of laser emitters intersect to form a plurality of calibration points within the frame body.
3. The method according to claim 2, wherein the laser positioning frame further comprises laser receivers, the laser receivers are arranged in one-to-one correspondence with the laser emitters, and each laser receiver receives a laser beam emitted by the laser emitter corresponding to the laser receiver;
control the manipulator front end of robot touches in proper order a plurality of laser calibration point is concentrated after the calibration point, still include:
and detecting whether the front end of the mechanical arm touches the calibration point or not according to the change information of the laser beam received by each laser receiver.
4. The method of claim 3, further comprising:
determining two laser receivers with weakened received laser beams according to the change information of the laser beams received by each laser receiver when the front end of the mechanical arm touches the calibration point each time;
and determining the physical coordinates of the calibration point touched by the front end of the mechanical arm in a calibration frame coordinate system according to the intersection position of the laser beams received by the two laser receivers.
5. A robot hand-eye calibration device, characterized in that the device comprises:
the laser calibration point set generating module is used for controlling a laser calibration frame to generate a laser calibration point set used for calibrating the robot, wherein the laser calibration point set comprises a plurality of calibration points formed by intersecting laser beams;
the calibration point touch module is used for controlling the front end of a mechanical arm of the robot to sequentially touch a plurality of the calibration points in the laser calibration point set;
the calibration image shooting module is used for shooting a calibration image of the front end of the mechanical arm touching the calibration point each time the front end of the mechanical arm touches the calibration point;
the conversion matrix determining module is used for determining a first conversion matrix of the pixel coordinates of the calibration point in the calibration image and the physical coordinates in the calibration frame coordinate system;
and the hand-eye calibration module is used for performing hand-eye calibration on the robot according to the first conversion matrix.
6. The apparatus of claim 5, wherein the laser positioning frame comprises a frame body and a plurality of laser emitters disposed along the frame body, wherein laser beams emitted by the plurality of laser emitters intersect to form a plurality of calibration points within the frame body.
7. The device of claim 6, wherein the laser positioning frame further comprises laser receivers, the laser receivers are arranged in one-to-one correspondence with the laser emitters, and each laser receiver receives a laser beam emitted by the laser emitter corresponding to the laser receiver;
the device further comprises:
and the touch detection module is used for detecting whether the front end of the mechanical arm touches the calibration point or not according to the change information of the laser beam received by each laser receiver.
8. The apparatus of claim 7, further comprising:
the change receiver determining module is used for determining two laser receivers which are weakened in received laser beams according to the change information of the laser beams received by each laser receiver when the front end of the mechanical arm touches the calibration point each time;
and the physical coordinate determination module is used for determining the physical coordinates of the calibration point touched by the front end of the mechanical arm in a calibration frame coordinate system according to the intersection position of the laser beams received by the two laser receivers.
9. A robot, characterized in that the robot is adapted to perform the method of any of claims 1 to 4 for hand-eye calibration.
10. A robot hand-eye calibration system, comprising: the robot and the laser mark fix the frame;
the robot is connected with the laser calibration frame;
the laser positioning frame is used for generating a laser positioning point set used for calibrating the robot under the control of the robot;
the robot is used for realizing hand-eye calibration by executing the method of any one of claims 1 to 4 based on the laser calibration frame.
CN202010603344.7A 2020-06-29 2020-06-29 Robot hand-eye calibration method, device and system Active CN111890354B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010603344.7A CN111890354B (en) 2020-06-29 2020-06-29 Robot hand-eye calibration method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010603344.7A CN111890354B (en) 2020-06-29 2020-06-29 Robot hand-eye calibration method, device and system

Publications (2)

Publication Number Publication Date
CN111890354A true CN111890354A (en) 2020-11-06
CN111890354B CN111890354B (en) 2022-01-11

Family

ID=73206500

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010603344.7A Active CN111890354B (en) 2020-06-29 2020-06-29 Robot hand-eye calibration method, device and system

Country Status (1)

Country Link
CN (1) CN111890354B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116852382A (en) * 2023-09-04 2023-10-10 青岛理工大学 System and method for quickly adjusting tail end gesture of shaft hole assembly robot

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5148591A (en) * 1981-05-11 1992-09-22 Sensor Adaptive Machines, Inc. Vision target based assembly
CN102034238A (en) * 2010-12-13 2011-04-27 西安交通大学 Multi-camera system calibrating method based on optical imaging test head and visual graph structure
CN203274682U (en) * 2013-05-04 2013-11-06 长春工业大学 Special-purpose calibration board for light surface of line-structured light
CN105014678A (en) * 2015-07-16 2015-11-04 深圳市得意自动化科技有限公司 Robot hand-eye calibration method based on laser range finding
CN105773609A (en) * 2016-03-16 2016-07-20 南京工业大学 Robot kinematics calibration method based on vision measurement and distance error model
CN106197321A (en) * 2016-07-06 2016-12-07 太原科技大学 Projector calibrating method based on red blue gridiron pattern scaling board
CN106839979A (en) * 2016-12-30 2017-06-13 上海交通大学 The hand and eye calibrating method of line structured laser sensor
CN107255443A (en) * 2017-07-14 2017-10-17 北京航空航天大学 Binocular vision sensor field calibration method and device under a kind of complex environment
CN107498558A (en) * 2017-09-19 2017-12-22 北京阿丘科技有限公司 Full-automatic hand and eye calibrating method and device
CN107816942A (en) * 2017-09-28 2018-03-20 中国东方电气集团有限公司 A kind of planar dimension measurement method based on cross structure light vision system
CN108269286A (en) * 2016-12-30 2018-07-10 中国空气动力研究与发展中心超高速空气动力研究所 Polyphaser pose correlating method based on combination dimensional mark
CN108717715A (en) * 2018-06-11 2018-10-30 华南理工大学 A kind of line-structured light vision system automatic calibration method for arc welding robot
CN108972544A (en) * 2018-06-21 2018-12-11 华南理工大学 A kind of vision laser sensor is fixed on the hand and eye calibrating method of robot
CN109443214A (en) * 2018-12-19 2019-03-08 广东工业大学 A kind of scaling method of structured light three-dimensional vision, device and measurement method, device
CN109623822A (en) * 2018-12-28 2019-04-16 芜湖哈特机器人产业技术研究院有限公司 Robotic Hand-Eye Calibration method
CN110370272A (en) * 2019-06-20 2019-10-25 重庆大学 It is a kind of based on the robot TCP calibration system vertically reflected
CN110757462A (en) * 2019-11-15 2020-02-07 上海威士顿信息技术股份有限公司 Robot hand-eye calibration method, system and storage medium
CN110815201A (en) * 2018-08-07 2020-02-21 广明光电股份有限公司 Method for correcting coordinates of robot arm
CN111199542A (en) * 2019-12-30 2020-05-26 季华实验室 Accurate positioning method for tooling plate
CN111307033A (en) * 2018-12-12 2020-06-19 成都蒸汽巨人机器人科技有限公司 Industrial robot depth vision sensor calibration board and calibration method

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5148591A (en) * 1981-05-11 1992-09-22 Sensor Adaptive Machines, Inc. Vision target based assembly
CN102034238A (en) * 2010-12-13 2011-04-27 西安交通大学 Multi-camera system calibrating method based on optical imaging test head and visual graph structure
CN203274682U (en) * 2013-05-04 2013-11-06 长春工业大学 Special-purpose calibration board for light surface of line-structured light
CN105014678A (en) * 2015-07-16 2015-11-04 深圳市得意自动化科技有限公司 Robot hand-eye calibration method based on laser range finding
CN105773609A (en) * 2016-03-16 2016-07-20 南京工业大学 Robot kinematics calibration method based on vision measurement and distance error model
CN106197321A (en) * 2016-07-06 2016-12-07 太原科技大学 Projector calibrating method based on red blue gridiron pattern scaling board
CN108269286A (en) * 2016-12-30 2018-07-10 中国空气动力研究与发展中心超高速空气动力研究所 Polyphaser pose correlating method based on combination dimensional mark
CN106839979A (en) * 2016-12-30 2017-06-13 上海交通大学 The hand and eye calibrating method of line structured laser sensor
CN107255443A (en) * 2017-07-14 2017-10-17 北京航空航天大学 Binocular vision sensor field calibration method and device under a kind of complex environment
CN107498558A (en) * 2017-09-19 2017-12-22 北京阿丘科技有限公司 Full-automatic hand and eye calibrating method and device
CN107816942A (en) * 2017-09-28 2018-03-20 中国东方电气集团有限公司 A kind of planar dimension measurement method based on cross structure light vision system
CN108717715A (en) * 2018-06-11 2018-10-30 华南理工大学 A kind of line-structured light vision system automatic calibration method for arc welding robot
CN108972544A (en) * 2018-06-21 2018-12-11 华南理工大学 A kind of vision laser sensor is fixed on the hand and eye calibrating method of robot
CN110815201A (en) * 2018-08-07 2020-02-21 广明光电股份有限公司 Method for correcting coordinates of robot arm
CN111307033A (en) * 2018-12-12 2020-06-19 成都蒸汽巨人机器人科技有限公司 Industrial robot depth vision sensor calibration board and calibration method
CN109443214A (en) * 2018-12-19 2019-03-08 广东工业大学 A kind of scaling method of structured light three-dimensional vision, device and measurement method, device
CN109623822A (en) * 2018-12-28 2019-04-16 芜湖哈特机器人产业技术研究院有限公司 Robotic Hand-Eye Calibration method
CN110370272A (en) * 2019-06-20 2019-10-25 重庆大学 It is a kind of based on the robot TCP calibration system vertically reflected
CN110757462A (en) * 2019-11-15 2020-02-07 上海威士顿信息技术股份有限公司 Robot hand-eye calibration method, system and storage medium
CN111199542A (en) * 2019-12-30 2020-05-26 季华实验室 Accurate positioning method for tooling plate

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116852382A (en) * 2023-09-04 2023-10-10 青岛理工大学 System and method for quickly adjusting tail end gesture of shaft hole assembly robot

Also Published As

Publication number Publication date
CN111890354B (en) 2022-01-11

Similar Documents

Publication Publication Date Title
CN109807885B (en) Visual calibration method and device for manipulator and intelligent terminal
CN107687855B (en) Robot positioning method and device and robot
CN110599541A (en) Method and device for calibrating multiple sensors and storage medium
CN109910000B (en) Calibration and operation of vision-based steering systems
JP5366789B2 (en) Input indication tool, control method therefor, and coordinate input device
JP2005056140A (en) Coordinate input device, and its control method and program
US5340060A (en) Rendezvous docking optical sensor system
JPWO2007037227A1 (en) POSITION INFORMATION DETECTING DEVICE, POSITION INFORMATION DETECTING METHOD, AND POSITION INFORMATION DETECTING PROGRAM
US20180272539A1 (en) Information processing apparatus, system, information processing method, and manufacturing method
CN111890354B (en) Robot hand-eye calibration method, device and system
US8355012B2 (en) Scanning method for determining a touch position of a touch input apparatus
CN111624612A (en) Verification method and verification system of time-of-flight camera module
JPH06137840A (en) Automatic calibration device for visual sensor
JP2020075327A (en) Control system
CN109470201B (en) Method for operating a hand-held laser distance measuring device and hand-held laser distance measuring device
CN111890355B (en) Robot calibration method, device and system
US20220254044A1 (en) Ranging device and ranging method
US11587260B2 (en) Method and apparatus for in-field stereo calibration
US9377897B2 (en) Control of coordinate input apparatus based on light distribution and moving amounts of sensor units
CN113804222A (en) Positioning accuracy testing method, device, equipment and storage medium
CN112809668A (en) Method, system and terminal for automatic hand-eye calibration of mechanical arm
CN117274399A (en) Camera calibration method, sewing equipment and computer readable storage medium
JP5445064B2 (en) Image processing apparatus and image processing program
JPH0545117A (en) Optical method for measuring three-dimensional position
JP2005173684A (en) Optical coordinate input device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant