CN116091619A - Calibration method, device, equipment and medium - Google Patents

Calibration method, device, equipment and medium Download PDF

Info

Publication number
CN116091619A
CN116091619A CN202211688574.3A CN202211688574A CN116091619A CN 116091619 A CN116091619 A CN 116091619A CN 202211688574 A CN202211688574 A CN 202211688574A CN 116091619 A CN116091619 A CN 116091619A
Authority
CN
China
Prior art keywords
transformation matrix
space transformation
target
dimensional coordinate
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211688574.3A
Other languages
Chinese (zh)
Inventor
王永昊
吴斌
安吉文
胡尊亭
张大伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Natong Medical Robot Technology Co ltd
Original Assignee
Beijing Natong Medical Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Natong Medical Robot Technology Co ltd filed Critical Beijing Natong Medical Robot Technology Co ltd
Priority to CN202211688574.3A priority Critical patent/CN116091619A/en
Publication of CN116091619A publication Critical patent/CN116091619A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • A61B2034/2057Details of tracking cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Robotics (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the disclosure relates to a calibration method, a calibration device, calibration equipment and calibration media, wherein the method comprises the following steps: acquiring a first three-dimensional coordinate point set corresponding to all marking points under a coordinate system of a calibration plate, controlling a camera to shoot the calibration plate under any pose to obtain multi-frame images, processing the multi-frame images to obtain a second three-dimensional coordinate point set corresponding to each frame of images, calculating based on the first three-dimensional coordinate point set and the second three-dimensional coordinate point set to obtain a first space transformation matrix between the coordinate system of the camera and the coordinate system of the calibration plate, processing based on a plurality of first space transformation matrices to obtain a first target space transformation matrix, calculating to obtain a second space transformation matrix between the tail end of the mechanical arm and the base of the mechanical arm and the first target space transformation matrix, and calculating to obtain a third space transformation matrix between the tail end of the mechanical arm and the camera. By adopting the technical scheme, the problem of low coordinate accuracy caused by image noise is solved to a certain extent, and the hand-eye calibration accuracy is improved.

Description

Calibration method, device, equipment and medium
Technical Field
The disclosure relates to the technical field of optical navigation and robots, in particular to a calibration method, a calibration device, calibration equipment and calibration media.
Background
The surgical robot can be used as intelligent medical equipment to finish fine surgical operation in the human body cavity, blood vessels and nerve dense areas, and has the characteristics of good stability, flexible operation, accurate movement, hand-eye coordination and the like. From the clinical medical application point of view, the surgical robots may be classified into neurosurgical robots, orthopedic surgical robots, laparoscopic surgical robots, and the like. The operation robot (hereinafter referred to as injection robot) for performing intramuscular injection can accurately reach a designated injection position, slowly push the injection needle to a proper injection depth, and input the gas or liquid in the injector into a human body to achieve the treatment effect. The injection robot can enable the injection needle to precisely penetrate into the skin at a specific input angle and smoothly convey the medicine with precise dosage to the appointed subcutaneous depth, so that the injection success rate is improved.
However, an important premise to achieve autonomous injection function is to equip the system with an accurate and stable surgical optical navigation system. Thus realizing millimeter-level identification and navigation of the injection mark points; and accurately acquiring the plane information of the skin of the injection point, and calculating the optimal injection angle of the injection point in the three-dimensional space. Therefore, the optical navigation equipment and the mechanical arm are accurately calibrated, and the space conversion relation between different coordinate systems is solved, so that the method becomes an important premise and foundation for ensuring the accuracy of the system.
Disclosure of Invention
In order to solve the above technical problems or at least partially solve the above technical problems, the present disclosure provides a calibration method, device, apparatus and medium.
The embodiment of the disclosure provides a calibration method, wherein a camera is arranged at the tail end of a mechanical arm of a robot, a calibration plate is fixed relative to a mechanical arm base of the robot, and a plurality of mark points are arranged on the calibration plate, and the method comprises the following steps:
acquiring a first three-dimensional coordinate point set corresponding to all marking points in a coordinate system of a calibration plate;
controlling the camera to shoot the calibration plate under any pose to obtain multi-frame images, and processing each frame of images to obtain a second three-dimensional coordinate point set corresponding to each frame of images;
calculating the first three-dimensional coordinate point set and the second three-dimensional coordinate point set based on a preset calculation algorithm to obtain a first space transformation matrix between a camera coordinate system corresponding to each frame of the image and the calibration plate coordinate system;
processing based on a plurality of first space transformation matrixes to obtain a first target space transformation matrix;
and acquiring a second space transformation matrix between the tail end of the mechanical arm and the mechanical arm base, and calculating based on the first target space transformation matrix and the second space transformation matrix to obtain a third space transformation matrix between the tail end of the mechanical arm and the camera.
The embodiment of the disclosure also provides a calibration device, which comprises:
the camera is installed at the arm end of robot, and the calibration board is relative the arm base of robot is fixed, have a plurality of mark points on the calibration board, the device includes:
the first acquisition module is used for acquiring a first three-dimensional coordinate point set corresponding to all the marking points in the coordinate system of the calibration plate;
the shooting module is used for controlling the camera to shoot the calibration plate under any pose to obtain multi-frame images;
the first processing module is used for processing the images of each frame to obtain a second three-dimensional coordinate point set corresponding to the images of each frame;
the first calculation module is used for calculating the first three-dimensional coordinate point set and the second three-dimensional coordinate point set based on a preset calculation algorithm to obtain a first space transformation matrix between a camera coordinate system corresponding to each frame of the image and the calibration plate coordinate system;
the second processing module is used for processing based on the plurality of first space transformation matrixes to obtain a first target space transformation matrix;
the second acquisition module is used for acquiring a second space transformation matrix from the tail end of the mechanical arm to the mechanical arm base;
And the second calculation module is used for calculating based on the first target space transformation matrix and the second space transformation matrix to obtain a third space transformation matrix between the tail end of the mechanical arm and the camera.
The embodiment of the disclosure also provides an electronic device, which comprises: a processor; a memory for storing the processor-executable instructions; the processor is configured to read the executable instructions from the memory and execute the instructions to implement the calibration method as provided in the embodiments of the present disclosure.
The embodiments of the present disclosure also provide a computer-readable storage medium storing a computer program for executing the calibration method as provided by the embodiments of the present disclosure.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages: according to the calibration scheme provided by the embodiment of the disclosure, the camera is arranged at the tail end of the mechanical arm of the robot, the calibration plate is fixed relative to the mechanical arm base of the robot, a plurality of mark points are arranged on the calibration plate, a first three-dimensional coordinate point set corresponding to all the mark points under the coordinate system of the calibration plate is obtained, the camera is controlled to shoot the calibration plate under any pose to obtain multi-frame images, each frame of images is processed to obtain a second three-dimensional coordinate point set corresponding to each frame of images, the first three-dimensional coordinate point set and the second three-dimensional coordinate point set are calculated based on a preset calculation algorithm to obtain a first space transformation matrix between the coordinate system of the camera corresponding to each frame of images and the coordinate system of the calibration plate, the first space transformation matrix is processed based on the plurality of first space transformation matrices to obtain a second space transformation matrix between the tail end of the mechanical arm and the base of the mechanical arm, and the second space transformation matrix is calculated based on the first target space transformation matrix and the second space transformation matrix to obtain a third space transformation matrix between the tail end of the mechanical arm and the camera. Therefore, by collecting the multi-frame images, the information in the multi-frame images is effectively utilized, a more accurate and stable space transformation matrix can be obtained, the calculation error problem caused by image noise is overcome to a great extent, and the hand-eye calibration precision is improved.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a schematic flow chart of a calibration method according to an embodiment of the disclosure;
FIG. 2 is a flow chart of another calibration method according to an embodiment of the disclosure;
FIG. 3 is a schematic diagram of a hand-eye calibration provided in an embodiment of the present disclosure;
FIG. 4 is a schematic view of a calibration plate according to an embodiment of the present disclosure;
FIG. 5 is a schematic structural diagram of a calibration device according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
Generally, in order to complete the hand-eye calibration process, two types of data are mainly required to be acquired for calculating a hand-eye calibration matrix, namely a space transformation matrix from a calibration plate to a camera and a space transformation matrix from the tail end of a mechanical arm to a base of the mechanical arm under different poses. The space transformation matrix from the calibration plate to the camera needs to be determined through the calibration plate image shot by the camera, and if only a single Zhang Huoshan frame image is acquired to perform space transformation matrix calculation, calculation errors caused by random image noise in the time domain are difficult to avoid. According to the embodiment of the disclosure, the multi-frame image is acquired, the information in the multi-frame image is effectively utilized, a more accurate and stable space transformation matrix can be obtained, the calculation error problem caused by image noise is overcome to a great extent, and the accuracy of the hand-eye calibration result is improved.
Taking an example of using an RGBD camera (Red, green, blue and Gray Depth map (namely a stereoscopic color camera) and a two-dimensional code calibration plate, the adopted RGBD camera can directly acquire the coordinates [ x, y and z ] of a pixel point [ p, q ] in the acquired two-dimensional RGB image coordinate system under the camera three-dimensional coordinate system. For example, n rows and m columns of two-dimensional codes are adopted for the used calibration plate, and 4 corner coordinates can be identified for each square two-dimensional code, so that 4 multiplied by m multiplied by n marking points can be identified in total, and the calibration plate is considered to be an ideal plane structure. Therefore, an X axis and a Y axis are constructed by the plane, and a three-dimensional right-hand coordinate system based on the calibration plate is constructed by taking the vertical direction of the plane as the Z axis. According to the design size of the calibration plate, the first three-dimensional coordinate points of all the marking points under the coordinate system of the calibration plate can be obtained. And the second three-dimensional coordinate points of all the marking points of the calibration plate in the RGB image under the camera coordinate system can be obtained. Therefore, the transformation matrix M of the two sets of mark point coordinate sets can be solved, and the transformation matrix M is the space transformation matrix from the calibration plate to the camera.
Specifically, when each frame of RGBD image is acquired, since its D (Depth) channel image is susceptible to noise, the three-dimensional coordinates of the marker points in the camera coordinate system are susceptible to noise, and finally the transformation matrix of the two sets of marker point coordinate sets is susceptible to noise. According to the embodiment of the disclosure, the three-dimensional coordinates of each marking point under the camera coordinate system are calculated by combining the acquired multi-frame image information, the effective mean value of the marking points is calculated according to a certain rule to be used as the multi-frame optimized three-dimensional coordinates of the marking points, and finally the marking point coordinate set is utilized to calculate the space transformation matrix from the calibration plate to the camera. The method can overcome the problem of low coordinate accuracy caused by image noise to a certain extent, and improves the accuracy of hand-eye calibration. The following is a detailed description with reference to fig. 1.
Fig. 1 is a schematic flow chart of a calibration method according to an embodiment of the disclosure, where the method may be performed by a calibration device, and the device may be implemented by software and/or hardware, and may be generally integrated in an electronic device. As shown in fig. 1, the method includes:
and 101, acquiring a first three-dimensional coordinate point set corresponding to all the marking points in the coordinate system of the calibration plate.
In the embodiment of the disclosure, a camera is mounted at the tail end of a mechanical arm of a robot, a calibration plate is fixed relative to a mechanical arm base of the robot, and a plurality of marking points are arranged on the calibration plate. The setting of the calibration plate and the marking point can be selected according to the application scene.
In the embodiment of the disclosure, a horizontal axis and a vertical axis are constructed by using a calibration plate plane, and a calibration plate coordinate system is constructed by using the vertical direction of the calibration plate plane as a vertical axis.
In the embodiment of the disclosure, there are various ways of obtaining the first three-dimensional coordinate point set corresponding to all the marking points in the calibration plate coordinate system, and in a specific embodiment, the first three-dimensional coordinate point of each marking point in the calibration plate coordinate system is obtained, and the first three-dimensional coordinate point set is obtained based on all the first three-dimensional coordinate points.
And 102, controlling a camera to shoot the calibration plate under any pose to obtain multi-frame images, and processing each frame of image to obtain a second three-dimensional coordinate point set corresponding to each frame of image.
And 103, calculating the first three-dimensional coordinate point set and the second three-dimensional coordinate point set based on a preset calculation algorithm to obtain a first space transformation matrix between a camera coordinate system corresponding to each frame of image and a calibration plate coordinate system.
In the embodiment of the disclosure, the camera can be controlled to shoot the calibration plate in any pose, so as to obtain a plurality of frames of images, namely, a plurality of frames of images are shot in one pose. The camera can be controlled to a certain pose by moving the mechanical arm, the camera needs to be ensured to capture the image of the calibration plate clearly under the pose, the fixed mechanical arm collects multiple frames of images, and a second three-dimensional coordinate set of the marking point under the camera coordinate system is calculated according to each frame of images.
In the embodiment of the disclosure, each frame of image is processed to obtain a second three-dimensional coordinate point set corresponding to each frame of image, and it can be understood that, for extracting the second three-dimensional coordinate point of each marking point in each frame of image, the second three-dimensional coordinate set is obtained based on the second three-dimensional coordinate points of all marking points.
In the embodiment of the disclosure, the calculation algorithm may select settings according to application scene requirements, such as a least square method, a maximum likelihood method, and the like.
In a specific embodiment, rotation component parameters and translation component parameters of a first space transformation matrix are acquired, each first three-dimensional coordinate point in a first three-dimensional coordinate point set, each second three-dimensional coordinate point in a second three-dimensional coordinate point set, rotation component parameters and translation component parameters are calculated based on a calculation algorithm, a minimum error value is obtained, rotation component parameters and translation component parameters corresponding to the minimum error value calculated are taken as a target rotation component and a target translation component, and the first space transformation matrix is determined based on the target rotation component and the target translation component.
Step 104, processing is performed based on the plurality of first space transformation matrixes to obtain a first target space transformation matrix.
In the embodiment of the disclosure, a plurality of first space transformation matrices may be processed to obtain a first target space transformation matrix, and in one embodiment, the plurality of first space transformation matrices are subjected to average processing to obtain the first target space transformation matrix; in another embodiment, the plurality of first spatial transformation matrices are respectively decomposed into rotation components and translation components, the rotation components are converted into euler angle representations and translation vectors which are kept unchanged, average processing is performed on the plurality of euler angles and the plurality of translation vectors to obtain euler angle average values and translation component average values, inverse conversion is performed on the basis of the euler angle average values and the translation component average values to obtain target rotation components and target translation components, and the first target spatial transformation matrices are determined on the basis of the target rotation components and the target translation components. The above two approaches are merely examples.
Step 105, a second space transformation matrix between the tail end of the mechanical arm and the mechanical arm base is obtained, and calculation is performed based on the first target space transformation matrix and the second space transformation matrix, so that a third space transformation matrix between the tail end of the mechanical arm and the camera is obtained.
In the embodiment of the disclosure, the second space transformation matrix corresponding to different robots is usually well calibrated, and can be directly queried, for example, the second space transformation matrix is directly queried and obtained based on the set parameters such as the model of the robot.
In the embodiment of the disclosure, a plurality of modes for obtaining a third spatial transformation matrix from the tail end of the mechanical arm to the camera are obtained by calculating based on the first target spatial transformation matrix and the second spatial transformation matrix, for example, a plurality of groups of first target spatial transformation matrix and the second spatial transformation matrix are obtained, and solving relation equations of the first target spatial transformation matrix, the second spatial transformation matrix and the third spatial transformation matrix are constructed, so that a plurality of groups of solving relation equation construction matrix relation equations are obtained for solving, and a third spatial transformation matrix is obtained; and then, for example, a plurality of groups of first target space transformation matrixes and second space transformation matrixes are directly input into a preset calculation formula or a pre-training model to be solved, so that a third space transformation matrix is obtained.
According to the calibration scheme provided by the embodiment of the disclosure, the camera is arranged at the tail end of the mechanical arm of the robot, the calibration plate is fixed relative to the mechanical arm base of the robot, a plurality of mark points are arranged on the calibration plate, a first three-dimensional coordinate point set corresponding to all the mark points under the coordinate system of the calibration plate is obtained, the camera is controlled to shoot the calibration plate under any pose to obtain multi-frame images, each frame of images is processed to obtain a second three-dimensional coordinate point set corresponding to each frame of images, the first three-dimensional coordinate point set and the second three-dimensional coordinate point set are calculated based on a preset calculation algorithm to obtain a first space transformation matrix between the coordinate system of the camera corresponding to each frame of images and the coordinate system of the calibration plate, the first space transformation matrix is processed based on the plurality of first space transformation matrices to obtain a second space transformation matrix between the tail end of the mechanical arm and the base of the mechanical arm, and the second space transformation matrix is calculated based on the first target space transformation matrix and the second space transformation matrix to obtain a third space transformation matrix between the tail end of the mechanical arm and the camera. Therefore, by collecting the multi-frame images, the information in the multi-frame images is effectively utilized, a more accurate and stable space transformation matrix can be obtained, the calculation error problem caused by image noise is overcome to a great extent, and the hand-eye calibration precision is improved.
Fig. 2 is a schematic flow chart of another calibration method according to an embodiment of the present disclosure, where the calibration method is further optimized based on the foregoing embodiment. As shown in fig. 2, the method includes:
and 201, constructing a calibration plate coordinate system by using a horizontal axis and a vertical axis of a calibration plate plane and using the vertical direction of the calibration plate plane as a vertical axis, acquiring first three-dimensional coordinate points of each marking point under the calibration plate coordinate system, and acquiring a first three-dimensional coordinate point set based on all the first three-dimensional coordinate points.
Illustratively, as shown in FIG. 3, the camera is mounted at the end of the robot arm so that the camera can capture images following the movement of the end of the arm. The calibration plate needs to be fixed relative to the base of the robot arm, for example, it can be fixed on a wall.
For example, as shown in fig. 4, if the calibration plate has 4 rows and 6 columns of two-dimensional codes for combination, the calibration plate has 96 feature corners of the two-dimensional codes. As an example, the two-dimensional code is 40mm square, the interval between two-dimensional codes is 10mm, and the two-dimensional code can be specifically adjusted according to application scenes.
In particular, the calibration plate itself is generally considered to be an ideal planar structure. Therefore, an X axis and a Y axis are constructed on the plane of the calibration plate, and a three-dimensional coordinate system of the calibration plate is constructed by taking the vertical direction of the plane as the Z axis.
Specifically, three-dimensional coordinate sets of 96 characteristic corner points under a calibration plate coordinate system are generated as follows: (P) 1 ,P 2 ,...P n ) board N=96° where P is a three-dimensional vector representing the three-dimensional coordinates of each point.
Step 202, controlling a camera to shoot the calibration plate under any pose to obtain multi-frame images, and processing each frame of images to obtain a second three-dimensional coordinate point set corresponding to each frame of images.
And 203, acquiring rotation component parameters and translation component parameters of the first space transformation matrix, and calculating each first three-dimensional coordinate point in the first three-dimensional coordinate point set, each second three-dimensional coordinate point in the second three-dimensional coordinate point set, the rotation component parameters and the translation component parameters based on a calculation algorithm to obtain a minimum error value.
And 204, taking the rotation component parameter and the translation component parameter corresponding to the minimum error value as a target rotation component and a target translation component, and determining a first space transformation matrix based on the target rotation component and the target translation component.
And 205, respectively decomposing the plurality of first space transformation matrixes into rotation components and translation components, converting the rotation components into Euler angle representations and translation vectors, and carrying out average processing on the plurality of Euler angles and the plurality of translation vectors to obtain Euler angle average values and translation component average values.
And 206, performing inverse conversion based on the Euler angle mean value and the translation component mean value to obtain a target rotation component and a target translation component, and determining a first target space transformation matrix based on the target rotation component and the target translation component.
Specifically, the mechanical arm is moved to a certain pose, so that the camera can be ensured to clearly capture the image of the calibration plate, the fixed mechanical arm acquires k frames of images (k is a positive integer), and three-dimensional coordinate sets of the marking points under the camera coordinate system are calculated according to each frame of images respectively as follows:
Figure BDA0004020350870000101
further, based on the three-dimensional coordinate set of 96 feature corner points under the calibration plate coordinate system and the three-dimensional coordinate set under the camera coordinate system, a classical singular value decomposition method is applied to convert the three-dimensional coordinate set into a conventional least square problem, as shown in a formula (1):
Figure BDA0004020350870000102
finally solving a first space transformation matrix M from the three-dimensional coordinate set under the coordinate system of the calibration plate to the three-dimensional coordinate set after time-domain averaging board2camera As shown in formula (2):
Figure BDA0004020350870000103
where R is a 3×3 rotation component and t is a 3×1 translation component.
Specifically, k M board2camera The matrix is decomposed into a rotation component R and a translation component t respectively, and since the rotation matrix representation of R has a more severe constraint condition, the rotation component R is first converted from the matrix representation into euler angle representation (α, β, γ) (the rotation angle order is designated as XYZ), and the calculation method is as shown in the following formulas (3) - (4):
Figure BDA0004020350870000111
/>
Figure BDA0004020350870000112
Specifically, the translation vector t is unchanged in form, and k groups of Euler angular averages (alpha, beta, gamma) are solved mean And a translation component t mean Finally convert it back to M board2camera Matrix representation to obtain M finally subjected to time domain optimization board2camera The transformation formula is shown as (5):
Figure BDA0004020350870000113
step 207, acquiring setting parameters of the robot, and acquiring a second space transformation matrix from a preset space transformation matrix database based on the setting parameters.
Step 208, acquiring a first target space transformation matrix and a second space transformation matrix of the target number, and determining a solving relation equation based on the first target space transformation matrix, the second space transformation matrix and the space transformation matrix to be solved.
And 209, constructing a matrix relation equation based on the solving relation equation of the target quantity, and solving the space transformation matrix to be solved by the matrix relation equation to obtain a third space transformation matrix.
In the embodiment of the disclosure, the camera is controlled to shoot and calculate the calibration plate under different poses by repeating the target number of times to obtain a first target space transformation matrix of the target number.
Specifically, a second spatial transformation matrix M from the end of the mechanical arm to the base of the mechanical arm is solved end2base
Steps 202 to 207 are repeated, each time the robot arm needs to be moved to a different pose and the camera needs to be ensured to capture the calibration plate image clearly. Multiple sets of corresponding (M board2camera ,M end2base )。
For the acquired sets of corresponding (M board2camera ,M end2base ) All have solution relation equations
Figure BDA0004020350870000121
It is converted into a classical solution problem, i.e. the matrix relation equation ax=xb:
Figure BDA0004020350870000122
the problem can be solved by using a Tex solving method, and finally a hand-eye calibration matrix, namely a third space transformation matrix M is solved camera2end And (5) completing the hand-eye calibration process.
The calibration scheme provided by the embodiment of the disclosure constructs a horizontal axis and a vertical axis with the plane of the calibration plate, constructs a coordinate system of the calibration plate with the vertical direction of the plane of the calibration plate as a vertical axis, obtains a first three-dimensional coordinate point of each marking point under the coordinate system of the calibration plate, obtains a first three-dimensional coordinate point set based on all the first three-dimensional coordinate points, controls a camera to shoot the calibration plate under any pose to obtain multi-frame images, processes each frame of images to obtain a second three-dimensional coordinate point set corresponding to each frame of images, obtains a rotation component parameter and a translation component parameter of a first space transformation matrix, calculates each first three-dimensional coordinate point in the first three-dimensional coordinate point set, each second three-dimensional coordinate point in the second three-dimensional coordinate point set, the rotation component parameter and the translation component parameter based on a calculation algorithm to obtain a minimum error value, the corresponding rotation component parameter and translation component parameter when the minimum error value is calculated are used as a target rotation component and a target translation component, a first space transformation matrix is determined based on the target rotation component and the target translation component, a plurality of first space transformation matrices are respectively decomposed into rotation components and translation components, the rotation components are converted into Euler angle representations and translation vectors which are kept unchanged, a plurality of Euler angles and a plurality of translation vectors are subjected to average processing to obtain Euler angle average values and translation component average values, inverse conversion is carried out based on the Euler angle average values and the translation component average values to obtain a target rotation component and a target translation component, a first target space transformation matrix is determined based on the target rotation component and the target translation component, setting parameters of a robot are obtained, a second space transformation matrix is obtained from a preset space transformation matrix database based on the setting parameters, acquiring a first target space transformation matrix and a second space transformation matrix of a target number, determining a solving relation equation based on the first target space transformation matrix, the second space transformation matrix and the space transformation matrix to be solved, constructing a matrix relation equation based on the solving relation equation of the target number, and solving the space transformation matrix to be solved by the matrix relation equation to obtain a third space transformation matrix. By adopting the technical scheme, the information in the multi-frame images is effectively utilized by collecting the multi-frame images, so that a more accurate and stable space transformation matrix can be obtained, the calculation error problem caused by image noise is solved to a great extent, and the hand-eye calibration precision is improved.
Fig. 5 is a schematic structural diagram of a calibration device according to an embodiment of the present disclosure, where the device may be implemented by software and/or hardware, and may be generally integrated in an electronic device. As shown in fig. 5, the camera is installed at the end of the mechanical arm of the robot, the calibration plate is fixed relative to the mechanical arm base of the robot, and the calibration plate has a plurality of mark points thereon, and the device includes:
the first obtaining module 301 is configured to obtain a first three-dimensional coordinate point set corresponding to all the marking points in the coordinate system of the calibration plate;
the shooting module 302 is configured to control the camera to shoot the calibration plate under any pose, so as to obtain a multi-frame image;
the first processing module 303 is configured to process the image of each frame to obtain a second three-dimensional coordinate point set corresponding to the image of each frame;
the first calculation module 304 is configured to calculate the first three-dimensional coordinate point set and the second three-dimensional coordinate point set based on a preset calculation algorithm, so as to obtain a first space transformation matrix between a camera coordinate system corresponding to each frame of the image and the calibration board coordinate system;
a second processing module 305, configured to perform processing based on a plurality of the first spatial transformation matrices to obtain a first target spatial transformation matrix;
A second obtaining module 306, configured to obtain a second spatial transformation matrix from the end of the mechanical arm to the mechanical arm base;
a second calculation module 307, configured to calculate based on the first target spatial transformation matrix and the second spatial transformation matrix, and obtain a third spatial transformation matrix between the tail end of the mechanical arm and the camera.
Optionally, the first obtaining module 301 is specifically configured to:
constructing a horizontal axis and a vertical axis by using a calibration plate plane, and constructing a calibration plate coordinate system by using the vertical direction of the calibration plate plane as a vertical axis;
acquiring a first three-dimensional coordinate point of each marking point under the coordinate system of the calibration plate;
and obtaining the first three-dimensional coordinate point set based on all the first three-dimensional coordinate points.
Optionally, the first computing module 304 is specifically configured to:
acquiring a rotation component parameter and a translation component parameter of the first space transformation matrix;
calculating each first three-dimensional coordinate point in the first three-dimensional coordinate point set, each second three-dimensional coordinate point in the second three-dimensional coordinate point set, the rotation component parameter and the translation component parameter based on the calculation algorithm to obtain a minimum error value;
Taking the rotation component parameter and the translation component parameter corresponding to the minimum error value as a target rotation component and a target translation component;
the first spatial transformation matrix is determined based on the target rotational component and the target translational component.
Optionally, the second obtaining module 306 is specifically configured to:
acquiring setting parameters of the robot;
and acquiring the second space transformation matrix from a preset space transformation matrix database based on the setting parameters.
Optionally, the second processing module 305 is specifically configured to:
decomposing a plurality of the first spatial transformation matrices into rotation components and translation components, respectively;
converting the rotation component into Euler angle representation and the translation vector to be unchanged, and carrying out average treatment on a plurality of Euler angles and a plurality of translation vectors to obtain Euler angle average values and translation component average values;
and performing inverse conversion based on the Euler angle mean value and the shift division mean value to obtain a target rotation component and a target translation component, and determining the first target space transformation matrix based on the target rotation component and the target translation component.
Optionally, the second computing module 307 is specifically configured to:
Acquiring a target number of the first target space transformation matrix and the second space transformation matrix;
determining a solution relation equation based on the first target spatial transformation matrix, the second spatial transformation matrix and a spatial transformation matrix to be solved;
and constructing a matrix relation equation based on the solving relation equation of the target quantity, and solving the space transformation matrix to be solved by the matrix relation equation to obtain the third space transformation matrix.
Optionally, acquiring the first target spatial transformation matrix of the target number includes: and repeating the target number of times to control the camera to carry out shooting calculation processing on the calibration plate under different poses, so as to obtain the first target space transformation matrix of the target number.
The calibration device provided by the embodiment of the disclosure can execute the calibration method provided by any embodiment of the disclosure, and has the corresponding functional modules and beneficial effects of the execution method.
Embodiments of the present disclosure also provide a computer program product comprising a computer program/instruction which, when executed by a processor, implements the calibration method provided by any of the embodiments of the present disclosure.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure. Referring now in particular to fig. 6, a schematic diagram of an electronic device 400 suitable for use in implementing embodiments of the present disclosure is shown. The electronic device 400 in the embodiments of the present disclosure may include, but is not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 6 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 6, the electronic device 400 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 401, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 402 or a program loaded from a storage means 408 into a Random Access Memory (RAM) 403. In the RAM403, various programs and data necessary for the operation of the electronic device 400 are also stored. The processing device 401, the ROM 402, and the RAM403 are connected to each other by a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
In general, the following devices may be connected to the I/O interface 405: input devices 406 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 408 including, for example, magnetic tape, hard disk, etc.; and a communication device 409. The communication means 409 may allow the electronic device 400 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 shows an electronic device 400 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via communications device 409, or from storage 408, or from ROM 402. The above-described functions defined in the calibration method of the embodiment of the present disclosure are performed when the computer program is executed by the processing device 401.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (Hyper Text Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a first three-dimensional coordinate point set corresponding to all marking points in a coordinate system of a calibration plate, controlling a camera to shoot the calibration plate under any pose to obtain multi-frame images, processing each frame of images to obtain a second three-dimensional coordinate point set corresponding to each frame of images, calculating the first three-dimensional coordinate point set and the second three-dimensional coordinate point set based on a preset calculation algorithm to obtain a first space transformation matrix between the coordinate system of the camera corresponding to each frame of images and the coordinate system of the calibration plate, processing based on a plurality of first space transformation matrices to obtain a first target space transformation matrix, acquiring a second space transformation matrix between the tail end of the mechanical arm and the base of the mechanical arm, and calculating based on the first target space transformation matrix and the second space transformation matrix to obtain a third space transformation matrix between the tail end of the mechanical arm and the camera.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, the present disclosure provides an electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement any of the calibration methods provided in the present disclosure.
According to one or more embodiments of the present disclosure, the present disclosure provides a computer-readable storage medium storing a computer program for performing any one of the calibration methods provided by the present disclosure.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (10)

1. The method for calibrating is characterized in that a camera is arranged at the tail end of a mechanical arm of a robot, a calibration plate is fixed relative to a mechanical arm base of the robot, and a plurality of mark points are arranged on the calibration plate, and the method comprises the following steps:
Acquiring a first three-dimensional coordinate point set corresponding to all marking points in a coordinate system of a calibration plate;
controlling the camera to shoot the calibration plate under any pose to obtain multi-frame images, and processing each frame of images to obtain a second three-dimensional coordinate point set corresponding to each frame of images;
calculating the first three-dimensional coordinate point set and the second three-dimensional coordinate point set based on a preset calculation algorithm to obtain a first space transformation matrix between a camera coordinate system corresponding to each frame of the image and the calibration plate coordinate system;
processing based on a plurality of first space transformation matrixes to obtain a first target space transformation matrix;
and acquiring a second space transformation matrix between the tail end of the mechanical arm and the mechanical arm base, and calculating based on the first target space transformation matrix and the second space transformation matrix to obtain a third space transformation matrix between the tail end of the mechanical arm and the camera.
2. The method according to claim 1, wherein the step of obtaining the first three-dimensional coordinate point set corresponding to all the mark points in the coordinate system of the calibration plate includes:
constructing a horizontal axis and a vertical axis by using a calibration plate plane, and constructing a calibration plate coordinate system by using the vertical direction of the calibration plate plane as a vertical axis;
Acquiring a first three-dimensional coordinate point of each marking point under the coordinate system of the calibration plate;
and obtaining the first three-dimensional coordinate point set based on all the first three-dimensional coordinate points.
3. The calibration method according to claim 1, wherein the calculating the first three-dimensional coordinate point set and the second three-dimensional coordinate point set based on the preset calculation algorithm to obtain a first space transformation matrix between the camera coordinate system corresponding to each frame of the image and the calibration plate coordinate system includes:
acquiring a rotation component parameter and a translation component parameter of the first space transformation matrix;
calculating each first three-dimensional coordinate point in the first three-dimensional coordinate point set, each second three-dimensional coordinate point in the second three-dimensional coordinate point set, the rotation component parameter and the translation component parameter based on the calculation algorithm to obtain a minimum error value;
taking the rotation component parameter and the translation component parameter corresponding to the minimum error value as a target rotation component and a target translation component;
the first spatial transformation matrix is determined based on the target rotational component and the target translational component.
4. The method of calibrating according to claim 1, wherein the obtaining a second spatial transformation matrix between the robot arm tip and the robot arm base comprises:
acquiring setting parameters of the robot;
and acquiring the second space transformation matrix from a preset space transformation matrix database based on the setting parameters.
5. The calibration method according to claim 1, wherein the processing based on the plurality of first spatial transformation matrices to obtain a first target spatial transformation matrix comprises:
decomposing a plurality of the first spatial transformation matrices into rotation components and translation components, respectively;
converting the rotation component into Euler angle representation and the translation vector to be unchanged, and carrying out average treatment on a plurality of Euler angles and a plurality of translation vectors to obtain Euler angle average values and translation component average values;
and performing inverse conversion based on the Euler angle mean value and the shift division mean value to obtain a target rotation component and a target translation component, and determining the first target space transformation matrix based on the target rotation component and the target translation component.
6. The calibration method according to claim 1, wherein the calculating based on the first target space transformation matrix and the second space transformation matrix to obtain a third space transformation matrix between the robot arm end and the camera comprises:
Acquiring a target number of the first target space transformation matrix and the second space transformation matrix;
determining a solution relation equation based on the first target spatial transformation matrix, the second spatial transformation matrix and a spatial transformation matrix to be solved;
and constructing a matrix relation equation based on the solving relation equation of the target quantity, and solving the space transformation matrix to be solved by the matrix relation equation to obtain the third space transformation matrix.
7. The method of calibrating according to claim 6, wherein obtaining the first target space transformation matrix of the target number comprises:
and repeating the target number of times to control the camera to carry out shooting calculation processing on the calibration plate under different poses, so as to obtain the first target space transformation matrix of the target number.
8. A calibration device, characterized in that, the camera is installed at the arm end of robot, and the calibration board is fixed relative to the arm base of robot, have a plurality of mark points on the calibration board, the device includes:
the first acquisition module is used for acquiring a first three-dimensional coordinate point set corresponding to all the marking points in the coordinate system of the calibration plate;
The shooting module is used for controlling the camera to shoot the calibration plate under any pose to obtain multi-frame images;
the first processing module is used for processing the images of each frame to obtain a second three-dimensional coordinate point set corresponding to the images of each frame;
the first calculation module is used for calculating the first three-dimensional coordinate point set and the second three-dimensional coordinate point set based on a preset calculation algorithm to obtain a first space transformation matrix between a camera coordinate system corresponding to each frame of the image and the calibration plate coordinate system;
the second processing module is used for processing based on the plurality of first space transformation matrixes to obtain a first target space transformation matrix;
the second acquisition module is used for acquiring a second space transformation matrix from the tail end of the mechanical arm to the mechanical arm base;
and the second calculation module is used for calculating based on the first target space transformation matrix and the second space transformation matrix to obtain a third space transformation matrix between the tail end of the mechanical arm and the camera.
9. An electronic device, the electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
The processor is configured to read the executable instructions from the memory and execute the instructions to implement the calibration method of any one of the preceding claims 1-7.
10. A computer readable storage medium, characterized in that the storage medium stores a computer program for executing the calibration method according to any one of the preceding claims 1-7.
CN202211688574.3A 2022-12-27 2022-12-27 Calibration method, device, equipment and medium Pending CN116091619A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211688574.3A CN116091619A (en) 2022-12-27 2022-12-27 Calibration method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211688574.3A CN116091619A (en) 2022-12-27 2022-12-27 Calibration method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN116091619A true CN116091619A (en) 2023-05-09

Family

ID=86207505

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211688574.3A Pending CN116091619A (en) 2022-12-27 2022-12-27 Calibration method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN116091619A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117598787A (en) * 2024-01-08 2024-02-27 上海卓昕医疗科技有限公司 Medical instrument navigation method, device, equipment and medium based on medical image

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012180034A (en) * 2011-03-02 2012-09-20 Nec Corp Target standardizing device and target standardizing method
US20170178349A1 (en) * 2015-12-18 2017-06-22 The Johns Hopkins University Method for deformable 3d-2d registration using multiple locally rigid registrations
CN107081755A (en) * 2017-01-25 2017-08-22 上海电气集团股份有限公司 A kind of robot monocular vision guides the automatic calibration device of system
CN109658460A (en) * 2018-12-11 2019-04-19 北京无线电测量研究所 A kind of mechanical arm tail end camera hand and eye calibrating method and system
CN110842914A (en) * 2019-10-15 2020-02-28 上海交通大学 Hand-eye calibration parameter identification method, system and medium based on differential evolution algorithm
CN111965624A (en) * 2020-08-06 2020-11-20 北京百度网讯科技有限公司 Calibration method, device and equipment for laser radar and camera and readable storage medium
CN113034612A (en) * 2021-03-25 2021-06-25 奥比中光科技集团股份有限公司 Calibration device and method and depth camera
CN114022570A (en) * 2022-01-05 2022-02-08 荣耀终端有限公司 Method for calibrating external parameters between cameras and electronic equipment
CN114147728A (en) * 2022-02-07 2022-03-08 杭州灵西机器人智能科技有限公司 Universal robot eye on-hand calibration method and system
CN114260899A (en) * 2021-12-29 2022-04-01 广州极飞科技股份有限公司 Hand-eye calibration method and device, electronic equipment and computer readable storage medium
CN114663495A (en) * 2022-03-21 2022-06-24 Oppo广东移动通信有限公司 Calibration method and apparatus, head-mounted display device, and computer-readable storage medium
CN114700953A (en) * 2022-04-29 2022-07-05 华中科技大学 Particle swarm hand-eye calibration method and system based on joint zero error
CN114792345A (en) * 2022-06-27 2022-07-26 杭州蓝芯科技有限公司 Calibration method based on monocular structured light system
CN114905509A (en) * 2022-04-28 2022-08-16 伯朗特机器人股份有限公司 Hand-eye calibration method, robot system and storage medium
CN115284292A (en) * 2022-08-19 2022-11-04 亿嘉和科技股份有限公司 Mechanical arm hand-eye calibration method and device based on laser camera

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012180034A (en) * 2011-03-02 2012-09-20 Nec Corp Target standardizing device and target standardizing method
US20170178349A1 (en) * 2015-12-18 2017-06-22 The Johns Hopkins University Method for deformable 3d-2d registration using multiple locally rigid registrations
CN107081755A (en) * 2017-01-25 2017-08-22 上海电气集团股份有限公司 A kind of robot monocular vision guides the automatic calibration device of system
CN109658460A (en) * 2018-12-11 2019-04-19 北京无线电测量研究所 A kind of mechanical arm tail end camera hand and eye calibrating method and system
CN110842914A (en) * 2019-10-15 2020-02-28 上海交通大学 Hand-eye calibration parameter identification method, system and medium based on differential evolution algorithm
CN111965624A (en) * 2020-08-06 2020-11-20 北京百度网讯科技有限公司 Calibration method, device and equipment for laser radar and camera and readable storage medium
CN113034612A (en) * 2021-03-25 2021-06-25 奥比中光科技集团股份有限公司 Calibration device and method and depth camera
CN114260899A (en) * 2021-12-29 2022-04-01 广州极飞科技股份有限公司 Hand-eye calibration method and device, electronic equipment and computer readable storage medium
CN114022570A (en) * 2022-01-05 2022-02-08 荣耀终端有限公司 Method for calibrating external parameters between cameras and electronic equipment
CN114147728A (en) * 2022-02-07 2022-03-08 杭州灵西机器人智能科技有限公司 Universal robot eye on-hand calibration method and system
CN114663495A (en) * 2022-03-21 2022-06-24 Oppo广东移动通信有限公司 Calibration method and apparatus, head-mounted display device, and computer-readable storage medium
CN114905509A (en) * 2022-04-28 2022-08-16 伯朗特机器人股份有限公司 Hand-eye calibration method, robot system and storage medium
CN114700953A (en) * 2022-04-29 2022-07-05 华中科技大学 Particle swarm hand-eye calibration method and system based on joint zero error
CN114792345A (en) * 2022-06-27 2022-07-26 杭州蓝芯科技有限公司 Calibration method based on monocular structured light system
CN115284292A (en) * 2022-08-19 2022-11-04 亿嘉和科技股份有限公司 Mechanical arm hand-eye calibration method and device based on laser camera

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张捷等: "基于立体靶标的摄像机标定方法", 《东 南 大 学 学 报 ( 自 然 科 学 版 )》, 31 May 2011 (2011-05-31), pages 1 - 6 *
苑 云等: "一种逐级标定相机参数的方法", 《激光与光电子学进展》, 31 December 2010 (2010-12-31), pages 1 - 7 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117598787A (en) * 2024-01-08 2024-02-27 上海卓昕医疗科技有限公司 Medical instrument navigation method, device, equipment and medium based on medical image

Similar Documents

Publication Publication Date Title
US10699431B2 (en) Method and apparatus for generating image generative model
CN116079697B (en) Monocular vision servo method, device, equipment and medium based on image
EP4270315A1 (en) Method and device for processing three-dimensional video, and storage medium
CN109754464B (en) Method and apparatus for generating information
CN116077182B (en) Medical surgical robot control method, device, equipment and medium
CN109784304A (en) Method and apparatus for marking dental imaging
CN109683710B (en) A kind of palm normal vector determines method, apparatus, equipment and storage medium
CN112818898B (en) Model training method and device and electronic equipment
WO2019155903A1 (en) Information processing device and method
CN113129366B (en) Monocular SLAM initialization method and device and electronic equipment
CN116091619A (en) Calibration method, device, equipment and medium
CN109816791B (en) Method and apparatus for generating information
US20240193804A1 (en) Image processing method and apparatus, electronic device, and storage medium
CN116091620A (en) Calibration method, device, equipment and medium
CN110717467A (en) Head pose estimation method, device, equipment and storage medium
CN112880675B (en) Pose smoothing method and device for visual positioning, terminal and mobile robot
CN110349109B (en) Fisheye distortion correction method and system and electronic equipment thereof
CN114116081B (en) Interactive dynamic fluid effect processing method and device and electronic equipment
CN117252914A (en) Training method and device of depth estimation network, electronic equipment and storage medium
CN114693860A (en) Highlight rendering method, highlight rendering device, highlight rendering medium and electronic equipment
CN114049403A (en) Multi-angle three-dimensional face reconstruction method and device and storage medium
CN110097622B (en) Method and device for rendering image, electronic equipment and computer readable storage medium
CN113570659A (en) Shooting device pose estimation method and device, computer equipment and storage medium
CN111768443A (en) Image processing method and device based on mobile camera
CN115994978A (en) Normal vector adjustment method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination