CN112907727B - Calibration method, device and system of relative transformation matrix - Google Patents

Calibration method, device and system of relative transformation matrix Download PDF

Info

Publication number
CN112907727B
CN112907727B CN202110099130.5A CN202110099130A CN112907727B CN 112907727 B CN112907727 B CN 112907727B CN 202110099130 A CN202110099130 A CN 202110099130A CN 112907727 B CN112907727 B CN 112907727B
Authority
CN
China
Prior art keywords
dimensional
point
laser
coordinate system
dimensional coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110099130.5A
Other languages
Chinese (zh)
Other versions
CN112907727A (en
Inventor
王润之
万文辉
邸凯昌
刘召芹
王晔盺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Information Research Institute of CAS
Original Assignee
Aerospace Information Research Institute of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Information Research Institute of CAS filed Critical Aerospace Information Research Institute of CAS
Priority to CN202110099130.5A priority Critical patent/CN112907727B/en
Publication of CN112907727A publication Critical patent/CN112907727A/en
Application granted granted Critical
Publication of CN112907727B publication Critical patent/CN112907727B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P10/00Technologies related to metal processing
    • Y02P10/25Process efficiency

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Manufacturing & Machinery (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the technical field of vision fusion, in particular to a calibration method, a calibration device and a calibration system of a relative transformation matrix, wherein the method comprises the steps of obtaining a plurality of three-dimensional control point coordinates under a laser three-dimensional coordinate system and obtaining three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of a target object; determining coordinate conversion from a model coordinate system to a laser three-dimensional coordinate system by utilizing the three-dimensional control point coordinates and the corresponding three-dimensional model coordinates; acquiring an initial image of a target object, wherein the initial image is acquired by a single-point laser range finder positioned at an origin of the laser three-dimensional coordinate system; determining a three-dimensional coordinate point under a laser three-dimensional coordinate system corresponding to the two-dimensional coordinate point based on the two-dimensional coordinate point of the target object on the initial image; and determining a relative transformation matrix of the camera relative to the single-point laser range finder according to the two-dimensional coordinate points and the three-dimensional coordinate points. And (3) calibrating a relative transformation matrix by utilizing a three-dimensional reconstruction result, so that the use of a checkerboard plate is avoided.

Description

Calibration method, device and system of relative transformation matrix
Technical Field
The invention relates to the technical field of visual fusion, in particular to a calibration method, device and system of a relative transformation matrix.
Background
The vision fusion system of the single-point laser range finder and the camera has been widely applied to various fields in modern life, such as city model construction, augmented reality, target tracking, automatic driving and the like. In the vision fusion system, the single-point laser range finder and the camera are two sets of independent data acquisition equipment, and to combine the information acquired by the two sets of independent data acquisition equipment, the relative position relationship between the single-point laser range finder and the camera needs to be accurately calculated, namely the relative transformation matrix between the single-point laser range finder and the camera needs to be calibrated.
In order to meet the requirement, the relative transformation matrix calibration method between the single-point laser range finder and the camera is proposed in the prior art, and coordinates of laser points at a plurality of positions are obtained by moving the checkerboard calibration plate a plurality of times, and the positions of the single-point laser range finder are fixed, so that the laser three-dimensional points are on a straight line. Therefore, a linear equation of the laser beam in a space can be constructed through the coordinates of the laser three-dimensional points, and then the direction and the coordinates of the laser points under the three-dimensional coordinate system of the camera can be obtained through formula calculation, so that the calibration of a relative transformation matrix between the camera and the single-point laser range finder is completed.
However, the method has two constraint conditions, firstly, an equation can be constructed only by a plurality of laser points obtained by moving the checkerboard calibration plate for a plurality of times on the same straight line, so that along with the increase of the distance between the calibration plate and the laser range finder, the error is increased, and secondly, the laser points must be marked on the plane where the checkerboard calibration plate is located. When the two conditions are not met in the actual operation process, the calibration method of the relative transformation matrix between the single-point laser range finder and the camera is invalid.
Disclosure of Invention
In view of the above, the embodiments of the present invention provide a method, an apparatus, and a system for calibrating a relative transformation matrix, so as to solve the problem of calibrating a relative transformation matrix between a single-point laser range finder and a camera.
According to a first aspect, an embodiment of the present invention provides a calibration method for a relative transformation matrix, where the calibration method includes:
acquiring a plurality of three-dimensional control point coordinates under a laser three-dimensional coordinate system, and acquiring three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of a target object;
determining coordinate conversion from the model coordinate system to the laser three-dimensional coordinate system by utilizing the three-dimensional control point coordinates and the corresponding three-dimensional model coordinates;
Acquiring an initial image of the target object, wherein the initial image is acquired by a camera of a single-point laser range finder positioned at the origin of the laser three-dimensional coordinate system;
determining a three-dimensional coordinate point under the laser three-dimensional coordinate system corresponding to the two-dimensional coordinate point based on the two-dimensional coordinate point of the target object on the initial image;
and determining a relative transformation matrix of the camera relative to the single-point laser range finder according to the two-dimensional coordinate points and the three-dimensional coordinate points.
According to the calibration method for the relative transformation matrix, provided by the embodiment of the invention, the relative transformation matrix is calibrated by utilizing the coordinates of the plurality of three-dimensional control points and the three-dimensional model coordinates of each three-dimensional control point under the model coordinate system, namely, the calibration of the relative transformation matrix is performed by utilizing the three-dimensional reconstruction result, so that the use of checkerboard plates is avoided, and the high-precision three-dimensional reconstruction can improve the calibration precision of the relative transformation matrix; further, since the initial image is acquired by the camera when the single-point laser range finder is located at the origin of the laser three-dimensional coordinate system, the position and the posture of the initial image in the laser three-dimensional coordinate system can be regarded as a relative transformation matrix of the camera relative to the laser range finder in the vision fusion system.
With reference to the first aspect, in a first implementation manner of the first aspect, the acquiring three-dimensional control point coordinates under the plurality of laser three-dimensional coordinate systems includes:
acquiring the laser three-dimensional coordinate system;
and controlling the single-point laser range finder to move under the laser three-dimensional coordinate system, and obtaining a measurement result of the single-point laser range finder to obtain the three-dimensional control point coordinate.
According to the calibration method of the relative transformation matrix, provided by the embodiment of the invention, the movement of the single-point laser range finder is controlled under the laser three-dimensional coordinate system, so that the three-dimensional control point coordinates are obtained, and are used for coordinate conversion from the follow-up model coordinate system to the laser three-dimensional coordinate system, and the three-dimensional control point coordinates are combined with three-dimensional reconstruction, so that the calibration of the relative transformation matrix is realized.
With reference to the first embodiment of the first aspect, in a second implementation of the first aspect, the controlling the single-point laser rangefinder to move under the laser three-dimensional coordinate system and obtaining a measurement result of the single-point laser rangefinder, to obtain the three-dimensional control point coordinate includes:
controlling the single-point laser range finder to move on a motion track, recording two-dimensional movement information of the single-point laser range finder, and obtaining a first direction coordinate and a second direction coordinate, wherein the motion track corresponds to the first direction and the second direction of the laser three-dimensional coordinate system, and the first direction is perpendicular to the second direction;
Controlling the single-point laser range finder to measure distance and record a measurement result to obtain a third direction coordinate, wherein the third direction is perpendicular to the first direction and the second direction;
and obtaining the three-dimensional control point coordinate by using the first direction coordinate, the second direction coordinate and the third direction coordinate.
According to the calibration method of the relative transformation matrix, disclosed by the embodiment of the invention, the movement of the single-point laser range finder is controlled under the laser three-dimensional coordinate system so as to calibrate the three-dimensional control point coordinates, so that the determination process of the three-dimensional control point coordinates is simplified, and the calibration efficiency is improved.
With reference to the first embodiment of the first aspect, in a third implementation of the first aspect, the obtaining three-dimensional model coordinates of each three-dimensional control point in a model coordinate system of the target object includes:
controlling the target object to rotate, and collecting control point images of the target object during each ranging, wherein the control point images are provided with laser spots;
based on the acquired control point images, carrying out three-dimensional modeling on the target object to obtain a three-dimensional model of the target object under the model coordinate system;
and determining three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of the target object based on the three-dimensional model.
According to the calibration method of the relative transformation matrix, provided by the embodiment of the invention, the control point images are collected during distance measurement, and the coordinates of each control point can be identified in the three-dimensional model by utilizing the control point images and the three-dimensional model, so that the three-dimensional model coordinates of each three-dimensional control point under the model coordinate system can be determined, and the reliability of the three-dimensional model coordinate determination is ensured.
With reference to the first aspect, or any one of the first to third implementation manners of the first aspect, in a fourth implementation manner of the first aspect, the determining a relative transformation matrix of the camera with respect to the single-point laser rangefinder according to the two-dimensional coordinate point and the three-dimensional coordinate point includes:
acquiring an internal parameter of the camera;
and calculating the relative transformation matrix based on the internal parameters of the camera, the two-dimensional coordinate points and the three-dimensional coordinate points.
According to the calibration method of the relative transformation matrix, which is provided by the embodiment of the invention, the calculation of the relative transformation matrix is performed by combining the internal parameters of the camera, so that the calculation process is simplified, and the calibration efficiency is improved.
With reference to the first aspect, in a fifth implementation manner of the first aspect, the method further includes:
acquiring a first distance from the single-point laser range finder to a laser range finding point;
And determining a second distance from the camera to the laser ranging point and coordinates of the laser ranging point on an image acquired by the camera based on the measurement result and the relative transformation matrix.
According to a second aspect, an embodiment of the present invention further provides a calibration device for a relative transformation matrix, where the calibration device includes:
the first acquisition module is used for acquiring a plurality of three-dimensional control point coordinates under a laser three-dimensional coordinate system and acquiring three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of a target object;
the first determining module is used for determining coordinate conversion from the model coordinate system to the laser three-dimensional coordinate system by utilizing the three-dimensional control point coordinates and the corresponding three-dimensional model coordinates;
the second acquisition module is used for acquiring an initial image of the target object, wherein the initial image is acquired by a camera of the single-point laser range finder positioned at the origin of the laser three-dimensional coordinate system;
the second determining module is used for determining a three-dimensional coordinate point under the laser three-dimensional coordinate system corresponding to the two-dimensional coordinate point based on the two-dimensional coordinate point of the target object on the initial image;
And the third determining module is used for determining a relative transformation matrix of the camera relative to the single-point laser range finder according to the two-dimensional coordinate points and the three-dimensional coordinate points.
The calibration device for the relative transformation matrix provided by the embodiment of the invention utilizes the coordinates of the plurality of three-dimensional control points and the three-dimensional model coordinates of each three-dimensional control point under the model coordinate system to calibrate the relative transformation matrix, namely, utilizes the three-dimensional reconstruction result to calibrate the relative transformation matrix, thereby avoiding the use of checkerboard plates, and improving the calibration precision of the relative transformation matrix by high-precision three-dimensional reconstruction; further, since the initial image is acquired by the camera when the single-point laser range finder is located at the origin of the laser three-dimensional coordinate system, the position and the posture of the initial image in the laser three-dimensional coordinate system can be regarded as a relative transformation matrix of the camera relative to the laser range finder in the vision fusion system.
According to a third aspect, an embodiment of the present invention provides an electronic device, including: the device comprises a memory and a processor, wherein the memory and the processor are in communication connection, the memory stores computer instructions, and the processor executes the computer instructions so as to execute the calibration method of the relative transformation matrix in the first aspect or any implementation manner of the first aspect.
According to a fourth aspect, an embodiment of the present invention provides a computer readable storage medium storing computer instructions for causing a computer to perform the method for calibrating a relative transformation matrix according to the first aspect or any implementation manner of the first aspect.
According to a fifth aspect, an embodiment of the present invention further provides a calibration system for a relative transformation matrix, the calibration system including:
the vision fusion system comprises a single-point laser range finder and a camera;
the movement track corresponds to a first direction and a second direction of a laser three-dimensional coordinate system, and the first direction is perpendicular to the second direction;
the electronic device of the third aspect of the present invention is connected to the vision fusion system, and the electronic device is configured to control an action of the vision fusion system to determine a relative transformation matrix of the camera with respect to the single-point laser range finder.
The calibration system of the relative transformation matrix provided by the embodiment of the invention does not need to use a checkerboard, but uses a high-precision three-dimensional reconstruction result to calibrate the relative transformation matrix, and the obtained calibration result has higher precision and better reliability.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a calibration system for a relative transformation matrix in an embodiment of the invention;
FIG. 2 is a flow chart of a method of calibrating a relative transformation matrix according to an embodiment of the present invention;
FIG. 3 is a flow chart of a method of calibrating a relative transformation matrix according to an embodiment of the present invention;
FIG. 4 is a flowchart of a method for calculating correspondence between image feature points according to an embodiment of the present invention;
FIG. 5 is a flow chart of three-dimensional modeling according to an embodiment of the present invention;
FIG. 6 is a flow chart of a method of calibrating a relative transformation matrix according to an embodiment of the present invention;
FIG. 7 is a block diagram of a calibration device for a relative transformation matrix according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The calibration method of the relative transformation matrix provided by the embodiment of the invention is applied to a vision fusion system formed by a camera and a single-point laser range finder, wherein the relative position relation between the camera and the single-point laser range finder in the vision fusion system is fixed, and the distance value acquired by the single-point laser range finder is the distance D from a laser range finding target point to the emission origin of the single-point laser range finder 1 In practical application, it is often required that the distance D from the target point to the camera is measured by the laser 2 And distance D 2 And the corresponding pixel point coordinates p (u, v) on the camera image. To obtain the distance D 2 And pixel coordinates p (u, v), the relative pose relationship between the single-point laser range finder and the camera in the vision fusion system, namely the relative transformation matrix between the single-point laser three-dimensional coordinate system and the camera three-dimensional coordinate system, is required to be obtained.
In order to solve the problems and needs, the embodiment of the invention provides a method for effectively marking a relative transformation matrix between a single-point laser range finder and a camera.
The calibration method provided by the embodiment of the invention can calculate the relative transformation matrix between the laser three-dimensional coordinate system and the camera three-dimensional coordinate system, thereby obtaining the distance D from the laser ranging target point to the camera according to the relative transformation matrix 2 Combining with the internal parameter matrix of the camera to obtain D 2 And corresponding pixel point coordinates p (u, v) on the camera pixel coordinate system. Therefore, the calibration method of the relative transformation matrix provided by the embodiment of the invention can be widely applied to various application scenes such as monocular vision quantity measurement, asteroid topography three-dimensional mapping and the like, and the specific application scene of the calibration method is not limited in the embodiment of the invention.
The embodiment of the invention provides a calibration system of a relative transformation matrix, which comprises a visual fusion system, a motion track and electronic equipment. As described above, the vision fusion system includes a single point laser rangefinder and a camera. Wherein, the relative position relation between single-point laser range finder and camera is fixed.
The movement track corresponds to a first direction and a second direction of a laser three-dimensional coordinate system, and the first direction is perpendicular to the second direction. Specifically, the motion trajectory is a trajectory under a two-dimensional coordinate system, and the two-dimensional coordinate system may be a projection coordinate system of the laser three-dimensional coordinate system on a certain plane. For example, as shown in fig. 1, the first initial laser rangefinder emission point is taken as the origin of the laser three-dimensional coordinate system, the laser three-dimensional coordinate system is directed to the right in the X-axis and downward in the Y-axis, the Z-axis is directed to the front of the laser emission, and the two track directions of the movement track are parallel to the X-axis and the Y-axis of the laser three-dimensional coordinate system. The motion track is a two-axis track, and the two directions of the track are mutually perpendicular. Specifically, the single-point laser rangefinder 10 and the camera 20 form the visual fusion system, which moves on the motion trajectory 30 under the control of electronics (not shown in fig. 1) to obtain three-dimensional control point coordinates for subsequent calibration of the relative transformation matrix. As shown in fig. 1, the visual fusion system is placed on a two-axis motion trajectory, in front of which is a spherical target object for three-dimensional reconstruction, with a diameter of about 1 meter. In the control point construction process, the laser point of the laser range finder in the vision fusion system needs to be ensured to strike on a spherical target object.
Further, in three-dimensional reconstruction of the target object, as shown in fig. 1, the target object 40 may be fixed on the base 50, and rotation of the target object 40 is controlled to acquire a plurality of images for three-dimensional modeling.
It should be noted that, the movement track in the embodiment of the present invention is not limited to the one shown in fig. 1, but may be in other forms, and is not limited to any specific one, and may be set correspondingly according to practical situations, so long as it is ensured that the movement track corresponds to a projection coordinate system of a laser three-dimensional coordinate system on a plane, where the plane is a plane perpendicular to the laser emission direction.
The electronic equipment is connected with the vision fusion system and is used for controlling the action of the vision fusion system so as to determine the relative transformation matrix of the camera relative to the single-point laser range finder. The electronic device may be a computer, tablet or other terminal with data processing capabilities, which is not limited in any way herein.
The electronic device is used for executing the calibration method of the relative transformation matrix in the embodiment of the invention so as to determine the relative transformation matrix of the camera relative to the single-point laser range finder. Specific calibration methods will be described in detail below.
According to an embodiment of the present invention, there is provided an embodiment of a method for calibrating a relative transformation matrix, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and that although a logical order is shown in the flowchart, in some cases the steps shown or described may be performed in an order different from that shown or described herein.
The embodiment provides a calibration method of a relative transformation matrix, which can be used for the electronic equipment, such as a mobile phone, a tablet personal computer and the like. Fig. 1 is a flowchart of a calibration method of a relative transformation matrix according to an embodiment of the present invention, as shown in fig. 1, the flowchart includes the following steps:
s11, acquiring a plurality of three-dimensional control point coordinates under a laser three-dimensional coordinate system and acquiring three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of a target object.
The three-dimensional control point coordinates are three-dimensional coordinates of laser spots formed by laser emitted by the single-point laser range finder on a target object in the moving process of the single-point laser range finder in the vision fusion system. The laser three-dimensional coordinate system can be that the origin of coordinates is determined first, and then three directions which are mutually perpendicular are the X-axis, Y-axis and Z-axis directions of the laser three-dimensional coordinate system. The single-point laser range finder moves in the three-dimensional coordinate system, so that the three-dimensional control point coordinates are obtained.
For example, the initial position of the single-point laser range finder is taken as the origin of coordinates in a laser three-dimensional coordinate system, and the laser irradiation direction is defined as the Z-axis direction, so that the formed laser spot, namely the Z-coordinate of the three-dimensional control point, is the measurement result of the single-point laser range finder. Further, by recording the positions of the single-point laser rangefinder in the X direction and the Y direction, the X coordinate and the Y coordinate of the three-dimensional control point can be determined.
The electronic device can acquire a plurality of three-dimensional control point coordinates, and the three-dimensional control point coordinates are obtained under a laser three-dimensional coordinate system.
Further, for the target object, the three-dimensional model of the target object can be obtained by performing three-dimensional modeling on the target object. The single-point laser range finder emits laser to the target object to obtain three-dimensional control point coordinates, laser spots are formed on the target object, the positions of the laser spots can be marked on the target object, and then images containing marking information are collected through a camera and used for subsequent three-dimensional modeling, so that the positions of the laser spots can be represented in a formed three-dimensional model, and the three-dimensional model coordinates of the three-dimensional control point under a model coordinate system can be obtained.
Or the single-point laser range finder can select infrared laser range finding when emitting laser to the target object to obtain the coordinates of the three-dimensional control point, then red laser spots are formed on the target object, at the moment, the current image of the target object is collected, and the obtained current image comprises the red laser spots, namely the control point. And each control point is included in the three-dimensional model formed subsequently, and accordingly, the three-dimensional model coordinates of each three-dimensional control point under the model coordinate system can be determined.
It should be noted that, the coordinates of the three-dimensional control points and the model coordinate system of each three-dimensional control point in the target object may be obtained in real time by the electronic device, or may be stored in the electronic device in advance, or may be obtained by the electronic device in other manners, and the specific obtaining manner is not limited at all, and may be set correspondingly according to actual situations.
This step will be described in detail hereinafter, and will not be described in detail here.
And S12, determining coordinate conversion from a model coordinate system to a laser three-dimensional coordinate system by utilizing the three-dimensional control point coordinates and the corresponding three-dimensional model coordinates.
The three-dimensional control point coordinates and the corresponding three-dimensional model coordinates represent the three-dimensional coordinates of the same control point under different coordinates, so that the conversion of the model coordinate system and the laser three-dimensional coordinate system can be determined by utilizing coordinate conversion.
After the electronic device determines the conversion from the model coordinate system to the laser three-dimensional coordinate system, the coordinate of the target object under the model coordinate system can be converted into the coordinate under the laser three-dimensional coordinate system. The electronic equipment can obtain the coordinates of each pixel point of the target object under the laser three-dimensional coordinates.
S13, acquiring an initial image of the target object.
The initial image is acquired by a camera when the single-point laser range finder is positioned at the origin of the laser three-dimensional coordinate system.
When the electronic device acquires the coordinate of a first initial control point (the origin of the laser three-dimensional coordinate system), a target object image, namely the initial image img_1, is acquired by using the camera. When the image img_1 is acquired, the single-point laser range finder is not moved and still is positioned on the original point of the laser three-dimensional coordinate system, and the relative position of the camera is unchanged, so that the pose of the initial image img_1 under the laser three-dimensional coordinate system is the pose of the camera under the laser three-dimensional coordinate system.
S14, determining a three-dimensional coordinate point under a laser three-dimensional coordinate system corresponding to the two-dimensional coordinate point based on the two-dimensional coordinate point of the target object on the initial image.
The initial image is acquired by the camera, and then the two-dimensional coordinate points of each pixel point on the initial image are two-dimensional coordinate points under the coordinate system of the camera. In S12 described above, the electronic device has obtained the three-dimensional coordinates of the target object at each pixel point of the laser three-dimensional coordinate system. Then, the corresponding two-dimensional coordinate point and the corresponding three-dimensional coordinate point can be determined first, and the relative transformation matrix of the camera relative to the single-point range finder can be determined.
Optionally, the electronic device may determine the two-dimensional coordinate point for matching on the initial image, and then determine the position of the two-dimensional coordinate in the three-dimensional model, so as to determine the three-dimensional coordinate point corresponding to the two-dimensional coordinate point. The position of the two-dimensional coordinates in the three-dimensional model may be determined automatically or manually.
Under the condition of automatic determination, the electronic equipment selects key characteristic points on the initial image, analyzes the three-dimensional model and determines the positions of the key characteristic points, so that the three-dimensional coordinate points corresponding to the two-dimensional coordinate points can be determined. Of course, the electronic device may also determine the three-dimensional coordinate point corresponding to the two-dimensional coordinate point in other manners.
And S15, determining a relative transformation matrix of the camera relative to the single-point laser range finder according to the two-dimensional coordinate points and the three-dimensional coordinate points.
As described above, the two-dimensional coordinate point is a coordinate point in the camera coordinate system, the three-dimensional coordinate point is a coordinate in the laser three-dimensional coordinate system, and the relative transformation matrix of the camera relative to the single-point laser range finder can be determined by using the correspondence between the two-dimensional coordinate point and the three-dimensional coordinate point.
This step will be described in detail later in detail.
According to the calibration method for the relative transformation matrix, the relative transformation matrix is calibrated by utilizing the three-dimensional control point coordinates and the three-dimensional model coordinates of each three-dimensional control point under the model coordinate system, namely, the relative transformation matrix is calibrated by utilizing the three-dimensional reconstruction result, so that the use of a checkerboard is avoided, and the high-precision three-dimensional reconstruction can improve the calibration precision of the relative transformation matrix; further, since the initial image is acquired by the camera when the single-point laser range finder is located at the origin of the laser three-dimensional coordinate system, the position and the posture of the initial image in the laser three-dimensional coordinate system can be regarded as a relative transformation matrix of the camera relative to the laser range finder in the vision fusion system.
In this embodiment, a method for calibrating a relative transformation matrix is provided, which may be used in the above electronic device, such as a mobile phone, a tablet computer, etc., and fig. 3 is a flowchart of a method for calibrating a relative transformation matrix according to an embodiment of the present invention, as shown in fig. 3, where the flowchart includes the following steps:
s21, acquiring a plurality of three-dimensional control point coordinates under a laser three-dimensional coordinate system and acquiring three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of the target object.
Specifically, the step S21 includes the following steps:
s211, acquiring a laser three-dimensional coordinate system.
As described above, the laser three-dimensional coordinate system may be that the origin of coordinates is determined first, and then three directions perpendicular to each other are the X-axis, Y-axis and Z-axis directions of the laser three-dimensional coordinate system, where the Z-axis direction is the laser emitting direction of the single-point laser range finder.
S212, controlling the single-point laser range finder to move under a laser three-dimensional coordinate system, and obtaining a measurement result of the single-point laser range finder to obtain three-dimensional control point coordinates.
The electronic equipment controls the single-point laser range finder to move under the laser three-dimensional coordinate system, specifically controls the single-point laser range finder to move along an X axis or a Y axis of the laser three-dimensional coordinate system, records the X axis and the Y axis coordinates and the measurement result of the single-point laser range finder, and obtains a Z axis coordinate, so that the three-dimensional control point coordinate can be obtained.
As an alternative implementation manner of this embodiment, the step S212 may include the following steps:
(1) And controlling the single-point laser range finder to move on the moving track, and recording the two-dimensional movement information of the single-point laser range finder to obtain a first direction coordinate and a second direction coordinate.
The movement track corresponds to the first direction and the second direction of the laser three-dimensional coordinate system, and the first direction is perpendicular to the second direction.
As shown in fig. 1, the moving track includes two track directions perpendicular to each other, and the two track directions are respectively the X-axis direction and the Y-axis direction of the laser three-dimensional coordinate system. The electronic equipment controls the single-point laser range finder to move on the motion track, and records the movement information, and the obtained first direction coordinate and second direction coordinate are the X coordinate and Y coordinate of the control point.
(2) And controlling the single-point laser range finder to measure the distance and recording the measurement result to obtain the third-direction coordinate.
The third direction is perpendicular to the first direction and the second direction.
The electronic equipment controls the single-point laser range finder to move once, so that the single-point laser range finder is controlled to perform range finding, and a measurement result is obtained. At this time, the obtained measurement result is the Z coordinate of the control point.
(3) And obtaining the three-dimensional control point coordinates by using the first direction coordinates, the second direction coordinates and the third direction coordinates.
The electronic equipment can obtain the three-dimensional control point coordinates of the control point by using the obtained first direction coordinates, second direction coordinates and third direction coordinates.
And the movement of the single-point laser range finder is controlled under the laser three-dimensional coordinate system so as to calibrate the three-dimensional control point coordinates, thereby simplifying the determination process of the three-dimensional control point coordinates and improving the calibration efficiency.
S213, controlling the target object to rotate, and collecting control point images of the target object during each ranging.
The control point image is provided with a laser spot.
In step (2) of S212, when the electronic device controls the single-point laser range finder to measure the distance, the electronic device controls the camera to collect a control point image of the target object, where the control point image includes the laser spot.
When the electronic equipment controls the single-point laser range finder to measure distance, the electronic equipment can control the target object to rotate after finishing each distance measurement, and collect images of the target object for subsequent three-dimensional modeling.
Specifically, the electronic device moves the vision fusion system for multiple times to perform laser ranging, so as to obtain a plurality of control point coordinates under the laser three-dimensional coordinate system, and in this embodiment, 12 control points are used. When the laser range finder is used for measuring the distance, a camera is used for acquiring a target object image, the position of each laser spot on the target object can be seen on the image, the follow-up of the position of a three-dimensional control point on the three-dimensional point cloud can be conveniently found in the three-dimensional reconstruction part of the target object.
And S214, carrying out three-dimensional modeling on the target object based on the acquired control point image to obtain a three-dimensional model of the target object under a model coordinate system.
Specifically, the three-dimensional reconstruction of the target object comprises image feature point corresponding relation calculation and three-dimensional point cloud generation based on a motion recovery structure.
Image feature point correspondence calculation
1. As shown in fig. 4, in combination with the schematic diagram of the visual fusion system shown in fig. 1, a spherical target object is first rotated at a constant speed by using a rotating base, and a plurality of images covering the surface of the target object are acquired by using a camera in the rotating process.
2. And extracting SIFT feature points from the acquired image.
3. And performing feature matching on the extracted SIFT feature points to obtain an initial matching point pair.
4. The basis matrix is estimated using the resulting initial matching point pairs.
5. And refining the initial matching point pairs by using the basic matrix constraint, and removing the mismatching point pairs.
6. And outputting the corresponding relation of the feature points between every two images.
(II) three-dimensional point cloud generation based on motion restoration structure
1. As shown in fig. 5, the feature point correspondence result between the images obtained in the image feature point correspondence calculation step is input first.
2. A set of image pairs with a sufficient number of matching points is selected to begin three-dimensional reconstruction.
3. Eight-point-algorithm (Eight-point-algorithm) is used to estimate the relative transformation matrix between two images.
4. Three-dimensional points of the initial model are calculated by means of triangulation.
5. And iteratively adding a new image, calculating three-dimensional points obtained by the image through the step 3 and the step 4, and fusing the newly obtained three-dimensional points with the three-dimensional points obtained before.
6. And outputting a three-dimensional reconstruction result of the spherical target object, wherein the obtained three-dimensional dense point cloud is also model coordinates, and the three-dimensional dense point cloud does not have scale information.
S215, determining three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of the target object based on the three-dimensional model.
The image for three-dimensional modeling comprises laser spots, so that the electronic equipment can determine three-dimensional model coordinates of each three-dimensional control point under the model coordinate system of the target object by using the established three-dimensional model.
S22, determining coordinate conversion from the model coordinate system to the laser three-dimensional coordinate system by utilizing the three-dimensional control point coordinates and the corresponding three-dimensional model coordinates.
Please refer to the embodiment S12 shown in fig. 2 in detail, which is not described herein.
S23, acquiring an initial image of the target object.
The initial image is acquired by a camera when the single-point laser range finder is positioned at the origin of the laser three-dimensional coordinate system.
Please refer to the embodiment S13 shown in fig. 2 in detail, which is not described herein.
S24, determining a three-dimensional coordinate point under a laser three-dimensional coordinate system corresponding to the two-dimensional coordinate point based on the two-dimensional coordinate point of the target object on the initial image.
Please refer to the embodiment S14 shown in fig. 2 in detail, which is not described herein.
S25, determining a relative transformation matrix of the camera relative to the single-point laser range finder according to the two-dimensional coordinate points and the three-dimensional coordinate points.
Please refer to the embodiment S15 shown in fig. 2 in detail, which is not described herein.
According to the calibration method of the relative transformation matrix, the movement of the single-point laser range finder is controlled under the laser three-dimensional coordinate system, so that three-dimensional control point coordinates are obtained, the three-dimensional control point coordinates are used for coordinate conversion from a follow-up model coordinate system to the laser three-dimensional coordinate system, and the three-dimensional control point coordinates are combined with three-dimensional reconstruction, so that the calibration of the relative transformation matrix is realized. Further, control point images are acquired during distance measurement, and coordinates of each control point can be marked in the three-dimensional model by using the control point images and the three-dimensional model, so that three-dimensional model coordinates of each three-dimensional control point under a model coordinate system can be determined, and reliability of three-dimensional model coordinate determination is ensured.
In this embodiment, a method for calibrating a relative transformation matrix is provided, which may be used in the above electronic device, such as a mobile phone, a tablet computer, etc., fig. 6 is a flowchart of a method for calibrating a relative transformation matrix according to an embodiment of the present invention, as shown in fig. 6, where the flowchart includes the following steps:
s31, acquiring a plurality of three-dimensional control point coordinates under a laser three-dimensional coordinate system and acquiring three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of a target object.
Please refer to the embodiment S21 shown in fig. 3 in detail, which is not described herein.
S32, determining coordinate conversion from the model coordinate system to the laser three-dimensional coordinate system by utilizing the three-dimensional control point coordinates and the corresponding three-dimensional model coordinates.
Please refer to the embodiment S22 shown in fig. 3 in detail, which is not described herein.
S33, acquiring an initial image of the target object.
The initial image is acquired by a camera when the single-point laser range finder is positioned at the origin of the laser three-dimensional coordinate system.
Please refer to the embodiment S23 shown in fig. 3 in detail, which is not described herein.
S34, determining a three-dimensional coordinate point under a laser three-dimensional coordinate system corresponding to the two-dimensional coordinate point based on the two-dimensional coordinate point of the target object on the initial image.
Please refer to the embodiment S24 shown in fig. 3 in detail, which is not described herein.
And S35, determining a relative transformation matrix of the camera relative to the single-point laser range finder according to the two-dimensional coordinate points and the three-dimensional coordinate points.
Specifically, the step S35 includes the following steps:
s351, acquiring internal parameters of the camera.
The internal parameters of the camera may be obtained by calibrating the internal parameters of the camera, for example, the internal parameters of the camera may be represented by an internal parameter matrix. Specifically, a camera of the vision fusion system may be used to capture multiple images at multiple distances and angles, and the images may be processed by a camera calibration procedure to obtain an intrinsic matrix K of the camera 3×3 The form is shown as a formula (1):
and S352, calculating a relative transformation matrix based on the internal parameters of the camera, the two-dimensional coordinate points and the three-dimensional coordinate points.
After the electronic device acquires the internal parameters of the camera, the position and the posture of the initial image under the laser three-dimensional coordinate system, namely the relative transformation matrix of the camera relative to the laser range finder, can be solved by utilizing the corresponding relation between the two-dimensional coordinate point and the three-dimensional coordinate point. The relationship between the three-dimensional feature points and the two-dimensional feature points on the initial image can be represented by the formula (2):
The coordinate transformation matrix T in the above formula 4×4 The relative transformation matrix of the camera relative to the single-point laser range finder is obtained by matching the homogeneous coordinates P of the three-dimensional characteristic points in the laser three-dimensional coordinate system L (X,Y,Z,1) T Conversion to the coordinate P in the three-dimensional coordinate system of the camera C (X',Y',Z',1) T 。K 3×3 Representing the internal parameters of the camera, which can be obtained through calibration of the internal parameters of the camera. The transformation matrix T of the camera relative to the single-point laser range finder can be determined only by at least 3 groups of three-dimensional characteristic points and two-dimensional characteristic points 4×4 ,T 4×4 The first 3 rows and the first 3 columns are the initial image postures, and the first 3 columns of the 4 th row are the initial image positions.
S36, acquiring a first distance from the single-point laser range finder to a laser range finding point.
In the vision fusion system combining the camera and the single-point laser range finder, the relative position relationship between the camera and the single-point laser range finder is fixed, and the distance value acquired by the single-point laser range finder is the distance D from the laser range finding point to the emission origin of the single-point laser range finder 1 In the practical application process, the distance D from the laser ranging point to the camera is often required 2 And distance D 2 And the corresponding pixel point coordinates p (u, v) on the monocular camera image. To obtain the distance D 2 And pixel point coordinates p (u, v), the relative pose relation between the single-point laser range finder and the camera in the vision fusion system needs to be calculated, namely, the relative transformation matrix between the single-point laser three-dimensional coordinate system and the camera three-dimensional coordinate system.
Further, single point laser ranging in a vision fusion systemThe instrument measures a first distance D between the origin of the single-point laser range finder and a laser ranging point by emitting laser 1 And the measured first distance D 1 And sending the data to the electronic equipment. The subsequent electronic device is based on the acquired first distance D 1 Calculate the second distance D 2
S37, determining a second distance from the camera to the laser ranging point and coordinates of the laser ranging point on an image acquired by the camera based on the measurement result and the relative transformation matrix.
Because the relative positional relationship between the camera and the single-point laser range finder in the vision fusion system is fixed, the relative transformation matrix T of the three-dimensional coordinate system of the camera relative to the three-dimensional coordinate system of the laser 4×4 Is also unchanged. Distance D obtained by laser range finder 1 Seen as a three-dimensional point P in a laser three-dimensional coordinate system L (0,0,D 1 ,1) T Then according to the utilization formula (3), the three-dimensional point P can be calculated L (0,0,D 1 ,1) T Coordinates P in a three-dimensional coordinate system of a camera C (X C ,Y C ,Z C ,1) T
Further, a second distance D from the laser ranging point to the camera can be calculated by using the formula (4) 2
Correspondingly, when the single-point laser range finder measures the first distance, the camera simultaneously collects the image of the target body at the moment. The electronic equipment calculates a second distance D between the laser ranging point and the camera 2 Then, the coordinates p (u, v) of the corresponding pixel points of the laser ranging points on the acquired image can be determined by using parameters in the camera, and specifically can be calculated by adopting a formula (5):
according to the calibration method of the relative transformation matrix, the calculation of the relative transformation matrix is performed by combining the internal parameters of the camera, so that the calculation process is simplified, and the calibration efficiency is improved.
In a specific application example of this embodiment, the calibration method of the relative transformation matrix according to the present invention mainly includes the following three parts: the method comprises a laser three-dimensional coordinate system control point construction part, a target object three-dimensional reconstruction part and a relative transformation matrix calculation part.
1. The laser three-dimensional coordinate system control point constructing section includes: the visual fusion system is arranged on a two-axis motion track, a three-dimensional space coordinate system taking the first initial single-point laser range finder emission point as an original point is constructed, the X axis of the coordinate system is right, the Y axis is downward, the Z axis points to the front of laser emission, and the two track directions of the two-axis motion track are parallel to the X axis and the Y axis of the laser three-dimensional coordinate system. At the same time of acquiring the first control point coordinate, acquiring an initial image img_1 of a target object by using a camera. And then, moving the vision fusion system by using the movement track, recording the moving position of the vision fusion system on the movement track as the X coordinate and the Y coordinate of the three-dimensional control point, and taking the average value of the multiple distance measurements of the single-point laser range finder as the Z coordinate of the three-dimensional control point. And then, the vision fusion system is moved for a plurality of times to measure the distance, so that a plurality of control point coordinates under the laser three-dimensional coordinate system are obtained. When the single-point laser range finder is used for measuring the distance, a camera is used for acquiring an image of a target object, the position of each laser spot on the target object can be seen on the image, the follow-up of the three-dimensional reconstruction part of the target object is facilitated, and the position of a three-dimensional control point can be quickly found from the three-dimensional point cloud.
2. The three-dimensional reconstruction portion of the target object includes: and (3) placing the target object on a base rotating at a constant speed, acquiring a plurality of images covering the surface of the target object by using a camera, calculating the acquired images through the corresponding relation of image characteristic points, completing the three-dimensional reconstruction of the target object model based on the three-dimensional point cloud generation step of the motion recovery structure, and completing the coordinate conversion of the three-dimensional point cloud from the model coordinate system to the laser three-dimensional coordinate system by using the control points under the obtained laser three-dimensional coordinate system.
3. The relative transformation matrix calculation section includes: in the laser three-dimensional coordinate system control point construction part, when the first initial control point (the origin of the laser three-dimensional coordinate system) coordinate is acquired, the initial image img_1 of a target object is acquired by using a camera. When the image img_1 is initially acquired, the single-point laser range finder is not moved and still is positioned on the origin of the laser three-dimensional coordinate system, the target object is not rotated, and the relative position of the camera is unchanged, so that the pose of the initial image img_1 under the laser three-dimensional coordinate system is the pose of the camera under the laser three-dimensional coordinate system. Based on the principle and analysis, the embodiment of the invention finds a plurality of two-dimensional points and pixel coordinates thereof in the initial image img_1, and three-dimensional points and coordinates thereof corresponding to the two-dimensional points on the dense Point cloud model, and then can solve the pose of the initial image img_1 under the laser three-dimensional coordinate system by using a high-efficiency n-Point perspective method (Efficient Perspective-n-Point, EPnP), namely the pose of the camera in the vision fusion system in the embodiment of the invention under the laser three-dimensional coordinate system, namely a relative transformation matrix between the single-Point laser range finder and the camera. Using the relative transformation matrix, the distance D from the laser ranging point to the emission origin of the single-point laser range finder 1 The distance D from the laser ranging point to the monocular camera can be obtained 2 . Then according to the internal parameters of the camera, the distance D can be obtained 2 And the corresponding pixel point coordinates p (u, v) on the monocular camera image.
The technical scheme in the embodiment of the invention has the following advantages:
1. compared with the traditional scheme of calibrating based on the checkerboard calibration plate, the calibration method does not need to utilize the checkerboard, but utilizes the high-precision three-dimensional reconstruction result to calibrate the relative transformation matrix, and the obtained calibration result is higher in precision and better in reliability.
2. In the conventional relative transformation matrix calibration scheme of the monocular camera and the laser range finder, the relative transformation matrix calibration is firstly carried out, and then the calibration result is utilized for carrying out three-dimensional reconstruction and other applications, so that the accuracy of the relative transformation matrix calibration influences the accuracy of the subsequent three-dimensional reconstruction. The calibration method in the embodiment of the invention utilizes the three-dimensional reconstruction result to finish the calibration of the relative transformation matrix while constructing the high-precision three-dimensional reconstruction result, so that the precision of the calibration result does not influence the precision of the three-dimensional reconstruction, and the high-precision three-dimensional reconstruction result can improve the precision of the calibration of the relative transformation matrix.
3. In some camera self-calibration methods based on three-dimensional reconstruction, the three-dimensional point cloud used is a sparse model coordinate without scale information. In the embodiment of the invention, a plurality of control points are used for converting the dense three-dimensional point cloud from the model coordinates to the laser three-dimensional coordinate system in the calibration process, so that the method has real scale information and can directly perform mapping work such as distance measurement on the three-dimensional point cloud.
In this embodiment, a calibration device for a relative transformation matrix is further provided, and the device is used to implement the foregoing embodiments and preferred embodiments, which are not described herein. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
The present embodiment provides a calibration device for a relative transformation matrix, as shown in fig. 7, including:
a first obtaining module 41, configured to obtain a plurality of three-dimensional control point coordinates in a laser three-dimensional coordinate system and obtain three-dimensional model coordinates of each three-dimensional control point in a model coordinate system of the target object;
A first determining module 42, configured to determine coordinate conversion from the model coordinate system to the laser three-dimensional coordinate system by using the three-dimensional control point coordinates and the corresponding three-dimensional model coordinates;
a second obtaining module 43, configured to obtain an initial image of the target object, where the initial image is collected by a camera of the single-point laser range finder located at an origin of the laser three-dimensional coordinate system;
a second determining module 44, configured to determine a three-dimensional coordinate point in the laser three-dimensional coordinate system corresponding to a two-dimensional coordinate point based on the two-dimensional coordinate point of the target object on the initial image;
and a third determining module 45, configured to determine a relative transformation matrix of the camera with respect to the single-point laser rangefinder according to the two-dimensional coordinate point and the three-dimensional coordinate point.
The calibration device for the relative transformation matrix provided by the embodiment calibrates the relative transformation matrix by utilizing the coordinates of the plurality of three-dimensional control points and the three-dimensional model coordinates of each three-dimensional control point under the model coordinate system, namely calibrates the relative transformation matrix by utilizing the three-dimensional reconstruction result, avoids the use of a checkerboard, and can improve the calibration precision of the relative transformation matrix by high-precision three-dimensional reconstruction; further, since the initial image is acquired by the camera when the single-point laser range finder is located at the origin of the laser three-dimensional coordinate system, the position and the posture of the initial image in the laser three-dimensional coordinate system can be regarded as a relative transformation matrix of the camera relative to the laser range finder in the vision fusion system.
The scaling means of the relative transformation matrix in this embodiment is presented in the form of functional units, here referred to as ASIC circuits, processors and memories executing one or more software or fixed programs, and/or other devices providing the above described functionality.
Further functional descriptions of the above respective modules are the same as those of the above corresponding embodiments, and are not repeated here.
The embodiment of the invention also provides electronic equipment, which is provided with the calibration device of the relative transformation matrix shown in the figure 7.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an electronic device according to an alternative embodiment of the present invention, as shown in fig. 8, the electronic device may include: at least one processor 51, such as a CPU (Central Processing Unit ), at least one communication interface 53, a memory 54, at least one communication bus 52. Wherein the communication bus 52 is used to enable connected communication between these components. The communication interface 53 may include a Display screen (Display) and a Keyboard (Keyboard), and the selectable communication interface 53 may further include a standard wired interface and a wireless interface. The memory 54 may be a high-speed RAM memory (Random Access Memory, volatile random access memory) or a non-volatile memory (non-volatile memory), such as at least one disk memory. The memory 54 may alternatively be at least one memory device located remotely from the aforementioned processor 51. Wherein the processor 51 may be as described in connection with fig. 7, the memory 54 stores an application program, and the processor 51 invokes the program code stored in the memory 54 for performing any of the method steps described above.
The communication bus 52 may be a peripheral component interconnect standard (peripheral component interconnect, PCI) bus or an extended industry standard architecture (extended industry standard architecture, EISA) bus, among others. The communication bus 52 may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, only one thick line is shown in fig. 8, but not only one bus or one type of bus.
Wherein the memory 54 may include volatile memory (english) such as random-access memory (RAM); the memory may also include a nonvolatile memory (english: non-volatile memory), such as a flash memory (english: flash memory), a hard disk (english: hard disk drive, abbreviated as HDD) or a solid state disk (english: solid-state drive, abbreviated as SSD); memory 54 may also include a combination of the types of memory described above.
The processor 51 may be a central processor (English: central processing unit, abbreviated: CPU), a network processor (English: network processor, abbreviated: NP) or a combination of CPU and NP.
The processor 51 may further include a hardware chip, among others. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof (English: programmable logic device). The PLD may be a complex programmable logic device (English: complex programmable logic device, abbreviated: CPLD), a field programmable gate array (English: field-programmable gate array, abbreviated: FPGA), a general-purpose array logic (English: generic array logic, abbreviated: GAL), or any combination thereof.
Optionally, the memory 54 is also used for storing program instructions. The processor 51 may invoke program instructions to implement the calibration method of the relative transformation matrix as shown in the embodiments of fig. 2, 3 and 6 of the present application.
The embodiment of the application also provides a non-transitory computer storage medium, which stores computer executable instructions, and the computer executable instructions can execute the calibration method of the relative transformation matrix in any method embodiment. Wherein the storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Flash Memory (Flash Memory), a Hard Disk (HDD), or a Solid State Drive (SSD); the storage medium may also comprise a combination of memories of the kind described above.
Although embodiments of the present application have been described in connection with the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the application, and such modifications and variations fall within the scope of the application as defined by the appended claims.

Claims (6)

1. The calibration method of the relative transformation matrix is characterized by comprising the following steps of:
Acquiring a plurality of three-dimensional control point coordinates under a laser three-dimensional coordinate system, and acquiring three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of a target object;
determining coordinate conversion from the model coordinate system to the laser three-dimensional coordinate system by utilizing the three-dimensional control point coordinates and the corresponding three-dimensional model coordinates;
acquiring an initial image of the target object, wherein the initial image is acquired by a camera of a single-point laser range finder positioned at the origin of the laser three-dimensional coordinate system;
determining a three-dimensional coordinate point under the laser three-dimensional coordinate system corresponding to the two-dimensional coordinate point based on the two-dimensional coordinate point of the target object on the initial image;
determining a relative transformation matrix of the camera relative to the single-point laser range finder according to the two-dimensional coordinate points and the three-dimensional coordinate points;
the acquiring the coordinates of the three-dimensional control points under the plurality of laser three-dimensional coordinate systems comprises the following steps:
acquiring the laser three-dimensional coordinate system;
controlling the single-point laser range finder to move under the laser three-dimensional coordinate system, and obtaining a measurement result of the single-point laser range finder to obtain the three-dimensional control point coordinate;
The controlling the single-point laser range finder to move under the laser three-dimensional coordinate system and obtaining the measurement result of the single-point laser range finder to obtain the three-dimensional control point coordinate comprises the following steps:
controlling the single-point laser range finder to move on a motion track, recording two-dimensional movement information of the single-point laser range finder, and obtaining a first direction coordinate and a second direction coordinate, wherein the motion track corresponds to the first direction and the second direction of the laser three-dimensional coordinate system, and the first direction is perpendicular to the second direction;
controlling the single-point laser range finder to measure distance and record a measurement result to obtain a third direction coordinate, wherein the third direction is perpendicular to the first direction and the second direction;
obtaining the three-dimensional control point coordinate by using the first direction coordinate, the second direction coordinate and the third direction coordinate;
the obtaining the three-dimensional model coordinates of each three-dimensional control point under the model coordinate system of the target object comprises the following steps:
controlling the target object to rotate, and collecting control point images of the target object during each ranging, wherein the control point images are provided with laser spots;
Based on the acquired control point images, carrying out three-dimensional modeling on the target object to obtain a three-dimensional model of the target object under the model coordinate system;
determining three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of a target object based on the three-dimensional model;
the determining the relative transformation matrix of the camera relative to the single-point laser range finder according to the two-dimensional coordinate point and the three-dimensional coordinate point comprises the following steps:
acquiring an internal parameter of the camera;
and calculating the relative transformation matrix based on the internal parameters of the camera, the two-dimensional coordinate points and the three-dimensional coordinate points.
2. The calibration method according to claim 1, characterized in that the method further comprises:
acquiring a first distance from the single-point laser range finder to a laser range finding point;
and determining a second distance from the camera to the laser ranging point and coordinates of the laser ranging point on an image acquired by the camera based on the measurement result and the relative transformation matrix.
3. A calibration device for a relative transformation matrix, the calibration device comprising:
the first acquisition module is used for acquiring a plurality of three-dimensional control point coordinates under a laser three-dimensional coordinate system and acquiring three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of a target object;
The first determining module is used for determining coordinate conversion from the model coordinate system to the laser three-dimensional coordinate system by utilizing the three-dimensional control point coordinates and the corresponding three-dimensional model coordinates;
the second acquisition module is used for acquiring an initial image of the target object, wherein the initial image is acquired by a camera of the single-point laser range finder positioned at the origin of the laser three-dimensional coordinate system;
the second determining module is used for determining a three-dimensional coordinate point under the laser three-dimensional coordinate system corresponding to the two-dimensional coordinate point based on the two-dimensional coordinate point of the target object on the initial image;
the third determining module is used for determining a relative transformation matrix of the camera relative to the single-point laser range finder according to the two-dimensional coordinate points and the three-dimensional coordinate points;
the acquiring the coordinates of the three-dimensional control points under the plurality of laser three-dimensional coordinate systems comprises the following steps:
acquiring the laser three-dimensional coordinate system;
controlling the single-point laser range finder to move under the laser three-dimensional coordinate system, and obtaining a measurement result of the single-point laser range finder to obtain the three-dimensional control point coordinate;
the controlling the single-point laser range finder to move under the laser three-dimensional coordinate system and obtaining the measurement result of the single-point laser range finder to obtain the three-dimensional control point coordinate comprises the following steps:
Controlling the single-point laser range finder to move on a motion track, recording two-dimensional movement information of the single-point laser range finder, and obtaining a first direction coordinate and a second direction coordinate, wherein the motion track corresponds to the first direction and the second direction of the laser three-dimensional coordinate system, and the first direction is perpendicular to the second direction;
controlling the single-point laser range finder to measure distance and record a measurement result to obtain a third direction coordinate, wherein the third direction is perpendicular to the first direction and the second direction;
obtaining the three-dimensional control point coordinate by using the first direction coordinate, the second direction coordinate and the third direction coordinate;
the obtaining the three-dimensional model coordinates of each three-dimensional control point under the model coordinate system of the target object comprises the following steps:
controlling the target object to rotate, and collecting control point images of the target object during each ranging, wherein the control point images are provided with laser spots;
based on the acquired control point images, carrying out three-dimensional modeling on the target object to obtain a three-dimensional model of the target object under the model coordinate system;
determining three-dimensional model coordinates of each three-dimensional control point under a model coordinate system of a target object based on the three-dimensional model;
The determining the relative transformation matrix of the camera relative to the single-point laser range finder according to the two-dimensional coordinate point and the three-dimensional coordinate point comprises the following steps:
acquiring an internal parameter of the camera;
and calculating the relative transformation matrix based on the internal parameters of the camera, the two-dimensional coordinate points and the three-dimensional coordinate points.
4. An electronic device, comprising:
a memory and a processor, the memory and the processor being communicatively coupled to each other, the memory having stored therein computer instructions, the processor executing the computer instructions to perform the method of calibrating a relative transformation matrix according to any of claims 1-2.
5. A computer readable storage medium storing computer instructions for causing a computer to perform the method of calibrating a relative transformation matrix according to any one of claims 1-2.
6. A calibration system for a relative transformation matrix, the calibration system comprising:
the vision fusion system comprises a single-point laser range finder and a camera;
the movement track corresponds to a first direction and a second direction of a laser three-dimensional coordinate system, and the first direction is perpendicular to the second direction;
The electronic device of claim 4, connected to the vision fusion system, for controlling the actions of the vision fusion system to determine a relative transformation matrix of the camera with respect to the single point laser rangefinder.
CN202110099130.5A 2021-01-25 2021-01-25 Calibration method, device and system of relative transformation matrix Active CN112907727B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110099130.5A CN112907727B (en) 2021-01-25 2021-01-25 Calibration method, device and system of relative transformation matrix

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110099130.5A CN112907727B (en) 2021-01-25 2021-01-25 Calibration method, device and system of relative transformation matrix

Publications (2)

Publication Number Publication Date
CN112907727A CN112907727A (en) 2021-06-04
CN112907727B true CN112907727B (en) 2023-09-01

Family

ID=76120187

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110099130.5A Active CN112907727B (en) 2021-01-25 2021-01-25 Calibration method, device and system of relative transformation matrix

Country Status (1)

Country Link
CN (1) CN112907727B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538591B (en) * 2021-06-18 2024-03-12 深圳奥锐达科技有限公司 Calibration method and device for distance measuring device and camera fusion system
CN113538592B (en) * 2021-06-18 2023-10-27 深圳奥锐达科技有限公司 Calibration method and device for distance measuring device and camera fusion system
CN113720259A (en) * 2021-08-23 2021-11-30 河北鹰眼智能科技有限公司 Stereoscopic vision positioning method
CN113945204B (en) * 2021-10-26 2022-11-29 西北工业大学 Space point cloud measuring system and calibration and reconstruction method
CN114152201B (en) * 2021-11-04 2023-10-17 深圳橙子自动化有限公司 Laser altimeter calibration method and device, electronic equipment and storage medium
CN114758016B (en) * 2022-06-15 2022-09-13 超节点创新科技(深圳)有限公司 Camera equipment calibration method, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107726975A (en) * 2017-09-20 2018-02-23 大连理工大学 A kind of error analysis method of view-based access control model stitching measure
CN110245599A (en) * 2019-06-10 2019-09-17 深圳市超准视觉科技有限公司 A kind of intelligent three-dimensional weld seam Auto-searching track method
CN111505606A (en) * 2020-04-14 2020-08-07 武汉大学 Method and device for calibrating relative pose of multi-camera and laser radar system
WO2020173052A1 (en) * 2019-02-28 2020-09-03 未艾医疗技术(深圳)有限公司 Three-dimensional image measurement method, electronic device, storage medium, and program product
CN111862238A (en) * 2020-07-23 2020-10-30 中国民航大学 Full-space monocular light pen type vision measurement method
CN111998772A (en) * 2020-08-05 2020-11-27 浙江大学 Pixel-level target positioning method based on laser and monocular vision fusion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107726975A (en) * 2017-09-20 2018-02-23 大连理工大学 A kind of error analysis method of view-based access control model stitching measure
WO2020173052A1 (en) * 2019-02-28 2020-09-03 未艾医疗技术(深圳)有限公司 Three-dimensional image measurement method, electronic device, storage medium, and program product
CN110245599A (en) * 2019-06-10 2019-09-17 深圳市超准视觉科技有限公司 A kind of intelligent three-dimensional weld seam Auto-searching track method
CN111505606A (en) * 2020-04-14 2020-08-07 武汉大学 Method and device for calibrating relative pose of multi-camera and laser radar system
CN111862238A (en) * 2020-07-23 2020-10-30 中国民航大学 Full-space monocular light pen type vision measurement method
CN111998772A (en) * 2020-08-05 2020-11-27 浙江大学 Pixel-level target positioning method based on laser and monocular vision fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
相机与激光跟踪仪相对位姿标定方法的研究;范百兴;杨聚庆;周维虎;李祥云;;测绘工程(第09期);全文 *

Also Published As

Publication number Publication date
CN112907727A (en) 2021-06-04

Similar Documents

Publication Publication Date Title
CN112907727B (en) Calibration method, device and system of relative transformation matrix
CN112223302B (en) Rapid calibration method and device of live working robot based on multiple sensors
CN110842901B (en) Robot hand-eye calibration method and device based on novel three-dimensional calibration block
CN111862180B (en) Camera set pose acquisition method and device, storage medium and electronic equipment
WO2018201677A1 (en) Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system
CN111678521B (en) Method and system for evaluating positioning accuracy of mobile robot
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN110419208B (en) Imaging system, imaging control method, image processing apparatus, and computer readable medium
CN112492292B (en) Intelligent visual 3D information acquisition equipment of free gesture
CN113870366B (en) Calibration method and calibration system of three-dimensional scanning system based on pose sensor
CN112254680B (en) Multi freedom's intelligent vision 3D information acquisition equipment
CN112229323A (en) Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method
CN112254675A (en) Space occupancy rate acquisition and judgment equipment and method containing moving object
KR101735325B1 (en) Apparatus for registration of cloud points
CN112254638B (en) Intelligent visual 3D information acquisition equipment that every single move was adjusted
CN112253913B (en) Intelligent visual 3D information acquisition equipment deviating from rotation center
JP2003006618A (en) Method and device for generating three-dimensional model and computer program
CN112257535B (en) Three-dimensional matching equipment and method for avoiding object
WO2018134866A1 (en) Camera calibration device
CN114926542A (en) Mixed reality fixed reference system calibration method based on optical positioning system
JP2004170277A (en) 3-dimensional measurement method, 3-dimensional measurement system, image processing apparatus, and computer program
CN117249764B (en) Vehicle body positioning method and device and electronic equipment
CN116233392B (en) Calibration method and device of virtual shooting system, electronic equipment and storage medium
CN115115684B (en) Calibration method, system, electronic device and computer readable storage medium
CN115222826B (en) Three-dimensional reconstruction method and device with changeable relative poses of structured light and camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant