CN114359401A - Calibration method, system and equipment - Google Patents

Calibration method, system and equipment Download PDF

Info

Publication number
CN114359401A
CN114359401A CN202111536170.8A CN202111536170A CN114359401A CN 114359401 A CN114359401 A CN 114359401A CN 202111536170 A CN202111536170 A CN 202111536170A CN 114359401 A CN114359401 A CN 114359401A
Authority
CN
China
Prior art keywords
calibration
camera
terminal
parameter
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111536170.8A
Other languages
Chinese (zh)
Inventor
蒋成
丁勇
李云强
陈颖
罗苇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jimu Yida Science And Technology Co ltd
Original Assignee
Shenzhen Jimu Yida Science And Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jimu Yida Science And Technology Co ltd filed Critical Shenzhen Jimu Yida Science And Technology Co ltd
Priority to CN202111536170.8A priority Critical patent/CN114359401A/en
Publication of CN114359401A publication Critical patent/CN114359401A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides a calibration method, a calibration system and calibration equipment, wherein an auxiliary camera is added, and because the frame and the resolution of the auxiliary camera are both larger than those of a scanner camera, a wider frame can be acquired during acquisition, so that the difference in the quantity of characteristic angular points is extracted at different positions on an image of a smooth calibration plate, and the quantity of the acquired images of the calibration plate is reduced. Meanwhile, uniform and sufficient characteristic points can be extracted, so that the calibration result of the projection device is more accurate, the calibration times are reduced, and the calibration efficiency is improved. And when the multi-camera combined calibration is carried out, because the multi-camera constraint can provide more stable constraint, the error is smaller when the adjustment calculation is carried out, so that the precision of the camera calibration is higher. Meanwhile, the more the number of the characteristic points is, the smaller the calculation error in calibration is, and the higher the calibration precision is.

Description

Calibration method, system and equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a calibration method, system and device.
Background
The primary task of computer vision is to acquire information corresponding to an object in a real three-dimensional world through shot image information, so that establishing a geometric model of the object in the process of mapping the object from the three-dimensional world to an imaging plane of a camera is particularly important.
The 3D scanner for coding the structured light mainly comprises a camera and a projection device, and the working principle of the 3D scanner is that the structured light is projected to an object from the projection device and is obtained by a sensor to form a picture, and then the distance from each point of the object to the plane of the camera is calculated according to the distortion condition of a coding pattern and the number of internal and external parameters of the sensor. Therefore, the most critical part of the process is to obtain its internal and external parameters by calibrating the sensor and the projection device.
When the coded structured light 3D scanner is calibrated, common calibration methods include a planar calibration method of Zhangyingyou, a Tsai two-step calibration method and the like. The calibration methods need to shoot more calibration images, the efficiency is not high, and the error in calibration is larger.
Disclosure of Invention
The invention aims to provide a calibration method, a calibration system and calibration equipment, which solve the problem of how to calibrate a coded structured light (3D) scanner with high efficiency and high precision.
In order to achieve the above object, the present invention provides the following technical solutions:
in a first aspect, a calibration method is provided, where the method is applied to a calibration system, and the calibration system includes: the 3D scanner comprises a scanner camera and a projection device, wherein the picture of the auxiliary camera is larger than that of the scanner camera, and the resolution of the auxiliary camera is larger than that of the scanner camera;
the method comprises the following steps:
a first acquisition step: the terminal acquires calibration plate images acquired by the auxiliary camera and the scanner camera;
a first extraction step: extracting characteristic angular points of the calibration plate image by the terminal;
calibrating a scanner camera: the terminal calculates a first internal parameter and a first external parameter according to the characteristic corner points, wherein the first internal parameter is an internal parameter of a scanner camera, and the first external parameter is an external parameter of the scanner camera;
a second acquisition step: the terminal acquires projection images acquired by the auxiliary camera and the scanner camera, wherein the projection images are images formed by the projection device on the white board in a projecting mode;
a second extraction step: extracting the characteristic points of the projected image by the terminal;
calibrating the projection device: the terminal calculates a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature point, wherein the second internal parameter is an internal parameter of the projection device, and the second external parameter is an external parameter of the projection device;
an output step: and the terminal outputs a calibration file according to the first internal parameter, the first external parameter, the second internal parameter and the second external parameter.
In a second aspect, a calibration system is provided, the calibration system comprising: the 3D scanner comprises a scanner camera and a projection device, wherein the picture of the auxiliary camera is larger than that of the scanner camera, and the resolution of the auxiliary camera is larger than that of the scanner camera;
the scanner camera is used for acquiring calibration plate images and projection images;
the auxiliary camera is used for acquiring calibration plate images and projection images;
the projection device is used for projecting the coding pattern onto the white board;
the terminal is used for executing the following steps:
a first acquisition step: acquiring calibration plate images acquired by the auxiliary camera and the scanner camera;
a first extraction step: extracting characteristic angular points of the calibration plate image;
calibrating a scanner camera: calculating a first internal parameter and a first external parameter according to the characteristic corner points, wherein the first internal parameter is an internal parameter of the scanner camera, and the first external parameter is an external parameter of the scanner camera;
a second acquisition step: acquiring projection images acquired by the auxiliary camera and the scanner camera, wherein the projection images are images formed by the projection device on a white board in a projection mode;
a second extraction step: extracting feature points of the projected image;
calibrating the projection device: calculating a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature point, wherein the second internal parameter is an internal parameter of the projection device, and the second external parameter is an external parameter of the projection device;
an output step: and outputting a calibration file according to the first internal parameter, the first external parameter, the second internal parameter and the second external parameter.
In a third aspect, a computer device is provided, comprising:
a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of:
a first acquisition step: acquiring calibration plate images acquired by the auxiliary camera and the scanner camera;
a first extraction step: extracting characteristic angular points of the calibration plate image;
calibrating a scanner camera: calculating a first internal parameter and a first external parameter according to the characteristic corner points, wherein the first internal parameter is an internal parameter of the scanner camera, and the first external parameter is an external parameter of the scanner camera;
a second acquisition step: acquiring projection images acquired by the auxiliary camera and the scanner camera, wherein the projection images are images formed by the projection device on a white board in a projection mode;
a second extraction step: extracting feature points of the projected image;
calibrating the projection device: calculating a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature point, wherein the second internal parameter is an internal parameter of the projection device, and the second external parameter is an external parameter of the projection device;
an output step: and outputting a calibration file according to the first internal parameter, the first external parameter, the second internal parameter and the second external parameter.
According to the calibration method, the calibration system and the calibration equipment, the characteristic angular points of the calibration plate image are extracted by acquiring the calibration plate image acquired by the auxiliary camera and the scanner camera, and then the first internal parameter and the first external parameter are calculated according to the characteristic angular points. The auxiliary camera is added, and because the picture and the resolution ratio of the auxiliary camera are both larger than those of the scanner camera, a wider picture can be acquired during acquisition, so that the difference in the quantity of characteristic angular points is extracted from different positions on the image of the smooth calibration plate, the quantity of the acquired images of the calibration plate is reduced, and the calibration efficiency is improved. And when the multi-camera combined calibration is carried out, because the multi-camera constraint can provide more stable constraint, the error is smaller when the adjustment calculation is carried out, so that the precision of the camera calibration is higher.
And secondly, extracting feature points of the projected image by acquiring the projected image acquired by the auxiliary camera and the scanner camera, and calculating a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature points. The auxiliary camera is added, and because the picture and the resolution of the auxiliary camera are both larger than those of the scanner camera, a wider picture can be acquired during acquisition, and uniform and sufficient characteristic points can be extracted, so that the calibration result of the projection device is more accurate, the calibration times are reduced, and the calibration efficiency is improved. And the more the number of the characteristic points is, the smaller the error of calculation in calibration is, and the higher the precision in calibration is.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Wherein:
FIG. 1 is a flow chart of a calibration method in one embodiment;
FIG. 2 is a system diagram of a calibration system in one embodiment;
FIG. 3 is a schematic drawing of one embodiment;
FIG. 4 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It is noted that the terms "comprises," "comprising," and "having" and any variations thereof in the description and claims of this application and the drawings described above are intended to cover non-exclusive inclusions. For example, a process, method, terminal, product, or apparatus that comprises a list of steps or elements is not limited to the listed steps or elements but may alternatively include other steps or elements not listed or inherent to such process, method, product, or apparatus. In the claims, the description and the drawings of the specification of the present application, relational terms such as "first" and "second", and the like, may be used solely to distinguish one entity/action/object from another entity/action/object without necessarily requiring or implying any actual such relationship or order between such entities/actions/objects.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The algorithm for establishing the three-dimensional model through the image comprises the following steps: camera restoration → model restoration → stereo correspondence. Camera restoration refers to solving for internal parameters of a camera used when an image is taken and external parameters of the camera. The 3D scanner for coding the structured light mainly comprises a camera and a projection device, and the working principle of the 3D scanner is that the structured light is projected to an object from the projection device and is obtained by a sensor to form a picture, and then the distance from each point of the object to the plane of the camera is calculated according to the distortion condition of a coding pattern and the number of internal and external parameters of the sensor. Therefore, the most critical part of the process is to obtain its internal and external parameters by calibrating the sensor and the projection device.
In the traditional calibration method, when calibration is performed, the feature angular points and feature points extracted from calibration plate images and projection images collected by a camera are fewer and uneven, so that more calibration plate images and projection images need to be collected, the efficiency is low, the precision is not high enough during calculation, and multiple times of calibration are needed.
As shown in fig. 1, a calibration method is provided, which is applied to a calibration system, where the calibration system includes: the terminal comprises a terminal 201, a 3D scanner and an auxiliary camera 202, wherein the 3D scanner comprises a scanner camera 203 and a projection device 204, the frame of the auxiliary camera is larger than that of the scanner camera, and the resolution of the auxiliary camera is larger than that of the scanner camera;
the method comprises the following steps:
a first acquisition step 101: the terminal 201 acquires calibration board images acquired by the auxiliary camera 202 and the scanner camera 203.
Wherein, before auxiliary camera 202 and scanner camera 203 gather calibration plate image, need carry out the timing to equipment, specifically be: the positions of the 3D scanner and the auxiliary camera 202 are fixed to form a triangular position relationship between the auxiliary camera 202 and the scanner camera 203 and the projection device 204, and the focal lengths of the lenses of the auxiliary camera 202, the scanner camera 203 and the optical projection device 204 are adjusted to make the images at the focal lengths thereof sharpest. When the device is calibrated, the auxiliary camera 202 and the scanner camera 203 begin to capture calibration plate images. The calibration plate used for calibration of the auxiliary camera 202 and the scanner camera 203 can be made of glass, then the customized pattern is printed on the glass plate, the auxiliary camera 202 and the scanner camera 203 shoot a flat plate with a pattern array with a fixed distance, and a geometric model of the auxiliary camera 202 and the scanner camera 203 can be obtained through calculation of a calibration algorithm, so that high-precision measurement and reconstruction results are obtained, and the flat plate with the pattern array with the fixed distance is the calibration plate. In the depth of field range of the auxiliary camera 202 and the scanner camera 203 (depth of field refers to the range of the front and back distances of a shot object measured by imaging which can obtain a clear image at the front edge of a camera lens or other imagers, after focusing is completed, a clear image can be formed in the range of the front and back of a focus, and the range of the front and back distances is called depth of field), image acquisition is performed on calibration plates with different postures from near to far, the meaning in the depth of field range is to ensure that the acquired images are clear enough, and enough feature angular points can be extracted from the images acquired by the auxiliary camera 202 and the scanner camera 203 each time. After the auxiliary camera 202 and the scanner camera 203 collect the calibration board images, the terminal 201 acquires the calibration board images collected by the auxiliary camera 202 and the scanner camera 203. The terminal 201 includes a computer or a tablet.
A first extraction step 102: the terminal 201 extracts the characteristic corner points of the calibration board image.
After the terminal 201 obtains the calibration plate image, the feature angular points in the calibration plate image are extracted. The feature corners are intersections between the contours. For the same scene, even if the visual angle changes, the scene usually has the characteristic of stable property; the pixel points in the area near the point have large changes in the gradient direction or the gradient amplitude.
Scanner camera calibration step 103: the terminal 201 calculates a first internal parameter and a first external parameter according to the feature corner points, where the first internal parameter is an internal parameter of the scanner camera 203, and the first external parameter is an external parameter of the scanner camera 203.
As shown in fig. 3, 301 is a frame of the auxiliary camera 202, 302 is a frame of the scanner camera 203, the frame of the auxiliary camera 202 is larger than that of the scanner camera 203, the frame refers to the size of the area of the image sensor in the camera, the larger the frame is, that is, the larger the area of the sensor is, the larger the imaging area is, the more pixels are, the more fine and smooth the image is, and in a simple way, the better the image quality is. The resolution of the auxiliary camera 202 is greater than that of the scanner camera 203, so the number of feature corner points extracted from the calibration plate image collected by the auxiliary camera 202 is greater than that of the feature corner points extracted from the calibration plate image collected by the scanner camera 203, and the feature corner points extracted from the calibration plate image collected by the auxiliary camera 202 are more accurate and uniform. The terminal 201 calculates a first internal parameter and a first external parameter according to the characteristic corner, specifically: the terminal 201 acquires the image coordinates of the characteristic corner points; the terminal 201 calculates the coordinates of the same-name points of the scanner camera 203 and the auxiliary camera 202 according to the image coordinates; the terminal 201 establishes a mapping relationship between the image coordinates of the characteristic corner points and the coordinates of the homonymy points; the terminal 201 calculates the first internal parameter and the first external parameter according to the mapping relationship. The internal parameters refer to the focal length of the sensor lens, the offset and the distortion of the optical center, and the external parameters include a sensor rotation parameter and a sensor translation parameter and represent the relative positions of the sensor rotation parameter and the sensor translation parameter. In the camera calibration, a calibration plate is used as a fixed reference coordinate system, the calibration plate is composed of known points which are regularly arranged, the distances between the points and the points in the horizontal and vertical directions are known, the three-dimensional coordinates of the points are known, the points are positioned at the intersection points of grids on the calibration plate, angular point characteristics are presented on an image, then the coordinates of the same-name points of the scanner camera 203 and the auxiliary camera 202 are calculated through the image coordinates of the characteristic angular points through a stereo matching algorithm, so that the mapping relation between the two is established, and the internal parameters and the external parameters of the scanner camera 203 are calculated through a stereo matching correlation theory method. The auxiliary camera 202 with larger picture and higher resolution is added to help the scanner camera 203 to calibrate, so that the difference in the quantity of the characteristic angular points can be extracted from different positions on the image of the calibration plate smoothly, the extracted characteristic angular points are more accurate, the quantity of collected images of the calibration plate is reduced, and the calibration efficiency is improved.
After the single camera is respectively calibrated, the multiple cameras are jointly calibrated, adjustment calculation is carried out, iterative calculation is used for enabling errors to be converged through the adjustment calculation, and the errors become small when the iteration is carried out once. Therefore, when the auxiliary camera 202 with a larger picture is added for multi-camera combined calibration, the error is smaller when the adjustment calculation is carried out due to the fact that multi-camera constraint can provide firmer constraint, and the calibration precision of the scanner camera 203 is higher.
Second acquisition step 104: the terminal 201 acquires projection images acquired by the auxiliary camera 202 and the scanner camera 203, and the projection images are images projected on the whiteboard by the projection device 204.
The 3D scanner applied in the present application is mainly a coded structured light 3D scanner, and the structure thereof is mainly composed of a camera and a projection device 204. The structured light code is projected from the projection device 204 to the object and is captured by the sensor to form a picture, and then the distance between each point of the object and the plane of the camera is calculated according to the distortion condition of the code pattern and the internal and external parameters of the sensor. The auxiliary camera 202 and the scanner camera 203 capture an image formed by the projection device 204 projected on the whiteboard, and the whiteboard is used to present the pattern projected by the optical projector. During acquisition, projection patterns on the whiteboard in different postures need to be acquired from near to far within the depth of field range of the camera, and the terminal 201 acquires the projection patterns acquired by the auxiliary camera 202 and the scanner camera 203.
Second extraction step 105: the terminal 201 extracts the feature points of the projection image.
After the terminal 201 acquires the projection patterns acquired by the auxiliary camera 202 and the scanner camera 203, the feature points of the projection images are extracted. The feature point refers to a point where the image gray value changes drastically or a point where the curvature is large on the edge of the image (i.e., the intersection of two edges).
Projector device calibration step 106: the terminal 201 calculates a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature point, wherein the second internal parameter is an internal parameter of the projection apparatus 204, and the second external parameter is an external parameter of the projection apparatus 204.
The terminal 201 calculates a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter, and the feature point, and specifically includes: the terminal 201 acquires the image coordinates of the feature points; the terminal 201 calculates a homography matrix according to the first internal parameter, the first external parameter and the characteristic point; the terminal 201 calculates the three-dimensional coordinates of the feature points according to the homography matrix; the terminal 201 calculates the second internal parameter and the second external parameter according to the image coordinates of the feature points and the three-dimensional coordinates of the feature points. Since the first internal parameter and the first external parameter of the scanner camera 203 are obtained in the scanner camera calibration step 103, the feature point is extracted in the second extraction step 105, the image coordinate of the feature point is known, a homography matrix can be solved through a sampling consistency algorithm by the first internal parameter, the first external parameter and the feature point, the three-dimensional coordinate of the feature point is calculated through the homography matrix, and the internal parameter and the external parameter of the projection apparatus 204 are calculated based on the feature point coordinate of the object plane and the three-dimensional coordinate of the corresponding space point. In the sampling consistency algorithm, the effect is better and the error is smaller when the number of the characteristic points is larger. Since the frame size and the resolution of the auxiliary camera 202 are greater than those of the scanner camera 203, the number of the feature points extracted from the projection pattern acquired by the auxiliary camera 202 is greater than those extracted from the projection pattern acquired by the scanner camera 203, and the calibration error of the projection device 204 is smaller, and the calibration result is more accurate. And uniform and enough characteristic points are extracted, so that the calibration result of the projection device 204 is more accurate, the calibration times are reduced, and the calibration efficiency is improved.
An output step 107: the terminal 201 outputs a calibration file according to the first internal parameter, the first external parameter, the second internal parameter, and the second external parameter.
After the internal parameters and the external parameters of the scanner camera 203 and the projection apparatus 204 are calculated, the internal parameters and the external parameters of the scanner camera 203 and the projection apparatus 204 are written into the calibration file, and the calibration is considered to be completed.
In the calibration method, the characteristic corner points of the calibration plate image are extracted by acquiring the calibration plate images acquired by the auxiliary camera 202 and the scanner camera 203, and then the first internal parameter and the first external parameter are calculated according to the characteristic corner points. The auxiliary camera 202 is added, and because the picture and the resolution of the auxiliary camera 202 are both larger than those of the scanner camera 203, a wider picture can be acquired during acquisition, so that the difference in the quantity of characteristic angular points is extracted from different positions on the image of the smooth calibration plate, the quantity of the acquired images of the calibration plate is reduced, and the calibration efficiency is improved. And when the multi-camera combined calibration is carried out, because the multi-camera constraint can provide more stable constraint, the error is smaller when the adjustment calculation is carried out, so that the precision of the camera calibration is higher.
Then, by acquiring the projection images acquired by the auxiliary camera 202 and the scanner camera 203, the feature points of the projection images are extracted, and then the second internal parameter and the second external parameter are calculated according to the first internal parameter, the first external parameter, and the feature points. The auxiliary camera 202 is added, and because the picture and the resolution of the auxiliary camera 202 are both larger than those of the scanner camera 203, a wider picture can be acquired during acquisition, and uniform and sufficient characteristic points can be extracted, so that the calibration result of the projection device 204 is more accurate, the calibration times are reduced, and the calibration efficiency is improved. And the more the number of the characteristic points is, the smaller the error of calculation in calibration is, and the higher the precision in calibration is.
In one embodiment, the terminal 201 calculates a first internal parameter and a first external parameter according to the feature corner, including: the terminal 201 acquires the image coordinates of the characteristic corner points; the terminal 201 calculates the coordinates of the same-name points of the scanner camera 203 and the auxiliary camera 202 according to the image coordinates; the terminal 201 establishes a mapping relationship between the image coordinates of the characteristic corner points and the coordinates of the homonymy points; the terminal 201 calculates the first internal parameter and the first external parameter according to the mapping relationship.
The terminal 201 calculates a first internal parameter and a first external parameter according to the characteristic corner, specifically: the terminal 201 acquires the image coordinates of the characteristic corner points; the terminal 201 calculates the coordinates of the same-name points of the scanner camera 203 and the auxiliary camera 202 according to the image coordinates; the terminal 201 establishes a mapping relationship between the image coordinates of the characteristic corner points and the coordinates of the homonymy points; the terminal 201 calculates the first internal parameter and the first external parameter according to the mapping relationship. The internal parameters refer to the focal length of the sensor lens, the offset and the distortion of the optical center, and the external parameters include a sensor rotation parameter and a sensor translation parameter and represent the relative positions of the sensor rotation parameter and the sensor translation parameter. In the camera calibration, a calibration plate is used as a fixed reference coordinate system, the calibration plate is composed of known points which are regularly arranged, the distances between the points and the points in the horizontal and vertical directions are known, the three-dimensional coordinates of the points are known, the points are positioned at the intersection points of grids on the calibration plate, angular point characteristics are presented on an image, then the coordinates of the same-name points of the scanner camera 203 and the auxiliary camera 202 are calculated through the image coordinates of the characteristic angular points through a stereo matching algorithm, so that the mapping relation between the two is established, and the internal parameters and the external parameters of the scanner camera 203 are calculated through a stereo matching correlation theory method. The auxiliary camera 202 with larger picture and higher resolution is added to help the scanner camera 203 to calibrate, so that the difference in the quantity of the characteristic angular points can be extracted from different positions on the image of the calibration plate smoothly, the extracted characteristic angular points are more accurate, the quantity of collected images of the calibration plate is reduced, and the calibration efficiency is improved.
In an embodiment, after the terminal 201 obtains the first internal parameter and the first external parameter by calculating according to the coordinates of the same-name point, the method further includes: the terminal 201 calculates a first calibration error of the scanner camera 203; the terminal 201 determines whether the first calibration error is lower than a first preset value; if yes, determining that the calibration is qualified, and skipping to the second acquisition step; if not, the calibration is regarded as unqualified, and the step is shifted to the first obtaining step 101.
After the terminal 201 finishes calibrating the scanner camera 203, it is also required to check whether the calibration result is accurate. First, a first calibration error is calculated, coordinates of three-dimensional points corresponding to the acquired two-dimensional calibration plate image are calculated through back projection, and since the three-dimensional coordinates of the points on the calibration plate are known, errors of the coordinates of the three-dimensional points corresponding to the acquired two-dimensional image and the three-dimensional coordinates of the points on the calibration plate are calculated through back projection. The magnitude relationship between the error and the first preset value is determined, and the first preset value can be designed by itself, for example, set to 0.1. When the error is smaller than a first preset value, the calibration is qualified; and when the error is not less than the first preset value, the calibration is regarded as unqualified. And verifying whether the calibration result is qualified or not by establishing a quantitative evaluation standard, and performing the next step according to the judgment result to ensure that the calibration result is qualified.
In one embodiment, the terminal 201 calculates a first calibration error of the scanner camera 203, including: the terminal 201 acquires a first three-dimensional coordinate of the characteristic corner point; the terminal 201 calculates a second three-dimensional coordinate of the characteristic corner point according to the image coordinate of the characteristic corner point; the terminal 201 calculates an error of the first three-dimensional coordinates and the second three-dimensional coordinates.
The method comprises the steps of firstly calculating a first calibration error, and calculating the coordinates of three-dimensional points corresponding to a two-dimensional calibration plate image acquired through back projection.
In one embodiment, the terminal 201 calculates a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature point, and includes: the terminal 201 acquires the image coordinates of the feature points; the terminal 201 calculates a homography matrix according to the first internal parameter, the first external parameter and the characteristic point; the terminal 201 calculates the three-dimensional coordinates of the feature points according to the homography matrix; the terminal 201 calculates the second internal parameter and the second external parameter according to the image coordinates of the feature points and the three-dimensional coordinates of the feature points.
The terminal 201 calculates a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter, and the feature point, and specifically includes: the terminal 201 acquires the image coordinates of the feature points; the terminal 201 calculates a homography matrix according to the first internal parameter, the first external parameter and the characteristic point; the terminal 201 calculates the three-dimensional coordinates of the feature points according to the homography matrix; the terminal 201 calculates the second internal parameter and the second external parameter according to the image coordinates of the feature points and the three-dimensional coordinates of the feature points. Since the first internal parameter and the first external parameter of the scanner camera 203 are obtained in the scanner camera calibration step 103, the feature point is extracted in the second extraction step 105, the image coordinate of the feature point is known, a homography matrix can be solved through a sampling consistency algorithm by the first internal parameter, the first external parameter and the feature point, the three-dimensional coordinate of the feature point is calculated through the homography matrix, and the internal parameter and the external parameter of the projection apparatus 204 are calculated based on the feature point coordinate of the object plane and the three-dimensional coordinate of the corresponding space point. In the sampling consistency algorithm, the effect is better and the error is smaller when the number of the characteristic points is larger. Since the frame size and the resolution of the auxiliary camera 202 are greater than those of the scanner camera 203, the number of the feature points extracted from the projection pattern acquired by the auxiliary camera 202 is greater than those extracted from the projection pattern acquired by the scanner camera 203, and the calibration error of the projection device 204 is smaller, and the calibration result is more accurate. And uniform and enough characteristic points are extracted, so that the calibration result of the projection device 204 is more accurate, the calibration times are reduced, and the calibration efficiency is improved.
In one embodiment, after the terminal 201 calculates the second internal parameter and the second external parameter according to the first internal parameter, the first external parameter and the feature point, the method further includes:
the terminal 201 calculates a second calibration error of the projection device 204; the terminal 201 determines whether the second calibration error is lower than a second preset value; if yes, determining that the calibration is qualified, and skipping to the output step 107; if not, the calibration is regarded as unqualified, and the step is skipped to the second acquisition step.
After the terminal 201 completes calibration of the projection apparatus 204, it is further required to check whether the calibration result is accurate. And calculating the coordinate of a three-dimensional point corresponding to the acquired two-dimensional projection image by back projection, wherein the error between the coordinate of the three-dimensional point corresponding to the acquired two-dimensional image and the three-dimensional coordinate of the point on the whiteboard is calculated by comparing the three-dimensional coordinates of the points on the whiteboard and the three-dimensional coordinates of the acquired two-dimensional image, because the three-dimensional coordinates of the points on the whiteboard are known. The magnitude relationship between the error and the second preset value is determined, and the second preset value can be designed by itself, for example, set to 0.1. When the error is smaller than a second preset value, the calibration is qualified; and when the error is not less than the second preset value, the calibration is regarded as unqualified. And verifying whether the calibration result is qualified or not by establishing a quantitative evaluation standard, and performing the next step according to the judgment result to ensure that the calibration result is qualified.
In one embodiment, after the outputting step 107, the method further includes: a judgment step 108: the terminal 201 establishes a quantitative evaluation standard, and judges whether the calibration file is qualified according to the quantitative evaluation standard.
After the terminal 201 outputs the calibration file, it is further required to check whether the calibration file is accurate. The method specifically comprises the following steps: the terminal 201 acquires a three-dimensional space point corresponding to the feature point; the terminal 201 constructs a triangular mesh according to the three-dimensional space points; the terminal 201 obtains the complete number of the triangular mesh; the terminal 201 determines whether the integrity of the triangular mesh is greater than a third preset value. The third preset value can be designed by itself, for example, set to 10. And verifying whether the calibration file is qualified or not by establishing a quantitative evaluation standard, and ensuring that the calibration result is qualified.
In one embodiment, the quantitative evaluation criteria include: the terminal 201 acquires a three-dimensional space point corresponding to the feature point; the terminal 201 constructs a triangular mesh according to the three-dimensional space points; the terminal 201 obtains the complete number of the triangular mesh; the terminal 201 determines whether the integrity of the triangular mesh is greater than a third preset value.
The third preset value can be designed by itself, for example, set to 10. And verifying whether the calibration file is qualified or not by establishing a quantitative evaluation standard, and ensuring that the calibration result is qualified.
As shown in fig. 2, a calibration system is proposed, which comprises: the terminal comprises a terminal 201, a 3D scanner and an auxiliary camera 202, wherein the 3D scanner comprises a scanner camera 203 and a projection device 204, the picture of the auxiliary camera 202 is larger than that of the scanner camera 203, and the resolution of the auxiliary camera 202 is larger than that of the scanner camera 203;
the scanner camera 203 is used for acquiring calibration plate images;
the auxiliary camera 202 is used for acquiring calibration plate images;
the projection device 204 is used for projecting the code pattern onto the whiteboard;
the terminal 201 is configured to perform the following steps:
a first acquisition step 101: acquiring calibration plate images acquired by the auxiliary camera 202 and the scanner camera 203;
a first extraction step 102: extracting characteristic angular points of the calibration plate image;
scanner camera calibration step 103: calculating a first internal parameter and a first external parameter according to the feature corner points, wherein the first internal parameter is an internal parameter of the scanner camera 203, and the first external parameter is an external parameter of the scanner camera 203;
second acquisition step 104: acquiring projection images acquired by the auxiliary camera 202 and the scanner camera 203, wherein the projection images are images formed by the projection device 204 projected on a whiteboard;
second extraction step 105: extracting feature points of the projected image;
projector device calibration step 106: calculating a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature points, wherein the second internal parameter is an internal parameter of the projection device 204, and the second external parameter is an external parameter of the projection device 204;
an output step 107: and outputting a calibration file according to the first internal parameter, the first external parameter, the second internal parameter and the second external parameter.
In the calibration system, the characteristic corner points of the calibration plate image are extracted by acquiring the calibration plate images acquired by the auxiliary camera 202 and the scanner camera 203, and then the first internal parameter and the first external parameter are calculated according to the characteristic corner points. The auxiliary camera 202 is added, and because the picture and the resolution of the auxiliary camera 202 are both larger than those of the scanner camera 203, a wider picture can be acquired during acquisition, so that the difference in the quantity of characteristic angular points is extracted from different positions on the image of the smooth calibration plate, the quantity of the acquired images of the calibration plate is reduced, and the calibration efficiency is improved. And when the multi-camera combined calibration is carried out, because the multi-camera constraint can provide more stable constraint, the error is smaller when the adjustment calculation is carried out, so that the precision of the camera calibration is higher.
Then, by acquiring the projection images acquired by the auxiliary camera 202 and the scanner camera 203, the feature points of the projection images are extracted, and then the second internal parameter and the second external parameter are calculated according to the first internal parameter, the first external parameter, and the feature points. The auxiliary camera 202 is added, and because the picture and the resolution of the auxiliary camera 202 are both larger than those of the scanner camera 203, a wider picture can be acquired during acquisition, and uniform and sufficient characteristic points can be extracted, so that the calibration result of the projection device 204 is more accurate, the calibration times are reduced, and the calibration efficiency is improved. And the more the number of the characteristic points is, the smaller the error of calculation in calibration is, and the higher the precision in calibration is.
In one embodiment, the terminal 201 calculates a first internal parameter and a first external parameter according to the feature corner, including: the terminal 201 acquires the image coordinates of the characteristic corner points; the terminal 201 calculates the coordinates of the same-name points of the scanner camera 203 and the auxiliary camera 202 according to the image coordinates; the terminal 201 establishes a mapping relationship between the image coordinates of the characteristic corner points and the coordinates of the homonymy points; the terminal 201 calculates the first internal parameter and the first external parameter according to the mapping relationship.
In an embodiment, after the terminal 201 obtains the first internal parameter and the first external parameter by calculating according to the coordinates of the same-name point, the method further includes: the terminal 201 calculates a first calibration error of the scanner camera 203; the terminal 201 determines whether the first calibration error is lower than a first preset value; if yes, determining that the calibration is qualified, and skipping to the second acquisition step; if not, the calibration is regarded as unqualified, and the step is shifted to the first obtaining step 101.
In one embodiment, the terminal 201 calculates a first calibration error of the scanner camera 203, including: the terminal 201 acquires a first three-dimensional coordinate of the characteristic corner point; the terminal 201 calculates a second three-dimensional coordinate of the characteristic corner point according to the image coordinate of the characteristic corner point; the terminal 201 calculates an error of the first three-dimensional coordinates and the second three-dimensional coordinates.
In one embodiment, the terminal 201 calculates a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature point, and includes: the terminal 201 acquires the image coordinates of the feature points; the terminal 201 calculates a homography matrix according to the first internal parameter, the first external parameter and the characteristic point; the terminal 201 calculates the three-dimensional coordinates of the feature points according to the homography matrix; the terminal 201 calculates the second internal parameter and the second external parameter according to the image coordinates of the feature points and the three-dimensional coordinates of the feature points.
In one embodiment, after the terminal 201 calculates the second internal parameter and the second external parameter according to the first internal parameter, the first external parameter and the feature point, the method further includes: the terminal 201 calculates a second calibration error of the projection device 204; the terminal 201 determines whether the second calibration error is lower than a second preset value; if yes, determining that the calibration is qualified, and skipping to the output step 107; if not, the calibration is regarded as unqualified, and the step is skipped to the second acquisition step.
In one embodiment, after the outputting step 107, the method further includes: a judgment step 108: the terminal 201 establishes a quantitative evaluation standard, and judges whether the calibration file is qualified according to the quantitative evaluation standard.
In one embodiment, the quantitative evaluation criteria include: the terminal 201 acquires a three-dimensional space point corresponding to the feature point; the terminal 201 constructs a triangular mesh according to the three-dimensional space points; the terminal 201 obtains the complete number of the triangular mesh; the terminal 201 determines whether the integrity of the triangular mesh is greater than a third preset value.
As shown in fig. 4, the computer device includes a processor, a memory, and a network interface connected by a terminal 201 bus. Wherein the memory includes a non-volatile storage medium and an internal memory. The non-volatile storage medium of the computer device has a storage operation terminal 201, and may further have a storage computer program, which when executed by the processor, enables the processor to implement the above-mentioned calibration method. The internal memory may also have stored therein a computer program that, when executed by the processor, causes the processor to perform the calibration method described above. Those skilled in the art will appreciate that the configuration shown in fig. 4 is a block diagram of only a portion of the configuration associated with the present application and does not constitute a limitation on the devices to which the present application applies, and that a particular device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, a computer-readable storage medium is proposed, having stored a computer program, which, when being executed by a processor, causes the processor to carry out the steps of the above-mentioned calibration method.
It is understood that the above calibration method, apparatus and storage medium belong to one general inventive concept, and the embodiments are applicable to each other.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A calibration method is applied to a calibration system, and is characterized in that the calibration system comprises: the 3D scanner comprises a scanner camera and a projection device, wherein the picture of the auxiliary camera is larger than that of the scanner camera, and the resolution of the auxiliary camera is larger than that of the scanner camera;
the method comprises the following steps:
a first acquisition step: the terminal acquires calibration plate images acquired by the auxiliary camera and the scanner camera;
a first extraction step: extracting characteristic angular points of the calibration plate image by the terminal;
calibrating a scanner camera: the terminal calculates a first internal parameter and a first external parameter according to the characteristic corner points, wherein the first internal parameter is an internal parameter of a scanner camera, and the first external parameter is an external parameter of the scanner camera;
a second acquisition step: the terminal acquires projection images acquired by the auxiliary camera and the scanner camera, wherein the projection images are images formed by the projection device on the white board in a projecting mode;
a second extraction step: extracting the characteristic points of the projected image by the terminal;
calibrating the projection device: the terminal calculates a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature point, wherein the second internal parameter is an internal parameter of the projection device, and the second external parameter is an external parameter of the projection device;
an output step: and the terminal outputs a calibration file according to the first internal parameter, the first external parameter, the second internal parameter and the second external parameter.
2. The calibration method according to claim 1, wherein the calculating, by the terminal, the first internal parameter and the first external parameter according to the characteristic corner point comprises:
the terminal acquires the image coordinates of the characteristic angular points;
the terminal calculates the coordinates of the same-name points of the scanner camera and the auxiliary camera according to the image coordinates;
the terminal establishes a mapping relation between the image coordinates of the characteristic angular points and the coordinates of the homonymy points;
and the terminal calculates the first internal parameter and the first external parameter according to the mapping relation.
3. The calibration method according to claim 2, wherein after the terminal calculates the first internal parameter and the first external parameter according to the coordinates of the same-name point, the method further comprises:
the terminal calculates a first calibration error of the scanner camera;
the terminal judges whether the first calibration error is lower than a first preset value or not;
if yes, determining that the calibration is qualified, and skipping to the second acquisition step;
if not, the calibration is regarded as unqualified, and the step is skipped to the first acquisition step.
4. The calibration method according to claim 3, wherein the terminal calculates a first calibration error of the scanner camera, comprising:
the terminal acquires a first three-dimensional coordinate of the characteristic corner point;
the terminal calculates a second three-dimensional coordinate of the characteristic corner point according to the image coordinate of the characteristic corner point;
and the terminal calculates the error of the first three-dimensional coordinate and the second three-dimensional coordinate.
5. The calibration method according to claim 1, wherein the terminal calculates a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature point, and comprises:
the terminal acquires the image coordinates of the feature points;
the terminal calculates a homography matrix according to the first internal parameter, the first external parameter and the characteristic points;
the terminal calculates the three-dimensional coordinates of the characteristic points according to the homography matrix;
and the terminal calculates the second internal parameter and the second external parameter according to the image coordinate of the characteristic point and the three-dimensional coordinate of the characteristic point.
6. The calibration method according to claim 5, after the terminal calculates a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature point, the method further comprising:
the terminal calculates a second calibration error of the projection device;
the terminal judges whether the second calibration error is lower than a second preset value or not;
if yes, determining that the calibration is qualified, and skipping to the output step;
if not, the calibration is regarded as unqualified, and the step is skipped to the second acquisition step.
7. The calibration method according to claim 1, further comprising, after the outputting step:
a judging step: and the terminal establishes a quantitative evaluation standard and judges whether the calibration file is qualified or not according to the quantitative evaluation standard.
8. The calibration method according to claim 7, wherein the quantitative evaluation criteria comprises:
the terminal acquires a three-dimensional space point corresponding to the characteristic point;
the terminal constructs a triangular mesh according to the three-dimensional space points;
the terminal acquires the complete number of the triangular mesh;
and the terminal judges whether the complete number of the triangular grids is greater than a third preset value.
9. A calibration system, characterized in that the calibration system comprises: the 3D scanner comprises a scanner camera and a projection device, wherein the picture of the auxiliary camera is larger than that of the scanner camera, and the resolution of the auxiliary camera is larger than that of the scanner camera;
the scanner camera is used for acquiring calibration plate images and projection images;
the auxiliary camera is used for acquiring calibration plate images and projection images;
the projection device is used for projecting the coding pattern onto the white board;
the terminal is used for executing the following steps:
a first acquisition step: acquiring calibration plate images acquired by the auxiliary camera and the scanner camera;
a first extraction step: extracting characteristic angular points of the calibration plate image;
calibrating a scanner camera: calculating a first internal parameter and a first external parameter according to the characteristic corner points, wherein the first internal parameter is an internal parameter of the scanner camera, and the first external parameter is an external parameter of the scanner camera;
a second acquisition step: acquiring projection images acquired by the auxiliary camera and the scanner camera, wherein the projection images are images formed by the projection device on a white board in a projection mode;
a second extraction step: extracting feature points of the projected image;
calibrating the projection device: calculating a second internal parameter and a second external parameter according to the first internal parameter, the first external parameter and the feature point, wherein the second internal parameter is an internal parameter of the projection device, and the second external parameter is an external parameter of the projection device;
an output step: and outputting a calibration file according to the first internal parameter, the first external parameter, the second internal parameter and the second external parameter.
10. A computer arrangement comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to carry out the steps of the calibration method as claimed in any one of claims 1 to 8.
CN202111536170.8A 2021-12-15 2021-12-15 Calibration method, system and equipment Pending CN114359401A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111536170.8A CN114359401A (en) 2021-12-15 2021-12-15 Calibration method, system and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111536170.8A CN114359401A (en) 2021-12-15 2021-12-15 Calibration method, system and equipment

Publications (1)

Publication Number Publication Date
CN114359401A true CN114359401A (en) 2022-04-15

Family

ID=81099325

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111536170.8A Pending CN114359401A (en) 2021-12-15 2021-12-15 Calibration method, system and equipment

Country Status (1)

Country Link
CN (1) CN114359401A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115526941A (en) * 2022-11-25 2022-12-27 海伯森技术(深圳)有限公司 Calibration device and calibration method for telecentric camera

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115526941A (en) * 2022-11-25 2022-12-27 海伯森技术(深圳)有限公司 Calibration device and calibration method for telecentric camera

Similar Documents

Publication Publication Date Title
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
CN109737874B (en) Object size measuring method and device based on three-dimensional vision technology
US11985293B2 (en) System and methods for calibration of an array camera
CN110717942B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN111091063B (en) Living body detection method, device and system
CN109510948B (en) Exposure adjusting method, exposure adjusting device, computer equipment and storage medium
CN106683070B (en) Height measuring method and device based on depth camera
US9886759B2 (en) Method and system for three-dimensional data acquisition
JP6363863B2 (en) Information processing apparatus and information processing method
CN109360246B (en) Stereoscopic vision three-dimensional displacement measurement method based on synchronous subarea search
CN111145271B (en) Method and device for determining accuracy of camera parameters, storage medium and terminal
WO2019232793A1 (en) Two-camera calibration method, electronic device and computer-readable storage medium
CN112257713A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN112753047B (en) Method and system for in-loop calibration and target point setting of hardware of camera and related equipment
CN114359401A (en) Calibration method, system and equipment
CN116222425A (en) Three-dimensional reconstruction method and system based on multi-view three-dimensional scanning device
CN111445513B (en) Plant canopy volume acquisition method and device based on depth image, computer equipment and storage medium
CN109658459B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN116777769A (en) Method and device for correcting distorted image, electronic equipment and storage medium
JP2007122328A (en) Distortion aberration correction device and distortion aberration correction method
CN115719384A (en) Imaging method, device, system and storage medium of three-dimensional imaging system
CN108827157B (en) Laser measurement verification method, device, system, equipment and storage medium
JP2006023133A (en) Instrument and method for measuring three-dimensional shape
CN117422650B (en) Panoramic image distortion correction method and device, electronic equipment and medium
Sun et al. Blind calibration for focused plenoptic cameras

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination