NL2031065B1 - Method for verifying precision of calibration parameter, medium and electronic device - Google Patents

Method for verifying precision of calibration parameter, medium and electronic device Download PDF

Info

Publication number
NL2031065B1
NL2031065B1 NL2031065A NL2031065A NL2031065B1 NL 2031065 B1 NL2031065 B1 NL 2031065B1 NL 2031065 A NL2031065 A NL 2031065A NL 2031065 A NL2031065 A NL 2031065A NL 2031065 B1 NL2031065 B1 NL 2031065B1
Authority
NL
Netherlands
Prior art keywords
points
point
reference points
reconstruction
matching
Prior art date
Application number
NL2031065A
Other languages
Dutch (nl)
Other versions
NL2031065A (en
Inventor
Zheng Guoyan
Zhao Yuyun
Wu Junyang
Original Assignee
Univ Shanghai Jiaotong
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Univ Shanghai Jiaotong filed Critical Univ Shanghai Jiaotong
Publication of NL2031065A publication Critical patent/NL2031065A/en
Application granted granted Critical
Publication of NL2031065B1 publication Critical patent/NL2031065B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/344Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure provides a method for verifying precision of a calibration parameter, a medium and an electronic device. The method includes: obtaining a first image and a second image of a mold, obtaining, according to pixel coordinates of first reference points in the first image, coordinates of the first reference points in a coordinate system of a first camera as first coordinates; obtaining, according to pixel coordinates of second reference points in the second image, coordinates of the second reference points in a coordinate system of a second camera as second coordinates, transforming the first reference points and the second reference points into a same coordinate system according to a to-be—verif1ed calibration parameter, performing matching according to a relative position relationship between the first reference points and the second reference points, to obtain matching point pairs; obtaining three-dimensional reconstruction points corresponding to the matching point pairs; and obtaining precision of the calibration parameter according to a relative position relationship between the three-dimensional reconstruction points and reference points. The method for verifying precision can verify the precision of the calibration parameter.

Description

1 001917P-NL
METHOD FOR VERIFYING PRECISION OF CALIBRATION PARAMETER,
MEDIUM AND ELECTRONIC DEVICE
Field of Disclosure
The present disclosure relates to a data processing method, in particular, to a method for verifying precision of a calibration parameter, a medium and an electronic device.
Description of Related Arts
Biplanar radiograph is a low-dose and non-invasive imaging technology, and has become a preferred choice for orthopedic motion tracking and kinematic analysis due to its high temporal and spatial resolution. When used in combination with 2D-3D registration, this technology can directly measure in vivo joint kinematics, which provides clinicians and engineers with a biomechanical perspective to understand potential mechanisms, obtain surgical advice, and improve prosthesis design.
The biplanar radiography technology may relate to two or more X-ray machine photos captured at different angles. Therefore, two or more cameras need to be calibrated to obtain a calibration parameter. The calibration can be implemented by using a conventional calibration method. However, in actual applications, the conventional calibration method can only provide the calibration parameter, but cannot verify the precision of the calibration parameter. As a result, the accuracy of the calibration result cannot be ensured, which may lead to errors in experiments or actual applications.
Summary of the Present Disclosure
The present disclosure provides a method for verifying precision of a calibration parameter, a medium and an electronic device, to resolve the foregoing problems.
In a first aspect, the present disclosure provides a method for verifying precision of a calibration parameter, the method including: obtaining a first image and a second image of a
2 001917P-NL mold, a plurality of reference points is disposed on the mold, and the first image and the second image are captured at different angles; obtaining, according to pixel coordinates of first reference points in the first image, coordinates of the first reference points in a coordinate system of a first camera as first coordinates, the first reference points are points corresponding to the reference points in the first image; obtaining, according to pixel coordinates of second reference points in the second image, coordinates of the second reference points in a coordinate system of a second camera as second coordinates, the second reference points are points corresponding to the reference points in the second image; transforming the first reference points and the second reference points into a same coordinate system according to a to-be-verified calibration parameter; performing matching according to a relative position relationship between the first reference points and the second reference points, to obtain matching point pairs; obtaining three-dimensional reconstruction points corresponding to the matching point pairs; and obtaining precision of the calibration parameter according to a relative position relationship between the three-dimensional reconstruction points and the reference points.
In an embodiment of the first aspect, an implementation method for performing matching according to the relative position relationship between the first reference points and the second reference points to obtain the matching point pairs comprises: obtaining a plurality of first straight lines according to the first reference points and a first optical center, the first optical center corresponds to an optical center of the first camera, and each of the first straight lines passes through the first optical center and a corresponding first reference point, obtaining a plurality of second straight lines according to the second reference points and a second optical center, the second optical center corresponds to an optical center of the second camera, and each of the second straight lines passes through the second optical center and a corresponding second reference point, obtaining matching straight line pairs according to a relative position relationship between the first straight lines and the second straight lines; and obtaining the matching point pairs according to the first reference points and the second reference points corresponding to the matching straight line pairs.
In an embodiment of the first aspect, an implementation method for obtaining the matching point pairs further comprises: executing a matching point pair update sub-method to
3 001917P-NL update the matching point pairs, the matching point pair update sub-method comprises: performing, for any of the matching point pairs, three-dimensional reconstruction of the matching point pair, to obtain a corresponding first reconstruction point, performing point cloud registration on the first reconstruction point and the reference points according to geometric features of the mold, to obtain a transformation matrix; performing three-dimensional reconstruction on first reference points and second reference points in to-be-determined points, to obtain to-be-determined reconstruction points; transforming the to-be-determined reconstruction points and the reference points into a same coordinate system according to the transformation matrix; obtaining second reconstruction points according to a relative position relationship between the to-be-determined reconstruction points and the reference points, and updating the matching point pairs according to to-be-determined points corresponding to the second reconstruction points.
In an embodiment of the first aspect, an implementation method for performing, for any of the matching point pairs, three-dimensional reconstruction of the matching point pair, to obtain a corresponding first reconstruction point comprises: obtaining a first straight line and a second straight line corresponding to the matching point pair, and obtaining a point with a minimum sum of distances to the two straight lines as the first reconstruction point corresponding to the matching point pair.
In an embodiment of the first aspect, the mold is a pyramid-shaped mold, the reference points comprise a spire point and base points of the pyramid-shaped mold, and an implementation method for performing point cloud registration on the first reconstruction point and the reference points according to the geometric features of the mold, to obtain the transformation matrix comprises: obtaining reconstruction coordinates of mold feature points according to coordinates of the first reconstruction point, the mold feature points comprise the spire point, one of the base points, and a base center point; performing pose estimation on the reconstruction coordinates and true coordinates of the mold feature points, to obtain an initialized transformation matrix; transforming first reconstruction points according to the initialized transformation matrix; and performing point cloud registration according to the transformed first reconstruction points and the reference points, to obtain the transformation matrix.
4 001917P-NL
In an embodiment of the first aspect, after the updating the matching point pairs according to the to-be-determined points corresponding to the second reconstruction points, the implementation method for obtaining the matching point pairs further comprises: repeatedly executing the matching point pair update sub-method based on the updated matching point pairs, until the matching point pairs meet an end condition.
In an embodiment of the first aspect, an implementation method for obtaining matching straight line pairs according to the relative position relationship between the first straight lines and the second straight lines comprises: performing first matching based on the first straight lines, to obtain a first matching result, the first matching result comprises each first straight line and the second straight line closest to each first straight line; performing second matching based on the second straight lines, to obtain a second matching result, the second matching result comprises each second straight line and the first straight line closest to each second straight line; obtaining to-be-determined matching straight line pairs according to the first matching result and the second matching result; and obtaining the matching straight line pairs from the to-be-determined matching straight line pairs according to ratios between shortest distances and second shortest distances of straight lines, the straight lines comprise the first straight lines and the second straight lines.
In an embodiment of the first aspect, an implementation method for obtaining precision of the calibration parameter according to the relative position relationship between the three-dimensional reconstruction points and the reference points comprises: performing point cloud registration on the three-dimensional reconstruction points and the reference points, and obtaining a root mean square error between the three-dimensional reconstruction points and the corresponding reference points as the precision of the calibration parameter.
In a second aspect, the present disclosure provides a computer-readable storage medium storing a computer program. The computer program implements the above method when executed by a processor.
In a third aspect, the present disclosure provides an electronic device. The electronic device includes a memory and a processor, the memory stores a computer program, and the processor is in communication connection with the memory and performs the above method
001917P-NL for verifying precision when invoking the computer program.
As described above, the method for verifying precision in one or more embodiments of the present disclosure has the following beneficial effects.
Based on reference points in a mold, the method for verifying precision performs 5 coordinate system transformation, point matching, and three-dimensional reconstruction on first reference points and second reference points to obtain three-dimensional reconstruction points, and obtain precision of a calibration parameter according to a relative position relationship between the three-dimensional reconstruction points and the reference points.
Therefore, the method for verifying precision can verify the precision of the calibration parameter, which facilitates relevant personnel to obtain precision information of the calibration parameter, thereby avoiding errors in experiments or actual applications.
Brief Description of the Drawings
Fig. 1 is a flowchart of a method for verifying precision according to an embodiment of the present disclosure.
Fig. 2A is a detailed flowchart of step S15 in a method for verifying precision according to an embodiment of the present disclosure.
Fig. 2B is a flowchart of a matching point pair update sub-method in a method for verifying precision according to an embodiment of the present disclosure.
Fig. 3A is an exemplary diagram of a mold in a method for verifying precision according to an embodiment of the present disclosure.
Fig. 3B is a detailed flowchart of step S22 in a method for verifying precision according to an embodiment of the present disclosure.
Fig. 4 is a detailed flowchart of step S153 in a method for verifying precision according to an embodiment of the present disclosure.
Fig. 5 is a flowchart of a method for verifying precision according to an embodiment of the present disclosure.
6 001917P-NL
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Description of Reference Numerals of Elements 600 Electronic device 610 Memory 620 Processor 630 Display
S11to S17 Steps
S151 to S154 Steps
S21 to S26 Steps
S221 to S224 Steps
S1531 to S1534 Steps
S501 to S514 Steps
Detailed Description of the Preferred Embodiments
The implementation mode of the present disclosure will be described below through specific embodiments. Those skilled in the art can easily understand other advantages and effects of the present disclosure according to contents disclosed by the specification. The present disclosure can also be implemented or applied through other different specific implementation modes. Various modifications or changes can also be made to all details in the specification based on different points of view and applications without departing from the spirit of the present disclosure. It needs to be stated that the following embodiments and the features in the embodiments can be combined with one another under the situation of no conflict.
It needs to be stated that the drawings provided in the following embodiments are just used for schematically describing the basic concept of the present disclosure, thus only illustrating components only related to the present disclosure and are not drawn according to the numbers, shapes and sizes of components during actual implementation, the configuration, number and scale of each component during actual implementation thereof may be freely changed, and the component layout configuration thereof may be more complex. In addition,
7 001917P-NL the relational terms herein, such as "first" and "second", are used only to differentiate an entity or operation from another entity or operation, and do not require or imply any actual relationship or sequence between these entities or operations.
The conventional calibration method can only provide a calibration parameter, but cannot verify the precision of the calibration parameter. In this way, the precision of calibration result cannot be guaranteed, which may lead to errors in experiments or actual applications. To solve this problem, referring to Fig. 1, a method for verifying precision is provided in an embodiment of the present disclosure, the method for verifying precision including:
S11. Obtain a first image and a second image of a mold, a plurality of reference points is disposed on the mold, and the first image and the second image are captured at different angles. The reference points are points that can mark geometric features of the mold, for example, a vertex, a center point, and/or a center of gravity.
Particularly, in a biplanar radiography scene, the first image and the second image may be X-ray photos captured at different angles. For example, the first image may be an orthographic X-ray photo, and the second image may be a lateral X-ray photo.
S12. Obtain, according to pixel coordinates of first reference points in the first image, coordinates of the first reference points in a coordinate system of a first camera as first coordinates, the first reference points are points corresponding to the reference points in the first image, the coordinate system of the first camera corresponds to the first image and is a coordinate system established with an optical center of the first camera as an origin, and the first camera 1s a camera capturing the first image. The first coordinates may be obtained by performing coordinate transformation on the pixel coordinates of the first reference points in the first image. Specifically, for any of the first reference points, corresponding first coordinates are 2 =K xP wherein K is an internal parameter of the first camera, and
P is pixel coordinates of the first reference point in the first image.
S13. Obtain, according to pixel coordinates of second reference points in the second image, coordinates of the second reference points in a coordinate system of a second camera
8 001917P-NL as second coordinates, the second reference points are points corresponding to the reference points in the second image, the coordinate system of the second camera corresponds to the second image and is a coordinate system established with an optical center of the second camera as an origin, and the second camera is a camera capturing the second image. The second coordinates may be obtained by performing coordinate transformation on the pixel coordinates of the second reference points in the second image based on an internal parameter of the second camera, and a specific implementation thereof is similar to that of the first coordinates.
S14. Transform the first reference points and the second reference points into a same coordinate system according to a to-be-verified calibration parameter. The calibration parameter is a camera external parameter obtained by correction or calibration, for example, a rotation transformation matrix R and/or a translation transformation matrix Í. In actual applications, the first reference points may be transformed into the second coordinate system through the calibration parameter and the first coordinates. For example, the first reference points are converted into the second coordinate system by a formula P, = Rx P +1, wherein [ is coordinates of the first reference points in the second coordinate system, and /] is the coordinates of the first reference points in the first coordinate system. In addition, the second reference points may also be transformed into the first coordinate system through the calibration parameter and the second coordinates.
S15. Perform matching according to a relative position relationship between the first reference points and the second reference points, to obtain matching point pairs. Each of the matching point pairs comprises one of the first reference points and one of the second reference points, and in an ideal case, the first reference point and the second reference point comprised in each of the matching point pairs correspond to a same reference point, that is, the first reference point and the second reference point comprised in each of the matching point pairs are obtained by respectively projecting the same reference point in different projection directions.
S16. Obtain three-dimensional reconstruction points corresponding to the matching point pairs. Specifically, the three-dimensional reconstruction points may be obtained by
9 001917P-NL performing three-dimensional reconstruction on points in the matching point pairs.
S17. Obtain precision of the calibration parameter according to a relative position relationship between the three-dimensional reconstruction points and the reference points.
Specifically, the three-dimensional reconstruction points are obtained by performing three-dimensional reconstruction according to the matching point pairs. Therefore, positions of the three-dimensional reconstruction points depend on the precision of the calibration parameter. Therefore, the relative position relationship between the three-dimensional reconstruction points and the reference points may reflect the precision of the calibration parameter. In this way, the precision of the calibration parameter may be obtained according to the relative position relationship between the two. In an ideal case, when the calibration parameter is accurate, the positions of the three-dimensional reconstruction points may be the same as positions of the reference points.
According to the above description, in this embodiment, based on reference points in a mold, the method for verifying precision performs coordinate system transformation, point matching, and three-dimensional reconstruction on first reference points and second reference points to obtain three-dimensional reconstruction points, and obtain precision of a calibration parameter according to a relative position relationship between the three-dimensional reconstruction points and the reference points. Therefore, the method for verifying precision can verify the precision of the calibration parameter, which facilitates relevant personnel to obtain precision information of the calibration parameter, thereby avoiding errors in experiments or actual applications.
It should be noted that, the first image and the second image in this embodiment are not limited to X-ray photos, and may alternatively be images captured by a binocular camera, and the like. In addition, although this embodiment only provides the method for verifying precision of a calibration parameter for two images, the method may alternatively be applied to three or more images in actual applications. This is not limited in the present disclosure.
In an embodiment of the present disclosure, the reference points are dots, and the method for verifying precision further comprises: detecting the first image and the second image by using a circle detection algorithm, to obtain the first reference points and the second
10 001917P-NL reference points. The circle detection algorithm is, for example, a circle detection algorithm based on Hough Transform.
Please refer to Fig. 2A, in an embodiment of the present disclosure, an implementation method for performing matching according to the relative position relationship between the first reference points and the second reference points, to obtain matching point pairs comprises:
S151. Obtain a plurality of first straight lines according to the first reference points and a first optical center, the first optical center corresponds to the optical center of the first camera, and each of the first straight lines passes through the first optical center and a corresponding first reference point.
Specifically, if the same coordinate system in S14 is the coordinate system of the first camera, the first optical center is the optical center of the first camera; and if the same coordinate system in S14 is the coordinate system of the second camera, the optical center of the first camera is transformed into the coordinate system of the second camera according to the calibration parameter, so that the first optical center is obtained.
S152. Obtain a plurality of second straight lines according to the second reference points and a second optical center, the second optical center corresponds to the optical center of the second camera, and each of the second straight lines passes through the second optical center and a corresponding second reference point.
Specifically, if the same coordinate system in S14 is the coordinate system of the second camera, the second optical center is the optical center of the second camera; and if the same coordinate system in S14 is the coordinate system of the first camera, the optical center of the second camera is transformed into the coordinate system of the first camera according to the calibration parameter, so that the second optical center is obtained.
S153. Obtain matching straight line pairs according to a relative position relationship between the first straight lines and the second straight lines. Specifically, in the same coordinate system, if a first reference point and a second reference point corresponding to any reference point P are respectively P1 and P2, a straight line formed by the Pl and the first optical center is L1, and a straight line formed by the P2 and the second optical center is L2,
11 001917P-NL in an ideal case, the L1 and the L2 may intersect at the reference point P, that is, both the straight lines L1 and L2 correspond to the same reference point P. In a non-ideal case, a smaller distance d may exist between an intersection point of the straight lines L1 and L2 and the reference point P. Therefore, the matching straight line pairs may be obtained according to the relative position relationship between the first straight lines and the second straight lines.
S154. Obtain the matching point pairs according to the first reference points and the second reference points corresponding to the matching straight line pairs. Specifically, any matching straight line pair C comprises a first straight line L3 and a second straight line L4, the first straight line L3 passes through a first reference point P3, and the second straight line
L4 passes through a second reference point P4. Therefore, the first reference point P3 is the first reference point corresponding to the matching straight line pair C, the second reference point P4 is the second reference point corresponding to the matching straight line pair C, and the first reference point P3 and the second reference point P4 are a matching point pair (P3,
P4) corresponding to the matching straight line pair C. Through this manner, all matching point pairs in the first reference points and the second reference points may be obtained, and all points other than the matching point pairs are obtained as to-be-determined points, the to-be-determined points comprise first reference points and second reference points.
Optionally, after step S154, the method for verifying precision of this embodiment further comprises: executing a matching point pair update sub-method to update the matching point pair, the matching point pair update sub-method is used for obtaining new matching point pairs from the to-be-determined points. Referring to Fig. 2B, the matching point pair update sub-method in this embodiment comprises:
S21. Perform, for any of the matching point pairs, three-dimensional reconstruction of the matching point pair, to obtain a corresponding first reconstruction point, the matching point pair may obtain the first reconstruction point through three-dimensional reconstruction.
Optionally, in this embodiment, an implementation method for performing, for any of the matching point pairs, three-dimensional reconstruction of the matching point pair, to obtain a corresponding first reconstruction point comprises: obtaining a first straight line and a second straight line corresponding to the matching point pair, and obtaining a point with a
12 001917P-NL minimum sum of distances to the two straight lines as the first reconstruction point corresponding to the matching point pair. For example, for the matching point pair (P3, P4) in step S154, the corresponding first straight line and second straight line are respectively L3 and
L4, and a three-dimensional point with a minimum sum of Euclidean distances to the two straight lines L3 and L4 obtained in a three-dimensional space is a first reconstruction point corresponding to the matching point pair (P3, P4). Through this manner, first reconstruction points corresponding to all the matching point pairs can be obtained.
S22. Perform point cloud registration on the first reconstruction point and the reference points according to geometric features of the mold, to obtain a transformation matrix. The first reconstruction point may be used as a point cloud, and the reference points may be used as another point cloud. Point cloud registration is performed between the two point clouds to obtain the transformation matrix, and the transformation matrix may be, for example, a rotation transformation matrix and/or a translation transformation matrix.
S23. Perform three-dimensional reconstruction on first reference points and second reference points in the to-be-determined points, to obtain to-be-determined reconstruction points. Specifically, the to-be-determined reconstruction points may be obtained by performing three-dimensional reconstruction on the first reference points and the second reference points in pairs, that is, the three-dimensional reconstruction is respectively performed on each of the first reference points and each of the second reference points in the to-be-determined points, to obtain all the to-be-determined reconstruction points. Each of the to-be-determined reconstruction points corresponds to one of the first reference points and one of the second reference points.
In this step, a manner of the three-dimensional reconstruction comprises: for example, for any first reference point P5 and any second reference point P6, respectively obtaining a straight line LS determined by the P5 and the first optical center and a straight line LO determined by the P6 and the second optical center, and obtaining a point with a minimum sum of distances to the L5 and L6 in a three-dimensional space as a to-be-determined reconstruction point corresponding to the PS and P6. Through this manner, to-be-determined reconstruction points corresponding to all the to-be-determined points may be obtained.
13 001917P-NL
S24. Transform the to-be-determined reconstruction points and the reference points into a same coordinate system according to the transformation matrix.
S25. Obtain second reconstruction points according to a relative position relationship between the to-be-determined reconstruction points and the reference points, to update the matching point pairs. The second reconstruction points are three-dimensional reconstruction points obtained by performing three-dimensional reconstruction on matching point pairs in the to-be-determined points. Specifically, if a first reference point and a second reference point are matching points, a to-be-determined reconstruction point corresponding to the two is a correct three-dimensional reconstruction point, and a distance between the to-be-determined reconstruction point and a true reference point may be close. Therefore, whether the to-be-determined reconstruction point is the correct three-dimensional reconstruction point may be determined according to whether the distance between the to-be-determined reconstruction point and the true reference point is smaller than a distance threshold, all correct three-dimensional reconstruction points may be selected as the second reconstruction points, and the distance threshold may be determined according to actual needs.
S26. Update the matching point pairs according to the to-be-determined points corresponding to the second reconstruction points. As mentioned above, each of the second reconstruction points is a correct three-dimensional reconstruction point, and a first reference point and a second reference point corresponding to the second reconstruction point match each other, namely, form a matching point pair. Therefore, part or all of the matching point pairs in the to-be-determined points may be obtained according to the second reconstruction points, and part or all of the matching point pairs in the to-be-determined points are added into the existing matching point pairs to achieve the update of the existing matching point pairs.
In an embodiment of the present disclosure, the mold is a pyramid-shaped mold, and the reference points comprise a spire point and base points of the pyramid-shaped mold.
Preferably, Fig. 3A is an exemplary diagram of the pyramid-shaped mold. The pyramid-shaped mold comprises 17 reference points, which are respectively the spire point and 4 points on each side, and a point farthest away from the spire point on each side is a base point.
14 001917P-NL
Please refer Fig. 3B, in this embodiment, an implementation method for performing point cloud registration on the first reconstruction point and the reference points according to geometric features of the mold, to obtain a transformation matrix comprises:
S221. Obtain reconstruction coordinates of mold feature points according to coordinates of the first reconstruction point, the mold feature points comprise the spire point, one of the base points, and a base center point, and the base point in the mold feature points is preferably a base point farthest away from the spire point.
Specifically, coordinate variances of the first reconstruction point in three dimensions of x, y, and z are respectively calculated, a variance in a pyramid height direction may be significantly different from those in other two directions. Therefore, the height direction of the pyramid-shaped mold may be obtained according to the variance, so that one first reconstruction point corresponding to the spire point and first reconstruction points corresponding to the base points may be determined according to coordinate features of the first reconstruction point. Coordinates of the one first reconstruction point corresponding to the spire point are reconstruction coordinates of the spire point, and coordinates of the first reconstruction points corresponding to the base points are reconstruction coordinates of the base points. In addition, reconstruction coordinates of the base center point may be obtained based on the reconstruction coordinates of the base points, for example, the reconstruction coordinates of the base center point may be obtained by averaging the reconstruction coordinates of the base points.
S222. Perform pose estimation on the reconstruction coordinates and true coordinates of the mold feature points, to obtain an initialized transformation matrix. Specifically, because the mold is known, true coordinates of the spire point and the base points on the mold are also known, and true coordinates of the base center point may be obtained by the true coordinates of the base points. Based on this, in step S222, pose estimation may be performed according to the reconstruction coordinates and the true coordinates of the mold feature points, to obtain the initialized transformation matrix.
S223. Transform the first reconstruction points according to the initialized transformation matrix.
15 001917P-NL
S224. Perform point cloud registration according to the transformed first reconstruction points and the reference points, to obtain the transformation matrix.
Optionally, in this embodiment, P,, ={p |p eR i=1,23} is set as a reconstruction point cloud, coordinates of points P,, P, and P, are respectively the > reconstruction coordinates of the spire point, the reconstruction coordinates of the base points, and the reconstruction coordinates of the base center point, the spire point, the base points, and the base center point being comprised in the mold feature points; and
Oma =t4;|a € R i=1,2,3} is a true point cloud, wherein coordinates of points ¢,, ¢, and G; are respectively the true coordinates of the spire point, the true coordinates of the pase points, and the true coordinates of the base center point that are comprised in the mold feature points. It should be noted that, in this embodiment, because the pyramid-shaped mold is rotationally symmetrical, points in £, , and O4 may be matched one by one. In addition, Rand f, , are respectively set as a rotation transformation matrix and a translation transformation matrix in point cloud registration, and f (Ag l414) Tepresents an error between the reconstruction point cloud £,,, and the true point cloud O4 in the rotation transformation matrix XK, , and the translation transformation matrix 44 .
Therefore, solving the initialized transformation matrix may be converted into solving an optimal solution (Rb) that meets min(f(Rus tong) , wherein 3 > 2
J (Roja tod ) = MI Rout xp, i FL toa Tú I : i=}
Optionally, a method for solving the above optimization problem in this embodiment comprises the following steps.
First, obtain a centroid 2? of the reconstruction point cloud and a centroid 4 of the : Ig 1g : : true point cloud, wherein, p=32(r). and 9=324). Based on this, de-centroid ir] i=1 coordinates D; =p, p,i=1,2,3 of each point in the reconstruction point cloud are obtained,
16 001917P-NL and de-centroid coordinates q, =¢,—q,i=12,3 of each point in the true point cloud are obtained. : s Lj 2
Then, calculate a rotation transformation matrix A14 = arg min =x Np Rul elond ‚Z i=1 according to the following optimization problem, wherein an optimal solution of X,,, may > be obtained by singular value decomposition. Specifically, a matrix W = > p xq’ is il defined, and W =U xxx} may be obtained by performing singular value decomposition on the matrix W, wherein > is a diagonal matrix formed by singular values, diagonal elements are arranged in descending order, and U and V are diagonal matrices. When
W is non-singular, an optimal rotation transformation matrix Rog =UxV", and then an optical translation transformation matrix #° may be obtained according to =p- B Xq . The optimal rotation transformation matrix Rr, and the optical translation transformation matrix +" are the initialized transformation matrix in S222.
Optionally, X = {xx eR,i=12L ‚ml is set as a reconstruction point cloud formed by the transformed first reconstruction points, wherein ’? is a quantity of first reconstruction points, and Y ={y‚ly, eRi=12L ‚‚n} is a true point cloud formed by reference points corresponding to the transformed first reconstruction points. Therefore, in this embodiment, an implementation method for performing point cloud registration according to the transformed first reconstruction points and the reference points, to obtain a transformation matrix comprises: first, calculate a closest point set: selecting a point X, €X in the reconstruction point cloud X, and obtaining a corresponding point Vy €Y at a shortest distance from X; in the true point cloud Y ‚to use X, and J; as a matching point pair.
Then, perform point cloud registration by using a method similar to that in step S222, and find an optimal solution CR Lema) that meets Min(f (Rly...) as the transformation
17 001917P-NL matrix. In this embodiment, f (Ri log) = Yl Roya XP tt 4 I i=]
In an embodiment of the present disclosure, after the updating the matching point pairs according to the to-be-determined points corresponding to the second reconstruction points, the implementation method for obtaining the matching point pairs further comprises: repeatedly executing the matching point pair update sub-method based on the updated matching point pairs, until the matching point pairs meet an end condition. The end condition is that, for example, a quantity of matching point pairs reaches a preset quantity threshold, and the quantity threshold may be set according to actual needs.
Please refer to Fig. 4, in an embodiment of the present disclosure, an implementation method for obtaining matching straight line pairs according to the relative position relationship between the first straight lines and the second straight lines comprises:
S1531: perform first matching based on the first straight lines, to obtain a first matching result, the first matching result comprises each first straight line and the second straight line closest to each first straight line.
Specifically, for any first straight line L7 and any second straight line L8, a direction vector 3 =(e,5.€,7,¢,,) may be obtained according to a first reference point and the first optical center corresponding to the L7, and a direction vector ¢, ni CN may be obtained according to a second reference point and the second optical center corresponding to the L8. Then, a normal vector n=e, xe, perpendicular to both the straight lines L7 and L8 may be found by using cross product. A point is respectively and randomly selected on the straight lines L7 and L8, and a projection of a line segment formed by the two points on the normal vector is a distance between the straight lines L7 and L8. In this embodiment, coordinates of the first optical center and the second optical center are known. Therefore, the two optical centers may be selected as fixed-points M1 and M2 on the straight lines L7 and 1.8 In this case, the distance between the L7 and the L8 is: d= Te 7 8
Distances between the first straight lines and the second straight lines may be
18 001917P-NL obtained based on the above method, and the second straight line closest to each first straight line may be obtained according to these distances, to obtain the first matching result, the first matching result may be regarded as a set of a plurality of straight line pairs, and each of the straight line pairs in the first matching result comprises a first straight line and a second straight line closest to the first straight line.
S1532. Perform second matching based on the second straight lines, to obtain a second matching result, the second matching result comprises each second straight line and the first straight line closest to each second straight line.
Specifically, the first straight line closest to each second straight line may be obtained according to the distances between the first straight lines and the second straight lines, to obtain the second matching result, the second matching result may be regarded as a set of a plurality of straight line pairs, and each of the straight line pairs in the second matching result comprises a second straight line and a first straight line closest to the second straight line.
S1533. Obtain to-be-determined matching straight line pairs according to the first matching result and the second matching result. For example, an intersection of the first matching result and the second matching result may be selected as the to-be-determined matching straight line pairs. That is, if a second straight line closest to a first straight line L9 is L10, and a first straight line closest to the second straight line L10 is L9, the L9 and the L10 form a to-be-determined matching straight line pair; otherwise, the L9 and the L10 cannot form the to-be-determined matching straight line pair.
S1534. Obtain the matching straight line pairs from the to-be-determined matching straight line pairs according to ratios between shortest distances and second shortest distances of straight lines, the straight lines comprise the first straight lines and the second straight lines.
For example, if the L9 and the L10 form the to-be-determined matching straight line pair, a second straight line L11 at a second shortest distance from the L9 and a straight line L12 at a second shortest distance from the L10 are obtained, and a distance d between the L9 and the
L10, a distance d, between the L9 and the L11, and a distance d; between the L 10 and the
19 001917P-NL
L12 are obtained. If a ratio of ¢ to dh is smaller than a preset ratio threshold (for example, 1/4), and a ratio of d, to dh is also smaller than the ratio threshold, the to-be-determined matching straight line pair formed by the L9 and the L10 is a matching straight line pair; otherwise, the to-be-determined matching straight line pair formed by the L9 and the L10 is > not the matching straight line pair.
In an embodiment of the present disclosure, an implementation method for obtaining precision of the calibration parameter according to the relative position relationship between the three-dimensional reconstruction points and the reference points comprises: performing point cloud registration on the three-dimensional reconstruction points and the reference points; and obtaining a root mean square error between the three-dimensional reconstruction points and the corresponding reference points as the precision of the calibration parameter. A manner of the point cloud registration performed on the three-dimensional reconstruction points and the reference points is the same as the above manner of the point cloud registration performed on the first reconstruction points and the reference points, and details are not described herein again.
Please refer to Fig. 5, in an embodiment of the present disclosure, the method for verifying precision comprises:
S501. Obtain a first image and a second image of a mold. In this embodiment, the mold is a pyramid-shaped mold, and comprises 17 circular reference points, and the first image and the second image are respectively, for example, photos captured by an orthographic X-ray machine and a lateral X-ray machine.
S502. Detect first reference points in the first image and second reference points in the second image. For example, the first reference points and the second reference points may be detected by using a circle detection algorithm based on Hough Transform.
S503. Obtain coordinates of the first reference points in a coordinate system of a first camera as first coordinates according to pixel coordinates of the first reference points in the first image.
S504. Obtain coordinates of the second reference points in a coordinate system of a
20 001917P-NL second camera as second coordinates according to pixel coordinates of the second reference points in the second image.
S505. Transform the first reference points into the coordinate system of the second camera according to a to-be-verified calibration parameter and the first coordinates.
S506. Obtain a plurality of first straight lines according to the first reference points and a first optical center, and obtain a plurality of second straight lines according to the second reference points and a second optical center.
S507. Obtain matching straight line pairs according to distances between the first straight lines and the second straight lines. The matching straight line pairs may be obtained by using steps S1531 to S1534 shown in Fig. 4.
S508. Obtain matching point pairs according to the matching straight line pairs, and obtain first reference points and second reference points other than the matching point pairs as to-be-determined points.
S509. Perform, for any of the matching point pairs, three-dimensional reconstruction of the matching point pair, to obtain a corresponding first reconstruction point. Specifically, an implementation method for performing, for any of the matching point pairs, three-dimensional reconstruction of the matching point pair, to obtain a corresponding first reconstruction point comprises: obtaining a first straight line and a second straight line corresponding to the matching point pair, and obtaining a point with a minimum sum of distances to the two straight lines as the first reconstruction point corresponding to the matching point pair.
S510. Perform point cloud registration on the first reconstruction point and the reference points according to geometric features of the mold, to obtain a transformation matrix. The transformation matrix may be implemented by using steps S221 to S224 shown in
Fig. 3B.
S511. Perform three-dimensional reconstruction on the to-be-determined points in pairs to obtain to-be-determined reconstruction points, and transform the to-be-determined reconstruction points into a world coordinate system by using the transformation matrix.
21 001917P-NL
S512. Obtain shortest distances between the to-be-determined reconstruction points and the reference points, and obtain matching point pairs in the to-be-determined points according to the shortest distances. Specifically, if shortest distances between a to-be-determined reconstruction point and the reference points are smaller than a distance threshold, a first reference point and a second reference point corresponding to the to-be-determined reconstruction point form a matching point pair. All matching point pairs in the to-be-determined points may be obtained in this manner.
S513. Repeatedly execute steps S509 to S513 based on all currently obtained matching point pairs, until all matching point pairs are obtained.
S514. Perform three-dimensional reconstruction according to all the matching point pairs to obtain final three-dimensional reconstruction points, and obtain a root mean square error between the final three-dimensional reconstruction points and the reference points as precision of the calibration parameter.
Based on the above description on the method for verifying precision, the present disclosure further provides a computer-readable storage medium, storing a computer program, the computer program, when executed by a processor, implementing the method for verifying precision shown in Fig. 1 or Fig. 5.
Based on the above description on the method for verifying precision, the present disclosure further provides an electronic device. Specifically, referring to Fig. 6, an electronic device 600 comprises a memory 610 and a processor 620. The memory 610 stores a computer program, and the processor 620 is in communication connection with the memory 610, and is configured to perform the method for verifying precision shown in Fig. | or Fig. 5 when invoking the computer program.
Optionally, the electronic device 600 further comprises a display 630, and the display 630 is in communication with the memory 610 and the processor 620, and is configured to display related GUI interactive interfaces of the method for verifying precision.
The protection scope of the method for verifying precision according to the present disclosure is not limited to the execution sequence of the steps listed in this embodiment, and all solutions implemented by adding or replacing a step in the prior art according to the
22 001917P-NL principle of the present disclosure fall within the protection scope of the present disclosure.
In conclusion, based on reference points in a mold, the method for verifying precision in the present disclosure performs coordinate system transformation, point matching, and three-dimensional reconstruction on first reference points and second reference points to obtain three-dimensional reconstruction points, and obtain precision of a calibration parameter according to a relative position relationship between the three-dimensional reconstruction points and the reference points. Therefore, the method for verifying precision can verify the precision of the calibration parameter, which facilitates relevant personnel to obtain precision information of the calibration parameter, thereby avoiding errors in experiments or actual applications. Therefore, the present disclosure effectively overcomes various defects in the prior art, and has a high value in industrial use.
The above embodiments only exemplarily illustrate the principles and effects of the present disclosure, but are not used to limit the present disclosure. Any person skilled in the art may make modifications or changes on the foregoing embodiments without departing from the spirit and scope of the present disclosure. Therefore, all equivalent modifications or changes made by a person of ordinary skill in the art without departing from the spirit and technical idea of the present disclosure shall be covered by the claims of the present disclosure.

Claims (10)

23 001917P-NL Octrooiconclusies23 001917P-NL Patent Claims 1. Een methode voor het verifiéren van de precisie zoals van een kalibratieparameter, bestaande uit: het verkrijgen van een eerste beeld en een tweede beeld van een matrijs, waarbij een veelheid van referentiepunten op de matrijs is geplaatst, en het eerste beeld en het tweede beeld onder verschillende hoeken zijn vastgelegd; het verkrijgen, volgens pixelcoördinaten van eerste referentiepunten in het eerste beeld, van coördinaten van de eerste referentiepunten in een coördinatensysteem van een eerste camera als eerste coördinaten, waarbij de eerste referentiepunten punten zijn die overeenkomen met de referentiepunten in het eerste beeld; het verkrijgen, volgens pixelcoördinaten van tweede referentiepunten in het tweede beeld, van coördinaten van de tweede referentiepunten in een coördinatensysteem van een tweede camera als tweede coördinaten, waarbij de tweede referentiepunten punten zijn die overeenkomen met de referentiepunten in het tweede beeld; het omzetten van de eerste referentiepunten en de tweede referentiepunten in eenzelfde coördinatensysteem volgens een te verifiëren kalibratieparameter het uitvoeren van matching volgens een relatieve positieverhouding tussen de eerste referentiepunten en de tweede referentiepunten, om matching-puntenparen te verkrijgen; het verkrijgen van driedimensionale reconstructiepunten die overeenkomen met de overeenkomende puntenparen; en verkrijgen van de nauwkeurigheid van de kalibratieparameter op basis van een relatieve positieverhouding tussen de driedimensionale reconstructiepunten en de referentiepunten.1. A method for verifying the precision of a calibration parameter, comprising: acquiring a first image and a second image of a mold with a plurality of reference points placed on the mold, and the first image and the second image captured at different angles; obtaining, according to pixel coordinates of first reference points in the first image, coordinates of the first reference points in a coordinate system of a first camera as first coordinates, the first reference points being points corresponding to the reference points in the first image; obtaining, according to pixel coordinates of second reference points in the second image, coordinates of the second reference points in a coordinate system of a second camera as second coordinates, the second reference points being points corresponding to the reference points in the second image; converting the first reference points and the second reference points into a same coordinate system according to a calibration parameter to be verified; performing matching according to a relative position relationship between the first reference points and the second reference points, to obtain pairs of matching points; obtaining three-dimensional reconstruction points corresponding to the matched pairs of points; and obtaining the accuracy of the calibration parameter based on a relative position relationship between the three-dimensional reconstruction points and the reference points. 2. De methode voor het verifiëren van de precisie zoals in conclusie 1, waarin een uitvoeringsmethode voor het uitvoeren van matching volgens de relatieve positierelatie tussen de eerste referentiepunten en de tweede referentiepunten, voor het verkrijgen van de matchingpuntparen bestaat uit: het verkrijgen van een veelheid van eerste rechte lijnen volgens de eerste referentiepunten en een eerste optisch middelpunt, waarbij het eerste optische middelpunt overeenkomt met een optisch middelpunt van de eerste camera, en elk van de eerste rechte lijnen door het eerste optische middelpunt en een overeenkomstig eerste referentiepunt gaat; het verkrijgen van een aantal tweede rechte lijnen overeenkomstig de tweede referentiepunten en een tweede optisch middelpunt, waarbij het tweede optische middelpunt overeenkomt metThe method for verifying precision as in claim 1, wherein an execution method for performing matching according to the relative position relationship between the first reference points and the second reference points to obtain the matching point pairs consists of: obtaining a plurality of of first straight lines according to the first reference points and a first optical center, the first optical center corresponding to an optical center of the first camera, and each of the first straight lines passing through the first optical center and a corresponding first reference point; obtaining a plurality of second straight lines corresponding to the second reference points and a second optical center, the second optical center corresponding to 24 001917P-NL een optisch middelpunt van de tweede camera en elk van de tweede rechte lijnen door het tweede optische middelpunt en een overeenkomstig tweede referentiepunt gaat het verkrijgen van overeenstemmende rechte lijnparen volgens een relatieve positieverhouding tussen de eerste rechte lijnen en de tweede rechte lijnen; en het verkrijgen van de overeenstemmende puntparen volgens de eerste referentiepunten en de tweede referentiepunten die met de overeenstemmende rechte lijnparen overeenstemmen.an optical center of the second camera and each of the second straight lines passes through the second optical center and a corresponding second reference point to obtain matching straight line pairs according to a relative position relationship between the first straight lines and the second straight lines; and obtaining the matching point pairs according to the first reference points and the second reference points corresponding to the matching straight line pairs. 3. De methode voor het verifiëren van de precisie zoals in conclusie 2, waarin een uitvoeringsmethode voor het verkrijgen van de overeenstemmende puntenparen verder omvat: het uitvoeren van een submethode voor het bijwerken van de overeenstemmende puntenparen om de overeenstemmende puntenparen bij te werken, waarin de submethode voor het bijwerken van de overeenstemmende puntenparen omvat: het uitvoeren, voor een van de matching point pairs, van een driedimensionale reconstructie van het matching point pair, om een corresponderend eerste reconstructiepunt te verkrijgen; het uitvoeren van puntenwolkregistratie op het eerste reconstructiepunt en de referentiepunten overeenkomstig geometrische kenmerken van de mal, om een transformatiematrix te verkrijgen het uitvoeren van een driedimensionale reconstructie op eerste referentiepunten en tweede referentiepunten in te bepalen punten, om te bepalen reconstructiepunten te verkrijgen het transformeren van de te bepalen reconstructiepunten en de referentiepunten in eenzelfde coordinatensysteem volgens de transformatiematrix tweede reconstructiepunten verkrijgen op basis van een relatieve positieverhouding tussen de te bepalen reconstructiepunten en de referentiepunten, en het bijwerken van de overeenstemmende puntenparen op basis van de te bepalen punten die overeenstemmen met de tweede reconstructiepunten.The precision verification method as in claim 2, wherein an executing method for obtaining the matched point pairs further comprises: executing a submethod for updating the matched point pairs to update the matched point pairs, wherein the sub-method for updating the matching point pairs includes: performing, for one of the matching point pairs, a three-dimensional reconstruction of the matching point pair to obtain a corresponding first reconstruction point; performing point cloud registration on the first reconstruction point and reference points according to geometric features of the mold, to obtain a transformation matrix performing three-dimensional reconstruction on first reference points and second reference points in points to be determined, to obtain reconstruction points to be determined transforming the reconstruction points to be determined and the reference points in the same coordinate system according to the transformation matrix obtain second reconstruction points based on a relative positional relationship between the reconstruction points to be determined and the reference points, and update the corresponding pairs of points based on the points to be determined corresponding to the second reconstruction points . 4. De methode voor het verifiëren van de precisie zoals in conclusie 3, waarin een uitvoeringsmethode voor het uitvoeren, voor een van de overeenstemmende puntenparen, van een driedimensionale reconstructie van het overeenstemmende puntenpaar, om een overeenkomstig eerste reconstructiepunt te verkrijgen, bestaat uit: het verkrijgen van een eerste rechte lijn en een tweede rechte lijn die overeenkomen met het overeenstemmende puntenpaar, en het verkrijgen van een punt met een minimale som van de afstanden tot de twee rechte lijnen als het eerste reconstructiepunt dat overeenstemt met het overeenstemmende puntenpaar.The method for verifying precision as in claim 3, wherein an execution method for performing, for one of the matched pairs of points, a three-dimensional reconstruction of the matched pair of points to obtain a matched first reconstruction point comprises: obtaining a first straight line and a second straight line corresponding to the matching point pair, and obtaining a point with a minimum sum of the distances from the two straight lines as the first reconstruction point corresponding to the matching point pair. 25 001917P-NL25 001917P-EN 5. De methode voor het verifiéren van de precisie zoals in conclusie 3, waarin de matrijs een piramidevormige matrijs is, de referentiepunten een spitsvormig punt en basispunten van de piramidevormige matrijs omvatten, en een uitvoeringsmethode voor het uitvoeren van puntenwolkregistratie op het eerste reconstructiepunt en de referentiepunten overeenkomstig de geometrische kenmerken van de matrijs, om de transformatiematrix te verkrijgen, omvat: het verkrijgen van reconstructiecoördinaten van vormkenmerkpunten overeenkomstig de coördinaten van het eerste reconstructiepunt, waarbij de vormkenmerkpunten het punt van de spits, een van de basispunten en een basiscentreerpunt omvatten; het uitvoeren van een schatting van de houding op de reconstructiecoördinaten en de werkelijke coördinaten van de malkenmerkpunten, om een geïnitialiseerde transformatiematrix te verkrijgen; het transformeren van de eerste reconstructiepunten volgens de geïnitialiseerde transformatiematrix; en het uitvoeren van puntenwolkregistratie op basis van de getransformeerde eerste reconstructiepunten en de referentiepunten, om de transformatiematrix te verkrijgen.The precision verification method as in claim 3, wherein the matrix is a pyramidal matrix, the reference points include a cusp point and base points of the pyramidal matrix, and an execution method for performing point cloud registration at the first reconstruction point and the reference points according to the geometric features of the mold, to obtain the transformation matrix, includes: obtaining reconstruction coordinates of shape feature points according to the coordinates of the first reconstruction point, the shape feature points including the point of the spire, one of the base points, and a base center point; performing an attitude estimation on the reconstruction coordinates and the actual coordinates of the template feature points to obtain an initialized transformation matrix; transforming the first reconstruction points according to the initialized transformation matrix; and performing point cloud registration based on the transformed first reconstruction points and the reference points to obtain the transformation matrix. 6. De methode voor het verifiëren van precisie zoals in conclusie 3, waarin na het bijwerken van de overeenstemmingspuntparen volgens de te bepalen punten die overeenstemmen met de tweede reconstructiepunten, de uitvoeringsmethode voor het verkrijgen van de overeenstemmingspuntparen verder bestaat uit: het herhaaldelijk uitvoeren van de submethode voor het bijwerken van de bijpassende puntenparen op basis van de bijgewerkte bijpassende puntenparen, totdat de bijpassende puntenparen aan een eindvoorwaarde voldoen.The method for verifying precision as in claim 3, wherein after updating the match point pairs according to the points to be determined corresponding to the second reconstruction points, the execution method for obtaining the match point pairs further comprises: repeatedly executing the submethod for updating the matching point pairs based on the updated matching point pairs, until the matching point pairs meet a final condition. 7. De methode voor het verifiëren van precisie zoals in conclusie 2, waarin een uitvoeringsmethode voor het verkrijgen van overeenkomende rechte lijnparen volgens de relatieve positierelatie tussen de eerste rechte lijnen en de tweede rechte lijnen omvat: het uitvoeren van eerste matching op basis van de eerste rechte lijnen, om een eerste matching resultaat te verkrijgen, waarbij het eerste matching resultaat elke eerste rechte lijn en de tweede rechte lijn omvat die het dichtst bij elke eerste rechte lijn ligt; tweede aanpassing op basis van de tweede rechte lijnen, om een tweede overeenstemmingsresultaat te verkrijgen, waarbij het tweede overeenstemmingsresultaat elke tweede rechte lijn omvat en de eerste rechte lijn die het dichtst bij elke tweede rechte lijn ligt;The method for verifying precision as in claim 2, wherein an execution method for obtaining matching straight line pairs according to the relative position relationship between the first straight lines and the second straight lines comprises: performing first matching based on the first straight lines, to obtain a first matching result, the first matching result comprising each first straight line and the second straight line closest to each first straight line; second adjustment based on the second straight lines to obtain a second match result, the second match result including each second straight line and the first straight line closest to each second straight line; 26 001917P-NL het verkrijgen van te bepalen paren van overeenkomende rechte lijnen volgens het eerste overeenstemmingsresultaat en het tweede overeenstemmingsresultaat; en het verkrijgen van de overeenstemmende rechte lijnparen uit de te bepalen overeenstemmende rechte lijnparen volgens verhoudingen tussen kortste afstanden en tweede kortste afstanden van rechte lijnen, waarin de rechte lijnen bestaan uit de eerste rechte lijnen en de tweede rechte lijnen.obtaining determinable pairs of matching straight lines according to the first match result and the second match result; and obtaining the matched straight line pairs from the matched straight line pairs to be determined according to ratios between shortest distances and second shortest distances of straight lines, wherein the straight lines consist of the first straight lines and the second straight lines. 8. De methode voor het verifiëren van de precisie zoals in conclusie 1, waarin een uitvoeringsmethode voor het verkrijgen van de precisie van de kalibratieparameter volgens de relatieve positierelatie tussen de driedimensionale reconstructiepunten en de referentiepunten bestaat uit: het uitvoeren van puntenwolkregistratie op de driedimensionale reconstructiepunten en de referentiepunten; en het verkrijgen van een kwadratisch gemiddelde fout tussen de driedimensionale reconstructiepunten en de overeenkomstige referentiepunten als de precisie van de kalibratieparameter.The method for verifying precision as in claim 1, wherein an executing method for obtaining the precision of the calibration parameter according to the relative position relationship between the three-dimensional reconstruction points and the reference points comprises: performing point cloud registration on the three-dimensional reconstruction points and the reference points; and obtaining a root mean square error between the three-dimensional reconstruction points and the corresponding reference points as the precision of the calibration parameter. 9. Een computer-leesbaar opslagmedium, dat een computerprogramma opslaat, waarin het computerprogramma, wanneer uitgevoerd door een bewerker, de methode voor het verifiëren van de precisie zoals in een van de conclusies 1 tot en met 8.A computer-readable storage medium storing a computer program, wherein the computer program, when executed by an editor, includes the method of verifying precision as in any one of claims 1 to 8. 10. Een elektronisch apparaat, bestaand uit: een geheugen, dat een computerprogramma opslaat; en een bewerker in communicatieverbinding met het geheugen, geconfigureerd om de methode voor het verifiëren van precisie zoals in een van de conclusies 1 tot en met 8 uit te voeren wanneer het computerprogramma aanroepen.10. An electronic device, consisting of: a memory that stores a computer program; and an processor in communication with the memory, configured to perform the method of verifying precision as in any one of claims 1 to 8 when calling the computer program.
NL2031065A 2021-09-02 2022-02-24 Method for verifying precision of calibration parameter, medium and electronic device NL2031065B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111027674.7A CN113674333B (en) 2021-09-02 2021-09-02 Precision verification method and medium for calibration parameters and electronic equipment

Publications (2)

Publication Number Publication Date
NL2031065A NL2031065A (en) 2022-05-09
NL2031065B1 true NL2031065B1 (en) 2023-06-16

Family

ID=78548146

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2031065A NL2031065B1 (en) 2021-09-02 2022-02-24 Method for verifying precision of calibration parameter, medium and electronic device

Country Status (2)

Country Link
CN (1) CN113674333B (en)
NL (1) NL2031065B1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2638524A2 (en) * 2010-11-09 2013-09-18 The Provost, Fellows, Foundation Scholars, & the other members of Board, of the College of the Holy & Undiv. Trinity of Queen Elizabeth near Dublin Method and system for recovery of 3d scene structure and camera motion from a video sequence
CN108230402B (en) * 2018-01-23 2021-09-21 北京易智能科技有限公司 Three-dimensional calibration method based on triangular pyramid model
CN109919911B (en) * 2019-01-26 2023-04-07 中国海洋大学 Mobile three-dimensional reconstruction method based on multi-view photometric stereo
CN109945853B (en) * 2019-03-26 2023-08-15 西安因诺航空科技有限公司 Geographic coordinate positioning system and method based on 3D point cloud aerial image
CN110363838B (en) * 2019-06-06 2020-12-15 浙江大学 Large-visual-field image three-dimensional reconstruction optimization method based on multi-spherical-surface camera model
CN110443840A (en) * 2019-08-07 2019-11-12 山东理工大学 The optimization method of sampling point set initial registration in surface in kind
CN111724446B (en) * 2020-05-20 2023-05-02 同济大学 Zoom camera external parameter calibration method for three-dimensional reconstruction of building
CN112991464B (en) * 2021-03-19 2023-04-07 山东大学 Point cloud error compensation method and system based on three-dimensional reconstruction of stereoscopic vision

Also Published As

Publication number Publication date
NL2031065A (en) 2022-05-09
CN113674333A (en) 2021-11-19
CN113674333B (en) 2023-11-07

Similar Documents

Publication Publication Date Title
US11544874B2 (en) System and method for calibration of machine vision cameras along at least three discrete planes
US20200065995A1 (en) System and method for tying together machine vision coordinate spaces in a guided assembly environment
CN109448055B (en) Monocular vision posture determining method and system
JP4230525B2 (en) Three-dimensional shape measuring method and apparatus
EP3108266B1 (en) Estimation and compensation of tracking inaccuracies
JP6573419B1 (en) Positioning method, robot and computer storage medium
WO2015132981A1 (en) Position measurement device and position measurement method
EP3421930B1 (en) Three-dimensional shape data and texture information generation system, photographing control program, and three-dimensional shape data and texture information generation method
JP6092530B2 (en) Image processing apparatus and image processing method
JP6928392B2 (en) Search method and system for vascular correspondence in multi-angle contrast
Park et al. Active calibration of camera-projector systems based on planar homography
JP2016019194A (en) Image processing apparatus, image processing method, and image projection device
BR112015013804B1 (en) measuring system for three-dimensional measurement of an underwater structure, method for laser triangulation of an underwater structure and non-transient computer-readable medium coded with instructions
WO2004044522A1 (en) Three-dimensional shape measuring method and its device
US9990739B1 (en) Method and device for fisheye camera automatic calibration
JP7151879B2 (en) Camera calibration device, camera calibration method, and program
JP2017144498A (en) Information processor, control method of information processor, and program
ES2924701T3 (en) On-screen position estimation
JP2008309595A (en) Object recognizing device and program used for it
JP4764896B2 (en) Camera calibration apparatus, camera calibration method, camera calibration program, and recording medium recording the program
NL2031065B1 (en) Method for verifying precision of calibration parameter, medium and electronic device
JP6065670B2 (en) Three-dimensional measurement system, program and method.
JP2014032161A (en) Image forming device and method
CN113963057B (en) Imaging geometric relation calibration method and device, electronic equipment and storage medium
US20240016550A1 (en) Scanner for intraoperative application