CN117036509A - Combined calibration method, device, equipment and storage medium - Google Patents

Combined calibration method, device, equipment and storage medium Download PDF

Info

Publication number
CN117036509A
CN117036509A CN202311166267.3A CN202311166267A CN117036509A CN 117036509 A CN117036509 A CN 117036509A CN 202311166267 A CN202311166267 A CN 202311166267A CN 117036509 A CN117036509 A CN 117036509A
Authority
CN
China
Prior art keywords
parameter
camera
measurement unit
frame
inertial measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311166267.3A
Other languages
Chinese (zh)
Inventor
晏凌云
薛利荣
熊传进
张沛尧
田蓉
赵斐斐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Huace Satellite Technology Co ltd
Shanghai Shuangwei Navigation Technology Co ltd
Shanghai Huace Navigation Technology Ltd
Original Assignee
Wuhan Huace Satellite Technology Co ltd
Shanghai Shuangwei Navigation Technology Co ltd
Shanghai Huace Navigation Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Huace Satellite Technology Co ltd, Shanghai Shuangwei Navigation Technology Co ltd, Shanghai Huace Navigation Technology Ltd filed Critical Wuhan Huace Satellite Technology Co ltd
Priority to CN202311166267.3A priority Critical patent/CN117036509A/en
Publication of CN117036509A publication Critical patent/CN117036509A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Gyroscopes (AREA)

Abstract

The invention discloses a joint calibration method, a device, equipment and a storage medium, comprising the following steps: acquiring calibration plate data information of a calibration plate; initializing a first parameter and a second parameter between the camera and the inertial measurement unit according to the calibration plate data information; and carrying out integrated processing according to the first parameter and the second parameter, and jointly calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit. According to the technical scheme, the problems that the internal and external parameters are not calibrated synchronously and the steps are complex are solved, the operation flow of the calibration of the internal and external parameters of the camera and the IMU is simplified, and the global consistency of the internal and external parameters of the camera and the IMU is improved.

Description

Combined calibration method, device, equipment and storage medium
Technical Field
The present invention relates to the field of vision processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for joint calibration.
Background
Along with the continuous improvement of the hardware level of a computer and a sensor, the sensor is used for acquiring external data, accurately estimating the position and the posture of the sensor, realizing the positioning and the navigation of a carrier, gradually developing from the limited requirements of the professional field to the ubiquitous requirements of the public, wherein the VIO/VI-SLAM for positioning and mapping by adopting the two sensors of a camera and an inertial measurement unit (Inertial Measurement Unit, IMU) is rapidly developed. Before using the VIO/VI-SLAM technique, the internal and external parameters of the camera and IMU, respectively, need to be determined by a calibration system.
The existing common camera internal reference calibration method is a Zhang's calibration method, and the external reference calibration method is divided into off-line calibration and on-line calibration, wherein the off-line calibration method comprises the steps of shooting a calibration plate to reverse the pose of a camera, fitting the pose of the camera, deriving the pose to obtain the acceleration and the angular velocity of the camera, establishing a constraint relation with the acceleration and the angular velocity measured by an IMU, and finally optimizing to obtain the camera IMU external reference; the on-line calibration method is a method for directly resolving an external reference as an unknown state quantity by fusing the motion states of the IMU and the camera in the motion process, is simple and convenient to operate, but the internal reference of the camera is also needed to be known in advance, and the motion of the camera is estimated by using the epipolar constraint to have errors, so that the overall precision is lower than that of off-line calibration. In the currently adopted calibration method, the calibration of the internal and external parameters of the IMU of the camera is carried out step by step, the internal parameters are determined firstly, the external parameters are determined based on the internal parameters, and the whole calibration step is complex.
Disclosure of Invention
The invention provides a combined calibration method, a device, equipment and a storage medium, which solve the problems of asynchronous calibration of internal and external parameters and complex steps, simplify the operation flow of the calibration of the internal and external parameters of a camera and an IMU, and promote the global consistency of the internal and external parameters of the camera and the IMU.
In a first aspect, an embodiment of the present disclosure provides a joint calibration method, including:
acquiring calibration plate data information of a calibration plate;
initializing a first parameter of a camera and a second parameter between the camera and an inertial measurement unit according to the calibration plate data information, wherein the first parameter comprises an internal parameter of the camera and a related parameter thereof, and the second parameter comprises an external parameter between the camera and the inertial measurement unit and a related parameter thereof;
and carrying out integrated processing according to the first parameter and the second parameter, and jointly calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit.
In a second aspect, embodiments of the present disclosure provide a joint calibration device, including:
the data information determining module is used for obtaining the calibration plate data information of the calibration plate;
the related parameter determining module is used for initializing a first parameter and a second parameter between the camera and the inertial measurement unit according to the calibration plate data information, wherein the first parameter comprises an internal parameter of the camera and related parameters thereof, and the second parameter comprises an external parameter between the camera and the inertial measurement unit and related parameters thereof;
and the combined calibration module is used for carrying out integrated processing according to the first parameter and the second parameter, and jointly calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform a joint calibration method as provided by the embodiments of the first aspect described above.
In a fourth aspect, embodiments of the present disclosure provide a computer readable storage medium, where computer instructions are stored, where the computer instructions are configured to cause a processor to execute a joint calibration method provided in the foregoing first aspect embodiment.
According to the combined calibration method, the device, the equipment and the storage medium, the data information of the calibration plate is obtained; initializing a first parameter and a second parameter between the camera and the inertial measurement unit according to the calibration plate data information; and carrying out integrated processing according to the first parameter and the second parameter, and jointly calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit. According to the technical scheme, the problems that the internal and external parameters are not calibrated synchronously and the steps are complex are solved, the operation flow of the calibration of the internal and external parameters of the camera and the IMU is simplified, and the global consistency of the internal and external parameters of the camera and the IMU is improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a joint calibration method according to a first embodiment of the present invention;
FIG. 2 is a flow chart of a joint calibration method according to a second embodiment of the present invention;
fig. 3 is an exemplary display diagram of checkerboard corner points involved in a joint calibration method according to a second embodiment of the present invention;
FIG. 4 is an exemplary diagram of a constraint relationship between a camera and an inertial measurement unit involved in a joint calibration method according to a second embodiment of the present invention;
FIG. 5 is a schematic structural diagram of a joint calibration device according to a third embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and "object" in the description of the present invention and the claims and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a joint calibration method according to an embodiment of the present invention, where the method may be performed by a joint calibration device, and the device may be implemented in hardware and/or software.
As shown in fig. 1, the method includes:
s101, acquiring calibration plate data information of the calibration plate.
In this embodiment, the calibration board data information may be understood as related data information of the calibration board of the checkerboard obtained after the time-synchronized visual inertial system is used to shoot the calibration checkerboard, including calibration board image information of the three-dimensional checkerboard calibration board image acquired by the camera and IMU data information acquired by the inertial measurement unit. It will be appreciated that the time synchronized visual inertial system includes a camera and an inertial measurement unit. The calibration plate image information can be understood as angular point pixel coordinate information obtained by extracting the angular points of the checkerboard in the calibration plate image. The IMU data information includes a frame pose, a frame position, and inter-frame pose change information of two adjacent frames or from an initial frame to a current frame, etc. acquired by each data frame.
Specifically, a visual inertial system with time synchronization completed is adopted to aim at a calibration checkerboard for shooting, the whole system is fully translated and rotated in the shooting process, each shaft system of an IMU is ensured to be excited, a motion process calibration plate is fully positioned in a camera view, images and IMU data are collected for about one minute, calibration plate images and IMU data of a sufficient number of frames are obtained, the image extraction and processing are carried out on the calibration plate images, angular point pixel coordinates of angular points of the checkerboard are extracted, and image information of the calibration plate images is obtained; and analyzing the IMU data to obtain the frame posture and frame position acquired by each data frame and the frame posture change information of two adjacent frames or the frame-to-frame posture change information from the initial frame to the current frame, thereby completing the initial information acquisition work of the joint calibration.
S102, initializing a first parameter and a second parameter between the camera and the inertial measurement unit according to the calibration plate data information, wherein the first parameter comprises an internal parameter of the camera and a related parameter of the internal parameter, and the second parameter comprises an external parameter between the camera and the inertial measurement unit and a related parameter of the external parameter.
In this embodiment, the first parameter may be understood as parameter information of a camera internal parameter, including a camera internal parameter and related parameters thereof, where the camera internal parameter may be understood as an initial value of the camera internal parameter, and the related parameters of the camera internal parameter include information such as a pose of the camera and an inverse matrix of the pose of the camera relative to the world coordinate system established by the calibration board. The second parameter may be understood as parameter information of a joint external parameter between the camera and the inertial measurement unit, including an external parameter and its related parameters, where the external parameter is an external parameter rotation matrix between the camera and the inertial measurement unit, and the external parameter related parameters include information such as a gyro zero offset initial value of the inertial measurement unit, a speed and a gravitational acceleration of the inertial measurement unit, and the like.
Specifically, data processing and calculation are respectively performed on calibration plate image information and IMU data information in the calibration plate data information, and a series of information processing and calculation are performed according to checkerboard corner information extracted from the calibration plate image, so that information such as an initial value of camera internal parameters aiming at the camera internal parameters, a pose of the camera, an inverse matrix of a world coordinate system pose established by the camera relative to the calibration plate and the like is obtained. And carrying out a series of information processing and calculation according to frame posture and frame position acquired by each data frame extracted after the data processing is carried out on the image information of the calibration plate and the IMU data information, frame posture change information between two adjacent frames or between an initial frame and a current frame and constraint relation between the adjacent data frames, and obtaining information such as an external parameter rotation matrix between a camera and an inertial measurement unit, a gyro zero offset initial value of the inertial measurement unit, speed and gravity acceleration of the inertial measurement unit and the like.
S103, carrying out integrated processing according to the first parameter and the second parameter, and jointly calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit.
In this embodiment, the target internal parameters may be understood as more accurate internal parameter data with respect to the camera internal parameters that are roughly determined at the time of initialization. The target outlier may be understood as more accurate outlier data relative to the joint outlier between the camera and the inertial measurement unit that is coarsely determined at initialization.
Specifically, the visual residual error of the angular point reprojection of the three-dimensional calibration plate of the relative camera is determined according to the first parameter, meanwhile, the relative camera and the inertial measurement unit are determined according to the second parameter, the IMU residual error of the speed, the position, the gesture and the zero offset between the two frames is combined with the visual residual error and the IMU residual error to construct an error model based on a residual error formula, a large amount of frame data is calculated and iterated according to the error model, multi-frame optimization is carried out in an accumulated mode, finally, the target internal parameter of the camera and the target external parameter between the camera and the IMU are obtained simultaneously according to the optimization result, and the joint calibration between the camera and the inertial measurement unit is realized.
In the embodiment, the calibration plate data information of the calibration plate is obtained; initializing a first parameter and a second parameter between the camera and the inertial measurement unit according to the calibration plate data information; and carrying out integrated processing according to the first parameter and the second parameter, and jointly calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit. According to the technical scheme, the problems that the internal and external parameters are not calibrated synchronously and the steps are complex are solved, the operation flow of the calibration of the internal and external parameters of the camera and the IMU is simplified, and the global consistency of the internal and external parameters of the camera and the IMU is improved.
Example two
Fig. 2 is a flowchart of a joint calibration method according to a second embodiment of the present invention, where any of the above embodiments is further optimized, and the method may be applied to a case of joint calibration of a camera and an inertial measurement unit, where the method may be performed by a joint calibration device, and the device may be implemented in hardware and/or software.
As shown in fig. 2, the method includes:
s201, acquiring calibration plate data information of the calibration plate.
S202, initializing a first parameter according to checkerboard corner information in calibration plate data information.
In this embodiment, the checkerboard corner information can be understood as relevant parameter information of the corner of the checkerboard calibration board, including pixel coordinate information of the corner.
Specifically, after calibration plate data information relative to the calibration plate image is determined according to each frame of calibration plate image, according to image pixel coordinates of the checkerboard corner in the calibration plate data information, actual position coordinates of the checkerboard corner in the calibration plate image and a corresponding relation between the two coordinates, multiple frames of accumulated calculation are performed, and initialization of first parameter information such as initial values of camera internal parameters, camera pose, inverse matrix of world coordinate system pose established by the camera relative to the calibration plate is completed.
Further, initializing a first parameter according to checkerboard corner information in calibration board data information, including:
s2021, determining corner position coordinates and corner pixel coordinates according to the checkerboard corner information in the calibration plate data information.
In this embodiment, the corner position coordinates may be understood as actual position coordinates of the corner points of the checkerboard in the checkerboard calibration plate image. Corner pixel coordinates can be understood as pixel value coordinates of the corner points of the checkerboard in the checkerboard calibration plate image.
Specifically, the geometric relationship of the three-dimensional calibration plate is utilized to obtain the three-dimensional coordinates of each angular point, namely the angular point position coordinates [ X, Y, Z ]] T Wherein X, Y, Z respectively represents coordinate values of three coordinate axis directions in a three-dimensional coordinate system, T represents matrix transposition, and angular point position coordinates [ X, Y, Z ]] T Using homogeneous coordinates, it can be expressed as [ X, Y, Z,1] T . The shot picture containing the calibration plate is subjected to checkerboard corner extraction, and corner pixel coordinates of corners can be obtainedThe use of homogeneous coordinates can be expressed as +.>Wherein (1)>And->Pixel values representing the corner positions, respectively.
Fig. 3 is an exemplary display diagram of checkerboard corner points involved in a joint calibration method according to a second embodiment of the present invention; let the grid size of the calibration plate be 40mm, let the three-dimensional coordinates of the upper left corner be (0, 0), then the coordinates of the lower right corner be (80 mm, 0).
S2022, initializing a first parameter according to the corresponding relation between the angular point position coordinates and the angular point pixel coordinates.
In this embodiment, the corner pixel coordinates of the plate image are calibrated according to each frameAnd angular point position coordinates [ X, Y, Z ]] T According to the one-to-one correspondence between the angular point position coordinates and the angular point pixel coordinate pairs of the multi-frame calibration plate imagesCalculating, and back calculating initial values K of camera pose and camera internal parameters, wherein K represents the frame number of the calibration plate image, and +.>The inverse matrix of the pose of the world coordinate system established for the camera relative to the calibration plate, w represents the world coordinate system and c represents the camera.
S203, initializing a second parameter between the camera and the inertial measurement unit according to the inter-frame attitude change in the calibration plate data information and the data frame information.
In this embodiment, the inter-frame pose change may be understood as a change in IMU pose in IMU data of two adjacent frames or a change in IMU pose in IMU data from an initial frame (0 th frame) to a current frame (k+1 th frame). The data frame information may be understood as information related to IMU data of each frame, including frame pose and frame position.
Specifically, according to the change of the IMU gesture in the IMU data of two adjacent frames in the calibration plate data information or the change of the IMU gesture in the IMU data from the initial frame to the current frame, and the frame gesture and the frame position of each frame of IMU data, the initialization of second parameter information such as the external parameter rotation matrix between the camera and the inertial measurement unit, the gyro zero offset initial value of the inertial measurement unit, the speed and the gravity acceleration of the inertial measurement unit is completed by combining the constraint relation between the two adjacent frames of data.
Further, determining a second parameter between the camera and the inertial measurement unit according to the inter-frame pose change in the calibration plate data information and the data frame information, including:
s2031, initializing an external parameter rotation matrix between the camera and the inertial measurement unit and a gyro zero offset initial value of the inertial measurement unit according to the inter-frame posture change in the calibration plate data information.
In this embodiment, the extrinsic rotation matrix may be understood as an initial extrinsic between the camera and the inertial measurement unit, and the initial value of zero bias of the gyroscope may be understood as a measurement error of the gyroscope in inertial navigation.
Specifically, according to the data information of the calibration plate, the inter-frame attitude change of the camera between every two adjacent frames, the inter-frame attitude change of the IMU between every two adjacent frames and the inter-frame attitude change of the camera between the initial frame and the current frame are established, the constraint relation between the camera and the IMU is established, the external parameter rotation matrix between the camera and the inertial measurement unit is calculated and determined according to the constraint relation, the least square problem is established based on the constraint relation, and the initial value of the zero offset of the gyroscope in the inertial measurement unit is determined.
Further, determining a gyro zero offset initial value of the inertial measurement unit relative to the camera according to the inter-frame attitude change in the calibration plate data information includes:
a1, determining an external parameter rotation matrix between the camera and the inertia measurement unit according to the first inter-frame posture change of the camera in two adjacent frames and the second inter-frame posture change of the inertia measurement unit in two adjacent frames in the calibration plate data information.
In this embodiment, the first inter-frame pose change may be understood as a pose change of the camera between acquisition of two adjacent frames of calibration plate images. The second inter-frame attitude change is the attitude change of the inertial measurement unit between acquisition of two adjacent frames of IMU data.
In particular, the method comprises the steps of,for camera kth frameFirst inter-frame pose Change to k+1 frames, < >>And a second inter-frame pose change from the kth frame to the k+1 frame of the IMU, wherein c represents a camera, i represents inertial navigation of the IMU, and q represents rotation-related pose of the IMU.
Wherein,calculated by IMU pre-integration, specifically, <' > the formula (I)>Wherein i represents an IMU inertial measurement unit, t represents frame data acquisition time, +.>Representing the quaternion calculation mode,/>Indicating the angular velocity of the gyroscope output at time t.
Wherein,is obtained by the back calculation of the pose of two adjacent frames of cameras, namely +.>And->The inverse matrix is multiplied and calculated, wherein +.>For the inverse matrix of the pose of the world coordinate system established at the kth frame camera relative to the calibration plate, < >>World seat built for camera relative to calibration plate at k+1st frameThe inverse matrix of the pose is marked.
Specifically, fig. 4 is an exemplary diagram of a constraint relationship between a camera and an inertial measurement unit according to a joint calibration method according to a second embodiment of the present invention, as shown in fig. 4, according to a first inter-frame pose change from a kth frame to a k+1 frame of the cameraAnd a second inter-frame pose change of IMU kth frame to k+1 frame +.>Establishing a constraint relation +.>Solving for a rotation matrix of the extrinsic parameters between camera (c) and IMU (i) by means of a known first inter-frame pose change of the camera and a second inter-frame pose change of the IMU>
b1, determining a gyro zero offset initial value of the inertial measurement unit relative to the camera according to the external parameter rotation matrix, the second inter-frame posture change and a third inter-frame posture change of the camera in the calibration plate data information, wherein the third inter-frame posture change is the posture change of the camera acquisition image from the initial frame to the current frame.
In this embodiment, the third inter-frame pose change may be understood as a pose change from the initial frame to the current frame during the process of capturing all calibration plate images by the camera. The initial frame is the 0 th frame, and the current frame is the k+1th frame.
Specifically, the matrix is rotated according to the external parametersSecond frame posture change->And in the calibration plate data information, the third frame attitude change of the camera from the 0 th frame to the k+1 th frame +.>Constructing least square problem and establishing relationWherein b is g Zero bias for gyroscope, B zero bias, g gyroscope, B set for all image frames,/for all image frames> Is a jacobian matrix. Considering only the imaginary part of the formula>Obtaining a zero offset initial value delta b of the gyroscope after calculation g
S2032, initializing a speed initial value and a gravity acceleration initial value of the inertial measurement unit according to data frame information in the calibration plate data information.
In this embodiment, the data frame information may be understood as specific information of IMU data of each frame, including a frame pose and a frame position. The initial speed value and the initial gravitational acceleration value are the speed and the gravitational acceleration value of the IMU inertial measurement unit when in motion.
Specifically, according to the known frame gesture and frame position in each frame of IMU data of the inertial measurement unit in motion, a series of initial values to be estimated are determined by calculating in combination with the set speed position constraint relation between two adjacent frames, the initial values are converted into the least square problem and solved, the initial value of the IMU speed and the initial value of the gravity acceleration are obtained, and the initialization of the speed and the gravity acceleration of the inertial measurement unit is realized.
Further, initializing an initial value of the speed and an initial value of the gravitational acceleration of the inertial measurement unit according to the data frame information in the calibration plate data information, including:
and initializing and determining a speed initial value and a gravity acceleration initial value of the inertia measurement unit according to the frame position and the frame posture of the inertia measurement unit in the calibration plate data information and combining the set constraint relation between two adjacent data frames.
Specifically, for IMU, the k-th frame and k+1-th frame have the following speed and position relation constraint, the position, the speed and the gesture of the kth frame under the world coordinate system w are respectively g w Is the gravitational acceleration in world coordinate system w, +.>And->Is of known quantity, & gt>And g w Is unknown to be solved. Wherein Δt is the interval time of two adjacent frames of data acquisition. Let-> There is-> All initial values to be estimated are placed on the right: />
In the method, in the process of the invention,the calculations of all known quantities are summarized, H is the equation coefficient,>to solve the equation +.>I represents an identity matrix, which is an error caused by noise. Conversion to a linear least squares problem:solving for +.using Cholesky decomposition> And obtaining an IMU speed initial value and a gravity acceleration initial value.
S2033, determining the external reference rotation matrix, the gyro zero offset initial value, the speed initial value, and the gravitational acceleration initial value as second parameters between the camera and the inertial measurement unit.
S204, determining a visual residual term of the opposite camera according to the first parameter.
In this embodiment, the visual residual term r c Which can be understood as errors in camera vision.
Specifically, according to the first parameter, the initial value K of the internal parameters of the camera and the inverse matrix of the pose of the world coordinate system established by the camera relative to the calibration plateCorner coordinates [ u ] of k+1st frame pixel obtained by re-projection k+1 ,v k+1 ,1] T Angular point coordinates [ X, Y, Z,1 ] of three-dimensional world calibration plate] T And a transformation matrix between the k-th frame and the k+1-th frame of the IMU>According to Determining the visual residual term of the relative camera>
S205, determining an inertial navigation residual term of the relative inertial measurement unit according to the second parameter.
In the present embodiment, the inertial navigation residual term r i It is understood as the error of the inertial measurement unit IMU.
Specifically, according to the second parameter, the external rotation matrix, the initial value of zero offset, the initial value of speed and initial value of gravitational acceleration of the gyroscope of the IMU, the information of the gesture, the position and the like Determining an inertial navigation residual term of the relative inertial measurement unit>In the formula δb a For acceleration zero offset residual error, δb g Zero bias residual error of gyroscope>And->Are all the reference numerals of the back matrix, +.>IMU pose changes for the kth frame and the k+1th frame.
S206, carrying out integrated residual item optimization processing on the visual residual item and the inertial navigation residual item according to the set error model, and jointly calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit.
In this embodiment, the set error model may be understood as a calculation formula set in advance for performing the residual term optimization process.
Specifically, according to the set error model Will r i And r c And the sum approaches zero, the set error model is solved, and the integrated residual error item optimization processing is completed under the data calculation of a large number of frames, so that the target internal parameters of the camera and the target external parameters between the camera and the inertial measurement unit are calibrated in a combined mode, and the time-synchronous visual inertial system is more accurate. Wherein B is a data frame set, and C is an image frame set.
In the embodiment, the first parameter is determined by initializing according to the checkerboard corner information in the calibration plate data information; initializing and determining a second parameter between the camera and the inertial measurement unit according to the frame posture change in the calibration plate data information, the inter-frame posture change and the data frame information; determining a visual residual term of the relative camera according to the first parameter; determining a measurement residual term inertial navigation residual term of the relative inertial measurement unit according to the second parameter; and carrying out integrated residual error item optimization processing on the visual residual error item and the inertial navigation residual error item according to the set error model, and jointly calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit. According to the technical scheme, the problems that the internal and external parameters are not calibrated synchronously and the steps are complex are solved, the operation flow of the calibration of the internal and external parameters of the camera and the IMU is simplified, and the global consistency of the internal and external parameters of the camera and the IMU is improved.
Example III
Fig. 5 is a schematic structural diagram of a joint calibration device according to a third embodiment of the present invention. As shown in fig. 5, the apparatus includes:
the data information determining module 31 is configured to obtain calibration plate data information of the calibration plate;
a related parameter determining module 32, configured to initialize a first parameter and a second parameter between the camera and the inertial measurement unit according to the calibration plate data information, where the first parameter includes an internal parameter of the camera and a related parameter thereof, and the second parameter includes an external parameter between the camera and the inertial measurement unit and a related parameter thereof;
and the joint calibration module 33 is used for performing integrated processing according to the first parameter and the second parameter, and jointly calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit.
The combined calibration device adopted by the technical scheme solves the problems that the calibration of the internal and external parameters is asynchronous and the steps are complex, simplifies the operation flow of the calibration of the internal and external parameters of the camera and the IMU, and improves the global consistency of the internal and external parameters of the camera and the IMU.
Optionally, the related parameter determining module 32 includes:
the internal parameter determining submodule is used for initializing a first parameter according to the checkerboard corner information in the calibration plate data information;
and the external parameter determination submodule is used for initializing a second parameter between the camera and the inertia measurement unit according to the inter-frame posture change and the data frame information in the calibration plate data information.
Optionally, the internal parameter determining submodule is specifically configured to:
respectively determining angular point position coordinates and angular point pixel coordinates according to the checkerboard angular point information in the calibration plate data information;
and initializing a first parameter according to the corresponding relation between the angular point position coordinates and the angular point pixel coordinates.
Optionally, the parameter determination submodule includes:
the gyro zero offset determining unit is used for initializing an external parameter rotation matrix between the camera and the inertial measurement unit and a gyro zero offset initial value of the inertial measurement unit according to the inter-frame posture change in the calibration plate data information;
the speed determining unit is used for initializing the initial speed value and the initial gravitational acceleration value of the inertial measurement unit according to the data frame information in the calibration plate data information;
and the related parameter determining unit is used for determining the external parameter rotation matrix, the gyro zero offset initial value, the speed initial value and the gravity acceleration initial value as second parameters between the camera and the inertia measuring unit.
Optionally, the gyro zero offset determining unit is specifically configured to:
determining an external parameter rotation matrix between the camera and the inertia measurement unit according to the first inter-frame posture change of the camera in two adjacent frames and the second inter-frame posture change of the inertia measurement unit in two adjacent frames in the calibration plate data information;
and determining a gyro zero offset initial value of the inertial measurement unit according to the external parameter rotation matrix, the second inter-frame posture change and a third inter-frame posture change of the camera in the calibration plate data information, wherein the third inter-frame posture change is the posture change of the camera acquisition image from an initial frame to a current frame.
Optionally, the speed determining unit is specifically configured to:
and initializing a speed initial value and a gravity acceleration initial value of the inertia measurement unit relative to the camera according to the frame position and the frame posture of the inertia measurement unit in the calibration plate data information and combining the set constraint relation between two adjacent data frames.
Optionally, the joint calibration module 33 is specifically configured to:
determining a visual residual term relative to the camera according to the first parameter;
determining inertial navigation residual terms relative to the inertial measurement unit according to the second parameters;
and carrying out integrated residual item optimization processing on the visual residual item and the inertial navigation residual item according to a set error model, and jointly calibrating a target internal parameter of the camera and a target external parameter between the camera and the inertial measurement unit.
The joint calibration device provided by the embodiment of the invention can execute the joint calibration method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 6 shows a schematic diagram of an electronic device 40 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 6, the electronic device 40 includes at least one processor 41, and a memory communicatively connected to the at least one processor 41, such as a Read Only Memory (ROM) 42, a Random Access Memory (RAM) 43, etc., in which the memory stores a computer program executable by the at least one processor, and the processor 41 may perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 42 or the computer program loaded from the storage unit 48 into the Random Access Memory (RAM) 43. In the RAM 43, various programs and data required for the operation of the electronic device 40 may also be stored. The processor 41, the ROM 42 and the RAM 43 are connected to each other via a bus 44. An input/output (I/O) interface 45 is also connected to bus 44.
Various components in electronic device 40 are connected to I/O interface 45, including: an input unit 46 such as a keyboard, a mouse, etc.; an output unit 47 such as various types of displays, speakers, and the like; a storage unit 48 such as a magnetic disk, an optical disk, or the like; and a communication unit 49 such as a network card, modem, wireless communication transceiver, etc. The communication unit 49 allows the electronic device 40 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 41 may be various general and/or special purpose processing components with processing and computing capabilities. Some examples of processor 41 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 41 performs the various methods and processes described above, such as a joint calibration method.
In some embodiments, a joint calibration method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 48. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 40 via the ROM 42 and/or the communication unit 49. When the computer program is loaded into RAM 43 and executed by processor 41, one or more of the steps of a joint calibration method described above may be performed. Alternatively, in other embodiments, the processor 41 may be configured to perform a joint calibration method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. The joint calibration method is characterized by comprising the following steps of:
acquiring calibration plate data information of a calibration plate;
initializing a first parameter of a camera and a second parameter between the camera and an inertial measurement unit according to the calibration plate data information, wherein the first parameter comprises an internal parameter of the camera and a related parameter thereof, and the second parameter comprises an external parameter between the camera and the inertial measurement unit and a related parameter thereof;
and carrying out integrated processing according to the first parameter and the second parameter, and jointly calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit.
2. The method of claim 1, wherein initializing a first parameter of a camera and a second parameter between the camera and an inertial measurement unit based on the calibration plate data information comprises:
initializing a first parameter according to the checkerboard corner information in the calibration plate data information;
and initializing a second parameter between the camera and the inertial measurement unit according to the inter-frame attitude change in the calibration plate data information and the data frame information.
3. The method of claim 2, wherein initializing the first parameter based on the checkerboard corner information in the calibration plate data information comprises:
respectively determining angular point position coordinates and angular point pixel coordinates according to the checkerboard angular point information in the calibration plate data information;
and initializing a first parameter according to the corresponding relation between the angular point position coordinates and the angular point pixel coordinates.
4. The method of claim 2, wherein initializing a second parameter between a camera and an inertial measurement unit based on the inter-frame pose change and data frame information in the calibration plate data information comprises:
initializing an external parameter rotation matrix between a camera and an inertial measurement unit and a gyro zero offset initial value of the inertial measurement unit according to the inter-frame posture change in the calibration plate data information;
initializing a speed initial value and a gravity acceleration initial value of the inertia measurement unit according to data frame information in the calibration plate data information;
and determining the external reference rotation matrix, the initial value of zero offset of the gyroscope, the initial value of speed and the initial value of gravitational acceleration as a second parameter between the camera and the inertial measurement unit.
5. The method of claim 4, wherein initializing the extrinsic rotation matrix between the camera and the inertial measurement unit and the gyro zero bias initial value of the inertial measurement unit according to the inter-frame pose change in the calibration plate data information comprises:
determining an external parameter rotation matrix between the camera and the inertia measurement unit according to the first inter-frame posture change of the camera in two adjacent frames and the second inter-frame posture change of the inertia measurement unit in two adjacent frames in the calibration plate data information;
and determining a gyro zero offset initial value of the inertial measurement unit according to the external parameter rotation matrix, the second inter-frame posture change and a third inter-frame posture change of the camera in the calibration plate data information, wherein the third inter-frame posture change is the posture change of the camera acquisition image from an initial frame to a current frame.
6. The method of claim 4, wherein initializing the initial velocity value and the initial gravitational acceleration value of the inertial measurement unit based on the data frame information in the calibration plate data information comprises:
and initializing a speed initial value and a gravity acceleration initial value of the inertia measurement unit according to the frame position and the frame posture of the inertia measurement unit in the calibration plate data information and combining the set constraint relation between two adjacent data frames.
7. The method of claim 1, wherein the integrating process based on the first parameter and the second parameter, in combination with calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit, comprises:
determining a visual residual term relative to the camera according to the first parameter;
determining inertial navigation residual terms relative to the inertial measurement unit according to the second parameters;
and carrying out integrated residual item optimization processing on the visual residual item and the inertial navigation residual item according to a set error model, and jointly calibrating a target internal parameter of the camera and a target external parameter between the camera and the inertial measurement unit.
8. A joint calibration device, comprising:
the data information determining module is used for obtaining the calibration plate data information of the calibration plate;
the related parameter determining module is used for initializing a first parameter and a second parameter between the camera and the inertial measurement unit according to the calibration plate data information, wherein the first parameter comprises an internal parameter of the camera and related parameters thereof, and the second parameter comprises an external parameter between the camera and the inertial measurement unit and related parameters thereof;
and the combined calibration module is used for carrying out integrated processing according to the first parameter and the second parameter, and jointly calibrating the target internal parameter of the camera and the target external parameter between the camera and the inertial measurement unit.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform a joint calibration method according to any one of claims 1-7.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores computer instructions for causing a processor to implement a joint calibration method according to any one of claims 1-7 when executed.
CN202311166267.3A 2023-09-11 2023-09-11 Combined calibration method, device, equipment and storage medium Pending CN117036509A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311166267.3A CN117036509A (en) 2023-09-11 2023-09-11 Combined calibration method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311166267.3A CN117036509A (en) 2023-09-11 2023-09-11 Combined calibration method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117036509A true CN117036509A (en) 2023-11-10

Family

ID=88626555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311166267.3A Pending CN117036509A (en) 2023-09-11 2023-09-11 Combined calibration method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117036509A (en)

Similar Documents

Publication Publication Date Title
CN107255476B (en) Indoor positioning method and device based on inertial data and visual features
US20190371003A1 (en) Monocular vision tracking method, apparatus and non-volatile computer-readable storage medium
US10247556B2 (en) Method for processing feature measurements in vision-aided inertial navigation
CN108871311B (en) Pose determination method and device
US20210183100A1 (en) Data processing method and apparatus
CN110879400A (en) Method, equipment and storage medium for fusion positioning of laser radar and IMU
CN110660098B (en) Positioning method and device based on monocular vision
CN111060138B (en) Calibration method and device, processor, electronic equipment and storage medium
CN111609868A (en) Visual inertial odometer method based on improved optical flow method
US11042984B2 (en) Systems and methods for providing image depth information
CN109767470B (en) Tracking system initialization method and terminal equipment
CN113361365B (en) Positioning method, positioning device, positioning equipment and storage medium
CN114013449A (en) Data processing method and device for automatic driving vehicle and automatic driving vehicle
CN109040525B (en) Image processing method, image processing device, computer readable medium and electronic equipment
CN111784834A (en) Point cloud map generation method and device and electronic equipment
CN117232499A (en) Multi-sensor fusion point cloud map construction method, device, equipment and medium
CN113610702B (en) Picture construction method and device, electronic equipment and storage medium
Cheng et al. AR-based positioning for mobile devices
CN117437348A (en) Computing device and model generation method
CN115311624B (en) Slope displacement monitoring method and device, electronic equipment and storage medium
CN117036509A (en) Combined calibration method, device, equipment and storage medium
CN115773759A (en) Indoor positioning method, device and equipment of autonomous mobile robot and storage medium
CN115727871A (en) Track quality detection method and device, electronic equipment and storage medium
CN112991445B (en) Model training method, gesture prediction method, device, equipment and storage medium
CN115239758A (en) Timestamp correction method, apparatus, device, medium, and computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination