CN113587924B - Shooting system calibration method, shooting system calibration device, computer equipment and storage medium - Google Patents

Shooting system calibration method, shooting system calibration device, computer equipment and storage medium Download PDF

Info

Publication number
CN113587924B
CN113587924B CN202110664964.6A CN202110664964A CN113587924B CN 113587924 B CN113587924 B CN 113587924B CN 202110664964 A CN202110664964 A CN 202110664964A CN 113587924 B CN113587924 B CN 113587924B
Authority
CN
China
Prior art keywords
rotation
time delay
value
calibration
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110664964.6A
Other languages
Chinese (zh)
Other versions
CN113587924A (en
Inventor
赖东东
谢亮
谭明朗
付伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Insta360 Innovation Technology Co Ltd
Original Assignee
Insta360 Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Insta360 Innovation Technology Co Ltd filed Critical Insta360 Innovation Technology Co Ltd
Priority to CN202110664964.6A priority Critical patent/CN113587924B/en
Publication of CN113587924A publication Critical patent/CN113587924A/en
Application granted granted Critical
Publication of CN113587924B publication Critical patent/CN113587924B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices

Abstract

The application relates to a shooting system calibration method, a shooting system calibration device, computer equipment and a storage medium. The method comprises the following steps: determining a reference value of the time delay according to a rotation error between the vision system and the inertial measurement unit; and calibrating the time delay and the rotation external parameter between the vision system and the inertia measurement unit according to the reference value of the time delay, wherein the vision system and the inertia measurement unit are coupled in the same shooting system, the rotation error is determined based on a calibration video, the calibration video is obtained by shooting the shooting system under rotation, and the rotation error is used for representing the degree of inconsistency between the rotation state determined based on the vision system and the rotation state determined based on the inertia measurement unit. Before the time delay between the visual system and the inertial measurement unit and the rotation external parameter are calibrated, the reference value of the time delay is determined, so that the situation that the time delay is not too large in deviation and the calibration result is not very accurate and even the calibration fails due to the too large time delay deviation is avoided.

Description

Shooting system calibration method, shooting system calibration device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of shooting calibration technologies, and in particular, to a shooting system calibration method, device, computer device, and storage medium.
Background
An IMU (Inertial Measurement Unit ) and vision system are typically present in the camera system. For example, a device for realizing positioning navigation has a photographing system mounted thereon. The visual system in the shooting system is required to shoot the surrounding environment, and the IMU in the shooting system is required to solve the equipment gesture. That is, other functions such as positioning navigation are implemented on the premise that the relevant parameters established between the vision system and the IMU can be precisely determined. Therefore, calibration of the camera system is required to determine relevant parameters between the vision system and the IMU.
In the prior art, when the shooting system is calibrated, only the time delay between the visual system and the IMU is generally calibrated, or the time delay and the rotation external parameter are calibrated simultaneously on the premise of adopting a calibration plate video, or the time delay coarse calibration is carried out by adopting a b spline alignment mode, or a residual equation is constructed to carry out iterative optimization mode, so that the rotation external parameter and the time delay are calibrated simultaneously. In the actual implementation process, if the time stamp interval of the calibrated video frame is larger or when a non-calibrated board video is adopted, the calibration method is easy to fail, and the calibration result obtained by the calibration method is also inaccurate and easy to cause errors.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a photographing system calibration method, apparatus, computer device, and storage medium that can solve the problem of inaccurate calibration results.
A method for calibrating a photographing system, the method comprising:
determining a reference value of the time delay according to a rotation error between the vision system and the inertial measurement unit;
and calibrating the time delay and the rotation external parameter between the vision system and the inertia measurement unit according to the reference value of the time delay, wherein the vision system and the inertia measurement unit are coupled in the same shooting system, the rotation error is determined based on a calibration video, the calibration video is obtained by shooting the shooting system under rotation, and the rotation error is used for representing the degree of inconsistency between the rotation state determined based on the vision system and the rotation state determined based on the inertia measurement unit.
In one embodiment, determining the reference value of the time delay based on the rotational error between the vision system and the inertial measurement unit includes:
updating the first initial value of the time delay to obtain a first updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the first updated value of the time delay, repeating the updating and determining processes to determine the rotation error meeting a first preset condition, and taking the first updated value corresponding to the rotation error meeting the first preset condition as a reference value of the time delay.
In one embodiment, updating the first initial value of the time delay to obtain a first updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the first updated value of the time delay, and repeating the updating and determining processes to determine the rotation error satisfying a first preset condition, including:
updating the first initial value according to a preset step length until the updated first updated value is not located in the first time delay interval, wherein the first initial value is located in the first time delay interval;
determining a rotation error corresponding to a first updated value obtained by each update;
and determining the minimum value of all the determined rotation errors, and taking the minimum value as the rotation error meeting the first preset condition.
In one embodiment, determining a rotation error corresponding to a first updated value obtained by each update includes:
for a first updated value obtained by each update, adjusting a time point corresponding to each frame of image in the calibration video according to the first updated value;
acquiring a first rotation matrix between two adjacent frames of images in the calibration video, and acquiring a second rotation matrix corresponding to the two adjacent frames of images in the calibration video after the time point is adjusted, wherein the second rotation matrix is determined based on the measured value of the inertial measurement unit in a time period corresponding to the two adjacent frames of images;
And determining a rotation error corresponding to the first updated value according to the first rotation matrix between every two adjacent frames of images and the second rotation matrix corresponding to every two adjacent frames of images.
In one embodiment, determining the rotation error corresponding to the first update value according to the first rotation matrix between every two adjacent frames of images and the second rotation matrix corresponding to every two adjacent frames of images includes:
calculating the shaft angle difference value corresponding to each two adjacent frames of images according to the first rotation matrix between each two adjacent frames of images and the second rotation matrix corresponding to each two adjacent frames of images;
and calculating a rotation error corresponding to the first updated value according to the shaft angle difference value corresponding to each two adjacent frames of images.
In one embodiment, calibrating the time delay and the rotation external parameter between the vision system and the inertial measurement unit according to the reference value of the time delay includes:
determining a second time delay interval according to the reference value of the time delay, and determining a second initial value of the time delay in the second time delay interval;
updating the second initial value of the time delay to obtain a second updated value of the time delay, acquiring a reference value of the rotation external parameter between the vision system and the inertial measurement unit according to the second updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the second updated value of the time delay and the reference value of the rotation external parameter, repeating the processes of updating the time delay, determining the rotation external parameter and the rotation error corresponding to the updated time delay, determining the rotation error meeting a second preset condition, and taking the second updated value of the time delay and the reference value of the rotation error corresponding to the rotation error meeting the second preset condition as a calibration result of the shooting system.
In one embodiment, obtaining the reference value of the rotation parameter between the vision system and the inertial measurement unit according to the second updated value of the time delay includes:
according to the second updated value of the time delay, determining a hand-eye calibration equation set, wherein a rotation external parameter in the hand-eye calibration equation set is an unknown quantity;
and solving the hand eye calibration equation set to obtain the reference value of the rotation external parameter.
A camera system calibration apparatus, the apparatus comprising:
the determining module is used for determining a reference value of the time delay according to the rotation error between the vision system and the inertia measuring unit;
the calibration module is used for calibrating the time delay and the rotation external parameter between the vision system and the inertia measurement unit according to the reference value of the time delay, the vision system and the inertia measurement unit are coupled in the same shooting system, the rotation error is determined based on a calibration video, the calibration video is obtained by shooting the shooting system under rotation, and the rotation error is used for representing the degree of inconsistency between the rotation state determined based on the vision system and the rotation state determined based on the inertia measurement unit.
A computer device comprising a memory storing a computer program and a processor which when executing the computer program performs the steps of:
Determining a reference value of the time delay according to a rotation error between the vision system and the inertial measurement unit;
and calibrating the time delay and the rotation external parameter between the vision system and the inertia measurement unit according to the reference value of the time delay, wherein the vision system and the inertia measurement unit are coupled in the same shooting system, the rotation error is determined based on a calibration video, the calibration video is obtained by shooting the shooting system under rotation, and the rotation error is used for representing the degree of inconsistency between the rotation state determined based on the vision system and the rotation state determined based on the inertia measurement unit.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
determining a reference value of the time delay according to a rotation error between the vision system and the inertial measurement unit;
and calibrating the time delay and the rotation external parameter between the vision system and the inertia measurement unit according to the reference value of the time delay, wherein the vision system and the inertia measurement unit are coupled in the same shooting system, the rotation error is determined based on a calibration video, the calibration video is obtained by shooting the shooting system under rotation, and the rotation error is used for representing the degree of inconsistency between the rotation state determined based on the vision system and the rotation state determined based on the inertia measurement unit.
The shooting system calibration method, the shooting system calibration device, the computer equipment and the storage medium determine the reference value of the time delay according to the rotation error between the vision system and the inertia measurement unit. And calibrating the time delay and the rotation external parameter between the visual system and the inertial measurement unit according to the reference value of the time delay. Before the time delay between the visual system and the inertial measurement unit and the rotation external parameter are calibrated, the reference value of the time delay is determined, so that the situation that the time delay is not too large in deviation and the calibration result is not very accurate and even the calibration fails due to the too large time delay deviation is avoided. In addition, the delay is firstly subjected to coarse calibration, and then the delay and the rotation external parameters are subjected to fine calibration. Because the optimization range of time delay is reduced, the calculated amount can be reduced, and the calibration efficiency is improved.
Drawings
FIG. 1 is a flow chart of a calibration method of a photographing system according to an embodiment;
FIG. 2 is a flow chart of a calibration method of a photographing system according to another embodiment;
FIG. 3 is a schematic diagram of delay adjustment in one embodiment;
FIG. 4 is a schematic diagram of a hand-eye calibration formula in one embodiment;
FIG. 5 is a graph illustrating time delay versus average error in one embodiment;
FIG. 6 is a block diagram of a calibration device of a photographing system according to an embodiment;
fig. 7 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
It will be understood that the terms "first," "second," and the like, as used herein, may be used to describe various terms, but are not limited by these terms unless otherwise specified. These terms are only used to distinguish one term from another. For example, the third and fourth preset thresholds may be the same or different without departing from the scope of the present application.
An IMU (Inertial Measurement Unit ) and vision system are typically present in the camera system. For example, a device for realizing positioning navigation has a photographing system mounted thereon. The visual system in the shooting system is required to shoot the surrounding environment, and the IMU in the shooting system is required to solve the equipment gesture. That is, other functions such as positioning navigation are implemented on the premise that the relevant parameters established between the vision system and the IMU can be precisely determined. Therefore, calibration of the camera system is required to determine relevant parameters between the vision system and the IMU.
In the related art, when the shooting system is calibrated, only the time delay between the visual system and the IMU is generally calibrated, or the time delay and the rotation external parameter are calibrated simultaneously on the premise of adopting a calibration plate video, or the time delay coarse calibration is carried out by adopting a b spline alignment mode, or a residual equation is constructed to carry out iterative optimization mode, so that the rotation external parameter and the time delay are calibrated simultaneously. In the actual implementation process, if the time stamp interval of the calibrated video frame is larger or when a non-calibrated board video is adopted, the calibration method is easy to fail, and the calibration result obtained by the calibration method is also inaccurate and easy to cause errors.
In view of the above problems in the related art, an embodiment of the present invention provides a calibration method for a photographing system, which may be applied to a processing device such as a server, and the processing device may be, but is not limited to, various personal computers, notebook computers, and the like. It should be noted that, the number of "plural" and the like mentioned in each embodiment of the present application refers to the number of "at least two", for example, "plural" refers to "at least two".
Before explaining the specific embodiments of the present application, a description is given of main application scenarios of the present application. The shooting system calibration method is mainly applied to scenes calibrated by the shooting system, and the shooting system can be formed by a vision system and an inertial measurement unit. The photographing system may be a photographing apparatus such as a camera, and the embodiment of the present invention is not particularly limited thereto. Taking a photographing system as an example of a camera, the vision system is an imaging module in the camera, and the inertial measurement unit may be disposed in the camera as well. In combination with the foregoing embodiments, in one embodiment, referring to fig. 1, a method for calibrating a photographing system is provided. Taking the method as an example, the method is applied to a terminal, and an execution subject is taken as the terminal for explanation, and the method comprises the following steps:
101. Determining a reference value of the time delay according to a rotation error between the vision system and the inertial measurement unit;
102. and calibrating a rotation external parameter between the vision system and the inertia measurement unit according to the reference value of the time delay, wherein the vision system and the inertia measurement unit are coupled in the same shooting system, a rotation error is determined based on a calibration video, the calibration video is obtained by shooting the shooting system under rotation, and the rotation error is used for representing the degree of inconsistency between the rotation state determined based on the vision system and the rotation state determined based on the inertia measurement unit.
Before the step 101 is executed, internal parameters and distortion parameters of the shooting system may be calibrated in advance, and the shooting system shoots and obtains a calibrated video in a rotating state. When the calibration video is obtained through shooting, the shooting system can generate certain rotation in the yaw direction, the pitching direction and the rolling direction as much as possible, so that the calibration video can cover various factors influencing the calibration, and then the calibration precision is higher when the calibration video is used for calibration. The shooting system can rotate at a low speed during rotation, and further, can rotate at a variable speed. The reason why the rotation is performed at a low speed is that if the rotation is too fast, the condition of fuzzy shooting of the calibrated video occurs, and for two adjacent frames of images, the characteristic points of the two frames of images need to be extracted in the subsequent process, and the first rotation matrix is obtained, if the fuzzy shooting of the calibrated video is caused, the extracted characteristic points are not accurate enough, and therefore the extraction and tracking precision is affected. The speed change rotation is not normal to the motion change of the shooting system during shooting, and is normal to the motion change. Therefore, the photographing system is set to rotate at a variable speed, so that the calibration video can cover the factor of 'speed change rotation', and the calibration accuracy is higher when the calibration is performed based on the calibration video.
In addition, according to the calibration video, a series of rotation states determined based on the vision system can be calculated, and the rotation state change between every two adjacent frames of images in the calibration video can be embodied. The series of rotation states may be represented by a time series of different time periods, each of which may be a time period corresponding to each two adjacent frames of images in the calibration video, each of which corresponds to a rotation state. The rotation state change between every two adjacent frames of images can be used for describing the rotation of the shooting system in the time period corresponding to every two adjacent frames of images. That is, the calculated series of rotational states determined based on the vision system, in effect, describes what rotation the camera system has undergone at the angle of the vision system.
Similarly, a series of rotation states determined based on the inertial measurement unit may be calculated, and the series of rotation states may be represented by a time series of different time periods, one rotation state for each time period. The different time periods of the time sequence can be the time periods corresponding to every two adjacent frames of images in the calibration video. The rotation state determined based on the inertial measurement unit corresponding to each time period is the angle of standing on the inertial measurement unit, and describes how the shooting system is subjected to rotation.
Since both describe what rotation the camera system has undergone during different time periods in the time series, and if the time series corresponding to both are the same, the rotation states determined by both are identical in an ideal state. And due to time delay, deviation exists between the rotation states determined by the time delay and the time delay, namely rotation errors. The rotation errors between the rotation states determined by the time delays are different. Based on the above principle, the time delay can be determined from the rotational error between the vision system and the inertial measurement unit. Accordingly, when step 101 is performed, q discrete delay values may be determined, and a rotation error for each delay value may be calculated, so that a delay value with a smaller rotation error may be selected as a reference value for the delay. In summary, the process corresponding to step 101 may be understood as a process of coarse calibration of the delay. Specifically, in step 101, coarse calibration can be performed on the time delay in a b-spline alignment mode, but calibration errors are easy to occur under the condition of larger time stamp intervals; or the time delay can be roughly calibrated in a linear solving mode.
In step 102, the time delay and the rotation external parameter may be calibrated according to the reference value of the time delay. The calibration mode can be calibrated simultaneously by adopting a mode of constructing a hand-eye calibration equation for solving, and the embodiment of the invention is not particularly limited, namely, besides constructing the hand-eye calibration equation, the calibration mode can also be calibrated by adopting a mode of constructing other equations for solving, for example, the calibration mode of respectively constructing a time delay calibration equation and a rotation external parameter calibration equation for solving, or the calibration mode of constructing a residual error equation for optimizing and iterating. The process corresponding to step 102 may be understood as a process of fine calibration of the time delay and the rotation parameters.
According to the method provided by the embodiment of the invention, the reference value of the time delay is determined according to the rotation error between the vision system and the inertial measurement unit. And calibrating the time delay and the rotation external parameter between the visual system and the inertial measurement unit according to the reference value of the time delay. Before the time delay between the visual system and the inertial measurement unit and the rotation external parameter are calibrated, the reference value of the time delay is determined, so that the situation that the time delay is not too large in deviation and the calibration result is not very accurate and even the calibration fails due to the too large time delay deviation is avoided. In addition, the delay is firstly subjected to coarse calibration, and then the delay and the rotation external parameters are subjected to fine calibration. Because the optimization range of time delay is reduced, the calculated amount can be reduced, and the calibration efficiency is improved.
In combination with the foregoing embodiments, in one embodiment, the present invention embodiment does not specifically limit the manner in which the reference value of the time delay is determined based on the rotation error between the vision system and the inertial measurement unit, including but not limited to: updating the first initial value of the time delay to obtain a first updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the first updated value of the time delay, repeating the updating and determining processes to determine the rotation error meeting a first preset condition, and taking the first updated value corresponding to the rotation error meeting the first preset condition as a reference value of the time delay.
In the above process, the first preset condition is mainly used to measure whether the rotation error determined by the first updated value based on the time delay is small enough, and the first preset condition may be set according to the requirement, which is not particularly limited in the embodiment of the present invention. The first preset condition may be that the rotation error falls within a first preset range, or that the rotation error gradually converges to a certain value, which is not particularly limited in the embodiment of the present invention. In addition, the first initial value of the delay may be selected within a certain range, such as [ -maxT, maxT ], and may be updated when the first initial value of the delay is updated. The update mode may be selected randomly, which is not limited in the embodiment of the present invention.
According to the method provided by the embodiment of the invention, the first initial value of the time delay is updated to obtain the first updated value of the time delay, the rotation error between the vision system and the inertia measurement unit is determined according to the first updated value of the time delay, the updating and determining processes are repeated to determine the rotation error meeting the first preset condition, and the first updated value corresponding to the rotation error meeting the first preset condition is used as the reference value of the time delay. The first updating values with different time delays can be screened to determine the first updating values which can meet the first preset condition to serve as the reference value of the time delays, and the first preset condition can be used for measuring whether the rotation errors determined by the first updating values with different time delays are small enough or not, so that the determined reference value is more accurate.
In combination with the foregoing embodiments, in one embodiment, regarding updating the first initial value of the time delay to obtain a first updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the first updated value of the time delay, and repeating the above updating and determining processes to determine a rotation error satisfying the first preset condition, which is not particularly limited in the embodiment of the present invention. Referring to fig. 2, the method process includes:
201. Updating the first initial value according to a preset step length until the updated first updated value is not located in the first time delay interval, wherein the first initial value is located in the first time delay interval;
202. determining a rotation error corresponding to a first updated value obtained by each update;
203. and determining the minimum value of all the determined rotation errors, and taking the minimum value as the rotation error meeting the first preset condition.
In the above process, when the first initial value is updated, the first initial value may be accumulated according to a preset step length, and the accumulated value added each time is used as the first updated value of the time delay, or after the first initial value is accumulated according to the preset step length, a number of values may be selected around the accumulated value as the first updated value of the time delay. Taking the first delay interval as [ -maxT, maxT ] as an example, the first initial value may be located in the interval, and the accumulation process may be performed until the accumulated value exceeds the upper limit of the interval.
In addition, the preset step length may be set according to the calibration resolution res, for example, the preset step length may be an integer multiple of res, that is, m×res is taken as the preset step length. It should be appreciated that the smaller the preset step size, the smaller the change in the first updated value obtained each time the time delay is updated, and thus the higher the calibration accuracy. Correspondingly, the calibration times are increased, and the calculated amount is also increased. That is, the smaller the preset step length is, the higher the calibration accuracy is.
According to the method provided by the embodiment of the invention, the first initial value is updated according to the preset step length until the updated first updated value is not located in the first time delay interval. And determining a rotation error corresponding to the first updated value obtained by updating each time. And determining the minimum value of all the determined rotation errors, and taking the minimum value as the rotation error meeting the first preset condition. The first updating values with different time delays can be screened based on the rotation errors determined by the first updating values with different time delays, the first updating values with different time delays can be determined to be used as the reference values of the time delays, the first preset conditions can be used for measuring whether the rotation errors determined by the first updating values with different time delays are small enough or not, repeated updating and value taking can be carried out on the first updating values with different time delays in a first time delay interval according to preset step sizes until the first updating values with different time delays can be determined to be used as the reference values of the time delays, and accordingly calibration errors caused under the condition that the time stamp interval between a vision system and an inertia measurement unit is too large can be avoided, and the determined reference values are more accurate.
In combination with the foregoing embodiments, in one embodiment, the manner of determining the rotation error corresponding to the first updated value obtained by each update according to the present invention is not specifically limited, and includes, but is not limited to: for a first updated value obtained by each update, adjusting a time point corresponding to each frame of image in the calibration video according to the first updated value; acquiring a first rotation matrix between two adjacent frames of images in the calibration video, and acquiring a second rotation matrix corresponding to the two adjacent frames of images in the calibration video after the time point is adjusted, wherein the second rotation matrix is determined based on the measured value of the inertial measurement unit in a time period corresponding to the two adjacent frames of images; and determining a rotation error corresponding to the first updated value according to the first rotation matrix between every two adjacent frames of images and the second rotation matrix corresponding to every two adjacent frames of images.
As shown in fig. 3, the upper half of the black solid arrow in fig. 3 is the time point corresponding to the image, and may be represented by the sampling time point of the image, and the gray dotted arrow with a smaller size than the black solid arrow is the sampling time of the inertial measurement unit. If the time delay between the vision system and the inertial measurement unit is dt, whether the vision system is slower than the inertial measurement unit or the inertial measurement unit is slower than the vision system, reference is made to the gray solid arrow in the lower half of fig. 3. Taking the example that the first black solid arrow from left to right in the upper half of fig. 3 represents the sampling time point of the first frame image, the second black solid arrow from left to right represents the sampling time point of the second frame image, the first gray solid arrow from left to right in the lower half of fig. 3 also represents the sampling time point of the first frame image, and the second gray solid arrow from left to right likewise represents the sampling time point of the second frame image, except that the sampling time point between the two is delayed by dt.
When a first rotation matrix between every two adjacent frames of images in the calibration video is obtained, the corresponding relation of pixel points between the two frames of images can be obtained through a direct method or a characteristic point method, and then the first rotation matrix between the two frames of images is obtained through calculation in a mode of decomposing an essential matrix or a basic matrix by utilizing epipolar geometric constraint. It should be noted that, if the exposure mode adopted by the vision system is roller shutter exposure, since the feature points at each position in the image may not be obtained by exposure at the same time, in order to make the calculated first rotation matrix between the images closer to the middle time of image exposure, before calculating the first rotation matrix by using the feature points, the extracted feature points may be uniformly distributed on the image as much as possible when extracting the feature points.
The relative rotation between time series and images in the calibrated video can be obtained through the processMake a representation of->In effect, the first rotation matrix corresponding to each of the different time periods is represented. Wherein, the time represents a time sequence consisting of time periods corresponding to every two adjacent frames of images in the calibration video, and +. >Representing every two adjacent framesThe relative rotation between the images, i.e. the first rotation matrix. />V in (c) represents the visual system coordinate system, c represents the frame number, and i represents the frame number, if i starts from the 0 th frame, the maximum value of i is the total frame number minus 2 in the calibration video.
As can be seen from fig. 3, although the time point corresponding to each frame of image in the calibration video is adjusted, the first rotation matrix between every two adjacent frames of images is determined according to the feature points in the images, so that the first rotation matrix is not affected by the time point corresponding to the images. The second rotation matrix corresponding to each two adjacent frames of images is affected by the corresponding time point of the images.
In connection with fig. 3, taking the example that the sampling time point of the first frame image in the upper half of fig. 3 is 1 st second and the sampling time point of the second frame image is 1.5 th second, and the vision system is slower than the inertial measurement unit by 0.25 second, the measurement value of the inertial measurement unit between 1 st second and 1.5 th seconds should be used when the second rotation matrix corresponding to the first frame image and the second frame image is acquired. After the time point corresponding to each frame of image in the calibration video is adjusted, when the second rotation matrix corresponding to the first frame of image and the second frame of image is acquired, the measurement value of the inertial measurement unit between 1.25 seconds and 1.75 seconds is changed.
When the second rotation matrix corresponding to each two adjacent frames of images in the adjusted calibration video is obtained, the measured values of the inertial measurement unit obtained in the time period corresponding to the two adjacent frames of images can be pre-integrated to obtain the second rotation matrix corresponding to the two adjacent frames of imagesWherein b represents the inertial measurement unit coordinate system. The lower right i indicates what frame of image is described above. The upper right i indicates that the relative rotation is obtained by the inertial measurement unit.
Adjusting the corresponding time point of each frame of image in the calibration videoThen, the time sequence in the video is calibrated, and the first rotation matrix between every two adjacent frames of images can pass throughAnd (3) representing. Similarly, the second rotation matrix between every two adjacent frames of images in the time sequence in the calibration video can be obtained by +.>And (3) representing. Wherein time+dt may represent the time series formed after updating the time delay between the vision system and the inertial measurement unit.
For the same time period, such as the time period between the first frame image and the second frame image, the corresponding first rotation matrix can be obtainedSecond rotation matrix->Wherein the first rotation matrix->The angle representing the standing in the vision system describes how the shooting system has undergone a rotation between one frame image and a second frame image, whereas the second rotation matrix +. >The angle representing standing at the inertial measurement unit describes how the imaging system has undergone rotation between one frame of image and the second frame of image. First rotation matrix->And a second rotation matrixThe difference between this is indicative of the time period between this first frame image and the second frame image,the resulting rotational error. Wherein the rotation error can be represented by the degree of difference between the two matrices.
Through the process, the difference between the first rotation matrix and the second rotation matrix corresponding to each two adjacent frames of images can be calculated, and the difference corresponding to each two adjacent frames of images is overlapped, so that the rotation error corresponding to the time delay dt can be obtained. The first updated value obtained by each update is the time delay dt obtained by each update.
According to the method provided by the embodiment of the invention, the corresponding time point of each frame of image in the calibration video is adjusted according to the first updated value obtained by updating each time. And acquiring a first rotation matrix between two adjacent frames of images in the calibration video, and acquiring a second rotation matrix corresponding to the two adjacent frames of images in the calibration video after the time point is adjusted. And determining a rotation error corresponding to the first updated value according to the first rotation matrix between every two adjacent frames of images and the second rotation matrix corresponding to every two adjacent frames of images. The first updating values with different time delays can be screened to determine the first updating values which can meet the first preset condition to serve as the reference value of the time delays, and the first preset condition can be used for measuring whether the rotation errors determined by the first updating values with different time delays are small enough or not, so that the determined reference value is more accurate.
In combination with the foregoing embodiments, in one embodiment, the method for determining the rotation error corresponding to the first update value according to the first rotation matrix between every two adjacent frames of images and the second rotation matrix corresponding to every two adjacent frames of images according to the present invention is not specifically limited, and includes but is not limited to: calculating the shaft angle difference value corresponding to each two adjacent frames of images according to the first rotation matrix between each two adjacent frames of images and the second rotation matrix corresponding to each two adjacent frames of images; and calculating a rotation error corresponding to the first updated value according to the shaft angle difference value corresponding to each two adjacent frames of images.
Wherein, the above calculation process can refer to the following formula (1):
in the above-mentioned formula (1),representing an axis angle determined from a first rotation matrix between the i-th frame image and the i+1-th frame image,/and/or->The axis angle determined from the second rotation matrix corresponding to the two frame images of the i-th frame image and the i+1-th frame image is represented. The difference between the two shaft angles is the shaft angle difference corresponding to the two adjacent frames of images, and n+1 represents the total frame number in the calibration video.
According to the method provided by the embodiment of the invention, the shaft angle difference value corresponding to each two adjacent frames of images is calculated according to the first rotation matrix between each two adjacent frames of images and the second rotation matrix corresponding to each two adjacent frames of images. And calculating a rotation error corresponding to the first updated value according to the shaft angle difference value corresponding to each two adjacent frames of images. The first updating values with different time delays can be screened based on the rotation errors determined by the first updating values with different time delays, the first updating values with different time delays can be determined to be used as the reference values of the time delays, the first preset conditions can be used for measuring whether the rotation errors determined by the first updating values with different time delays are small enough or not, repeated updating and value taking can be carried out on the first updating values with different time delays in a first time delay interval according to preset step sizes until the first updating values with different time delays can be determined to be used as the reference values of the time delays, and accordingly calibration errors caused under the condition that the time stamp interval between a vision system and an inertia measurement unit is too large can be avoided, and the determined reference values are more accurate.
In combination with the foregoing embodiments, in one embodiment, the method for calibrating the delay and the rotation parameter between the vision system and the inertial measurement unit according to the reference value of the delay is not specifically limited, and includes, but is not limited to: determining a second time delay interval according to the reference value of the time delay, and determining a second initial value of the time delay in the second time delay interval; updating the second initial value of the time delay to obtain a second updated value of the time delay, acquiring a reference value of the rotation external parameter between the vision system and the inertial measurement unit according to the second updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the second updated value of the time delay and the reference value of the rotation external parameter, repeating the processes of updating the time delay, determining the rotation external parameter and the rotation error corresponding to the updated time delay, determining the rotation error meeting a second preset condition, and taking the second updated value of the time delay and the reference value of the rotation error corresponding to the rotation error meeting the second preset condition as a calibration result of the shooting system.
The above process of determining the reference value of the time delay is actually a process of performing coarse calibration on the time delay. Thus, in the content provided by the embodiment of the present invention, the range of the second delay interval is much smaller than the range of the first delay interval in the above process. Taking the reference value of the delay as rawdt as an example, the second delay interval may be represented by [ rawdt-2×mres, rawdt+2×mres ]. Where res denotes the calibration resolution and m denotes the integer multiple.
Similar to the first preset condition, the second preset condition may be that the rotation error falls within a second preset range, or that the rotation error gradually converges to a certain value, which is not limited in particular in the embodiment of the present invention. It should be noted that, unlike the above-mentioned coarse calibration process for obtaining the reference value of the time delay, the content provided in the embodiment of the present invention is to calibrate the time delay and the rotation external parameter at the same time, which is equivalent to the fine calibration process again. Thus, the setting of the second preset condition may be more severe than the setting standard of the first preset condition. Wherein the setting of the criterion is more stringent may be reflected in that the second preset condition may ensure that the determined rotation error is smaller than the rotation error determined by the first preset condition.
According to the method provided by the embodiment of the invention, the second time delay interval is determined according to the reference value of the time delay, and the second initial value of the time delay is determined in the second time delay interval. Updating the second initial value of the time delay to obtain a second updated value of the time delay, acquiring a reference value of the rotation external parameter between the vision system and the inertial measurement unit according to the second updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the second updated value of the time delay and the reference value of the rotation external parameter, repeating the processes of updating the time delay, determining the rotation external parameter and the rotation error corresponding to the updated time delay, determining the rotation error meeting a second preset condition, and taking the second updated value of the time delay and the reference value of the rotation error corresponding to the rotation error meeting the second preset condition as a calibration result of the shooting system. The delay is roughly calibrated firstly, and then the delay and the rotation external parameters are finely calibrated later, so that the optimization range of the delay is reduced, the calculated amount can be reduced, and the calibration efficiency is improved.
In addition, because the corresponding rotation external parameters can be determined based on the second updated values with different time delays, and the corresponding rotation errors are determined based on the second updated values with different time delays and the corresponding rotation external parameters, the time delays and the rotation external parameters are screened at the same time, the second updated values and the rotation errors which can meet the second preset conditions are determined as the calibration results, and the second preset conditions can be used for measuring whether the rotation errors determined based on the first updated values with time delays are small enough or not, so that the determined calibration results are more accurate.
In combination with the foregoing embodiments, in one embodiment, the method for obtaining the reference value of the rotation parameter between the vision system and the inertial measurement unit according to the second updated value of the time delay is not specifically limited, and includes, but is not limited to: according to the second updated value of the time delay, determining a hand-eye calibration equation set, wherein a rotation external parameter in the hand-eye calibration equation set is an unknown quantity; and solving the hand eye calibration equation set to obtain the reference value of the rotation external parameter.
From the rotation transformation relation, a hand-eye calibration formula ax=xb can be constructed, and in combination with the content in the above example, the formula can be specifically the following formula (2):
In the above-mentioned formula(s),and->Definition of (1) can refer to the content of the above embodiment, R b_c The external parameters to be solved are also matrixes. It should be noted that->And R is R b_c Are each a matrix of 3*3. The transformation process of the hand-eye calibration can refer to fig. 4.
As can be seen from the foregoing embodiments, on the premise that the second updated value of the time delay is known, the first rotation matrix between every two adjacent frames of images in the video and the second rotation matrix corresponding to every two adjacent frames of images can be obtained by solving. Therefore, each two adjacent frames of images in the calibration video can correspondingly construct a hand-eye calibration equation, and a hand-eye calibration equation set can be obtained. The number of equations in the hand-eye calibration equation set is the total frame number in the calibration video minus 1.
The process of solving the equation can be converted into the following least squares problem, specifically, the following formula (3) can be referred to:
in the above formula, the left matrix in formula (2) is actually multiplied by the transpose matrix of the right matrix. In an ideal situation, if the difference between the shaft angles is 0, the two should be multiplied to obtain a unitary matrix. Due to the rotation error, an identity matrix is typically not obtained, and the product of the two may be indicative of such rotation error. In the formula (3) of the present invention, Representing a robust kernel function, i.e. a weight, which can be calculated from the actual calculation>To make the setting.
To a certain extent find R b_c Can minimize the cumulative value of the errors and can minimize the cumulative value of the errors b_c The rotation external parameters to be solved are obtained. Based on the principle, when the hand-eye calibration equation set is solved and the reference value of the rotation external parameter is obtained, the reference value can be solved through an iterative optimization mode, and also can be solved through a mode of converting the reference value into an axis angle or quaternion solution linear equation set, and the embodiment of the invention is not limited in particular. Considering that the solving process of the quaternion is relatively fast, in order to facilitate understanding, the embodiment of the invention adopts the quaternion mode to linearly solve the hand-eye calibration equation set, and the specific process is as follows:
firstly, converting a hand-eye calibration equation set into a form of a matrix of quaternion left-hand multiplication and right-hand multiplication, wherein the following formula (4) can be specifically referred to:
in the above-mentioned formula (4),representing the conversion of quaternion into a left-hand matrix of 4*4->Representing the conversion of the quaternion into a right-hand matrix of 4*4.
Accumulating the 1 st to nth time linear equations can obtain the following homogeneous equation set, and the following equation (5) can be referred to specifically:
In the above equation (5), the coefficient matrix Q in the homogeneous equation set n The dimension is 4n×4, and since the number of image frames used for calibration is generally relatively large, several hundred and thousand frames are possible. Thus, for Q n The singular value decomposition takes a lot of time and, according to the derivation, the system of equations Q n *q b_c Solution of =0The same applies. Thus, equation (5) can be converted into a solution +.>Wherein (1)>The number of dimensions of (2) is 4*4, and the singular value decomposition speed of the vector is high.
In combination with the above embodiments, after the reference value of the rotation external parameter between the vision system and the inertial measurement unit is obtained according to the second updated value of the time delay, the rotation error between the vision system and the inertial measurement unit may be determined according to the second updated value of the time delay and the reference value of the rotation external parameter. In connection with the above embodiment, the error formula for determining the rotation error may refer to the following formula (6):
of course, in the practical implementation process, the error formula for determining the rotation error can also have various variations by combining the above hand-eye calibration formula, for example, the following formula (7) can be referred to:
according to the method provided by the embodiment of the invention, the hand-eye calibration equation set is determined according to the second updated value of the time delay, and the rotation external parameter in the hand-eye calibration equation set is used as the unknown quantity. And solving the hand eye calibration equation set to obtain the reference value of the rotation external parameter. When the calibration results of the time delay and the rotation external parameters are obtained, a linear solving method is adopted, so that the situation that the obtained calibration results are not accurate enough and even the subsequent calibration fails due to the fact that local minimum values are involved in the process of optimizing the time delay and the rotation external parameters can be avoided.
In combination with the above examples, after solving the errors corresponding to the different values of the time delay, the graph of the values of the time delay and the errors may refer to fig. 5. And taking the values dt of all the time delays in calibration as abscissa, and drawing a time delay and average error curve by taking the average error Err corresponding to each dt as ordinate. Fig. 5 shows the result of a certain calibration in practice, the abscissa is the time delay dt of the vision system corresponding to the inertial measurement unit, in seconds. The ordinate is the average error calculated by using the error calculation formula under different dt, and the specific unit is radian value. The lower the lowest point value of the curve is, the sharper the lowest point value is, and the higher the effect and the higher the precision of time delay calibration are. Under different calibration precision testing tools, such as euroc data, which are mainly data shot by global exposure, time delay and rotation external parameters calibrated by using the euroc data can be referred to as the following table 1 for the corresponding calibration results:
TABLE 1
Calibration sequence number 1 2 3 4 5 6 7
Time delay/ms -0.5 -0.5 -0.5 -0.5 -0.5 -0.5 -0.5
Error/degree of rotation of external parameters 0.309 0.272 0.106 0.360 0.296 0.192 0.114
The evo data is mainly the data shot by using roller shutter exposure, the time delay and rotation external parameters calibrated by the evo data are used, and the corresponding calibration results can be referred to as the following table 2:
TABLE 2
Calibration sequence number 1 2 3 4 5 6 7
Time delay/ms -1.5 -1.5 -0.5 -1.5 -2.0 -1.5 -1.5
Error/degree of rotation of external parameters 0.104 0.388 0.373 0.328 0.418 0.392 0.314
From tables 1 and 2 above, it can be seen that the time delay between the visual system calibrated by the euroc data and the inertial measurement unit is relatively stable, substantially 0.5ms, when tested on sufficiently rotated data. Whereas the evo data mostly scale the results: between-0.0015 and 0s, the cases of-0.002 and 0.001 occasionally occur, the basic calibration error being within 2 ms. The calibration result of the evo data is not stable in the euroc data, and the evo data may have a certain relation with the evo data obtained by shooting by the roller shutter camera. In addition, the error of the rotation external parameter is an angle difference, the error corresponding to the rotation external parameter calibrated by the euroc data is about 0.3 degrees, and the error between the rotation external parameter calibrated by the evo data and the result of the anti-shake group calibration is about 0.5 degrees. The evo data are obtained by shooting with a roller shutter camera, and the euroc data are obtained by shooting with a global camera.
It should be understood that, although the steps in the flowcharts of fig. 1 and 2 are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least a portion of the steps in fig. 1 and 2 may include a plurality of steps or stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily sequential, but may be performed in rotation or alternately with at least a portion of the steps or stages in other steps or steps.
In combination with the foregoing embodiments, in one embodiment, as shown in fig. 6, there is provided a calibration device for a photographing system, including: a determining module 601 and a calibrating module 602, wherein:
a determining module 601, configured to determine a reference value of the time delay according to a rotation error between the vision system and the inertial measurement unit;
the calibration module 602 is configured to calibrate a time delay and a rotation parameter between the vision system and the inertial measurement unit according to a reference value of the time delay, where the vision system and the inertial measurement unit are coupled in a same shooting system, a rotation error is determined based on a calibration video, the calibration video is obtained by shooting the shooting system under rotation, and the rotation error is used to represent a degree of inconsistency between a rotation state determined based on the vision system and a rotation state determined based on the inertial measurement unit.
In one embodiment, the determining module 601 is configured to update the first initial value of the time delay to obtain a first updated value of the time delay, determine a rotation error between the vision system and the inertial measurement unit according to the first updated value of the time delay, repeat the updating and determining processes to determine the rotation error satisfying the first preset condition, and use the first updated value corresponding to the rotation error satisfying the first preset condition as the reference value of the time delay.
In one embodiment, the determining module 601 includes:
the updating unit is used for updating the first initial value according to a preset step length until the updated first updated value is not located in the first time delay interval, and the first initial value is located in the first time delay interval;
the first determining unit is used for determining a rotation error corresponding to a first updated value obtained by updating each time;
and the second determining unit is used for determining the minimum value of all the determined rotation errors and taking the minimum value as the rotation error meeting the first preset condition.
In one embodiment, the first determining unit includes:
the adjusting subunit is used for adjusting the corresponding time point of each frame of image in the calibration video according to the first updated value obtained by updating each time;
the acquisition subunit is used for acquiring a first rotation matrix between two adjacent frames of images in the calibration video, and acquiring a second rotation matrix corresponding to the two adjacent frames of images in the calibration video after the time point is adjusted, wherein the second rotation matrix is determined based on the measured value of the inertial measurement unit in a time period corresponding to the two adjacent frames of images;
and the determining subunit is used for determining the rotation error corresponding to the first update value according to the first rotation matrix between every two adjacent frames of images and the second rotation matrix corresponding to every two adjacent frames of images.
In one embodiment, the determining subunit is configured to calculate, according to a first rotation matrix between each two adjacent frames of images and a second rotation matrix corresponding to each two adjacent frames of images, a shaft angle difference value corresponding to each two adjacent frames of images; and calculating a rotation error corresponding to the first updated value according to the shaft angle difference value corresponding to each two adjacent frames of images.
In one embodiment, the calibration module 602 includes:
the third determining unit is used for determining a second time delay interval according to the reference value of the time delay and determining a second initial value of the time delay in the second time delay interval;
the acquisition unit is used for updating the second initial value of the time delay to obtain a second updated value of the time delay, acquiring a reference value of the rotation external parameter between the vision system and the inertial measurement unit according to the second updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the second updated value of the time delay and the reference value of the rotation external parameter, repeating the processes of updating the time delay, determining the rotation external parameter and the rotation error corresponding to the updated time delay, determining the rotation error meeting a second preset condition, and taking the second updated value of the time delay and the reference value of the rotation error corresponding to the rotation error meeting the second preset condition as a calibration result of the shooting system.
In one embodiment, the obtaining unit is configured to determine a hand-eye calibration equation set according to the second updated value of the time delay, where the rotation external parameter is an unknown quantity; and solving the hand eye calibration equation set to obtain the reference value of the rotation external parameter.
For specific limitations of the calibration device of the photographing system, reference may be made to the above limitations of the calibration method of the photographing system, and thus, the description thereof will not be repeated. The modules in the shooting system calibration device can be realized in whole or in part by software, hardware and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 7. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used for storing time delay and rotation external parameters. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a photographing system calibration method.
It will be appreciated by those skilled in the art that the structure shown in fig. 7 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of:
determining a reference value of the time delay according to a rotation error between the vision system and the inertial measurement unit;
and calibrating the time delay and the rotation external parameter between the vision system and the inertia measurement unit according to the reference value of the time delay, wherein the vision system and the inertia measurement unit are coupled in the same shooting system, the rotation error is determined based on a calibration video, the calibration video is obtained by shooting the shooting system under rotation, and the rotation error is used for representing the degree of inconsistency between the rotation state determined based on the vision system and the rotation state determined based on the inertia measurement unit.
In one embodiment, the processor when executing the computer program further performs the steps of:
updating the first initial value of the time delay to obtain a first updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the first updated value of the time delay, repeating the updating and determining processes to determine the rotation error meeting a first preset condition, and taking the first updated value corresponding to the rotation error meeting the first preset condition as a reference value of the time delay.
In one embodiment, the processor when executing the computer program further performs the steps of: updating the first initial value according to a preset step length until the updated first updated value is not located in the first time delay interval, wherein the first initial value is located in the first time delay interval; determining a rotation error corresponding to a first updated value obtained by each update; and determining the minimum value of all the determined rotation errors, and taking the minimum value as the rotation error meeting the first preset condition.
In one embodiment, the processor when executing the computer program further performs the steps of: for a first updated value obtained by each update, adjusting a time point corresponding to each frame of image in the calibration video according to the first updated value; acquiring a first rotation matrix between two adjacent frames of images in the calibration video, and acquiring a second rotation matrix corresponding to the two adjacent frames of images in the calibration video after the time point is adjusted, wherein the second rotation matrix is determined based on the measured value of the inertial measurement unit in a time period corresponding to the two adjacent frames of images; and determining a rotation error corresponding to the first updated value according to the first rotation matrix between every two adjacent frames of images and the second rotation matrix corresponding to every two adjacent frames of images.
In one embodiment, the processor when executing the computer program further performs the steps of: calculating the shaft angle difference value corresponding to each two adjacent frames of images according to the first rotation matrix between each two adjacent frames of images and the second rotation matrix corresponding to each two adjacent frames of images; and calculating a rotation error corresponding to the first updated value according to the shaft angle difference value corresponding to each two adjacent frames of images.
In one embodiment, the processor when executing the computer program further performs the steps of: determining a second time delay interval according to the reference value of the time delay, and determining a second initial value of the time delay in the second time delay interval; updating the second initial value of the time delay to obtain a second updated value of the time delay, acquiring a reference value of the rotation external parameter between the vision system and the inertial measurement unit according to the second updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the second updated value of the time delay and the reference value of the rotation external parameter, repeating the processes of updating the time delay, determining the rotation external parameter and the rotation error corresponding to the updated time delay, determining the rotation error meeting a second preset condition, and taking the second updated value of the time delay and the reference value of the rotation error corresponding to the rotation error meeting the second preset condition as a calibration result of the shooting system.
In one embodiment, the processor when executing the computer program further performs the steps of: according to the second updated value of the time delay, determining a hand-eye calibration equation set, wherein a rotation external parameter in the hand-eye calibration equation set is an unknown quantity; and solving the hand eye calibration equation set to obtain the reference value of the rotation external parameter.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
determining a reference value of the time delay according to a rotation error between the vision system and the inertial measurement unit;
and calibrating the time delay and the rotation external parameter between the vision system and the inertia measurement unit according to the reference value of the time delay, wherein the vision system and the inertia measurement unit are coupled in the same shooting system, the rotation error is determined based on a calibration video, the calibration video is obtained by shooting the shooting system under rotation, and the rotation error is used for representing the degree of inconsistency between the rotation state determined based on the vision system and the rotation state determined based on the inertia measurement unit.
In one embodiment, the computer program when executed by the processor further performs the steps of: updating the first initial value of the time delay to obtain a first updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the first updated value of the time delay, repeating the updating and determining processes to determine the rotation error meeting a first preset condition, and taking the first updated value corresponding to the rotation error meeting the first preset condition as a reference value of the time delay.
In one embodiment, the computer program when executed by the processor further performs the steps of: updating the first initial value according to a preset step length until the updated first updated value is not located in the first time delay interval, wherein the first initial value is located in the first time delay interval; determining a rotation error corresponding to a first updated value obtained by each update; and determining the minimum value of all the determined rotation errors, and taking the minimum value as the rotation error meeting the first preset condition.
In one embodiment, the computer program when executed by the processor further performs the steps of: for a first updated value obtained by each update, adjusting a time point corresponding to each frame of image in the calibration video according to the first updated value; acquiring a first rotation matrix between two adjacent frames of images in the calibration video, and acquiring a second rotation matrix corresponding to the two adjacent frames of images in the calibration video after the time point is adjusted, wherein the second rotation matrix is determined based on the measured value of the inertial measurement unit in a time period corresponding to the two adjacent frames of images; and determining a rotation error corresponding to the first updated value according to the first rotation matrix between every two adjacent frames of images and the second rotation matrix corresponding to every two adjacent frames of images.
In one embodiment, the computer program when executed by the processor further performs the steps of: calculating the shaft angle difference value corresponding to each two adjacent frames of images according to the first rotation matrix between each two adjacent frames of images and the second rotation matrix corresponding to each two adjacent frames of images; and calculating a rotation error corresponding to the first updated value according to the shaft angle difference value corresponding to each two adjacent frames of images.
In one embodiment, the computer program when executed by the processor further performs the steps of: determining a second time delay interval according to the reference value of the time delay, and determining a second initial value of the time delay in the second time delay interval; updating the second initial value of the time delay to obtain a second updated value of the time delay, acquiring a reference value of the rotation external parameter between the vision system and the inertial measurement unit according to the second updated value of the time delay, determining a rotation error between the vision system and the inertial measurement unit according to the second updated value of the time delay and the reference value of the rotation external parameter, repeating the processes of updating the time delay, determining the rotation external parameter and the rotation error corresponding to the updated time delay, determining the rotation error meeting a second preset condition, and taking the second updated value of the time delay and the reference value of the rotation error corresponding to the rotation error meeting the second preset condition as a calibration result of the shooting system.
In one embodiment, the computer program when executed by the processor further performs the steps of: according to the second updated value of the time delay, determining a hand-eye calibration equation set, wherein a rotation external parameter in the hand-eye calibration equation set is an unknown quantity; and solving the hand eye calibration equation set to obtain the reference value of the rotation external parameter.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, or the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples merely represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (7)

1. A method for calibrating a photographing system, the method comprising:
updating a first initial value according to a preset step length until a first updated value obtained by updating is not located in a first time delay interval, wherein the first initial value is located in the first time delay interval;
for a first updated value obtained by each update, adjusting a time point corresponding to each frame of image in the calibration video according to the first updated value;
Acquiring a first rotation matrix between two adjacent frames of images in the calibration video, and acquiring a second rotation matrix corresponding to the two adjacent frames of images in the calibration video after the time point is adjusted, wherein the second rotation matrix is determined based on the measured value of the inertial measurement unit in a time period corresponding to the two adjacent frames of images;
determining a rotation error corresponding to the first update value according to a first rotation matrix between every two adjacent frames of images and a second rotation matrix corresponding to every two adjacent frames of images;
determining the minimum value of all the determined rotation errors, taking the minimum value as the rotation error meeting a first preset condition, and taking a first updated value corresponding to the rotation error meeting the first preset condition as a reference value of time delay;
and calibrating the time delay and the rotation external parameter between the vision system and the inertia measurement unit according to the reference value of the time delay, wherein the vision system and the inertia measurement unit are coupled in the same shooting system, the rotation error is determined based on a calibration video, the calibration video is obtained by shooting the shooting system under rotation, and the rotation error is used for representing the degree of inconsistency between the rotation state determined based on the vision system and the rotation state determined based on the inertia measurement unit.
2. The method according to claim 1, wherein determining the rotation error corresponding to the first update value according to the first rotation matrix between each two adjacent frames of images and the second rotation matrix corresponding to each two adjacent frames of images comprises:
calculating the shaft angle difference value corresponding to each two adjacent frames of images according to the first rotation matrix between each two adjacent frames of images and the second rotation matrix corresponding to each two adjacent frames of images;
and calculating a rotation error corresponding to the first updated value according to the shaft angle difference value corresponding to each two adjacent frames of images.
3. The method according to claim 1, wherein calibrating the delay and rotation parameters between the vision system and the inertial measurement unit based on the reference value of the delay comprises:
determining a second time delay interval according to the reference value of the time delay, and determining a second initial value of the time delay in the second time delay interval;
updating the second initial value of the time delay to obtain a second updated value of the time delay, acquiring a reference value of the rotation external parameter between the vision system and the inertia measurement unit according to the second updated value of the time delay, determining a rotation error between the vision system and the inertia measurement unit according to the second updated value of the time delay and the reference value of the rotation external parameter, repeating the processes of updating the time delay and determining the rotation external parameter and the rotation error corresponding to the updated time delay to determine the rotation error meeting a second preset condition, and taking the second updated value of the time delay and the reference value of the rotation error corresponding to the rotation error meeting the second preset condition as a calibration result of the shooting system.
4. A method according to claim 3, wherein the obtaining a reference value for a rotation parameter between the vision system and the inertial measurement unit from the second updated value of the time delay comprises:
determining a hand-eye calibration equation set according to the second updated value of the time delay, wherein a rotation external parameter in the hand-eye calibration equation set is an unknown quantity;
and solving the hand-eye calibration equation set to obtain a reference value of the rotation external parameter.
5. A camera system calibration apparatus, the apparatus comprising:
the updating unit is used for updating the first initial value according to a preset step length until the updated first updated value is not located in a first time delay interval, wherein the first initial value is located in the first time delay interval;
the adjustment subunit is used for adjusting the corresponding time point of each frame of image in the calibration video according to the first updated value obtained by updating each time;
the acquisition subunit is used for acquiring a first rotation matrix between two adjacent frames of images in the calibration video and acquiring a second rotation matrix corresponding to the two adjacent frames of images in the calibration video after the time point is adjusted, wherein the second rotation matrix is determined based on the measured value of the inertial measurement unit in a time period corresponding to the two adjacent frames of images;
A determining subunit, configured to determine a rotation error corresponding to the first update value according to a first rotation matrix between every two adjacent frames of images and a second rotation matrix corresponding to every two adjacent frames of images;
a second determining unit, configured to determine a minimum value among all the determined rotation errors, and take the minimum value as a rotation error that satisfies a first preset condition; the first updated value corresponding to the rotation error meeting the first preset condition is used as a reference value of time delay;
the calibration module is used for calibrating the time delay and the rotation external parameter between the vision system and the inertia measurement unit according to the reference value of the time delay, the vision system and the inertia measurement unit are coupled in the same shooting system, the rotation error is determined based on a calibration video, the calibration video is obtained by shooting the shooting system in a rotating mode, and the rotation error is used for representing the degree of inconsistency between the rotation state determined based on the vision system and the rotation state determined based on the inertia measurement unit.
6. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 4 when the computer program is executed.
7. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 4.
CN202110664964.6A 2021-06-16 2021-06-16 Shooting system calibration method, shooting system calibration device, computer equipment and storage medium Active CN113587924B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110664964.6A CN113587924B (en) 2021-06-16 2021-06-16 Shooting system calibration method, shooting system calibration device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110664964.6A CN113587924B (en) 2021-06-16 2021-06-16 Shooting system calibration method, shooting system calibration device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113587924A CN113587924A (en) 2021-11-02
CN113587924B true CN113587924B (en) 2024-03-29

Family

ID=78243696

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110664964.6A Active CN113587924B (en) 2021-06-16 2021-06-16 Shooting system calibration method, shooting system calibration device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113587924B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107655473A (en) * 2017-09-20 2018-02-02 南京航空航天大学 Spacecraft based on SLAM technologies is with respect to autonomous navigation system
CN109073407A (en) * 2017-10-26 2018-12-21 深圳市大疆创新科技有限公司 Drift scaling method, equipment and the unmanned vehicle of Inertial Measurement Unit
CN110880189A (en) * 2018-09-06 2020-03-13 舜宇光学(浙江)研究院有限公司 Combined calibration method and combined calibration device thereof and electronic equipment
CN111242860A (en) * 2020-01-07 2020-06-05 影石创新科技股份有限公司 Super night scene image generation method and device, electronic equipment and storage medium
CN111553956A (en) * 2020-05-20 2020-08-18 北京百度网讯科技有限公司 Calibration method and device of shooting device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150204901A1 (en) * 2014-01-17 2015-07-23 Caterpillar Inc. System and method for determining ground speed of machine

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107655473A (en) * 2017-09-20 2018-02-02 南京航空航天大学 Spacecraft based on SLAM technologies is with respect to autonomous navigation system
CN109073407A (en) * 2017-10-26 2018-12-21 深圳市大疆创新科技有限公司 Drift scaling method, equipment and the unmanned vehicle of Inertial Measurement Unit
CN110880189A (en) * 2018-09-06 2020-03-13 舜宇光学(浙江)研究院有限公司 Combined calibration method and combined calibration device thereof and electronic equipment
CN111242860A (en) * 2020-01-07 2020-06-05 影石创新科技股份有限公司 Super night scene image generation method and device, electronic equipment and storage medium
CN111553956A (en) * 2020-05-20 2020-08-18 北京百度网讯科技有限公司 Calibration method and device of shooting device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
中俄东线天然气管道智能化关键技术创新与思考;王振声,等;油气储运;第39卷(第7期);730-739 *

Also Published As

Publication number Publication date
CN113587924A (en) 2021-11-02

Similar Documents

Publication Publication Date Title
CN107633536B (en) Camera calibration method and system based on two-dimensional plane template
CN110740247B (en) Video stability augmentation method and device, computer equipment and storage medium
CN111160229B (en) SSD network-based video target detection method and device
CN109214254B (en) Method and device for determining displacement of robot
CN108921898B (en) Camera pose determination method and device, electronic equipment and computer readable medium
CN111563965B (en) Method and device for generating panoramic image by optimizing depth image
CN113587924B (en) Shooting system calibration method, shooting system calibration device, computer equipment and storage medium
CN113436267B (en) Visual inertial navigation calibration method, device, computer equipment and storage medium
CN112629565B (en) Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit
CN110310243B (en) Unmanned aerial vehicle photogrammetry image correction method, system and storage medium
CN116614620B (en) High-pixel optical lens assembly equipment and control method
CN107240077B (en) Visual measurement method based on elliptic conformation deviation iterative correction
CN111750896A (en) Holder calibration method and device, electronic equipment and storage medium
CN113438409A (en) Delay calibration method, delay calibration device, computer equipment and storage medium
CN109389645B (en) Camera self-calibration method and system, camera, robot and cloud server
CN116430069A (en) Machine vision fluid flow velocity measuring method, device, computer equipment and storage medium
CN115047004A (en) Water and soil conservation monitoring method based on automatic positioning and orientation technology
CN116012242A (en) Camera distortion correction effect evaluation method, device, medium and equipment
CN112927276B (en) Image registration method, device, electronic equipment and storage medium
CN110619611B (en) Image correction calibration method and device, computer equipment and storage medium
CN115311336A (en) Image registration method, device and equipment of multiple cameras and storage medium
CN111259703B (en) Face inclination angle detection method and device
CN114034303A (en) Moving target object positioning method and device based on Kalman filtering
CN113780316A (en) Picture repetition detection method and device, computer equipment and storage medium
CN109146966A (en) Vision SLAM front end processing method, system, storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant