CN110728716B - Calibration method and device and aircraft - Google Patents
Calibration method and device and aircraft Download PDFInfo
- Publication number
- CN110728716B CN110728716B CN201910834056.XA CN201910834056A CN110728716B CN 110728716 B CN110728716 B CN 110728716B CN 201910834056 A CN201910834056 A CN 201910834056A CN 110728716 B CN110728716 B CN 110728716B
- Authority
- CN
- China
- Prior art keywords
- imu
- pose
- camera
- parameters
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000005259 measurement Methods 0.000 claims abstract description 52
- 238000001914 filtration Methods 0.000 claims abstract description 41
- 239000011159 matrix material Substances 0.000 claims description 24
- 238000006243 chemical reaction Methods 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 10
- 230000001133 acceleration Effects 0.000 claims description 8
- 239000013598 vector Substances 0.000 claims description 7
- 238000004891 communication Methods 0.000 claims description 6
- 239000002245 particle Substances 0.000 claims description 3
- 230000008569 process Effects 0.000 description 6
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 210000005036 nerve Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Navigation (AREA)
Abstract
The application discloses a calibration method, a calibration device and an aircraft, wherein the method is used for the aircraft provided with a camera and an IMU, and the method comprises the following steps: acquiring a first pose parameter of the camera, wherein the first pose parameter comprises a position parameter and a pose parameter of the camera in a world coordinate system; acquiring second pose parameters of the IMU, wherein the second pose parameters comprise position parameters and pose parameters of the IMU in a world coordinate system; constructing a system state model according to the second pose parameters; constructing a system measurement model according to the first pose parameter, the second pose parameter and the state model; constructing a nonlinear Kalman filter according to the system state model and the system measurement model; and filtering the first pose parameter and the second pose parameter by using the nonlinear Kalman filter, and outputting a filtering convergence value, wherein the filtering convergence value comprises calibration estimation values of the camera and the IMU parameters.
Description
Technical Field
The application relates to the technical field of robot vision, in particular to a calibration method and device and an aircraft.
Background
With the development of unmanned aerial vehicle technology, unmanned aerial vehicles are widely applied to the fields of disaster area rescue, address exploration and the like.
The camera and the IMU (Inertial measurement unit, inertial measurement module) are important elements in the aircraft, and the IMU can obtain motion information inside the robot at high frequency and is not influenced by surrounding environment because the camera is easy to fail under the conditions of rapid motion, illumination change and the like, so that the defect of the camera is overcome. Meanwhile, the camera can obtain rich environmental information, loop detection and loop correction are completed through visual matching, so that the accumulated drift error of the IMU is effectively corrected, and therefore, the fusion of the camera and the IMU is considered to have great potential to realize low-cost and high-precision positioning and image measurement.
Taking camera calibration as an example, in image measurement, in order to determine the correlation between the three-dimensional geometric position of a point on the surface of a spatial object and its corresponding point in the image, a geometric model of camera imaging must be established, and these geometric model parameters are camera parameters. The process of solving the parameters is called camera calibration, and the accuracy of the calibration result directly influences the accuracy of the measurement result.
The traditional camera and IMU parameter calibration method is an off-line calibration method, and the off-line calibration method or the off-line calibration method requires the sensor module to be stationary and rotate around the IMU; or the optimization problem with complex structure is needed to calibrate the camera and IMU external parameters and IMU deviation.
However, in practical use of the camera and the IMU, the deviation value of the IMU may be affected by factors such as temperature and the like to change slowly. Meanwhile, due to the limitations of mechanical vibration, processing technology and the like, the camera and the IMU are not strictly connected through rigid bodies in the running process of the equipment, and the rotation and translation of the external parameters can generate tiny changes, so that the traditional camera and IMU parameter calibration method is low in accuracy.
Therefore, how to realize the on-line calibration of the parameters of the camera and the IMU is a popular subject for the study of the person skilled in the art.
Disclosure of Invention
The application mainly aims to provide a calibration method, a calibration device and an aircraft, and aims to realize accurate online calibration of parameters of a camera and an IMU.
In order to achieve the above object, the present application provides a calibration method, applied to an aircraft, comprising:
acquiring a first pose parameter of the camera, wherein the first pose parameter comprises a position parameter and a pose parameter of the camera in a world coordinate system;
acquiring second pose parameters of the IMU, wherein the second pose parameters comprise position parameters and pose parameters of the IMU in a world coordinate system;
constructing a system state model;
constructing a system measurement model according to the first pose parameter, the second pose parameter and the state model;
constructing a nonlinear Kalman filter according to the system state model and the system measurement model;
and filtering the first pose parameter and the second pose parameter by using the nonlinear Kalman filter, and outputting a filtering convergence value, wherein the filtering convergence value comprises calibration estimation values of the camera and the IMU parameters.
Preferably, the constructing a system state model according to the second pose parameter includes:
defining a system state according to the second pose parameter;
and constructing a system state model according to the system state.
Preferably, the system state is:
wherein T is the transpose of the vector, G P I is the location of the IMU in the world coordinate system;is the pose of the IMU in the world coordinate system; G V I is the velocity of the IMU in the world coordinate system; b a Deviation for IMU angular velocity measurement; b g Deviation of the IMU acceleration measurement; I r C translational components of the camera and the IMU external parameters; />Is the rotation component of the camera and the external parameters of the IMU.
Preferably, the establishing the system state model according to the system state includes:
acquiring the said G P I The saidThe said G V I Said b a Said b g The said I r C Said->Derivative with respect to time to build the system state model。
Preferably, the constructing the system measurement model according to the first pose parameter, the second pose parameter and the state model includes:
selecting coordinates of any one space point in a space under a camera coordinate system to construct the system measurement model, wherein the system measurement model is as follows:
wherein,is the position of any spatial point in space; />The conversion matrix from the IMU coordinate system to the camera coordinate system; />The conversion matrix from the world coordinate system to the IMU coordinate system; />Is the position of the spatial point in the world coordinate system; I r C is a translational component of the camera and IMU parameters.
The application also provides a calibration device, the device is arranged on an aircraft, the aircraft is also provided with a camera and an IMU, and the calibration device comprises:
the first pose module is used for acquiring first pose parameters of the camera, wherein the first pose parameters comprise position parameters and pose parameters of the camera in a world coordinate system;
the second pose module is used for acquiring second pose parameters of the IMU, wherein the second pose parameters comprise position parameters and pose parameters of the IMU in a world coordinate system;
the first modeling module is used for constructing a system state model;
the second modeling module is used for constructing a system measurement model according to the first pose parameter, the second pose parameter and the state model;
the third modeling module is used for constructing a nonlinear Kalman filter according to the system state model and the system measurement model;
and the filtering module is used for carrying out filtering processing on the first pose parameter and the second pose parameter by using the nonlinear Kalman filter and outputting a filtering convergence value, wherein the filtering convergence value comprises calibration estimated values of the camera and the IMU parameter.
Preferably, the first modeling module is further configured to:
defining a system state according to the second pose parameter;
and constructing a system state model according to the system state.
Preferably, the system state is:
wherein T is the transpose of the vector, G P I is the location of the IMU in the world coordinate system;is the pose of the IMU in the world coordinate system; G V I is the velocity of the IMU in the world coordinate system; b a Deviation for IMU angular velocity measurement; b g Deviation of the IMU acceleration measurement; I r C translational components of the camera and the IMU external parameters; />Is the rotation component of the camera and the external parameters of the IMU.
Preferably, the second modeling module is further configured to:
acquiring the said G P I The saidThe said G V I Said b a Said b g The said I r C Said->Derivative with respect to time to build the system state model.
The application also provides an aircraft, which comprises a fuselage, a horn connected with the fuselage, a power device arranged on the horn, a camera connected with the fuselage, an IMU connected with the camera and a vision chip in communication connection with the camera and the IMU, and is characterized in that the vision chip further comprises:
a memory and a processor;
the memory is used for storing a calibration program executable by a computer;
the processor is used for calling the calibration program executable by the computer to realize the calibration method.
Compared with the prior art, the calibration method provided by the application has the advantages that the first pose parameters of the camera are obtained, wherein the first pose parameters comprise the position parameters and the pose parameters of the camera in a world coordinate system; acquiring second pose parameters of the IMU, wherein the second pose parameters comprise position parameters and pose parameters of the IMU in a world coordinate system; constructing a system state model according to the second pose parameters; constructing a system measurement model according to the first pose parameter, the second pose parameter and the state model; constructing a nonlinear Kalman filter according to the system state model and the system measurement model; and carrying out filtering processing on the first pose parameters and the second pose parameters by using the nonlinear Kalman filter, and outputting a filtering convergence value, wherein the filtering convergence value comprises calibration estimated values of the camera and the IMU parameters, and the accurate online calibration of the camera and the IMU parameters is realized by using the obtained calibration estimated values.
Drawings
Fig. 1 is a schematic view of a scenario in which an aircraft is communicatively connected to a terminal device according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating steps of a calibration method according to an embodiment of the present application;
FIG. 3 is a flow chart of the substeps of step S12 in FIG. 2;
FIG. 4 is a block diagram schematically illustrating a calibration apparatus according to an embodiment of the present application;
fig. 5 is a schematic block diagram of an aircraft according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not necessarily limited to those steps or modules that are expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that the description of "first", "second", etc. in this disclosure is for descriptive purposes only and is not to be construed as indicating or implying a relative importance or implying an indication of the number of technical features being indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not within the scope of protection claimed in the present application.
The application provides a calibration method, a calibration device and an aircraft, wherein the calibration method is applied to the aircraft, the aircraft is provided with a camera and an IMU, and the calibration method is implemented by acquiring a first pose parameter of the camera, wherein the first pose parameter comprises a position parameter and a pose parameter of the camera in a world coordinate system; acquiring second pose parameters of the IMU, wherein the second pose parameters comprise position parameters and pose parameters of the IMU in a world coordinate system; constructing a system state model according to the second pose parameters; constructing a system measurement model according to the first pose parameter, the second pose parameter and the state model; constructing a nonlinear Kalman filter according to the system state model and the system measurement model; and carrying out filtering processing on the first pose parameters and the second pose parameters by using the nonlinear Kalman filter, and outputting a filtering convergence value, wherein the filtering convergence value comprises calibration estimated values of the camera and the IMU parameters, and the accurate online calibration of the camera and the IMU parameters is realized by using the obtained calibration estimated values.
Referring to fig. 1, fig. 1 shows an aircraft 10 according to the present application, the aircraft 10 being communicatively connected to a terminal device 30, wherein the terminal device 30 is used for controlling the aircraft 10.
The aircraft 10 may be a rotorcraft, such as a quad-rotor, a six-rotor, or a fixed-rotor aircraft. The terminal device 30 is, for example, a smart phone, a tablet computer, a remote controller, etc. A user may interact with the terminal device 30 via one or more user interaction devices of any suitable type, which may be a mouse, keys, touch screen, etc.
The aircraft 10 includes a fuselage 101, a horn 102 connected to the fuselage 101, a power plant 103 provided to the horn 102, and a control system (not shown) provided to the fuselage 101. The power plant 103 is used to provide thrust, lift, etc. for the flight of the aircraft 10, and the control system is the central nerve of the aircraft 10 and may include a number of functional modules, such as a flight control system, tracking system, path planning system, and other systems with specific functions. Among other things, the flight control system includes various types of sensors, such as IMUs, gyroscopes, accelerometers, etc., for controlling the attitude of the aircraft 10, etc. The path planning system is configured to plan a flight path of the aircraft 10 based on the position of the tracked object, and instruct the flight control system to control the flight attitude of the aircraft 10 so that the aircraft 10 flies along the specified path. The tracking system comprises a camera 104 connected with the machine body 101 and a vision chip arranged on the machine body 101, wherein the camera 104 is in communication connection with the vision chip, the camera 104 is used for shooting and acquiring media data such as images or videos of targets to be tracked, and the vision chip is used for determining to identify the targets to be tracked from the media data so as to generate corresponding tracking control instructions. The camera 104 may be a high-definition digital camera or other image capturing device, and the camera 104 may be disposed at any suitable position for capturing images, and in some embodiments, the camera 104 is mounted on the bottom of the body 101 through a pan-tilt. In some embodiments, a vision chip may also be provided on the horn 102.
The vision chip can select and track the target by utilizing the target frame according to the characteristics of the target. In some application scenarios of the aircraft 10, a terminal device 30 is also included, and the target frame may be transmitted to the aircraft 10 via the terminal device 30. Specifically, the terminal device 30 may display a picture taken by the aircraft 10, and the user performs frame selection on the target to be tracked in the picture to obtain an initial target frame, and then uploads the initial target frame to the aircraft 10.
The communication connection between the aircraft 10 and the terminal device 30 can be established by means of wireless communication modules, such as signal receivers, signal transmitters, etc., which are arranged in each case internally, and upload or issue data/commands. In other embodiments, the initial target frame may also be stored in advance in a memory device or vision chip of the aircraft 10.
Referring to fig. 2, fig. 2 is a schematic diagram of a calibration method according to the present application, where the method is applied to an aircraft 10 and is performed by a vision chip of the aircraft 10, and the method includes:
step S10: and acquiring a first pose parameter of the camera, wherein the first pose parameter comprises a position parameter and a pose parameter of the camera in a world coordinate system.
The method comprises the steps that the aircraft 10 obtains first pose parameters of a camera 104 arranged on the aircraft 10, wherein the first pose parameters are position parameters and pose parameters of the camera 104 in a world coordinate system, and the position parameters are used for representing position coordinates of the camera 104 in the world coordinate system; the pose parameters are used to characterize the pose angle of the camera 104 at world coordinates. The camera 104 may be located in the world coordinate system by acquiring the position parameters and the pose parameters of the camera 104 in the world coordinate system.
Step S11: and acquiring second pose parameters of the IMU, wherein the second pose parameters comprise position parameters and pose parameters of the IMU in a world coordinate system.
The method comprises the steps that the aircraft 10 obtains second pose parameters of an IMU arranged on the aircraft 10, wherein the second pose parameters are position parameters and pose parameters of the IMU in a world coordinate system, and the position parameters are used for representing position coordinates of the IMU in the world coordinate system; the attitude parameters are used to characterize the attitude angle of the IMU at world coordinates. The IMU may be located in the world coordinate system by acquiring its position and attitude parameters in the world coordinate system.
Step S12: and constructing a system state model according to the second pose parameters.
Referring to fig. 3, in some embodiments, the constructing a system state model according to the second pose parameter includes:
step S121: and defining a system state according to the second pose parameter.
And defining a system state according to the second pose parameters of the IMU so as to correlate the second pose parameters of the IMU with the system state.
Illustratively, the system state is defined as X, and X has the following expression equation:
wherein T is the transpose of the vector, G P I is the location of the IMU in the world coordinate system;is the pose of the IMU in the world coordinate system; G V I is the velocity of the IMU in the world coordinate system; b a Deviation for IMU angular velocity measurement; b g Deviation of the IMU acceleration measurement; I r C translational components of the camera and the IMU external parameters; />Is the rotation component of the camera and the external parameters of the IMU.
Step S122: and constructing a system state model according to the system state.
The system state model is the derivative of all vectors in the system state with respect to time, obtained G P I The saidThe said G V I Said b a Said b g The said I r C Said->Derivative with respect to time to build the system state model.
The system state model is as follows:
wherein (1)>Is->Derivative with respect to time>Is that G P I Derivative with respect to time>Is that G V I Derivative with respect to time>B is a Derivative with respect to time>B is g Derivative with respect to time>Is that I r C Derivative with respect to time>Is->With respect to the derivative of time, n aω 、n gω Is a preset value of 0 3×1 Is a zero-matrix of the matrix, I omega is the angular velocity of the IMU, I a I for the acceleration of the IMU, G g is gravitational acceleration.
Step S13: and constructing a system measurement model according to the first pose parameter, the second pose parameter and the state model.
The measurement model refers to an observation value related to a state quantity, and in this embodiment, pixel coordinates of any point in a space selected in the camera are used as the observation value to construct the system measurement model.
Illustratively, coordinates of any one spatial point in a space under a camera coordinate system are selected to construct the system measurement model, wherein the system measurement model is:
wherein,is the position of any spatial point in space; />The conversion matrix from the IMU coordinate system to the camera coordinate system; />The conversion matrix from the world coordinate system to the IMU coordinate system; />Is the position of the spatial point in the world coordinate system; I r C is a translational component of the camera and IMU parameters.
Step S14: and constructing a nonlinear Kalman filter according to the system state model and the system measurement model.
The nonlinear kalman filter may be an extended kalman filter, an unscented kalman filter or a particle filter.
Step S15: and filtering the first pose parameter and the second pose parameter by using the nonlinear Kalman filter, and outputting a filtering convergence value, wherein the filtering convergence value comprises calibration estimation values of the camera and the IMU parameters.
Since the cameras 104 and IMUs provided to the aircraft 10 follow the camera movements during the movement of the aircraft 10, the system state equations are updated in real time during the movement of the aircraft 10.
In some embodiments, the filtering processing is performed on the first pose parameter and the second pose parameter by using the nonlinear kalman filter, and the outputting a filtering convergence value may be:
acquiring initialization parameters of the nonlinear Kalman filter, and initializing the nonlinear Kalman filter according to the initialization parameters, wherein the initialization parameters are nonlinear Kalman filter initialization parameters set by a user;
filtering the k-moment system state model according to the initialized nonlinear Kalman filter to predict a predicted value of the k+1-moment system state model and a predicted value of the k+1-moment covariance matrix, wherein the k-moment system state model is constructed according to a second pose parameter at the k moment;
outputting an estimated value of the system state model at the time k+1 according to a predicted value of the system state model at the time k+1 and a measured value of the system measurement model at the time k+1, and updating the system state model and a covariance matrix, wherein the system measurement model is constructed according to the first pose parameters and the second pose parameters;
and detecting a system state model and a covariance matrix after N times of iteration updating, judging that the nonlinear Kalman filter converges and outputting a filtering convergence value if the predicted value of the covariance matrix is smaller than a preset error value, wherein the filtering convergence value comprises calibration estimated values of the camera and the IMU parameters, and the camera and the IMU parameters can be calibrated accurately on line by using the calibration estimated values of the camera and the IMU parameters.
Referring to fig. 4, the present application further provides a calibration device 20, the calibration device 20 is disposed on the aircraft 10, the aircraft 10 is further provided with a camera 104 and an IMU, the calibration device 20 is communicatively connected with the camera 104 and the IMU, and the calibration device 20 includes:
a first pose module 201, configured to obtain first pose parameters of the camera, where the first pose parameters include position parameters and pose parameters of the camera in a world coordinate system;
a second pose module 202, configured to obtain second pose parameters of the IMU, where the second pose parameters include a position parameter and a pose parameter of the IMU in a world coordinate system;
a first modeling module 203, configured to construct a system state model according to the second pose parameter;
a second modeling module 204, configured to construct a system measurement model according to the first pose parameter, the second pose parameter, and the state model;
a third modeling module 205, configured to construct a nonlinear kalman filter according to the system state model and the system measurement model;
the filtering module 206 performs filtering processing on the first pose parameter and the second pose parameter by using the nonlinear kalman filter, and outputs a filtering convergence value, where the filtering convergence value includes calibration estimation values of the camera and IMU parameters.
In some embodiments, the first modeling module is further to:
defining a system state according to the second pose parameter;
and constructing a system state model according to the system state.
In some embodiments, the system state is:
wherein T is the transpose of the vector, G P I is the location of the IMU in the world coordinate system; />Is the pose of the IMU in the world coordinate system; G V I is the velocity of the IMU in the world coordinate system; b a Deviation for IMU angular velocity measurement; b g Deviation of the IMU acceleration measurement; I r C translational components of the camera and the IMU external parameters; />Is the rotation component of the camera and the external parameters of the IMU.
In some embodiments, the first modeling module is further to:
acquiring the said G P I The saidThe said G V I Said b a Said b g The said I r C Said->Derivative with respect to time to build the system state model.
In some embodiments, the second modeling module is further to:
selecting coordinates of any one space point in a space under a camera coordinate system to construct the system measurement model, wherein the system measurement model is as follows:
wherein,is the position of any spatial point in space; />The conversion matrix from the IMU coordinate system to the camera coordinate system; />The conversion matrix from the world coordinate system to the IMU coordinate system; />Is the position of the spatial point in the world coordinate system; I r C is a translational component of the camera and IMU parameters.
Referring to fig. 5, in some embodiments, the vision chip of the aircraft 10 further includes a memory 105 and a processor 106, and the memory 105 is electrically connected to the processor 106.
The memory 105 includes at least one type of readable storage medium including flash memory, a hard disk, a multimedia card, a card memory (e.g., SD or DX memory, etc.), magnetic memory, magnetic disk, optical disk, etc. Memory 105 may be, in some embodiments, an internal storage unit of aircraft 10, such as a hard disk of aircraft 10. The memory 105 may in other embodiments also be an external storage device of the aircraft 10, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the aircraft 10. The memory 105 can be used not only for storing application software installed on the aircraft 10 and various types of data, such as code for a computer-readable calibration program, but also for temporarily storing data that has been output or is to be output.
The processor 106 may be a central processing unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor or other data processing chip in some embodiments, and the processor 106 may call program code or process data stored in the memory 105 to perform the calibration method described above.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the application, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.
Claims (9)
1. A calibration method applied to an aircraft provided with a camera and an IMU, the method comprising:
acquiring a first pose parameter of the camera, wherein the first pose parameter comprises a position parameter and a pose parameter of the camera in a world coordinate system;
acquiring second pose parameters of the IMU, wherein the second pose parameters comprise position parameters and pose parameters of the IMU in a world coordinate system;
constructing a system state model according to the second pose parameters;
constructing a system measurement model according to the first pose parameter, the second pose parameter and the state model, comprising:
selecting coordinates of any one space point in a space under a camera coordinate system to construct the system measurement model, wherein the system measurement model is as follows:
wherein,is the position of any spatial point in space; />The conversion matrix from the IMU coordinate system to the camera coordinate system; />The conversion matrix from the world coordinate system to the IMU coordinate system; />Is the position of the spatial point in the world coordinate system; G P I is the location of the IMU in the world coordinate system; I r C translational components of the camera and IMU external parameters;
constructing a nonlinear Kalman filter according to the system state model and the system measurement model; the nonlinear Kalman filter is an extended Kalman filter, an unscented Kalman filter or a particle filter;
and filtering the first pose parameter and the second pose parameter by using the nonlinear Kalman filter, and outputting a filtering convergence value, wherein the filtering convergence value comprises the following steps:
acquiring an initialization parameter of a nonlinear Kalman filter, and initializing the nonlinear Kalman filter according to the initialization parameter, wherein the initialization parameter is a nonlinear Kalman filter initial parameter set by a user;
filtering the k-moment system state model according to the initialized nonlinear Kalman filter to predict a predicted value of the k+1-moment system state model and a predicted value of the k+1-moment covariance matrix, wherein the k-moment system state model is constructed according to a second pose parameter at the k moment;
outputting an estimated value of the system state model at the time k+1 according to the predicted value of the system state model at the time k+1 and the measured value of the system measurement model at the time k+1, and updating the system state model and the covariance matrix; and detecting a system state model and a covariance matrix which are updated by N times of iteration, wherein N is a positive integer, judging that the nonlinear Kalman filter converges if the predicted value of the covariance matrix is smaller than a preset error value, and outputting a filtering convergence value, wherein the filtering convergence value comprises the calibrated estimated values of the camera and the IMU parameters.
2. The method of claim 1, wherein: the constructing a system state model according to the second pose parameters comprises the following steps:
defining a system state according to the second pose parameter;
and constructing a system state model according to the system state.
3. The method of claim 2, wherein the system state is:
wherein T is the transpose of the vector, G P I is the location of the IMU in the world coordinate system;is the pose of the IMU in the world coordinate system; G V I is the velocity of the IMU in the world coordinate system; b a Deviation for IMU angular velocity measurement; b g Deviation of the IMU acceleration measurement; I r C translational components of the camera and the IMU external parameters; />Is the rotation component of the camera and the external parameters of the IMU.
4. The method of claim 3, wherein said building said system state model from said system state comprises:
acquiring the said G P I The saidThe said G V I Said b a Said b g The said I r C Said->Derivative with respect to time to build the system state model.
5. The utility model provides a calibration device, calibration device sets up in the aircraft, the aircraft still is provided with camera and IMU, calibration device with camera and IMU communication connection, its characterized in that, calibration device includes:
the first pose module is used for acquiring first pose parameters of the camera, wherein the first pose parameters comprise position parameters and pose parameters of the camera in a world coordinate system;
the second pose module is used for acquiring second pose parameters of the IMU, wherein the second pose parameters comprise position parameters and pose parameters of the IMU in a world coordinate system;
the first modeling module is used for constructing a system state model;
the second modeling module is configured to construct a system measurement model according to the first pose parameter, the second pose parameter and the state model, and includes:
selecting coordinates of any one space point in a space under a camera coordinate system to construct the system measurement model, wherein the system measurement model is as follows:
wherein,is the position of any spatial point in space; />The conversion matrix from the IMU coordinate system to the camera coordinate system; />The conversion matrix from the world coordinate system to the IMU coordinate system; />Is the position of the spatial point in the world coordinate system; G P I is the location of the IMU in the world coordinate system; I r C translational components of the camera and IMU external parameters;
the third modeling module is used for constructing a nonlinear Kalman filter according to the system state model and the system measurement model; the nonlinear Kalman filter is an extended Kalman filter, an unscented Kalman filter or a particle filter;
the filtering module performs filtering processing on the first pose parameter and the second pose parameter by using the nonlinear Kalman filter and outputs a filtering convergence value, and the filtering module comprises: acquiring an initialization parameter of a nonlinear Kalman filter, and initializing the nonlinear Kalman filter according to the initialization parameter, wherein the initialization parameter is a nonlinear Kalman filter initial parameter set by a user;
filtering the k-moment system state model according to the initialized nonlinear Kalman filter to predict a predicted value of the k+1-moment system state model and a predicted value of the k+1-moment covariance matrix, wherein the k-moment system state model is constructed according to a second pose parameter at the k moment;
outputting an estimated value of the system state model at the time k+1 according to the predicted value of the system state model at the time k+1 and the measured value of the system measurement model at the time k+1, and updating the system state model and the covariance matrix; detecting a system state model and a covariance matrix after N times of iterative updating, wherein N is a positive integer, judging that the nonlinear Kalman filter converges if the predicted value of the covariance matrix is smaller than a preset error value, and outputting a filtering convergence value, wherein the filtering convergence value comprises calibration estimated values of the camera and IMU parameters.
6. The calibration device of claim 5, wherein the first modeling module is further configured to:
defining a system state according to the second pose parameter;
and constructing a system state model according to the system state.
7. The calibration device of claim 6, wherein the system state is:
wherein T is the transpose of the vector, G P I is the location of the IMU in the world coordinate system;is the pose of the IMU in the world coordinate system; G V I is the velocity of the IMU in the world coordinate system; b a Deviation for IMU angular velocity measurement; b g Deviation of the IMU acceleration measurement; I r C translational components of the camera and the IMU external parameters; />Is the rotation component of the camera and the external parameters of the IMU.
8. The calibration device of claim 7, wherein the second modeling module is further configured to:
acquiring the said G P I The saidThe said G V I Said b a Said b g The said I r C Said->Derivative with respect to time to build the system state model.
9. An aircraft, the aircraft includes the fuselage, with the horn that the fuselage links to each other, locate power device of horn, with the camera that the fuselage links to each other, with the IMU that the camera is connected and with camera and IMU communication connection's vision chip, its characterized in that, vision chip still includes:
a memory and a processor;
the memory is used for storing a calibration program executable by a computer;
the processor is configured to invoke the computer-executable calibration procedure to implement the calibration method of any of claims 1-4.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910834056.XA CN110728716B (en) | 2019-09-04 | 2019-09-04 | Calibration method and device and aircraft |
PCT/CN2020/113257 WO2021043214A1 (en) | 2019-09-04 | 2020-09-03 | Calibration method and device, and unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910834056.XA CN110728716B (en) | 2019-09-04 | 2019-09-04 | Calibration method and device and aircraft |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110728716A CN110728716A (en) | 2020-01-24 |
CN110728716B true CN110728716B (en) | 2023-11-17 |
Family
ID=69218913
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910834056.XA Active CN110728716B (en) | 2019-09-04 | 2019-09-04 | Calibration method and device and aircraft |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110728716B (en) |
WO (1) | WO2021043214A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110728716B (en) * | 2019-09-04 | 2023-11-17 | 深圳市道通智能航空技术股份有限公司 | Calibration method and device and aircraft |
CN113687336A (en) * | 2021-09-09 | 2021-11-23 | 北京斯年智驾科技有限公司 | Radar calibration method and device, electronic equipment and medium |
CN114549656A (en) * | 2022-02-14 | 2022-05-27 | 希姆通信息技术(上海)有限公司 | Calibration method for AR (augmented reality) glasses camera and IMU (inertial measurement Unit) |
CN117990112B (en) * | 2024-04-03 | 2024-07-02 | 中国人民解放军海军工程大学 | Unmanned aerial vehicle photoelectric platform target positioning method based on robust unscented Kalman filtering |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108731670A (en) * | 2018-05-18 | 2018-11-02 | 南京航空航天大学 | Inertia/visual odometry combined navigation locating method based on measurement model optimization |
CN109341724A (en) * | 2018-12-04 | 2019-02-15 | 中国航空工业集团公司西安航空计算技术研究所 | A kind of Airborne Camera-Inertial Measurement Unit relative pose online calibration method |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8676498B2 (en) * | 2010-09-24 | 2014-03-18 | Honeywell International Inc. | Camera and inertial measurement unit integration with navigation data feedback for feature tracking |
WO2019080052A1 (en) * | 2017-10-26 | 2019-05-02 | 深圳市大疆创新科技有限公司 | Attitude calibration method and device, and unmanned aerial vehicle |
CN108711166B (en) * | 2018-04-12 | 2022-05-03 | 浙江工业大学 | Monocular camera scale estimation method based on quad-rotor unmanned aerial vehicle |
CN110728716B (en) * | 2019-09-04 | 2023-11-17 | 深圳市道通智能航空技术股份有限公司 | Calibration method and device and aircraft |
-
2019
- 2019-09-04 CN CN201910834056.XA patent/CN110728716B/en active Active
-
2020
- 2020-09-03 WO PCT/CN2020/113257 patent/WO2021043214A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108731670A (en) * | 2018-05-18 | 2018-11-02 | 南京航空航天大学 | Inertia/visual odometry combined navigation locating method based on measurement model optimization |
CN109341724A (en) * | 2018-12-04 | 2019-02-15 | 中国航空工业集团公司西安航空计算技术研究所 | A kind of Airborne Camera-Inertial Measurement Unit relative pose online calibration method |
Non-Patent Citations (2)
Title |
---|
Multi-sensor Navigation Algorithm Using Monocular Camera, IMU and GPS for Large Scale Augmented Reality;Taragay等;《2012 IEEE International Symposium on Mixed and Augmented Reality》;20130107;第72页第4段、第74页第1段 * |
江晶等.3.2.2卡尔曼滤波.《运动传感器目标跟踪技术》.2017,第64-66页. * |
Also Published As
Publication number | Publication date |
---|---|
CN110728716A (en) | 2020-01-24 |
WO2021043214A1 (en) | 2021-03-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110728716B (en) | Calibration method and device and aircraft | |
CN109696183B (en) | Calibration method and device of inertia measurement unit | |
CN111415387B (en) | Camera pose determining method and device, electronic equipment and storage medium | |
US9013617B2 (en) | Gyroscope conditioning and gyro-camera alignment | |
WO2020253260A1 (en) | Time synchronization processing method, electronic apparatus, and storage medium | |
CN108036785A (en) | A kind of aircraft position and orientation estimation method based on direct method and inertial navigation fusion | |
CN109540126A (en) | A kind of inertia visual combination air navigation aid based on optical flow method | |
CN107014371A (en) | UAV integrated navigation method and apparatus based on the adaptive interval Kalman of extension | |
US20180075614A1 (en) | Method of Depth Estimation Using a Camera and Inertial Sensor | |
CN111338383B (en) | GAAS-based autonomous flight method and system, and storage medium | |
CN110268445A (en) | It is calibrated automatically using the camera of gyroscope | |
WO2018182524A1 (en) | Real time robust localization via visual inertial odometry | |
CN113551665B (en) | High-dynamic motion state sensing system and sensing method for motion carrier | |
CN112050806B (en) | Positioning method and device for moving vehicle | |
CN104848861A (en) | Image vanishing point recognition technology based mobile equipment attitude measurement method | |
CN111247389B (en) | Data processing method and device for shooting equipment and image processing equipment | |
CN105324792A (en) | Method for estimating the angular deviation of a mobile element relative to a reference direction | |
CN114111776A (en) | Positioning method and related device | |
CN110720113A (en) | Parameter processing method and device, camera equipment and aircraft | |
Mebarki et al. | Image moments-based velocity estimation of UAVs in GPS denied environments | |
WO2020019175A1 (en) | Image processing method and apparatus, and photographing device and unmanned aerial vehicle | |
CN113256728B (en) | IMU equipment parameter calibration method and device, storage medium and electronic device | |
JP7008736B2 (en) | Image capture method and image capture device | |
CN108225316B (en) | Carrier attitude information acquisition method, device and system | |
Zhao et al. | Distributed filtering-based autonomous navigation system of UAV |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 518055 Shenzhen, Guangdong, Nanshan District Xili street, No. 1001, Zhiyuan Road, B1 9. Applicant after: Shenzhen daotong intelligent Aviation Technology Co.,Ltd. Address before: 518055 Shenzhen, Guangdong, Nanshan District Xili street, No. 1001, Zhiyuan Road, B1 9. Applicant before: AUTEL ROBOTICS Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |