CN112136137A - Parameter optimization method and device, control equipment and aircraft - Google Patents

Parameter optimization method and device, control equipment and aircraft Download PDF

Info

Publication number
CN112136137A
CN112136137A CN201980030587.1A CN201980030587A CN112136137A CN 112136137 A CN112136137 A CN 112136137A CN 201980030587 A CN201980030587 A CN 201980030587A CN 112136137 A CN112136137 A CN 112136137A
Authority
CN
China
Prior art keywords
image frame
aircraft
initial value
image
relative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980030587.1A
Other languages
Chinese (zh)
Inventor
高文良
叶长春
周游
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112136137A publication Critical patent/CN112136137A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

A parameter optimization method, a parameter optimization device, a control device and an aircraft are provided, and the method comprises the following steps: acquiring a set of image frames acquired by the camera unit (103) during flight of the aircraft; determining a plurality of groups of matched feature point sets in the image frame set, wherein each group of matched feature point set corresponds to one space point and comprises a plurality of matched feature points; optimizing correlation parameters stored by an aircraft according to sensing data of the inertial measurement unit (104) during acquisition of each image frame of the image frame set and according to image positions of matched feature points in the matched feature point set on corresponding image frames of the image frame set; wherein the associated parameters include: at least one of a relative attitude parameter between the camera unit (103) and the inertial measurement unit (104) and an internal reference of the camera unit (103) enables more accurate external reference and internal reference.

Description

Parameter optimization method and device, control equipment and aircraft
Technical Field
The invention relates to the technical field of electronics, in particular to a parameter optimization method and device, control equipment and an aircraft.
Background
The aircraft is used as a multifunctional movable platform, can provide services for task requirements of different users, for example, a camera is carried on the aircraft, so that the requirement of the users on shooting images in places where manpower cannot reach is met, environment monitoring is carried out on a certain large environment, and the like. For another example, the aircraft-based agricultural chemical spraying device can be carried on the aircraft, so that the agricultural chemical can be quickly sprayed to the farmland.
In order to smoothly and safely complete related flight tasks of users, in addition to deploying reasonable flight control strategies on a flight controller, sensing devices such as a binocular camera, an IMU (Inertial measurement unit), a compass and the like are arranged on an aircraft, and sensing data (including environment images, attitude data of the aircraft and the like) are acquired by the sensing devices to determine the flight state of the aircraft, so that the aircraft is safely and reliably controlled to fly, and the tasks of the users are completed.
In general, a camera unit and an inertial measurement unit are deployed in an aircraft, and the flight environment and the self condition of the aircraft are monitored based on data of the two units. In the prior art, the internal parameters of the camera unit and the relative attitude parameters (i.e. external parameters) between the camera unit and the inertial measurement unit are stored in the aircraft at the time of factory shipment of the aircraft, and the aircraft uses the camera unit and the inertial measurement unit based on the stored internal parameters and external parameters. However, during the use of the aircraft, the internal parameter and the external parameter change, which may result in the failure of the aircraft to accurately monitor the flight environment and the self condition.
Disclosure of Invention
The embodiment of the invention provides a parameter optimization method, a parameter optimization device, control equipment and an aircraft, so that correlation parameters required by fusion of a camera unit and an inertia measurement unit can be optimized in the flight process of the aircraft.
In one aspect, an embodiment of the present invention provides a parameter optimization method applied to an aircraft, where the aircraft includes a camera unit and an inertial measurement unit, and the method includes:
acquiring an image frame set acquired by the camera unit in the flight process of the aircraft;
determining a plurality of groups of matched feature point sets in the image frame set, wherein each group of matched feature point set corresponds to one spatial point and comprises a plurality of matched feature points;
optimizing correlation parameters stored by an aircraft according to sensing data of the inertial measurement unit when each image frame of the image frame set is acquired and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set;
wherein the associated parameters include: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit.
In another aspect, an embodiment of the present invention further provides a parameter optimization apparatus, where the apparatus is applied to an aircraft, where the aircraft includes a camera unit and an inertial measurement unit, and the apparatus includes:
the acquisition module is used for acquiring an image frame set acquired by the camera unit in the flight process of the aircraft;
a determining module, configured to determine multiple sets of matched feature point sets in the image frame set, where each set of matched feature point set corresponds to one spatial point, and each set of matched feature point set includes multiple matched feature points;
the processing module is used for optimizing the correlation parameters stored by the aircraft according to the sensing data of the inertial measurement unit during the acquisition of each image frame of the image frame set and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frames of the image frame set;
wherein the associated parameters include: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit.
In yet another aspect, an embodiment of the present invention further provides a control device, where the control device is connected to an aircraft, where the aircraft includes a camera unit and an inertial measurement unit, and the control device includes a storage device and a processor;
the storage device is used for storing program instructions;
the processor calls the program instruction for
Acquiring an image frame set acquired by the camera unit in the flight process of the aircraft;
determining a plurality of groups of matched feature point sets in the image frame set, wherein each group of matched feature point set corresponds to one spatial point and comprises a plurality of matched feature points;
optimizing correlation parameters stored by an aircraft according to sensing data of the inertial measurement unit when each image frame of the image frame set is acquired and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set;
wherein the associated parameters include: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit.
In yet another aspect, an embodiment of the present invention further provides an aircraft, where the aircraft includes:
a camera unit;
an inertial measurement unit;
the power assembly is used for providing power for driving the aircraft to move;
storage means for storing program instructions;
a controller calling the program instructions for
Acquiring an image frame set acquired by the camera unit in the flight process of the aircraft;
determining a plurality of groups of matched feature point sets in the image frame set, wherein each group of matched feature point set corresponds to one spatial point and comprises a plurality of matched feature points;
optimizing correlation parameters stored by an aircraft according to sensing data of the inertial measurement unit when each image frame of the image frame set is acquired and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set;
wherein the associated parameters include: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit.
The embodiment of the invention can be used for optimizing the relative attitude parameters (namely external parameters) between the camera unit and the inertial measurement unit and the internal parameters of the camera unit based on the image sequence acquired by the aircraft in the flying process and the data acquired by the inertial measurement unit in the image acquisition process, so that the aircraft can obtain more accurate external parameters and internal parameters in the flying process, and the subsequent flying processing of positioning, speed measurement and the like of the aircraft based on the accurate external parameters and the accurate internal parameters can be conveniently realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic structural diagram of an aircraft in accordance with an embodiment of the present invention;
FIG. 2 is a diagram illustrating a parameter optimization process according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a parameter optimization method according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart diagram of a particular optimization method of an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of a parameter optimization apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a control device according to an embodiment of the present invention.
Detailed Description
A conventional aircraft generally includes a body, a power assembly, a flight assist device, a power module, and other components, and the flight assist device includes many sensing devices, such as a monocular or binocular camera (also referred to as a camera unit), an IMU configured by sensors such as an acceleration sensor and a gyroscope, a barometer for determining a flight altitude, a compass for determining a flight azimuth, and the like. Through these structures, the aircraft is able to better perform flight missions, providing the user with various desired services. As shown in fig. 1, which is a schematic structural diagram of an aircraft according to an embodiment of the present invention, a sensing device mainly related in the embodiment of the present invention includes a camera unit 103 and an inertial measurement unit 104, the camera unit 103 may be directly mounted on an airframe 101 of the aircraft or mounted on the airframe through a cradle head for capturing an environmental image, the IMU104 may be disposed inside the airframe 101, for example, the IMU104 is shown in fig. 1 by a dotted line and is disposed in a cavity inside a housing forming the airframe 101, and is close to or far from other structural components of the aircraft, and of course, the IMU104 may be disposed at other positions, such as being fixed on a landing gear of the aircraft. The power assembly 102 of the aircraft of fig. 1 is a rotor-based power assembly for powering the aircraft, and in an embodiment of the present invention, the aircraft may be a rotor aircraft as shown in fig. 1, or a fixed-wing aircraft.
In one embodiment, the aircraft may be positioned, attitude measured, speed measured, etc. by fusing data of the camera unit and the IMU based on the VIO (visual-inertial odometer) in the aircraft. The current visual inertial odometers using cameras and IMU inertial measurement units are largely classified into two categories, loosely coupled and tightly coupled. The loosely coupled visual inertial odometer respectively carries out motion state estimation through two independent visual motion modules (aiming at a camera unit) and an inertial navigation motion estimation module (aiming at an IMU), and then outputs of the modules are fused to obtain final aircraft pose information. The tightly coupled visual inertial odometer directly fuses the original data of the two camera units and the IMU, common estimation, mutual restriction and mutual supplement are carried out, the characteristics of the sensor are fully utilized, and the precision is high.
Meanwhile, the accurate aircraft pose information can be obtained by fusion by using the vision observation based on the camera unit and the output of the IMU through the technology of the vision inertia odometer, and the measurement drift is not easy to cause by the bias of the IMU inertia measurement unit.
In the vision system, three-dimensional light rays are in one-to-one correspondence with two-dimensional pixel coordinates, and what describes the correspondence is the internal parameters of the camera unit, which may include focal length, optical center, distortion parameters, and the like. Whether the camera internal parameters accurately determine the conversion precision from two-dimensional pixel coordinate information to three-dimensional light ray information, namely the conversion precision from a three-dimensional space to a two-dimensional image space.
In a visual inertial odometer, the camera needs to be synchronized in time with the output of the IMU inertial measurement unit to ensure that there is an acceptable, stable data delay. The external parameter, i.e. the relative spatial relationship (relative position and relative attitude), between the two sensor coordinate systems of the camera unit and the IMU directly determines the accuracy of the visual inertial odometer.
In the embodiment of the invention, the internal parameters of the camera unit and the external parameters between the camera unit and the IMU are optimized in a corresponding parameter optimization mode, so that better internal parameters and external parameters can be obtained in the use process of the aircraft.
Referring to fig. 2, it is a schematic diagram of a parameter optimization processing flow according to an embodiment of the present invention, in a large optimization direction, the parameter optimization according to the embodiment of the present invention includes, in S201, calculating a pose position of a camera unit by using an internal parameter of the camera unit recorded in an aircraft and a tight coupling algorithm, where the recorded internal parameter of the camera unit is an internal parameter that is set by default from a factory or an internal parameter obtained after the last optimization, or an internal parameter that can be adjusted and configured by a user as needed. Based on the internal parameters and the above mentioned tight coupling algorithm, the pose of the camera unit, position, can be calculated, which includes the relative rotation and translation between the camera unit and the IMU, or is understood as the external parameters between the camera unit and the IMU. When processing is performed based on the internal reference and tight coupling algorithm, a picture sequence of the camera unit and information of the IMU need to be acquired, a Keyframe is selected from the picture sequence, Feature points are extracted by Feature point matching, and the Feature point matching in the Keyframe may be performed by corner detection, a Scale Invariant Feature Transform (SIFT) algorithm, a KLT algorithm, or the like. And performing three-dimensional reconstruction based on the pose, the matching feature points and the like of the camera unit obtained by matching to obtain a sparse point map.
In S202, it is determined whether optimization of related associated parameters including at least one of a relative attitude parameter between the camera unit and the inertial measurement unit and an internal parameter of the camera unit is required. If the judgment result is yes, S203 is executed, and if the judgment result is no, the image frame of the next frame camera unit and the data of the IMU are acquired, and the S201 is executed. In one embodiment, whether the optimization of the associated parameters is suitable or not can be determined by judging the motion state and the state of the obtained sparse point map.
In S203, the recorded internal parameters of the camera unit and the external parameters between the camera unit and the IMU calculated in S201 are used as initial values, and further optimization solution is performed. Therefore, the optimized internal parameter of the camera unit and the external parameter between the camera unit and the IMU can be obtained. After the optimization is complete, the internal and external parameters of the optimized camera unit may be stored in the aircraft so that functional applications that require the flight control strategy of the internal and external parameters or other data processing requirements may invoke the internal and external parameters. After S203 is executed, the next optimization for the internal reference and/or the external reference, such as the arrival of the next optimization period, or the detection of the impact on the aircraft, the possible shift of the camera unit and the inertial sensor, and so on, may be continuously waited, and the above-mentioned S201 to S203, or specifically, the following S301 to S303 may be executed to implement the optimization for the internal reference and the external reference.
The above-described optimization process may be implemented in a dedicated control device, may be performed by the flight controller of the aircraft, or may be performed by processing means provided in the camera unit.
Referring to fig. 3, it is a schematic flow chart of a parameter optimization method according to an embodiment of the present invention, where the method is applied to an aircraft, the aircraft includes a camera unit and an IMU, and the camera unit and the IMU may be relatively fixedly connected. The method according to an embodiment of the invention may also be performed by a dedicated control device, by a flight controller of the aircraft, or by processing means provided in the camera unit. The method is applied for optimizing a correlation parameter between a camera unit and an inertial measurement unit comprised on an aircraft, in one embodiment the correlation parameter comprises: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit. The method includes the associated steps shown in relation to fig. 3 as described below.
In the flying process of the aircraft, a large number of environment images can be shot on the aircraft through the camera unit directly connected or connected with the cradle head, and in S301, an image frame set acquired by the camera unit in the flying process of the aircraft is acquired. In one embodiment, all images acquired by the camera unit may be added to the image frame set to facilitate a fully accurate analysis. In another embodiment, only a part of the image frames with special meaning may be added as a key frame, and the key frame is added to the image frame set to perform the subsequent analysis processing of the embodiment of the present invention, that is, the step S301 may specifically include acquiring an original image frame acquired by the camera unit during the flight process of the aircraft; the method comprises the steps of obtaining an image frame set according to original image frames, wherein the image frame set comprises image frames which are selected from the original image frames and meet key frame conditions, and for the image frame set which is constructed after the image frames are selected, the key frames meet the following conditions.
First, a relative translation amount between adjacent image frames in the image frame set satisfies a key frame condition, and the relative translation amount satisfying the key frame condition includes: the relative translation amount is greater than or equal to a preset translation amount threshold. In one embodiment, the translation amount represents a relative translation distance of the camera unit in the process of sequentially shooting the adjacent image frames. That is to say, any two image frames in the image frame set are shot after moving for a certain distance, and a reasonable movement distance is set to select the key frame to obtain the image frame set for subsequent processing, so that the software and hardware resources in the processing process can be better saved, and the accuracy of the subsequent optimization processing can be better ensured.
Second, a relative rotation amount between adjacent image frames in the image frame set satisfies a key frame condition, and the satisfying the key frame condition includes: the relative rotation amount is greater than or equal to a preset rotation amount threshold. And the relative rotation amount represents the rotation angle of the camera unit in the process of successively shooting the adjacent image frames. That is to say, any two image frames in the image frame set are shot after rotating a certain interval angle, a reasonable interval angle is set to select the key frame, the image frame set is obtained to carry out subsequent processing, software and hardware resources in the processing process can be better saved, and the accuracy of subsequent optimization processing can be better ensured.
Third, the number of matched feature points detected on an image frame in the set of image frames satisfies a keyframe condition, which includes: the number of detected matching feature points is greater than or equal to a first preset number threshold. Wherein, the matched characteristic points on the image frame refer to: and extracting characteristic points in the image frame according to the characteristic points, and determining the characteristic points after tracking and matching with the N image frames before the image frame. In one embodiment, the matching feature points may be determined from the original image frame based on the above-mentioned corner detection, SIFT, KLT algorithm, etc. The more matched feature points are, the more obvious the optimization effect is, and the finally obtained key parameters are better.
Fourthly, the number of feature points detected in the image frames of the image frame set satisfies a key frame condition, wherein the condition that the number of feature points satisfies the key frame condition includes that the number of matching feature points is greater than or equal to a second preset number threshold. The feature points are detected only to detect meaningful points present in the current image, and these feature points mentioned herein may or may not have matching features in the other image frame or frames. Similarly, the more the obtained characteristic points are, the more obvious the optimization effect is, and the finally obtained key parameters are better.
In a specific implementation, a key frame is selected from the original image frames to obtain an image frame set, and any one or more of the first, second, third, and fourth described criteria are satisfied between the image frames selected into the image frame set.
After the image frame set is obtained, processing is performed on the basis of the image frame set, and further in S302, a plurality of sets of matched feature points in the image frame set are determined, where each set of matched feature points corresponds to a space point, and each set of matched feature points includes a plurality of matched feature points. The spatial point refers to a matching feature point on an image frame in an image frame set, a specific spatial position point in a three-dimensional space in a world coordinate system is obtained after calculation, and the spatial point determined from the matching feature point can be obtained in a SLAM (Simultaneous localization and mapping) process. Each set of matching feature point set comprises target matching feature points, wherein the target matching feature points refer to: the number proportion of the image frames in the image frame set where the target matching feature point is located is greater than a preset number proportion threshold, for example, more than 80%.
In the step S303, in the multiple sets of matching feature point sets, based on the sensing data of the inertial measurement unit during the acquisition of each image frame of the image frame set, and based on the image positions of the matching feature points in the matching feature point sets on the corresponding image frames of the image frame set, the correlation parameters stored in the aircraft are optimized.
In an optional implementation manner, after the optimization is completed, the aircraft can be controlled according to the optimized associated parameters, including controlling the flight position, the flight speed and the like of the aircraft. And subsequently, when the spatial position of the aircraft in the current environment is calculated based on the data of the camera unit and the IMU, the correlation parameters are used for calculation, so that the aircraft is accurately positioned, and the aim of controlling the flight position and the flight speed of the aircraft is fulfilled.
In an optional implementation manner, an opportunity to start optimization may also be set, and in an embodiment, the method may further include: acquiring flight state data of the aircraft when a camera unit collects a plurality of frame image frames in the image frame set; and determining whether the flight state of the aircraft is greater than or equal to a preset flight state change threshold value according to the flight state data. Specifically, the flight state and whether the flight state satisfies the condition may be determined before performing the S303. The flight state data can be speed data of an aircraft when the camera unit collects adjacent image frames, when the speed changes, accurate data output exists in the IMU, the IMU is excited sufficiently, and relevant parameters can be optimized. In one embodiment, the speed data of the aircraft when the camera unit acquires the adjacent image frames is denoted viCalculating the variance of the speed, when the variance of the speed is greater than a certain threshold, i.e.It is considered appropriate to perform optimization of the associated parameters. The variance of the velocity is calculated as follows:
Figure BDA0002762877050000081
based on the variance calculated in the above manner, the S303 may include: and when the flight state parameter is equal to or more than a preset flight state change threshold value, optimizing the correlation parameter stored by the aircraft according to the sensing data of the inertial measurement unit during the acquisition of each image frame of the image frame set and the position of the feature point in the matched feature point set in the image frame set corresponding to the image frame.
In an embodiment, please refer to fig. 4, which is a flowchart illustrating a specific optimization method according to an embodiment of the present invention, where the method according to the embodiment of the present invention corresponds to the above-mentioned S303. In the embodiment of the present invention, the method may specifically include the following steps.
S401: determining an initial value of a pose quantity according to sensing data of the inertial measurement unit during acquisition of each image frame of the image frame set, wherein the initial value of the pose quantity comprises: and acquiring a relative translation amount initial value and a relative rotation amount initial value of a world coordinate system and a machine body coordinate system corresponding to the inertial measurement unit when each image frame of the image frame set is acquired.
S402: and according to the initial value of the relative translation amount, the initial value of the relative rotation amount and the image position of the matched feature point in the matched feature point set on the corresponding image frame in the image frame set, operating an optimization algorithm to optimize the stored associated parameters so as to obtain the optimized associated parameters, wherein the optimization algorithm is configured according to the reprojection error of the space point.
In the embodiment of the invention, optimization operation is carried out based on the reprojection error, the initial value of the relative translation amount and the initial value of the relative rotation amount, so as to obtain the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount. In one embodiment, the optimization algorithm for optimizing the initial values of the relative translation amount and the relative rotation amount can be expressed by the following formula 2 configured according to the reprojection error.
Figure BDA0002762877050000091
Among them, the projective transformation process is abbreviated as p' ═ pi (RP)i+ t), π represents a projection function representing a spatial point P in three-dimensional spaceiAnd mapping the image to the image frame of the ith frame shot by the camera unit through rotation, translation and the like.
PiIs the three-dimensional coordinate, P, of the certain matching feature pointiThat is, the above mentioned matching feature point set corresponds to a space point, and the certain matching feature point belongs to the matching feature point set.
piIs the pixel coordinate (i.e. two-dimensional image coordinate) of the certain matching feature point on the ith frame image, piIndicating the position of the certain matching feature point in the matching feature point set in the corresponding image frame in the image frame set.
Figure BDA0002762877050000092
Representing initial values of relative rotation and translation from world to current fuselage coordinate systems, i.e.
Figure BDA0002762877050000093
Corresponding to the above-mentioned initial value of the relative rotation amount,
Figure BDA0002762877050000094
corresponding to the relative translation initial value mentioned above. Is calculated according to the sensing data sensed by the IMU. In one embodiment of the present invention,
Figure BDA0002762877050000101
Figure BDA0002762877050000102
may be determined based on sensed data of the IMU and sensor data of the compass.
Rex,texThe method comprises the steps of representing default external parameters between a camera and an IMU inertial measurement unit, K is used for representing default internal parameters of the camera unit, and K pairs comprise a focal length f, principal point coordinates c, a radial distortion coefficient K and a tangential distortion coefficient p; in the embodiment of the present invention, the external parameters and the internal parameters used in the camera projection process are default values, and are stored in the aircraft, and may be default values set by a factory or values stored in a storage device of the aircraft after the previous optimization.
arg represents the parameter (goal) of the optimization
Figure BDA0002762877050000103
PiThe optimization process based on equation 2 can be understood as using the default camera parameters K, R firstex,texTo calculate
Figure BDA0002762877050000104
Pi. Wherein the main optimization is the transformation from the world coordinate system to the relative attitude parameters between the current fuselage coordinate systems, i.e.
Figure BDA0002762877050000105
And
Figure BDA0002762877050000106
Pifor optional purposes, or it can be considered that optimizing the associated parameters optionally optimizes Pi
In an embodiment, the S402 may further specifically include: running an optimization algorithm according to the initial value of the relative translation amount, the initial value of the relative rotation amount, the image position of the matched feature point in the matched feature point set on the corresponding image frame in the image frame set and the stored associated parameters to obtain the three-dimensional coordinates of the space point, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount; and operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
That is to say, in the specific optimization process, initial optimization may be performed first, to obtain the three-dimensional coordinates of the initially optimized spatial point, the initial value of the relative translation amount after initial optimization, and the initial value of the relative rotation amount after initial optimization, and further optimization of all the associated parameters may be performed on the basis that the parameters obtained by initial optimization are used as new initial values.
Firstly, performing primary optimization by using the initial value of the relative translation amount, the initial value of the relative rotation amount, the image position of the matching feature point in the matching feature point set on the corresponding image frame in the image frame set, and the stored associated parameters (actually using the internal parameters and the external parameters in the stored associated parameters) through the formula 2 to obtain each primary optimized parameter (the primary optimized parameters include the three-dimensional coordinates of the space point, the optimized initial value of the relative translation amount, and the optimized initial value of the relative rotation amount).
Secondly, taking the parameters of the initial optimization as new initial optimization values, running the optimization algorithm again, and optimizing to obtain new associated parameters, wherein the new associated parameters not only can include: internal reference K of camera unit and external reference of camera unit
Figure BDA0002762877050000107
texThe method also comprises the following steps: three-dimensional coordinates P of a spatial pointiRelative translation amount of a world coordinate system and a body coordinate system corresponding to an inertial measurement unit during acquisition of each image frame of the image frame set
Figure BDA0002762877050000108
Relative amount of rotation
Figure BDA0002762877050000111
In one embodiment, the re-executed optimization algorithm is similar to equation 2 above, and the specific principles are the same, except that the new optimization algorithm may require optimization to obtain more parameters, see equation 3 below for details.
Figure BDA0002762877050000112
After the desired association parameter is obtained through optimization, it may be determined in reverse whether the optimized association parameter is valid, that is, whether the association parameter obtained after final optimization can be directly used, where determining whether the association parameter is valid may include, for example: and judging whether the number of reliable characteristic points is increased, whether the three-dimensional position coordinates of the reliable characteristic points are more stable, whether the change is smaller, whether the speed is smoother, and whether the integral and the position of the speed are more fitted. If the optimized associated parameters are valid, that is, the self-calibration of the parameters such as the internal parameters and the external parameters of the camera unit is considered to be successful, the associated parameters can be used for other parts.
In one embodiment, a variation of the three-dimensional coordinates of the spatial point obtained when the optimization algorithm is executed may also be determined, and when the variation is smaller than or equal to a preset variation threshold, the re-optimization is performed, that is, the following steps are performed: and operating the optimization algorithm according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount, and optimizing the stored associated parameters.
The three-dimensional process of determining the amount of change in the three-dimensional coordinates of the spatial points of a certain set of matching feature points can be described with reference to the following.
The matching feature points on the image frames in the image frame set described above may be understood as reliable feature points, which are obtained as follows.
Traversing all the feature points on a certain image frame in the image frame set, determining whether the largest reprojection error is small enough (smaller than a certain threshold), and determining that the number of times of occurrence in the image frame set is large enough (larger than a certain threshold, for example, tracking matching succeeds in 80% of the key frames), for example, there are 100 image frames in the image frame set, for a certain target feature point, there is a matched feature point in 85 image frames in the target feature point, and meanwhile, for the reprojection error of the feature point smaller than a certain threshold, the feature point is considered as a reliable feature point, and the reliable feature point and the matched feature points on other image frames constitute a set of matched feature points, and the feature points all belong to the matched feature points in the matched feature point set.
Determining the variation of the three-dimensional position of each point in each matching feature point set in the optimization according to the three-dimensional coordinates of the space points corresponding to all or part of the matching feature point sets, wherein the variation is the variation of the three-dimensional coordinates of the space points mentioned above, when the variation is small enough (smaller than a certain variation threshold), the re-optimization is considered to be possible, and based on the formula 3, the internal parameter K of the camera unit and the external parameter R between the camera unit and the inertia measurement unit are obtained through optimizationex,texThe method also comprises the following steps: three-dimensional coordinates P of a spatial pointiRelative translation amount of a world coordinate system and a body coordinate system corresponding to an inertial measurement unit during acquisition of each image frame of the image frame set
Figure BDA0002762877050000121
Relative amount of rotation
Figure BDA0002762877050000122
The embodiment of the invention can optimize the external parameters between the camera unit and the inertial measurement unit and the internal parameters of the camera unit based on the image sequence acquired by the aircraft in the flying process and the data acquired by the inertial measurement unit during image acquisition, can obtain more accurate external parameters and internal parameters, and can facilitate the subsequent flying processing of positioning, speed measurement and the like of the aircraft based on the accurate external parameters and the accurate internal parameters.
Referring to fig. 5 again, the parameter optimization apparatus according to an embodiment of the present invention is a schematic structural diagram of the parameter optimization apparatus, and the parameter optimization apparatus according to an embodiment of the present invention is applied to an aircraft, where the aircraft includes a camera unit and an inertial measurement unit, and the parameter optimization apparatus is mainly used for optimizing an internal parameter of the camera unit and an external parameter between the camera unit and the inertial measurement unit. In the embodiment of the invention, the device comprises the following module structure.
An obtaining module 501, configured to obtain an image frame set acquired by the camera unit in a flight process of the aircraft; a determining module 502, configured to determine multiple sets of matched feature point sets in the image frame set, where each set of matched feature point set corresponds to one spatial point, and each set of matched feature point set includes multiple matched feature points; the processing module 503 is configured to optimize the correlation parameters stored in the aircraft according to the sensing data of the inertial measurement unit when the inertial measurement unit collects each image frame of the image frame set and according to the image positions of the matching feature points in the matching feature point set on the corresponding image frame of the image frame set; wherein the associated parameters include: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit.
In one embodiment, the processing module 503 is further configured to control the aircraft according to the optimized associated parameter.
In one embodiment, the processing module 503 is further configured to acquire flight status data of the aircraft when the camera unit acquires a plurality of image frames in the image frame set; determining whether the flight state parameter of the aircraft is larger than or equal to a preset flight state change threshold value or not according to the flight state data;
and the processing module 503 is specifically configured to, when the flight state parameter is equal to or greater than a preset flight state change threshold, optimize the correlation parameter stored in the aircraft according to the sensing data of the inertial measurement unit at the time of acquiring each image frame of the image frame set, and according to the image position of the matching feature point in the matching feature point set on the corresponding image frame of the image frame set.
In one embodiment, the obtaining module 501 is specifically configured to obtain a raw image frame acquired by the camera unit during a flight of the aircraft; and obtaining an image frame set according to the original image frames, wherein the image frame set comprises image frames which are selected from the original image frames and meet the key frame condition.
In one embodiment, the relative amount of translation between adjacent image frames in the set of image frames satisfies a key frame condition, the relative amount of translation satisfying the key frame condition comprising: the relative translation amount is greater than or equal to a preset translation amount threshold.
In one embodiment, the amount of relative rotation between adjacent image frames in the set of image frames satisfies a key frame condition, the amount of relative rotation satisfying the key frame condition comprising: the relative rotation amount is greater than or equal to a preset rotation amount threshold.
In one embodiment, the number of matched feature points detected on an image frame of the set of image frames satisfies a keyframe condition, the number of matched feature points satisfying the keyframe condition comprising: the number of detected matching feature points is greater than or equal to a first preset number threshold.
In one embodiment, the number of feature points detected in the image frames of the set of image frames satisfies a keyframe condition, wherein the number of feature points satisfying the keyframe condition comprises the number of matching feature points being greater than or equal to a second preset number threshold.
In an embodiment, the processing module 503 is specifically configured to determine a pose initial value according to sensing data of the inertial measurement unit at the time of acquisition of each image frame of the image frame set, where the pose initial value includes: acquiring a relative translation amount initial value and a relative rotation amount initial value of a world coordinate system and a machine body coordinate system corresponding to an inertial measurement unit when each image frame of the image frame set is acquired; and according to the initial value of the relative translation amount, the initial value of the relative rotation amount and the position of the corresponding image frame of the feature point in the image frame set, operating an optimization algorithm to optimize the stored associated parameters to obtain the optimized associated parameters, wherein the optimization algorithm is configured according to the reprojection error of the space point.
In an embodiment, the processing module 503 is specifically configured to run an optimization algorithm according to the initial value of the relative translation amount, the initial value of the relative rotation amount, the position of the feature point in the corresponding image frame in the image frame set, and the stored associated parameters, so as to obtain the three-dimensional coordinates of the spatial point, the optimized initial value of the relative translation amount, and the optimized initial value of the relative rotation amount; and operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
In one embodiment, the processing module 503 is further configured to determine a variation of three-dimensional coordinates of the spatial point obtained when the optimization algorithm is executed; and the processing module 503 is specifically configured to, when the variation is smaller than or equal to a preset variation threshold, operate the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the spatial point, the optimized initial value of the relative translational amount, and the optimized initial value of the relative rotational amount.
In the embodiment of the present invention, reference may be made to the description of relevant contents in the embodiments corresponding to fig. 1 to fig. 4, which is not repeated herein.
The embodiment of the invention can optimize the external parameters between the camera unit and the inertial measurement unit and the internal parameters of the camera unit based on the image sequence acquired by the aircraft in the flying process and the data acquired by the inertial measurement unit during image acquisition, can obtain more accurate external parameters and internal parameters, and can facilitate the subsequent flying processing of positioning, speed measurement and the like of the aircraft based on the accurate external parameters and the accurate internal parameters. Referring to fig. 6 again, the control apparatus in the embodiment of the present invention is a schematic structural diagram, and the control apparatus in the embodiment of the present invention includes a storage device 601 and a processor 602, where the processor 602 is connected to the storage device 601, and the processor 602 is further connected to an external camera unit and an inertial measurement unit, and is configured to receive data of the camera unit and the inertial measurement unit, and implement optimization of associated parameters. In other embodiments, the control device may comprise a storage 601, a processor 602, a camera unit and an inertial measurement unit, i.e. the camera unit, the inertial measurement unit and the storage 601, the processor 602 form a complete product through which services such as positioning, speed estimation, etc. of a movable platform such as an aircraft, an intelligent robot, an autonomous vehicle, etc. are provided.
The storage 601 may include a volatile memory (volatile memory), such as a random-access memory (RAM); the storage device 601 may also include a non-volatile memory (non-volatile memory), such as a flash memory (flash memory), a solid-state drive (SSD), or the like; the storage means 601 may also comprise a combination of memories of the kind described above.
The processor 602 may be a Central Processing Unit (CPU). The processor 602 may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or the like. The PLD may be a field-programmable gate array (FPGA), a General Array Logic (GAL), or the like.
Optionally, the storage device 601 is also used for storing program instructions. The processor 602 may invoke the program instructions to implement the process steps corresponding to fig. 2, fig. 3, and fig. 4 of the present application.
In one embodiment, the processor 602, invokes the program instructions for
Acquiring an image frame set acquired by the camera unit in the flight process of the aircraft;
determining a plurality of groups of matched feature point sets in the image frame set, wherein each group of matched feature point set corresponds to one spatial point and comprises a plurality of matched feature points;
optimizing correlation parameters stored by an aircraft according to sensing data of the inertial measurement unit when each image frame of the image frame set is acquired and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set;
wherein the associated parameters include: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit.
In one embodiment, the processor 602 is further configured to control the aircraft according to the optimized associated parameter.
In one embodiment, the processor 602 is further configured to acquire flight status data of the aircraft when the camera unit acquires a plurality of image frames in the image frame set; determining whether the flight state parameter of the aircraft is larger than or equal to a preset flight state change threshold value or not according to the flight state data;
and the processor 602 is configured to, when the flight state parameter is equal to or greater than a preset flight state change threshold, optimize the correlation parameter stored in the aircraft according to the sensing data of the inertial measurement unit at the time of acquiring each image frame of the image frame set, and according to the image position of the matching feature point in the matching feature point set on the corresponding image frame of the image frame set.
In one embodiment, the processor 602, when being configured to acquire the set of image frames acquired by the camera unit during the flight of the aircraft, is configured to acquire raw image frames acquired by the camera unit during the flight of the aircraft; and obtaining an image frame set according to the original image frames, wherein the image frame set comprises image frames which are selected from the original image frames and meet the key frame condition.
In one embodiment, the relative amount of translation between adjacent image frames in the set of image frames satisfies a key frame condition, the relative amount of translation satisfying the key frame condition comprising: the relative translation amount is greater than or equal to a preset translation amount threshold.
In one embodiment, the amount of relative rotation between adjacent image frames in the set of image frames satisfies a key frame condition, the amount of relative rotation satisfying the key frame condition comprising: the relative rotation amount is greater than or equal to a preset rotation amount threshold.
In one embodiment, the number of matched feature points detected on an image frame of the set of image frames satisfies a keyframe condition, the number of matched feature points satisfying the keyframe condition comprising: the number of detected matching feature points is greater than or equal to a first preset number threshold.
In one embodiment, the number of feature points detected in the image frames of the set of image frames satisfies a keyframe condition, wherein the number of feature points satisfying the keyframe condition comprises the number of matching feature points being greater than or equal to a second preset number threshold.
In one embodiment, the processor 602, when being configured to optimize the correlation parameters stored in the aircraft according to the sensing data of the inertial measurement unit at the time of acquiring each image frame of the image frame set and according to the image positions of the matching feature points in the matching feature point set on the corresponding image frame of the image frame set, is configured to
Determining an initial value of a pose quantity according to sensing data of the inertial measurement unit during acquisition of each image frame of the image frame set, wherein the initial value of the pose quantity comprises: acquiring a relative translation amount initial value and a relative rotation amount initial value of a world coordinate system and a machine body coordinate system corresponding to an inertial measurement unit when each image frame of the image frame set is acquired;
and according to the initial value of the relative translation amount, the initial value of the relative rotation amount and the position of the corresponding image frame of the feature point in the image frame set, operating an optimization algorithm to optimize the stored associated parameters to obtain the optimized associated parameters, wherein the optimization algorithm is configured according to the reprojection error of the space point.
In one embodiment, the processor 602, when being configured to execute an optimization algorithm to optimize the stored associated parameters according to the initial values of the relative translation amount, the relative rotation amount, and the position of the feature point in the corresponding image frame in the image frame set, is configured to
Running an optimization algorithm according to the initial value of the relative translation amount, the initial value of the relative rotation amount, the position of the feature point in the image frame set corresponding to the image frame and the stored associated parameters to obtain a three-dimensional coordinate of the space point, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount;
and operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
In one embodiment, the processor 602 is further configured to
Determining the variable quantity of the three-dimensional coordinates of the space points acquired when the optimization algorithm is operated;
the optimizing the stored associated parameters by operating the optimization algorithm according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount includes:
and when the variation is smaller than or equal to a preset variation threshold, operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
In the embodiment of the present invention, the specific implementation of the processor 602 may refer to the description of the related contents in the embodiments corresponding to fig. 1 to fig. 4, which is not repeated herein.
The embodiment of the invention can optimize the external parameters between the camera unit and the inertial measurement unit and the internal parameters of the camera unit based on the image sequence acquired by the aircraft in the flying process and the data acquired by the inertial measurement unit during image acquisition, can obtain more accurate external parameters and internal parameters, and can facilitate the subsequent flying processing of positioning, speed measurement and the like of the aircraft based on the accurate external parameters and the accurate internal parameters. The embodiment of the present invention further provides an aircraft, the structure of which can be shown in fig. 1, and the aircraft further includes a storage device, a power module, a communication module, and a controller, in addition to the power assembly 101, the body 102, the camera unit 103, and the inertia measurement unit 104, which have been shown in fig. 1, and the controller may be a separate controller or a flight controller integrated with corresponding functions.
The storage device may include volatile memory (volatile memory), such as RAM; the storage device may also include a non-volatile memory (non-volatile memory), such as a flash memory (SSD), etc.; the storage means may also comprise a combination of memories of the kind described above.
The processor may be a CPU, and the processor may further include a hardware chip. The hardware chip may be an ASIC, PLD, or the like. The PLD may be an FPGA, GAL, or the like. The power assembly 101 is used for providing power for driving the aircraft to move; the power assembly 101 may include an electronic governor or the like in addition to the rotor and the stand-alone unit shown in fig. 1. The power unit 101 may include four rotors, six rotors, eight rotors, or the like, or may be a fixed-wing unit.
Storage means for storing program instructions; the controller calls the program instruction for
Acquiring an image frame set acquired by the camera unit in the flight process of the aircraft;
determining a plurality of groups of matched feature point sets in the image frame set, wherein each group of matched feature point set corresponds to one spatial point and comprises a plurality of matched feature points;
optimizing correlation parameters stored by an aircraft according to sensing data of the inertial measurement unit when each image frame of the image frame set is acquired and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set;
wherein the associated parameters include: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit.
In one embodiment, the controller is further configured to control the aircraft according to the optimized associated parameter.
In one embodiment, the controller is further configured to acquire flight status data of the aircraft when the camera unit acquires a plurality of image frames in the image frame set; determining whether the flight state parameter of the aircraft is larger than or equal to a preset flight state change threshold value or not according to the flight state data;
and the controller is used for optimizing the correlation parameters stored by the aircraft according to the sensing data of the inertial measurement unit during the acquisition of each image frame of the image frame set and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set when the flight state parameters are equal to or more than a preset flight state change threshold value.
In one embodiment, the controller, when being configured to acquire the set of image frames acquired by the camera unit during flight of the aircraft, is configured to acquire raw image frames acquired by the camera unit during flight of the aircraft; and obtaining an image frame set according to the original image frames, wherein the image frame set comprises image frames which are selected from the original image frames and meet the key frame condition.
In one embodiment, the relative amount of translation between adjacent image frames in the set of image frames satisfies a key frame condition, the relative amount of translation satisfying the key frame condition comprising: the relative translation amount is greater than or equal to a preset translation amount threshold.
In one embodiment, the amount of relative rotation between adjacent image frames in the set of image frames satisfies a key frame condition, the amount of relative rotation satisfying the key frame condition comprising: the relative rotation amount is greater than or equal to a preset rotation amount threshold.
In one embodiment, the number of matched feature points detected on an image frame of the set of image frames satisfies a keyframe condition, the number of matched feature points satisfying the keyframe condition comprising: the number of detected matching feature points is greater than or equal to a first preset number threshold.
In one embodiment, the number of feature points detected in the image frames of the set of image frames satisfies a keyframe condition, wherein the number of feature points satisfying the keyframe condition comprises the number of matching feature points being greater than or equal to a second preset number threshold.
In one embodiment, the controller, when being configured to optimize the correlation parameters stored in the aircraft according to the sensing data of the inertial measurement unit at the time of acquisition of each image frame of the image frame set and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set, is configured to
Determining an initial value of a pose quantity according to sensing data of the inertial measurement unit during acquisition of each image frame of the image frame set, wherein the initial value of the pose quantity comprises: acquiring a relative translation amount initial value and a relative rotation amount initial value of a world coordinate system and a machine body coordinate system corresponding to an inertial measurement unit when each image frame of the image frame set is acquired;
and according to the initial value of the relative translation amount, the initial value of the relative rotation amount and the position of the corresponding image frame of the feature point in the image frame set, operating an optimization algorithm to optimize the stored associated parameters to obtain the optimized associated parameters, wherein the optimization algorithm is configured according to the reprojection error of the space point.
In one embodiment, the controller, when being configured to run an optimization algorithm to optimize the stored associated parameters according to the initial values of the relative translation amount, the relative rotation amount, and the position of the feature point in the corresponding image frame in the image frame set, is configured to
Running an optimization algorithm according to the initial value of the relative translation amount, the initial value of the relative rotation amount, the position of the feature point in the image frame set corresponding to the image frame and the stored associated parameters to obtain a three-dimensional coordinate of the space point, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount;
and operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
In one embodiment, the controller is further configured to
Determining the variable quantity of the three-dimensional coordinates of the space points acquired when the optimization algorithm is operated;
the optimizing the stored associated parameters by operating the optimization algorithm according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount includes:
and when the variation is smaller than or equal to a preset variation threshold, operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
In the embodiment of the present invention, the specific implementation of the controller may refer to the description of the related contents in the embodiments corresponding to fig. 1 to fig. 4, which is not repeated herein.
The embodiment of the invention can optimize the external parameters between the camera unit and the inertial measurement unit and the internal parameters of the camera unit based on the image sequence acquired by the aircraft in the flying process and the data acquired by the inertial measurement unit during image acquisition, can obtain more accurate external parameters and internal parameters, and can facilitate the subsequent flying processing of positioning, speed measurement and the like of the aircraft based on the accurate external parameters and the accurate internal parameters. It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above disclosure is intended to be illustrative of only some embodiments of the invention, and is not intended to limit the scope of the invention.

Claims (34)

1. A parameter optimization method, applied to an aircraft comprising a camera unit and an inertial measurement unit, the method comprising:
acquiring an image frame set acquired by the camera unit in the flight process of the aircraft;
determining a plurality of groups of matched feature point sets in the image frame set, wherein each group of matched feature point set corresponds to one space point and comprises a plurality of matched feature points;
optimizing correlation parameters stored by an aircraft according to sensing data of the inertial measurement unit when each image frame of the image frame set is acquired and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set;
wherein the associated parameters include: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit.
2. The method of claim 1, further comprising: and controlling the aircraft according to the optimized associated parameters.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
acquiring flight state data of the aircraft when a camera unit collects a plurality of frame image frames in the image frame set;
determining whether the flight state parameter of the aircraft is larger than or equal to a preset flight state change threshold value or not according to the flight state data;
and when the flight state parameter is larger than or equal to a preset flight state change threshold value, optimizing the correlation parameter stored by the aircraft according to the sensing data of the inertial measurement unit during the acquisition of each image frame of the image frame set and according to the image position of the matching feature point in the matching feature point set on the corresponding image frame of the image frame set.
4. The method of any one of claims 1-3, wherein said acquiring a set of image frames acquired by said aircraft during flight by said camera unit comprises:
acquiring a raw image frame acquired by the camera unit in the flight process of the aircraft;
and obtaining an image frame set according to the original image frames, wherein the image frame set comprises image frames which are selected from the original image frames and meet the key frame condition.
5. The method of claim 4, wherein a relative amount of translation between adjacent image frames in the set of image frames satisfies a key frame condition, the relative amount of translation satisfying a key frame condition comprising: the relative translation amount is greater than or equal to a preset translation amount threshold.
6. The method of claim 4 or 5, wherein a relative rotation amount between adjacent image frames in the set of image frames satisfies a key frame condition, the relative rotation amount satisfying the key frame condition comprising: the relative rotation amount is greater than or equal to a preset rotation amount threshold.
7. The method of any of claims 4-6, wherein the number of matched feature points detected on an image frame in the set of image frames satisfies a keyframe condition, the number of matched feature points satisfying the keyframe condition comprising: the number of detected matching feature points is greater than or equal to a first preset number threshold.
8. The method according to any of claims 4-7, wherein the number of feature points detected in an image frame of the set of image frames satisfies a keyframe condition, wherein the number of feature points satisfying the keyframe condition comprises the number of matching feature points being greater than or equal to a second preset number threshold.
9. The method according to any one of claims 1 to 8, wherein the optimizing the aircraft stored associated parameters according to the sensing data of the inertial measurement unit at the time of acquisition of each image frame of the image frame set and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set comprises:
determining an initial value of a pose quantity according to sensing data of the inertial measurement unit during acquisition of each image frame of the image frame set, wherein the initial value of the pose quantity comprises: acquiring a relative translation amount initial value and a relative rotation amount initial value of a world coordinate system and a machine body coordinate system corresponding to an inertial measurement unit when each image frame of the image frame set is acquired;
and according to the initial value of the relative translation amount, the initial value of the relative rotation amount and the image position of the matched feature point in the matched feature point set on the corresponding image frame in the image frame set, operating an optimization algorithm to optimize the stored associated parameters so as to obtain the optimized associated parameters, wherein the optimization algorithm is configured according to the reprojection error of the space point.
10. The method according to claim 9, wherein the operating an optimization algorithm to optimize the stored associated parameters according to the initial values of the relative translation amount, the relative rotation amount, and the image positions of the matched feature points in the set of matched feature points on the corresponding image frames in the set of image frames comprises:
running an optimization algorithm according to the initial value of the relative translation amount, the initial value of the relative rotation amount, the image position of the matched feature point in the matched feature point set on the corresponding image frame in the image frame set and the stored associated parameters to obtain a three-dimensional coordinate of the space point, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount;
and operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
11. The method of claim 10, wherein the method further comprises:
determining the variable quantity of the three-dimensional coordinates of the space points acquired when the optimization algorithm is operated;
the optimizing the stored associated parameters by operating the optimization algorithm according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount includes:
and when the variation is smaller than or equal to a preset variation threshold, operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
12. A parameter optimization device, characterized in that it is applied to an aircraft comprising a camera unit and an inertial measurement unit, said device comprising:
the acquisition module is used for acquiring an image frame set acquired by the camera unit in the flight process of the aircraft;
the determining module is used for determining a plurality of groups of matched feature point sets in the image frame set, wherein each group of matched feature point set corresponds to one space point and comprises a plurality of matched feature points;
the processing module is used for optimizing the correlation parameters stored by the aircraft according to the sensing data of the inertial measurement unit during the acquisition of each image frame of the image frame set and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frames of the image frame set;
wherein the associated parameters include: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit.
13. A control device, wherein the control device is connected to an aircraft, the aircraft comprising a camera unit and an inertial measurement unit, the control device comprising a memory device and a processor;
the storage device is used for storing program instructions;
the processor, calling the program instructions, is configured to:
acquiring an image frame set acquired by the camera unit in the flight process of the aircraft;
determining a plurality of groups of matched feature point sets in the image frame set, wherein each group of matched feature point set corresponds to one space point and comprises a plurality of matched feature points;
optimizing correlation parameters stored by an aircraft according to sensing data of the inertial measurement unit when each image frame of the image frame set is acquired and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set;
wherein the associated parameters include: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit.
14. The control apparatus of claim 13, wherein the processor is further configured to control the aircraft according to the optimized associated parameter.
15. The control apparatus according to claim 13 or 14,
the processor is further configured to acquire flight state data of the aircraft when the camera unit acquires a plurality of frames of image frames in the image frame set; determining whether the flight state parameter of the aircraft is larger than or equal to a preset flight state change threshold value or not according to the flight state data;
and the processor is used for optimizing the correlation parameters stored by the aircraft according to the sensing data of the inertial measurement unit during the acquisition of each image frame of the image frame set and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set when the flight state parameters are equal to or more than a preset flight state change threshold value.
16. The control device of any one of claims 13-15, wherein the processor, when configured to acquire the set of image frames acquired by the camera unit during flight of the aircraft, is configured to:
acquiring a raw image frame acquired by the camera unit in the flight process of the aircraft;
and obtaining an image frame set according to the original image frames, wherein the image frame set comprises image frames which are selected from the original image frames and meet the key frame condition.
17. The control device of claim 16, wherein a relative amount of translation between adjacent image frames in the set of image frames satisfies a key frame condition, the relative amount of translation satisfying the key frame condition comprising: the relative translation amount is greater than or equal to a preset translation amount threshold.
18. The control device according to claim 16 or 17, wherein a relative rotation amount between adjacent image frames in the set of image frames satisfies a key frame condition, the relative rotation amount satisfying the key frame condition including: the relative rotation amount is greater than or equal to a preset rotation amount threshold.
19. The control device according to any one of claims 16 to 18, wherein the number of matched feature points detected on an image frame of the set of image frames satisfies a keyframe condition, the number of matched feature points satisfying the keyframe condition comprising: the number of detected matching feature points is greater than or equal to a first preset number threshold.
20. The control device according to any one of claims 16 to 19, wherein the number of feature points detected in an image frame of the set of image frames satisfies a keyframe condition, wherein the number of feature points satisfying the keyframe condition comprises the number of matching feature points being greater than or equal to a second preset number threshold.
21. The control device of any one of claims 13-20, wherein the processor, when being configured to optimize the aircraft stored association parameters based on the sensed data of the inertial measurement unit at the time of acquisition of each image frame of the set of image frames and based on the image positions of matching feature points of the set of matching feature points on corresponding image frames of the set of image frames, is configured to optimize the aircraft stored association parameters based on the sensed data of the inertial measurement unit at the time of acquisition of each image frame of the set of image frames and based on the image positions of the matching feature points of the set
Determining an initial value of a pose quantity according to sensing data of the inertial measurement unit during acquisition of each image frame of the image frame set, wherein the initial value of the pose quantity comprises: acquiring a relative translation amount initial value and a relative rotation amount initial value of a world coordinate system and a machine body coordinate system corresponding to an inertial measurement unit when each image frame of the image frame set is acquired;
and according to the initial value of the relative translation amount, the initial value of the relative rotation amount and the position of the corresponding image frame of the feature point in the image frame set, operating an optimization algorithm to optimize the stored associated parameters to obtain the optimized associated parameters, wherein the optimization algorithm is configured according to the reprojection error of the space point.
22. The control device according to claim 21, wherein the processor, when being configured to run an optimization algorithm to optimize the stored associated parameters based on the initial values of the relative translation amount, the initial values of the relative rotation amount, and the positions of the feature points in the corresponding image frames in the set of image frames, is configured to
Running an optimization algorithm according to the initial value of the relative translation amount, the initial value of the relative rotation amount, the position of the feature point in the image frame set corresponding to the image frame and the stored associated parameters to obtain a three-dimensional coordinate of the space point, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount;
and operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
23. The control device of claim 22, wherein the processor is further configured to
Determining the variable quantity of the three-dimensional coordinates of the space points acquired when the optimization algorithm is operated;
the optimizing the stored associated parameters by operating the optimization algorithm according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount includes:
and when the variation is smaller than or equal to a preset variation threshold, operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
24. An aircraft, characterized in that it comprises:
a camera unit;
an inertial measurement unit;
the power assembly is used for providing power for driving the aircraft to move;
storage means for storing program instructions;
a controller calling the program instructions for
Acquiring an image frame set acquired by the camera unit in the flight process of the aircraft;
determining a plurality of groups of matched feature point sets in the image frame set, wherein each group of matched feature point set corresponds to one space point and comprises a plurality of matched feature points;
optimizing correlation parameters stored by an aircraft according to sensing data of the inertial measurement unit when each image frame of the image frame set is acquired and according to the image positions of the matched feature points in the matched feature point set on the corresponding image frame of the image frame set;
wherein the associated parameters include: at least one of a relative pose parameter between the camera unit and the inertial measurement unit and an internal reference of the camera unit.
25. The aircraft of claim 24, wherein the controller is further configured to control the aircraft according to the optimized associated parameter.
26. The aircraft of claim 24 or 25,
the controller is further used for acquiring flight state data of the aircraft when the camera unit collects the multi-frame image frames in the image frame set; determining whether the flight state parameter of the aircraft is larger than or equal to a preset flight state change threshold value or not according to the flight state data;
and, the controller is configured to: and when the flight state parameter is equal to or more than a preset flight state change threshold value, optimizing the correlation parameter stored by the aircraft according to the sensing data of the inertial measurement unit during the acquisition of each image frame of the image frame set and according to the image position of the matching feature point in the matching feature point set on the corresponding image frame of the image frame set.
27. The aircraft of any of claims 24-26, wherein said controller, when configured to acquire a set of image frames acquired by said camera unit during flight of said aircraft, is configured to:
acquiring a raw image frame acquired by the camera unit in the flight process of the aircraft;
and obtaining an image frame set according to the original image frames, wherein the image frame set comprises image frames which are selected from the original image frames and meet the key frame condition.
28. The aircraft of claim 27, wherein a relative amount of translation between adjacent image frames in the set of image frames satisfies a keyframe condition, the relative amount of translation satisfying a keyframe condition comprising: the relative translation amount is greater than or equal to a preset translation amount threshold.
29. The aerial vehicle of claim 27 or 28, wherein a relative rotation amount between adjacent image frames in the set of image frames satisfies a keyframe condition, the relative rotation amount satisfying the keyframe condition comprising: the relative rotation amount is greater than or equal to a preset rotation amount threshold.
30. The aircraft of any of claims 27-29, wherein the number of matched feature points detected on an image frame in the set of image frames satisfies a keyframe condition, the number of matched feature points satisfying the keyframe condition comprising: the number of detected matching feature points is greater than or equal to a first preset number threshold.
31. The aircraft of any of claims 27-30, wherein the number of feature points detected in an image frame of the set of image frames satisfies a keyframe condition, wherein the number of feature points satisfying the keyframe condition comprises the number of matching feature points being greater than or equal to a second preset number threshold.
32. The aircraft of any one of claims 24 to 31, wherein the controller, when configured to optimize the aircraft stored correlation parameters based on the sensed data of the inertial measurement unit at the time of acquisition of each image frame of the set of image frames and based on the image positions of matching feature points in the set of matching feature points on corresponding image frames of the set of image frames, is configured to
Determining an initial value of a pose quantity according to sensing data of the inertial measurement unit during acquisition of each image frame of the image frame set, wherein the initial value of the pose quantity comprises: acquiring a relative translation amount initial value and a relative rotation amount initial value of a world coordinate system and a machine body coordinate system corresponding to an inertial measurement unit when each image frame of the image frame set is acquired;
and according to the initial value of the relative translation amount, the initial value of the relative rotation amount and the position of the corresponding image frame of the feature point in the image frame set, operating an optimization algorithm to optimize the stored associated parameters to obtain the optimized associated parameters, wherein the optimization algorithm is configured according to the reprojection error of the space point.
33. The aircraft of claim 32, wherein said controller, when configured to run an optimization algorithm to optimize said stored correlation parameters based on said initial values of relative translation, relative rotation, and the position of said feature point in the corresponding image frame in said set of image frames
Running an optimization algorithm according to the initial value of the relative translation amount, the initial value of the relative rotation amount, the position of the feature point in the image frame set corresponding to the image frame and the stored associated parameters to obtain a three-dimensional coordinate of the space point, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount;
and operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
34. The aircraft of claim 33, wherein said controller is further configured to control said aircraft to operate in a non-stop mode
Determining the variable quantity of the three-dimensional coordinates of the space points acquired when the optimization algorithm is operated;
the optimizing the stored associated parameters by operating the optimization algorithm according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount includes:
and when the variation is smaller than or equal to a preset variation threshold, operating the optimization algorithm to optimize the stored associated parameters according to the three-dimensional coordinates of the space points, the optimized initial value of the relative translation amount and the optimized initial value of the relative rotation amount.
CN201980030587.1A 2019-10-29 2019-10-29 Parameter optimization method and device, control equipment and aircraft Pending CN112136137A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/114098 WO2021081774A1 (en) 2019-10-29 2019-10-29 Parameter optimization method and apparatus, control device, and aircraft

Publications (1)

Publication Number Publication Date
CN112136137A true CN112136137A (en) 2020-12-25

Family

ID=73849172

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980030587.1A Pending CN112136137A (en) 2019-10-29 2019-10-29 Parameter optimization method and device, control equipment and aircraft

Country Status (2)

Country Link
CN (1) CN112136137A (en)
WO (1) WO2021081774A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112950715A (en) * 2021-03-04 2021-06-11 杭州迅蚁网络科技有限公司 Visual positioning method and device for unmanned aerial vehicle, computer equipment and storage medium
CN114782550A (en) * 2022-04-25 2022-07-22 高德软件有限公司 Camera calibration method, device, electronic equipment and program product
CN117151311A (en) * 2023-10-31 2023-12-01 天津云圣智能科技有限责任公司 Mapping parameter optimization processing method and device, electronic equipment and storage medium
CN112950715B (en) * 2021-03-04 2024-04-30 杭州迅蚁网络科技有限公司 Visual positioning method and device of unmanned aerial vehicle, computer equipment and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272494B (en) * 2022-09-29 2022-12-30 腾讯科技(深圳)有限公司 Calibration method and device for camera and inertial measurement unit and computer equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106780601B (en) * 2016-12-01 2020-03-27 北京未动科技有限公司 Spatial position tracking method and device and intelligent equipment
CN106874616B (en) * 2017-03-06 2021-04-20 北京经纬恒润科技股份有限公司 Parameter optimization adjustment method and system
CN107747941B (en) * 2017-09-29 2020-05-15 歌尔股份有限公司 Binocular vision positioning method, device and system
US10645364B2 (en) * 2017-11-14 2020-05-05 Intel Corporation Dynamic calibration of multi-camera systems using multiple multi-view image frames
CN110044354B (en) * 2019-03-28 2022-05-20 东南大学 Binocular vision indoor positioning and mapping method and device
CN110068326B (en) * 2019-04-29 2021-11-30 京东方科技集团股份有限公司 Attitude calculation method and apparatus, electronic device, and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112950715A (en) * 2021-03-04 2021-06-11 杭州迅蚁网络科技有限公司 Visual positioning method and device for unmanned aerial vehicle, computer equipment and storage medium
CN112950715B (en) * 2021-03-04 2024-04-30 杭州迅蚁网络科技有限公司 Visual positioning method and device of unmanned aerial vehicle, computer equipment and storage medium
CN114782550A (en) * 2022-04-25 2022-07-22 高德软件有限公司 Camera calibration method, device, electronic equipment and program product
CN117151311A (en) * 2023-10-31 2023-12-01 天津云圣智能科技有限责任公司 Mapping parameter optimization processing method and device, electronic equipment and storage medium
CN117151311B (en) * 2023-10-31 2024-02-02 天津云圣智能科技有限责任公司 Mapping parameter optimization processing method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2021081774A1 (en) 2021-05-06

Similar Documents

Publication Publication Date Title
CN109887057B (en) Method and device for generating high-precision map
US10565732B2 (en) Sensor fusion using inertial and image sensors
EP3158293B1 (en) Sensor fusion using inertial and image sensors
EP3158417B1 (en) Sensor fusion using inertial and image sensors
Loianno et al. Cooperative localization and mapping of MAVs using RGB-D sensors
EP3158411B1 (en) Sensor fusion using inertial and image sensors
CN103914065B (en) The method and apparatus that flight state is revised in real time
TW201832185A (en) Camera auto-calibration with gyroscope
WO2019155335A1 (en) Unmanned aerial vehicle including an omnidirectional depth sensing and obstacle avoidance aerial system and method of operating same
WO2019104571A1 (en) Image processing method and device
CN112136137A (en) Parameter optimization method and device, control equipment and aircraft
WO2023138007A1 (en) High-reliability and high-precision navigation positioning method and system for gps-denied unmanned aerial vehicle
CN112789672A (en) Control and navigation system, attitude optimization, mapping and positioning technology
WO2020019175A1 (en) Image processing method and apparatus, and photographing device and unmanned aerial vehicle
CN112154480B (en) Positioning method and device for movable platform, movable platform and storage medium
CN110799922A (en) Shooting control method and unmanned aerial vehicle
US20210256732A1 (en) Image processing method and unmanned aerial vehicle
JP2019023865A (en) Method, system, and program for executing error recovery
WO2022126397A1 (en) Data fusion method and device for sensor, and storage medium
WO2021035746A1 (en) Image processing method and device, and movable platform
WO2021064982A1 (en) Information processing device and information processing method
Ma et al. KCF based 3D Object Tracking via RGB-D Camera of a Quadrotor
WO2022113482A1 (en) Information processing device, method, and program
WO2021217372A1 (en) Control method and device for movable platform
WO2020107480A1 (en) Image feature point evaluation method and mobile platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination