WO2021043213A1 - Procédé d'étalonnage, dispositif, dispositif de photographie aérienne et support de stockage - Google Patents

Procédé d'étalonnage, dispositif, dispositif de photographie aérienne et support de stockage Download PDF

Info

Publication number
WO2021043213A1
WO2021043213A1 PCT/CN2020/113256 CN2020113256W WO2021043213A1 WO 2021043213 A1 WO2021043213 A1 WO 2021043213A1 CN 2020113256 W CN2020113256 W CN 2020113256W WO 2021043213 A1 WO2021043213 A1 WO 2021043213A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
angular velocity
sensor
acceleration
calibration
Prior art date
Application number
PCT/CN2020/113256
Other languages
English (en)
Chinese (zh)
Inventor
谢青青
Original Assignee
深圳市道通智能航空技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司 filed Critical 深圳市道通智能航空技术有限公司
Publication of WO2021043213A1 publication Critical patent/WO2021043213A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Definitions

  • This application relates to the field of machine vision technology, and in particular to a calibration method, device, aerial photography equipment and storage medium.
  • Smart devices increasingly include one or more cameras or other types of image capture devices to enable users to capture images.
  • a smartphone or aircraft includes a camera that can capture images in a variety of scenes.
  • Many camera systems can be calibrated during manufacturing.
  • the existing calibration methods often calibrate the time deviation and sensor external parameters step by step. They do not have high accuracy of joint estimation, ignore the potential connection between the two parameters, and have low calibration accuracy.
  • This application provides a calibration method, device, aerial photography equipment and storage medium, aiming to solve the technical problem of insufficient calibration of the existing calibration scheme.
  • the present application provides a calibration method, which is applied to aerial photography equipment, the aerial photography equipment includes a camera and an inertial sensor, and the method includes:
  • the preset spatial external parameters of the camera and the inertial sensor transform the camera angular velocity and the camera acceleration from the camera coordinate system of the camera to the sensor coordinate system of the inertial sensor to obtain the first Predicted angular velocity and predicted acceleration;
  • the multiple camera poses of the camera in the world coordinate system include:
  • the obtaining the camera angular velocity and camera acceleration of the camera at different moments according to the multiple camera poses includes:
  • the extracting the corner points of the calibration plate in the multiple images respectively includes:
  • each of the multiple images includes multiple quadrilaterals
  • a midpoint between two adjacent corner points of two adjacent quadrilaterals that have diagonal corners on the same straight line is extracted as the corner point.
  • the calculating the multiple camera poses of the camera in the world coordinate system according to the corner points of the calibration plate in each image of the multiple images respectively includes:
  • the camera calibration algorithm is used to calculate the multiple camera poses of the camera in the world coordinate system according to the corner points of the calibration board in each image.
  • the method further includes:
  • the initial estimation of the time deviation is used as an initial value of the time deviation when optimizing the time deviation.
  • the method further includes:
  • the initial rotation component is used as an initial value for optimizing the spatial external parameter.
  • the present invention also provides a calibration device, which is applied to aerial photography equipment, the aerial photography equipment includes a camera and an inertial sensor, and the calibration device includes:
  • the pose acquisition module is used to acquire multiple camera poses of the camera in the world coordinate system when the camera shoots multiple images of the calibration board during the movement;
  • a camera speed acquiring module configured to acquire the camera angular velocity and camera acceleration of the camera at different moments according to the poses of the multiple cameras;
  • a sensor speed acquisition module configured to acquire the sensor angular velocity and sensor acceleration measured by the inertial sensor during the movement of the camera
  • a prediction module for transforming the camera angular velocity and the camera acceleration from the camera coordinate system of the camera to the sensor coordinate system of the inertial sensor according to the preset spatial external parameters of the camera and the inertial sensor , To get the first predicted angular velocity and predicted acceleration;
  • An error module configured to construct an acceleration error term according to the first predicted angular velocity, the predicted acceleration, the sensor angular velocity, and the sensor acceleration;
  • the optimization module is used to optimize the external spatial parameters and the time deviation to obtain the space external parameters and the time deviation that minimize the acceleration error term, where the time deviation is the time defined by the inertial sensor and is relative to the camera The time offset defined by the shot.
  • this application also provides an aerial photography equipment, the aerial photography equipment includes:
  • a processor the processor is in communication connection with the camera and the inertial sensor, and the processor is configured to implement the calibration method described in the embodiment of the first aspect of the present application when the calibration program is executed.
  • the present application also provides a storage medium, the storage medium is a computer-readable storage medium, the storage medium stores a calibration program, and the calibration program is executed by a processor to realize the implementation of the first aspect of the application The calibration method described in the example.
  • the calibration method, device, aerial photography equipment and storage medium of the present invention can optimize the space external parameters and time deviation for calibration, fully consider the time deviation and the potential connection of the space external parameters, and effectively improve the calibration The precision.
  • Fig. 1 is a flowchart of the calibration method provided by the first embodiment of the application.
  • Fig. 2 is a flowchart of step S11 in Fig. 1.
  • FIG. 3A is a schematic diagram of the image after binarization processing in this application.
  • FIG. 3B is a schematic diagram of the white pixels of the image after being expanded in this application.
  • FIG. 3C is a schematic diagram after extracting a quadrilateral from an image in this application.
  • Figure 3D is a schematic diagram of the position of the corner points in the image in this application.
  • Fig. 4 is a flowchart of step S111 in Fig. 2.
  • Fig. 5 is a flowchart of step S12 in Fig. 1.
  • FIG. 6 is a flowchart of the calibration method provided by the second embodiment of this application.
  • FIG. 7 is a schematic diagram of modules of the calibration device provided by the third embodiment of this application.
  • FIG. 8 is a schematic structural diagram of the aerial photography equipment provided by the fourth embodiment of this application.
  • a process, method, system, product, or device that includes a series of steps or units is not necessarily limited to clearly listed Instead, those steps or units listed may include other steps or units that are not clearly listed or are inherent to these processes, methods, products, or equipment.
  • FIG. 1 is a calibration method provided by the first embodiment of this application.
  • the calibration method can be performed by a calibration device, which can be implemented by hardware and/or software for accurate calibration.
  • the camera and an inertial measurement unit (IMU) are calibrated and applied to aerial photography equipment.
  • the aerial photography equipment includes the camera and the inertial sensor, and the camera and the inertial sensor are fixedly connected.
  • the results of the calibration can be applied to the drone.
  • the calibration method includes:
  • S11 Acquire multiple camera poses of the camera in the world coordinate system when the camera shoots multiple images of the calibration plate during movement.
  • the calibration board is a black-and-white checkerboard-style calibration board.
  • the world coordinate system can be determined by the calibration board.
  • the world coordinate system includes x-axis, y-axis and z-axis.
  • the camera coordinate system can be determined by the camera, the sensor coordinate system can be determined by the inertial sensor, and the image coordinate system and the pixel coordinate system can be determined by the image taken by the camera.
  • the camera and the inertial sensor are moved at the same time. During the movement, the camera shoots the calibration board, and the camera's pose at different times and positions can be calculated through the camera calibration algorithm.
  • the camera pose includes a camera position and a camera pose.
  • the camera and the inertial sensor can give full play to the degrees of freedom of rotation and translation.
  • the camera and the inertial sensor move at the same time, and the camera takes an image of the calibration board at the first interval.
  • the first time is not limited. If the first time is 0.1s, the camera will take an image of the calibration plate every 0.1s. After a period of time, multiple images can be taken. Preferably, the number of images taken of the calibration plate is greater than 10.
  • Step S11 may include:
  • Step S112 may include:
  • S1111 Binarize the multiple images respectively.
  • FIG. 3A is a schematic diagram after the binarization processing.
  • S1112 Perform pixel expansion on the white pixels in the multiple images after the binarization process.
  • FIG. 3B is a schematic diagram of white pixels after expansion.
  • contour extraction can be performed, the convex hull of each contour can be calculated, and whether the extracted polygon has only four vertices; if there are only four vertices detected, it is a quadrilateral .
  • the interfering quadrilaterals can be deleted. For example, some interfering quadrilaterals can be removed with constraints such as aspect ratio, perimeter, and area.
  • Fig. 3C is a schematic diagram after the quadrilateral is extracted.
  • Fig. 3D is a schematic diagram of the position of the corner point.
  • S112 Calculate the multiple camera poses of the camera in the world coordinate system according to the corner points of the calibration plate in each of the multiple images.
  • step S112 includes:
  • the camera calibration algorithm is used to calculate the multiple camera poses of the camera in the world coordinate system according to the corner points of the calibration board in each image.
  • each image can determine a camera pose of the camera.
  • Using a camera calibration algorithm to calculate the camera pose is a common prior art in this field, and this embodiment only provides a brief description.
  • pose is a variable that describes the motion of an object in a three-dimensional space.
  • Position refers to the (x, y, z) coordinates of the object in a specified coordinate system.
  • the pose is rotation around the x axis, rotation around the y axis, and rotation around the z axis.
  • Available symbols are represented as (tx, ty, tz, rx, ry, rz).
  • tx, ty, tz represent translation
  • rx, ry, rz represent rotation.
  • the calibration board is a known object in the world coordinate system.
  • the three-dimensional coordinates of each corner on the calibration board in the world coordinate system are also known.
  • the pixel coordinates of the corner points are known.
  • the coordinates and the camera calibration parameters of the camera itself can be projected from a point on the world coordinate system to a two-dimensional point, which can be described by the following formula:
  • the matrix P1 is the rigid body transformation matrix that transforms the world coordinate system and the camera coordinate system.
  • the matrix can transform the point from the world coordinate system to the camera coordinate system.
  • the matrix P2 is the mutual transformation between the camera coordinate system and the image coordinate system. Transformation relationship, where f refers to the focal length of the camera, the matrix P2 can be used to transform the three-dimensional coordinates in the camera coordinate system into two-dimensional coordinates on the imaging plane, and the matrix P3 is the mutual transformation relationship between the image coordinate system and the pixel coordinate system.
  • the matrix P3 is the internal parameter matrix of the camera
  • K refers to the zoom factor between the focal length and the physical size
  • u and v refer to the horizontal and vertical axes of the image, which can transform the two-dimensional coordinates on the imaging plane into standard pixel coordinates
  • dx And dy respectively represent the physical size of each pixel on the horizontal axis x and the vertical axis y.
  • the matrix P is a matrix representation that transforms a point on the world coordinate system to a pixel coordinate system.
  • the matrix P2 and the matrix P3 are determined by the camera's own parameters, and the matrix P1 can be equivalently transformed with the pose. Therefore, a series of corner projection equations can be listed to find the unknown quantity in the first matrix to determine Multiple camera poses of the camera in the world coordinate system determined by the calibration board.
  • step S12 includes:
  • S121 Perform spline fitting on the poses of the multiple cameras to obtain a pose curve.
  • Each camera pose is equivalent to a point, and the pose curve can be obtained by spline fitting multiple camera poses.
  • B-spline fitting is performed on multiple poses.
  • Performing B-spline fitting on a series of points is a prior art and will not be described in detail in this application.
  • the reason for choosing B-spline fitting to express the camera pose curve is that first, the B-spline curve has local characteristics, and the adjustment of a single parameter only has an effect on the part of the spline; second, the B-spline basis function is a polynomial, which can be simple Carry out differential and integral operations, it is easy to evaluate the error term for correct calibration.
  • Performing spline fitting to obtain a pose curve can transform a discrete problem into a continuous problem.
  • S122 Differentiate the pose curve to obtain the camera angular velocity and the camera acceleration of the camera at different moments.
  • the camera angular velocity and the camera acceleration can be continuous, that is, the camera angular velocity curve and the camera acceleration curve are formed.
  • the pose curve has six dimensions, including the positions and attitudes of the x-axis, y-axis, and z-axis. Differentiate the position components of the x-axis, y-axis, and z-axis to obtain the camera angular velocity.
  • the angular velocity of the camera can be obtained by differentiation of the posture component of the z-axis.
  • the inertial sensor and the camera are relatively fixed. When the camera is in motion, the inertial sensor also moves in synchronization with the camera. During the movement of the camera, the inertial sensor is also working, that is, the angular velocity and acceleration of the sensor measured during the movement of the camera during the operation of the inertial sensor can be obtained.
  • the sensor angular velocity is the angular velocity measured by the sensor
  • the sensor acceleration is the sensor acceleration measured by the sensor.
  • the sensor angular velocity and the sensor acceleration can be acquired continuously, that is, the sensor angular velocity curve and the sensor acceleration curve can be acquired.
  • the angular velocity of the sensor and the acceleration of multiple sensors are vectors, including the direction and magnitude.
  • the time deviation is defined in this application, and the time deviation is the time defined by the inertial sensor relative to the time defined by the camera. This application subsequently optimizes the time deviation to obtain accurate calibration.
  • the spatial external parameters of the camera and the inertial sensor are the transformation relationship from the camera coordinate system to the sensor coordinate system through a rigid body transformation.
  • the relative position of the camera and the inertial sensor is fixed and set according to needs, so from the camera coordinates
  • the transformation relationship from the rigid body transformation to the sensor coordinate system is known, that is, the spatial external parameters of the camera and the inertial sensor are known. Therefore, according to the preset spatial external parameters of the camera and the inertial sensor, the camera angular velocity can be transformed from the camera coordinate system of the camera to the sensor coordinate system of the inertial sensor to obtain the first predicted angular velocity.
  • the external parameter can transform the camera acceleration from the camera coordinate system to the sensor coordinate system to obtain the predicted acceleration.
  • the camera angular velocity in the camera angular velocity curve can be transformed into the coordinate system to obtain the first predicted angular velocity curve
  • the camera acceleration in the camera acceleration curve can be transformed into the coordinate system to obtain the predicted acceleration curve.
  • Differentiating the pose curve to obtain the first predicted angular velocity and predicted acceleration avoids the integration process of the measured value of the inertial sensor and increases the accuracy of the calibration.
  • the external spatial parameters are known, there may be errors in the external spatial parameters of the camera and the inertial sensor.
  • the distance between the camera and the inertial sensor may have an error during measurement, which will cause the external spatial parameters of the camera and the inertial sensor. There is an error in the determination.
  • the spatial external parameters of the camera and the inertial sensor will be optimized to obtain accurate calibration.
  • S15 Construct an acceleration error term according to the first predicted angular velocity, the predicted acceleration, the sensor angular velocity, and the sensor acceleration.
  • the first predicted angular velocity, the predicted acceleration, the sensor angular velocity, and the sensor acceleration may be all the obtained first predicted angular velocity, the predicted acceleration, the sensor angular velocity, and the sensor acceleration. That is, the camera angular velocity in the camera angular velocity curve, the camera acceleration in the camera acceleration curve, the first preset angular velocity in the first predicted angular velocity curve, and the predicted acceleration in the predicted acceleration curve.
  • the acceleration error term is the square of the difference between the value of the first predicted angular velocity and the value of the predicted acceleration, and the value of the sensor angular velocity and the value of the sensor acceleration.
  • the first predicted angular velocity, the predicted acceleration, the sensor angular velocity, and the sensor acceleration are all vectors. Assuming that the first predicted angular velocity is a1, the predicted acceleration is a2, the sensor angular velocity is a3, and the sensor acceleration is a4, Then the acceleration error term can be expressed as:
  • the time deviation is changed, that is, the sensor angular velocity curve of the sensor angular velocity is overall translation on the time axis, and the sensor acceleration curve is overall translation on the time axis.
  • the inertial sensor thinks that the sensor angular velocity is acquired at the absolute time T2 J and sensor acceleration K, and sensor angular velocity J and sensor acceleration K are acquired at absolute time T2+0.02s. If the acceleration error term needs to be reduced, the time axis of the inertial sensor needs to be shifted forward by 0.03s.
  • the sensor angular velocity and sensor acceleration of the inertial sensor at time T2 will change.
  • the specific time difference from the absolute time is uncertain, and the time deviation needs to be continuously changed, and the value of the acceleration error term is used to determine whether the changed error term is appropriate.
  • the space external parameters and time deviation are independent variables, and the acceleration error term is the dependent variable.
  • the acceleration error term changes with the changes of the space external parameters and time deviation.
  • the gradient can be calculated from the initial value, along Move the independent variable in the direction of the gradient and repeat several times until the gradient is small enough.
  • the gradient descent method is an existing technology, and this application no longer specifically explains how to use the gradient descent method to optimize the spatial external parameters and time deviation, and finally obtain the space external parameters and time deviation that minimize the acceleration error term.
  • the calibration is completed by obtaining the appropriate space external parameters and time deviation.
  • the calibration method provided in this embodiment optimizes the space external parameters and the time deviation for calibration, fully considers the time deviation and the potential connection of the space external parameters, and effectively improves the accuracy of the calibration.
  • the second embodiment of the present application also provides a calibration method. Based on the foregoing embodiment, this embodiment provides a solution for providing initial values for external spatial parameters and time deviations.
  • the calibration method includes :
  • S21 Acquire multiple camera poses of the camera in the world coordinate system when the camera shoots multiple images of the calibration board during movement.
  • the original sensor angular velocity D obtained is the camera at time T1+0.05s Angular velocity, change the time deviation 0.02s, the sensor angular velocity D is the sensor angular velocity at time T+0.03s, and the sensor angular velocity at time T1+0.05s is no longer the sensor angular velocity D.
  • the preset time deviation range can be set as required.
  • S25 Calculate the correlation coefficient between the module length of the angular velocity of the camera and the module length of the angular velocity of the sensor, and use the time deviation with the largest correlation coefficient as the initial estimation of the time deviation.
  • the cross-correlation number is a statistical index used to reflect the close degree of the correlation between variables, that is, it reflects the close degree of the correlation between the angular velocity of the camera and the angular velocity of the sensor.
  • Obtaining the correlation coefficient between two side roads is a prior art, and only a brief description is given in this application.
  • the module length of the angular velocity of the camera is represented by ⁇ Xn ⁇
  • the module length of the angular velocity of the sensor is represented by ⁇ Yn ⁇
  • m is the time deviation.
  • -3 ⁇ m ⁇ 3 is the assumed preset value.
  • the time deviation with the largest cross-correlation coefficient is used as the initial estimation of the time deviation.
  • the initial estimation of the time deviation is used as the initial value of the time deviation when optimizing the time deviation.
  • the preset spatial external parameters of the camera and the inertial sensor are known.
  • the spatial external parameters include the external parameter rotation component and the external parameter translation component.
  • the external parameter rotation component is a transformation relationship that is rotated from the camera coordinate system through a rigid body rotation transformation to the same direction as the sensor coordinate system.
  • the external parameter translation component is the transformation relationship from the camera coordinate system through the rigid body translation transformation to the position consistent with the sensor coordinate system.
  • the external parameter rotation component and the external parameter translation component are both known. Therefore, the camera angular velocity can be transformed from the camera coordinate system of the camera to the sensor coordinate system of the inertial sensor according to the external parameter rotation component to obtain the second predicted angular velocity.
  • the angular velocity error term is the difference between the camera angular velocity and the second predicted angular velocity.
  • the external parameter rotation component changes, the angular velocity error term will also change. Therefore, the external parameter rotation component is an independent variable, the angular velocity error term is the dependent variable, and the angular velocity error term changes with the external parameter rotation component.
  • a nonlinear optimization method can be used for optimization.
  • the gradient descent method is used.
  • the gradient can be calculated from the initial value, along the direction of the gradient. Move the independent variable and repeat several times until the gradient is small enough.
  • the gradient descent method is an existing technology, and this application no longer specifically explains how to use the gradient descent method to optimize the external parameter rotation component, and finally obtain the external parameter rotation component that minimizes the angular velocity error term.
  • the initial rotation component is used as an initial value for optimizing the spatial external parameter.
  • S210 Construct an acceleration error term according to the first predicted angular velocity, the predicted acceleration, the sensor angular velocity, and the sensor acceleration.
  • the initial estimation of the time deviation is used as an initial value for optimizing the time deviation
  • the initial rotation component is used as an initial value for optimizing the spatial external parameter
  • the calibration method provided in this embodiment can respectively determine the initial values of the space external parameters and the time deviation, which can effectively speed up the calibration.
  • the third embodiment of the present application provides a calibration device 30 for calibrating a camera and an inertial sensor.
  • the calibration device 30 can be applied to aerial photography equipment.
  • the aerial photography equipment includes a camera and an inertial sensor.
  • the inertial sensors are fixedly connected.
  • the calibration device 30 can implement the calibration method of the foregoing embodiment.
  • the calibration device 30 includes:
  • the pose acquisition module 31 is configured to acquire multiple camera poses of the camera in the world coordinate system when the camera shoots multiple images of the calibration board during movement;
  • the camera speed acquiring module 32 is configured to acquire the camera angular velocity and camera acceleration of the camera at different moments according to the poses of the multiple cameras;
  • the sensor speed acquisition module 33 is configured to acquire the sensor angular velocity and the sensor acceleration measured by the inertial sensor during the movement of the camera;
  • the prediction module 34 is configured to transform the camera angular velocity and the camera acceleration from the camera coordinate system of the camera to the sensor coordinates of the inertial sensor according to the preset spatial external parameters of the camera and the inertial sensor System to obtain the first predicted angular velocity and predicted acceleration;
  • the error module 35 is configured to construct an acceleration error term according to the first predicted angular velocity, the predicted acceleration, the sensor angular velocity, and the sensor acceleration;
  • the optimization module 36 is configured to optimize the external spatial parameters and time deviations to obtain the external spatial parameters and time deviations that minimize the acceleration error term, where the time deviation is the time defined by the inertial sensor, relative to the The time deviation defined by the camera shooting.
  • the calibration device provided by the third embodiment of the present application can optimize the spatial external parameters and the time deviation for calibration, fully consider the time deviation and the potential connection of the spatial external parameters, and effectively improve the accuracy of the calibration.
  • the calibration device 30 further includes:
  • the change module is used to continuously change the time deviation within a preset time deviation range to obtain the corresponding sensor angular velocity
  • the first calculation module is configured to calculate the correlation coefficient between the modulus length of the camera angular velocity and the modulus length of the sensor angular velocity, and use the time deviation that maximizes the correlation coefficient as the initial estimation of the time deviation; wherein, the The initial estimate of the time deviation is used as the initial value of the time deviation when optimizing the time deviation;
  • a transformation module configured to transform the camera angular velocity from the camera coordinate system of the camera to the sensor coordinate system of the inertial sensor according to the external parameter rotation component in the preset spatial external parameters to obtain a second predicted angular velocity
  • the second calculation module is used to calculate the difference between the camera angular velocity and the second predicted angular velocity to obtain the angular velocity error term;
  • the initial rotation module is used to optimize the external parameter rotation component, and use the external parameter rotation component that minimizes the angular velocity error term as the initial rotation component, wherein the initial rotation component is used to optimize the spatial external parameter Initial value.
  • the pose acquisition module 31 includes:
  • An extracting unit for extracting the corner points of the calibration plate in the multiple images respectively
  • the pose determination unit is configured to calculate the multiple camera poses of the camera in the world coordinate system according to the corner points of the calibration plate in each of the multiple images.
  • the camera speed acquiring module 32 includes:
  • the camera speed acquisition unit is configured to differentiate the pose curve to obtain the camera angular velocity and the camera acceleration of the camera at different moments.
  • the extraction unit includes:
  • the binarization subunit is used to perform binarization processing on the multiple images respectively;
  • the expansion subunit is used to perform pixel expansion on the white pixels in the multiple images after the binarization process
  • the quadrilateral extraction subunit is configured to extract quadrilaterals in the multiple images after pixel expansion, each of the multiple images includes multiple quadrilaterals;
  • the corner point extraction subunit is used for extracting the midpoint between the two adjacent corner points of two adjacent quadrilaterals with diagonal corners in the same straight line from among the multiple quadrilaterals in each image, as the ⁇ Said corner point.
  • the above-mentioned product can execute the method provided in any embodiment of the present application, and has corresponding functional modules and beneficial effects for the execution method.
  • the fourth embodiment of the present application provides an aerial photography device 40, which can perform the calibration method described in the above embodiment.
  • the aerial photography equipment 40 includes:
  • Camera (not shown); inertial sensor (not shown); one or more processors 41 and memory 42.
  • processor 41 is taken as an example in the figure.
  • the processor 41 and the memory 42 may be connected by a bus or in other ways, and the connection by a bus is taken as an example in the figure.
  • the processor 41 is also in communication connection with the camera and the inertial sensor.
  • the memory 42 can be used to store non-volatile software programs and non-volatile computer-executable programs, such as program instructions corresponding to a calibration method in the above-mentioned embodiments of this application.
  • the processor 41 executes various functional applications and data processing of a calibration method by running the non-volatile software program instructions stored in the memory 42, that is, implements a calibration method in the foregoing method embodiment.
  • the memory 42 may include a program storage area and a data storage area, where the program storage area may store an operating system, an application program required by at least one function, and the like.
  • the memory 42 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
  • the memory 42 may optionally include memories remotely provided with respect to the processor 41, and these remote memories may be connected to the processor 41 through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the program instructions are stored in the memory 42 and, when executed by the one or more processors 41, execute each step of a calibration method in any of the foregoing method embodiments.
  • the above-mentioned product can execute the method provided in the above-mentioned embodiment of the present application, and has the corresponding beneficial effect of the execution method.
  • the method provided in the above embodiment of this application please refer to the method provided in the above embodiment of this application.
  • the embodiments of the present application also provide a non-volatile computer-readable storage medium, the computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are executed by one or more processors.
  • a processor 41 of can enable a computer to execute each step of a calibration method in any of the foregoing method embodiments.
  • the embodiments of the present application also provide a computer program product.
  • the computer program product includes a computer program stored on a non-volatile computer-readable storage medium.
  • the computer program includes program instructions. Execution by multiple processors, for example, a processor 41 in the figure, can cause a computer to execute each step of a calibration method in any of the foregoing method embodiments.
  • each embodiment can be implemented by software plus a general hardware platform, and of course, it can also be implemented by hardware.
  • a person of ordinary skill in the art can understand that all or part of the processes in the methods of the foregoing embodiments can be implemented by computer programs instructing relevant hardware.
  • the present application also provides a storage medium, which includes a computer-readable storage medium, the above-mentioned calibration program may be stored in a computer-readable storage medium, and when the program is executed, it may include information such as each of the above-mentioned calibration methods.
  • the storage medium may include flash memory, hard disk, multimedia card, card-type memory (for example, SD or DX memory, etc.), magnetic memory, magnetic disk, optical disk, and the like.
  • the memory 107 may be an internal storage unit of the aircraft 10 in some embodiments, for example, a hard disk of the aircraft 10.
  • the memory 107 may also be an external storage device of the aircraft 10, for example, a plug-in hard disk equipped on the aircraft 10, a smart memory card (Smart Media Card, SMC), and a Secure Digital (SD). Card, Flash Card, etc.
  • a plug-in hard disk equipped on the aircraft 10 for example, a smart memory card (Smart Media Card, SMC), and a Secure Digital (SD). Card, Flash Card, etc.
  • SD Secure Digital
  • the calibration method, device, aerial photography equipment and storage medium of the present invention optimize the space external parameters and time deviation for calibration, fully consider the time deviation and the potential connection of the space external parameters, and effectively improve the calibration performance. Accuracy.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé d'étalonnage, un dispositif, un dispositif de photographie aérienne et un support de stockage. Le procédé d'étalonnage comprend les étapes consistant à : acquérir de multiples poses de caméra d'une caméra dans un système de coordonnées universel quand la caméra est en train de photographier de multiples images d'une mire d'étalonnage tout en étant en mouvement (S11) ; acquérir des vitesses angulaires de caméra et des accélérations de caméra de la caméra à différents moments sur la base des multiples poses de caméra (S12) ; acquérir des vitesses angulaires de capteur et des accélérations de capteur mesurées par un capteur inertiel pendant que la caméra est en mouvement (S13) ; transformer les vitesses angulaires de caméra et les accélérations de caméra d'un système de coordonnées de caméra de la caméra vers un système de coordonnées de capteur du capteur inertiel sur la base de paramètres externes spatiaux prédéfinis de la caméra et du capteur inertiel pour produire une première vitesse angulaire prédite et une accélération prédite (S14) ; construire un élément d'erreur d'accélération sur la base de la première vitesse angulaire prédite, de l'accélération prédite, des vitesses angulaires de capteur et des accélérations de capteur (S15) ; et optimiser les paramètres externes spatiaux et un décalage temporel pour produire un paramètre externe spatial et un décalage temporel permettant de minimiser l'élément d'erreur d'accélération (S16).
PCT/CN2020/113256 2019-09-06 2020-09-03 Procédé d'étalonnage, dispositif, dispositif de photographie aérienne et support de stockage WO2021043213A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910843584.1A CN110782496B (zh) 2019-09-06 2019-09-06 标定方法、装置、航拍设备和存储介质
CN201910843584.1 2019-09-06

Publications (1)

Publication Number Publication Date
WO2021043213A1 true WO2021043213A1 (fr) 2021-03-11

Family

ID=69384056

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/113256 WO2021043213A1 (fr) 2019-09-06 2020-09-03 Procédé d'étalonnage, dispositif, dispositif de photographie aérienne et support de stockage

Country Status (2)

Country Link
CN (1) CN110782496B (fr)
WO (1) WO2021043213A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115388914A (zh) * 2022-10-28 2022-11-25 福思(杭州)智能科技有限公司 传感器的参数标定方法、装置、存储介质和电子装置

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110782496B (zh) * 2019-09-06 2022-09-09 深圳市道通智能航空技术股份有限公司 标定方法、装置、航拍设备和存储介质
CN111351487A (zh) * 2020-02-20 2020-06-30 深圳前海达闼云端智能科技有限公司 多传感器的时钟同步方法、装置及计算设备
CN111551191B (zh) * 2020-04-28 2022-08-09 浙江商汤科技开发有限公司 传感器外参数标定方法及装置、电子设备和存储介质
CN113701745B (zh) * 2020-05-21 2024-03-08 杭州海康威视数字技术股份有限公司 一种外参变化检测方法、装置、电子设备及检测系统
CN111951314B (zh) * 2020-08-21 2021-08-31 贝壳找房(北京)科技有限公司 点云配准方法和装置、计算机可读存储介质、电子设备
CN112362084A (zh) * 2020-11-23 2021-02-12 北京三快在线科技有限公司 一种数据标定方法、装置及数据标定系统
CN112598749B (zh) * 2020-12-21 2024-02-27 西北工业大学 大场景非共同视野多相机标定方法
CN116558545A (zh) * 2022-01-29 2023-08-08 北京三快在线科技有限公司 一种传感器数据的标定方法及装置
CN115235527B (zh) * 2022-07-20 2023-05-12 上海木蚁机器人科技有限公司 传感器外参标定方法、装置以及电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150201180A1 (en) * 2013-07-23 2015-07-16 The Regents Of The University Of California 3-d motion estimation and online temporal calibration for camera-imu systems
CN105606127A (zh) * 2016-01-11 2016-05-25 北京邮电大学 一种双目立体相机与惯性测量单元相对姿态标定方法
CN109074664A (zh) * 2017-10-26 2018-12-21 深圳市大疆创新科技有限公司 姿态标定方法、设备及无人飞行器
CN109685852A (zh) * 2018-11-22 2019-04-26 上海肇观电子科技有限公司 相机与惯性传感器的标定方法、系统、设备及存储介质
CN109949370A (zh) * 2019-03-15 2019-06-28 苏州天准科技股份有限公司 一种用于imu-相机联合标定的自动化方法
CN110782496A (zh) * 2019-09-06 2020-02-11 深圳市道通智能航空技术有限公司 标定方法、装置、航拍设备和存储介质

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107255476B (zh) * 2017-07-06 2020-04-21 青岛海通胜行智能科技有限公司 一种基于惯性数据和视觉特征的室内定位方法和装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150201180A1 (en) * 2013-07-23 2015-07-16 The Regents Of The University Of California 3-d motion estimation and online temporal calibration for camera-imu systems
CN105606127A (zh) * 2016-01-11 2016-05-25 北京邮电大学 一种双目立体相机与惯性测量单元相对姿态标定方法
CN109074664A (zh) * 2017-10-26 2018-12-21 深圳市大疆创新科技有限公司 姿态标定方法、设备及无人飞行器
CN109685852A (zh) * 2018-11-22 2019-04-26 上海肇观电子科技有限公司 相机与惯性传感器的标定方法、系统、设备及存储介质
CN109949370A (zh) * 2019-03-15 2019-06-28 苏州天准科技股份有限公司 一种用于imu-相机联合标定的自动化方法
CN110782496A (zh) * 2019-09-06 2020-02-11 深圳市道通智能航空技术有限公司 标定方法、装置、航拍设备和存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115388914A (zh) * 2022-10-28 2022-11-25 福思(杭州)智能科技有限公司 传感器的参数标定方法、装置、存储介质和电子装置
CN115388914B (zh) * 2022-10-28 2023-02-03 福思(杭州)智能科技有限公司 传感器的参数标定方法、装置、存储介质和电子装置

Also Published As

Publication number Publication date
CN110782496B (zh) 2022-09-09
CN110782496A (zh) 2020-02-11

Similar Documents

Publication Publication Date Title
WO2021043213A1 (fr) Procédé d'étalonnage, dispositif, dispositif de photographie aérienne et support de stockage
CN107747941B (zh) 一种双目视觉定位方法、装置及系统
EP3028252B1 (fr) Ajustement par faisceaux séquentiel défilant
CN110880189B (zh) 联合标定方法及其联合标定装置和电子设备
WO2020237574A1 (fr) Procédé et appareil pour l'étalonnage de paramètres internes d'une caméra, procédé et appareil pour l'étalonnage de l'attitude relative d'une caméra, véhicule aérien sans pilote et appareil de stockage
KR101666959B1 (ko) 카메라로부터 획득한 영상에 대한 자동보정기능을 구비한 영상처리장치 및 그 방법
WO2021003263A1 (fr) Procédé et système de génération d'images
CN106447766B (zh) 一种基于移动设备单目相机的场景重建方法及装置
CN110702111A (zh) 使用双事件相机的同时定位与地图创建(slam)
WO2021004416A1 (fr) Procédé et appareil permettant d'établir une carte de balises sur la base de balises visuelles
CN111354042A (zh) 机器人视觉图像的特征提取方法、装置、机器人及介质
CN110176032B (zh) 一种三维重建方法及装置
WO2019104571A1 (fr) Procédé et dispositif de traitement d'image
US20120268567A1 (en) Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
JP2017112602A (ja) パノラマ魚眼カメラの画像較正、スティッチ、および深さ再構成方法、ならびにそのシステム
WO2022156755A1 (fr) Procédé et appareil de positionnement intérieur, dispositif et support de stockage lisible par ordinateur
WO2021139176A1 (fr) Procédé et appareil de suivi de trajectoire de piéton sur la base d'un étalonnage de caméra binoculaire, dispositif informatique et support de stockage
JP2002027507A (ja) カメラ・キャリブレーション装置及び方法、並びに、記憶媒体
CN111127524A (zh) 一种轨迹跟踪与三维重建方法、系统及装置
TW201904643A (zh) 控制裝置、飛行體以及記錄媒體
US20200294269A1 (en) Calibrating cameras and computing point projections using non-central camera model involving axial viewpoint shift
WO2020063878A1 (fr) Procédé et appareil de traitement de données
CN112085790A (zh) 点线结合的多相机视觉slam方法、设备及存储介质
TWI795885B (zh) 視覺定位方法、設備和電腦可讀儲存介質
WO2020181409A1 (fr) Procédé d'étalonnage de paramètre de dispositif de capture, appareil et support de stockage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20860678

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20860678

Country of ref document: EP

Kind code of ref document: A1