CN112771577A - Camera parameter determination method, device and readable storage medium - Google Patents

Camera parameter determination method, device and readable storage medium Download PDF

Info

Publication number
CN112771577A
CN112771577A CN202080005255.0A CN202080005255A CN112771577A CN 112771577 A CN112771577 A CN 112771577A CN 202080005255 A CN202080005255 A CN 202080005255A CN 112771577 A CN112771577 A CN 112771577A
Authority
CN
China
Prior art keywords
camera
calibration parameters
calibration
parameters
historical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080005255.0A
Other languages
Chinese (zh)
Inventor
黄振昊
何纲
彭昭亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112771577A publication Critical patent/CN112771577A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

A camera parameter determination method, a camera parameter determination apparatus, and a readable storage medium. The method comprises the following steps: acquiring multi-frame images collected by a camera on a movable platform, wherein the multi-frame images at least comprise two frames of images with partially overlapped images; calibrating according to the multi-frame image to obtain calibration parameters of the camera at the current moment; acquiring calibration parameters of a camera at historical time, wherein the calibration parameters at the historical time are calibration parameters of the camera used by the movable platform when executing historical job tasks; determining target calibration parameters according to the calibration parameters at the historical moment and the calibration parameters at the current moment, wherein the target calibration parameters are used when the movable platform executes a new operation task; wherein the calibration parameters comprise one or more of the following parameters of the camera: internal parameters, distortion parameters and external parameters.

Description

Camera parameter determination method, device and readable storage medium
Technical Field
The present disclosure relates to the field of information processing, and in particular, to a method for determining camera parameters, a device for determining camera parameters, and a readable storage medium.
Background
In the image measurement process, in order to determine the mapping relationship between the three-dimensional position of the object and the corresponding point of the object in the image, a geometric model for camera imaging needs to be established, wherein a relatively common geometric model is a pinhole model which describes the relationship of projection imaging after light passes through a pinhole. The parameters of the geometric model include camera parameters, and in general, the camera parameters can be obtained only through a calibration process, and the stability of the camera parameters obtained through calibration influences the accuracy of image measurement.
Disclosure of Invention
The present disclosure provides a camera parameter determining method, including: acquiring multi-frame images collected by a camera on a movable platform, wherein the multi-frame images at least comprise two frames of images with partially overlapped images; calibrating according to the multi-frame image to obtain a calibration parameter of the camera at the current moment; acquiring calibration parameters of the camera at historical time, wherein the calibration parameters at the historical time are calibration parameters of the camera used by the movable platform when executing historical job tasks; determining target calibration parameters according to the calibration parameters at the historical moment and the calibration parameters at the current moment, wherein the target calibration parameters are used by the movable platform when executing a new job task; wherein the calibration parameters include one or more of the following parameters of the camera: internal parameters, distortion parameters and external parameters.
The present disclosure also provides a camera parameter determination apparatus, including: a processor; a memory for storing one or more programs, wherein the one or more programs, when executed by the processor, cause the processor to: acquiring multi-frame images collected by a camera on a movable platform, wherein the multi-frame images at least comprise two frames of images with partially overlapped images; calibrating according to the multi-frame image to obtain a calibration parameter of the camera at the current moment; acquiring calibration parameters of the camera at historical time, wherein the calibration parameters at the historical time are calibration parameters of the camera used by the movable platform when executing historical job tasks; and determining target calibration parameters according to the calibration parameters of the historical moment and the calibration parameters of the current moment, wherein the target calibration parameters are used by the movable platform when executing a new job task.
The present disclosure also provides a readable storage medium having stored thereon executable instructions that, when executed by a processor, cause the processor to perform a camera parameter determination method as described above
According to the embodiment of the disclosure, calibration parameters of the camera at the current moment are obtained through calibration of multi-frame images collected by the camera on the movable platform, and target calibration parameters of the camera used by the movable platform when a new job task is executed are determined according to the calibration parameters of the camera at the historical moment used by the movable platform when the historical job task is executed and the estimated calibration of the camera at the current moment obtained through current calibration. The target calibration parameters of the camera are determined according to the calibration parameters at the historical moment and the calibration parameters at the current moment obtained by current calibration, but not fixed and unchangeable parameters, so that the calibration parameters of the camera can be optimized in an iterative mode in the using process, and the calibration parameters of the camera used by the movable platform in the actual operation process are more in line with the actual operation condition. Even if the lens of the camera is worn in the using process or is reassembled after being disassembled, the influence of the value of the camera parameter changed due to wear and reassembly on the image measuring result can be reduced by referring to the calibration parameter at the historical moment.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
fig. 1 schematically shows a schematic diagram of a pinhole model of a camera according to an embodiment of the present disclosure.
Fig. 2 schematically shows a schematic diagram of a distortion model of a camera according to an embodiment of the present disclosure.
Fig. 3 schematically illustrates an application scenario in which the camera parameter determination method and apparatus may be applied according to an embodiment of the present disclosure.
Fig. 4 schematically shows a flow chart of a camera parameter determination method according to an embodiment of the present disclosure.
Fig. 5 schematically shows a flowchart for acquiring calibration parameters of a camera at a historical time according to an embodiment of the disclosure.
Fig. 6 schematically shows a block diagram of a camera parameter determination apparatus according to an embodiment of the present disclosure.
Detailed Description
The technical solution of the present disclosure will be clearly and completely described below with reference to the embodiments and the drawings in the embodiments. It is to be understood that the described embodiments are merely illustrative of some, and not restrictive, of the embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. The techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system.
The process of capturing an object by a camera to generate an image is essentially the conversion of a subject located in the world coordinate system to an image located in the pixel coordinate system. The world coordinate system is a three-dimensional rectangular coordinate system which can be used as a reference to describe the relative spatial positions of the camera and a shot object, and the pixel coordinate system is a two-dimensional rectangular coordinate system which reflects the arrangement condition of pixels in the CCD/CMOS chip of the camera, so that the imaging geometric model of the camera represents the conversion process from the world coordinate system to the pixel coordinate system.
From the world coordinate system (O-X)wYwZw) The conversion to the pixel coordinate system (O-uv) generally requires first converting from the world coordinate system (O-X)wYwZw) Converted into a camera coordinate system (C-XYZ) and reconverted from the camera coordinate system (C-XYZ) into a pixel coordinate system (o-uv). The camera coordinate system (C-XYZ) is also a three-dimensional rectangular coordinate system, the origin is at the optical center C of the camera, the X axis and the Y axis are respectively parallel to two sides of the image plane, and the Z axis is the optical axis of the camera.
From the world coordinate system (O-X)wYwZw) The conversion into the camera coordinate system (C-XYZ) typically includes a rotation and displacement transformation Transform. The Matrix R is Rotation Matrix, the Matrix T is Translation Matrix displacement Matrix, and R and T are external reference external Matrix of the camera, which express a world coordinate system (O-X) in three-dimensional spacewYwZw) Conversion to camera coordinate system (C-XYZ).
The conversion relationship for converting the world coordinate system to the camera coordinate system is as follows:
Figure BDA0002984647780000041
wherein R is a 3 × 3 rotation matrix, t is a 3 × 1 translation vector, (X, Y, Z, 1)TIs the odd-order coordinate of the camera coordinate system, (X)w,Yw,Zw,1)TThe odd-order coordinates of the world coordinate system.
A relatively common geometric model for converting from a camera coordinate system (C-XYZ) to a pixel coordinate system (o-uv) is a pinhole model, which describes the relationship of projection imaging after light passes through a pinhole.
However, in the pixel coordinate system (o-uv), the upper left corner of the image is generally used as the origin o, the u-axis and the v-axis are parallel to the two sides of the image plane, and the unit of the coordinate axis in the pixel coordinate system is a pixel. In order to establish the conversion relation, an image coordinate system (p-xy) can be established by taking the center of an image surface of the image as an origin p, and taking an x axis and a y axis which are respectively parallel to a u axis and a v axis, wherein the coordinate of the origin p in the pixel coordinate system is (u-xy)0,v0). Of course, the present disclosure may also directly translate the origin of coordinates o of the pixel coordinate system (o-uv) to the image center p without establishing the image coordinate system (p-xy). The origin p of the image coordinate system is the intersection point of the optical axis of the camera and the image plane of the image, namely the principal point of the camera, and is located at the center of the image. The pixel coordinate system and the image coordinate system are substantially in a translation relationship, and can be transformed by translation, so that the conversion relationship of the image coordinate system (p-xy) into the pixel coordinate system (o-uv) is as follows:
Figure BDA0002984647780000042
wherein dx and dy are the physical dimensions of the pixel in the x and y directions, respectively, (u)0,v0) Representing the principal point coordinates.
Next, the camera coordinate system (C-XYZ) may be converted into the image coordinate system (p-xy), and then the image coordinate system (p-xy) and the pixel coordinate system (o-uv) may be converted into the pixel coordinate system (o-uv).
The conversion from the camera coordinate system (C-XYZ) to the image coordinate system (p-xy) can be described using a pinhole model.
Fig. 1 schematically shows a schematic diagram of a pinhole model of a camera according to an embodiment of the present disclosure.
As shown in FIG. 1, the pinhole model of the camera reflects the transformation relationship between the camera coordinate system (C-XYZ) and the image coordinate system (p-xy). For example, any point M in the camera coordinate system (C-XYZ) corresponds to an image point M in the image coordinate system (p-xy), a connecting line between M and the camera optical center C is CM, an intersection point between CM and the image plane is the image point M, and M is a projection of the space point M on the image plane. The conversion of the camera coordinate system (C-XYZ) into the image coordinate system (p-xy) corresponds to the process of perspective projection as a matrix representation:
Figure BDA0002984647780000051
where s is a scale factor other than 0, f is the effective focal length of the camera (distance of the optical center to the image plane), (X, Y, Z, 1)TIs the odd coordinate of the camera coordinate system, (x, y, 1)TThe odd-order coordinates of the image coordinate system.
Based on the transformation relationship from the world coordinate system to the camera coordinate system, the transformation relationship from the camera coordinate system to the image coordinate system, and the transformation relationship from the image coordinate system to the pixel coordinate system, the transformation relationship from the world coordinate system to the pixel coordinate system can be obtained as follows:
Figure BDA0002984647780000052
wherein z is a scaling factor other than 0. The matrix K is called Camera internal reference, namely, Camera calibration matrix, and R and T are external references of the Camera. Thus, the camera parameters generally include the focal length f, the position u of the principal point in the pixel coordinate system0,v0The size dx, dy of the picture element.
However, due to the existence of the lens, the light may be distorted during the projection process, and a distortion model may be generally used to describe the distortion of the light during the projection process. The distortion model may generally be characterized by distortion parameters. The distortion parameters generally include radial distortion parameters k1, k2, k3, and tangential distortion parameters p1, p 2. Radial distortion, which may also be referred to as barrel distortion and pincushion distortion, generally results from greater deflection of light away from the center of the lens, and tangential distortion results generally from the lens not being perfectly parallel to the image plane.
Fig. 2 schematically shows a schematic diagram of a distortion model of a camera according to an embodiment of the present disclosure.
As shown in fig. 2, the theoretical position of the pixel point is on the arc, but the position of the pixel point is deviated radially and tangentially due to the deflection of the light at a position far from the center of the lens and the incomplete parallelism of the lens to the image plane.
In general, camera external parameters can be obtained in the operation process, camera internal parameters and distortion parameters are generally calibrated in advance, and the camera internal parameters and the distortion parameters are required to be input as initial values in mapping software in the post-processing mapping process, and iteration is performed in a mapping algorithm.
In the related art, a calibration method for camera parameters is generally performed by calibrating camera parameters by using a checkerboard and a visual calibration algorithm (e.g., an OpenCV algorithm) when the camera is shipped from a factory, and then writing the calibrated camera parameters into a storage area of the camera, or providing a detection report and a certificate.
Specifically, for example, the OpenCV algorithm is built with a camera intrinsic parameter calibration algorithm, and the calibrated camera intrinsic parameter and distortion parameter can be obtained by shooting checkerboards at different angles and calling the calibration algorithm.
However, in the using process of the camera, due to abrasion or lens disassembly and assembly and other reasons, the camera parameters may change, so that the difference between the camera parameters and the preset parameter calibration value is large, troubles are brought to subsequent image construction processing, and the accuracy of image measurement is influenced.
In the related art, initial parameters obtained by factory calibration are used when a camera is used to perform a task. For example, by writing camera parameters and distortion parameters into the camera, and then writing fixed parameters into a field of an exif part of a picture during the photographing process or directly giving a calibration certificate, wherein the exif, i.e., changeable image file format, is specially set for the picture of the digital camera, the attribute information and the photographing data of the digital picture can be recorded, and the attribute information and the photographing data can be attached to a file such as JPEG, TIFF, RIFF and the like, and the content and the index map of the photographing information of the digital camera or the version information of the image processing software can be added thereto. If the lens of the camera is worn in the using process or is reassembled after being disassembled, the parameters of the camera are inconsistent with the calibration value given by factory, and troubles are brought to subsequent image construction processing and the like. Therefore, the camera internal parameters and distortion parameters have no self-iteration function, if the lens/camera is disassembled in the midway or the service time is too long, the lens/camera can only be re-checked by returning to the factory, the self-iteration capability is not provided, and the deviation of the camera internal parameters and distortion parameters and true values is large before checking and correcting.
The embodiment of the disclosure provides a camera parameter determination method and device. The method comprises the following steps: acquiring multi-frame images collected by a camera on a movable platform, wherein the multi-frame images at least comprise two frames of images with partially overlapped images; calibrating according to the multi-frame image to obtain a calibration parameter of the camera at the current moment; acquiring calibration parameters of the camera at historical time, wherein the calibration parameters at the historical time are calibration parameters of the camera used by the movable platform when executing historical job tasks; and determining target calibration parameters according to the calibration parameters of the historical moment and the calibration parameters of the current moment, wherein the target calibration parameters are used by the movable platform when executing a new job task.
An application scenario of the camera parameter determination method and apparatus will be described below by taking a movable platform as an unmanned aerial vehicle as an example.
Fig. 3 schematically illustrates an application scenario in which the camera parameter determination method and apparatus may be applied according to an embodiment of the present disclosure. It should be noted that fig. 3 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 3, a user 301 may control a drone 302 to fly through a remote control terminal, and a camera on the drone 302 may capture multiple frames of images in a real environment. The multi-frame images collected by the camera at least comprise two frames of images of partially overlapped images. For example, two images collected by the camera at least contain the same big tree, or the same peak, and the like, and the overlapped images can be the same big tree, or the same peak.
According to the embodiment of the present disclosure, the unmanned aerial vehicle 302 may send the image collected by the camera to the remote control terminal for processing, or may locally process the image at the unmanned aerial vehicle 302, and the calibration parameters of the camera at the current moment may be obtained by processing the multi-frame image collected by the camera.
According to an embodiment of the present disclosure, the drone 302 may obtain calibration parameters of the historical time of the camera from the server 303, where the calibration parameters of the historical time are calibration parameters of the camera used by the drone 302 when executing the historical job task. Then, a target calibration parameter used by the drone 302 when executing a new job task is determined according to the calibration parameter at the historical time and the calibration parameter at the current time.
In an alternative embodiment, the control terminal may obtain the calibration parameters of the camera at the historical time from the server 303 in advance, and cache the calibration parameters locally at the control terminal. Then, the drone 302 obtains calibration parameters of the camera at the historical time from the control terminal.
In another alternative embodiment, drone 302 may also obtain calibration parameters of the camera at the historical time from a storage medium at the home end of drone 302.
As can be seen from this, the present disclosure may obtain the calibration parameters of the historical time of the camera in various ways, and may not be limited to obtaining the calibration parameters of the historical time of the camera from the server 303.
According to the embodiment of the disclosure, since the target calibration parameters of the camera are determined according to the calibration parameters at the historical time and the calibration parameters at the current time obtained by current calibration, rather than fixed and unchangeable parameters, the calibration parameters of the camera can be iteratively optimized in the using process, so that the calibration parameters of the camera used by the unmanned aerial vehicle 302 in the actual operation process better conform to the actual operation condition. Even if the lens of the camera is worn in the using process or is reassembled after being disassembled, the influence of the value of the camera parameter changed due to wear and reassembly on the image measuring result can be reduced by referring to the calibration parameter at the historical moment.
The method for determining camera parameters provided by the embodiment of the present disclosure is further described with reference to fig. 4.
Fig. 4 schematically shows a flow chart of a camera parameter determination method according to an embodiment of the present disclosure.
It should be noted that, unless explicitly stated that there is an execution sequence between different operations or there is an execution sequence between different operations in technical implementation, the execution sequence between multiple operations may not be sequential, or multiple operations may be executed simultaneously in the flowchart in this disclosure.
As shown in fig. 4, the camera parameter determination method includes operations S410 to S440.
In operation S410, a plurality of images captured by a camera on a movable platform are acquired, wherein the plurality of images captured by the camera at least include two images with partially overlapped images.
According to the embodiments of the present disclosure, the position and the posture of the camera on the movable platform can be changed to obtain a plurality of frames of images. For example, the camera is controlled to shoot the same object from different angles at three different positions, and the three acquired images at least comprise pixel parts related to the object. Further, the positions of the object on the images acquired at different positions are different.
In operation S420, calibration parameters of the camera at the current time are obtained according to calibration of multiple frames of images.
According to the embodiment of the disclosure, the calibration parameter of the camera at the current moment can be obtained according to the calibration of the multi-frame image in the process of aerial triangulation, and the calibration parameter can be one or more of camera internal parameter and distortion parameter camera external parameter. For example, in the process of processing the image, the calibrated intrinsic parameters and distortion parameters written in the image are called, and then the calibrated intrinsic parameters and distortion parameters are used as unknowns in the process of aerial triangulation to perform nonlinear optimization processing, so that more accurate optimized camera intrinsic parameters and distortion parameters are iterated.
According to the embodiment of the disclosure, a calibration algorithm can be used for processing the acquired multi-frame image, identifying the calibration point in the image, and calibrating the calibration point according to the three-dimensional position to obtain the calibration parameter of the camera at the current moment. In operation S430, calibration parameters of a historical time of the camera are acquired, wherein the calibration parameters of the historical time are calibration parameters of the camera used by the movable platform when executing the historical job task.
According to the embodiment of the disclosure, the acquired calibration parameters at the historical time may include one or more calibration parameters, and in the case of acquiring the calibration parameters at the multiple historical times, the calibration parameters at the multiple historical times may be the calibration parameters respectively used by the movable platform when executing the multiple historical job tasks. In the case of acquiring the calibration parameter of one history time, the calibration parameter of the history time may be the calibration parameter of the history time corresponding to the history job whose time is closest to that of the current job.
According to the embodiment of the disclosure, the obtained calibration parameter of the historical time may also be a comprehensive value. For example, the acquired calibration parameter at the historical time is a weighted average of calibration parameters used by the movable platform when executing a plurality of historical job tasks.
In operation S440, target calibration parameters are determined according to the calibration parameters at the historical time and the calibration parameters at the current time, wherein the target calibration parameters are used by the movable platform when executing a new job task.
According to the embodiment of the disclosure, when the movable platform executes a new job task, the tasks such as drawing and the like can be performed by using the determined target calibration parameters. After a new job task is executed, i.e. at a future time, the currently determined target calibration parameters may be used as calibration parameters at the historical time, so that at the future time, the target calibration parameters at the future time may be re-determined.
According to the embodiment of the disclosure, the modes of determining the target calibration parameters according to the calibration parameters of the camera at the historical time and the calibration parameters of the camera at the current time include multiple modes. For example, the following embodiments may be included.
In an optional embodiment, a weighted average of the calibration parameters at the historical time and the calibration parameters at the current time may be calculated; and then determining the weighted average value as the target calibration parameter.
According to the embodiment of the disclosure, under the condition that the calibration parameters at the historical moments include a plurality of parameters, the weight of the calibration parameters at each historical moment can be determined according to the calibration time of the calibration parameters at each historical moment. For example, the integrated weighted average may be calculated by assigning the calibration parameters at the historical time closer to the current operation time with a higher weight and assigning the calibration parameters at the historical time farther from the current operation time with a lower weight. Different weights are set for calibration parameters at different historical moments to obtain a comprehensive weighted average value, so that the influence of too-long parameters on the accuracy of target calibration parameters is avoided.
In another alternative embodiment, a first deviation between the calibration parameter at the historical time and the calibration parameter at the current time may be calculated; comparing the first deviation with a first preset threshold; if the first deviation is smaller than a first preset threshold value, determining the calibration parameter at the current moment or the weighted average value of the calibration parameter at the historical moment and the calibration parameter at the current moment as a target calibration parameter; and if the first deviation is larger than or equal to a first preset threshold value, determining the calibration parameters at the historical moment as target calibration parameters.
According to the embodiment of the present disclosure, for example, the calibration parameter at the current time may be subtracted from the calibration parameter at the historical time to obtain a first deviation, and then the first deviation is compared with a first preset threshold, and if the first deviation is greater than or equal to the first preset threshold, it is determined that the optimization process or the initial camera parameter is wrong, and at this time, the calibration parameter at the historical time may be determined as the target calibration parameter. In addition, a prompt can be given to the user to prompt the user to perform image data processing again, perform image data acquisition again or perform factory return and re-calibration on the equipment. If the first deviation is smaller than the first preset threshold, the optimization process is considered to be normal, and the calibration parameter at the current moment or the weighted average of the calibration parameter at the historical moment and the calibration parameter at the current moment can be determined as the target calibration parameter.
In another optional implementation, preset calibration parameters of the camera may also be obtained, and then the target calibration parameters are determined according to the preset calibration parameters, the calibration parameters at the historical time, and the calibration parameters at the current time.
According to the embodiment of the disclosure, the preset calibration parameter may be a calibration parameter set for the camera and received by the movable platform, or may be a factory calibration parameter of the camera, and the like.
According to the embodiment of the present disclosure, determining the target calibration parameter according to the preset calibration parameter, the calibration parameter at the historical time, and the calibration parameter at the current time includes: calculating a second deviation between the preset calibration parameter and the calibration parameter at the current moment; comparing the second deviation with a second preset threshold; if the second deviation is smaller than a second preset threshold, determining the calibration parameter at the current moment or the weighted average of the calibration parameter at the historical moment and the calibration parameter at the current moment as a target calibration parameter; and if the deviation is greater than or equal to a preset threshold value, determining the calibration parameter at the historical moment or the preset calibration parameter as the target calibration parameter.
According to the embodiment of the present disclosure, for example, the preset calibration parameter may be subtracted from the calibration parameter at the current time to obtain a second deviation, the second deviation is then compared with a second preset threshold, and if the second deviation is greater than or equal to the second preset threshold, it is determined that the optimization process or the initial camera parameter is incorrect, and at this time, the calibration parameter at the historical time or the preset calibration parameter may be determined as the target calibration parameter. In addition, a prompt can be given to the user to prompt the user to perform image data processing again, perform image data acquisition again or perform factory return and re-calibration on the equipment. If the second deviation is smaller than the second preset threshold, the optimization process is considered to be normal, and the calibration parameter at the current moment or the weighted average of the calibration parameter at the historical moment and the calibration parameter at the current moment can be determined as the target calibration parameter.
According to the embodiment of the disclosure, when the estimated value is inconsistent with the initial value, the user can be prompted to avoid the image construction error caused by the large change of the camera parameter.
According to the embodiment of the present disclosure, the sizes of the first preset threshold and the second preset threshold may be adjusted according to actual experience or measurement results.
According to the embodiment of the disclosure, in the case that the first deviation is smaller than the first preset threshold and/or the second deviation is smaller than the second preset threshold, the precision of the calibration parameter at the current time may be considered to be higher, and therefore, the calibration parameter at the current time and/or the target calibration parameter may be saved and used as a reference when the next job task is executed.
According to the embodiment of the disclosure, after the target calibration parameters are determined, the target calibration parameters may be sent to the cloud server, so that the target calibration parameters corresponding to the camera are stored in the cloud server.
According to the embodiment of the disclosure, the calibration parameters at the current moment and/or the target calibration parameters can be uploaded to the cloud server, and the calibration parameters at the current moment and/or the target calibration parameters can be stored in the local terminal or the control terminal of the unmanned aerial vehicle.
According to the embodiment of the disclosure, if the calibration parameters at the current moment and/or the target calibration parameters are uploaded to the cloud server, the parameters corresponding to the lens ID number of the camera can be stored in the area corresponding to the cloud server.
According to the embodiment of the disclosure, calibration parameters respectively used by the movable platform when executing different historical job tasks are stored in the cloud server, and the calibration parameters comprise at least one of the following parameters of the camera: internal parameters, distortion parameters and external parameters.
Specifically, taking the example that the cloud server stores the internal parameters and the distortion parameters, the data format stored in the cloud server is shown in table 1.
TABLE 1
Factory calibration N-4 times N-3 times N-2 times N-1 times The nth time Composite value
F0 Fn-4 Fn-3 Fn-2 Fn-1 Fn Favg
Cx0 Cxn-4 Cxn-3 Cxn-2 Cxn-1 Cxn Cxavg
Cy0 Cyn-4 Cyn-3 Cyn-2 Cyn-1 Cyn Cyavg
K10 K1n-4 K1n-3 K1n-2 K1n-1 K1n K1avg
K20 K2n-4 K2n-3 K2n-2 K2n-1 K2n K2avg
K30 K3n-4 K3n-3 K3n-2 K3n-1 K3n K3avg
P10 P1n-4 P1n-3 P1n-2 P1n-1 P1n P1avg
P20 P2n-4 P2n-3 P2n-2 P2n-1 P2n P2avg
Where F denotes the focal length, Cx, Cy denotes the position of the image principal point in the pixel coordinate system, k1, k2, k3 denotes the radial distortion parameter, and p1, p2 denotes the tangential distortion parameter.
According to the embodiment of the disclosure, calibration parameters used by the movable platform for executing N times of historical job tasks are stored in the server, wherein N is a preset fixed value, and when the server stores target calibration parameters corresponding to the camera, the server can delete the calibration parameters corresponding to the historical job task which is farthest away from the current job task.
According to the embodiment of the present disclosure, for example, the cloud server may only keep the information uploaded for the last 5 times/10 times, that is, N is 5 times/10 times. The newly uploaded data are camera internal parameters and distortion parameters after the last effective operation is optimized, and a group of parameters which are farthest away from the effective operation in time can be replaced. For example, all parameters corresponding to the n-5 th time are deleted.
According to the embodiment of the present disclosure, the weighted average calculation may be performed based on the latest 5/10 data, the weight may be higher according to the weight closer to the current test time,the longer time with lower weight is distributed to calculate the comprehensive weighted average camera internal parameter (F)avg,Cxavg,Cyavg) And distortion parameter (k 1)avg,k2avg,k3avg,p1avg,p2avg)。
According to the embodiment of the disclosure, the comprehensive weighted average parameter value is obtained by limiting the use of the previous to current data only for a plurality of times, so that the influence of too long and too long parameters on the accuracy of the parameters can be avoided.
According to the embodiment of the disclosure, calibration parameters of the camera at the current moment are obtained through calibration of multi-frame images collected by the camera on the movable platform, and target calibration parameters of the camera used by the movable platform when a new operation task is executed are determined according to the calibration parameters of the camera used by the movable platform when the historical operation task is executed and the calibration parameters of the camera at the current moment obtained through current calibration. The target calibration parameters of the camera are determined according to the calibration parameters at the historical moment and the calibration parameters at the current moment obtained by current calibration, but not fixed and unchangeable parameters, so that the calibration parameters of the camera can be optimized in an iterative mode in the using process, and the calibration parameters of the camera used by the movable platform in the actual operation process are more in line with the actual operation condition. Even if the lens of the camera is worn in the using process or is reassembled after being disassembled, the influence of the value of the camera parameter changed due to wear and reassembly on the image measuring result can be reduced by referring to the calibration parameter at the historical moment.
According to the embodiment of the disclosure, when the next image construction operation is carried out, whether the user reads the parameters of distortion in the comprehensive weighted average camera stored in the cloud server or not can be prompted. If the user selects to read, the Lens ID number 'Lens _ ID' is uploaded to a cloud server, searching is carried out, and the camera internal parameters (F) of the comprehensive weighted average camera are searchedavg,Cxavg,Cyavg) And distortion parameter (k 1)avg,k2avg,k3avg,p1avg,p2avg) And/or the camera internal parameter and the distortion parameter used in each historical operation are used as the constructionThe initial value of the graph operation is applied to the aerial triangulation process, and then the calibration parameter of the current moment is obtained based on the acquired image calibration.
According to the embodiment of the disclosure, when the next image construction operation is performed, the movable platform can also directly request to acquire the comprehensive weighted average camera internal parameter and distortion parameter from the cloud server, and/or the camera internal parameter and distortion parameter used in each historical operation, without prompting a user.
According to the embodiment of the disclosure, if the user chooses not to read the comprehensive average internal parameters and distortion parameters of the cloud server and/or the internal parameters and distortion parameters of the camera used in each historical job, the original calibration value (F) stored in the lens can be used0,Cx0,Cy0) And (k 1)0,k20,k30,p10P20) as initial parameters.
According to the embodiment of the disclosure, when the Lens of the camera leaves the factory, a Lens unique ID number can be written in the Lens cache, the keyword can be "Lens _ ID", for example, and the ID number can be uniquely identified by information such as a material number, a date of departure, a special code, and the like. The 'number of factory return and calibration times' field can be reserved in the lens cache, and the initial value is 0.
According to the embodiment of the disclosure, the calibration of the camera intrinsic parameters and distortion parameters can be carried out by using the checkerboard in a factory, and the initially calibrated camera intrinsic parameters (F) are obtained0,Cx0,Cy0) And (k 1)0,k20,k30,p10,p20). The initially calibrated camera internal parameters and distortion parameters are written into a lens cache, and when a camera is used for shooting photos in subsequent operations, the initially calibrated camera internal parameters and distortion parameters of the camera can be sequentially written into an exif part 'undistorted parameter' field of the image.
According to the embodiment of the disclosure, when the lens is subjected to factory return calibration, the initial camera internal parameters (F) can be associated with distortion parameters0,Cx0,Cy0) And (k 1)0,k20,k30,p10,p20) Reset and return to factory and check "Field value + 1. In addition, after the lens is calibrated in the factory, the initial value in the parameter history table corresponding to the lens ID number can be reset in the cloud server, and the rest values are all emptied.
The method of fig. 4 is further described with reference to fig. 5 in conjunction with specific embodiments.
Fig. 5 schematically shows a flowchart for acquiring calibration parameters of a camera at a historical time according to an embodiment of the disclosure.
As shown in fig. 5, acquiring calibration parameters of the historical time of the camera includes operations S510 to S530.
In operation S510, identification information of a camera is acquired.
According to an embodiment of the present disclosure, a unique ID of a camera Lens may be recorded in an exif field of an image, and a keyword may be "Lens _ ID" as unique identification information.
In operation S520, the identification information of the camera is transmitted to the server, so that the server acquires calibration parameters of the historical time corresponding to the camera according to the identification information of the camera.
According to the embodiment of the disclosure, calibration parameters respectively used by the camera during multiple historical tasks can be stored in the server. The table 1 shows that parameters corresponding to the identification information of the camera are stored in the area corresponding to the server, and details are not repeated here.
In operation S530, calibration parameters of a historical time corresponding to a camera are received from a server.
According to the embodiment of the disclosure, the server may obtain the calibration parameter of the historical time corresponding to the camera according to the identification information lookup table of the camera.
According to the embodiment of the disclosure, the calibration parameter at the historical time may be a comprehensive weighted average as shown in table 1, or may be data corresponding to the calibration parameter at multiple historical times.
According to the embodiment of the disclosure, the server can also obtain the calibration parameters of the camera at the current moment, and the target calibration parameters are determined at the server end according to the calibration parameters at the historical moment and the calibration parameters at the current moment.
According to the embodiment of the disclosure, the type of the work task executed by the unmanned aerial vehicle is not limited, and the work task may be a task related to the need of drawing. For example, the system can be a mapping task, a polling task or a tracking task, and the like.
According to the embodiment of the disclosure, since the calibration parameters of the camera at the current moment are calibrated according to the multi-frame images collected by the camera, the calibration parameters of the camera at the current moment are associated with the images. In the related art, when the unmanned aerial vehicle is used to execute the job tasks in different position areas, the used camera parameters are generally the same, but since the images in the different position areas are different, if the same camera parameters are used to execute the job tasks in the different position areas, the accuracy of image measurement is affected, and the mapping effect is poor.
According to the embodiment of the disclosure, the geographical position information of the camera during the current operation period can be acquired, then the identification information and the geographical position information of the camera are sent to the server, and the server can acquire the calibration parameters of the historical moment corresponding to the camera according to the identification information and the geographical position information of the camera.
According to the embodiment of the disclosure, calibration parameters used by the camera respectively in multiple historical tasks can be stored in the server, and each historical task has corresponding geographical position information.
For example, the data format stored in the server is shown in table 2.
TABLE 2
Factory calibration N-4 times N-3 times N-2 times N-1 times The nth time Composite value
F0 Fn-4 Fn-3 Fn-2 Fn-1 Fn Favg
Cx0 Cxn-4 Cxn-3 Cxn-2 Cxn-1 Cxn Cxavg
Cy0 Cyn-4 Cyn-3 Cyn-2 Cyn-1 Cyn Cyavg
K10 K1n-4 K1n-3 K1n-2 K1n-1 K1n K1avg
K20 K2n-4 K2n-3 K2n-2 K2n-1 K2n K2avg
K30 K3n-4 K3n-3 K3n-2 K3n-1 K3n K3avg
P10 P1n-4 P1n-3 P1n-2 P1n-1 P1n Plavg
P20 P2n-4 P2n-3 P2n-2 P2n-1 P2n P2avg
L0 Ln-4 Ln-3 Ln-2 Ln-1 Ln -
Wherein L is0Geographical location information indicating the calibration of the plant, Ln-4~LnRespectively representing the corresponding geographical location information each time the job task is executed.
According to the embodiment of the disclosure, the server may search for data corresponding to the camera according to the identification information of the camera, and then obtain the calibration parameter at the historical time corresponding to the geographical position which is the same as or close to the geographical position information of the camera during the current job according to the geographical position information of the camera during the current job. For example, the same or similar geographical location information as that during the current job is Ln-4Then the calibration parameters at the historical time corresponding to the (n-4) th time may be determined as the calibration parameters at the historical time corresponding to the camera.
According to the embodiment of the disclosure, the calibration parameters of the historical moment corresponding to the camera are acquired from the server according to the identification information and the geographic position information of the camera, and the target calibration parameters determined according to the calibration parameters of the historical moment and the calibration parameters of the current moment not only enable the calibration parameters of the camera to be iteratively optimized in the using process, but also adapt to the operation environment during the current operation, so that the calibration parameters better accord with the actual operation condition, the accuracy of image measurement is improved, and a better image construction effect is achieved.
Fig. 6 schematically shows a block diagram of a camera parameter determination apparatus according to an embodiment of the present disclosure.
As shown in fig. 6, the camera parameter determination apparatus 600 includes a processor 610 and a memory 620.
The memory 620 is used for storing one or more programs, which when executed by the processor, cause the processor to: acquiring multi-frame images collected by a camera on a movable platform, wherein the multi-frame images at least comprise two frames of images with partially overlapped images; calibrating according to the multi-frame image to obtain calibration parameters of the camera at the current moment; acquiring calibration parameters of a camera at historical time, wherein the calibration parameters at the historical time are calibration parameters of the camera used by the movable platform when executing historical job tasks; and determining target calibration parameters according to the calibration parameters at the historical moment and the calibration parameters at the current moment, wherein the target calibration parameters are used for the movable platform to use when executing a new job task.
According to the embodiment of the disclosure, calibration parameters of the camera at the current moment are obtained through calibration of multi-frame images collected by the camera on the movable platform, and target calibration parameters of the camera used by the movable platform when a new operation task is executed are determined according to the calibration parameters of the camera used by the movable platform when the historical operation task is executed and the calibration parameters of the camera at the current moment obtained through current calibration. The target calibration parameters of the camera are determined according to the calibration parameters at the historical moment and the calibration parameters at the current moment obtained by current calibration, but not fixed and unchangeable parameters, so that the calibration parameters of the camera can be optimized in an iterative mode in the using process, and the calibration parameters of the camera used by the movable platform in the actual operation process are more in line with the actual operation condition. Even if the lens of the camera is worn in the using process or is reassembled after being disassembled, the influence of the value of the camera parameter changed due to wear and reassembly on the image measuring result can be reduced by referring to the calibration parameter at the historical moment.
According to an embodiment of the present disclosure, the acquiring, by the processor 610, calibration parameters of the camera at the historical time includes: acquiring identification information of a camera; sending the identification information of the camera to the server so that the server can obtain calibration parameters of historical time corresponding to the camera according to the identification information of the camera; and receiving calibration parameters of historical moments corresponding to the camera from the server.
According to the embodiment of the disclosure, calibration parameters respectively used by a movable platform when executing different historical job tasks are stored in a server; the parameters include at least one of the following parameters of the camera: internal parameters, distortion parameters and external parameters.
According to an embodiment of the disclosure, the processor 610 further performs the following operations: after determining the target calibration parameters, the target calibration parameters are sent to the server so as to store the target calibration parameters corresponding to the camera in the server.
According to the embodiment of the disclosure, calibration parameters used by the movable platform for executing N times of historical job tasks are stored in the server, wherein N is a preset fixed value, and when the server stores target calibration parameters corresponding to the camera, the server deletes the calibration parameters corresponding to the historical job task which is farthest away from the current job task in time.
According to an embodiment of the present disclosure, the acquiring, by the processor 610, calibration parameters of the camera at the historical time includes: acquiring geographic position information of a camera during current operation; the sending of the identification information of the camera to the server so that the server obtains the calibration parameters of the historical moment corresponding to the camera according to the identification information of the camera includes: and sending the identification information and the geographical position information of the camera to the server so that the server can obtain the calibration parameters of the historical moment corresponding to the camera according to the identification information and the geographical position information of the camera.
According to an embodiment of the present disclosure, the determining, by the processor 610, the target calibration parameter according to the calibration parameter of the camera at the historical time and the calibration parameter of the camera at the current time includes: calculating a weighted average value of the calibration parameters at the historical moment and the calibration parameters at the current moment; and determining the weighted average as the target calibration parameter.
According to an embodiment of the present disclosure, the determining, by the processor 610, the target calibration parameter according to the calibration parameter of the camera at the historical time and the calibration parameter of the camera at the current time includes: calculating a first deviation between the calibration parameter at the historical moment and the calibration parameter at the current moment; comparing the first deviation with a first preset threshold; if the first deviation is smaller than a first preset threshold value, determining the calibration parameter at the current moment or the weighted average value of the calibration parameter at the historical moment and the calibration parameter at the current moment as a target calibration parameter; and if the first deviation is larger than or equal to a first preset threshold value, determining the calibration parameters at the historical moment as target calibration parameters.
According to an embodiment of the disclosure, the processor 610 further performs the following operations: acquiring preset calibration parameters of a camera; the method for determining the target calibration parameters according to the calibration parameters of the camera at the historical moment and the calibration parameters of the camera at the current moment comprises the following steps: and determining a target calibration parameter according to the preset calibration parameter, the calibration parameter at the historical moment and the calibration parameter at the current moment.
According to an embodiment of the present disclosure, the determining, by the processor 610, the target calibration parameter according to the preset calibration parameter, the calibration parameter at the historical time, and the calibration parameter at the current time includes: calculating a second deviation between the preset calibration parameter and the calibration parameter at the current moment; comparing the second deviation with a second preset threshold; if the second deviation is smaller than a second preset threshold, determining the calibration parameter at the current moment or the weighted average of the calibration parameter at the historical moment and the calibration parameter at the current moment as a target calibration parameter; and if the deviation is greater than or equal to a preset threshold value, determining the calibration parameter at the historical moment or the preset calibration parameter as a target calibration parameter.
In particular, the processor 610 may comprise, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), or the like. The processor 610 may also include onboard memory for caching purposes. The processor 610 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
The memory 620 may include a computer program 621, which computer program 621 may include code/computer-executable instructions that, when executed by the processor 610, cause the processor 610 to perform a method flow such as that described above in connection with fig. 4-5, and any variations thereof.
The computer program 621 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 621 may include one or more program modules, including 621A, 621B, … …, for example. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use any suitable program module or combination of program modules according to actual situations, which when executed by the processor 610, enable the processor 610 to perform the method flows described above in connection with fig. 4-5, for example, and any variations thereof.
There is also provided, in accordance with an embodiment of the present disclosure, a readable storage medium having stored thereon executable instructions that, when executed by a processor, cause the processor to perform the camera calibration method as described above. The readable storage medium may have the same function as the above-described memory.
The readable storage medium may be included in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The readable storage medium carries one or more programs which, when executed, implement a method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the readable storage medium may be a non-volatile readable storage medium, which may include, for example but not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
According to an embodiment of the present disclosure, there is also provided a control system of a movable platform. The control system may comprise one or more moveable platforms, each of which may be implemented by a processor as described above, and a control terminal for controlling the moveable platforms.
According to an embodiment of the present disclosure, there is also provided a movable platform, a processor of which may execute the camera calibration method as described above.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working process of the device described above, reference may be made to the corresponding process in the foregoing method embodiment, which is not described herein again.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present disclosure, and not for limiting the same; while the present disclosure has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; features in embodiments of the disclosure may be combined arbitrarily, without conflict; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present disclosure.

Claims (21)

1. A method for determining camera parameters, comprising:
acquiring multi-frame images collected by a camera on a movable platform, wherein the multi-frame images at least comprise two frames of images with partially overlapped images;
obtaining calibration parameters of the camera at the current moment according to the calibration of the multi-frame images;
acquiring calibration parameters of the camera at historical time, wherein the calibration parameters at the historical time are calibration parameters of the camera used by the movable platform when executing historical job tasks; and
determining target calibration parameters according to the calibration parameters at the historical moment and the calibration parameters at the current moment, wherein the target calibration parameters are used by the movable platform when executing a new job task;
wherein the calibration parameters include one or more of the following parameters of the camera: internal parameters, distortion parameters and external parameters.
2. The method of claim 1, wherein the obtaining calibration parameters for the historical time of the camera comprises:
acquiring identification information of the camera;
sending the identification information of the camera to a server so that the server can obtain calibration parameters of historical time corresponding to the camera according to the identification information of the camera; and
and receiving calibration parameters of historical moments corresponding to the camera from the server.
3. The method of claim 2, wherein calibration parameters used by the movable platform when performing different historical job tasks are stored in the server.
4. The method of claim 2 or 3, further comprising:
after determining the target calibration parameters, sending the target calibration parameters to the server so as to store the target calibration parameters corresponding to the camera in the server.
5. The method according to claim 4, wherein the server stores therein calibration parameters used by the movable platform for each of N historical job tasks, wherein N is a preset fixed value, and when the server stores target calibration parameters corresponding to the camera, the server deletes the calibration parameters corresponding to the historical job task that is farthest from the current job task.
6. The method of claim 2, wherein the obtaining calibration parameters for the historical time of the camera comprises:
acquiring geographic position information of the camera during the current operation;
the sending the identification information of the camera to a server so that the server obtains the calibration parameters of the historical moment corresponding to the camera according to the identification information of the camera comprises:
and sending the identification information of the camera and the geographical position information to a server so that the server can obtain the calibration parameters of the historical moment corresponding to the camera according to the identification information of the camera and the geographical position information.
7. The method of claim 1, wherein determining target calibration parameters from calibration parameters at the historical time and calibration parameters at the current time of the camera comprises:
calculating a weighted average value of the calibration parameters at the historical moment and the calibration parameters at the current moment; and
and determining the weighted average value as the target calibration parameter.
8. The method of claim 1, wherein determining target calibration parameters from calibration parameters at the historical time and calibration parameters at the current time of the camera comprises:
calculating a first deviation between the calibration parameters at the historical moment and the calibration parameters at the current moment;
comparing the first deviation with a first preset threshold;
if the first deviation is smaller than the first preset threshold, determining the calibration parameter at the current moment, or determining a weighted average value of the calibration parameter at the historical moment and the calibration parameter at the current moment as the target calibration parameter; and
and if the first deviation is greater than or equal to the first preset threshold value, determining the calibration parameters at the historical moment as the target calibration parameters.
9. The method of claim 1, further comprising:
acquiring preset calibration parameters of the camera;
wherein the determining target calibration parameters according to the calibration parameters of the camera at the historical time and the calibration parameters of the camera at the current time comprises:
and determining the target calibration parameters according to the preset calibration parameters, the calibration parameters at the historical moment and the calibration parameters at the current moment.
10. The method according to claim 9, wherein the determining the target calibration parameter according to the preset calibration parameter, the calibration parameter at the historical time, and the calibration parameter at the current time comprises:
calculating a second deviation between the preset calibration parameter and the calibration parameter at the current moment;
comparing the second deviation with a second preset threshold;
if the second deviation is smaller than the second preset threshold, determining the calibration parameter at the current moment, or determining a weighted average value of the calibration parameter at the historical moment and the calibration parameter at the current moment as the target calibration parameter; and
and if the deviation is greater than or equal to the preset threshold value, determining the calibration parameter at the historical moment or the preset calibration parameter as the target calibration parameter.
11. A camera parameter determination apparatus, comprising:
a processor;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the processor, cause the processor to:
acquiring multi-frame images collected by a camera on a movable platform, wherein the multi-frame images at least comprise two frames of images with partially overlapped images;
obtaining calibration parameters of the camera at the current moment according to the calibration of the multi-frame images;
acquiring calibration parameters of the camera at historical time, wherein the calibration parameters at the historical time are calibration parameters of the camera used by the movable platform when executing historical job tasks; and
and determining target calibration parameters according to the calibration parameters at the historical moment and the calibration parameters at the current moment, wherein the target calibration parameters are used for the movable platform to use when executing a new job task.
12. The apparatus of claim 11, wherein the processor obtaining calibration parameters for historical moments of the camera comprises:
acquiring identification information of the camera;
sending the identification information of the camera to a server so that the server can obtain calibration parameters of historical time corresponding to the camera according to the identification information of the camera; and
and receiving calibration parameters of historical moments corresponding to the camera from the server.
13. The apparatus according to claim 12, wherein calibration parameters used by the movable platform respectively when executing different historical job tasks are stored in the server;
the parameters include at least one of the following parameters of the camera: internal parameters, distortion parameters and external parameters.
14. The apparatus of claim 12 or 13, wherein the processor further performs the following:
after determining the target calibration parameters, sending the target calibration parameters to the server so as to store the target calibration parameters corresponding to the camera in the server.
15. The apparatus according to claim 14, wherein the server stores therein calibration parameters that are used by the movable platform each time when the movable platform executes N historical job tasks, where N is a preset fixed value, and when the server stores target calibration parameters corresponding to the camera, the server deletes calibration parameters corresponding to a historical job task that is farthest away from the time of a current job task.
16. The apparatus of claim 12, wherein the processor obtaining calibration parameters for historical moments of the camera comprises:
acquiring geographic position information of the camera during the current operation;
the sending the identification information of the camera to a server so that the server obtains the calibration parameters of the historical moment corresponding to the camera according to the identification information of the camera comprises:
and sending the identification information of the camera and the geographical position information to a server so that the server can obtain the calibration parameters of the historical moment corresponding to the camera according to the identification information of the camera and the geographical position information.
17. The apparatus of claim 11, wherein the processor determining target calibration parameters from calibration parameters at a historical time and calibration parameters at the current time of the camera comprises:
calculating a weighted average value of the calibration parameters at the historical moment and the calibration parameters at the current moment; and
and determining the weighted average value as the target calibration parameter.
18. The apparatus of claim 11, wherein the processor determining target calibration parameters from calibration parameters at a historical time and calibration parameters at the current time of the camera comprises:
calculating a first deviation between the calibration parameters at the historical moment and the calibration parameters at the current moment;
comparing the first deviation with a first preset threshold;
if the first deviation is smaller than the first preset threshold, determining the calibration parameter at the current moment, or determining a weighted average value of the calibration parameter at the historical moment and the calibration parameter at the current moment as the target calibration parameter; and
and if the first deviation is greater than or equal to the first preset threshold value, determining the calibration parameters at the historical moment as the target calibration parameters.
19. The apparatus of claim 11, wherein the processor further performs the following:
acquiring preset calibration parameters of the camera;
wherein the determining target calibration parameters according to the calibration parameters of the camera at the historical time and the calibration parameters of the camera at the current time comprises:
and determining the target calibration parameters according to the preset calibration parameters, the calibration parameters at the historical moment and the calibration parameters at the current moment.
20. The apparatus of claim 19, wherein the processor determining the target calibration parameter according to the preset calibration parameter, the historical time calibration parameter, and the current time calibration parameter comprises:
calculating a second deviation between the preset calibration parameter and the calibration parameter at the current moment;
comparing the second deviation with a second preset threshold;
if the second deviation is smaller than the second preset threshold, determining the calibration parameter at the current moment, or determining a weighted average value of the calibration parameter at the historical moment and the calibration parameter at the current moment as the target calibration parameter; and
and if the deviation is greater than or equal to the preset threshold value, determining the calibration parameter at the historical moment or the preset calibration parameter as the target calibration parameter.
21. A readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to perform the method of any one of claims 1 to 10.
CN202080005255.0A 2020-05-28 2020-05-28 Camera parameter determination method, device and readable storage medium Pending CN112771577A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/092939 WO2021237574A1 (en) 2020-05-28 2020-05-28 Camera parameter determination method and apparatus, and readable storage medium

Publications (1)

Publication Number Publication Date
CN112771577A true CN112771577A (en) 2021-05-07

Family

ID=75699527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080005255.0A Pending CN112771577A (en) 2020-05-28 2020-05-28 Camera parameter determination method, device and readable storage medium

Country Status (2)

Country Link
CN (1) CN112771577A (en)
WO (1) WO2021237574A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114152208A (en) * 2021-11-24 2022-03-08 燕山大学 DIC technology-based 4D printing flexible skin deformation efficiency evaluation method
WO2023241372A1 (en) * 2022-06-13 2023-12-21 华为技术有限公司 Camera intrinsic parameter calibration method and related device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114897994A (en) * 2022-05-30 2022-08-12 平安普惠企业管理有限公司 Camera multi-lens dynamic calibration method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107730551A (en) * 2017-01-25 2018-02-23 问众智能信息科技(北京)有限公司 The method and apparatus that in-vehicle camera posture is estimated automatically
US20190028688A1 (en) * 2017-11-14 2019-01-24 Intel Corporation Dynamic calibration of multi-camera systems using multiple multi-view image frames
CN110473262A (en) * 2019-08-22 2019-11-19 北京双髻鲨科技有限公司 Outer ginseng scaling method, device, storage medium and the electronic equipment of more mesh cameras

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10432855B1 (en) * 2016-05-20 2019-10-01 Gopro, Inc. Systems and methods for determining key frame moments to construct spherical images
CN110809781B (en) * 2018-11-15 2024-02-27 深圳市大疆创新科技有限公司 Image processing method, control terminal and storage medium
CN109947886B (en) * 2019-03-19 2023-01-10 腾讯科技(深圳)有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN110766760B (en) * 2019-10-21 2022-08-02 北京百度网讯科技有限公司 Method, device, equipment and storage medium for camera calibration

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107730551A (en) * 2017-01-25 2018-02-23 问众智能信息科技(北京)有限公司 The method and apparatus that in-vehicle camera posture is estimated automatically
US20190028688A1 (en) * 2017-11-14 2019-01-24 Intel Corporation Dynamic calibration of multi-camera systems using multiple multi-view image frames
CN110473262A (en) * 2019-08-22 2019-11-19 北京双髻鲨科技有限公司 Outer ginseng scaling method, device, storage medium and the electronic equipment of more mesh cameras

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114152208A (en) * 2021-11-24 2022-03-08 燕山大学 DIC technology-based 4D printing flexible skin deformation efficiency evaluation method
WO2023241372A1 (en) * 2022-06-13 2023-12-21 华为技术有限公司 Camera intrinsic parameter calibration method and related device

Also Published As

Publication number Publication date
WO2021237574A1 (en) 2021-12-02

Similar Documents

Publication Publication Date Title
CN110568447B (en) Visual positioning method, device and computer readable medium
CN108986161B (en) Three-dimensional space coordinate estimation method, device, terminal and storage medium
CN112771577A (en) Camera parameter determination method, device and readable storage medium
CN107782322B (en) Indoor positioning method and system and indoor map establishing device thereof
CN108090959B (en) Indoor and outdoor integrated modeling method and device
CN110809781B (en) Image processing method, control terminal and storage medium
KR100914211B1 (en) Distorted image correction apparatus and method
CN108759788B (en) Unmanned aerial vehicle image positioning and attitude determining method and unmanned aerial vehicle
KR102248459B1 (en) Apparatus and methdo for calibrating a camera
CN112686877A (en) Binocular camera-based three-dimensional house damage model construction and measurement method and system
CN110766731A (en) Method and device for automatically registering panoramic image and point cloud and storage medium
CN113132717A (en) Data processing method, terminal and server
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN108444452B (en) Method and device for detecting longitude and latitude of target and three-dimensional space attitude of shooting device
CN115423863B (en) Camera pose estimation method and device and computer readable storage medium
CN111882655A (en) Method, apparatus, system, computer device and storage medium for three-dimensional reconstruction
CN111105462B (en) Pose determining method and device, augmented reality equipment and readable storage medium
CN110136205B (en) Parallax calibration method, device and system of multi-view camera
CN113658279B (en) Camera internal reference and external reference estimation method, device, computer equipment and storage medium
CN114445583A (en) Data processing method and device, electronic equipment and storage medium
Dlesk et al. Structure from motion processing of analogue images captured by Rollei metric camera, digitized with various scanning resolution
KR20150096127A (en) Method and apparatus for calculating location of points captured in image
CN113489970A (en) Method and device for correcting pan-tilt camera, storage medium and electronic device
CN117422650B (en) Panoramic image distortion correction method and device, electronic equipment and medium
CN111292297A (en) Welding seam detection method, device and equipment based on binocular stereo vision and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination