WO2021237574A1 - 相机参数确定方法、装置和可读存储介质 - Google Patents

相机参数确定方法、装置和可读存储介质 Download PDF

Info

Publication number
WO2021237574A1
WO2021237574A1 PCT/CN2020/092939 CN2020092939W WO2021237574A1 WO 2021237574 A1 WO2021237574 A1 WO 2021237574A1 CN 2020092939 W CN2020092939 W CN 2020092939W WO 2021237574 A1 WO2021237574 A1 WO 2021237574A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
calibration parameter
calibration
parameters
moment
Prior art date
Application number
PCT/CN2020/092939
Other languages
English (en)
French (fr)
Inventor
黄振昊
何纲
彭昭亮
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/092939 priority Critical patent/WO2021237574A1/zh
Priority to CN202080005255.0A priority patent/CN112771577A/zh
Publication of WO2021237574A1 publication Critical patent/WO2021237574A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Definitions

  • the present disclosure relates to the field of information processing, and in particular to a method for determining camera parameters, a device for determining camera parameters, and a readable storage medium.
  • a geometric model of camera imaging In the image measurement process, in order to determine the mapping relationship between the three-dimensional position of the object and the corresponding point of the object in the image, a geometric model of camera imaging needs to be established.
  • the more commonly used geometric model is the pinhole model, which describes the light passing through After the pinhole, the relationship between the projection image and the image.
  • the parameters of the geometric model include camera parameters.
  • the camera parameters need to be obtained through the calibration process. The stability of the camera parameters obtained through the calibration affects the accuracy of the image measurement.
  • the present disclosure provides a method for determining camera parameters, including: acquiring a multi-frame image collected by a camera on a movable platform, wherein the multi-frame image includes at least two frames with partially overlapping images; The calibration parameters of the camera at the current time; acquiring the calibration parameters of the camera at the historical time, where the calibration parameters at the historical time are the calibration parameters of the camera used by the movable platform when performing historical tasks; and according to the history
  • the calibration parameters at the time and the calibration parameters at the current time determine the target calibration parameters, where the target calibration parameters are used when the movable platform performs a new task; wherein, the calibration parameters include one of the following or Multiple parameters: internal parameters, distortion parameters, and external parameters.
  • the present disclosure also provides a device for determining camera parameters, including: a processor; a memory for storing one or more programs, wherein when the one or more programs are executed by the processor, the processor is caused to execute the following Operation: Obtain a multi-frame image collected by a camera on a movable platform, wherein the multi-frame image includes at least two frames of images with partially overlapping images; obtain the calibration parameters of the camera at the current moment according to the multi-frame image calibration; The calibration parameters of the camera at historical moments, wherein the calibration parameters at the historical moments are the calibration parameters of the camera used by the movable platform when performing historical tasks; and according to the calibration parameters at the historical moments and the calibration parameters at the current moments The target calibration parameters are determined, wherein the target calibration parameters are used by the movable platform when performing a new operation task.
  • the present disclosure also provides a readable storage medium with executable instructions stored thereon.
  • the processor executes the method for determining camera parameters as described above.
  • the calibration parameters of the camera at the current time are obtained by calibrating the multi-frame images collected by the camera on the movable platform, and the calibration parameters of the camera at the historical time used by the movable platform when performing historical tasks are obtained.
  • the estimated calibration at the current moment obtained from the current calibration determines the target calibration parameters of the camera used by the movable platform when performing a new task. Since the target calibration parameters of the camera are determined based on the calibration parameters at the historical moment and the calibration parameters at the current moment obtained from the current calibration, rather than fixed parameters, the camera's calibration parameters can be iteratively optimized during use, making it possible to The calibration parameters of the camera used in the actual operation of the mobile platform are more in line with the actual operation situation. Even if the camera lens is worn during use, or is reassembled after disassembly, since the calibration parameters at historical moments are referenced, the impact of changing camera parameter values due to wear and reassembly on the image measurement results can be reduced.
  • Fig. 1 schematically shows a schematic diagram of a pinhole model of a camera according to an embodiment of the present disclosure.
  • Fig. 2 schematically shows a schematic diagram of a distortion model of a camera according to an embodiment of the present disclosure.
  • Fig. 3 schematically shows an application scenario in which a method and device for determining camera parameters can be applied according to an embodiment of the present disclosure.
  • Fig. 4 schematically shows a flowchart of a method for determining camera parameters according to an embodiment of the present disclosure.
  • Fig. 5 schematically shows a flow chart of acquiring calibration parameters of a camera at a historical moment according to an embodiment of the present disclosure.
  • Fig. 6 schematically shows a block diagram of an apparatus for determining camera parameters according to an embodiment of the present disclosure.
  • the process of generating an image from an object captured by a camera is essentially converting the object located in the world coordinate system into an image located in the pixel coordinate system.
  • the world coordinate system is a three-dimensional rectangular coordinate system that can describe the relative spatial position of the camera and the object based on it.
  • the pixel coordinate system is a two-dimensional rectangular coordinate system that reflects the arrangement of pixels in the camera's CCD/CMOS chip. Therefore, the geometric model of camera imaging represents the conversion process from the world coordinate system to the pixel coordinate system.
  • the conversion from the world coordinate system (OX w Y w Z w ) to the pixel coordinate system (o-uv) generally needs to be converted from the world coordinate system (OX w Y w Z w ) to the camera coordinate system (C-XYZ), and then Convert from camera coordinate system (C-XYZ) to pixel coordinate system (o-uv).
  • the camera coordinate system (C-XYZ) is also a three-dimensional rectangular coordinate system, the origin is at the optical center C of the camera, the X and Y axes are parallel to the two sides of the image plane, and the Z axis is the optical axis of the camera.
  • the conversion from the world coordinate system (OX w Y w Z w ) to the camera coordinate system (C-XYZ) generally includes rotation and displacement transformation Transform.
  • the matrix R is the Rotation Matrix rotation matrix
  • the matrix T is the Translation Matrix displacement matrix
  • R and T are the camera's external parameters Extrinsic Matrix, which express the world coordinate system (OX w Y w Z w ) to the camera coordinate system ( C-XYZ) conversion relationship.
  • R is the 3 ⁇ 3 rotation matrix
  • t is the 3 ⁇ 1 translation vector
  • (X, Y, Z, 1) T is the odd coordinate of the camera coordinate system
  • (X w , Y w , Z w , 1 ) T is the odd coordinate of the world coordinate system.
  • the more commonly used geometric model for converting from the camera coordinate system (C-XYZ) to the pixel coordinate system (o-uv) is the pinhole model, which describes the relationship between projection and imaging after light passes through the pinhole.
  • the image plane center of the image can be used as the origin p
  • the x-axis and y-axis parallel to the u-axis and the v-axis can be used to establish the image coordinate system (p-xy).
  • the origin p is in the pixel coordinate system.
  • the coordinates are (u 0 , v 0 ).
  • the present disclosure may not establish the image coordinate system (p-xy), and directly translate the coordinate origin o of the pixel coordinate system (o-uv) to the image center p.
  • the origin p of the image coordinate system is the intersection point of the optical axis of the camera and the image plane of the image, that is, the principal point of the camera, which is located at the center of the image.
  • the pixel coordinate system and the image coordinate system are essentially translational, and can be transformed by translation.
  • the image coordinate system (p-xy) is converted to the pixel coordinate system (o-uv) as follows:
  • dx and dy are the physical dimensions of the pixel in the x and y axis directions, respectively, and (u 0 , v 0 ) represent the coordinates of the principal point.
  • the conversion from the camera coordinate system (C-XYZ) to the image coordinate system (p-xy) can be described using a pinhole model.
  • Fig. 1 schematically shows a schematic diagram of a pinhole model of a camera according to an embodiment of the present disclosure.
  • the pinhole model of the camera reflects the transformation relationship between the camera coordinate system (C-XYZ) and the image coordinate system (p-xy).
  • any point M in the camera coordinate system (C-XYZ) corresponds to the image point m in the image coordinate system (p-xy)
  • the connection line between M and the camera optical center C is CM
  • the intersection point between CM and the image plane is The image point m, m is the projection of the spatial point M on the image plane.
  • the process of converting the camera coordinate system (C-XYZ) to the image coordinate system (p-xy) corresponds to the perspective projection, which is represented by the following matrix:
  • s is a scale factor that is not 0
  • f is the effective focal length of the camera (the distance from the optical center to the image plane)
  • (X, Y, Z, 1) T is the odd coordinates of the camera coordinate system
  • (x, y , 1) T is the odd coordinate of the image coordinate system.
  • the transformation relationship from the world coordinate system to the camera coordinate system Based on the transformation relationship from the world coordinate system to the camera coordinate system, the transformation relationship from the camera coordinate system to the image coordinate system, and the transformation relationship from the image coordinate system to the pixel coordinate system, the transformation relationship from the world coordinate system to the pixel coordinate system can be obtained as follows:
  • the matrix K is called the camera's internal parameters, that is, the Camera calibration matrix, and R and T are the camera's external parameters. Therefore, the camera internal parameters generally include the focal length f, the position of the principal point of the image in the pixel coordinate system u 0 , v 0 , and the size of the pixel dx, dy.
  • the distortion model can be used to describe the distortion of the light during the projection process.
  • the distortion model can generally be characterized by distortion parameters.
  • Distortion parameters generally include radial distortion parameters k1, k2, k3, and tangential distortion parameters p1, p2.
  • Radial distortion can also be called barrel distortion and pincushion distortion.
  • the cause is generally that the light is deflected more far away from the center of the lens, and the cause of tangential distortion is generally that the lens is not completely parallel to the image plane.
  • Fig. 2 schematically shows a schematic diagram of a distortion model of a camera according to an embodiment of the present disclosure.
  • the theoretical position of the pixel is on an arc, but because the light is deflected away from the center of the lens and the lens is not completely parallel to the image plane, the position of the pixel is deviated radially and tangentially.
  • the camera external parameters can be obtained during the operation.
  • the camera internal parameters and distortion parameters should generally be calibrated in advance.
  • the camera internal parameters and distortion parameters must be entered in the mapping software as initial values. Iterate in the mapping algorithm.
  • the more commonly used camera parameter calibration method is generally to use a checkerboard and a visual calibration algorithm (such as the OpenCV algorithm) to calibrate the camera parameters at the factory, and then write the calibrated camera parameters into the storage area of the camera. Or provide test reports and certificates.
  • a visual calibration algorithm such as the OpenCV algorithm
  • the OpenCV algorithm has a built-in camera to participate in the distortion parameter calibration algorithm, and the calibration algorithm can be obtained by shooting the checkerboard at different angles to obtain the calibrated camera internal parameters and distortion parameters.
  • the camera parameters may change, making them differ from the preset parameter calibration values, which will cause problems for subsequent mapping processing and affect the image Accuracy of measurement.
  • the exchangeable image file format is Specially set for photos of digital cameras, can record the attribute information and shooting data of digital photos, which can be attached to JPEG, TIFF, RIFF and other files to add content and index diagrams or images related to the shooting information of digital cameras Process the version information of the software. If the camera lens is worn during use, or is reassembled after disassembly, the camera's parameters do not match the calibration values given by the factory, which will cause problems for subsequent image processing.
  • the camera's internal parameters and distortion parameters do not have the function of its own iteration. If the lens/camera is disassembled in the middle, or the use time is too long, generally it can only be re-calibrated by returning to the factory. It does not have its own iterative ability, and before the calibration The camera internal parameters and distortion parameters have large deviations from their true values.
  • the embodiments of the present disclosure provide a method and device for determining camera parameters.
  • the method includes: acquiring multiple frames of images collected by a camera on a movable platform, wherein the multiple frames of images include at least two frames of images with partially overlapping images; and obtaining the calibration parameters of the camera at the current moment according to the multiple frames of image calibration; Acquire the calibration parameters of the aforementioned camera at historical moments, where the aforementioned calibration parameters at the historical moments are the calibration parameters of the camera used by the movable platform when performing historical tasks; and the calibration parameters based on the aforementioned historical moments and the aforementioned current moments
  • the calibration parameters determine the target calibration parameters, where the above target calibration parameters are used by the movable platform when performing a new operation task.
  • the following will take the mobile platform as an unmanned aerial vehicle as an example to describe the application scenarios of the camera parameter determination method and device.
  • FIG. 3 schematically shows an application scenario in which a method and device for determining camera parameters can be applied according to an embodiment of the present disclosure. It should be noted that FIG. 3 is only an example of a scenario where the embodiment of the present disclosure can be applied to help those skilled in the art understand the technical content of the present disclosure, but it does not mean that the embodiment of the present disclosure cannot be used for other devices. , System, environment or scene.
  • the user 301 can control the drone 302 to fly through the remote control terminal, and the camera on the drone 302 can collect multiple frames of images in the real environment.
  • the multiple frames of images collected by the camera include at least two frames of images that partially overlap the images.
  • the two frames of images captured by the camera include at least the same big tree, or the same mountain, etc., and the overlapping images can be the same big tree or the same mountain.
  • the drone 302 can send the images collected by the camera to a remote control terminal for processing, or can also be processed locally by the drone 302. By processing multiple frames of images collected by the camera, the camera can be obtained.
  • the calibration parameters of the current moment can be obtained.
  • the drone 302 can obtain the calibration parameters of the camera at historical moments from the server 303, where the calibration parameter at the historical moment is the calibration of the camera used by the drone 302 when performing historical tasks. parameter. Then, according to the calibration parameters at the historical time and the calibration parameters at the current time, the target calibration parameters used by the UAV 302 when performing a new operation task are determined.
  • control terminal may obtain the calibration parameters of the aforementioned camera at a historical time from the server 303 in advance, and cache them locally in the control terminal. Then, the drone 302 obtains the calibration parameters of the aforementioned camera at the historical moment from the control terminal.
  • the drone 302 may also obtain the calibration parameters of the aforementioned camera at historical moments from the storage medium at the local end of the drone 302.
  • the present disclosure has various methods for obtaining the calibration parameters of the aforementioned camera at a historical time, and may not be limited to obtaining the calibration parameters of the aforementioned camera at a historical time from the server 303.
  • the calibration parameters of the camera are in use. It can be iteratively optimized, so that the calibration parameters of the camera used by the UAV 302 in the actual operation process are more in line with the actual operation situation. Even if the camera lens is worn during use, or is reassembled after disassembly, since the calibration parameters at historical moments are referenced, the impact of changing camera parameter values due to wear and reassembly on the image measurement results can be reduced.
  • Fig. 4 schematically shows a flowchart of a method for determining camera parameters according to an embodiment of the present disclosure.
  • the method for determining camera parameters includes operations S410 to S440.
  • a multi-frame image collected by a camera on a movable platform is acquired, where the multi-frame image collected by the camera includes at least two frames of images with partially overlapping images.
  • the position and posture of the camera on the movable platform can be changed to obtain multiple frames of images.
  • the camera is controlled to shoot the same object from different angles at three different positions, and the three frames of images collected include at least the pixel part of the object. Further, the position of the object on the image collected at different positions is different.
  • the calibration parameters of the camera at the current moment can be obtained according to the multi-frame image calibration during the aerial triangulation.
  • the calibration parameters can be one or more of the camera internal parameters and the distortion parameter camera external parameters.
  • the calibration internal parameters and distortion parameters are taken as unknowns, and nonlinear optimization processing is performed, so as to iterate more Participate in the distortion parameters in the precisely optimized camera.
  • the acquired multi-frame images can also be processed by the calibration algorithm, the calibration points in the images are identified, and the calibration parameters of the camera at the current time can be obtained by calibrating in combination with the three-dimensional position of the calibration points.
  • the calibration parameters of the historical moments of the camera are acquired, where the calibration parameters of the historical moments are the calibration parameters of the camera used by the movable platform when performing historical tasks.
  • the acquired calibration parameters of historical moments may include one or more.
  • the calibration parameters of the multiple historical moments may be performed by the movable platform.
  • the calibration parameter of the historical moment may be the calibration parameter of the historical moment corresponding to the historical job that is closest to the time of the current job.
  • the acquired calibration parameter at the historical moment may also be an integrated value.
  • the acquired calibration parameters at historical moments are the weighted average of the calibration parameters used by the movable platform when performing multiple historical tasks.
  • the target calibration parameter is determined according to the calibration parameter at the historical moment and the calibration parameter at the current moment, where the target calibration parameter is used by the movable platform when performing a new work task.
  • the determined target calibration parameters can be used to perform tasks such as map building. After the new task is executed, that is, at a future time, the currently determined target calibration parameter can be used as the calibration parameter at the historical time. Therefore, at the future time, the target calibration parameter at the future time can be re-determined.
  • the target calibration parameter according to the calibration parameter of the camera at the historical moment and the calibration parameter at the current moment.
  • the following embodiments may be included.
  • the weighted average of the calibration parameters at the historical moment and the calibration parameters at the current moment can be calculated; and then the weighted average is determined as the target calibration parameter.
  • the weight of the calibration parameter of each historical moment can be determined according to the calibration time of the calibration parameter of each historical moment. For example, the weights of the calibration parameters of the historical moments closer to the current working time are higher, and the weights of the calibration parameters of the historical moments farther from the current working time are lower, and a comprehensive weighted average value can be calculated. By setting different weights for the calibration parameters at different historical moments, the comprehensive weighted average value is obtained, which prevents too long parameters from affecting the accuracy of the target calibration parameters.
  • the first deviation between the calibration parameter at the historical moment and the calibration parameter at the current moment may be calculated first; the first deviation is compared with the first preset threshold; if the first deviation is less than the first preset threshold Set a threshold, and determine the calibration parameter at the current moment, or the weighted average of the calibration parameter at the historical moment and the calibration parameter at the current moment as the target calibration parameter; and if the first deviation is greater than or equal to the first preset threshold, the historical moment’s The calibration parameter is determined as the target calibration parameter.
  • the calibration parameter at the historical moment can be subtracted from the calibration parameter at the current moment to obtain the first deviation, and then the first deviation is compared with the first preset threshold. If the first deviation is greater than or equal to The first preset threshold value is considered to be wrong in the optimization process or the initial camera parameters.
  • the calibration parameters at the historical time can be determined as the target calibration parameters.
  • the user can also be prompted to re-process the image data, re-acquire the image data, or return the equipment to the factory for re-calibration, etc. If the first deviation is less than the first preset threshold, it is considered that the optimization process is normal, and the calibration parameter at the current moment, or the weighted average of the calibration parameter at the historical moment and the calibration parameter at the current moment may be determined as the target calibration parameter.
  • the preset calibration parameters of the camera can also be acquired, and then the target calibration parameters are determined according to the pre-calibration parameters, the calibration parameters at the historical moment, and the calibration parameters at the current moment.
  • the preset calibration parameter may be a calibration parameter set for the camera that the movable platform receives a user input, or may be a factory calibration parameter of the camera, and so on.
  • determining the target calibration parameter according to the preset calibration parameter, the calibration parameter at the historical moment, and the calibration parameter at the current moment includes: calculating a second deviation between the preset calibration parameter and the calibration parameter at the current moment; and calculating the second deviation Compare with the second preset threshold; if the second deviation is less than the second preset threshold, determine the calibration parameter at the current moment, or the weighted average of the calibration parameter at the historical moment and the calibration parameter at the current moment as the target calibration parameter; if The deviation is greater than or equal to the preset threshold, and the calibration parameter or the preset calibration parameter at the historical moment is determined as the target calibration parameter.
  • the preset calibration parameter can be subtracted from the calibration parameter at the current moment to obtain the second deviation, and then the second deviation is compared with the second preset threshold, if the second deviation is greater than or equal to the first 2.
  • the preset threshold value It is considered that the optimization process or the initial camera parameters are wrong.
  • the calibration parameters at the historical time or the preset calibration parameters can be determined as the target calibration parameters.
  • the second deviation is less than the second preset threshold, it is considered that the optimization process is normal, and the calibration parameter at the current moment, or the weighted average of the calibration parameter at the historical moment and the calibration parameter at the current moment may be determined as the target calibration parameter.
  • the user when the estimated value is found to be inconsistent with the initial value, the user can be prompted to avoid large changes in camera parameters that may cause mapping errors.
  • the size of the first preset threshold and the second preset threshold may be adjusted according to actual experience or measurement results.
  • the first deviation when the first deviation is less than the first preset threshold, and/or the second deviation is less than the second preset threshold, it can be considered that the accuracy of the calibration parameters at the current moment is relatively high. Therefore, the The calibration parameters at the current time and/or the target calibration parameters are saved for reference when performing the next job task.
  • the target calibration parameter can be sent to the cloud server, so that the target calibration parameter corresponding to the camera is stored in the cloud server.
  • the calibration parameters at the current time and/or the target calibration parameters can be uploaded to the cloud server, and the calibration parameters at the current time and/or the target calibration parameters can be saved on the local end or control of the drone. In the terminal.
  • the parameters corresponding to the lens ID number of the camera can be stored in the area corresponding to the cloud server.
  • the cloud server stores calibration parameters respectively used by the mobile platform when executing tasks with different history.
  • the calibration parameters include at least one of the following parameters of the camera: internal parameters, distortion parameters, and external parameters.
  • the data format stored in the cloud server is shown in Table 1.
  • F represents the focal length
  • Cx Cy represent the position of the principal point of the image in the pixel coordinate system
  • k1, k2, and k3 represent the radial distortion parameters
  • p1, p2 represent the tangential distortion parameters.
  • the server stores the calibration parameters used each time when the mobile platform executes N historical job tasks, where N is a preset fixed value, when the server stores the target calibration parameters corresponding to the camera , The server can delete the calibration parameter corresponding to the historical job task that is the furthest away from the time of the current job task.
  • the cloud server may only retain the information uploaded for the last 5 times/10 times, that is, N is 5 times/10 times.
  • the newly uploaded data are the distortion parameters in the camera optimized for the last valid job, and the set of parameters that are the farthest away from the current valid job in time can be replaced. For example, delete all the parameters corresponding to the n-5th time.
  • a weighted average calculation can be performed based on the latest 5 times/10 times data, and the weights can be assigned according to the higher weight of the closer to the current test time and the lower weight of the longer time, and calculate Comprehensive weighted average camera internal parameters (F avg , Cx avg , Cy avg ) and distortion parameters (k1 avg , k2 avg , k3 avg , p1 avg , p2 avg ).
  • the comprehensive weighted average parameter value can be obtained, which can prevent the parameter that is too long from affecting the accuracy of the parameter.
  • the calibration parameters of the camera at the current time are obtained by calibrating the multi-frame images collected by the camera on the movable platform, and the calibration parameters of the camera at the historical time used by the movable platform when performing historical tasks are obtained.
  • the current calibration parameters obtained from the current calibration determine the target calibration parameters of the camera used by the movable platform when performing a new task. Since the target calibration parameters of the camera are determined based on the calibration parameters at the historical moment and the calibration parameters at the current moment obtained from the current calibration, rather than fixed parameters, the camera's calibration parameters can be iteratively optimized during use, making it possible to The calibration parameters of the camera used in the actual operation of the mobile platform are more in line with the actual operation situation. Even if the camera lens is worn during use, or is reassembled after disassembly, since the calibration parameters at historical moments are referenced, the impact of changing camera parameter values due to wear and reassembly on the image measurement results can be reduced.
  • the user may be prompted whether to read the participating distortion parameters in the integrated weighted average camera stored in the cloud server. If the user chooses to read, the lens ID number "Lens_ID" will be uploaded to the cloud server, and search will be performed, and the integrated weighted average camera internal parameters (F avg , Cx avg , Cy avg ) and distortion parameters (k1 avg , k2 avg , k3 avg , p1 avg , p2 avg ), and/or the in-camera distortion parameters used in each historical operation are used as the initial value of this mapping operation and applied to the aerial triangulation process, and then the current time is obtained based on the acquired image calibration The calibration parameters.
  • the integrated weighted average camera internal parameters F avg , Cx avg , Cy avg
  • distortion parameters k1 avg , k2 avg , k3 avg
  • the mobile platform when the next mapping operation is performed, can also directly request from the cloud server to obtain the comprehensive weighted average in-camera participating distortion parameters, and/or the in-camera participating distortion parameters used in each historical operation Without prompting the user.
  • the original calibration value stored in the lens (F 0 , Cx 0 , Cy 0 ) and (k1 0 , k2 0 , k3 0 , p1 0 , p2 0 ) are used as initial parameters for iteration.
  • the lens unique ID number can be written in the lens cache.
  • the keyword can be, for example, "Lens_ID”, and the ID number can be unique by information such as material number, appearance date, special code, etc. Identified.
  • the "return to factory inspection times" field can be reserved in the lens cache, and the initial value is 0.
  • a checkerboard can be used in the factory to calibrate the distortion parameters involved in the camera, and the internal camera parameters (F 0 , Cx 0 , Cy 0 ) and (k1 0 , k2 0 , k3 0 , p1) of the initial calibration can be obtained. 0 , p2 0 ).
  • the initial calibrated camera's participating distortion parameters into the lens cache.
  • the "undistort parameter" field in the exif part of the image can be written into the initial camera's initial participating distortion parameters in sequence.
  • the distortion parameters (F 0 , Cx 0 , Cy 0 ) and (k1 0 , k2 0 , k3 0 , p1 0 , p2 0 ) involved in the initial camera can be combined Perform a reset and +1 the value of the "Return to Factory Inspection" field.
  • the initial value of the parameter history table corresponding to the lens ID number can be reset on the cloud server, and all the remaining values can be cleared.
  • Fig. 5 schematically shows a flow chart of acquiring calibration parameters of a camera at a historical moment according to an embodiment of the present disclosure.
  • obtaining the calibration parameters of the camera at the historical moment includes operation S510 to operation S530.
  • the unique ID of the camera lens can be recorded in the exif field of the image, and the keyword can be "Lens_ID" as the unique identification information.
  • the identification information of the camera is sent to the server, so that the server obtains the calibration parameters of the historical moment corresponding to the camera according to the identification information of the camera.
  • the calibration parameters used by the camera in multiple historical tasks can be stored in the server.
  • the parameters corresponding to the identification information of the camera are stored in the area corresponding to the server, see Table 1, which will not be repeated here.
  • a calibration parameter corresponding to the camera at a historical time is received from the server.
  • the server can look up a table according to the identification information of the camera to obtain the calibration parameters of the historical moment corresponding to the camera.
  • the calibration parameter at a historical moment may be a comprehensive weighted average as shown in Table 1, or may be data corresponding to the calibration parameter at multiple historical moments.
  • the server can also obtain the calibration parameters of the camera at the current time, and determine the target calibration parameters on the server's local end according to the calibration parameters at the historical time and the calibration parameters at the current time.
  • the type of work task performed by the drone is not limited, and the work task may be a task that involves mapping.
  • the work task may be a surveying task, a patrol task or a tracking task, and so on.
  • the calibration parameters of the camera at the current time are obtained by calibration based on the multiple frames of images collected by the camera, the calibration parameters of the camera at the current time are associated with the image itself.
  • the camera parameters used are generally the same, but because the images of different location areas are different, if the same camera parameters are used to perform operations in different location areas Tasks will affect the accuracy of image measurement, resulting in poor mapping results.
  • the geographic location information of the camera during the current operation can be obtained, and then the identification information and geographic location information of the camera can be sent to the server.
  • the server can obtain the information corresponding to the camera according to the identification information and geographic location information of the camera. Calibration parameters at historical moments.
  • the server may store calibration parameters used by the camera in multiple historical tasks, and each historical task has corresponding geographic location information.
  • the data format stored in the server is shown in Table 2.
  • L 0 represents the geographic location information when the factory is calibrated
  • L n-4 to L n represent the corresponding geographic location information each time the job task is executed.
  • the server can search for data corresponding to the camera according to the identification information of the camera, and then obtain the same or the same geographic location information as the location information during the current operation according to the geographic location information of the camera during the current operation.
  • Calibration parameters of historical moments corresponding to similar geographic locations For example, the geographic location information that is the same as or similar to the geographic location information during the current operation is L n-4 , then the calibration parameter of the historical moment corresponding to the n-4th time can be determined as the calibration parameter of the historical moment corresponding to the camera .
  • the calibration parameters of the historical moment corresponding to the camera are obtained from the server according to the identification information and geographic location information of the camera.
  • the calibration parameters of the camera can be optimized iteratively during use, and adapt to the operating environment during the current operation, making it more in line with the actual operating conditions, improving the accuracy of image measurement, and achieving a better mapping effect.
  • Fig. 6 schematically shows a block diagram of an apparatus for determining camera parameters according to an embodiment of the present disclosure.
  • the camera parameter determination device 600 includes a processor 610 and a memory 620.
  • the memory 620 is configured to store one or more programs, where when the one or more programs are executed by the processor, the processor is caused to perform the following operations: acquire a multi-frame image collected by a camera on a movable platform, where the multi-frame image At least include two frames of images with partially overlapping images; calibrate the calibration parameters of the camera at the current time according to the multi-frame images; obtain the calibration parameters of the camera at the historical time, where the calibration parameters at the historical time are the historical operations performed by the movable platform The calibration parameters of the camera used in the task; and the target calibration parameters are determined according to the calibration parameters at the historical moment and the calibration parameters at the current moment, where the target calibration parameters are used by the movable platform when performing a new task.
  • the calibration parameters of the camera at the current time are obtained by calibrating the multiple frames of images collected by the camera on the movable platform, and the calibration parameters of the camera at the historical time used by the movable platform when executing historical tasks are obtained.
  • the current calibration parameters obtained from the current calibration determine the target calibration parameters of the camera used by the movable platform when performing a new task. Since the target calibration parameters of the camera are determined based on the calibration parameters at the historical moment and the calibration parameters at the current time obtained from the current calibration, rather than fixed parameters, the camera's calibration parameters can be iteratively optimized during use, making it possible to The calibration parameters of the camera used in the actual operation of the mobile platform are more in line with the actual operation situation.
  • the processor 610 acquiring the calibration parameters of the historical moment of the camera includes: acquiring the identification information of the camera; sending the identification information of the camera to the server, so that the server obtains the calibration of the historical moment corresponding to the camera according to the identification information of the camera Parameters; and receiving the calibration parameters corresponding to the historical moment of the camera from the server.
  • the server stores calibration parameters respectively used by the movable platform when executing tasks of different history; the parameters include at least one of the following parameters of the camera: internal parameters, distortion parameters, and external parameters.
  • the processor 610 further performs the following operations: after determining the target calibration parameters, send the target calibration parameters to the server, so as to store the target calibration parameters corresponding to the camera in the server.
  • the server stores the calibration parameters used each time when the mobile platform executes N historical job tasks, where N is a preset fixed value, when the server stores the target calibration parameters corresponding to the camera , The server deletes the calibration parameter corresponding to the historical job task that is the furthest away from the time of the current job task.
  • the processor 610 acquiring the calibration parameters of the camera at the historical moment includes: acquiring the geographic location information of the camera during the current operation; wherein, sending the identification information of the camera to the server, so that the server can according to the identification information of the camera
  • Obtaining the calibration parameter of the historical moment corresponding to the camera includes: sending the identification information and geographic location information of the camera to the server, so that the server can obtain the calibration parameter of the historical moment corresponding to the camera according to the identification information and geographic location information of the camera.
  • the processor 610 determining the target calibration parameter according to the calibration parameter at the historical moment of the camera and the calibration parameter at the current moment includes: calculating the calibration parameter at the historical moment and the weighted average value of the calibration parameter at the current moment; and weighting The average value is determined as the target calibration parameter.
  • the processor 610 determining the target calibration parameter according to the calibration parameter at the historical time of the camera and the calibration parameter at the current time includes: calculating the first deviation between the calibration parameter at the historical time and the calibration parameter at the current time; The deviation is compared with the first preset threshold; if the first deviation is less than the first preset threshold, the calibration parameter at the current moment, or the weighted average of the calibration parameter at the historical moment and the calibration parameter at the current moment is determined as the target calibration parameter; And if the first deviation is greater than or equal to the first preset threshold, the calibration parameter at the historical moment is determined as the target calibration parameter.
  • the processor 610 further performs the following operations: acquiring preset calibration parameters of the camera; wherein, determining the target calibration parameters according to the calibration parameters at the historical moment of the camera and the calibration parameters at the current moment includes: according to the preset calibration parameters , The calibration parameters at the historical moment and the calibration parameters at the current moment determine the target calibration parameters.
  • determining the target calibration parameter by the processor 610 according to the preset calibration parameter, the calibration parameter at the historical moment, and the calibration parameter at the current moment includes: calculating a second deviation between the preset calibration parameter and the calibration parameter at the current moment; The second deviation is compared with the second preset threshold; if the second deviation is less than the second preset threshold, the calibration parameter at the current moment, or the weighted average of the calibration parameter at the historical moment and the calibration parameter at the current moment is determined as the target calibration Parameters; and if the deviation is greater than or equal to the preset threshold, the calibration parameter or the preset calibration parameter at the historical moment is determined as the target calibration parameter.
  • the processor 610 may include, for example, a general-purpose microprocessor, an instruction set processor, and/or a related chipset and/or a special-purpose microprocessor (for example, an application specific integrated circuit (ASIC)), and so on.
  • the processor 610 may also include on-board memory for caching purposes.
  • the processor 610 may be a single processing unit or multiple processing units for executing different actions of a method flow according to an embodiment of the present disclosure.
  • the memory 620 may include a computer program 621, and the computer program 621 may include code/computer executable instructions, which when executed by the processor 610 causes the processor 610 to execute, for example, the method flow described above in conjunction with FIGS. 4 to 5 and any variations thereof .
  • the computer program 621 may be configured to have, for example, computer program code including computer program modules.
  • the code in the computer program 621 may include one or more program modules, for example, including 621A, module 621B,... It should be noted that the division method and number of modules are not fixed. Those skilled in the art can use appropriate program modules or program module combinations according to the actual situation. When these program module combinations are executed by the processor 610, the processor 610 can Perform, for example, the method flow described above in conjunction with FIGS. 4 to 5 and any variations thereof.
  • a readable storage medium having executable instructions stored thereon, and when the instructions are executed by a processor, the processor executes the aforementioned camera calibration method.
  • the readable storage medium may have the same function as the above-mentioned memory.
  • the readable storage medium may be included in the device/device/system described in the foregoing embodiment; or it may exist alone without being assembled into the device/device/system.
  • the aforementioned readable storage medium carries one or more programs, and when the aforementioned one or more programs are executed, the method according to the embodiments of the present disclosure is implemented.
  • the readable storage medium may be a nonvolatile readable storage medium.
  • it may include but not limited to: portable computer disk, hard disk, random access memory (RAM), read only memory (ROM), Erasable programmable read-only memory (EPROM or flash memory), portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • a control system for a movable platform may include one or more movable platforms, and a control terminal for controlling the movable platforms, and the processor in each movable platform can execute the camera calibration method as described above.
  • a movable platform is also provided, and the processor of the movable platform can execute the above-mentioned camera calibration method.
  • each block in the flowchart or block diagram may represent a module, program segment, or part of code, and the above-mentioned module, program segment, or part of code contains one or more for realizing the specified logic function.
  • Executable instructions may also occur in a different order from the order marked in the drawings. For example, two blocks shown one after the other can actually be executed substantially in parallel, or they can sometimes be executed in the reverse order, depending on the functions involved.
  • each block in the block diagram or flowchart, and the combination of blocks in the block diagram or flowchart can be implemented by a dedicated hardware-based system that performs the specified function or operation, or can be implemented by It is realized by a combination of dedicated hardware and computer instructions.

Abstract

一种相机参数确定方法、一种相机参数确定装置和一种可读存储介质。该方法包括:获取可移动平台上的相机采集的多帧图像,其中,多帧图像至少包括具有部分重叠影像的两帧图像;根据多帧图像标定得到相机的当前时刻的标定参数;获取相机的历史时刻的标定参数,其中,历史时刻的标定参数为可移动平台在执行历史的作业任务时使用的相机的标定参数;以及根据历史时刻的标定参数和当前时刻的标定参数确定目标标定参数,其中,目标标定参数用于可移动平台在执行新的作业任务时使用;其中,标定参数包括相机的以下一种或多种参数:内参、畸变参数、外参。

Description

相机参数确定方法、装置和可读存储介质 技术领域
本公开涉及信息处理领域,尤其涉及一种相机参数确定方法、一种相机参数确定装置和一种可读存储介质。
背景技术
在图像测量过程中,为了确定物体的三维位置和物体在图像中对应点之间的映射关系,需要建立相机成像的几何模型,其中,比较常用的几何模型为针孔模型,它描述了光线通过针孔后,投影成像的关系。几何模型的参数包括相机参数,在一般情况下,相机参数需要通过标定过程才能得到,通过标定得到的相机参数的稳定性影响了图像测量的准确性。
公开内容
本公开提供了一种相机参数确定方法,包括:获取可移动平台上的相机采集的多帧图像,其中,上述多帧图像至少包括具有部分重叠影像的两帧图像;根据上述多帧图像标定得到上述相机的当前时刻的标定参数;获取上述相机的历史时刻的标定参数,其中,上述历史时刻的标定参数为上述可移动平台在执行历史的作业任务时使用的相机的标定参数;以及根据上述历史时刻的标定参数和上述当前时刻的标定参数确定目标标定参数,其中,上述目标标定参数用于上述可移动平台在执行新的作业任务时使用;其中,上述标定参数包括上述相机的以下一种或多种参数:内参、畸变参数、外参。
本公开还提供了一种相机参数确定装置,包括:处理器;存储器,用于存储一个或多个程序,其中,当上述一个或多个程序被上述处理器执行时,使得上述处理器执行以下操作:获取可移动平台上的相机采集的多帧图像,其中,上述多帧图像至少包括具有部分重叠影像的两帧图像;根据上述多帧图像标定得到上述相机的当前时刻的标定参数;获取上述相机的历史时刻的标定参数,其中,上述历史时刻的标定参数为上述可移动平台 在执行历史的作业任务时使用的相机的标定参数;以及根据上述历史时刻的标定参数和上述当前时刻的标定参数确定目标标定参数,其中,上述目标标定参数用于上述可移动平台在执行新的作业任务时使用。
本公开还提供了一种可读存储介质,其上存储有可执行指令,该指令被处理器执行时使处理器执行如上所述的相机参数确定方法
根据本公开的实施例,通过可移动平台上的相机采集的多帧图像标定得到相机的当前时刻的标定参数,根据可移动平台在执行历史的作业任务时使用的相机的历史时刻的标定参数和当前标定得到的当前时刻的估计的标定,确定可移动平台在执行新的作业任务时使用的相机的目标标定参数。由于相机的目标标定参数是根据历史时刻的标定参数和当前标定得到的当前时刻的标定参数进行确定的,而不是固定不变的参数,使得相机的标定参数在使用过程中能够迭代优化,使得可移动平台在实际作业过程中使用的相机的标定参数更加符合实际作业情况。即使相机的镜头在使用过程中有磨损,或者在拆卸后又重新装配,由于参考了历史时刻的标定参数,能够降低因磨损与重新装配改变相机参数的值对图像测量结果的影响。
附图说明
附图是用来提供对本公开的进一步理解,并且构成说明书的一部分,与下面的具体实施方式一起用于解释本公开,但并不构成对本公开的限制。在附图中:
图1示意性示出了根据本公开实施例的相机的针孔模型的示意图。
图2示意性示出了根据本公开实施例的相机的畸变模型的示意图。
图3示意性示出了根据本公开实施例的可以应用相机参数确定方法及装置的应用场景。
图4示意性示出了根据本公开实施例的相机参数确定方法的流程图。
图5示意性示出了根据本公开实施例的获取相机的历史时刻的标定参数的流程图。
图6示意性示出了根据本公开实施例的相机参数确定装置的框图。
具体实施方式
下面将结合实施例和实施例中的附图,对本公开技术方案进行清楚、完整的描述。显然,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。在下面的详细描述中,为便于解释,阐述了许多具体的细节以提供对本公开实施例的全面理解。然而,明显地,一个或多个实施例在没有这些具体细节的情况下也可以被实施。此外,在以下说明中,省略了对公知结构和技术的描述,以避免不必要地混淆本公开的概念。
附图中示出了一些方框图和/或流程图。应理解,方框图和/或流程图中的一些方框或其组合可以由计算机程序指令来实现。这些计算机程序指令可以提供给通用计算机、专用计算机或其他可编程数据处理装置的处理器,从而这些指令在由该处理器执行时可以创建用于实现这些方框图和/或流程图中所说明的功能/操作的装置。本公开的技术可以硬件和/或软件(包括固件、微代码等)的形式来实现。另外,本公开的技术可以采取存储有指令的计算机可读存储介质上的计算机程序产品的形式,该计算机程序产品可供指令执行系统使用或者结合指令执行系统使用。
相机拍摄物体生成图像的过程实质上是将位于世界坐标系中的被摄物体转换为位于像素坐标系中的像。世界坐标系是一个三维直角坐标系,以其为基准可以描述相机和被摄物体的相对空间位置,像素坐标系是一个二维直角坐标系,反映了相机CCD/CMOS芯片中像素的排列情况,因此,相机成像的几何模型表征了从世界坐标系到像素坐标系的转换过程。
从世界坐标系(O-X wY wZ w)到像素坐标系(o-uv)的转换一般需要先从世界坐标系(O-X wY wZ w)转换为相机坐标系(C-XYZ),再从相机坐标系(C-XYZ)再转换为像素坐标系(o-uv)。相机坐标系(C-XYZ)也是一个三维直角坐标系,原点在相机的光心C处,X轴和Y轴分别为与像面的两边平行,Z轴为相机的光轴。
从世界坐标系(O-X wY wZ w)转换为相机坐标系(C-XYZ)的转换一般包括旋转与位移变换Transform。矩阵R为Rotation Matrix旋转矩阵,矩阵T为Translation Matrix位移矩阵,R和T为相机的外参Extrinsic Matrix, 表达的是三维空间中,世界坐标系(O-X wY wZ w)到相机坐标系(C-XYZ)的转换关系。
将世界坐标系转换为相机坐标系的转换关系如下:
Figure PCTCN2020092939-appb-000001
其中,R为3×3的旋转矩阵,t为3×1的平移矢量,(X,Y,Z,1) T为相机坐标系的奇次坐标,(X w,Y w,Z w,1) T为世界坐标系的奇次坐标。
从相机坐标系(C-XYZ)转换为像素坐标系(o-uv)比较常用的几何模型为针孔模型,它描述了光线通过针孔后,投影成像的关系。
但在像素坐标系(o-uv)中,一般以图像的左上角为坐标原点o,u轴和v轴分别与像面的两边平行,像素坐标系中坐标轴的单位是像素。为了便于建立转换关系,可以以图像的像面中心为原点p,以分别与u轴和v轴平行的x轴和y轴建立图像坐标系(p-xy),原点p在像素坐标系中的坐标为(u 0,v 0)。当然,本公开也可以不建立图像坐标系(p-xy),直接将像素坐标系(o-uv)的坐标原点o平移至图像中心p。其中,图像坐标系的原点p是相机光轴与图像的像面的交点,即相机的主点,位于图像的中心。像素坐标系和图像坐标系实质上是平移关系,可以通过平移进行变换,则将图像坐标系(p-xy)转换为像素坐标系(o-uv)的转换关系如下:
Figure PCTCN2020092939-appb-000002
其中,dx、dy分别为像素在x、y轴方向上的物理尺寸,(u 0,v 0)表示主点坐标。
接下来,可以先将相机坐标系(C-XYZ)转换为图像坐标系(p-xy),然后根据图像坐标系(p-xy)与像素坐标系(o-uv)的转换关系再转化为像素坐标系(o-uv)。
从相机坐标系(C-XYZ)转换为图像坐标系(p-xy)可以使用针孔模型进行描述。
图1示意性示出了根据本公开实施例的相机的针孔模型的示意图。
如图1所示,相机的针孔模型反映了相机坐标系(C-XYZ)和图像坐标系(p-xy)之间的变换关系。例如,相机坐标系(C-XYZ)中任意一点M对应于图像坐标系(p-xy)中的像点m,M与相机光心C的连线为CM,CM与像面的交点即为像点m,m为空间点M在图像平面上的投影。则将相机坐标系(C-XYZ)转换为图像坐标系(p-xy)对应于透视投影的过程,如下矩阵表示:
Figure PCTCN2020092939-appb-000003
其中,s为不为0的比例因子,f为相机的有效焦距(光心到图像平面的距离),(X,Y,Z,1) T为相机坐标系的奇次坐标,(x,y,1) T为图像坐标系的奇次坐标。
基于上述世界坐标系到相机坐标系的变换关系、相机坐标系到图像坐标系的变换关系、以及图像坐标系到像素坐标系的变换关系,可以得到世界坐标系到像素坐标系的变换关系如下:
Figure PCTCN2020092939-appb-000004
其中,z为不为0的比例因子。矩阵K称为相机内参,即Camera calibration matrix,R和T为相机的外参。因此,相机内参一般包括焦距f,像主点在像素坐标系中的位置u 0,v 0,像元的尺寸dx,dy。
但由于镜头透镜的存在,使得光线在投影过程中会产生畸变,一般可以使用畸变模型来描述光线在投影过程中产生的畸变。畸变模型一般可以通过畸变参数来表征。畸变参数一般包括径向畸变参数k1,k2,k3,切向畸变参数p1,p2。径向畸变也可以称之为桶形畸变和枕形畸变,产生的原因一般是光线在远离透镜中心的地方偏折更大,切向畸变产生的原因一般是透镜不完全平行于图像平面。
图2示意性示出了根据本公开实施例的相机的畸变模型的示意图。
如图2所示,像素点的理论位置在圆弧上,但由于光线在远离透镜中 心的地方偏折和透镜不完全平行于图像平面,像素点的位置发生径向和切向偏差。
在一般情况下,相机外参可以在作业过程中获取,相机内参和畸变参数一般要提前标定出,在后处理建图过程中需要在建图软件中输入相机内参和畸变参数作为初值,在建图算法中进行迭代。
在相关技术中,较为常用的相机参数的标定方法一般是在出厂时使用棋盘格以及视觉标定算法(如OpenCV算法)对相机参数进行标定,之后将标定得到的相机参数写入相机的存储区域,或者提供检测报告和证书。
具体地,例如,OpenCV算法内置相机内参与畸变参数标定算法,可以通过在不同角度拍摄棋盘格,调用标定算法获取标定后的相机内参和畸变参数。
但是,在相机使用过程中,由于磨损或镜头拆装等原因,相机参数可能会发生变化,使得其和预设的参数标定值相差较大,会给后续建图处理带来困扰,影响了图像测量的准确性。
而在相关技术中,在使用相机执行任务时,使用的是通过出厂标定得到的初始参数。例如,通过将相机内参和畸变参数写入相机,进而在拍照过程中将固定的参数写入照片exif部分的某个字段中或者直接给出检校证书,其中,exif即exchangeable image file format,是专门为数码相机的照片设定的,可以记录数码照片的属性信息和拍摄数据,其可附加于JPEG、TIFF、RIFF等文件之中,为其增加有关数码相机拍摄信息的内容和索引图或图像处理软件的版本信息。如果相机的镜头在使用过程中有磨损,或者在拆卸后又重新装配,使得相机的参数和出厂给出的标定值不符,会给后续建图处理等带来困扰。因此,相机内参和畸变参数没有自身迭代的功能,如果中途发生了镜头/相机拆卸,或者使用时间过长,一般只能通过返厂进行重新检校,不具备自身迭代能力,而且在检校之前相机内参和畸变参数和真值偏差较大。
本公开的实施例提供了相机参数确定方法及装置。该方法包括:获取可移动平台上的相机采集的多帧图像,其中,上述多帧图像至少包括具有部分重叠影像的两帧图像;根据上述多帧图像标定得到上述相机的当前时刻的标定参数;获取上述相机的历史时刻的标定参数,其中,上述历史时 刻的标定参数为上述可移动平台在执行历史的作业任务时使用的相机的标定参数;以及根据上述历史时刻的标定参数和上述当前时刻的标定参数确定目标标定参数,其中,上述目标标定参数用于上述可移动平台在执行新的作业任务时使用。
以下将以可移动平台为无人机为例,对相机参数确定方法及装置的应用场景进行说明。
图3示意性示出了根据本公开实施例的可以应用相机参数确定方法及装置的应用场景。需要注意的是,图3所示仅为可以应用本公开实施例的场景的示例,以帮助本领域技术人员理解本公开的技术内容,但并不意味着本公开实施例不可以用于其他设备、系统、环境或场景。
如图3所示,用户301可以通过遥控终端控制无人机302飞行,无人机302上的相机可以采集现实环境中的多帧图像。其中,相机采集的多帧图像中至少包括部分重叠影像的两帧图像。例如,相机采集的两帧图像中至少都包含同一颗大树,或者同一座山峰等等,该重叠影像可以为同一颗大树,或者同一座山峰。
根据本公开的实施例,无人机302可以将相机采集的图像发送至遥控终端进行处理,或者也可以在无人机302本地进行处理,通过对相机采集的多帧图像进行处理,可以得到相机的当前时刻的标定参数。
根据本公开的实施例,无人机302可以从服务器303获得上述相机的历史时刻的标定参数,其中,上述历史时刻的标定参数为无人机302在执行历史的作业任务时使用的相机的标定参数。然后,根据历史时刻的标定参数和当前时刻的标定参数确定无人机302在执行新的作业任务时使用的目标标定参数。
在一可选的实施例中,控制终端可以预先从服务器303中获得上述相机的历史时刻的标定参数,并缓存至控制终端本地。然后,无人机302从控制终端获得上述相机的历史时刻的标定参数。
在另一可选的实施例中,无人机302也可以从无人机302本端的存储介质中获得上述相机的历史时刻的标定参数。
由此可见,本公开获得上述相机的历史时刻的标定参数的方式多样,可以不限于从服务器303获得上述相机的历史时刻的标定参数。
根据本公开的实施例,由于相机的目标标定参数是根据历史时刻的标定参数和当前标定得到的当前时刻的标定参数进行确定的,而不是固定不变的参数,使得相机的标定参数在使用过程中能够迭代优化,使得无人机302在实际作业过程中使用的相机的标定参数更加符合实际作业情况。即使相机的镜头在使用过程中有磨损,或者在拆卸后又重新装配,由于参考了历史时刻的标定参数,能够降低因磨损与重新装配改变相机参数的值对图像测量结果的影响。
下面参考图4,对本公开实施例所提供的相机参数确定方法做进一步说明。
图4示意性示出了根据本公开实施例的相机参数确定方法的流程图。
需要说明的是,本公开实施例中的流程图所示的操作除非明确说明不同操作之间存在执行的先后顺序,或者不同操作在技术实现上存在执行的先后顺序,否则,多个操作之间的执行顺序可以不分先后,多个操作也可以同时执行。
如图4所示,相机参数确定方法包括操作S410~S440。
在操作S410,获取可移动平台上的相机采集的多帧图像,其中,相机采集的多帧图像至少包括具有部分重叠影像的两帧图像。
根据本公开的实施例,可以改变可移动平台上的相机的位置和姿态来获得多帧图像。例如,控制相机在三个不同的位置从不同角度来拍摄同一个物体,采集到的三帧图像中至少都包括关于该物体的像素部分。进一步的,该物体分别在不同位置采集得到的图像上的位置也不一样。
在操作S420,根据多帧图像标定得到相机的当前时刻的标定参数。
根据本公开的实施例,可以在空中三角测量的过程中根据多帧图像标定得到相机的当前时刻的标定参数,标定参数可以是相机内参、畸变参数相机外参中的一种或多种。例如,在对图像进行处理的过程中,调用图像中写入的标定内参与畸变参数,之后在空中三角测量的过程中将标定内参和畸变参数作为未知数,进行非线性优化处理,从而迭代出更加精确的优化后的相机内参与畸变参数。
根据本公开的实施例,也可以通过标定算法对采集得到的多帧图像进行处理,识别出图像中的标定点,结合标定点的三维位置标定得到相机的 当前时刻的标定参数。在操作S430,获取相机的历史时刻的标定参数,其中,历史时刻的标定参数为可移动平台在执行历史的作业任务时使用的相机的标定参数。
根据本公开的实施例,获取到的历史时刻的标定参数可以包括一个或多个,在获取多个历史时刻的标定参数的情况下,多个历史时刻的标定参数可以是可移动平台在执行多次历史的作业任务时分别使用的标定参数。在获取一个历史时刻的标定参数的情况下,该历史时刻的标定参数可以是与当前作业的时间最接近的历史作业对应的历史时刻的标定参数。
根据本公开的实施例,获取到的历史时刻的标定参数也可以是一个综合值。例如,获取到的历史时刻的标定参数为可移动平台在执行多次历史的作业任务时使用的标定参数的加权平均值。
在操作S440,根据历史时刻的标定参数和当前时刻的标定参数确定目标标定参数,其中,目标标定参数用于可移动平台在执行新的作业任务时使用。
根据本公开的实施例,当可移动平台在执行新的作业任务时,可以利用确定的目标标定参数进行建图等任务。在执行完新的作业任务之后,即在未来时刻,当前确定的目标标定参数就可以作为历史时刻的标定参数,因此,在未来时刻,可以重新确定未来时刻的目标标定参数。
根据本公开的实施例,根据相机的历史时刻的标定参数和当前时刻的标定参数确定目标标定参数的方式包括多种。例如,可以包括如下实施方式。
在一可选的实施方式中,可以计算历史时刻的标定参数和当前时刻的标定参数的加权平均值;然后将加权平均值确定为目标标定参数。
根据本公开的实施例,在历史时刻的标定参数包括多个情况下,可以根据每个历史时刻的标定参数的标定时间确定每个历史时刻的标定参数的权重大小。例如,可以按照距离当前作业时间较近的历史时刻的标定参数的权重较高,距离当前作业时间较远的历史时刻的标定参数的权重较低进行分配,计算出综合加权平均值。通过为不同的历史时刻的标定参数设置不同的权重,求得综合加权平均值,避免了太过久远的参数影响目标标定参数的准确性。
在另一可选的实施方式中,可以先计算历史时刻的标定参数和当前时刻的标定参数的第一偏差;将第一偏差与第一预设阈值进行比较;如果第一偏差小于第一预设阈值,将当前时刻的标定参数,或者历史时刻的标定参数和当前时刻的标定参数的加权平均值确定为目标标定参数;以及如果第一偏差大于或等于第一预设阈值,将历史时刻的标定参数确定为目标标定参数。
根据本公开的实施例,例如,可以将历史时刻的标定参数减去当前时刻的标定参数,得到第一偏差,然后将第一偏差与第一预设阈值进行比较,如果第一偏差大于或等于第一预设阈值,认为优化过程或初始的相机参数有误,此时可以将历史时刻的标定参数确定为目标标定参数。此外,也可以向用户给出提示,提示用户重新进行图像数据处理,重新进行图像数据采集或者将设备进行返厂重新检校等。如果第一偏差小于第一预设阈值,认为优化过程正常,可以将当前时刻的标定参数,或者历史时刻的标定参数和当前时刻的标定参数的加权平均值确定为目标标定参数。
在另一可选的实施方式中,还可以获取相机的预设标定参数,然后根据预没标定参数、历史时刻的标定参数和当前时刻的标定参数确定目标标定参数。
根据本公开的实施例,预设标定参数可以是可移动平台接收到用户输入的为相机设定的标定参数,也可以是相机的出厂标定参数等等。
根据本公开的实施例,根据预设标定参数、历史时刻的标定参数和当前时刻的标定参数确定目标标定参数包括:计算预设标定参数和当前时刻的标定参数的第二偏差;将第二偏差与第二预设阈值进行比较;如果第二偏差小于第二预设阈值,将当前时刻的标定参数,或者历史时刻的标定参数和当前时刻的标定参数的加权平均值确定为目标标定参数;如果偏差大于或等于预设阈值,将历史时刻的标定参数或预设标定参数确定为目标标定参数。
根据本公开的实施例,例如,可以将预设标定参数减去当前时刻的标定参数,得到第二偏差,然后将第二偏差与第二预设阈值进行比较,如果第二偏差大于或等于第二预设阈值,认为优化过程或初始的相机参数有误,此时可以将历史时刻的标定参数或预设标定参数确定为目标标定参数。此 外,也可以向用户给出提示,提示用户重新进行图像数据处理,重新进行图像数据采集或者将设备进行返厂重新检校等。如果第二偏差小于第二预设阈值,认为优化过程正常,可以将当前时刻的标定参数,或者历史时刻的标定参数和当前时刻的标定参数的加权平均值确定为目标标定参数。
根据本公开的实施例,当发现估计值和初始值不一致时,可以提示用户,以避免相机参数发生较大变化而导致建图误差。
根据本公开的实施例,第一预设阈值和第二预设阈值的大小可以根据实际经验或测量结果进行调整。
根据本公开的实施例,在第一偏差小于第一预设阈值,以及/或者第二偏差小于第二预设阈值的情况下,可以认为当前时刻的标定参数的精度较高,因此,可以将当前时刻的标定参数,以及/或者目标标定参数保存,用作执行下一次作业任务时的参考。
根据本公开的实施例,在确定目标标定参数之后,可以向云端服务器发送目标标定参数,以便将与相机对应的目标标定参数存储在云端服务器中。
根据本公开的实施例,可以将当前时刻的标定参数,以及/或者目标标定参数上传至云端服务器,也可以将当前时刻的标定参数,以及/或者目标标定参数保存在无人机本端或者控制终端中。
根据本公开的实施例,如果将当前时刻的标定参数,以及/或者目标标定参数上传至云端服务器,可以在云端服务器对应的区域存储该相机的镜头ID编号对应的参数。
根据本公开的实施例,云端服务器中存储有可移动平台在执行不同历史的作业任务时分别使用的标定参数,标定参数包括相机的以下至少一种参数:内参、畸变参数、外参。
具体地,以云端服务器中存储有内参和畸变参数为例,云端服务器中存储的数据格式如表1所示。
表1
工厂标定 第n-4次 第n-3次 第n-2次 第n-1次 第n次 综合值
F 0 F n-4 F n-3 F n-2 F n-1 F n F avg
Cx 0 Cx n-4 Cx n-3 Cx n-2 Cx n-1 Cx n Cx avg
Cy 0 Cy n-4 Cy n-3 Cy n-2 Cy n-1 Cy n Cy avg
K1 0 K1 n-4 K1 n-3 K1 n-2 K1 n-1 K1 n K1 avg
K2 0 K2 n-4 K2 n-3 K2 n-2 K2 n-1 K2 n K2 avg
K3 0 K3 n-4 K3 n-3 K3 n-2 K3 n-1 K3 n K3 avg
P1 0 P1 n-4 P1 n-3 P1 n-2 P1 n-1 P1 n P1 avg
P2 0 P2 n-4 P2 n-3 P2 n-2 P2 n-1 P2 n P2 avg
其中,F表示焦距,Cx,Cy表示像主点在像素坐标系中的位置,k1,k2,k3表示径向畸变参数,p1,p2表示切向畸变参数。
根据本公开的实施例,服务器中存储有可移动平台执行N次历史的作业任务时每次分别使用的标定参数,其中,N为预设固定值,在服务器存储与相机对应的目标标定参数时,服务器可以删除与当前一次作业任务的时间相距最远的历史的作业任务对应的标定参数。
根据本公开的实施例,例如,云端服务器可以只保留最近5次/10次所上传的信息,即N为5次/10次。新上传的数据为最近一次有效作业优化后的相机内参与畸变参数,可以将时间上距离本次有效作业最远的一组参数替换掉。例如,删除第n-5次对应的所有参数。
根据本公开的实施例,可以根据最近的5次/10次数据,进行加权平均计算,权重可按照距离当前测试时间较近的权重较高,时间较远的权重较低,进行分配,计算出综合加权平均相机内参(F avg,Cx avg,Cy avg)与畸变参数(k1 avg,k2 avg,k3 avg,p1 avg,p2 avg)。
根据本公开的实施例,通过限定只使用若干次之前到当前的数据,求得综合加权平均参数值,可以避免太过久远的参数影响参数的准确性。
根据本公开的实施例,通过可移动平台上的相机采集的多帧图像标定得到相机的当前时刻的标定参数,根据可移动平台在执行历史的作业任务时使用的相机的历史时刻的标定参数和当前标定得到的当前时刻的标定参数,确定可移动平台在执行新的作业任务时使用的相机的目标标定参数。 由于相机的目标标定参数是根据历史时刻的标定参数和当前标定得到的当前时刻的标定参数进行确定的,而不是固定不变的参数,使得相机的标定参数在使用过程中能够迭代优化,使得可移动平台在实际作业过程中使用的相机的标定参数更加符合实际作业情况。即使相机的镜头在使用过程中有磨损,或者在拆卸后又重新装配,由于参考了历史时刻的标定参数,能够降低因磨损与重新装配改变相机参数的值对图像测量结果的影响。
根据本公开的实施例,在下一次进行建图作业时,可以提示用户是否读取存储在云端服务器的综合加权平均相机内参与畸变参数。如果用户选择读取,则将镜头ID号“Lens_ID”上传至云端服务器,并进行搜索,将综合加权平均相机内参(F avg,Cx avg,Cy avg)与畸变参数(k1 avg,k2 avg,k3 avg,p1 avg,p2 avg),以及/或者每次历史作业时使用的相机内参与畸变参数作为本次建图作业的初始值应用到空中三角测量过程中,然后基于采集的图像标定得到当前时刻的标定参数。
根据本公开的实施例,在下一次进行建图作业时,可移动平台也可以直接从云端服务器请求获取综合加权平均相机内参与畸变参数,以及/或者每次历史作业时使用的相机内参与畸变参数,而无需提示用户。
根据本公开的实施例,如果用户选择不读取云端服务器的综合平均内参与畸变参数,以及/或者每次历史作业时使用的相机内参与畸变参数,可以使用镜头中存储的原始标定值(F 0,Cx 0,Cy 0)与(k1 0,k2 0,k3 0,p1 0,p2 0)作为初始参数进行迭代。
根据本公开的实施例,在相机的镜头出厂时,可以在镜头缓存中写入镜头唯一ID号,关键字例如可以是“Lens_ID”,ID号可以由物料号、出场日期、特殊编码等信息唯一标识出。镜头缓存中可以预留“返厂检校次数”字段,初始至0。
根据本公开的实施例,可以在工厂使用棋盘格进行相机内参与畸变参数的标定,获得初始标定的相机内参(F 0,Cx 0,Cy 0)与(k1 0,k2 0,k3 0,p1 0,p2 0)。将初始标定的相机内参与畸变参数写入镜头缓存,在后续作业利用相机拍摄照片时,可以在图像的exif部分“undistort parameter”字段依次写入相机的初始内参与畸变参数。
根据本公开的实施例,当镜头进行返厂标定时,可以将初始相机内参与畸变参数(F 0,Cx 0,Cy 0)与(k1 0,k2 0,k3 0,p1 0,p2 0)进行重置,并将“返厂检校”字段数值+1。此外,在镜头进行返厂标定之后,可以在云端服务器将此镜头ID号所对应的参数历史表中初始值重置,并将其余值全部清空。
下面参考图5,结合具体实施例对图4所示的方法做进一步说明。
图5示意性示出了根据本公开实施例的获取相机的历史时刻的标定参数的流程图。
如图5所示,获取相机的历史时刻的标定参数包括操作S510~操作S530。
在操作S510,获取相机的标识信息。
根据本公开的实施例,可以将相机镜头的唯一ID记录在图像的exif字段,关键字可以为“Lens_ID”,作为唯一标识信息。
在操作S520,向服务器发送相机的标识信息,以便服务器根据相机的标识信息获取与相机对应的历史时刻的标定参数。
根据本公开的实施例,在服务器中可以存储有该相机在多次历史任务时分别使用的标定参数。在服务器对应的区域存储与该相机的标识信息对应的参数参见表1所示,在此不再赘述。
在操作S530,接收来自服务器的与相机对应的历史时刻的标定参数。
根据本公开的实施例,服务器可以根据该相机的标识信息查找表格,获得与该相机对应的历史时刻的标定参数。
根据本公开的实施例,历史时刻的标定参数可以是如表1所示的综合加权平均值,也可以是多次历史时刻的标定参数对应的数据。
根据本公开的实施例,服务器也可以获得相机的当前时刻的标定参数,通过在服务器本端根据历史时刻的标定参数和当前时刻的标定参数确定目标标定参数。
根据本公开的实施例,无人机执行的作业任务的类型不做限定,作业任务可以是涉及需要进行建图的任务。例如可以是测绘任务,也可以是巡检任务或跟踪任务等等。
根据本公开的实施例,由于相机的当前时刻的标定参数是根据相机采集的多帧图像标定得到的,因此,相机的当前时刻的标定参数与图像本身相关联。在相关技术中,利用无人机在不同位置区域执行作业任务时,使用的相机参数一般是相同的,但由于不同位置区域的图像是不同的,如果使用相同的相机参数在不同位置区域执行作业任务,会影响图像测量的准确性,导致建图的效果差。
根据本公开的实施例,可以获取相机在当前作业期间所处的地理位置信息,然后向服务器发送相机的标识信息和地理位置信息,服务器可以根据相机的标识信息和地理位置信息获取与相机对应的历史时刻的标定参数。
根据本公开的实施例,服务器中可以存储有相机在多次历史任务时分别使用的标定参数,并且,每次历史任务具有对应的地理位置信息。
例如,服务器中存储的数据格式如表2所示。
表2
工厂标定 第n-4次 第n-3次 第n-2次 第n-1次 第n次 综合值
F 0 F n-4 F n-3 F n-2 F n-1 F n F avg
Cx 0 Cx n-4 Cx n-3 Cx n-2 Cx n-1 Cx n Cx avg
Cy 0 Cy n-4 Cy n-3 Cy n-2 Cy n-1 Cy n Cy avg
K1 0 K1 n-4 K1 n-3 K1 n-2 K1 n-1 K1 n K1 avg
K2 0 K2 n-4 K2 n-3 K2 n-2 K2 n-1 K2 n K2 avg
K3 0 K3 n-4 K3 n-3 K3 n-2 K3 n-1 K3 n K3 avg
P1 0 P1 n-4 P1 n-3 P1 n-2 P1 n-1 P1 n P1 avg
P2 0 P2 n-4 P2 n-3 P2 n-2 P2 n-1 P2 n P2 avg
L 0 L n-4 L n-3 L n-2 L n-1 L n -
其中,L 0表示工厂标定时的地理位置信息,L n-4~L n分别表示每次执行作业任务时对应的地理位置信息。
根据本公开的实施例,服务器可以根据相机的标识信息搜索到与该相机对应的数据,然后根据相机在当前作业期间所处的地理位置信息,获取与当前作业期间所处的地理位置信息相同或相近地理位置对应的历史时刻的标定参数。例如,与当前作业期间所处的地理位置信息相同或相近地 理位置信息为L n-4,那么可以将第n-4次对应的历史时刻的标定参数确定为与相机对应的历史时刻的标定参数。
通过本公开的实施例,根据相机的标识信息和地理位置信息从服务器中获取与相机对应的历史时刻的标定参数,根据该历史时刻的标定参数和当前时刻的标定参数确定的目标标定参数不仅使得相机的标定参数在使用过程中能够迭代优化,而且适应当前作业期间的作业环境,使得更加符合实际作业情况,提高了图像测量的准确性,达到了较好的建图效果。
图6示意性示出了根据本公开实施例的相机参数确定装置的框图。
如图6所示,相机参数确定装置600包括处理器610和存储器620。
存储器620用于存储一个或多个程序,其中,当一个或多个程序被处理器执行时,使得处理器执行以下操作:获取可移动平台上的相机采集的多帧图像,其中,多帧图像至少包括具有部分重叠影像的两帧图像;根据多帧图像标定得到相机的当前时刻的标定参数;获取相机的历史时刻的标定参数,其中,历史时刻的标定参数为可移动平台在执行历史的作业任务时使用的相机的标定参数;以及根据历史时刻的标定参数和当前时刻的标定参数确定目标标定参数,其中,目标标定参数用于可移动平台在执行新的作业任务时使用。
根据本公开的实施例,通过可移动平台上的相机采集的多帧图像标定得到相机的当前时刻的标定参数,根据可移动平台在执行历史的作业任务时使用的相机的历史时刻的标定参数和当前标定得到的当前时刻的标定参数,确定可移动平台在执行新的作业任务时使用的相机的目标标定参数。由于相机的目标标定参数是根据历史时刻的标定参数和当前标定得到的当前时刻的标定参数进行确定的,而不是固定不变的参数,使得相机的标定参数在使用过程中能够迭代优化,使得可移动平台在实际作业过程中使用的相机的标定参数更加符合实际作业情况。即使相机的镜头在使用过程中有磨损,或者在拆卸后又重新装配,由于参考了历史时刻的标定参数,能够降低因磨损与重新装配改变相机参数的值对图像测量结果的影响。
根据本公开的实施例,处理器610获取相机的历史时刻的标定参数包括:获取相机的标识信息;向服务器发送相机的标识信息,以便服务器根 据相机的标识信息获取与相机对应的历史时刻的标定参数;以及接收来自服务器的与相机对应的历史时刻的标定参数。
根据本公开的实施例,服务器中存储有可移动平台在执行不同历史的作业任务时分别使用的标定参数;参数包括相机的以下至少一种参数:内参、畸变参数、外参。
根据本公开的实施例,处理器610还执行以下操作:在确定目标标定参数之后,向服务器发送目标标定参数,以便将与相机对应的目标标定参数存储在服务器中。
根据本公开的实施例,服务器中存储有可移动平台执行N次历史的作业任务时每次分别使用的标定参数,其中,N为预设固定值,在服务器存储与相机对应的目标标定参数时,服务器删除与当前一次作业任务的时间相距最远的历史的作业任务对应的标定参数。
根据本公开的实施例,处理器610获取相机的历史时刻的标定参数包括:获取相机在当前作业期间所处的地理位置信息;其中,向服务器发送相机的标识信息,以便服务器根据相机的标识信息获取与相机对应的历史时刻的标定参数包括:向服务器发送相机的标识信息和地理位置信息,以便服务器根据相机的标识信息和地理位置信息获取与相机对应的历史时刻的标定参数。
根据本公开的实施例,处理器610根据相机的历史时刻的标定参数和当前时刻的标定参数确定目标标定参数包括:计算历史时刻的标定参数和当前时刻的标定参数的加权平均值;以及将加权平均值确定为目标标定参数。
根据本公开的实施例,处理器610根据相机的历史时刻的标定参数和当前时刻的标定参数确定目标标定参数包括:计算历史时刻的标定参数和当前时刻的标定参数的第一偏差;将第一偏差与第一预设阈值进行比较;如果第一偏差小于第一预设阈值,将当前时刻的标定参数,或者历史时刻的标定参数和当前时刻的标定参数的加权平均值确定为目标标定参数;以及如果第一偏差大于或等于第一预设阈值,将历史时刻的标定参数确定为目标标定参数。
根据本公开的实施例,处理器610还执行以下操作:获取相机的预设标定参数;其中,根据相机的历史时刻的标定参数和当前时刻的标定参数确定目标标定参数包括:根据预设标定参数、历史时刻的标定参数和当前时刻的标定参数确定目标标定参数。
根据本公开的实施例,处理器610根据预设标定参数、历史时刻的标定参数和当前时刻的标定参数确定目标标定参数包括:计算预设标定参数和当前时刻的标定参数的第二偏差;将第二偏差与第二预设阈值进行比较;如果第二偏差小于第二预设阈值,将当前时刻的标定参数,或者历史时刻的标定参数和当前时刻的标定参数的加权平均值确定为目标标定参数;以及如果偏差大于或等于预设阈值,将历史时刻的标定参数或预设标定参数确定为目标标定参数。
具体地,处理器610例如可以包括通用微处理器、指令集处理器和/或相关芯片组和/或专用微处理器(例如,专用集成电路(ASIC)),等等。处理器610还可以包括用于缓存用途的板载存储器。处理器610可以是用于执行根据本公开实施例的方法流程的不同动作的单一处理单元或者是多个处理单元。
存储器620可以包括计算机程序621,该计算机程序621可以包括代码/计算机可执行指令,其在由处理器610执行时使得处理器610执行例如上面结合图4~5所描述的方法流程及其任何变形。
计算机程序621可被配置为具有例如包括计算机程序模块的计算机程序代码。例如,在示例实施例中,计算机程序621中的代码可以包括一个或多个程序模块,例如包括621A、模块621B、……。应当注意,模块的划分方式和个数并不是固定的,本领域技术人员可以根据实际情况使用合适的程序模块或程序模块组合,当这些程序模块组合被处理器610执行时,使得处理器610可以执行例如上面结合图4~5所描述的方法流程及其任何变形。
根据本公开的实施例,还提供了一种可读存储介质,其上存储有可执行指令,该指令被处理器执行时使处理器执行如上所述的相机标定方法。该可读存储介质可以与上述存储器具有相同功能。
该可读存储介质可以是上述实施例中描述的设备/装置/系统中所包含的;也可以是单独存在,而未装配入该设备/装置/系统中。上述可读存储介质承载有一个或者多个程序,当上述一个或者多个程序被执行时,实现根据本公开实施例的方法。
根据本公开的实施例,可读存储介质可以是非易失性的可读存储介质,例如可以包括但不限于:便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开中,可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。
根据本公开的实施例,还提供了一种可移动平台的控制系统。所述控制系统可以包括一个或多个可移动平台,以及用于控制所述可移动平台的控制终端,每个可移动平台中的处理器可执行如上所述的相机标定方法。
根据本公开的实施例,还提供了一种可移动平台,所述可移动平台的处理器可执行如上所述的相机标定方法。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,上述模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图或流程图中的每个方框、以及框图或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
本领域技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的装置的具体工作过 程,可以参考前述方法实施例中的对应过程,在此不再赘述。
最后应说明的是:以上各实施例仅用以说明本公开的技术方案,而非对其限制;尽管参照前述各实施例对本公开进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;在不冲突的情况下,本公开实施例中的特征可以任意组合;而这些修改或者替换,并不使相应技术方案的本质脱离本公开各实施例技术方案的范围。

Claims (21)

  1. 一种相机参数确定方法,其特征在于,包括:
    获取可移动平台上的相机采集的多帧图像,其中,所述多帧图像至少包括具有部分重叠影像的两帧图像;
    根据所述多帧图像标定得到所述相机的当前时刻的标定参数;
    获取所述相机的历史时刻的标定参数,其中,所述历史时刻的标定参数为所述可移动平台在执行历史的作业任务时使用的相机的标定参数;以及
    根据所述历史时刻的标定参数和所述当前时刻的标定参数确定目标标定参数,其中,所述目标标定参数用于所述可移动平台在执行新的作业任务时使用;
    其中,所述标定参数包括所述相机的以下一种或多种参数:内参、畸变参数、外参。
  2. 根据权利要求1所述的方法,其特征在于,所述获取所述相机的历史时刻的标定参数包括:
    获取所述相机的标识信息;
    向服务器发送所述相机的标识信息,以便所述服务器根据所述相机的标识信息获取与所述相机对应的历史时刻的标定参数;以及
    接收来自所述服务器的与所述相机对应的历史时刻的标定参数。
  3. 根据权利要求2所述的方法,其特征在于,所述服务器中存储有所述可移动平台在执行不同历史的作业任务时分别使用的标定参数。
  4. 根据权利要求2或3所述的方法,其特征在于,还包括:
    在确定所述目标标定参数之后,向所述服务器发送所述目标标定参数,以便将与所述相机对应的目标标定参数存储在所述服务器中。
  5. 根据权利要求4所述的方法,其特征在于,所述服务器中存储有所述可移动平台执行N次历史的作业任务时每次分别使用的标定参数,其中,所述N为预设固定值,在所述服务器存储与所述相机对应的目标标定参数时,所述服务器删除与当前一次作业任务的时间相距最远的历史的作业任务对应的标定参数。
  6. 根据权利要求2所述的方法,其特征在于,所述获取所述相机的历史时刻的标定参数包括:
    获取所述相机在当前作业期间所处的地理位置信息;
    其中,所述向服务器发送所述相机的标识信息,以便所述服务器根据所述相机的标识信息获取与所述相机对应的历史时刻的标定参数包括:
    向服务器发送所述相机的标识信息和所述地理位置信息,以便所述服务器根据所述相机的标识信息和所述地理位置信息获取与所述相机对应的历史时刻的标定参数。
  7. 根据权利要求1所述的方法,其特征在于,所述根据所述相机的历史时刻的标定参数和所述当前时刻的标定参数确定目标标定参数包括:
    计算所述历史时刻的标定参数和所述当前时刻的标定参数的加权平均值;以及
    将所述加权平均值确定为所述目标标定参数。
  8. 根据权利要求1所述的方法,其特征在于,所述根据所述相机的历史时刻的标定参数和所述当前时刻的标定参数确定目标标定参数包括:
    计算所述历史时刻的标定参数和所述当前时刻的标定参数的第一偏差;
    将所述第一偏差与第一预设阈值进行比较;
    如果所述第一偏差小于所述第一预设阈值,将所述当前时刻的标定参数,或者所述历史时刻的标定参数和所述当前时刻的标定参数的加权平均值确定为所述目标标定参数;以及
    如果所述第一偏差大于或等于所述第一预设阈值,将所述历史时刻的标定参数确定为所述目标标定参数。
  9. 根据权利要求1所述的方法,其特征在于,还包括:
    获取所述相机的预设标定参数;
    其中,所述根据所述相机的历史时刻的标定参数和所述当前时刻的标定参数确定目标标定参数包括:
    根据所述预设标定参数、所述历史时刻的标定参数和所述当前时刻的标定参数确定所述目标标定参数。
  10. 根据权利要求9所述的方法,其特征在于,所述根据所述预设标定参数、所述历史时刻的标定参数和所述当前时刻的标定参数确定所述目标标定参数包括:
    计算所述预设标定参数和所述当前时刻的标定参数的第二偏差;
    将所述第二偏差与第二预设阈值进行比较;
    如果所述第二偏差小于所述第二预设阈值,将所述当前时刻的标定参数,或者所述历史时刻的标定参数和所述当前时刻的标定参数的加权平均值确定为所述目标标定参数;以及
    如果所述偏差大于或等于所述预设阈值,将所述历史时刻的标定参数或所述预设标定参数确定为所述目标标定参数。
  11. 一种相机参数确定装置,其特征在于,包括:
    处理器;
    存储器,用于存储一个或多个程序,
    其中,当所述一个或多个程序被所述处理器执行时,使得所述处理器执行以下操作:
    获取可移动平台上的相机采集的多帧图像,其中,所述多帧图像至少包括具有部分重叠影像的两帧图像;
    根据所述多帧图像标定得到所述相机的当前时刻的标定参数;
    获取所述相机的历史时刻的标定参数,其中,所述历史时刻的标定参数为所述可移动平台在执行历史的作业任务时使用的相机的标定参数;以及
    根据所述历史时刻的标定参数和所述当前时刻的标定参数确定目标标定参数,其中,所述目标标定参数用于所述可移动平台在执行新的作业任务时使用。
  12. 根据权利要求11所述的装置,其特征在于,所述处理器获取所述相机的历史时刻的标定参数包括:
    获取所述相机的标识信息;
    向服务器发送所述相机的标识信息,以便所述服务器根据所述相机的标识信息获取与所述相机对应的历史时刻的标定参数;以及
    接收来自所述服务器的与所述相机对应的历史时刻的标定参数。
  13. 根据权利要求12所述的装置,其特征在于,所述服务器中存储有所述可移动平台在执行不同历史的作业任务时分别使用的标定参数;
    所述参数包括所述相机的以下至少一种参数:内参、畸变参数、外参。
  14. 根据权利要求12或13所述的装置,其特征在于,所述处理器还执行以下操作:
    在确定所述目标标定参数之后,向所述服务器发送所述目标标定参数,以便将与所述相机对应的目标标定参数存储在所述服务器中。
  15. 根据权利要求14所述的装置,其特征在于,所述服务器中存储有所述可移动平台执行N次历史的作业任务时每次分别使用的标定参数,其中,所述N为预设固定值,在所述服务器存储与所述相机对应的目标标定参数时,所述服务器删除与当前一次作业任务的时间相距最远的历史的作业任务对应的标定参数。
  16. 根据权利要求12所述的装置,其特征在于,所述处理器获取所述相机的历史时刻的标定参数包括:
    获取所述相机在当前作业期间所处的地理位置信息;
    其中,所述向服务器发送所述相机的标识信息,以便所述服务器根据所述相机的标识信息获取与所述相机对应的历史时刻的标定参数包括:
    向服务器发送所述相机的标识信息和所述地理位置信息,以便所述服务器根据所述相机的标识信息和所述地理位置信息获取与所述相机对应的历史时刻的标定参数。
  17. 根据权利要求11所述的装置,其特征在于,所述处理器根据所述相机的历史时刻的标定参数和所述当前时刻的标定参数确定目标标定参数包括:
    计算所述历史时刻的标定参数和所述当前时刻的标定参数的加权平均值;以及
    将所述加权平均值确定为所述目标标定参数。
  18. 根据权利要求11所述的装置,其特征在于,所述处理器根据所述相机的历史时刻的标定参数和所述当前时刻的标定参数确定目标标定参数包括:
    计算所述历史时刻的标定参数和所述当前时刻的标定参数的第一偏差;
    将所述第一偏差与第一预设阈值进行比较;
    如果所述第一偏差小于所述第一预设阈值,将所述当前时刻的标定参数,或者所述历史时刻的标定参数和所述当前时刻的标定参数的加权平均值确定为所述目标标定参数;以及
    如果所述第一偏差大于或等于所述第一预设阈值,将所述历史时刻的标定参数确定为所述目标标定参数。
  19. 根据权利要求11所述的装置,其特征在于,所述处理器还执行以下操作:
    获取所述相机的预设标定参数;
    其中,所述根据所述相机的历史时刻的标定参数和所述当前时刻的标定参数确定目标标定参数包括:
    根据所述预设标定参数、所述历史时刻的标定参数和所述当前时刻的标定参数确定所述目标标定参数。
  20. 根据权利要求19所述的装置,其特征在于,所述处理器根据所述预设标定参数、所述历史时刻的标定参数和所述当前时刻的标定参数确定所述目标标定参数包括:
    计算所述预设标定参数和所述当前时刻的标定参数的第二偏差;
    将所述第二偏差与第二预设阈值进行比较;
    如果所述第二偏差小于所述第二预设阈值,将所述当前时刻的标定参数,或者所述历史时刻的标定参数和所述当前时刻的标定参数的加权平均值确定为所述目标标定参数;以及
    如果所述偏差大于或等于所述预设阈值,将所述历史时刻的标定参数或所述预设标定参数确定为所述目标标定参数。
  21. 一种可读存储介质,其上存储有可执行指令,该指令被处理器执行时使处理器执行权利要求1至10中任一项所述的方法。
PCT/CN2020/092939 2020-05-28 2020-05-28 相机参数确定方法、装置和可读存储介质 WO2021237574A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/092939 WO2021237574A1 (zh) 2020-05-28 2020-05-28 相机参数确定方法、装置和可读存储介质
CN202080005255.0A CN112771577A (zh) 2020-05-28 2020-05-28 相机参数确定方法、装置和可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/092939 WO2021237574A1 (zh) 2020-05-28 2020-05-28 相机参数确定方法、装置和可读存储介质

Publications (1)

Publication Number Publication Date
WO2021237574A1 true WO2021237574A1 (zh) 2021-12-02

Family

ID=75699527

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/092939 WO2021237574A1 (zh) 2020-05-28 2020-05-28 相机参数确定方法、装置和可读存储介质

Country Status (2)

Country Link
CN (1) CN112771577A (zh)
WO (1) WO2021237574A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114152208B (zh) * 2021-11-24 2023-04-07 燕山大学 基于dic技术的4d打印柔性蒙皮变形效能评估方法
CN117274392A (zh) * 2022-06-13 2023-12-22 华为技术有限公司 相机内参标定方法及相关设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107730551A (zh) * 2017-01-25 2018-02-23 问众智能信息科技(北京)有限公司 车载相机姿态自动估计的方法和装置
CN109947886A (zh) * 2019-03-19 2019-06-28 腾讯科技(深圳)有限公司 图像处理方法、装置、电子设备及存储介质
US10432855B1 (en) * 2016-05-20 2019-10-01 Gopro, Inc. Systems and methods for determining key frame moments to construct spherical images
CN110473262A (zh) * 2019-08-22 2019-11-19 北京双髻鲨科技有限公司 多目相机的外参标定方法、装置、存储介质及电子设备
CN110766760A (zh) * 2019-10-21 2020-02-07 北京百度网讯科技有限公司 用于相机标定的方法、装置、设备和存储介质
CN110809781A (zh) * 2018-11-15 2020-02-18 深圳市大疆创新科技有限公司 一种图像处理方法、控制终端及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10432855B1 (en) * 2016-05-20 2019-10-01 Gopro, Inc. Systems and methods for determining key frame moments to construct spherical images
CN107730551A (zh) * 2017-01-25 2018-02-23 问众智能信息科技(北京)有限公司 车载相机姿态自动估计的方法和装置
CN110809781A (zh) * 2018-11-15 2020-02-18 深圳市大疆创新科技有限公司 一种图像处理方法、控制终端及存储介质
CN109947886A (zh) * 2019-03-19 2019-06-28 腾讯科技(深圳)有限公司 图像处理方法、装置、电子设备及存储介质
CN110473262A (zh) * 2019-08-22 2019-11-19 北京双髻鲨科技有限公司 多目相机的外参标定方法、装置、存储介质及电子设备
CN110766760A (zh) * 2019-10-21 2020-02-07 北京百度网讯科技有限公司 用于相机标定的方法、装置、设备和存储介质

Also Published As

Publication number Publication date
CN112771577A (zh) 2021-05-07

Similar Documents

Publication Publication Date Title
WO2020014909A1 (zh) 拍摄方法、装置和无人机
WO2021227360A1 (zh) 一种交互式视频投影方法、装置、设备及存储介质
CN108257183B (zh) 一种相机镜头光轴校准方法和装置
KR101657039B1 (ko) 화상 처리 장치, 화상 처리 방법, 및 촬상 시스템
CN111127357B (zh) 户型图处理方法、系统、装置和计算机可读存储介质
TWI434567B (zh) An image processing apparatus, an image processing method, an image processing program, and a recording medium
WO2020097851A1 (zh) 一种图像处理方法、控制终端及存储介质
WO2021004416A1 (zh) 一种基于视觉信标建立信标地图的方法、装置
CN110345925B (zh) 一种针对五目航拍照片质量检测及空三处理方法
WO2021237574A1 (zh) 相机参数确定方法、装置和可读存储介质
WO2021136386A1 (zh) 数据处理方法、终端和服务器
WO2016155110A1 (zh) 图像透视畸变校正的方法及系统
TW201926240A (zh) 無人飛行機之全景拍照方法與使用其之無人飛行機
CN113781664B (zh) 基于三维模型的vr全景图构造显示方法、系统及终端
KR100544345B1 (ko) 항공사진의 정사영상 제작방법
CN114943773A (zh) 相机标定方法、装置、设备和存储介质
CN114549666B (zh) 一种基于agv的环视图像拼接标定方法
CN110136205B (zh) 多目相机的视差校准方法、装置及系统
CN111383264B (zh) 一种定位方法、装置、终端及计算机存储介质
CN111343360B (zh) 一种校正参数获得方法
WO2020237422A1 (zh) 航测方法、飞行器及存储介质
CN112734630B (zh) 正射影像处理方法、装置、设备及存储介质
CN113034347A (zh) 倾斜摄影图像处理方法、装置、处理设备及存储介质
CN109389645B (zh) 相机自校准方法、系统、相机、机器人及云端服务器
CN113920144B (zh) 一种实景照片地面视域分析方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20938115

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20938115

Country of ref document: EP

Kind code of ref document: A1