CN116993830A - Automatic calibration method for dynamic camera coordinate mapping - Google Patents

Automatic calibration method for dynamic camera coordinate mapping Download PDF

Info

Publication number
CN116993830A
CN116993830A CN202311036498.2A CN202311036498A CN116993830A CN 116993830 A CN116993830 A CN 116993830A CN 202311036498 A CN202311036498 A CN 202311036498A CN 116993830 A CN116993830 A CN 116993830A
Authority
CN
China
Prior art keywords
camera
picture
automatic calibration
coordinate mapping
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311036498.2A
Other languages
Chinese (zh)
Other versions
CN116993830B (en
Inventor
梁华
吕建明
李晓威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fu'an Digital Technology Co ltd
Original Assignee
Guangzhou Fu'an Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fu'an Digital Technology Co ltd filed Critical Guangzhou Fu'an Digital Technology Co ltd
Priority to CN202311036498.2A priority Critical patent/CN116993830B/en
Publication of CN116993830A publication Critical patent/CN116993830A/en
Application granted granted Critical
Publication of CN116993830B publication Critical patent/CN116993830B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)
  • Image Input (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an automatic calibration method for dynamic camera coordinate mapping, which comprises the following steps: s1, collecting data before and after the offset of a camera to obtain a picture pair set; s2, matching of characteristic points of the camera picture and the shifted picture is achieved based on a SIFT algorithm, and a characteristic point pair set corresponding to different pictures of the camera is obtained; s3, calculating to obtain a camera parameter set by utilizing the characteristic point pair sets of all the picture pairs; and S4, setting weights for each group of camera parameters to obtain camera parameters for correction, and realizing automatic calibration of dynamic camera coordinate mapping. The automatic calibration method for the dynamic camera coordinate mapping is adopted, manual intervention is not needed, automatic calibration of the dynamic camera coordinate mapping is realized through automatic optimization of camera parameters, and waste of human resources is reduced.

Description

Automatic calibration method for dynamic camera coordinate mapping
Technical Field
The invention relates to the technical field of camera monitoring, in particular to an automatic calibration method for dynamic camera coordinate mapping.
Background
The distance between mechanical gears of the driving motor and the precision of structural members can cause errors in the precision of preset positions, and the camera can accumulate errors when rotated for a long time, so that a picture in the current posture of the camera is offset to a certain extent from a picture in the original posture, and the position of an object in the picture is changed. After the camera picture is offset, the geographic position coordinate of the target object is not matched with the corresponding position in the video monitoring picture. At present, the correction of the camera picture is mainly finished by manual operation, and the manual operation is time-consuming and labor-consuming, so that a large amount of human resources are wasted, and the cost is increased.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide an automatic calibration method for the coordinate mapping of the dynamic camera, which does not need human intervention and realizes the automatic calibration of the coordinate mapping of the dynamic camera through the automatic optimization of camera parameters.
In order to achieve the above object, the present invention provides the following solutions: an automatic calibration method for dynamic camera coordinate mapping comprises the following steps:
s1, collecting data before and after the offset of a camera to obtain a picture pair set;
s2, matching of characteristic points of the camera picture and the shifted picture is achieved based on a SIFT algorithm, and a characteristic point pair set corresponding to different pictures of the camera is obtained;
s3, calculating to obtain a camera parameter set by utilizing the characteristic point pair sets of all the picture pairs;
and S4, setting weights for each group of camera parameters to obtain camera parameters for correction, and realizing automatic calibration of dynamic camera coordinate mapping.
Preferably, in the step S1, the following steps are included:
s11, before data of camera offset are collected, setting the zoom multiple of the camera to be Z, and obtaining pictures in different directions within 360 DEG range of the camera to obtain a picture set T 1 And then recording the gesture information of the camera corresponding to each picture to obtain a gesture information set:
M={(p 1 ,t 1 ,z 1 ),(p 2 ,t 2 ,z 2 ),(p 3 ,t 3 ,z 3 ),…,(p n ,t n ,z n )},n>0
wherein p is n To take the corresponding azimuth angle, t n To take the corresponding pitch angle, z of the nth picture n The scaling times corresponding to the nth picture are taken;
s12, collecting data of the offset of the camera, setting the zoom multiple of the camera to be Z, rotating the camera to the corresponding position in the M according to the attitude information set M, and obtaining a picture of a picture at the position to obtain a picture set T 2
S13, T is taken 1 And T 2 Pairing the pictures in the same posture to obtain a picture pair set:
T={(T 11 ,T 21 ),(T 12 ,T 22 ),(T 13 ,T 23 ),…,(T 1n ,T 2n )},n>0
wherein T is 1n For a set of pictures T 1 N-th picture, T 2n For a set of pictures T 2 The nth picture of (a).
Preferably, in the step S2, the pairing of the feature points of the camera frame and the shifted frame includes the following steps:
s21, establishing a multi-scale space of an image and a Gaussian pyramid image;
s22, subtracting 2 Gaussian images of adjacent scales to obtain a Gaussian difference multi-scale space, and obtaining a local extremum point;
s23, accurately positioning the obtained extreme points by a surface fitting method, and removing edge points and points with lower contrast in initial characteristic points by adopting a hessian matrix of the Gaussian difference image to obtain the characteristic points of the image.
S24, after the feature points of the image are obtained, matching 2 feature points by taking Euclidean distance as a similarity criterion of the multidimensional vector, and matching the feature points to obtain a feature point pair set:
P={[(x 11 ,y 11 ),(x 12 ,y 12 )],[(x 21 ,y 21 ),(x 22 ,y 22 )],…,[(x n1 ,y n1 ),(x n2 ,y n2 )]},n>0
wherein [ (x) n1 ,y n1 ),(x n2 ,y n2 )]Is the pixel coordinate pair of the nth feature point pair.
Preferably, feature point pairing is carried out on each group of pictures in the picture pair set T, so as to obtain feature point pair sets of all the picture pairs:
Ω={P 1 ,P 2 ,P 3 ,…,P m },m>0
wherein P is m And the feature point pair set is the m group of picture pairs.
Preferably, in the step S3, the camera parameters include an initial azimuth angle, an initial pitch angle, and an initial roll angle of the camera, and in order to implement correction of the camera, parameters after the camera is offset are calculated, including the following steps:
s31, before the camera is deviated, the conversion relation between the pixel coordinates and the longitude and latitude coordinates is as follows:
s32, after the camera is shifted, the conversion relation between the pixel coordinates and the longitude and latitude coordinates is as follows:
wherein H is (α,β,θ) The method is a conversion matrix related to an initial azimuth angle, an initial pitch angle and an initial roll angle, delta 1, delta 2 and delta 3 are the variation quantities of the initial azimuth angle, the initial pitch angle and the initial roll angle respectively, wherein (x, y) and (x ', y') are pixel coordinates of the same geographic position in two pictures before and after the offset of the camera, and (lon, lat) and (lon ', lat') are plane coordinates after the conversion of (x, y) and (x ', y').
Preferably, in the step S3, further includes:
s33, calculating an optimal solution of three variables (delta 1, delta 2 and delta 3) in the conversion relation between pixel coordinates after camera offset and longitude and latitude coordinates by adopting a genetic algorithm or other algorithms for searching the optimal solution, and calculating a camera parameter set according to a characteristic point pair set P, wherein the method comprises the following steps of:
s331, setting a threshold I, and taking Loss as a Loss function:
s332, if Loss is more than or equal to I, repeating the step S33 to obtain an optimal solution;
s333, if Loss is less than I, obtaining a corresponding camera parameter x= { a, b, c }, where a=α+Δ1, b=β+Δ2, and c=θ+Δ3;
s34, performing step S33 on the feature point pair set omega of all the picture pairs to obtain a camera parameter set K= { X 1 ,X 2 ,X 3 ,…,X m M > 0, loss function value set l= { Loss 1 ,Loss 2 ,…,Loss m },m>0;
Wherein X is m Camera parameters for mth group of picture pairs, loss m Is the loss function value for the mth group of picture pairs.
Preferably, in the step S4, in order to reduce the influence of strong wind and strong fog and special weather, a weight W is set for each optimized set of camera parameters m
Wherein i is the total number of loss function value sets;
according to the weight of each group of camera parameters, the camera parameters X' = (A, B, C) for correcting are calculated as follows:
wherein A, B and C are respectively the initial azimuth angle and the initial depression angle of the camera after correctionElevation angle and initial roll angle, (a) m ,b m ,c m ) And the camera parameters corresponding to the m-th group of picture pairs.
Preferably, after the correction of the camera, the conversion relationship between the pixel coordinates and the longitude and latitude coordinates is as follows:
and (3) performing the operations from the step S1 to the step S4 at intervals, and thus realizing the automatic calibration of the dynamic camera coordinate mapping.
Therefore, the automatic calibration method for the dynamic camera coordinate mapping is adopted, human intervention is not needed, and the automatic calibration of the dynamic camera coordinate mapping is realized through the automatic optimization of camera parameters.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of an automatic calibration method for dynamic camera coordinate mapping according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention aims to provide an automatic calibration method for dynamic camera coordinate mapping, which does not need human intervention and realizes the automatic calibration of the dynamic camera coordinate mapping through the automatic optimization of camera parameters.
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
Fig. 1 is a diagram of an automatic calibration method for dynamic camera coordinate mapping according to the present invention, as shown in fig. 1, the present invention provides an automatic calibration method for dynamic camera coordinate mapping, comprising the following steps:
s1, collecting data before and after the offset of a camera to obtain a picture pair set;
in step S1, the method specifically includes the following steps:
s11, before data of camera offset are collected, setting the zoom multiple of the camera to be Z, and obtaining pictures in different directions within 360 DEG range of the camera to obtain a picture set T 1 And then recording the gesture information of the camera corresponding to each picture to obtain a gesture information set:
M={(p 1 ,t 1 ,z 1 ),(p 2 ,t 2 ,z 2 ),(p 3 ,t 3 ,z 3 ),…,(p n ,t n ,z n )},n>0
wherein p is n To take the corresponding azimuth angle, t n To take the corresponding pitch angle, z of the nth picture n The scaling times corresponding to the nth picture are taken;
s12, collecting data of the offset of the camera, setting the zoom multiple of the camera to be Z, rotating the camera to the corresponding position in the M according to the attitude information set M, and obtaining a picture of a picture at the position to obtain a picture set T 2
S13, T is taken 1 And T 2 Pairing the pictures in the same posture to obtain a picture pair set:
T={(T 11 ,T 21 ),(T 12 ,T 22 ),(T 13 ,T 23 ),…,(T 1n ,T 2n )},n>0
wherein T is 1n For picture collectionT 1 N-th picture, T 2n For a set of pictures T 2 The nth picture of (a).
S2, matching of characteristic points of the camera picture and the shifted picture is achieved based on a SIFT algorithm, and a characteristic point pair set corresponding to different pictures of the camera is obtained;
in step S2, the pairing of the feature points of the camera frame and the shifted frame includes the steps of:
s21, establishing a multi-scale space of an image and a Gaussian pyramid image;
s22, subtracting 2 Gaussian images of adjacent scales to obtain a Gaussian difference multi-scale space, and obtaining a local extremum point;
s23, accurately positioning the obtained extreme points by a surface fitting method, and removing edge points and points with lower contrast in initial characteristic points by adopting a hessian matrix of a Gaussian differential image to obtain the characteristic points of the image;
s24, after the feature points of the image are obtained, matching 2 feature points by taking Euclidean distance as a similarity criterion of the multidimensional vector, and matching the feature points to obtain a feature point pair set:
P={[(x 11 ,y 11 ),(x 12 ,y 12 )],[(x 21 ,y 21 ),(x 22 ,y 22 )],…,[(x n1 ,y n1 ),(x n2 ,y n2 )]},n>0
wherein [ (x) n1 ,y n1 ),(x n2 ,y n2 )]A pixel coordinate pair which is the nth feature point pair;
pairing the characteristic points of each group of pictures in the picture pair set T to obtain a characteristic point pair set of all the picture pairs:
Ω={P 1 ,P 2 ,P 3 ,…,P m },m>0
wherein P is m And the feature point pair set is the m group of picture pairs.
S3, calculating to obtain a camera parameter set by utilizing the characteristic point pair sets of all the picture pairs;
in step S3, the camera parameters include an initial azimuth angle, an initial pitch angle, and an initial roll angle of the camera, and in order to implement correction of the camera, parameters after the camera is offset are calculated, including the following steps:
s31, before the camera is deviated, the conversion relation between the pixel coordinates and the longitude and latitude coordinates is as follows:
s32, after the camera is shifted, the conversion relation between the pixel coordinates and the longitude and latitude coordinates is as follows:
wherein H is (α,β,θ) The method is a conversion matrix related to an initial azimuth angle, an initial pitch angle and an initial roll angle, delta 1, delta 2 and delta 3 are the variation quantities of the initial azimuth angle, the initial pitch angle and the initial roll angle respectively, wherein (x, y) and (x ', y') are pixel coordinates of the same geographic position in two pictures before and after the offset of the camera, and (lon, lat) and (lon ', lat') are plane coordinates after the conversion of (x, y) and (x ', y').
In addition, the step S3 further includes:
s33, calculating an optimal solution of three variables (delta 1, delta 2 and delta 3) in the conversion relation between pixel coordinates after camera offset and longitude and latitude coordinates by adopting a genetic algorithm or other algorithms for searching the optimal solution, and calculating a camera parameter set according to a characteristic point pair set P, wherein the method comprises the following steps of:
s331, setting a threshold I, and taking Loss as a Loss function:
s332, if Loss is more than or equal to I, repeating the step S33 to obtain an optimal solution;
s333, if Loss is less than I, obtaining a corresponding camera parameter x= { a, b, c }, where a=α+Δ1, b=β+Δ2, and c=θ+Δ3;
s34, performing step S33 on the feature point pair set omega of all the picture pairs to obtain a camera parameter set K= { X 1 ,X 2 ,X 3 ,…,X m M > 0, loss function value set l= { Loss 1 ,Loss 2 ,…,Loss m },m>0;
Wherein X is m Camera parameters for mth group of picture pairs, loss m Is the loss function value for the mth group of picture pairs.
S4, setting weights for each group of camera parameters to obtain camera parameters for correction, and realizing automatic calibration of dynamic camera coordinate mapping;
in step S4, a weight W is set for each optimized set of camera parameters in order to reduce the influence of heavy wind and heavy fog m
Wherein i is the total number of loss function value sets;
according to the weight of each group of camera parameters, the camera parameters X' = (A, B, C) for correcting are calculated as follows:
wherein A, B, C are respectively the initial azimuth angle, the initial pitch angle and the initial roll angle after the correction of the camera, (a) m ,b m ,c m ) The camera parameters corresponding to the m group of picture pairs;
after the correction of the camera, the conversion relation between the pixel coordinates and the longitude and latitude coordinates is as follows:
and (3) performing the operations from the step S1 to the step S4 at intervals, and thus realizing the automatic calibration of the dynamic camera coordinate mapping.
Therefore, the automatic calibration method for the dynamic camera coordinate mapping is adopted, human intervention is not needed, and the automatic calibration of the dynamic camera coordinate mapping is realized through the automatic optimization of camera parameters.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the system disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to assist in understanding the methods of the present invention and the core ideas thereof; also, it is within the scope of the present invention to be modified by those of ordinary skill in the art in light of the present teachings. In view of the foregoing, this description should not be construed as limiting the invention.

Claims (8)

1. An automatic calibration method for dynamic camera coordinate mapping is characterized by comprising the following steps:
s1, collecting data before and after the offset of a camera to obtain a picture pair set;
s2, matching of characteristic points of the camera picture and the shifted picture is achieved based on a SIFT algorithm, and a characteristic point pair set corresponding to different pictures of the camera is obtained;
s3, calculating to obtain a camera parameter set by utilizing the characteristic point pair sets of all the picture pairs;
and S4, setting weights for each group of camera parameters to obtain camera parameters for correction, and realizing automatic calibration of dynamic camera coordinate mapping.
2. The method according to claim 1, wherein in the step S1, the method comprises the steps of:
s11, before data of camera offset are collected, setting the zoom multiple of the camera to be Z, and obtaining pictures in different directions within 360 DEG range of the camera to obtain a picture set T 1 And then recording the gesture information of the camera corresponding to each picture to obtain a gesture information set:
M={(p 1 ,t 1 ,z 1 ),(p 2 ,t 2 ,z 2 ),(p 3 ,t 3 ,z 3 ),…,(p n ,t n ,z n )},n>0
wherein p is n To take the corresponding azimuth angle, t n To take the corresponding pitch angle, z of the nth picture n The scaling times corresponding to the nth picture are taken;
s12, collecting data of the offset of the camera, setting the zoom multiple of the camera to be Z, rotating the camera to the corresponding position in the M according to the attitude information set M, and obtaining a picture of a picture at the position to obtain a picture set T 2
S13, T is taken 1 And T 2 Pairing the pictures in the same posture to obtain a picture pair set:
T={(T 11 ,T 21 ),(T 12 ,T 22 ),(T 13 ,T 23 ),…,(T 1n ,T 2n )},n>0
wherein T is 1n For a set of pictures T 1 N-th picture, T 2n For a set of pictures T 2 The nth picture of (a).
3. The automatic calibration method of dynamic camera coordinate mapping according to claim 1, wherein in the step S2, pairing of the feature points of the camera frame and the offset frame includes the steps of:
s21, establishing a multi-scale space of an image and a Gaussian pyramid image;
s22, subtracting 2 Gaussian images of adjacent scales to obtain a Gaussian difference multi-scale space, and obtaining a local extremum point;
s23, accurately positioning the obtained extreme points by a surface fitting method, and removing edge points and points with lower contrast in initial characteristic points by adopting a hessian matrix of the Gaussian difference image to obtain the characteristic points of the image.
S24, after the feature points of the image are obtained, matching 2 feature points by taking Euclidean distance as a similarity criterion of the multidimensional vector, and matching the feature points to obtain a feature point pair set:
P={[(x 11 ,y 11 ),(x 12 ,y 12 )],[(x 21 ,y 21 ),(x 22 ,y 22 )],…,[(x n1 ,y n1 ),(x n2 ,y n2 )]},n>0
wherein [ (x) n1 ,y n1 ),(x n2 ,y n2 )]Is the pixel coordinate pair of the nth feature point pair.
4. The automatic calibration method for dynamic camera coordinate mapping according to claim 3, wherein feature point pairing is performed on each group of pictures in the picture pair set T to obtain a feature point pair set of all picture pairs:
Ω={P 1 ,P 2 ,P 3 ,…,P m },m>0
wherein P is m And the feature point pair set is the m group of picture pairs.
5. The automatic calibration method of dynamic camera coordinate mapping according to claim 1, wherein in the step S3, the camera parameters include an initial azimuth angle, an initial pitch angle, and an initial roll angle of the camera, and in order to implement correction of the camera, parameters after the camera is offset are calculated, including the following steps:
s31, before the camera is deviated, the conversion relation between the pixel coordinates and the longitude and latitude coordinates is as follows:
s32, after the camera is shifted, the conversion relation between the pixel coordinates and the longitude and latitude coordinates is as follows:
wherein H is (α,β,θ) The method is a conversion matrix related to an initial azimuth angle, an initial pitch angle and an initial roll angle, delta 1, delta 2 and delta 3 are the variation quantities of the initial azimuth angle, the initial pitch angle and the initial roll angle respectively, wherein (x, y) and (x ', y') are pixel coordinates of the same geographic position in two pictures before and after the offset of the camera, and (lon, lat) and (lon ', lat') are plane coordinates after the conversion of (x, y) and (x ', y').
6. The method according to claim 5, wherein in step S3, further comprising:
s33, calculating an optimal solution of three variables (delta 1, delta 2 and delta 3) in the conversion relation between pixel coordinates after camera offset and longitude and latitude coordinates by adopting a genetic algorithm or other algorithms for searching the optimal solution, and calculating a camera parameter set according to a characteristic point pair set P, wherein the method comprises the following steps of:
s331, setting a threshold I, and taking Loss as a Loss function:
s332, if Loss is more than or equal to I, repeating the step S33 to obtain an optimal solution;
s333, if Loss is less than I, obtaining a corresponding camera parameter x= { a, b, c }, where a=α+Δ1, b=β+Δ2, and c=θ+Δ3;
s34, performing step S33 on the feature point pair set omega of all the picture pairs to obtain a camera parameter set K= { X 1 ,X 2 ,X 3 ,…,X m M > 0, loss function value set l= { Loss 1 ,Loss 2 ,…,Loss m },m>0;
Wherein X is m Camera parameters for mth group of picture pairs, loss m Is the loss function value for the mth group of picture pairs.
7. The method according to claim 1, wherein in step S4, weights W are set for each optimized set of camera parameters in order to reduce the influence of heavy wind and heavy fog on special weather m
Wherein i is the total number of loss function value sets;
according to the weight of each group of camera parameters, the camera parameters X' = (A, B, C) for correcting are calculated as follows:
wherein A, B, C are respectively the initial azimuth angle, the initial pitch angle and the initial roll angle after the correction of the camera, (a) m ,b m ,c m ) And the camera parameters corresponding to the m-th group of picture pairs.
8. The automatic calibration method for dynamic camera coordinate mapping according to claim 7, wherein after the correction of the camera, the conversion relationship between the pixel coordinates and the longitude and latitude coordinates is:
and (3) performing the operations from the step S1 to the step S4 at intervals, and thus realizing the automatic calibration of the dynamic camera coordinate mapping.
CN202311036498.2A 2023-08-17 2023-08-17 Automatic calibration method for dynamic camera coordinate mapping Active CN116993830B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311036498.2A CN116993830B (en) 2023-08-17 2023-08-17 Automatic calibration method for dynamic camera coordinate mapping

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311036498.2A CN116993830B (en) 2023-08-17 2023-08-17 Automatic calibration method for dynamic camera coordinate mapping

Publications (2)

Publication Number Publication Date
CN116993830A true CN116993830A (en) 2023-11-03
CN116993830B CN116993830B (en) 2024-09-27

Family

ID=88526579

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311036498.2A Active CN116993830B (en) 2023-08-17 2023-08-17 Automatic calibration method for dynamic camera coordinate mapping

Country Status (1)

Country Link
CN (1) CN116993830B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140104376A1 (en) * 2012-10-17 2014-04-17 Vivotek Inc. Linking-up photographing system and control method for linked-up cameras thereof
JP2016213535A (en) * 2015-04-30 2016-12-15 株式会社東芝 Camera calibration device, method and program
CN107333130A (en) * 2017-08-24 2017-11-07 歌尔股份有限公司 Assemble multi-cam module testing method and system
US20180322658A1 (en) * 2017-02-27 2018-11-08 Anhui Huami Information Technology Co.,Ltd. Camera Calibration
CN110567469A (en) * 2018-06-05 2019-12-13 北京市商汤科技开发有限公司 Visual positioning method and device, electronic equipment and system
CN111833394A (en) * 2020-07-27 2020-10-27 深圳惠牛科技有限公司 Camera calibration method and measuring method based on binocular measuring device
CN112991453A (en) * 2019-12-17 2021-06-18 杭州海康机器人技术有限公司 Calibration parameter calibration method and device for binocular camera and electronic equipment
CN113012047A (en) * 2021-03-26 2021-06-22 广州市赋安电子科技有限公司 Dynamic camera coordinate mapping establishing method and device and readable storage medium
CN113536655A (en) * 2021-04-07 2021-10-22 北京聚树核科技有限公司 Artificial intelligent deviation rectifying method and device for heliostat, electronic equipment and storage medium
CN114758011A (en) * 2022-04-13 2022-07-15 南京航空航天大学 Zoom camera online calibration method fusing offline calibration results

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140104376A1 (en) * 2012-10-17 2014-04-17 Vivotek Inc. Linking-up photographing system and control method for linked-up cameras thereof
JP2016213535A (en) * 2015-04-30 2016-12-15 株式会社東芝 Camera calibration device, method and program
US20180322658A1 (en) * 2017-02-27 2018-11-08 Anhui Huami Information Technology Co.,Ltd. Camera Calibration
CN107333130A (en) * 2017-08-24 2017-11-07 歌尔股份有限公司 Assemble multi-cam module testing method and system
CN110567469A (en) * 2018-06-05 2019-12-13 北京市商汤科技开发有限公司 Visual positioning method and device, electronic equipment and system
CN112991453A (en) * 2019-12-17 2021-06-18 杭州海康机器人技术有限公司 Calibration parameter calibration method and device for binocular camera and electronic equipment
CN111833394A (en) * 2020-07-27 2020-10-27 深圳惠牛科技有限公司 Camera calibration method and measuring method based on binocular measuring device
CN113012047A (en) * 2021-03-26 2021-06-22 广州市赋安电子科技有限公司 Dynamic camera coordinate mapping establishing method and device and readable storage medium
CN113536655A (en) * 2021-04-07 2021-10-22 北京聚树核科技有限公司 Artificial intelligent deviation rectifying method and device for heliostat, electronic equipment and storage medium
CN114758011A (en) * 2022-04-13 2022-07-15 南京航空航天大学 Zoom camera online calibration method fusing offline calibration results

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ZHANHUA WANG ET AL.: "A real-time correction and stitching algorithm for underwater fisheye images", 《SIGNAL, IMAGE AND VIDEO PROCESSING》, 22 February 2022 (2022-02-22) *
刘涌;黄丁发;刘志勤;贾渊;: "基于仿射变换和透视投影的摄像机镜头畸变校正方法", 西南科技大学学报, no. 03, 15 September 2010 (2010-09-15) *
王?;付强;姚江云;: "基于FPGA的全景相机系统的软件设计", 科技创新导报, no. 12, 21 April 2020 (2020-04-21) *
郭;王波;: "交叉路口摄像头视觉目标标定及误差校正仿真", 计算机仿真, no. 11, 15 November 2018 (2018-11-15) *

Also Published As

Publication number Publication date
CN116993830B (en) 2024-09-27

Similar Documents

Publication Publication Date Title
CN110211043B (en) Registration method based on grid optimization for panoramic image stitching
CN111076733B (en) Robot indoor map building method and system based on vision and laser slam
CN104484648B (en) Robot variable visual angle obstacle detection method based on outline identification
CN106529587B (en) Vision course recognition methods based on object detection
CN103198487B (en) A kind of automatic marking method for video monitoring system
CN102156970B (en) Fisheye image correction method based on distorted straight slope calculation
CN111583110A (en) Splicing method of aerial images
CN113222820B (en) Pose information-assisted aerial remote sensing image stitching method
CN113159466B (en) Short-time photovoltaic power generation prediction system and method
CN109801220B (en) Method for solving mapping parameters in vehicle-mounted video splicing on line
CN111899164B (en) Image splicing method for multi-focal-segment scene
CN106373088A (en) Quick mosaic method for aviation images with high tilt rate and low overlapping rate
CN107192375B (en) A kind of unmanned plane multiple image adaptive location bearing calibration based on posture of taking photo by plane
CN112561807B (en) End-to-end radial distortion correction method based on convolutional neural network
CN111768332A (en) Splicing method of vehicle-mounted all-around real-time 3D panoramic image and image acquisition device
CN109376641A (en) A kind of moving vehicle detection method based on unmanned plane video
CN113295171B (en) Monocular vision-based attitude estimation method for rotating rigid body spacecraft
CN111553945A (en) Vehicle positioning method
CN110223233B (en) Unmanned aerial vehicle aerial photography image building method based on image splicing
CN110660099A (en) Rational function model fitting method for remote sensing image processing based on neural network
CN116993830B (en) Automatic calibration method for dynamic camera coordinate mapping
CN108109118B (en) Aerial image geometric correction method without control points
CN103473782B (en) Least square matching method based on object-space vertical double-surface element
CN114581515B (en) Multi-camera calibration parameter optimization method based on optimal path conversion
CN112016568A (en) Method and device for tracking image feature points of target object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant