CN115239816A - Camera calibration method, system, electronic device and storage medium - Google Patents
Camera calibration method, system, electronic device and storage medium Download PDFInfo
- Publication number
- CN115239816A CN115239816A CN202210588999.0A CN202210588999A CN115239816A CN 115239816 A CN115239816 A CN 115239816A CN 202210588999 A CN202210588999 A CN 202210588999A CN 115239816 A CN115239816 A CN 115239816A
- Authority
- CN
- China
- Prior art keywords
- camera
- calibration data
- reference calibration
- determining
- position information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 57
- 238000003860 storage Methods 0.000 title claims abstract description 14
- 238000006243 chemical reaction Methods 0.000 claims description 13
- 238000012937 correction Methods 0.000 claims description 6
- 238000005457 optimization Methods 0.000 claims description 6
- 230000003287 optical effect Effects 0.000 description 16
- 239000011159 matrix material Substances 0.000 description 13
- 238000004364 calculation method Methods 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 238000005259 measurement Methods 0.000 description 5
- 239000000243 solution Substances 0.000 description 5
- 238000004422 calculation algorithm Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000036544 posture Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 241000226585 Antennaria plantaginifolia Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 239000012482 calibration solution Substances 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a camera calibration method, a camera calibration system, electronic equipment and a storage medium. Wherein the method comprises the following steps: controlling a camera to acquire image data of an identification feature, and determining first position information of the identification feature in a camera coordinate system according to the image data and first internal reference calibration data preset by the camera; determining second position information of the identification feature in a target coordinate system according to the first position information; determining second external reference calibration data according to the first position information and the second position information; and determining second internal reference calibration data according to first internal reference calibration data preset by the camera, first external reference calibration data preset by the camera and the second external reference calibration data, and determining a calibration result of the camera based on the second external reference calibration data and the second internal reference calibration data. By the mode, originally complex high-precision calibration can be completed quickly, accurately and simply, and calibration efficiency is improved.
Description
Technical Field
The present invention relates to the field of mechanical automation technologies, and in particular, to a camera calibration method, a camera calibration system, an electronic device, and a computer-readable storage medium.
Background
In the image measurement process and machine vision application, the corresponding relation between the three-dimensional space position of the surface point of the space object and the corresponding point in the image needs to be determined, the corresponding relation can be expressed by a camera imaging model, and the process of solving the parameters in the imaging model is called camera calibration.
In practical operation, the inventor of the present application finds that research of the current calibration technology mainly focuses on indoor working environment calibration, and for outdoor long-distance and large-target measurement scenes, the following two methods mainly exist: firstly, a large calibration plate is manufactured according to the actual environment, camera calibration solution is completed by acquiring dozens of calibration plate pictures with different placing positions and postures, the method has high requirements on the calibration plate and calibration operators, the large calibration plate is difficult to manufacture, and the large calibration plate is inconvenient to carry or move during calibration and easy to deform so as to introduce errors or mistakes; secondly, the laboratory calibration is completed in the laboratory environment according to the distance and the angle of the engineering application environment, and then the calibration result is directly applied to the engineering field, so that the calibration precision is often insufficient to meet the actual application requirement; in addition, the environment of the target can not be provided with a calibration plate, so that calibration cannot be carried out.
Disclosure of Invention
The invention mainly solves the technical problem of providing a camera calibration method, a camera calibration system, an electronic device and a storage medium, which can quickly, highly precisely and simply complete originally complex high-precision calibration and improve the calibration efficiency.
In order to solve the technical problems, the invention adopts a technical scheme that: a camera calibration method is provided, which comprises the following steps:
controlling a camera to acquire image data of an identification feature, and determining first position information of the identification feature in a camera coordinate system according to the image data and first internal reference calibration data preset by the camera;
determining second position information of the identification feature in a target coordinate system according to the first position information;
determining second external reference calibration data according to the first position information and the second position information;
and determining second internal reference calibration data according to first internal reference calibration data preset by the camera, first external reference calibration data preset by the camera and the second external reference calibration data, and determining a calibration result of the camera based on the second external reference calibration data and the second internal reference calibration data.
Optionally, the controlling the camera to obtain image data of an identification feature, and determining first position information of the identification feature in a camera coordinate system according to the image data and first internal reference calibration data preset by the camera includes:
controlling a camera to acquire image data of the identification features;
confirming the pixel coordinate information corresponding to the identification characteristics according to the image data
Calibrating data according to a first internal reference preset by the cameraThe pixel coordinate informationDetermining standardized coordinates of the identification feature in the camera
Obtaining distance information d between the identification feature and the camera i ;
Determining first position information (x) of the identification feature in a camera coordinate system from the standardized coordinates and the distance information ci ,y ci ,z ci )。
Optionally, the controlling the camera to acquire image data of the identification feature includes:
controlling a laser range finder to emit laser to fall on a target plane to form a laser spot;
obtaining ranging information of a laser range finder and controlling a camera to obtain image data of the laser point, wherein the laser point is the identification feature;
the obtaining of the distance information between the identification feature and the camera includes:
and determining the distance information between the identification feature and the camera according to the distance measuring information and the position relation between the laser range finder and the camera.
Optionally, after determining the standardized coordinates of the identification feature in the camera, the method further includes:
for the standardized coordinatesDistortion correction is performed to obtain a corrected normalized coordinate (x) cn ,y cn ,1)。
Optionally, the determining, according to the first position information, second position information of the identification feature in a target coordinate system includes:
determining Euclidean distance between the identification features according to the first position information;
determining second position information (0, 0), (x) of the identification feature in a target coordinate system according to the Euclidean distance 2 ,0,0),(x 3 ,y 3 0); wherein the target coordinate system is established based on the identification features;
the determining second external reference calibration data according to the first position information and the second position information includes:
and acquiring a conversion relation between a camera coordinate system and a target coordinate system, and determining second external reference calibration data according to the first position information, the second position information and the conversion relation.
Optionally, the determining second internal reference calibration data according to the first internal reference calibration data preset by the camera, the first external reference calibration data preset by the camera, and the second external reference calibration data includes:
acquiring first object distance information d according to first external reference calibration data preset by the camera o 1 ,d o 1 =T 3 1 ,T 3 1 Calibrating T in data for the first external reference 3 (ii) a And acquiring first image distance information d according to first internal reference calibration data preset by the camera i 1 ,d i 1 =α 1 ·dx=β 1 ·dy;
Determining focal length information based on the first object distance information and the first image distance information:
wherein d is x 、d y Pixel sizes of a unit pixel in the horizontal direction and the vertical direction are respectively set; f. of x 、f y Focal length information in the horizontal direction and the vertical direction respectively;
obtaining second distance information d according to the second external reference calibration data o *,d o *=T 3 * ;
First internal reference calibration data preset based on cameraDetermining second internal reference calibration data by the focal length information and the second distance information:
optionally, the determining a calibration result of the camera based on the second external reference calibration data and the second internal reference calibration data includes:
the second external reference calibration data and the second internal reference calibration data are respectively used as new first external reference calibration data and new first internal reference calibration data, and after data updating, new second external reference calibration data and new second internal reference calibration data are obtained through calculation;
and repeating the steps to perform iterative optimization until an iteration termination condition is met, and determining final second external reference calibration data and final second internal reference calibration data as the calibration result of the camera.
In order to solve the technical problem, the invention adopts another technical scheme that: providing a camera calibration system, the system comprising:
the first acquisition module is used for controlling a camera to acquire image data of the identification feature and determining first position information of the identification feature in a camera coordinate system according to the image data and first internal reference calibration data preset by the camera;
the second acquisition module is used for determining second position information of the identification feature in a target coordinate system according to the first position information;
the external parameter determining module is used for determining second external parameter calibration data according to the first position information and the second position information;
and the calibration module is used for determining second internal reference calibration data according to first internal reference calibration data preset by the camera, first external reference calibration data preset by the camera and the second external reference calibration data, and determining a calibration result of the camera based on the second external reference calibration data and the second internal reference calibration data.
In order to solve the technical problem, the invention adopts another technical scheme that: there is provided an electronic device, the device comprising: a memory and a processor coupled to the memory; the processor calls the program data stored in the memory to execute the flow steps of the camera calibration method.
In order to solve the technical problem, the invention adopts another technical scheme that: there is provided a computer readable storage medium having stored therein program instructions which are executed to implement the process steps of the camera calibration method as described above.
Different from the prior art, the calibration method and the calibration device have the advantages that the first internal reference calibration data and the first external reference calibration data are determined through pre-calibration or the reference data of factory calibration are directly obtained, when the field engineering calibration is carried out, the camera is controlled to obtain one or more images with identification characteristics, and the second external reference calibration data and the second internal reference calibration data can be obtained through calculation according to the images, the first internal reference calibration data and the first external reference calibration data, so that the calibration result of the camera is determined quickly, the operation complexity and the time complexity of the engineering field calibration are greatly reduced, and the field calibration efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts. Wherein:
fig. 1 is a schematic flowchart of an embodiment of a camera calibration method provided in the present invention;
FIG. 2 is a flowchart illustrating an embodiment of step S101 according to the present invention;
FIG. 3 is a flowchart illustrating an embodiment of step S21 according to the present invention;
FIG. 4 is a schematic flowchart of an embodiment of the present invention after step S23;
FIG. 5 is a flowchart illustrating an embodiment of step S102 according to the present invention;
FIG. 6 is a schematic diagram of the coordinates of a signature PQS in the invention in a target plane;
FIG. 7 is a flowchart illustrating an embodiment of step S104 according to the present invention;
FIG. 8 is a schematic flowchart illustrating another embodiment of step S104 according to the present invention;
FIG. 9 is a schematic structural diagram of an embodiment of a camera calibration system according to the present invention;
FIG. 10 is a schematic structural diagram of an embodiment of an electronic device in the invention;
FIG. 11 is a schematic structural diagram of an embodiment of a computer-readable storage medium in the present invention.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
Reference in the application to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The steps in the embodiments of the present application are not necessarily processed according to the described step sequence, and may be optionally rearranged in a random manner, or steps in the embodiments may be deleted, or steps in the embodiments may be added according to requirements.
The term "and/or" in embodiments of the present application refers to any and all possible combinations including one or more of the associated listed items. It is also to be noted that: when used in this specification, the terms "comprises/comprising" specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The terms "first", "second", etc. in this application are used to distinguish between different objects and not to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In addition, although the terms "first", "second", etc. are used several times in this application to describe various data (or various elements or various applications or various instructions or various operations), etc., these data (or elements or applications or instructions or operations) should not be limited by these terms. These terms are only used to distinguish one data (or element or application or instruction or operation) from another data (or element or application or instruction or operation). For example, the first position information may be referred to as second position information, and the second position information may also be referred to as first position information, only the ranges of which are included are different, without departing from the scope of the present application, and the first position information and the second position information are each a set of various position and orientation information, only that they are not the same set of position and orientation information.
Referring to fig. 1, fig. 1 is a schematic flow chart illustrating a camera calibration method according to an embodiment of the invention. It should be noted that the method of the present invention is not limited to the flow sequence shown in fig. 1 if the results are substantially the same. As shown in fig. 1, the method comprises the steps of:
s101, controlling the camera to acquire image data of the identification features, and determining first position information of the identification features in a camera coordinate system according to the image data and first internal reference calibration data preset by the camera.
Before the camera is controlled to acquire image data of the identification characteristics, the camera is set and the identification of a measured target plane is selected according to actual engineering requirements, namely the identification is preset at a proper position of the measured target according to the object distance of the measured target, the field environment limitation, the target position and the like, the camera is set, the posture of the camera is adjusted to enable the preset identification to be located in the field range of the camera, the camera is generally good at the center of the field, and the lens aperture and the lens focusing ring are adjusted to enable the preset identification to be proper in brightness and in a focusing clear state.
Optionally, the method of selecting the identifier includes selecting a plurality of identifier features on the target itself, placing a conventional calibration object, a speckle pattern, and the like, or laser dotting may be performed in a suitable manner according to different application scenarios.
Optionally, the camera here includes a monocular camera, a binocular camera, a depth camera, and the like.
Alternatively, in the process of adjusting the lens focusing ring, where the focal length F of the lens is not changed, a fixed-focus lens is used, and the focusing process is simply to adjust the distance from the optical center of the lens to the image plane of the camera, i.e. the light sensing device.
In some embodiments, adjusting the pose of the camera comprises adjusting a pitch angle, a yaw angle, a roll angle of the camera; for binocular cameras, the included angle between the cameras can be adjusted, so that the target is located in the visual fields of the left camera and the right camera at the same time.
In an embodiment of the application, as shown in fig. 2, the step S101 specifically includes:
s21, controlling a camera to acquire image data of the identification features;
here, the image data is generally a two-dimensional image.
The substep S22, confirming the pixel coordinate information corresponding to the identification characteristics according to the image data;
specifically, a proper recognition algorithm is selected according to the characteristics of the identification characteristics to confirm the corresponding pixel coordinate informationIf the identification feature is circular, the circle center can be fitted by using a circle center detection algorithm, and the obtained circle center pixel coordinate is the pixel coordinate information corresponding to the identification feature.
Alternatively, for visualized image data, the identification features may be manually selected to determine their corresponding pixel coordinate information.
Optionally, image data may be preprocessed, such as image denoising, image adding, image filtering, rotation transformation, etc., to facilitate subsequent image processing.
Step S23, determining a standard coordinate of the identification feature in the camera according to first internal reference calibration data and pixel coordinate information preset by the camera;
specifically, the camera is preset with first internal reference calibration dataThe known calibration data may be a factory calibration result of the camera, or may be understood as other indoor or outdoor calibration results.
For a particular camera, the pixel coordinates (u, v) of an imaging point in the pixel coordinate system and the normalized coordinates (x) of the point in the image coordinate system cn ,y cn 1) the conversion formula is as follows:
normalized coordinates (x) of the point in the camera coordinate system cn ,y cn 1) three-dimensional coordinates (x) of the point in the camera coordinate system c ,y c ,z c ) The conversion formula between is as follows:
the pixel coordinate system O-UV takes the upper left corner point of the image matrix as an original point, u and v axes are respectively parallel to the imaging target surface, namely two sides of the image plane, and the coordinate unit is a pixel (pixel).
The camera coordinate system O-XcYcZc takes the optical center of the camera lens as an origin, the Xc axis and the Yc axis are respectively parallel to the x axis and the y axis of the image coordinate system, the Zc axis is coincident with the optical axis of the camera, and the coordinate unit is millimeter (mm) or other.
Wherein,u 0 、v 0 pixel coordinates representing the optical center of the camera lens in an image plane, i.e., principal point coordinates; γ is a tilt coefficient, representing a non-perpendicular factor of the u-axis and v-axis in the pixel coordinate system, and is typically 0 for a standard camera;f is the distance from the optical center of the camera to the imaging target surface, which can be understood as the image distance; dx, dy is the pixel size, i.e. the physical size of a unit pixel in the horizontal and vertical directions, in mm/pixel.
Specifically, the first internal reference calibration data is expressed as matrixThe pixel coordinate information isAccording to the formula, the standard coordinate in the camera coordinate system can be obtainedWherein,representing the identified feature pixel coordinates containing lens distortion,indicating the standardized coordinates of the identifying feature containing lens distortion in the camera coordinate system.
Substep S24, obtaining distance information between the identification feature and the camera;
specifically, the distance information d here i May be measured by a ranging tool such as radar ranging, laser ranging, tape-measure ranging, and the like.
And a substep S25 of determining first position information of the identification feature in the camera coordinate system according to the standardized coordinates and the distance information.
According to the euclidean distance formula, when the distance from the real point to the camera is d, the following relationship exists:
(x c -0) 2 +(y c -0) 2 +(z c -0) 2 =d 2
through derivation, the method can obtain the product,
therefore, based on the normalized coordinatesAnd distance information d i Then z can be obtained ci Further, the coordinates (x) of the identification feature in the camera coordinate system is obtained ci ,y ci ,z ci )。
In an embodiment of the application, as shown in fig. 3, the step S21 specifically includes:
s31, controlling a laser range finder to emit laser to fall on a target plane to form a laser point;
the target plane is an actual plane or an approximate plane where the monitored target is located, which is determined according to the visual measurement requirement.
Optionally, the laser range finder may be a laser range finder built in the camera system or an external laser range finder, the two use scenes are different, the built-in laser range finder is mainly applied to a scene with weak ambient light at a short distance, the effective range finding distance of the external laser range finder is large and can reach 100 meters, and the external laser range finder can be applied to a scene with a long distance, a large-size target and strong ambient light. In addition, a range finder using higher power or higher range can effectively measure distances of more than 1-2 kilometers.
S32, obtaining ranging information of the laser range finder and controlling a camera to obtain image data of a laser point, wherein the laser point is an identification feature;
specifically, the angle of the laser range finder can be finely adjusted, a plurality of different laser points are projected on the target plane, and each laser point is projected to obtain the range finding information of the laser range finder and control the camera to shoot the image.
Optionally, for a binocular camera, the laser range finders respectively attached to the two cameras need to be operated, so that two laser points projected by the two laser range finders coincide, and distance measurement and shooting can be performed. For example, the laser range finder attached to the left camera may be controlled to project a laser spot onto a target plane in the public view, the laser range finder attached to the right camera may be controlled to project a laser spot onto the same point so that the two laser spots coincide with each other, and then respective captured images of the two cameras and respective range finding information of the two laser range finders may be obtained.
Alternatively, it is generally desirable to project at least three non-collinear laser spots as the identifying feature.
Optionally, the obtained image data is processed to obtain a laser optical center, the amplified image is directly subjected to manual laser optical center selection in a visual interface frame, a laser spot center is generally used as a laser center, a laser area can be locked by setting a gray value threshold to determine the laser optical center through automatic image processing, and the laser optical center can be identified through an existing identification algorithm, which is not described herein again.
The step S24 specifically includes determining distance information between the identification feature and the camera according to the distance measurement information and the position relationship between the laser range finder and the camera.
Optionally, for the built-in laser range finder, the laser exit point and the optical center of the camera are generally arranged to be substantially on the same vertical line, the distance between the laser exit point and the optical center of the camera is very short, and the distance between the laser exit point and the optical center of the camera is negligible relative to the distance between the mark point and the camera, so that the range finding information of the laser range finder is substantially equal to the distance information between the laser point and the camera, that is, the distance information between the mark feature and the camera.
Optionally, for an independent external laser range finder, if the position relationship between the laser range finder and the camera is known, the distance information between the identification feature and the camera may also be calculated according to the range information.
Further, for a system in which the positional relationship between the camera and the laser range finder is fixed, the positional relationship between the laser range finder and the camera may be calibrated in advance.
In an embodiment of the present application, as shown in fig. 4, after the step S23, the method further includes performing distortion correction on the normalized coordinates to obtain corrected normalized coordinates;
specifically, the normal solution is carried out according to the distortion coefficient obtained from the known calibration result to obtain the standard coordinate (x) of the corrected identification feature cn ,y cn ,1)。
Specifically, distortion is image deformation that the camera inevitably generates due to its own imaging characteristics. Intrinsic parameters are a description of the camera's internal characteristics, including the camera image center, camera focal length, etc. The distortion can be divided into two types, tangential distortion and radial distortion. Radial distortion arises because rays are more curved away from the center of the lens than they are near the center, and there are both barrel and pincushion distortions. Tangential distortion arises from the lens not being perfectly parallel to the image; the distortion system is not affected by the resolution of camera shots and other factors, and is an inherent property of the camera itself.
Alternatively, a common model of radial distortion is represented as follows, where k 1 、k 2 、k 3 Is the radial distortion coefficient.
A common model for tangential distortion is represented as follows, the tangential distortion coefficient being p 1 ,p 2 ;
Optionally, the distortion correction solution may also be performed according to other distortion models, such as a division model, a Brown distortion model, and a higher-order polynomial model, which is not described herein again.
Alternatively, for a lens with little distortion, the distortion may be ignored.
And S102, determining second position information of the identification features in the target coordinate system according to the first position information.
The target coordinate system is a three-dimensional coordinate system established in a target plane, and is generally determined based on the identification features.
In an embodiment of the present application, as shown in fig. 5, the step S102 further includes,
substep S51, determining Euclidean distance between the identification features according to the first position information;
specifically, based on the principle of distance invariance, the coordinates (x) of the identification features in the camera coordinate system are determined ci ,y ci ,z ci ) The euclidean distance between the identifying features may be determined.
Specifically, assume that there are three non-collinear identification features, such as identification points P, Q, S, whose coordinates in the camera coordinate system are respectively (x) c1 ,y c1 ,z c1 )(x c2 ,y c2 ,z c2 )(x c3 ,y c3 ,z c3 ) The Euclidean distance d between P and Q can be obtained 12 And the Euclidean distance between P and S is d 13 And the Euclidean distance between Q and S is d 13 。
The substep S52, determining second position information of the identification features under a target coordinate system according to Euclidean distances among the identification features; wherein the target coordinate system is established based on the identification features;
specifically, a target coordinate system can be established based on three identification features, a plane where the three identification features are located replaces a target plane, the plane where the three identification features are located represents an O-XY plane of the target coordinate system, it is assumed that P determines an origin of the target coordinate system, a PQ connection line determines an X-axis direction, a perpendicular is made to the X-axis through S, a Y-axis direction is determined by the perpendicular, a Z-axis is perpendicular to the OXY plane, and the target coordinate system is established according to the rule of a right-hand coordinate system. The coordinates of P, Q, S in the target coordinate system can be set to (0, 0), (x) 2 ,0,0),(x 3 ,y 3 0), pixel coordinates corresponding to P, Q, S, respectively
Based on the invariance of the distance in the transformation process of the coordinate system, namely, the Euclidean distance obtained by coordinate calculation of P and Q in the camera coordinate system is equal to the Euclidean distance obtained by coordinate calculation of P and Q in the target coordinate system, the following steps are provided:
x 2 =|Q-P|=d 12
as is evident from FIG. 6, x 2 Equal to the Euclidean distance d between P and Q 12 I.e., modulo | Q-P | of the vector PQ; x is the number of 3 The vector PQ direction is equal to the projection of the vector PS, namely the value obtained by inner product operation of the vector PS and the unit vector in the vector PQ direction; in the figure, P, Q, S rotate in the counterclockwise direction, y 3 Equal to the projection of vector PQ on the perpendicular to PS, i.e., the value obtained by performing an outer product operation on the vector PS and a unit vector in the direction of vector PQ.
By using the above mode, the coordinates of each point in the target coordinate system can be obtained only by the definition of the target world coordinate system and simple geometric relation operation, namely: the coordinates of P, Q and S in the target coordinate system can be quickly calculated, and the calculation amount is reduced.
Optionally, the target coordinate system is also established according to actual requirements, so that coordinates of P, Q, and S in the target coordinate system are obtained by other geometric methods or analytic methods, which is not described herein.
S103, determining second external reference calibration data according to the first position information and the second position information.
Further, a conversion relation between the camera coordinate system and the target coordinate system is obtained, and second external reference calibration data is determined according to the first position information, the second position information and the conversion relation.
In particular, for a particular camera, the three-dimensional coordinates (x) of a real point in the camera coordinate system c ,y c ,z c ) The conversion formula with the three-dimensional coordinates (x, y, z) of the point in the target coordinate system is as follows:
wherein,is an external reference matrix, R is a rotation matrix, and is a 3 multiplied by 3 unit orthogonal matrix; t is a translation vector and is a coordinate in a corresponding target coordinate system;
specifically, taking P, Q, S as an example, according to the first position information (x) c1 ,y c1 ,z c1 )(x c2 ,y c2 ,z c2 )(x c3 ,y c3 ,z c3 ) Second position information (0, 0), (x) 2 ,0,0),(x 3 ,y 3 0), and the physical meaning and rotation matrix R definition of the identifying feature in the target coordinate system, are as follows:
Wherein, the vector r 1 ,r 2 ,r 3 Satisfy | | | r 1 ||=1,||r 2 ||=1,r 1 ·r 2 =0, can be used to verify the above
And calculating a result.
The first position information (x) c1 ,y c1 ,z c1 )(x c2 ,y c2 ,z c2 )(x c3 ,y c3 ,z c3 ) And second position information (0, 0), (x) 2 ,0,0),(x 3 ,y 3 And 0) is substituted into the formula to carry out operation, so that second external reference calibration data can be obtained and expressed by a matrix as
By using the method, the value of each component in the external reference calibration matrix can be quickly obtained through simple geometric relation calculation without solving a complex multivariate equation, so that the calculation amount is reduced.
S104, second internal reference calibration data are determined according to first internal reference calibration data preset by the camera, first external reference calibration data preset by the camera and second external reference calibration data, and a calibration result of the camera is determined based on the second external reference calibration data and the second internal reference calibration data.
In an embodiment of the present application, as shown in fig. 7, the step S104 further includes,
substep S71, obtaining first object distance information d according to first external reference calibration data preset by the camera o 1 ,d o 1 =T 3 1 ,T 3 1 Calibrating T in data for the first external reference 3 (ii) a And acquiring first image distance information d according to first internal reference calibration data preset by the camera i 1 ,d i 1 =α 1 ·dx=β 1 ·dy;
Wherein the first external reference calibration data is expressed by matrix asFirst reference calibration data expressed as matrixdx and dy are the pixel sizes of the unit pixel in the horizontal and vertical directions, respectively, where the horizontal and vertical directions refer to the image coordinate system O-X cn Y cn X-axis direction and y-axis direction.
In particular, object distance information refers in physics to the distance of an object to the optical center of a lens, here equivalent to the z-translation component T in the external reference matrix 3 Approximately replacing object distance information, thus yielding d o 1 =T 3 1 。
In particular, the image distance information is the distance between the image of the object and the optical center of the lens, by definitionf is the distance from the optical center of the camera to the imaging target surface, f is approximately substituted for the distance information, thus obtaining d i 1 =α 1 ·dx=β 1 ·dy。
Substep S72, determining focal distance information based on the first object distance information and the first image distance information:
wherein f is x 、f y Focal length information in the horizontal and vertical directions, respectively, where the horizontal and vertical directions refer to the image coordinate system O-X cn Y cn X-axis direction and y-axis direction.
In particular, according to the gaussian imaging formula,substituting the correlation data to obtain f x 、f y 。
Substep S73, obtaining second distance information d according to second external reference calibration data o *,d o *=T 3 * ;
Substep S74, based on preset first internal reference calibration data of cameraDetermining second internal reference calibration data by the focal length information and the second focal length information:
specifically, the second reference calibration data is expressed as matrixWherein u is 0 、v 0 Y is only related to the camera properties itself, and its value is substantially unchanged.
In particular, for a fixed focus lens, f x 、f y The same holds true, according to the gaussian formula,
here, dx and dy are intrinsic properties of the camera, and their values are basically unchanged and can be derived
Specifically, if the second external reference calibration data is obtainedAnd second reference calibration dataThe precision requirement in the target scene is met through inspection, and the calibration result can be directly used as the calibration result of the camera in the scene.
In an embodiment of the present application, as shown in fig. 8, the step S104 further includes,
substep S81, taking the second external reference calibration data and the second internal reference calibration data as new first external reference calibration data and new first internal reference calibration data, and calculating to obtain new second external reference calibration data and new second internal reference calibration data after updating the data;
specifically, if the second external reference calibration data is obtainedAnd second reference calibration dataIf the accuracy requirement in the target scene cannot be met through detection, the required data is used as a preset value, namely an iteration initial value, and further optimization is carried out according to the method in the application.
And S82, repeating the steps to perform iterative optimization until an iteration termination condition is met, and determining final second external reference calibration data and final second internal reference calibration data as the calibration result of the camera.
Specifically, the iteration termination condition may be set as a value obtained after a preset iteration number n is reached as a final calibration result, where n is an optional value of 50 according to an actual situation and may also be another positive value; it can also be set that the iterative error function value is less than epsilon, epsilon can be a value close to 0 according to the actual situation, and the error function selection includes, but is not limited to, the classical error function least square method. This indicates that the iteration has stabilized.
In practical application, the method can obtain the second external reference calibration data and the second internal reference calibration data during the on-site engineering calibration by controlling the camera to obtain one or more images with identification characteristics and calculating according to the images, the first internal reference calibration data and the first external reference calibration data when the on-site engineering calibration is carried out, thereby quickly determining the calibration result of the camera, greatly reducing the operation complexity and time complexity of engineering on-site calibration and improving the on-site calibration efficiency. The form of the practical application includes that,
1. camera calibration in combination with target self-identification features
Specifically, it is generally required that one or more objects having specific textures or shapes exist in the target scene, and the specific textures or shapes can be identified as identification features and have a certain position relationship with each other.
2. Camera calibration combined with laser range finder
Specifically, a laser point is projected on a target plane by using a laser range finder, and the laser point is used as an identification feature for identification.
3. Calibration of camera in combination with specific calibration plate/speckle pattern or the like
Specifically, a specific calibration board/speckle pattern is arranged in the target scene, and the specific calibration board/speckle pattern is used as an identification feature for identification.
The above application forms can be selected in combination with specific scenes.
Referring to fig. 9, fig. 9 is a schematic structural diagram of an embodiment of a camera calibration system according to the invention. In this embodiment, the camera calibration system 200 includes a first obtaining module 201, a second obtaining module 202, an external parameter determining module 203, and a calibration module 204.
The first obtaining module 201 is configured to control the camera to obtain image data of the identification feature, and determine first position information of the identification feature in a camera coordinate system according to the image data and first internal reference calibration data preset by the camera;
the second obtaining module 202 is configured to determine second position information of the identifier feature in the target coordinate system according to the first position information;
the external parameter determining module 203 is configured to determine second external parameter calibration data according to the first position information and the second position information;
the calibration module 204 is configured to determine second internal reference calibration data according to first internal reference calibration data preset by the camera, first external reference calibration data preset by the camera, and second external reference calibration data, and determine a calibration result of the camera based on the second external reference calibration data and the second internal reference calibration data.
Optionally, the first obtaining module 201 is specifically configured to control the camera to obtain image data of the identification feature; confirming pixel coordinate information corresponding to the identification features according to the image data, and determining standardized coordinates of the identification features in the camera according to first internal reference calibration data and pixel coordinate information preset by the camera; and acquiring distance information of the identification feature and the camera, and determining first position information of the identification feature in a camera coordinate system according to the standardized coordinates and the distance information.
Optionally, the first obtaining module 201 is specifically configured to control the laser range finder to emit laser to fall on a target plane to form a laser point, obtain range finding information of the laser range finder, and control the camera to obtain image data of the laser point, where the laser point is an identification feature; and determining the distance information between the identification feature and the camera according to the distance measuring information and the position relation between the laser range finder and the camera.
Optionally, the first obtaining module 201, after determining the standardized coordinates of the identification feature in the camera, further includes: and carrying out lens distortion correction on the standardized coordinates to obtain the corrected standardized coordinates.
Optionally, the second obtaining module 202 is specifically configured to determine an euclidean distance between the identification features according to the first position information, and determine second position information of the identification features in the target coordinate system according to the euclidean distance; wherein the target coordinate system is established based on the identification feature.
Optionally, the external reference determining module 203 is specifically configured to obtain a conversion relationship between the camera coordinate system and the target coordinate system, and determine the second external reference calibration data according to the first position information, the second position information, and the conversion relationship.
Optionally, the calibration module 204 is specifically configured to determine second internal reference calibration data according to first internal reference calibration data preset by the camera, first external reference calibration data preset by the camera, and second external reference calibration data, and includes:
acquiring first object distance information d according to first external reference calibration data preset by a camera o 1 ,d o 1 =T 3 1 ,T 3 1 Calibrating T in data for the first external reference 3 (ii) a To be provided withAnd acquiring first image distance information d according to first internal reference calibration data preset by the camera i 1 ,d i 1 =α 1 ·dx=β 1 ·dy;
Determining focal length information based on the first object distance information and the first image distance information:
wherein, d x 、d y Pixel sizes of a unit pixel in the horizontal direction and the vertical direction are respectively set; f. of x 、f y Focal length information in the horizontal direction and the vertical direction respectively;
and obtaining second distance information d according to the second external reference calibration data o *,d o *=T 3 * ;
First internal reference calibration data preset based on cameraDetermining second internal reference calibration data by the focal length information and the second focal length information:
optionally, the calibration module 204 is specifically configured to use the second external reference calibration data and the second internal reference calibration data as new first external reference calibration data and new first internal reference calibration data, respectively, and obtain new second external reference calibration data and new second internal reference calibration data through calculation after updating data;
and repeating the steps to perform iterative optimization until an iteration termination condition is met, and determining final second external reference calibration data and final second internal reference calibration data as calibration results of the camera.
Other steps are the same as those in the camera calibration method, and are not described herein again.
Referring to fig. 10, fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the invention. The electronic device can execute the steps executed by the camera calibration system in the method. For related matters, please refer to the detailed description of the above method, which is not redundant.
The electronic device 300 comprises a memory 301 and a processor 302 connected to said memory 301.
The memory 301 is used for storing an operating system, instructions executed by the processor 302, received messages, and the like.
The processor 302 executes the instructions as follows: controlling a camera to acquire image data of the identification feature, and determining first position information of the identification feature in a camera coordinate system according to the image data and first internal reference calibration data preset by the camera; determining second position information of the identification features under the target coordinate system according to the first position information; determining second external reference calibration data according to the first position information and the second position information; and determining second internal reference calibration data according to first internal reference calibration data preset by the camera, first external reference calibration data preset by the camera and second external reference calibration data, and determining a calibration result of the camera based on the second external reference calibration data and the second internal reference calibration data.
Optionally, the processor 302 executes the instructions as: controlling a camera to acquire image data of the identification feature, and determining first position information of the identification feature in a camera coordinate system according to the image data and first internal reference calibration data preset by the camera, wherein the method comprises the following steps: controlling a camera to acquire image data of the identification features; confirming pixel coordinate information corresponding to the identification features according to the image data; determining a standard coordinate of the identification feature in the camera according to first internal reference calibration data and pixel coordinate information preset by the camera; acquiring distance information of the identification features and the camera; determining first position information of the identification features in a camera coordinate system according to the standardized coordinates and the distance information;
optionally, the processor 302 executes instructions to: controlling a camera to acquire image data identifying a feature, comprising: controlling a laser range finder to emit laser to fall on a target plane to form a laser spot; obtaining ranging information of a laser range finder and controlling a camera to obtain image data of a laser spot, wherein the laser spot is an identification feature; obtaining distance information of the identification feature and the camera, including: and determining the distance information between the identification feature and the camera according to the distance measuring information and the position relation between the laser distance measuring instrument and the camera.
Optionally, the processor 302 executes the instructions as: after determining the standardized coordinates of the identification feature in the camera, further comprising: and carrying out distortion correction on the standardized coordinates to obtain the corrected standardized coordinates.
Optionally, the processor 302 executes instructions to: determining second position information of the identification feature in the target coordinate system according to the first position information, wherein the second position information comprises: determining Euclidean distances between the identification features according to the first position information; determining second position information of the identification features in the target coordinate system according to the Euclidean distance; wherein the target coordinate system is established based on the identification features;
and determining second external reference calibration data according to the first position information and the second position information, wherein the second external reference calibration data comprises: and acquiring a conversion relation between the camera coordinate system and the target coordinate system, and determining second external reference calibration data according to the first position information, the second position information and the conversion relation.
Optionally, the processor 302 executes the instructions as: determining second internal reference calibration data according to first internal reference calibration data preset by a camera, first external reference calibration data preset by the camera and second external reference calibration data, wherein the method comprises the following steps: acquiring first object distance information d according to first external reference calibration data preset by a camera o 1 ,d o 1 =T 3 1 ,T 3 1 Calibrating T in data for the first external reference 3 (ii) a And acquiring first image distance information d according to first internal reference calibration data preset by the camera i 1 ,d i 1 =α 1 ·dx=β 1 ·dy;
Determining focal distance information based on the first object distance information and the first image distance information:
wherein dx and dy are unit pixels in the horizontal direction and the vertical direction, respectivelyUpward pixel size, f x 、f y Focal length information in the horizontal direction and the vertical direction respectively;
obtaining second object distance information d according to second external reference calibration data o *,d o *=T 3 * ;
First internal reference calibration data based on camera presettingDetermining second internal reference calibration data by the focal length information and the second focal length information:
optionally, the processor 302 executes instructions to: determining a calibration result of the camera based on the second external reference calibration data and the second internal reference calibration data, including: taking the second external reference calibration data and the second internal reference calibration data as new first external reference calibration data and new first internal reference calibration data, and calculating to obtain new second external reference calibration data and new second internal reference calibration data after updating the data;
and repeating the steps to carry out iterative optimization until an iteration termination condition is met, and determining final second external reference calibration data and final second internal reference calibration data as the calibration result of the camera.
Referring to fig. 11, fig. 11 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention. The computer readable storage medium can execute the steps executed by the camera calibration system in the method. For related matters, please refer to the detailed description of the above method, which is not redundant.
The computer readable storage medium 400 has stored therein program instructions 401, which program instructions 401 are executed to implement the steps performed by the camera calibration system in the above-described method.
The technical scheme is different from the prior art, the first internal reference calibration data and the first external reference calibration data are determined by pre-calibration or the reference data of factory calibration is directly obtained, when the field engineering calibration is carried out, the camera is controlled to obtain one or more images of the identification characteristics, and the second external reference calibration data and the second internal reference calibration data can be obtained by calculating according to the images, the first internal reference calibration data and the first external reference calibration data, so that the calibration result of the camera is determined quickly, the operation complexity and the time complexity of engineering field calibration are greatly reduced, and the field calibration efficiency is improved.
In the several embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described system embodiments are merely illustrative, and for example, the division of the modules or units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (10)
1. A camera calibration method is characterized by comprising the following steps:
controlling a camera to acquire image data of an identification feature, and determining first position information of the identification feature in a camera coordinate system according to the image data and first internal reference calibration data preset by the camera;
determining second position information of the identification feature in a target coordinate system according to the first position information;
determining second external reference calibration data according to the first position information and the second position information;
and determining second internal reference calibration data according to first internal reference calibration data preset by the camera, first external reference calibration data preset by the camera and the second external reference calibration data, and determining a calibration result of the camera based on the second external reference calibration data and the second internal reference calibration data.
2. The method of claim 1,
the method for controlling the camera to acquire image data of the identification feature and determining first position information of the identification feature in a camera coordinate system according to the image data and first internal reference calibration data preset by the camera includes the following steps:
controlling a camera to acquire image data of the identification features;
confirming pixel coordinate information corresponding to the identification features according to the image data;
determining a standard coordinate of the identification feature in the camera according to first internal reference calibration data preset by the camera and the pixel coordinate information;
acquiring distance information between the identification feature and the camera;
and determining first position information of the identification feature in a camera coordinate system according to the standardized coordinates and the distance information.
3. The method of claim 2,
the controlling the camera to acquire image data identifying the feature, comprising:
controlling a laser range finder to emit laser to fall on a target plane to form a laser spot;
obtaining ranging information of a laser range finder and controlling a camera to obtain image data of the laser point, wherein the laser point is the identification feature;
the acquiring distance information between the identification feature and the camera includes:
and determining the distance information between the identification feature and the camera according to the distance information and the position relation between the laser range finder and the camera.
4. The method of claim 2,
after the determining the standardized coordinates of the identification feature in the camera, the method further comprises:
and carrying out distortion correction on the standardized coordinates to obtain the corrected standardized coordinates.
5. The method of claim 1,
the determining second position information of the identification feature in a target coordinate system according to the first position information includes:
determining Euclidean distance between the identification features according to the first position information;
determining second position information of the identification feature in a target coordinate system according to the Euclidean distance; wherein the target coordinate system is established based on the identification features;
the determining second external reference calibration data according to the first position information and the second position information includes:
and acquiring a conversion relation between a camera coordinate system and a target coordinate system, and determining second external reference calibration data according to the first position information, the second position information and the conversion relation.
6. The method of claim 1,
the determining second internal reference calibration data according to the first internal reference calibration data preset by the camera, the first external reference calibration data preset by the camera and the second external reference calibration data comprises:
obtaining first object distance information d according to first external reference calibration data preset by the camera o 1 ,d o 1 =T 3 1 ,T 3 1 Calibrating T in data for the first external reference 3 (ii) a And acquiring first image distance information d according to first internal reference calibration data preset by the camera i 1 ,d i 1 =α 1 ·dx=β 1 ·dy;
Determining focal length information based on the first object distance information and the first image distance information:
wherein dx and dy are pixel sizes of the unit pixel in the horizontal direction and the vertical direction, respectively, f x 、f y Focal length information in the horizontal direction and the vertical direction respectively;
obtaining second distance information d according to the second external reference calibration data o *,d o *=T 3 * ;
First internal reference calibration data preset based on cameraDetermining second internal reference calibration data by the focal length information and the second distance information:
7. the method of claim 1,
the determining a calibration result of the camera based on the second external reference calibration data and the second internal reference calibration data comprises:
taking the second external reference calibration data and the second internal reference calibration data as new first external reference calibration data and new first internal reference calibration data respectively, and calculating to obtain new second external reference calibration data and new second internal reference calibration data after updating the data;
and repeating the steps to carry out iterative optimization until an iteration termination condition is met, and determining final second external reference calibration data and final second internal reference calibration data as the calibration result of the camera.
8. A camera calibration system, the system comprising:
the first acquisition module is used for controlling a camera to acquire image data of the identification feature and determining first position information of the identification feature in a camera coordinate system according to the image data and first internal reference calibration data preset by the camera;
the second acquisition module is used for determining second position information of the identification feature in a target coordinate system according to the first position information;
the external reference determining module is used for determining second external reference calibration data according to the first position information and the second position information;
and the calibration module is used for determining second internal reference calibration data according to first internal reference calibration data preset by the camera, first external reference calibration data preset by the camera and the second external reference calibration data, and determining a calibration result of the camera based on the second external reference calibration data and the second internal reference calibration data.
9. An electronic device, characterized in that the device comprises:
a memory and a processor coupled to the memory; wherein the memory stores program data, and the processor retrieves the program data stored in the memory to execute the camera calibration method according to any one of claims 1-7.
10. A computer readable storage medium having stored therein program instructions, the program instructions being executable to implement the camera calibration method as claimed in any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210588999.0A CN115239816A (en) | 2022-05-26 | 2022-05-26 | Camera calibration method, system, electronic device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210588999.0A CN115239816A (en) | 2022-05-26 | 2022-05-26 | Camera calibration method, system, electronic device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115239816A true CN115239816A (en) | 2022-10-25 |
Family
ID=83667894
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210588999.0A Pending CN115239816A (en) | 2022-05-26 | 2022-05-26 | Camera calibration method, system, electronic device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115239816A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118011421A (en) * | 2024-04-10 | 2024-05-10 | 中国科学院西安光学精密机械研究所 | Theodolite image automatic focusing method and system based on laser radar depth estimation |
-
2022
- 2022-05-26 CN CN202210588999.0A patent/CN115239816A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118011421A (en) * | 2024-04-10 | 2024-05-10 | 中国科学院西安光学精密机械研究所 | Theodolite image automatic focusing method and system based on laser radar depth estimation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109767476B (en) | Automatic focusing binocular camera calibration and depth calculation method | |
CN107767422B (en) | Fisheye lens correction method and device and portable terminal | |
CN110689581B (en) | Structured light module calibration method, electronic device and computer readable storage medium | |
CN109859272B (en) | Automatic focusing binocular camera calibration method and device | |
CN108257183B (en) | Camera lens optical axis calibration method and device | |
CN112949478B (en) | Target detection method based on tripod head camera | |
Douxchamps et al. | High-accuracy and robust localization of large control markers for geometric camera calibration | |
CN111210468A (en) | Image depth information acquisition method and device | |
CN113920206B (en) | Calibration method of perspective tilt-shift camera | |
CN111080705B (en) | Calibration method and device for automatic focusing binocular camera | |
CN115830103A (en) | Monocular color-based transparent object positioning method and device and storage medium | |
CN110099267A (en) | Trapezoidal correcting system, method and projector | |
CN109272555B (en) | External parameter obtaining and calibrating method for RGB-D camera | |
CN110136207B (en) | Fisheye camera calibration system, fisheye camera calibration method, fisheye camera calibration device, electronic equipment and storage medium | |
CN110827360B (en) | Photometric stereo measurement system and method for calibrating light source direction thereof | |
CN111383264B (en) | Positioning method, positioning device, terminal and computer storage medium | |
CN113298886B (en) | Calibration method of projector | |
CN112365421B (en) | Image correction processing method and device | |
CN115187612A (en) | Plane area measuring method, device and system based on machine vision | |
CN117495975A (en) | Zoom lens calibration method and device and electronic equipment | |
CN115239816A (en) | Camera calibration method, system, electronic device and storage medium | |
CN112489141B (en) | Production line calibration method and device for single-board single-image strip relay lens of vehicle-mounted camera | |
CN111754587B (en) | Zoom lens rapid calibration method based on single-focus focusing shooting image | |
CN112598751A (en) | Calibration method and device, terminal and storage medium | |
CN115018922A (en) | Distortion parameter calibration method, electronic device and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |