CN112802120A - Camera external parameter calibration method based on non-uniform segmentation accumulation table and orthogonal blanking points - Google Patents
Camera external parameter calibration method based on non-uniform segmentation accumulation table and orthogonal blanking points Download PDFInfo
- Publication number
- CN112802120A CN112802120A CN202110040970.4A CN202110040970A CN112802120A CN 112802120 A CN112802120 A CN 112802120A CN 202110040970 A CN202110040970 A CN 202110040970A CN 112802120 A CN112802120 A CN 112802120A
- Authority
- CN
- China
- Prior art keywords
- line segments
- coordinate system
- rotation matrix
- theta
- accumulation table
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000009825 accumulation Methods 0.000 title claims abstract description 36
- 230000011218 segmentation Effects 0.000 title claims abstract description 31
- 238000000034 method Methods 0.000 title claims abstract description 23
- 239000011159 matrix material Substances 0.000 claims abstract description 39
- 238000005070 sampling Methods 0.000 claims abstract description 7
- 239000013598 vector Substances 0.000 claims description 9
- 230000003287 optical effect Effects 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000005192 partition Methods 0.000 claims description 3
- 238000001514 detection method Methods 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 14
- 238000004590 computer program Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a camera external reference calibration method based on a non-uniform segmentation accumulation table and orthogonal blanking points. Establishing a spherical coordinate system, and projecting the intersection points of all the detection line segments onto a unit sphere under the coordinate system; dividing azimuth angles according to angles, dividing zenith angles by combining angles and accumulation cell areas, establishing a non-uniform division accumulation table, and updating by using projection of line segment intersection points; randomly sampling a plurality of groups of line segments, wherein each group comprises two line segments; respectively establishing rotation matrix hypotheses in a spherical coordinate system or orthogonality of line segments in the spherical coordinate system according to the condition that two line segments correspond to the same blanking point or different blanking points; and respectively calculating the accumulated value of each rotation matrix in the non-uniform segmentation accumulation table, and outputting the rotation matrix with the maximum corresponding accumulated value as a final result. The method can accurately detect the blanking point, acquire the rotation matrix of the camera relative to the world coordinate system, and greatly improve the accuracy of external reference calibration of the camera.
Description
Technical Field
The invention relates to camera external reference calibration in the field of calibration, in particular to a camera external reference calibration method based on a non-uniform segmentation accumulation table and orthogonal blanking points.
Background
Computer vision techniques are widely applied to practical scenes, such as detection tracking, three-dimensional reconstruction, velocity estimation, and the like. The camera external reference calibration is a key link in computer vision application, determines the transformation relation of rotation and translation between a camera coordinate system and a world coordinate system, and the calibration result directly influences the accuracy of subsequent detection, tracking and other steps. Therefore, it is very important to design an accurate camera external reference calibration method in practical application.
In an actual scene structure, there are many mutually parallel straight lines which pass through the perspective projection and intersect at a point on the image, which is called a blanking point. The detection of the blanking point plays an important role in the external reference calibration of the camera, and one common method is to calculate the position of the blanking point by using the line segment information in the image to obtain the rotation transformation of the camera relative to the scene, but the rotation matrix in the method is not constrained by orthogonality, and the rotation matrix cannot ensure orthogonality under the known internal reference of the camera. The other method is to generate a plurality of hypotheses of the rotation matrix in the space, project the hypotheses to the image to obtain a plurality of groups of orthogonal blanking points, and optimize according to the line segment information to obtain a final result.
Disclosure of Invention
The invention aims to provide a camera external reference calibration method based on a non-uniform segmentation accumulation table and orthogonal blanking points, which overcomes the problems of the existing external reference calibration, has simple principle and high efficiency, ensures the orthogonality of a rotation matrix, and can be widely applied to scenes needing the camera external reference calibration.
In order to achieve the purpose, the technical scheme of the invention is as follows: a camera external parameter calibration method based on a non-uniform segmentation accumulation table and orthogonal blanking points comprises the following steps:
step S1, establishing a spherical coordinate system by taking the optical center of the camera as an origin, projecting the intersection point of the line segments on the image onto a unit sphere under the coordinate system to obtain coordinates (theta, phi), wherein theta represents the azimuth angle of the point and the positive half axis of the x axis, and phi represents the zenith angle of the point and the positive half axis of the z axis;
step S2, carrying out non-uniform segmentation on the unit sphere, and constructing a non-uniform segmentation accumulation table T (i, j) on the spherical surface, wherein the non-uniform segmentation accumulation table T is used for accumulating the projection of the line segment intersection points in the image on the spherical surface, and i corresponds to an azimuth angle theta and j corresponds to a zenith angle phi;
step S3, arbitrarily sampling S groups of line segments on the image, each group including two line segments, and constructing a plurality of rotation matrix hypotheses R ═ R in a spherical coordinate system according to the parallelism and orthogonality of the corresponding blanking point line segments in the image1 r2 r3];
Step S4, searching an accumulated value corresponding to each rotation matrix hypothesis in the non-uniform segmentation accumulated table;
and step S5, outputting the rotation matrix with the maximum corresponding accumulated value in the hypothesis as the final result.
In an embodiment of the present invention, the step S1 specifically includes the following steps:
step S11, detecting N line segments in the image, and calculating the intersection points (u, v) of all the line segments;
step S12, converting the point (u, v) on the image plane into the camera coordinate system by using the camera internal reference to obtain PC=(X,Y,Z);
Step S13, converting the P coordinate system into a spherical coordinate system by using a conversion formula from a camera coordinate system to the spherical coordinate systemCProjecting the coordinate (theta, phi) of the unit sphere in a spherical coordinate system to obtain the coordinate (theta, phi) of the unit sphere in the spherical coordinate systemThe value ranges are respectively theta ∈ [0, 2 pi) and
in an embodiment of the present invention, the step S2 specifically includes the following steps:
step S21, alignmentDividing by an angle theta into N with 1 degree as precisionθ360 equal parts, where the azimuth angle θ can be expressed as
Step S22, zenith angleDividing into N with 1 degree as precisionφEqual to 60 parts, where the zenith angle phi may be expressed as
Step S23, zenith angleDividing into N 'while ensuring the same area of each accumulation unit cell'φ200 equal parts, where the zenith angle phi may be expressed as
Step S25, using the non-uniform segmentation accumulation table to match the spherical coordinates obtained in step S1Accumulating and updating the accumulation table to obtainWherein lmAnd lnRepresenting two line segments corresponding to the intersection point in the image, and alpha represents the included angle between the two line segments.
In an embodiment of the present invention, the step S3 specifically includes the following steps:
s31, sampling the N line segments detected in the S1 to obtain S groups of line segments, wherein each group comprises two line segments;
step S32, assuming that two line segments correspond to the same blanking point, i.e. are parallel in space, calculating the intersection point of the two line segments to obtain a first blanking point v1Blank point v1Corresponding rotation matrix in spaceWith r1The normal vector plane intersects with the unit sphere to form a circle, the azimuth angle theta is divided on the circle by taking 1 degree as the precision, and the zenith angle corresponding to each azimuth angle is obtainedR is calculated according to the coordinates (theta, phi)2(ii) a Finally, r is calculated3Is r1And r2Cross multiplication of (1);
step S33, assuming that the two line segments correspond to different blanking points, i.e. are orthogonal in space, which corresponds to r of the rotation matrix in space1And r2(ii) a Constructing a plane passing through the optical center of the camera for each line segment, wherein the normal vectors of the plane are n1And n2Each plane intersects a unit sphere to form a circle; in a manner thatOn a circle of normal vector with an accuracy of 1 DEG to an azimuth angle theta1Dividing to obtain zenith angle corresponding to each azimuth angleAccording to the coordinates (theta)1,φ1) Calculated to obtain r1(ii) a If r1×n2Not equal to 0, then r2Is r1And n2Cross product of, if r1×n2When r is equal to 0, then2=n1(ii) a Finally, r is calculated3Is r1And r2Cross multiplication of (1);
step S34, repeating step S32 and step S33S times, considering spinSymmetry of the rotation matrix, co-construction A rotation matrix hypothesis.
In an embodiment of the present invention, the step S4 specifically includes the following steps:
step S41, calculating r for each rotation matrix hypothesis1、r2And r3Corresponding T (i) on the non-uniform partition accumulation table1,j1)、T(i2,j2) And T (i)3,j3);
In step S42, the accumulated value of each rotation matrix hypothesis is T (i)1,j1)+T(i2,j2)+T(i3,j3)。
Compared with the prior art, the invention has the following beneficial effects: the method can accurately detect the blanking point, acquire the rotation matrix of the camera relative to the world coordinate system and greatly improve the accuracy of external parameter calibration of the camera; the method has the advantages of simple principle and high efficiency, ensures the orthogonality of the rotation matrix, and can be widely applied to scenes needing external parameter calibration of the camera.
Drawings
Fig. 1 is a block diagram of the structure of the embodiment of the present invention.
Fig. 2 is a schematic diagram of the point on the image in step S1 projected to the spherical coordinate system in the embodiment of the present invention.
FIG. 3 is a schematic diagram of the step S22 of dividing the zenith angle φ according to angle in this embodiment of the present invention.
FIG. 4 is a diagram illustrating the step S23 of dividing the zenith angle phi according to the area of the accumulated cells according to an embodiment of the present invention.
FIG. 5 is a schematic diagram of the step S32 of constructing the rotation matrix hypothesis according to the parallelism according to the embodiment of the present invention.
FIG. 6 is a schematic diagram of the step S32 for constructing the rotation matrix hypothesis according to the orthogonality according to the embodiment of the present invention.
Detailed Description
The technical scheme of the invention is specifically explained below with reference to the accompanying drawings.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
As shown in fig. 1, the embodiment provides a camera external reference calibration method based on a non-uniform segmentation accumulation table and orthogonal blanking points, including the following steps:
step S1, establishing a spherical coordinate system by taking the optical center of the camera as an origin, projecting the intersection point of the line segments on the image onto a unit sphere under the coordinate system to obtain coordinates (theta, phi), wherein theta represents the azimuth angle of the point and the positive half axis of the x axis, and phi represents the zenith angle of the point and the positive half axis of the z axis;
step S2, carrying out non-uniform segmentation on the unit sphere, and constructing a non-uniform segmentation accumulation table T (i, j) on the spherical surface, wherein the non-uniform segmentation accumulation table T is used for accumulating the projection of the line segment intersection points in the image on the spherical surface, and i corresponds to an azimuth angle theta and j corresponds to a zenith angle phi;
step S3, arbitrarily sampling S groups of line segments on the image, each group including two line segments, and constructing a plurality of rotation matrix hypotheses R ═ R in a spherical coordinate system according to the parallelism and orthogonality of the corresponding blanking point line segments in the image1 r2 r3];
Step S4, searching an accumulated value corresponding to each rotation matrix hypothesis in the non-uniform segmentation accumulated table;
and step S5, outputting the rotation matrix with the maximum corresponding accumulated value in the hypothesis as the final result.
As shown in fig. 2, in this embodiment, the step S1 specifically includes the following steps:
step S11, detecting N line segments in the image, and calculating the intersection points (u, v) of all the line segments;
step S12, converting the point (u, v) on the image plane into the camera coordinate system by using the camera internal reference to obtain PC=(X,Y,Z);
Step S13, converting the P coordinate system into a spherical coordinate system by using a conversion formula from a camera coordinate system to the spherical coordinate systemCProjecting the coordinate (theta, phi) of the unit sphere in a spherical coordinate system to obtain the coordinate (theta, phi) of the unit sphere in the spherical coordinate systemThe value ranges are respectively theta ∈ [0, 2 pi) and
in this embodiment, the step S2 specifically includes the following steps:
step S21, dividing the azimuth angle theta into N with 1 degree as precisionθ360 equal parts, where the azimuth angle θ can be expressed as
Step S22, zenith angleDividing into N with 1 degree as precisionφEqual to 60 parts, where the zenith angle phi may be expressed asAs shown in fig. 3.
Step S23, zenith angleTo carry outDividing into N 'by ensuring the same area of each accumulation unit cell'φ200 equal parts, where the zenith angle phi may be expressed asAs shown in fig. 4.
Step S25, accumulating the spherical coordinates (theta, phi) obtained in the step S1 by utilizing a non-uniform segmentation accumulation table, and updating the accumulation table to obtainWherein lmAnd lnRepresenting two line segments corresponding to the intersection point in the image, and alpha represents the included angle between the two line segments.
In this embodiment, the step S3 specifically includes the following steps:
s31, sampling the N line segments detected in the S1 to obtain S groups of line segments, wherein each group comprises two line segments;
step S32, assuming that two line segments correspond to the same blanking point, i.e. are parallel in space, calculating the intersection point of the two line segments to obtain a first blanking point v1Blank point v1Corresponding rotation matrix in spaceWith r1The normal vector plane intersects with the unit sphere to form a circle, the azimuth angle theta is divided on the circle by taking 1 degree as the precision, and the zenith angle corresponding to each azimuth angle is obtainedR is calculated according to the coordinates (theta, phi)2(ii) a Finally, r is calculated3Is r1And r2Cross product of (d), as shown in fig. 5.
Step S33, assume that two line segments correspond to each otherDifferent blanking points, i.e. orthogonal in space, corresponding in space to r of the rotation matrix1And r2(ii) a Constructing a plane passing through the optical center of the camera for each line segment, wherein the normal vectors of the plane are n1And n2Each plane intersects a unit sphere to form a circle; in a manner thatOn a circle of normal vector with an accuracy of 1 DEG to an azimuth angle theta1Dividing to obtain zenith angle corresponding to each azimuth angleAccording to the coordinates (theta)1,φ1) Calculated to obtain r1(ii) a If r1×n2Not equal to 0, then r2Is r1And n2Cross product of, if r1×n2When r is equal to 0, then2=n1(ii) a Finally, r is calculated3Is r1And r2Cross product of (d), as shown in fig. 6.
Step S34, repeating the steps S32 and S33S times, considering symmetry of rotation matrix, and constructing A rotation matrix hypothesis.
In an embodiment of the present invention, the step S4 specifically includes the following steps:
step S41, calculating r for each rotation matrix hypothesis1、r2And r3Corresponding T (i) on the non-uniform partition accumulation table1,j1)、T(i2,j2) And T (i)3,j3);
In step S42, the accumulated value of each rotation matrix hypothesis is T (i)1,j1)+T(i2,j2)+T(i3,j3)。
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The foregoing is directed to preferred embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow. However, any simple modification, equivalent change and modification of the above embodiments according to the technical essence of the present invention are within the protection scope of the technical solution of the present invention.
Claims (5)
1. A camera external reference calibration method based on a non-uniform segmentation accumulation table and orthogonal blanking points is characterized by comprising the following steps:
step S1, establishing a spherical coordinate system by taking the optical center of the camera as an origin, projecting the intersection point of the line segments on the image onto a unit sphere under the coordinate system to obtain coordinates (theta, phi), wherein theta represents the azimuth angle of the point and the positive half axis of the x axis, and phi represents the zenith angle of the point and the positive half axis of the z axis;
step S2, carrying out non-uniform segmentation on the unit sphere, and constructing a non-uniform segmentation accumulation table T (i, j) on the spherical surface, wherein the non-uniform segmentation accumulation table T is used for accumulating the projection of the line segment intersection points in the image on the spherical surface, and i corresponds to an azimuth angle theta and j corresponds to a zenith angle phi;
step S3, arbitrarily sampling S groups of line segments on the image, each group including two line segments, and constructing a plurality of rotation matrix hypotheses R ═ R in a spherical coordinate system according to the parallelism and orthogonality of the corresponding blanking point line segments in the image1 r2 r3];
Step S4, searching an accumulated value corresponding to each rotation matrix hypothesis in the non-uniform segmentation accumulated table;
and step S5, outputting the rotation matrix with the maximum corresponding accumulated value in the hypothesis as the final result.
2. The method for calibrating camera external parameters based on the non-uniform segmentation accumulation table and the orthogonal blanking points as claimed in claim 1, wherein the step S1 specifically comprises the following steps:
step S11, detecting N line segments in the image, and calculating the intersection points (u, v) of all the line segments;
step S12, converting the point (u, v) on the image plane into the camera coordinate system by using the camera internal reference to obtain PC=(X,Y,Z);
Step S13, converting the P coordinate system into a spherical coordinate system by using a conversion formula from a camera coordinate system to the spherical coordinate systemCProjecting the coordinate (theta, phi) of the unit sphere in a spherical coordinate system to obtain the coordinate (theta, phi) of the unit sphere in the spherical coordinate systemThe value ranges are respectively theta ∈ [0, 2 pi) and
3. the method for calibrating camera external parameters based on the non-uniform segmentation accumulation table and the orthogonal blanking points as claimed in claim 1, wherein the step S2 specifically comprises the following steps:
step S21, dividing the azimuth angle theta into N with 1 degree as precisionθ360 equal parts, where the azimuth angle θ can be expressed as
Step S22, zenith angleDividing into N with 1 degree as precisionφEqual to 60 parts, where the zenith angle phi may be expressed as
Step S23, zenith angleDividing into N 'while ensuring the same area of each accumulation unit cell'φ200 equal parts, wherein,the zenith angle phi can be expressed as
Step S25, accumulating the spherical coordinates (theta, phi) obtained in the step S1 by utilizing a non-uniform segmentation accumulation table, and updating the accumulation table to obtainWherein lmAnd lnRepresenting two line segments corresponding to the intersection point in the image, and alpha represents the included angle between the two line segments.
4. The method for calibrating camera external parameters based on the non-uniform segmentation accumulation table and the orthogonal blanking points as claimed in claim 1, wherein the step S3 specifically comprises the following steps:
s31, sampling the N line segments detected in the S1 to obtain S groups of line segments, wherein each group comprises two line segments;
step S32, assuming that two line segments correspond to the same blanking point, i.e. are parallel in space, calculating the intersection point of the two line segments to obtain a first blanking point v1Blank point v1Corresponding rotation matrix in spaceWith r1The normal vector plane intersects with the unit sphere to form a circle, the azimuth angle theta is divided on the circle by taking 1 degree as the precision, and the zenith angle corresponding to each azimuth angle is obtainedR is calculated according to the coordinates (theta, phi)2(ii) a Finally, r is calculated3Is r1And r2Is cross multiplication of;
Step S33, assuming that the two line segments correspond to different blanking points, i.e. are orthogonal in space, which corresponds to r of the rotation matrix in space1And r2(ii) a Constructing a plane passing through the optical center of the camera for each line segment, wherein the normal vectors of the plane are n1And n2Each plane intersects a unit sphere to form a circle; in a manner thatOn a circle of normal vector with an accuracy of 1 DEG to an azimuth angle theta1Dividing to obtain zenith angle corresponding to each azimuth angleAccording to the coordinates (theta)1,φ1) Calculated to obtain r1(ii) a If r1×n2Not equal to 0, then r2Is r1And n2Cross product of, if r1×n2When r is equal to 0, then2=n1(ii) a Finally, r is calculated3Is r1And r2Cross multiplication of (1);
5. The method for calibrating camera external parameters based on the non-uniform segmentation accumulation table and the orthogonal blanking points as claimed in claim 1, wherein the step S4 specifically comprises the following steps:
step S41, calculating r for each rotation matrix hypothesis1、r2And r3Corresponding T (i) on the non-uniform partition accumulation table1,j1)、T(i2,j2) And T (i)3,j3);
In step S42, the accumulated value of each rotation matrix hypothesis is T (i)1,j1)+T(i2,j2)+T(i3,j3)。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110040970.4A CN112802120B (en) | 2021-01-13 | 2021-01-13 | Camera external parameter calibration method based on non-uniform segmentation accumulation table and orthogonal blanking points |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110040970.4A CN112802120B (en) | 2021-01-13 | 2021-01-13 | Camera external parameter calibration method based on non-uniform segmentation accumulation table and orthogonal blanking points |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112802120A true CN112802120A (en) | 2021-05-14 |
CN112802120B CN112802120B (en) | 2024-02-27 |
Family
ID=75810368
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110040970.4A Active CN112802120B (en) | 2021-01-13 | 2021-01-13 | Camera external parameter calibration method based on non-uniform segmentation accumulation table and orthogonal blanking points |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112802120B (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104392450A (en) * | 2014-11-27 | 2015-03-04 | 苏州科达科技股份有限公司 | Method for determining focal length and rotary angles of camera, camera calibration method and camera calibration system |
CN104766292A (en) * | 2014-01-02 | 2015-07-08 | 株式会社理光 | Method and system for calibrating multiple stereo cameras |
CN106327532A (en) * | 2016-08-31 | 2017-01-11 | 北京天睿空间科技股份有限公司 | Three-dimensional registering method for single image |
CN109064516A (en) * | 2018-06-28 | 2018-12-21 | 北京航空航天大学 | A kind of Camera Self-Calibration method based on absolute conic picture |
CN109974618A (en) * | 2019-04-02 | 2019-07-05 | 青岛鑫慧铭视觉科技有限公司 | The overall calibration method of multisensor vision measurement system |
CN110555886A (en) * | 2018-05-31 | 2019-12-10 | 杭州海康威视数字技术股份有限公司 | Vehicle-mounted camera external parameter calibration method and device, electronic equipment and storage medium |
CN110782524A (en) * | 2019-10-25 | 2020-02-11 | 重庆邮电大学 | Indoor three-dimensional reconstruction method based on panoramic image |
CN110807815A (en) * | 2019-10-30 | 2020-02-18 | 扬州大学 | Rapid underwater calibration method based on two groups of mutually orthogonal parallel lines corresponding vanishing points |
-
2021
- 2021-01-13 CN CN202110040970.4A patent/CN112802120B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104766292A (en) * | 2014-01-02 | 2015-07-08 | 株式会社理光 | Method and system for calibrating multiple stereo cameras |
CN104392450A (en) * | 2014-11-27 | 2015-03-04 | 苏州科达科技股份有限公司 | Method for determining focal length and rotary angles of camera, camera calibration method and camera calibration system |
CN106327532A (en) * | 2016-08-31 | 2017-01-11 | 北京天睿空间科技股份有限公司 | Three-dimensional registering method for single image |
CN110555886A (en) * | 2018-05-31 | 2019-12-10 | 杭州海康威视数字技术股份有限公司 | Vehicle-mounted camera external parameter calibration method and device, electronic equipment and storage medium |
CN109064516A (en) * | 2018-06-28 | 2018-12-21 | 北京航空航天大学 | A kind of Camera Self-Calibration method based on absolute conic picture |
CN109974618A (en) * | 2019-04-02 | 2019-07-05 | 青岛鑫慧铭视觉科技有限公司 | The overall calibration method of multisensor vision measurement system |
CN110782524A (en) * | 2019-10-25 | 2020-02-11 | 重庆邮电大学 | Indoor three-dimensional reconstruction method based on panoramic image |
CN110807815A (en) * | 2019-10-30 | 2020-02-18 | 扬州大学 | Rapid underwater calibration method based on two groups of mutually orthogonal parallel lines corresponding vanishing points |
Also Published As
Publication number | Publication date |
---|---|
CN112802120B (en) | 2024-02-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106204574B (en) | Camera pose self-calibrating method based on objective plane motion feature | |
CN109165680B (en) | Single-target object dictionary model improvement method in indoor scene based on visual SLAM | |
WO2021082571A1 (en) | Robot tracking method, device and equipment and computer readable storage medium | |
CN110686677A (en) | Global positioning method based on geometric information | |
KR20140014298A (en) | Planar mapping and tracking for mobile devices | |
CN107358633A (en) | Join scaling method inside and outside a kind of polyphaser based on 3 points of demarcation things | |
Melo et al. | Unsupervised intrinsic calibration from a single frame using a" plumb-line" approach | |
CN111750820A (en) | Image positioning method and system | |
KR20170131500A (en) | 3D modeling method and apparatus | |
CN111754579A (en) | Method and device for determining external parameters of multi-view camera | |
CN113888639B (en) | Visual odometer positioning method and system based on event camera and depth camera | |
CN112198878B (en) | Instant map construction method and device, robot and storage medium | |
US11504608B2 (en) | 6DoF inside-out tracking game controller | |
CN112767546B (en) | Binocular image-based visual map generation method for mobile robot | |
Zhu et al. | Robust plane-based calibration of multiple non-overlapping cameras | |
CN111047652B (en) | Rapid multi-TOF camera external parameter calibration method and device | |
Zhang et al. | A visual-inertial dynamic object tracking SLAM tightly coupled system | |
Gao et al. | Marker tracking for video-based augmented reality | |
CN107480710B (en) | Feature point matching result processing method and device | |
CN112802120B (en) | Camera external parameter calibration method based on non-uniform segmentation accumulation table and orthogonal blanking points | |
Jaramillo et al. | 6-DoF pose localization in 3D point-cloud dense maps using a monocular camera | |
CN114399547B (en) | Monocular SLAM robust initialization method based on multiframe | |
CN113920196A (en) | Visual positioning method and device and computer equipment | |
CN115205419A (en) | Instant positioning and map construction method and device, electronic equipment and readable storage medium | |
Ming et al. | A real-time monocular visual SLAM based on the bundle adjustment with adaptive robust kernel |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |