CN111462251B - Camera calibration method and terminal - Google Patents
Camera calibration method and terminal Download PDFInfo
- Publication number
- CN111462251B CN111462251B CN202010263368.2A CN202010263368A CN111462251B CN 111462251 B CN111462251 B CN 111462251B CN 202010263368 A CN202010263368 A CN 202010263368A CN 111462251 B CN111462251 B CN 111462251B
- Authority
- CN
- China
- Prior art keywords
- coordinates
- camera
- matrix
- image
- transformation matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a camera calibration method and a terminal, wherein the method comprises the following steps: acquiring image information of a camera; establishing an image coordinate system according to the image information; respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system; calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points; respectively acquiring image coordinates of three known points and corresponding UTM coordinates; calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix; and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix. The invention can realize the automatic conversion between the image coordinate and the UTM coordinate and realize the linkage positioning of a plurality of cameras.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a camera calibration method and a terminal.
Background
As the cost of cameras and processors continues to drop, vision-based sensing is becoming an alternative to traditional sensors for collecting traffic data. Many research and business systems have derived a range of information of interest to them through the analysis of videos, such as road occupancy, vehicle speed, vehicle type, and event detection.
When a plurality of cameras are adopted for data acquisition, the following problems exist: the selection of the origin of the world coordinate system established by calibrating different cameras is inconvenient, and the obtained world coordinate system is not unique, so that the use significance of the world coordinate system is not great; when a plurality of cameras are adopted to position an object in a linkage mode according to world coordinates, multi-view projection geometry needs to be considered, and a model is complex.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the camera calibration method and the terminal can realize linkage positioning of a plurality of cameras.
In order to solve the technical problems, the invention adopts the technical scheme that:
a camera calibration method comprises the following steps:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
The invention adopts another technical scheme that:
a camera calibration terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
The invention has the beneficial effects that: coordinate conversion between the image coordinate system and the UTM coordinate system can be automatically carried out through the second transformation matrix; even if the coordinates of the same object in different camera images are different, due to the uniqueness of the UTM coordinate system, the multiple cameras can realize linkage positioning of the same object without considering multi-view projection geometry.
Drawings
Fig. 1 is a flowchart of a camera calibration method according to a first embodiment of the present invention;
fig. 2 is a schematic diagram of a camera calibration terminal according to a second embodiment of the present invention.
Description of reference numerals:
100. calibrating a terminal by a camera; 1. a memory; 2. a processor.
Detailed Description
In order to explain technical contents, achieved objects, and effects of the present invention in detail, the following description is made with reference to the accompanying drawings in combination with the embodiments.
The most key concept of the invention is as follows: and a transformation matrix between the image coordinate system and the UTM coordinate system is calculated, so that the multiple cameras can perform linkage positioning on the same object.
Referring to fig. 1, a camera calibration method includes:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
From the above description, the beneficial effects of the present invention are: coordinate conversion between the image coordinate system and the UTM coordinate system can be automatically carried out through the second transformation matrix; even if the coordinates of the same object in different camera images are different, due to the uniqueness of the UTM coordinate system, the multiple cameras can realize linkage positioning of the same object without considering multi-view projection geometry.
Further, before the obtaining the image coordinates of the principal point and the three vanishing points in the image coordinate system, respectively, the method further includes:
respectively acquiring three pairs of vanishing point directions according to the image information;
and calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
Further, the obtaining of the second transformation matrix according to the camera matrix and the first transformation matrix specifically includes: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix.
As shown in fig. 2, another technical solution related to the present invention is:
a camera calibration terminal 100 comprising a memory 1, a processor 2 and a computer program stored on said memory 1 and executable on the processor 2, said processor 2 realizing the following steps when executing said computer program:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
Further, the processor 2, when executing the computer program, implements the following steps:
before the obtaining of the image coordinates of the principal point and the three vanishing points in the image coordinate system, respectively, the method further comprises:
respectively acquiring three pairs of vanishing point directions according to the image information;
and calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
Further, the obtaining of the second transformation matrix according to the camera matrix and the first transformation matrix specifically includes: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix.
Example one
Referring to fig. 1, an embodiment of the present invention is a camera calibration method, including the following steps:
and S1, acquiring image information of the camera.
In the present embodiment, it is assumed that the image taken by the camera is free from radial distortion and the camera is looking straight forward and downward.
And S2, establishing an image coordinate system according to the image information.
The origin of coordinates of the image coordinate system can be set at the upper left corner of the picture, and can be a left-hand rectangular coordinate system or a right-hand rectangular coordinate system.
And S3, respectively acquiring the image coordinates of the main point and the three vanishing points in the image coordinate system.
The principal point can be freely selected, and a point at the center of the road or a point not at the center of the road can be selected as the principal point.
In this embodiment, step S3 is preceded by:
s301, respectively acquiring three pairs of vanishing point directions according to the image information;
and S302, calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
The directions of the three pairs of vanishing points respectively correspond to the transverse direction, the longitudinal direction and the gravity direction of the road in the image information, and the vanishing points are the intersection points of the directions of the vanishing points of each pair. Suppose that the principal point is c and the three vanishing points are x respectively1、x2、x3,
And S4, calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points.
Camera matrix T ═ KR)-1=R-1K-1。
And S5, acquiring image coordinates of three known points and corresponding UTM coordinates respectively.
The three known points are not collinear, and the corresponding UTM coordinates can be obtained through manual calculation.
And S6, calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix.
In particular, according to the formulaAnd calculating to obtain each parameter in the first transformation matrix B, (E, N) represents the UTM coordinates of the known point, and (x, y) represents the coordinates of the image coordinates of the known point after the transformation of the camera matrix T. Since there are three known points, six equations can be obtained, solving for the six parameters B in the first transformation matrix B11、B12、B13、B21、B22And B23。
And S7, obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
In this embodiment, step S7 specifically includes: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix. I.e. the second transformation matrix S ═ B × T.
In this embodiment, the image coordinate of any point may be obtained by transforming the image coordinate with the second transformation matrix, or the image coordinate of any point may be obtained by inverse transforming the known UTM coordinate with the second transformation matrix. When a plurality of cameras are arranged, although the image coordinates of each camera are different, the UTM coordinates of the same target are the same, and the position of the same target in different image coordinates can be found through the UTM coordinates. In addition, the positional relationship between different targets, the size of the target, the movement speed, and the like can also be obtained.
Example two
Referring to fig. 2, the second embodiment of the present invention is:
a camera calibration terminal 100, corresponding to the method of the first embodiment, includes a memory 1, a processor 2, and a computer program stored in the memory 1 and executable on the processor 2, where the processor 2 executes the computer program to implement the following steps:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
Further, the processor 2, when executing the computer program, implements the following steps:
before the obtaining of the image coordinates of the principal point and the three vanishing points in the image coordinate system, respectively, the method further comprises:
respectively acquiring three pairs of vanishing point directions according to the image information;
and calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
Further, the obtaining of the second transformation matrix according to the camera matrix and the first transformation matrix specifically includes: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix.
In summary, the camera calibration method and the terminal provided by the invention can realize automatic conversion between the image coordinate and the UTM coordinate, and can realize linkage positioning of multiple cameras.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all equivalent changes made by using the contents of the present specification and the drawings, or applied directly or indirectly to the related technical fields, are included in the scope of the present invention.
Claims (6)
1. A camera calibration method, comprising:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
obtaining a second transformation matrix according to the camera matrix and the first transformation matrix;
the step of obtaining a first transformation matrix by calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix comprises:
according to the formulaCalculating to obtain each parameter B in the first transformation matrix B11、B12、B13、B21、B22And B23And (E, N) represents the UTM coordinates corresponding to the known points, and (x, y) represents the coordinates of the image coordinates of the known points after the transformation of the camera matrix T, wherein three known points are provided, and six equations are obtained according to the formula.
2. The camera calibration method according to claim 1, wherein before acquiring the image coordinates of the principal point and the three vanishing points in the image coordinate system, respectively, the method further comprises:
respectively acquiring three pairs of vanishing point directions according to the image information;
and calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
3. The camera calibration method according to claim 1, wherein the obtaining of the second transformation matrix from the camera matrix and the first transformation matrix specifically comprises: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix.
4. A camera calibration terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the following steps when executing the computer program:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
obtaining a second transformation matrix according to the camera matrix and the first transformation matrix;
the step of obtaining a first transformation matrix by calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix comprises:
according to the formulaCalculating to obtain each parameter B in the first transformation matrix B11、B12、B13、B21、B22And B23And (E, N) represents the UTM coordinates corresponding to the known points, and (x, y) represents the coordinates of the image coordinates of the known points after the transformation of the camera matrix T, wherein three known points are provided, and six equations are obtained according to the formula.
5. The camera calibration terminal of claim 4, wherein the processor, when executing the computer program, performs the steps of:
before the obtaining of the image coordinates of the principal point and the three vanishing points in the image coordinate system, respectively, the method further comprises:
respectively acquiring three pairs of vanishing point directions according to the image information;
and calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
6. The camera calibration terminal according to claim 4, wherein the obtaining of the second transformation matrix according to the camera matrix and the first transformation matrix specifically comprises: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010263368.2A CN111462251B (en) | 2020-04-07 | 2020-04-07 | Camera calibration method and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010263368.2A CN111462251B (en) | 2020-04-07 | 2020-04-07 | Camera calibration method and terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111462251A CN111462251A (en) | 2020-07-28 |
CN111462251B true CN111462251B (en) | 2021-05-11 |
Family
ID=71681639
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010263368.2A Active CN111462251B (en) | 2020-04-07 | 2020-04-07 | Camera calibration method and terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111462251B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104574406A (en) * | 2015-01-16 | 2015-04-29 | 大连理工大学 | Joint calibration method between 360-degree panorama laser and multiple visual systems |
CN108447100A (en) * | 2018-04-26 | 2018-08-24 | 王涛 | A kind of eccentric vector sum Collimation axis eccentricity angle scaling method of airborne TLS CCD camera |
CN109146958A (en) * | 2018-08-15 | 2019-01-04 | 北京领骏科技有限公司 | A kind of traffic sign method for measuring spatial location based on two dimensional image |
CN110033492A (en) * | 2019-04-17 | 2019-07-19 | 深圳金三立视频科技股份有限公司 | Camera marking method and terminal |
CN110378965A (en) * | 2019-05-21 | 2019-10-25 | 北京百度网讯科技有限公司 | Determine the method, apparatus, equipment and storage medium of coordinate system conversion parameter |
CN110766760A (en) * | 2019-10-21 | 2020-02-07 | 北京百度网讯科技有限公司 | Method, device, equipment and storage medium for camera calibration |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9558559B2 (en) * | 2013-04-05 | 2017-01-31 | Nokia Technologies Oy | Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system |
-
2020
- 2020-04-07 CN CN202010263368.2A patent/CN111462251B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104574406A (en) * | 2015-01-16 | 2015-04-29 | 大连理工大学 | Joint calibration method between 360-degree panorama laser and multiple visual systems |
CN108447100A (en) * | 2018-04-26 | 2018-08-24 | 王涛 | A kind of eccentric vector sum Collimation axis eccentricity angle scaling method of airborne TLS CCD camera |
CN109146958A (en) * | 2018-08-15 | 2019-01-04 | 北京领骏科技有限公司 | A kind of traffic sign method for measuring spatial location based on two dimensional image |
CN110033492A (en) * | 2019-04-17 | 2019-07-19 | 深圳金三立视频科技股份有限公司 | Camera marking method and terminal |
CN110378965A (en) * | 2019-05-21 | 2019-10-25 | 北京百度网讯科技有限公司 | Determine the method, apparatus, equipment and storage medium of coordinate system conversion parameter |
CN110766760A (en) * | 2019-10-21 | 2020-02-07 | 北京百度网讯科技有限公司 | Method, device, equipment and storage medium for camera calibration |
Also Published As
Publication number | Publication date |
---|---|
CN111462251A (en) | 2020-07-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3627109B1 (en) | Visual positioning method and apparatus, electronic device and system | |
US10659768B2 (en) | System and method for virtually-augmented visual simultaneous localization and mapping | |
WO2021139176A1 (en) | Pedestrian trajectory tracking method and apparatus based on binocular camera calibration, computer device, and storage medium | |
WO2021004416A1 (en) | Method and apparatus for establishing beacon map on basis of visual beacons | |
JP2019024196A (en) | Camera parameter set calculating apparatus, camera parameter set calculating method, and program | |
CN109300143B (en) | Method, device and equipment for determining motion vector field, storage medium and vehicle | |
WO2018098811A1 (en) | Localization method and device | |
CN111750820A (en) | Image positioning method and system | |
EP3968266B1 (en) | Obstacle three-dimensional position acquisition method and apparatus for roadside computing device | |
CN111121754A (en) | Mobile robot positioning navigation method and device, mobile robot and storage medium | |
CN104715479A (en) | Scene reproduction detection method based on augmented virtuality | |
US20110150319A1 (en) | Method for Determining 3D Poses Using Points and Lines | |
CN112556685B (en) | Navigation route display method and device, storage medium and electronic equipment | |
JP2018179990A (en) | Camera parameter set calculation method, camera parameter set calculation program and camera parameter set calculation device | |
CN113029128A (en) | Visual navigation method and related device, mobile terminal and storage medium | |
CN111735439A (en) | Map construction method, map construction device and computer-readable storage medium | |
JP2007256029A (en) | Stereo image processing device | |
CN112097732A (en) | Binocular camera-based three-dimensional distance measurement method, system, equipment and readable storage medium | |
CN110033492B (en) | Camera calibration method and terminal | |
CN111754579A (en) | Method and device for determining external parameters of multi-view camera | |
WO2023184869A1 (en) | Semantic map construction and localization method and apparatus for indoor parking lot | |
CN115205807A (en) | Lane line fusion method and device, automobile, electronic equipment and computer-readable storage medium | |
JP2020038101A (en) | Image processor, image supporting system, image processing method, and program | |
JP2008309595A (en) | Object recognizing device and program used for it | |
CN109782755B (en) | Method for controlling AGV to calibrate and AGV to calibrate position, computer storage medium and AGV |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |