CN111462251A - Camera calibration method and terminal - Google Patents

Camera calibration method and terminal Download PDF

Info

Publication number
CN111462251A
CN111462251A CN202010263368.2A CN202010263368A CN111462251A CN 111462251 A CN111462251 A CN 111462251A CN 202010263368 A CN202010263368 A CN 202010263368A CN 111462251 A CN111462251 A CN 111462251A
Authority
CN
China
Prior art keywords
camera
coordinates
image
matrix
transformation matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010263368.2A
Other languages
Chinese (zh)
Other versions
CN111462251B (en
Inventor
亢晓斌
张宇
刘东剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Santachi Video Technology Shenzhen Co ltd
Original Assignee
Santachi Video Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Santachi Video Technology Shenzhen Co ltd filed Critical Santachi Video Technology Shenzhen Co ltd
Priority to CN202010263368.2A priority Critical patent/CN111462251B/en
Publication of CN111462251A publication Critical patent/CN111462251A/en
Application granted granted Critical
Publication of CN111462251B publication Critical patent/CN111462251B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The invention discloses a camera calibration method and a terminal, wherein the method comprises the following steps: acquiring image information of a camera; establishing an image coordinate system according to the image information; respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system; calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points; respectively acquiring image coordinates of three known points and corresponding UTM coordinates; calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix; and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix. The invention can realize the automatic conversion between the image coordinate and the UTM coordinate and realize the linkage positioning of a plurality of cameras.

Description

Camera calibration method and terminal
Technical Field
The invention relates to the technical field of image processing, in particular to a camera calibration method and a terminal.
Background
As the cost of cameras and processors continues to drop, vision-based sensing is becoming an alternative to traditional sensors for collecting traffic data. Many research and business systems have derived a range of information of interest to them through the analysis of videos, such as road occupancy, vehicle speed, vehicle type, and event detection.
When a plurality of cameras are adopted for data acquisition, the following problems exist: the selection of the origin of the world coordinate system established by calibrating different cameras is inconvenient, and the obtained world coordinate system is not unique, so that the use significance of the world coordinate system is not great; when a plurality of cameras are adopted to position an object in a linkage mode according to world coordinates, multi-view projection geometry needs to be considered, and a model is complex.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the camera calibration method and the terminal can realize linkage positioning of a plurality of cameras.
In order to solve the technical problems, the invention adopts the technical scheme that:
a camera calibration method comprises the following steps:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
The invention adopts another technical scheme that:
a camera calibration terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
The invention has the beneficial effects that: coordinate conversion between the image coordinate system and the UTM coordinate system can be automatically carried out through the second transformation matrix; even if the coordinates of the same object in different camera images are different, due to the uniqueness of the UTM coordinate system, the multiple cameras can realize linkage positioning of the same object without considering multi-view projection geometry.
Drawings
Fig. 1 is a flowchart of a camera calibration method according to a first embodiment of the present invention;
fig. 2 is a schematic diagram of a camera calibration terminal according to a second embodiment of the present invention.
Description of reference numerals:
100. calibrating a terminal by a camera; 1. a memory; 2. a processor.
Detailed Description
In order to explain technical contents, achieved objects, and effects of the present invention in detail, the following description is made with reference to the accompanying drawings in combination with the embodiments.
The most key concept of the invention is as follows: and a transformation matrix between the image coordinate system and the UTM coordinate system is calculated, so that the multiple cameras can perform linkage positioning on the same object.
Referring to fig. 1, a camera calibration method includes:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
From the above description, the beneficial effects of the present invention are: coordinate conversion between the image coordinate system and the UTM coordinate system can be automatically carried out through the second transformation matrix; even if the coordinates of the same object in different camera images are different, due to the uniqueness of the UTM coordinate system, the multiple cameras can realize linkage positioning of the same object without considering multi-view projection geometry.
Further, before the obtaining the image coordinates of the principal point and the three vanishing points in the image coordinate system, respectively, the method further includes:
respectively acquiring three pairs of vanishing point directions according to the image information;
and calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
Further, the obtaining of the second transformation matrix according to the camera matrix and the first transformation matrix specifically includes: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix.
As shown in fig. 2, another technical solution related to the present invention is:
a camera calibration terminal 100 comprising a memory 1, a processor 2 and a computer program stored on said memory 1 and executable on the processor 2, said processor 2 realizing the following steps when executing said computer program:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
Further, the processor 2, when executing the computer program, implements the following steps:
before the obtaining of the image coordinates of the principal point and the three vanishing points in the image coordinate system, respectively, the method further comprises:
respectively acquiring three pairs of vanishing point directions according to the image information;
and calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
Further, the obtaining of the second transformation matrix according to the camera matrix and the first transformation matrix specifically includes: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix.
Example one
Referring to fig. 1, an embodiment of the present invention is a camera calibration method, including the following steps:
and S1, acquiring image information of the camera.
In the present embodiment, it is assumed that the image taken by the camera is free from radial distortion and the camera is looking straight forward and downward.
And S2, establishing an image coordinate system according to the image information.
The origin of coordinates of the image coordinate system can be set at the upper left corner of the picture, and can be a left-hand rectangular coordinate system or a right-hand rectangular coordinate system.
And S3, respectively acquiring the image coordinates of the main point and the three vanishing points in the image coordinate system.
The principal point can be freely selected, and a point at the center of the road or a point not at the center of the road can be selected as the principal point.
In this embodiment, step S3 is preceded by:
s301, respectively acquiring three pairs of vanishing point directions according to the image information;
and S302, calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
The directions of the three pairs of vanishing points respectively correspond to the transverse direction, the longitudinal direction and the gravity direction of the road in the image information, and the vanishing points are the intersection points of the directions of the vanishing points of each pair. Suppose that the principal point is c and the three vanishing points are x respectively1、x2、x3
Figure BDA0002440266710000051
And S4, calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points.
Figure BDA0002440266710000052
Figure BDA0002440266710000053
Figure BDA0002440266710000054
Camera matrix T ═ KR)-1=R-1K-1
And S5, acquiring image coordinates of three known points and corresponding UTM coordinates respectively.
The three known points are not collinear, and the corresponding UTM coordinates can be obtained through manual calculation.
And S6, calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix.
In particular, according to the formula
Figure BDA0002440266710000055
And calculating to obtain each parameter in the first transformation matrix B, (E, N) represents the UTM coordinates of the known point, and (x, y) represents the coordinates of the image coordinates of the known point after the transformation of the camera matrix T. Since there are three known points, six equations can be obtained, solving for the six parameters B in the first transformation matrix B11、B12、B13、B21、B22And B23
And S7, obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
In this embodiment, step S7 specifically includes: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix. I.e. the second transformation matrix S ═ B × T.
In this embodiment, the image coordinate of any point may be obtained by transforming the image coordinate with the second transformation matrix, or the image coordinate of any point may be obtained by inverse transforming the known UTM coordinate with the second transformation matrix. When a plurality of cameras are arranged, although the image coordinates of each camera are different, the UTM coordinates of the same target are the same, and the position of the same target in different image coordinates can be found through the UTM coordinates. In addition, the positional relationship between different targets, the size of the target, the movement speed, and the like can also be obtained.
Example two
Referring to fig. 2, the second embodiment of the present invention is:
a camera calibration terminal 100, corresponding to the method of the first embodiment, includes a memory 1, a processor 2, and a computer program stored in the memory 1 and executable on the processor 2, where the processor 2 executes the computer program to implement the following steps:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
Further, the processor 2, when executing the computer program, implements the following steps:
before the obtaining of the image coordinates of the principal point and the three vanishing points in the image coordinate system, respectively, the method further comprises:
respectively acquiring three pairs of vanishing point directions according to the image information;
and calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
Further, the obtaining of the second transformation matrix according to the camera matrix and the first transformation matrix specifically includes: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix.
In summary, the camera calibration method and the terminal provided by the invention can realize automatic conversion between the image coordinate and the UTM coordinate, and can realize linkage positioning of multiple cameras.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all equivalent changes made by using the contents of the present specification and the drawings, or applied directly or indirectly to the related technical fields, are included in the scope of the present invention.

Claims (6)

1. A camera calibration method, comprising:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
2. The camera calibration method according to claim 1, wherein before acquiring the image coordinates of the principal point and the three vanishing points in the image coordinate system, respectively, the method further comprises:
respectively acquiring three pairs of vanishing point directions according to the image information;
and calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
3. The camera calibration method according to claim 1, wherein the obtaining of the second transformation matrix from the camera matrix and the first transformation matrix specifically comprises: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix.
4. A camera calibration terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the following steps when executing the computer program:
acquiring image information of a camera;
establishing an image coordinate system according to the image information;
respectively acquiring image coordinates of a main point and three vanishing points in the image coordinate system;
calculating to obtain a camera matrix according to the image coordinates of the principal point and the three vanishing points;
respectively acquiring image coordinates of three known points and corresponding UTM coordinates;
calculating according to the image coordinates of the three known points, the corresponding UTM coordinates and the camera matrix to obtain a first transformation matrix;
and obtaining a second transformation matrix according to the camera matrix and the first transformation matrix.
5. The camera calibration terminal of claim 4, wherein the processor, when executing the computer program, performs the steps of:
before the obtaining of the image coordinates of the principal point and the three vanishing points in the image coordinate system, respectively, the method further comprises:
respectively acquiring three pairs of vanishing point directions according to the image information;
and calculating to obtain three vanishing points according to the directions of the three pairs of vanishing points.
6. The camera calibration terminal according to claim 4, wherein the obtaining of the second transformation matrix according to the camera matrix and the first transformation matrix specifically comprises: and right multiplying the camera matrix by the first transformation matrix to obtain a second transformation matrix.
CN202010263368.2A 2020-04-07 2020-04-07 Camera calibration method and terminal Active CN111462251B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010263368.2A CN111462251B (en) 2020-04-07 2020-04-07 Camera calibration method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010263368.2A CN111462251B (en) 2020-04-07 2020-04-07 Camera calibration method and terminal

Publications (2)

Publication Number Publication Date
CN111462251A true CN111462251A (en) 2020-07-28
CN111462251B CN111462251B (en) 2021-05-11

Family

ID=71681639

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010263368.2A Active CN111462251B (en) 2020-04-07 2020-04-07 Camera calibration method and terminal

Country Status (1)

Country Link
CN (1) CN111462251B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140300637A1 (en) * 2013-04-05 2014-10-09 Nokia Corporation Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
CN104574406A (en) * 2015-01-16 2015-04-29 大连理工大学 Joint calibration method between 360-degree panorama laser and multiple visual systems
CN108447100A (en) * 2018-04-26 2018-08-24 王涛 A kind of eccentric vector sum Collimation axis eccentricity angle scaling method of airborne TLS CCD camera
CN109146958A (en) * 2018-08-15 2019-01-04 北京领骏科技有限公司 A kind of traffic sign method for measuring spatial location based on two dimensional image
CN110033492A (en) * 2019-04-17 2019-07-19 深圳金三立视频科技股份有限公司 Camera marking method and terminal
CN110378965A (en) * 2019-05-21 2019-10-25 北京百度网讯科技有限公司 Determine the method, apparatus, equipment and storage medium of coordinate system conversion parameter
CN110766760A (en) * 2019-10-21 2020-02-07 北京百度网讯科技有限公司 Method, device, equipment and storage medium for camera calibration

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140300637A1 (en) * 2013-04-05 2014-10-09 Nokia Corporation Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
CN104574406A (en) * 2015-01-16 2015-04-29 大连理工大学 Joint calibration method between 360-degree panorama laser and multiple visual systems
CN108447100A (en) * 2018-04-26 2018-08-24 王涛 A kind of eccentric vector sum Collimation axis eccentricity angle scaling method of airborne TLS CCD camera
CN109146958A (en) * 2018-08-15 2019-01-04 北京领骏科技有限公司 A kind of traffic sign method for measuring spatial location based on two dimensional image
CN110033492A (en) * 2019-04-17 2019-07-19 深圳金三立视频科技股份有限公司 Camera marking method and terminal
CN110378965A (en) * 2019-05-21 2019-10-25 北京百度网讯科技有限公司 Determine the method, apparatus, equipment and storage medium of coordinate system conversion parameter
CN110766760A (en) * 2019-10-21 2020-02-07 北京百度网讯科技有限公司 Method, device, equipment and storage medium for camera calibration

Also Published As

Publication number Publication date
CN111462251B (en) 2021-05-11

Similar Documents

Publication Publication Date Title
CN110567469B (en) Visual positioning method and device, electronic equipment and system
US11012620B2 (en) Panoramic image generation method and device
US10659768B2 (en) System and method for virtually-augmented visual simultaneous localization and mapping
JP2019024196A (en) Camera parameter set calculating apparatus, camera parameter set calculating method, and program
US8442305B2 (en) Method for determining 3D poses using points and lines
WO2018098811A1 (en) Localization method and device
CN109300143B (en) Method, device and equipment for determining motion vector field, storage medium and vehicle
WO2021004416A1 (en) Method and apparatus for establishing beacon map on basis of visual beacons
CN104715479A (en) Scene reproduction detection method based on augmented virtuality
CN111750820A (en) Image positioning method and system
EP3968266B1 (en) Obstacle three-dimensional position acquisition method and apparatus for roadside computing device
JP2018179990A (en) Camera parameter set calculation method, camera parameter set calculation program and camera parameter set calculation device
CN113029128A (en) Visual navigation method and related device, mobile terminal and storage medium
CN111754579A (en) Method and device for determining external parameters of multi-view camera
CN112097732A (en) Binocular camera-based three-dimensional distance measurement method, system, equipment and readable storage medium
WO2021139176A1 (en) Pedestrian trajectory tracking method and apparatus based on binocular camera calibration, computer device, and storage medium
Kim et al. Spherical approximation for multiple cameras in motion estimation: Its applicability and advantages
CN114494150A (en) Design method of monocular vision odometer based on semi-direct method
JP2008309595A (en) Object recognizing device and program used for it
Huttunen et al. A monocular camera gyroscope
CN110197104B (en) Distance measurement method and device based on vehicle
CN111462251B (en) Camera calibration method and terminal
CN110033492B (en) Camera calibration method and terminal
CN103489165A (en) Decimal lookup table generation method for video stitching
CN111797715A (en) Parking space detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant