CN112102413A - Virtual lane line-based automatic calibration method for vehicle-mounted camera - Google Patents

Virtual lane line-based automatic calibration method for vehicle-mounted camera Download PDF

Info

Publication number
CN112102413A
CN112102413A CN202010713419.7A CN202010713419A CN112102413A CN 112102413 A CN112102413 A CN 112102413A CN 202010713419 A CN202010713419 A CN 202010713419A CN 112102413 A CN112102413 A CN 112102413A
Authority
CN
China
Prior art keywords
camera
coordinate system
coordinates
vehicle
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010713419.7A
Other languages
Chinese (zh)
Other versions
CN112102413B (en
Inventor
陈俊龙
魏宇豪
曾科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN202010713419.7A priority Critical patent/CN112102413B/en
Publication of CN112102413A publication Critical patent/CN112102413A/en
Application granted granted Critical
Publication of CN112102413B publication Critical patent/CN112102413B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a virtual lane line-based vehicle-mounted camera automatic calibration method, which comprises the following steps: a world coordinate system is established at the intersection point of the center of a rear axle of the vehicle and the ground vertically downwards, the Z axis is arranged right in front of the vehicle, the X axis is arranged on the right side of the advancing direction, and the Y axis is arranged vertically downwards; establishing a camera coordinate system; taking a single picture at the center of a lane in front of a vehicle by using a camera, measuring the lane width, selecting a rectangular frame formed by virtual lane lines on two sides as a calibration graph under the overlooking view angle of a world coordinate system, obtaining the relation between four characteristic points of the rectangle and the lane width according to the rectangular property, and obtaining a rotation matrix equation based on the camera coordinate system according to the orthogonal matrix property and the coordinate transformation relation between the camera coordinate system and the world coordinate system; the camera coordinates are converted into pixel coordinates by using the camera internal parameters, and then the pixel coordinates of the four characteristic points are acquired from the image, and then psi, theta, phi and h parameters of a rotation matrix and a translation matrix related to the camera external parameters are obtained.

Description

Virtual lane line-based automatic calibration method for vehicle-mounted camera
Technical Field
The invention belongs to the field of traffic, and particularly relates to a vehicle-mounted camera automatic calibration method based on virtual lane lines.
Background
Up to now, automatic calibration algorithms in the traffic field (including vehicle-mounted cameras, traffic monitoring cameras, and the like) can be roughly classified into calibration algorithms based on static targets such as lane lines and calibration algorithms based on moving targets such as vehicles and pedestrians, depending on markers. Compared with a calibration algorithm based on a static target, the algorithm based on the moving target is much more complex, the method not only requires that targets such as vehicles or pedestrians appear in a picture, but also needs to analyze a video sequence to obtain a moving track so as to obtain a vanishing point, and partial algorithms even have requirements on the moving direction and speed, so that the method is more suitable for a stationary traffic monitoring camera. For the vehicle-mounted camera, a large number of vehicles may appear in a scene, but due to complex relative motion between the vehicles, it is difficult to find a suitable target for trajectory analysis, and the lane line is used as a stationary object and is more suitable for being used as a marker for automatic calibration of the vehicle-mounted camera.
In the imaging process of the camera, a certain point in the three-dimensional world is converted into a pixel point in a two-dimensional image, a model can be established by using a geometric method to describe the process, and camera parameters are parameters related to the geometric model. The camera internal parameters comprise focal length, optical center position, distortion coefficient and the like; the camera external parameters include a rotation matrix and a translation matrix. The purpose of camera calibration is to obtain camera parameters, and the calibration precision directly influences the visual perception and positioning of the automatic driving vehicle. The traditional camera calibration method needs to determine camera parameters by using specific points on a calibration plate, so the method is only suitable for static conditions and is generally used for calibrating camera internal parameters. When the vehicle-mounted camera is in a driving process, external parameters of the vehicle-mounted camera may change due to various factors such as road bump, vehicle body vibration and the like (the internal parameters of the camera are not changed), and the external parameters need to be calibrated again. The lane lines generally exist in a driving scene, and the camera external parameters can be automatically calibrated by utilizing the characteristics of the lane lines, such as the parallelism, the known lane width and the like.
Disclosure of Invention
The invention aims to provide a vehicle-mounted camera automatic calibration method based on a virtual lane line aiming at the defects of the prior art.
The invention is realized by adopting the following technical scheme:
a vehicle-mounted camera automatic calibration method based on virtual lane lines comprises the following steps:
1) a world coordinate system is established at the intersection point of the center of a rear axle of the vehicle and the ground vertically downwards, the Z axis is arranged right in front of the vehicle, the X axis is arranged on the right side of the advancing direction, and the Y axis is arranged vertically downwards; establishing a camera coordinate system, wherein the coordinate of the origin of the camera coordinate system in a world coordinate system is (d, h, l);
2) taking a single picture at the center of a lane in front of a vehicle by using a camera, measuring the lane width, selecting a rectangular frame formed by virtual lane lines on two sides as a calibration graph under the overlooking view angle of a world coordinate system, obtaining the relation between four characteristic points of the rectangle and the lane width according to the rectangular property, and obtaining a rotation matrix equation based on the camera coordinate system according to the orthogonal matrix property and the coordinate transformation relation between the camera coordinate system and the world coordinate system;
3) the camera internal parameters are unchanged in the driving process of the vehicle, the camera coordinates are converted into pixel coordinates by using the camera internal parameters, and then the pixel coordinates of the four characteristic points are obtained from the image, and then psi, theta, phi and h four parameters of a rotation matrix and a translation matrix related to the camera external parameters are obtained.
The further improvement of the invention is that the specific implementation method of the step 2) is as follows:
101) the transformation model introduced between the world coordinate system W and the camera coordinate system C is as follows
Pc=R·Pw+T
Wherein R represents a rotation matrix and T represents a translation matrix;
since the rotation matrix R is an orthogonal matrix, the formula is rewritten as follows according to the properties of the orthogonal matrix:
Pw=R-1Pc-R-1T=RTPc-RTT
in the formula, -RTThe actual meaning of T is the coordinate of the origin of the camera coordinate system in the world coordinate system;
for ease of understanding and calculation, r is used hereinmnRepresenting the elements in the rotation matrix R, the formula is rewritten to matrix form as follows:
Figure BDA0002597368750000031
102) setting four points of the rectangle as ABCD, A, C, B and D on the same virtual lane line respectively, distributing along the Y axis, the lane width is width, and obtaining the following formula according to the property of the rectangle:
Figure BDA0002597368750000032
103) since the world coordinates are unknown, substituting equation (1) into equation (2) converts the world coordinates to camera coordinates, as follows
Figure BDA0002597368750000033
At the moment, the world coordinates of each point are not contained in the equation, only the camera coordinates of each point are left, the internal reference of the vehicle is not changed in the driving process, and the internal reference of the camera is known, so that the camera coordinates are converted into pixel coordinates by using the internal reference of the camera.
The further improvement of the invention is that the specific implementation method of the step 3) is as follows:
201) introducing a transformation model between a classical pixel coordinate system and a world coordinate system, wherein the transformation model comprises the following formula:
Figure BDA0002597368750000041
in the formula (f)x=f/dx;fyThe normalized focal lengths of the x axis and the y axis are respectively called f/dy, dx and dy respectively represent the physical dimensions of one pixel point in the x axis direction and the y axis direction, f is the focal length of the camera, and (u) is the focal length of the camera0,v0) Representing the coordinates of the origin of an image coordinate system under a pixel coordinate system, R representing a camera rotation matrix, and T representing a camera translation matrix;
202) combining a conversion model between the world coordinate system and the camera coordinate system with a conversion model between the pixel coordinate system and the world coordinate system to obtain a conversion model between the camera coordinate system and the pixel coordinate system, which is as follows:
Figure BDA0002597368750000042
expansion gives the following formula:
Figure BDA0002597368750000043
in the formula (I), the compound is shown in the specification,
Figure BDA0002597368750000044
fx、fy、u0and v0Are all known parameters;
substituting the above equation into the last equation in (r) yields the following equation:
Figure BDA0002597368750000045
203) solving an external parameter matrix R, T;
substituting the formulas II and III into the formula I to eliminate the camera coordinates of each point, only leaving parameters related to the pixel coordinates, and directly obtaining the pixel coordinates from the image; the equation is simplified to the following equation:
Figure BDA0002597368750000051
the formula only includes psi, theta, phi and h four unknowns, so simultaneous solution can obtain the following formula:
Figure BDA0002597368750000052
in the formula, FAC=(mC-mA)+tan φ mAmC(nA-nC);GAC=sin φ(mC-mA)+cos φ mAmC(nA-nC);FBD=(mD-mB)+tan φ mBmD(nB-nD);GBD=sin φ(mD-mB)+cos φ mBmD(nB-nD)
Four parameters of psi, theta, phi and h of the rotation matrix R and the translation matrix T of the external parameters of the camera are solved.
The invention has at least the following beneficial technical effects:
the invention provides an automatic calibration method of a vehicle-mounted camera based on virtual lane lines, which is characterized in that a rectangle formed by the virtual lane lines on two sides in an image obtained by the vehicle-mounted camera in real time is used as a calibration object, and the calibration work of the camera can be completed at one time by automatically calibrating external parameters of the camera by utilizing the characteristics of parallelism of the lane lines, known lane width and the like in a common driving scene. The calibration method provided by the invention can realize real-time automatic calibration aiming at the camera external parameter changes caused by road jolt, vehicle body vibration and the like in the vehicle driving process, and has the advantages of simple operation, convenient measurement, good real-time performance and the like.
Furthermore, because the introduced unknown variable only has the lane width, and the coordinates of the four points of the virtual lane rectangle are converted into the coordinates under the camera coordinate system through the conversion relation between the camera coordinate system and the world coordinate system, the invention has the advantages of less selected calibration parameters and convenient measurement. In claim 3, the coordinates of four points obtained in the camera coordinate system are converted into the pixel coordinate system by the internal reference of the camera, the coordinates of the four points are converted into the pixel coordinates in the pixel coordinate system in the image directly obtained from the image, and the equations of four parameters ψ, θ, φ and h with respect to the camera external reference rotation matrix R and the translation matrix T are converted into equations with respect to only the lane width, and the external reference of the camera is obtained by simultaneous solution. Therefore, the virtual lane line-based vehicle-mounted camera automatic calibration method provided by the invention is simple to operate, uses few calibration parameters, is convenient and fast to measure, and has excellent universality and good real-time property.
Drawings
FIG. 1 is a schematic of a world coordinate system to a camera coordinate system.
FIG. 2 is a schematic diagram of a world coordinate system with a point rotated by angle ψ about the X-axis.
Fig. 3 is a schematic diagram of a camera coordinate system to an image coordinate system.
FIG. 4 is a diagram of an image coordinate system to a pixel coordinate system.
Fig. 5 is a schematic diagram of a positional relationship between the vehicle and the camera, in which fig. 5(a) is a front view, fig. 5(b) is a side view, and fig. 5(c) is a top view.
FIG. 6 is a schematic diagram of a rectangle formed by two side dashed lane lines.
Fig. 7 is an image of an actual road scene calibrated by Opencv.
FIG. 8 is an image calibrated by the method of the present invention.
Detailed Description
The invention is further described below with reference to the following figures and examples.
Basic theory of camera calibration
The process of capturing an image by the camera is an optical imaging process. The process involves the following four coordinate systems:
pixel coordinate system: and (u, v) with the upper left corner of the image as the origin, the horizontal right as the u-axis, and the vertical down as the v-axis, in pixels.
Image coordinate system: expressed as (x, y), the origin is the image center and the horizontal right is the x-axis. Vertically down is the y-axis in physical units.
Camera coordinate system: by (X)c,Yc,Zc) The origin is the optical center of the lens, the X, Y axes are parallel to the two sides of the phase plane, the Z axis is the optical axis of the lens, and is perpendicular to the image plane, and the unit is the physical unit.
World coordinate system: by (X)w,Yw,Zw) The position of the world coordinate system is not fixed and is defined by human, and the unit is a physical unit.
World to camera coordinate system
The transformation process from the world coordinate system to the camera coordinate system belongs to rigid body transformation, namely, an object does not deform in the transformation process, and only rotation operation and translation operation are required to be carried out on the coordinate system. The relationship between the world coordinate system and the camera coordinate system is shown in fig. 1, where R represents the rotation matrix and T represents the translation matrix.
Assuming that there is a point P, the coordinate in the world coordinate system is Pw(Xw,Yw,Zw) The coordinate in the camera coordinate system is Pc(Xc,Yc,Zc) Then P iscAnd PwThe following relationships exist:
Pc=R·Pw+T (5-1)
since the camera coordinate system can be derived from the world coordinate system by rotational translation, the present invention first rotates point P by an angle ψ about the X-axis, as shown in fig. 2:
from the relationship between the two coordinate systems in fig. 2, the matrix form of the world coordinate system rotated by ψ about the X-axis can be obtained as shown in equation (5-2):
Figure BDA0002597368750000071
in the same way, the coordinate change relationship after rotating the angle theta around the Y axis and the angle phi around the Z axis is shown as the formula (5-3).
Figure BDA0002597368750000081
The rotation matrix R is then:
Figure BDA0002597368750000082
the relationship between the camera coordinate system and the world coordinate system can be obtained, and because the elements in the rotation matrix R are long, for the convenience of understanding and expression, R and the subscript are collectively expressed as shown in formula (5-5):
Figure BDA0002597368750000083
camera coordinate system to image coordinate system
This process is a process of converting from a three-dimensional coordinate system to a two-dimensional planar coordinate system, and the two coordinate systems are in a perspective projection relationship and conform to the triangle similarity theorem. The relationship between the two coordinate systems is shown in fig. 3, where f is the camera focal length.
As can be seen from the above figure, POcIs a point Pc(Xc,Yc,Zc) And the optical center OcConnecting line between, POcThe intersection point with the imaging plane is the space point Pc(Xc,Yc,Zc) The projection point p (x, y) on the imaging plane, so the invention can obtain two pairs of similar triangles delta ABOc~△oCOc,△PBOc~△pCOcThe formula (5-6) can be obtained by the similarity relationship of two pairs of similar triangles:
Figure BDA0002597368750000084
rewriting the above formula into a matrix form is represented as follows:
Figure BDA0002597368750000085
image coordinate system to pixel coordinate system
In the conversion process, rotation conversion is not carried out, but the original positions of the two coordinate systems are not consistent, and the unit sizes of the coordinate systems are also not consistent, so that the method can be realized through telescopic conversion and translation conversion. The relationship between the two coordinate systems is shown in FIG. 4, (u)0,v0) Representing the coordinates of the origin of the image coordinate system in the pixel coordinate system, P (x, y), i.e. the spatial point Pc(Xc,Yc,Zc) A projected point on the imaging plane.
The relationship between the two coordinate systems can therefore be represented by:
Figure BDA0002597368750000091
in the formula, dx and dy respectively represent the physical sizes of a pixel point in the directions of x and y axes. The above formula is then expressed in terms of homogeneous coordinates and matrices as follows:
Figure BDA0002597368750000092
up to this point, the matrix relationships between the four coordinate systems have been obtained. And (5-5), (5-7) and (5-9) are arranged to finally obtain the coordinate transformation relation between the pixel coordinate system and the world coordinate system, wherein the matrix form is shown as formula (5-10):
Figure BDA0002597368750000093
in the formula (f)x=f/dx;fyF/dy, which is called the normalized focal length of the x-axis and the y-axis, respectively.
In the formula (5-10), the first matrix behind the second equal sign is the internal reference matrix of the camera, and the second matrix is the external reference matrix of the camera. Thus, the camera parameters mainly include fx、fy、u0And v0Four parameters and distortion coefficients reflecting the relationship between the camera coordinate system and the pixel coordinate system; the camera external parameters are 6 parameters which are psi, theta, phi and three elements in the translation matrix T respectively, and the external parameters reflect the relation between the world coordinate system and the camera coordinate system.
Virtual lane line-based automatic calibration method for vehicle-mounted camera
As shown in fig. 5, a world coordinate system and a camera coordinate system are established. The default camera coordinate system is along the optical axis as the Z-axis, to the right as the X-axis, and vertically down as the Y-axis. The origin of the world coordinate system is vertically downward at the center of a rear axle of the vehicle and intersects with the ground, the Z axis is arranged right in front of the vehicle, the X axis is arranged on the right side of the advancing direction, the Y axis is vertically downward, and the coordinates of the origin of the camera coordinate system in the world coordinate system are (d, h, l).
The relative position of the camera and the vehicle can change along with the vibration of the vehicle during the running of the vehicle, generally speaking, three rotation angles and the height of the camera in the camera external reference change obviously, and d and l are basically unchanged, so that the automatic calibration algorithm provided by the invention mainly calculates the following four parameters: psi, theta, phi, and h. Assuming that the road surface is flat and the advancing direction of the vehicle is parallel to the lane line direction, the invention selects a rectangular frame formed by two virtual lane lines as a calibration graph, as shown in fig. 6:
in the top view of the world coordinate system, the invention considers that four points of ABCD form a rectangle, and the formula (5-11) can be obtained according to the property of the rectangle:
Figure BDA0002597368750000101
in the formula, width represents a lane width.
In the formula (5-1), since the rotation matrix R is an orthogonal matrix, the formula can be rewritten as follows according to the property of the orthogonal matrix:
Pw=R-1Pc-R-1T=RTPc-RTT (5-12)
in the formula, -RTThe practical meaning of T is the coordinates of the origin of the camera coordinate system in the world coordinate system.
The equations (5-12) are rewritten to a matrix form as shown in equations (5-13), and r is used here for easy understanding and calculationmnRepresenting the elements in the rotation matrix R.
Figure BDA0002597368750000111
Since world coordinates are unknown, the present invention substitutes equation (5-13) into equation (5-11), which converts the world coordinates to camera coordinates, as shown in equation (5-14).
Figure BDA0002597368750000112
At this time, the equation no longer contains the world coordinates of each point, but only the camera coordinates of each point. It has been mentioned in the foregoing that the camera reference is known here, since the camera reference is not changed during the driving of the vehicle. Camera parameters can be used to convert the camera coordinates to pixel coordinates.
From the equation (5-10), the relationship between the camera coordinate system and the pixel coordinate system is shown as follows:
Figure BDA0002597368750000113
expanding equation (5-15) yields the following:
Figure BDA0002597368750000114
in the formula (I), the compound is shown in the specification,
Figure BDA0002597368750000115
fx、fy、u0and v0All are known parameters, so that m and n can be calculated only by acquiring the pixel coordinates of each point from the image.
Substituting equation (5-16) into the last equation in equation (5-14) yields:
Figure BDA0002597368750000116
substituting equations (5-16) and (5-17) into equation (5-14) can eliminate the camera coordinates of each point, leaving only the parameters associated with the pixel coordinates, which can be directly obtained from the image. The equation is simplified to the following form:
Figure BDA0002597368750000121
equation (5-18) contains only four unknowns ψ, θ, φ and h, and hence can be solved simultaneously:
Figure BDA0002597368750000122
in the formula, FAC=(mC-mA)+tanφmAmC(nA-nC);GAC=sinφ(mC-mA)+cosφmAmC(nA-nC);
FBD=(mD-mB)+tanφmBmD(nB-nD);GBD=sinφ(mD-mB)+cosφmBmD(nB-nD)。
Fig. 7 is an original image calibrated by Opencv (a computer vision and machine learning software library issued by an open source based on BSD license), after the vehicle is parked, the present invention actually measures the world coordinates of 8 points on the lane line, and the Opencv built-in function solvePnP can obtain an external reference according to the world coordinates and the corresponding image coordinates; FIG. 8 is a diagram of a calibration performed using the calibration algorithm of the present invention, wherein four vertices of a rectangle formed by dashed lane lines are selected. Table 1 shows a comparison between calibration results and errors in an actual road scene of a virtual lane line-based vehicle-mounted camera automatic calibration method (the present invention) and an Opencv method.
Figure BDA0002597368750000131

Claims (3)

1. A vehicle-mounted camera automatic calibration method based on virtual lane lines is characterized by comprising the following steps:
1) a world coordinate system is established at the intersection point of the center of a rear axle of the vehicle and the ground vertically downwards, the Z axis is arranged right in front of the vehicle, the X axis is arranged on the right side of the advancing direction, and the Y axis is arranged vertically downwards; establishing a camera coordinate system, wherein the coordinate of the origin of the camera coordinate system in a world coordinate system is (d, h, l);
2) taking a single picture at the center of a lane in front of a vehicle by using a camera, measuring the lane width, selecting a rectangular frame formed by virtual lane lines on two sides as a calibration graph under the overlooking view angle of a world coordinate system, obtaining the relation between four characteristic points of the rectangle and the lane width according to the rectangular property, and obtaining a rotation matrix equation based on the camera coordinate system according to the orthogonal matrix property and the coordinate transformation relation between the camera coordinate system and the world coordinate system;
3) the camera internal parameters are unchanged in the driving process of the vehicle, the camera coordinates are converted into pixel coordinates by using the camera internal parameters, and then the pixel coordinates of the four characteristic points are obtained from the image, and then psi, theta, phi and h four parameters of a rotation matrix and a translation matrix related to the camera external parameters are obtained.
2. The method for automatically calibrating the vehicle-mounted camera based on the virtual lane line according to claim 1, wherein the method for specifically implementing the step 2) is as follows:
101) the transformation model introduced between the world coordinate system W and the camera coordinate system C is as follows
Pc=R·Pw+T
Wherein R represents a rotation matrix and T represents a translation matrix;
since the rotation matrix R is an orthogonal matrix, the formula is rewritten as follows according to the properties of the orthogonal matrix:
Pw=R-1Pc-R-1T=RTPc-RTT
in the formula, -RTThe actual meaning of T is the coordinate of the origin of the camera coordinate system in the world coordinate system;
for ease of understanding and calculation, r is used hereinmnRepresenting the elements in the rotation matrix R, the formula is rewritten to matrix form as follows:
Figure FDA0002597368740000011
102) setting four points of the rectangle as ABCD, A, C, B and D on the same virtual lane line respectively, distributing along the Y axis, the lane width is width, and obtaining the following formula according to the property of the rectangle:
Figure FDA0002597368740000021
103) since the world coordinates are unknown, substituting equation (1) into equation (2) converts the world coordinates to camera coordinates, as follows
Figure FDA0002597368740000022
At the moment, the world coordinates of each point are not contained in the equation, only the camera coordinates of each point are left, the internal reference of the vehicle is not changed in the driving process, and the internal reference of the camera is known, so that the camera coordinates are converted into pixel coordinates by using the internal reference of the camera.
3. The method for automatically calibrating the vehicle-mounted camera based on the virtual lane line according to claim 1, wherein the specific implementation method of the step 3) is as follows:
201) introducing a transformation model between a classical pixel coordinate system and a world coordinate system, wherein the transformation model comprises the following formula:
Figure FDA0002597368740000023
in the formula (f)x=f/dx;fyThe normalized focal lengths of the x axis and the y axis are respectively called f/dy, dx and dy respectively represent the physical dimensions of one pixel point in the x axis direction and the y axis direction, f is the focal length of the camera, and (u) is the focal length of the camera0,v0) Representing the coordinates of the origin of an image coordinate system under a pixel coordinate system, R representing a camera rotation matrix, and T representing a camera translation matrix;
202) combining a conversion model between the world coordinate system and the camera coordinate system with a conversion model between the pixel coordinate system and the world coordinate system to obtain a conversion model between the camera coordinate system and the pixel coordinate system, which is as follows:
Figure FDA0002597368740000031
expansion gives the following formula:
Figure FDA0002597368740000032
in the formula (I), the compound is shown in the specification,
Figure FDA0002597368740000033
fx、fy、u0and v0Are all known parameters;
substituting the above equation into the last equation in (r) yields the following equation:
Figure FDA0002597368740000034
203) solving an external parameter matrix R, T;
substituting the formulas II and III into the formula I to eliminate the camera coordinates of each point, only leaving parameters related to the pixel coordinates, and directly obtaining the pixel coordinates from the image; the equation is simplified to the following equation:
Figure FDA0002597368740000035
the formula only includes psi, theta, phi and h four unknowns, so simultaneous solution can obtain the following formula:
Figure FDA0002597368740000041
in the formula, FAC=(mC-mA)+tanφmAmC(nA-nC);GAC=sinφ(mC-mA)+cosφmAmC(nA-nC);FBD=(mD-mB)+tanφmBmD(nB-nD);GBD=sinφ(mD-mB)+cosφmBmD(nB-nD)
Four parameters of psi, theta, phi and h of the rotation matrix R and the translation matrix T of the external parameters of the camera are solved.
CN202010713419.7A 2020-07-22 2020-07-22 Virtual lane line-based automatic calibration method for vehicle-mounted camera Active CN112102413B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010713419.7A CN112102413B (en) 2020-07-22 2020-07-22 Virtual lane line-based automatic calibration method for vehicle-mounted camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010713419.7A CN112102413B (en) 2020-07-22 2020-07-22 Virtual lane line-based automatic calibration method for vehicle-mounted camera

Publications (2)

Publication Number Publication Date
CN112102413A true CN112102413A (en) 2020-12-18
CN112102413B CN112102413B (en) 2022-12-09

Family

ID=73749988

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010713419.7A Active CN112102413B (en) 2020-07-22 2020-07-22 Virtual lane line-based automatic calibration method for vehicle-mounted camera

Country Status (1)

Country Link
CN (1) CN112102413B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112785653A (en) * 2020-12-30 2021-05-11 中山联合汽车技术有限公司 Vehicle-mounted camera attitude angle calibration method
CN112927309A (en) * 2021-03-26 2021-06-08 苏州欧菲光科技有限公司 Vehicle-mounted camera calibration method and device, vehicle-mounted camera and storage medium
CN112927303A (en) * 2021-02-22 2021-06-08 中国重汽集团济南动力有限公司 Lane line-based automatic driving vehicle-mounted camera pose estimation method and system
CN113223095A (en) * 2021-05-25 2021-08-06 中国人民解放军63660部队 Internal and external parameter calibration method based on known camera position
CN114359412A (en) * 2022-03-08 2022-04-15 盈嘉互联(北京)科技有限公司 Automatic calibration method and system for external parameters of camera facing to building digital twins
CN114463439A (en) * 2022-01-18 2022-05-10 襄阳达安汽车检测中心有限公司 Vehicle-mounted camera correction method and device based on image calibration technology
CN115024740A (en) * 2022-08-11 2022-09-09 晓智未来(成都)科技有限公司 Virtual radiation field display method for common X-ray photography
CN117611438A (en) * 2023-12-06 2024-02-27 浙江省交通投资集团有限公司智慧交通研究分公司 Monocular image-based reconstruction method from 2D lane line to 3D lane line
US12020456B2 (en) 2021-02-07 2024-06-25 Black Sesame Technologies Inc. External parameter calibration method, device and system for image acquisition apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150332446A1 (en) * 2014-05-16 2015-11-19 GM Global Technology Operations LLC Surround-view camera system (vpm) and vehicle dynamic
CN108898638A (en) * 2018-06-27 2018-11-27 江苏大学 A kind of on-line automatic scaling method of vehicle-mounted camera
CN110008893A (en) * 2019-03-29 2019-07-12 武汉理工大学 A kind of automobile driving running deviation automatic testing method based on vehicle-mounted imaging sensor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150332446A1 (en) * 2014-05-16 2015-11-19 GM Global Technology Operations LLC Surround-view camera system (vpm) and vehicle dynamic
CN108898638A (en) * 2018-06-27 2018-11-27 江苏大学 A kind of on-line automatic scaling method of vehicle-mounted camera
CN110008893A (en) * 2019-03-29 2019-07-12 武汉理工大学 A kind of automobile driving running deviation automatic testing method based on vehicle-mounted imaging sensor

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
KUNFENG WANG等: ""Research on Lane-Marking Line Based Camera Calibration"", 《2007 IEEE INTERNATIONAL CONFERENCE ON VEHICULAR ELECTRONICS AND SAFETY》 *
M.B.DE PAULA等: ""Automatic on-the-fly extrinsic camera calibration of onboard vehicular cameras"", 《EXPERT SYSTEMS WITH APPLICATIONS》 *
吴骅跃等: ""基于IPM和边缘图像过滤的多干扰车道线检测"", 《中国公路学报》 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112785653A (en) * 2020-12-30 2021-05-11 中山联合汽车技术有限公司 Vehicle-mounted camera attitude angle calibration method
CN112785653B (en) * 2020-12-30 2024-06-21 中山联合汽车技术有限公司 Vehicle-mounted camera attitude angle calibration method
US12020456B2 (en) 2021-02-07 2024-06-25 Black Sesame Technologies Inc. External parameter calibration method, device and system for image acquisition apparatus
CN112927303B (en) * 2021-02-22 2023-01-24 中国重汽集团济南动力有限公司 Lane line-based automatic driving vehicle-mounted camera pose estimation method and system
CN112927303A (en) * 2021-02-22 2021-06-08 中国重汽集团济南动力有限公司 Lane line-based automatic driving vehicle-mounted camera pose estimation method and system
CN112927309A (en) * 2021-03-26 2021-06-08 苏州欧菲光科技有限公司 Vehicle-mounted camera calibration method and device, vehicle-mounted camera and storage medium
CN112927309B (en) * 2021-03-26 2024-04-09 苏州欧菲光科技有限公司 Vehicle-mounted camera calibration method and device, vehicle-mounted camera and storage medium
CN113223095A (en) * 2021-05-25 2021-08-06 中国人民解放军63660部队 Internal and external parameter calibration method based on known camera position
CN113223095B (en) * 2021-05-25 2022-06-17 中国人民解放军63660部队 Internal and external parameter calibration method based on known camera position
CN114463439B (en) * 2022-01-18 2023-04-11 襄阳达安汽车检测中心有限公司 Vehicle-mounted camera correction method and device based on image calibration technology
CN114463439A (en) * 2022-01-18 2022-05-10 襄阳达安汽车检测中心有限公司 Vehicle-mounted camera correction method and device based on image calibration technology
CN114359412B (en) * 2022-03-08 2022-05-27 盈嘉互联(北京)科技有限公司 Automatic calibration method and system for external parameters of camera facing to building digital twins
CN114359412A (en) * 2022-03-08 2022-04-15 盈嘉互联(北京)科技有限公司 Automatic calibration method and system for external parameters of camera facing to building digital twins
CN115024740A (en) * 2022-08-11 2022-09-09 晓智未来(成都)科技有限公司 Virtual radiation field display method for common X-ray photography
CN117611438A (en) * 2023-12-06 2024-02-27 浙江省交通投资集团有限公司智慧交通研究分公司 Monocular image-based reconstruction method from 2D lane line to 3D lane line

Also Published As

Publication number Publication date
CN112102413B (en) 2022-12-09

Similar Documents

Publication Publication Date Title
CN112102413B (en) Virtual lane line-based automatic calibration method for vehicle-mounted camera
CN110148169B (en) Vehicle target three-dimensional information acquisition method based on PTZ (pan/tilt/zoom) pan-tilt camera
CN107133988B (en) Calibration method and calibration system for camera in vehicle-mounted panoramic looking-around system
US7697126B2 (en) Three dimensional spatial imaging system and method
JP4555876B2 (en) Car camera calibration method
JP5739584B2 (en) 3D image synthesizing apparatus and method for visualizing vehicle periphery
JP5455124B2 (en) Camera posture parameter estimation device
CN110842940A (en) Building surveying robot multi-sensor fusion three-dimensional modeling method and system
CN110728715A (en) Camera angle self-adaptive adjusting method of intelligent inspection robot
WO2015127847A1 (en) Super resolution processing method for depth image
US20140104424A1 (en) Imaging surface modeling for camera modeling and virtual view synthesis
US20230351625A1 (en) A method for measuring the topography of an environment
CN113362228A (en) Method and system for splicing panoramic images based on improved distortion correction and mark splicing
Nagy et al. Online targetless end-to-end camera-LiDAR self-calibration
CN206460515U (en) A kind of multichannel fisheye camera caliberating device based on stereo calibration target
CN113205603A (en) Three-dimensional point cloud splicing reconstruction method based on rotating platform
CN112927133A (en) Image space projection splicing method based on integrated calibration parameters
CN112254680B (en) Multi freedom's intelligent vision 3D information acquisition equipment
CN115239922A (en) AR-HUD three-dimensional coordinate reconstruction method based on binocular camera
CN112802109B (en) Method for generating aerial view panorama of automobile
WO2022078437A1 (en) Three-dimensional processing apparatus and method between moving objects
CN112257535B (en) Three-dimensional matching equipment and method for avoiding object
CN112304250B (en) Three-dimensional matching equipment and method between moving objects
CN112254678B (en) Indoor 3D information acquisition equipment and method
CN114359365A (en) Convergent binocular vision measuring method with high resolution

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant