CN112529966A - On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof - Google Patents

On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof Download PDF

Info

Publication number
CN112529966A
CN112529966A CN202011501750.9A CN202011501750A CN112529966A CN 112529966 A CN112529966 A CN 112529966A CN 202011501750 A CN202011501750 A CN 202011501750A CN 112529966 A CN112529966 A CN 112529966A
Authority
CN
China
Prior art keywords
image
vehicle
lane line
line
external parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011501750.9A
Other languages
Chinese (zh)
Other versions
CN112529966B (en
Inventor
彭莎
苏文凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Haowei Technology Wuhan Co ltd
Original Assignee
Haowei Technology Wuhan Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haowei Technology Wuhan Co ltd filed Critical Haowei Technology Wuhan Co ltd
Priority to CN202011501750.9A priority Critical patent/CN112529966B/en
Publication of CN112529966A publication Critical patent/CN112529966A/en
Application granted granted Critical
Publication of CN112529966B publication Critical patent/CN112529966B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T3/08
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Abstract

The invention provides a vehicle-mounted all-round looking system and an online calibration method. The online calibration method is used for online calibrating the external parameters of the image pickup device in the vehicle-mounted all-round system, and comprises the following steps: acquiring a group of synchronous 2D images shot by a camera device when a vehicle runs, wherein each 2D image comprises an image of a lane line; performing projection transformation processing on the 2D image to obtain a corresponding 3D image; detecting information of the lane line in the 3D image, and extracting features of the lane line from the 3D image; calculating the correction quantity of the external parameter according to the extracted characteristics of the lane line; and updating the external parameter according to the correction quantity. By the method, the vehicle-mounted all-round looking system can be subjected to online self-calibration, and the external parameter error is reduced. The vehicle-mounted all-round looking system comprises an online calibration device and a plurality of camera devices, and online self-calibration can be carried out by utilizing the online calibration device, so that the error of external parameters of the camera devices is small, and the image precision of the system is high.

Description

On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof
Technical Field
The invention relates to the technical field of automobile vision systems, in particular to an on-line calibration method of a vehicle-mounted looking-around system and the vehicle-mounted looking-around system.
Background
Intellectualization is one of the important trends in the development of the automobile industry, and visual systems are increasingly widely applied in the field of automobile active safety. A360-degree vehicle-mounted all-round system is one of high-grade automobile auxiliary safety systems, can provide the conditions around the automobile for a driver under the low-speed working condition, provides visual assistance for the driver to operate at low speed (such as parking and the like), and is a standard configuration of a plurality of mass-produced automobile types.
At present, a vehicle-mounted all-round system usually adopts an off-line calibration method to calibrate parameters adopted by a system configuration camera device. The parameters comprise internal parameters and external parameters, the internal parameters refer to equipment parameters such as focal length, optical center and lens distortion, the external parameters refer to a rotation matrix and a translation matrix from a camera coordinate system to a world coordinate system, and the calibration of the external parameters is a technical key point.
When an external parameter matrix (external parameter matrix for short) is calibrated offline at present, four cameras of a vehicle-mounted all-round-looking system are used for shooting the same scene (the vehicle-mounted all-round-looking system can be arranged on an experimental trolley for analog shooting) to obtain four synchronous 2D (namely 2-dimensional) images; and then, calibrating the external parameter matrix, which specifically comprises the following steps: 1) defining a world coordinate system, wherein an XOY plane is a ground plane, and the origin of coordinates is the projection of the center point of the vehicle body on the ground plane; 2) measuring the physical distance (in centimeters) between each angular point of the calibration plate and the origin of the world coordinate system; 3) detecting the positions (in pixels) of all corner points of the calibration plate on the 2D image after distortion removal; 4) and calculating a projection transformation matrix, namely an appearance parameter matrix, for transforming the 2D image to the 3D (three-dimensional) image by utilizing the corresponding relation between the physical distance between each angular point of the calibration plate and the origin of the world coordinate system and the position on the 2D image after distortion removal. Because the four cameras of the vehicle-mounted all-round looking system are used for converting the original points of the same world coordinate system, theoretically, after calibration, four 3D images obtained through conversion can be completely overlapped, namely splicing is free of dislocation, and therefore a panoramic aerial view of the periphery of an automobile can be obtained.
However, when the physical distance between the corner point of the calibration plate and the origin of the world coordinate system is measured manually, a large manual measurement error exists; and when off-line calibration is carried out, the experimental trolley is short, and a large detection error exists when the angular point position of the calibration plate is detected on the 2D image after distortion removal. Therefore, when the vehicle-mounted all-round viewing system is calibrated by using an off-line calibration method, the obtained external parameter error is larger. In addition, in reality, due to the long-term bearing or collision of the automobile, external parameters of the all-round looking system installed around the automobile body can be changed frequently, the image precision of the automobile-mounted all-round looking system is influenced, and the actual application requirements can not be met only through offline calibration.
Chinese patent (CN106875448A) discloses a self-calibration method for external parameters of a vehicle-mounted monocular camera. In the technical scheme, the information of the vanishing point is used, so that the method is only suitable for a front-facing or rear-facing camera; the technical scheme simultaneously uses vertical edge information, so that specific scenes such as buildings are needed; in addition, the technical scheme also uses optical flow information, so that the algorithm precision depends on the detection and matching of the feature points, and the self-calibration method is not suitable for a vehicle-mounted all-around system due to the constraints. In addition, chinese patent (CN107145828A) discloses a vehicle panoramic image processing method and apparatus, wherein a laser device is required to measure the physical position of a lane line to estimate the translational compensation amount of a vehicle body, which has high requirements for the configuration of a vehicle-mounted panoramic system.
Therefore, how to realize the online calibration of the system parameters of the vehicle-mounted panoramic system, and meanwhile, the constraint is less so as to facilitate the popularization and the application, is still a problem to be solved in the field.
Disclosure of Invention
The invention provides an online calibration method of a vehicle-mounted looking-around system, which is used for online calibrating external parameters of a camera device in the vehicle-mounted looking-around system, reduces the error of the external parameters, is beneficial to improving the real-time image precision of the vehicle-mounted looking-around system, requires less detection information for online calibration, and is convenient to popularize and apply. The invention further provides a vehicle-mounted all-around system which can perform online self-calibration on external parameters.
The invention provides an online calibration method of a vehicle-mounted all-round system, which is used for online calibrating external parameters of a camera device in the vehicle-mounted all-round system; the external parameters have initial values and are used for setting a camera device in the vehicle-mounted all-around system; the camera device is set on the vehicle according to the initial value of the external parameter before updating the external parameter; the on-line calibration method of the vehicle-mounted all-around system comprises the following steps:
acquiring a group of synchronous 2D images shot by the camera device when a vehicle runs, wherein each 2D image comprises an image of a lane line;
performing projection transformation processing on the 2D image to obtain a corresponding 3D image;
detecting information of the lane line in the 3D image, and extracting features of the lane line from the 3D image;
calculating the correction quantity of the external parameter according to the extracted features of the lane line;
and updating the external parameter according to the correction quantity.
Optionally, the external parameters include a rotation matrix and a translation matrix of a world coordinate system transformed from a camera coordinate system where the camera device is located.
Optionally, the vehicle-mounted looking-around system further includes: the internal parameters of the camera device are set; in the on-line calibration method of the vehicle-mounted all-around system, when the 2D image is subjected to projection transformation processing to obtain a corresponding 3D image, at least part of pixel points in the 2D image are projected from a camera coordinate system to a world coordinate system by using a matrix of the external parameters and the internal parameters.
Optionally, the online calibration method of the vehicle-mounted looking-around system further includes:
when the correction quantity adopted for updating the external parameter is larger than or equal to a first threshold and the iteration number of the correction quantity is smaller than a second threshold, performing projection transformation processing on the 2D image again by using the updated external parameter to obtain an updated 3D image so as to obtain the iterated correction quantity, and updating the external parameter again by using the iterated correction quantity until the correction quantity is smaller than the first threshold or the iteration number of the correction quantity is larger than the second threshold.
Optionally, before performing projection transformation processing on the 2D image to obtain a corresponding 3D image, defining a rendering range of the 3D image in a world coordinate system; and when the 2D image is subjected to projection transformation processing, only pixel points corresponding to the rendering range in the 2D image are subjected to projection transformation to the world coordinate system.
Optionally, the information of the lane line is detected from the 3D image by using a line detection method.
Optionally, the correction amounts include correction amounts of a yaw angle, a roll angle, a pitch angle, an up-down offset, and a left-right offset of the image pickup device.
Optionally, the characteristics of the lane line include an inclination and a width of the lane line.
The invention provides a vehicle-mounted all-round looking system, which comprises an online calibration device and a plurality of camera devices, wherein the online calibration device is used for online calibrating external parameters of the camera devices in the vehicle-mounted all-round looking system; wherein the external parameter has an initial value, the plurality of camera devices are set on the vehicle according to the initial value of the external parameter before updating the external parameter, and the on-line calibration device of the vehicle-mounted looking-around system comprises:
the image extraction module is used for acquiring a group of synchronous 2D images shot by the camera device when the vehicle runs, and each 2D image comprises an image of a lane line;
the image transformation module is used for performing projection transformation processing on the 2D image extracted by the image extraction module to obtain a corresponding 3D image;
the lane line detection module is used for detecting the 3D image obtained by the image transformation module to obtain the information of the lane line;
the characteristic extraction module is used for extracting the characteristics of the lane line from the information of the lane line obtained by the lane line detection module;
the correction amount calculation module is used for calculating the correction amount of the external parameter according to the lane line characteristics extracted by the characteristic extraction module; and
and the parameter updating module is used for updating the external parameters according to the correction quantity output by the correction quantity calculating module.
Optionally, the on-line calibration device of the vehicle-mounted looking-around system further includes:
and the correction quantity judging module is used for carrying out projection transformation processing on the 2D image again by using the updated external parameter and obtaining an updated 3D image when the correction quantity adopted for updating the external parameter is greater than or equal to a first threshold and the iteration number of the correction quantity is less than a second threshold, so as to obtain the iterated correction quantity, and updating the external parameter again by using the iterated correction quantity until the correction quantity is less than the first threshold or the iteration number of the correction quantity is greater than the second threshold.
The on-line calibration method of the vehicle-mounted panoramic system provided by the invention can update the external parameters in the vehicle running process, timely updates the external parameters which change in the vehicle running process relative to off-line calibration, and timely adjusts the position of the camera device by using the updated external parameters, namely, the external parameters are calibrated, so that the obtained vehicle panoramic image has higher precision, and the improvement of the safety of the vehicle is facilitated. In addition, the on-line calibration method of the vehicle-mounted panoramic system has less constraint on the vehicle-mounted panoramic system, only utilizes the information of lane lines, has no specific scene limitation, does not need laser equipment, has a wider application range and is convenient to popularize and apply.
The vehicle-mounted all-round looking system comprises an online calibration device and a plurality of camera devices. The vehicle-mounted looking-around system can finish the process of calibrating the external parameters by using the online calibrating device in the vehicle running process, updates the external parameters which change in the vehicle running process in time relative to offline calibration, and adjusts the position of the camera device in time by using the updated external parameters, namely realizes the calibration of the external parameters, so that the vehicle-mounted looking-around system has a self-calibrating function, can reduce the external parameter error of the camera device, is favorable for improving the image precision of the vehicle-mounted looking-around system, and is also favorable for improving the safety of the vehicle.
Drawings
Fig. 1 is a flowchart of an online calibration method of a vehicle-mounted panoramic system according to an embodiment of the present invention.
FIG. 2 is a schematic diagram of a dotted line obtained by performing a line inspection according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an online calibration device in a vehicle-mounted panoramic view system according to an embodiment of the present invention.
Detailed Description
The present invention provides an on-line calibration method for a vehicle-mounted looking-around system and a vehicle-mounted looking-around system thereof, which are further described in detail with reference to the accompanying drawings and specific embodiments. The advantages and features of the present invention will become more apparent from the following description. It is to be noted that the drawings are in a very simplified form and are not to precise scale, which is merely for the purpose of facilitating and distinctly claiming the embodiments of the present invention.
Fig. 1 is a flowchart of an online calibration method of a vehicle-mounted panoramic system according to an embodiment of the present invention. As shown in fig. 1, the online calibration method of the vehicle-mounted looking-around system provided in this embodiment is used for online calibrating external parameters of a camera device in the vehicle-mounted looking-around system; the external parameters have initial values and are used for setting a camera device in the vehicle-mounted all-around system; the camera device is set on the vehicle according to the initial value of the external parameter before updating the external parameter; the on-line calibration method of the vehicle-mounted all-around system comprises the following steps:
step S1, acquiring a group of synchronous 2D images shot by the camera device when the vehicle runs, wherein each 2D image comprises an image of a lane line;
step S2, performing projection transformation processing on the 2D image to obtain a corresponding 3D image;
step S3 of detecting information of the lane line in the 3D image and extracting features of the lane line from the 3D image;
step S4, calculating the correction quantity of the external parameter according to the extracted features of the lane line; and
in step S5, the external parameter is updated according to the correction amount.
In the above-mentioned vehicle-mounted all-round system, camera device sets up in the different position of vehicle in order to obtain multichannel synchronous 2D image (a set of synchronous 2D image promptly) when the vehicle goes, and is concrete, camera device can be the visible light camera, and can be the fisheye camera to in order can realize not having the dead angle control on a large scale. In this embodiment, the vehicle-mounted all-round system includes four camera devices, for example, and install respectively around the automobile body, for example, set up leading camera and rear camera respectively in the locomotive and the rear of a vehicle, set up left side camera and right side camera in the handle position of both sides door. The synchronous multipath 2D images obtained by the four cameras are four paths of synchronous frames. The present invention is not limited thereto, and in another embodiment, the vehicle-mounted surround view system may include more than four cameras, which are respectively arranged at different positions on a vehicle (a real vehicle or an experimental trolley), respectively obtain 2D images, and transmit the 2D images to the vehicle-mounted surround view system according to a plurality of synchronous frames.
The on-line calibration method of the present invention is specifically described below by taking a vehicle-mounted panoramic system having four cameras as an example.
Firstly, executing step S1 to obtain original internal parameters and external parameters of four camera devices in the vehicle-mounted all-round system; when the vehicle runs, the four camera devices shoot scenes around the vehicle at the same time to obtain four paths of synchronous frames, wherein the synchronous frames comprise images of lane lines on the side of the vehicle. In practical application, the lane lines shot by the camera device are straight lines parallel to the vehicle traveling direction, the lane lines have the same width, and two lane lines on the side of the vehicle are parallel.
Then, step S2 is executed to perform projective transformation processing on the 2D image to obtain a corresponding 3D image. In this embodiment, when the 2D image is subjected to projection transformation to obtain a corresponding 3D image, at least some of the pixels in the 2D image may be projected from the camera coordinate system to the world coordinate system by using external parameters of the camera device and internal parameters of the camera device.
In this embodiment, the internal parameters of the image capturing apparatus refer to device parameters such as focal length, optical center, and lens distortion, and are represented by a matrix a. For example, the internal parameter matrix a of the image pickup apparatus may be represented by a focal length f expressed in pixel unitsxAnd fyAnd principal point (c) of the center of the imagex,cy) Expressed as shown in formula (1):
Figure BDA0002842366850000071
the external parameters refer to a rotation matrix and a translation matrix which are transformed from a camera coordinate system to a world coordinate system, and initial values of the external parameters can be obtained according to an off-line calibration method. The extrinsic parameters may be represented by a matrix B. For example, the extrinsic parameter matrix B may be composed of a 3 × 3 rotation matrix R and a 3 × 1 translation vector t, as shown in equation (2):
B=[R|t] (2)
as an example, a point T in a 2D image0Is set to (u, v), T is T0The coordinate of the T point is (X) at the projection corresponding point of the world coordinate systemw,Yw,Zw) Wherein T is0The projection transformation T can be transformed by equation (3), where equation (3) is expressed as:
Figure BDA0002842366850000072
in the formula (3), s is an arbitrary scaling of the projective transformation, a is an internal parameter matrix, R is a rotation matrix in the external parameter matrix, and t is a translation vector in the external parameter matrix. Here, the external parameter matrix includes a rotation matrix R containing a distortion factor when considering distortion of the imaging device lens and a translational vector t.
In this embodiment, after the step S1 is executed and before the step S2 is executed, a rendering range of the 3D image in the world coordinate system may be further defined. For example, the center of the vehicle may be used as the origin O of a world coordinate system including X, Y and Z axes that are perpendicular to each other, and the ground plane may be used as the XOY plane of the world coordinate system. In the simulation experiment, since the used test car is small, the rendering range in the X-axis direction may be 0cm to 50cm, the rendering range in the Y-axis direction may be-50 cm to 50cm, and rendering is not performed in the Z-axis direction (that is, Z is 0). In an actual situation, the range of rendering the 3D image may be determined according to the actual situation of the vehicle. In a preferable embodiment, in step S2, only the pixel points corresponding to the rendering range in the 2D image may be transformed into the world coordinate system by projection, so as to obtain a 3D image, and compared with transforming all the pixel points in the 2D image into a 3D image by projection, the amount of data transformation by projection may be reduced, and the speed of transforming the 2D image into the 3D image by projection may be increased.
Then, step S3 is performed to detect information of a lane line in the 3D image and extract features of the lane line from the 3D image. The image pickup device can shoot the lane line when the vehicle runs, so that the images of the lane line exist in the multi-path synchronous 2D images. Through the conversion of the 2D image into the 3D image in step S2, the lane line image in the 2D image is also converted into the 3D image, and thus the 3D image has information of the lane line.
In this embodiment, a straight line detection method may be adopted to detect the information of the lane line in the 3D image, and the detected straight line is used to represent the lane line. In actual practice, the lane lines may be characterized using the straight lines on which the detected line segments lie.
Specifically, the lane Line may be detected by using an lsd (Line Segment detector) Line Detection algorithm or a Hough Line Detection (Hough Line Detection) method.
As an example, the LSD line detection algorithm first calculates the gradient size and direction of all points in an image, then uses the points with small gradient direction change and adjacent points as a connected domain, then judges whether it needs to be disconnected according to rules to form a plurality of domains with larger rectangularity according to the rectangularity of each domain, and finally improves and screens all the generated domains, and retains the domain satisfying the conditions, that is, the final line detection result. The algorithm has the advantages of high detection speed, no need of parameter adjustment and utilization of an error control method to improve the accuracy of linear detection.
The hough line detection method may include the steps of: 1) converting the color image into a gray scale image; 2) denoising the gray level image; 3) performing edge extraction; 4) binarization, namely judging whether the extracted points are edge points or not; 5) mapping the edge points to a Hough space; 6) taking a local maximum value and setting a threshold value to filter out an interference straight line; 7) drawing a straight line and calibrating the angular points. The advantages of Hough line detection are strong anti-interference capability, insensitivity to the incomplete part of the line in the image, noise and other coexisting nonlinear structures, tolerance of gaps in feature boundary description, and relative insusceptibility to image noise.
In this embodiment, a straight line is obtained by calculation using two end points of the detected line segment. FIG. 2 is a schematic diagram of a dotted line obtained by performing a line test in accordance with one embodiment of the present invention. As an example, as shown in fig. 2, a straight line1 and a straight line2 (the straight line is represented by a line segment in the drawing) are detected. The two endpoints of the detected line segment are p1 (x)1,y1) And p2 (x)2,y2) The equation of the line1 of the straight lines of p1 and p2 can be expressed as Ax + By + C ═ 0, wherein A, B, C is a parameter of the straight line and can be calculated By coordinates of two end points; the slope of the line1 can be calculated using the coordinates of the endpoints p1 and p2, i.e., the slope k of the line (y) ═ y1-y2)/(x1-x2) And then, the inclination angle corresponding to the slope k (i.e. the inclination angle of the straight line and the image coordinate axis (such as the X axis)) can be obtained by utilizing the arctan function. Two end points p3 (x) of the line segment are detected3,y3)、p4(x4,y4) And a straight line2 on which p3 and p4 are located is calculated, the distance from p3 to the straight line1 can be expressed as
Figure BDA0002842366850000091
The distance between two straight lines can be represented by the distance between a point on one straight line and a point on the other straight line, that is, the distance between an end point of one line segment and the other line segment can be represented by the distance between two line segmentsThe distance of the straight line; the length of two straight lines (actually the length of two line segments) can be represented by the distance between the two end points of the line segment. Since the lane line is on the ground, the coordinate values in the Z-axis direction in the endpoint coordinates of the line segment representing the lane line are all 0.
After the straight line detection is finished, the detected straight lines can be filtered, straight lines with the inclination angle of (90 +/-n) degrees (for example, n is 30), the distance between the two straight lines is in a first specified range, and the length difference of the two straight lines is in a second specified range are reserved, the first specified range and the second specified range can be set according to actual conditions, then collinear straight lines with the same direction are combined, and the longest paired straight lines are selected to represent the lane line. It should be understood that the straight line where the boundary of the lane line is located is the longest, and the paired straight lines representing the lane line are the boundary straight lines of the lane line, i.e. the straight lines representing the boundary position of the lane line with the lane line.
In this embodiment, because leading camera and rear camera can shoot two lane lines that lie in the automobile body both sides, through lane line detection to after selecting the straight line, can correspond four straight lines. And the left camera and the right camera only shoot one lane line on one side of the vehicle body, and after the lane line is detected and the straight lines are selected, the two straight lines can be corresponding to the two straight lines.
After detecting the information of the lane line in the 3D image, it is necessary to extract the feature of the lane line from the 3D image. Specifically, the feature of the lane line may be calculated and obtained by using feature points (i.e., end points of line segments) of detected straight lines (actually line segments). The characteristics of the lane line may include an inclination and a width of the lane line. The inclination angle of the lane line may be calculated by using an inclination angle of a straight line representing the lane line, for example, the inclination angle of the lane line is an average value of the inclination angles of the straight line representing the lane line. The width of the lane line may be obtained by calculating a distance between two straight lines representing the lane line, for example, referring to fig. 2, the lane line is represented by line segments line1 and line2, and the width of the lane line may be a distance from an upper end point (p0) of the line segment line1 to the line segment line2, a distance from a lower end point (p4) of the line segment line2 to the line segment line1, or an average value of a distance from an upper end point (p0) of the line segment line1 to the line segment line2 and a distance from a lower end point (p4) of the line segment line2 to the line segment 1.
Then, step S4 is executed to calculate a correction amount of the external parameter based on the extracted feature of the lane line. Here, the correction amount of the external parameter may be calculated using the width and the inclination angle of the lane line. The external parameter correction amount may include correction amounts of a Yaw angle (Yaw), a Roll angle (Roll), a Pitch angle (Pitch), an up-down offset (settling amount), and a left-right offset of an imaging device of the vehicle-mounted all-round system. Each camera (camera) of the vehicle-mounted all-round viewing system has a corresponding lane line in the 3D image (for example, the lane line corresponding to the front camera is two lane lines captured by the front camera, and the lane line corresponding to the left camera is the left lane line captured by the left camera). The correction amount of the external parameter of each image pickup device may be calculated by using the characteristics of the lane corresponding to the image pickup device, or calculated by using the relationship between the characteristics of the lane corresponding to the image pickup device and the characteristics of the lane corresponding to the other image pickup devices.
Specifically, the correction amount of the yaw angle may be calculated from an inclination angle of the lane line. The correction amount of the yaw angle β of a camera device can be obtained by calculating a difference between the mean value of the tilt angles of the lane lines corresponding to the camera device and 90 degrees, for example, the correction amount of the yaw angle β can be obtained by multiplying the difference between the mean value of the tilt angles of the corresponding lane lines and 90 degrees by an update rate, wherein when the camera device corresponds to only one lane line, the mean value of the tilt angles of the lane lines is itself.
The correction amount of the rollover angle θ can be obtained by calculating the width of the lane line. The correction amount of the roll angle θ may be calculated by using the difference between the widths of the left and right lane lines, or may be calculated by using the difference between the distances from the end points of two line segments representing the lane lines to the opposite line segment (not the line segment where the end points are located). For example, the product of the update rate and the difference between the widths of the left and right lane lines or the product of the update rate and the difference between the distances from the end points of two line segments representing the lane lines to the opposite line segment is the correction amount of the roll angle θ.
The correction amount of the pitch angle α may be calculated by using the degree of the trapezoid of the lane line. That is, the correction amount of the pitch angle α of a camera device can be obtained by calculating the degree of the trapezoid of the lane line corresponding to the camera device, for example, the product of the degree of the trapezoid of the corresponding lane line and the update rate is the correction amount of the pitch angle α. The degree of the trapezoid of a lane line may be a difference between the width of a lane line and the average of the widths of all lane lines in the 3D image.
The amount of correction of the amount of sinking is the amount of movement of the image pickup device in the vehicle height direction. The correction amount of the settlement amount may be obtained by calculation using the width of the lane line. The amount of settling of an image capturing device can be calculated by the difference between the width of the corresponding lane line of the image capturing device and the average width of all the lane lines in the 3D image, for example, the amount of settling is the product of the difference between the widths (i.e., the difference between the width of the corresponding lane line and the average width of all the lane lines in the image) and the update rate.
The amount of correction of the left-right offset is the amount of movement of the imaging device in the lateral direction of the vehicle (for example, the X-axis direction of the world coordinate system). The correction amount of the left and right offset amounts may be obtained by calculation using a spatial position relationship of the lane lines. The position of the lane line may be calculated using coordinates (e.g., coordinate values corresponding to the X axis) of end points of the line segment representing the lane line. When a camera device corresponds to two lane lines on the left and right, the correction amount of the left and right offset amount of the camera device is obtained by calculating the difference between the position of the lane line corresponding to the camera device and the positions of all the lane lines in the 3D image, for example, the correction amount of the left and right offset amount of the camera device is the product of the difference between the positions and the update rate; when a camera device corresponds to a lane line (such as a left lane line or a right lane line), the correction amount of the left-right offset amount of the camera device is obtained by calculating the difference between the position of the lane line corresponding to the camera device and the positions of all the lane lines on the same side of the corresponding lane line in the 3D image, for example, the correction amount of the left-right offset amount of the camera device is the product of the difference and the update rate.
The "update rate" mentioned above may be a positive number smaller than 1, wherein the smaller the update rate used for obtaining the correction amount of the external parameter is, the slower the update is, and thus the number of iterations of the correction amount to be subsequently performed is relatively large, but the too large update rate used for obtaining the correction amount of the external parameter may cause the iterations to fail to converge. In this embodiment, the values of the update rates used to obtain the corrections of the yaw angle, the roll angle, the pitch angle, the up-down offset (the settling amount), and the left-right offset may be the same. In another embodiment, the values of the update rates used to obtain the corrections for yaw angle, roll angle, pitch angle, up and down offset (settling amount), and left and right offset may be different.
And after the correction quantity is obtained, updating the external parameter according to the correction quantity. Specifically, a plurality of sub correction matrices corresponding to the corrections of the yaw angle, the roll angle, the pitch angle, the settling amount, and the left and right offset may be calculated, and the sub correction matrices are multiplied to obtain a corrected rotation and translation matrix, and then the corrected rotation and translation matrix is used to correct the external parameter matrix. For example, the modified rotational-translation matrix may be pre-multiplied by the matrix of extrinsic parameters to obtain an updated extrinsic parameter matrix. The sub-correction matrix, the corrected rotational-translational matrix and the external parameter matrix have the same structure, for example, all are 4 × 4 matrices.
In this embodiment, as shown in fig. 1, the on-line calibration method for the vehicle-mounted panoramic system may further include step S6, determining the correction amount, performing projection transformation on the 2D image again by using the updated external parameter to obtain an updated 3D image when the correction amount used for updating the external parameter is greater than or equal to a first threshold (e) and the iteration number (n) of the correction amount is less than a second threshold, so as to obtain the iterative correction amount, and updating the external parameter again by using the iterative correction amount (i.e., repeating steps S2 to S5); and when the correction quantity adopted for updating the external parameter is smaller than the first threshold value or the iteration number of the correction quantity is larger than the second threshold value, stopping updating the external parameter. The first threshold value and the second threshold value may be set as needed. The step of judging the correction amount is added, and whether to continue recalculating the correction amount and updating the external parameter is determined according to the judgment result, so that the error of the external parameter can be further reduced.
The on-line calibration method of the vehicle-mounted panoramic system can update the external parameters of the camera device in the driving process of the vehicle and timely adjust the orientation of the camera device by using the updated external parameters, thereby realizing the on-line self-calibration of the external parameters, being beneficial to reducing the external parameter errors of the camera device of the vehicle-mounted panoramic system, for example, eliminating the external parameter errors of five degrees of freedom (such as the yaw angle, the roll angle, the pitch angle, the vertical offset and the horizontal offset of the camera device) of the vehicle-mounted panoramic system, improving the image precision of the vehicle-mounted panoramic system and improving the safety of the vehicle, and in addition, the on-line calibration method of the vehicle-mounted panoramic system has less constraint on the vehicle-mounted panoramic system, only utilizes the information of a lane line, has no specific scene limitation and does not need laser equipment, and has wider application range, is convenient for popularization and application.
The above-mentioned online calibration method for the vehicle-mounted panoramic system may be executed by a computer processor, for example, a computer readable storage medium (such as an optical disc or a memory in the computer system) has computer instructions stored thereon, and when the computer instructions are executed by the processor, the online calibration method for the vehicle-mounted panoramic system according to the above-mentioned embodiment can be executed. The computer readable storage medium may be disposed in the vehicle-mounted looking-around system, or may be disposed independently of the vehicle-mounted looking-around system, for example, may be disposed in a control system of a vehicle in which the vehicle-mounted looking-around system is disposed. The vehicle provided with the vehicle-mounted all-around system can be a fuel automobile, a hybrid electric vehicle or a pure electric vehicle.
The embodiment also provides a vehicle-mounted all-round looking system. Fig. 3 is a schematic diagram of a vehicle-mounted around-the-vehicle system according to an embodiment of the invention. The vehicle-mounted all-round looking system comprises an online calibration device 2 and a plurality of camera devices 1, wherein the online calibration device 2 is used for online calibrating external parameters of the camera devices 1 in the vehicle-mounted all-round looking system; wherein the external parameter has an initial value, and the plurality of image pickup devices 1 are set on the vehicle according to the initial value of the external parameter before updating the external parameter.
Specifically, as shown in fig. 3, the online calibration device includes an image extraction module 21, an image transformation module 22, a lane line detection module 23, a feature extraction module 24, a correction amount calculation module 25, and a parameter update module 26. The image extraction module 21 is configured to obtain a set of synchronous 2D images captured by the camera device while the vehicle is running, where each 2D image includes an image of a lane line. The image transformation module 22 is configured to perform projection transformation processing on the 2D image extracted by the image extraction module 21 to obtain a corresponding 3D image. The lane line detection module 23 is configured to detect the 3D image obtained by the image conversion module 22 to obtain information of a lane line. The feature extraction module 24 is configured to extract features of the lane line from the lane line information obtained by the lane line detection module 23. The correction amount calculating module 25 is configured to calculate a correction amount of the external parameter according to the lane line feature extracted by the feature extracting module 24. The parameter updating module 26 is configured to update the external parameter according to the correction amount output by the correction amount calculating module 25.
Specifically, the imaging apparatus 1 has initial internal parameters and external parameters. The internal parameters refer to device parameters such as focal length, optical center and lens distortion. The external parameters may include a rotation matrix and a translation matrix of a coordinate system of a camera in which the camera device is located, which is transformed into a world coordinate system.
When the image transformation module 22 performs projection transformation processing on the 2D image to obtain a corresponding 3D image, at least a part of pixel points in the 2D image may be projected from a camera coordinate system to a world coordinate system by using the matrix of the external parameter and the internal parameter. The image transformation module 22 may be further configured to limit a rendering range of the 3D image in a world coordinate system, and when performing projection transformation processing on the 2D image, only a pixel point in the 2D image corresponding to the rendering range may be transformed to the world coordinate system by projection, so as to reduce a workload of the image transformation module 22 and improve a work efficiency thereof.
The lane line detection module 23 may detect the lane line information from the 3D image by using a line detection method. The Line Detection method may include an lsd (Line Segment detector) Line Detection algorithm or a Hough Line Detection (Hough Line Detection) method. The lane line detection module 23 may also be configured to filter the extracted multiple straight lines, and retain straight lines with an inclination angle of (90 ± n) (for example, n is 30), a distance between two straight lines in a first specified range, and a length difference between the two straight lines in a second specified range, then combine collinear straight lines with the same direction, and select a longest paired straight line to represent a lane line.
The feature extraction module 24 may calculate and obtain the features of the lane line by using the feature points (i.e., the end points of the line segment) of the straight line (actually, the line segment) detected by the lane line detection module 23. The lane line features extracted by the feature extraction module 24 may include the inclination and width of the lane line.
The correction amount of the external parameter output by the correction amount calculation module 25 may include parameters such as correction amounts of a yaw angle, a roll angle, a pitch angle, an up-down offset amount, and a left-right offset amount of a camera device of the vehicle-mounted all-round system. The parameter updating module 26 may be further configured to calculate a modified rotation-translation matrix corresponding to the correction amounts of the yaw angle, the roll angle, the pitch angle, the vertical offset, and the horizontal-translation amount, and modify the external parameter matrix according to the modified rotation-translation matrix.
In this embodiment, as shown in fig. 3, the online calibration apparatus may further include a correction amount determination module 27. The correction amount determining module 27 is configured to, when the correction amount used for updating the external parameter is greater than or equal to a first threshold and the iteration number of the correction amount is smaller than a second threshold, perform projection transformation on the 2D image again by using the updated external parameter to obtain an updated 3D image, so as to obtain the iterative correction amount, and update the external parameter again by using the iterative correction amount until the correction amount is smaller than the first threshold or the iteration number of the correction amount is greater than the second threshold. The correction amount determination module 27 may be added to calculate the correction amount a plurality of times and correct the external parameter a plurality of times, so as to further reduce the error of the external parameter.
In this embodiment, when the correction amount determining module 27 determines that the updating of the external parameter is stopped (that is, when the correction amount is smaller than the first threshold or the number of iterations of the correction amount is greater than the second threshold), the parameter updating module 26 may adjust the orientation of the imaging apparatus according to the updated external parameter.
The vehicle-mounted all-round looking system of the embodiment comprises an online calibration device 2 and a plurality of camera devices 1, wherein the online calibration device 2 comprises an image extraction module 21, an image transformation module 22, a lane line detection module 23, a feature extraction module 24, a correction amount calculation module 25 and a parameter updating module 26. The process of calibrating the external parameters by using the online calibration device can be completed in the vehicle running process, the external parameters (particularly the external parameters when the external parameters change in the vehicle running process) are updated in time relative to offline calibration, and the orientation of the camera device is adjusted in time by using the updated external parameters, so that the external parameters are calibrated, the vehicle-mounted looking-around system has a self-calibration function, the error of the external parameters of the camera device can be reduced, the image precision of the vehicle-mounted looking-around system is improved, and the safety of the vehicle is improved.
It is understood that the in-vehicle surround view system of the embodiment of the present invention may include a plurality of computers, hardware, devices, etc. interconnected by a communication unit such as a network, or include a single computer, hardware, device, etc. having a process of implementing the present invention. The computer may include, among other things, a Central Processing Unit (CPU), memory, and input and output components such as a keyboard, mouse, touch screen, display, and the like. The modules and units (image extraction module, image transformation module, lane line detection module, feature extraction module, correction amount calculation module, correction amount judgment module, and parameter update module) in the online calibration device may be combined and implemented in one module, or any one of the modules may be split into a plurality of sub-units, or at least part of the functions of one or more of the units may be combined with at least part of the functions of other units and implemented in one module. According to an embodiment of the present invention, at least one of the modules in the online calibration apparatus may be implemented at least partially as a hardware circuit, or may be implemented in hardware or firmware in any other reasonable manner of integrating or packaging circuits, or at least one of the modules in the online calibration apparatus may be implemented at least partially as a program code module, which when executed by a computer controlling the online calibration apparatus, may perform the functions of the corresponding module.
It should be noted that the present specification is described in a progressive manner, and the following description focuses on differences from the preceding description, and the same and similar parts may be referred to each other.
The above description is only for the purpose of describing the preferred embodiments of the present invention and is not intended to limit the scope of the claims of the present invention, and any person skilled in the art can make possible the variations and modifications of the technical solutions of the present invention using the methods and technical contents disclosed above without departing from the spirit and scope of the present invention, and therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present invention belong to the protection scope of the technical solutions of the present invention.

Claims (10)

1. The on-line calibration method of the vehicle-mounted all-round looking system is characterized by being used for carrying out on-line calibration on external parameters of a camera device in the vehicle-mounted all-round looking system; the external parameters have initial values and are used for setting a camera device in the vehicle-mounted all-around system; the camera device is set on the vehicle according to the initial value of the external parameter before updating the external parameter; the on-line calibration method of the vehicle-mounted all-around system comprises the following steps:
acquiring a group of synchronous 2D images shot by the camera device when a vehicle runs, wherein each 2D image comprises an image of a lane line;
performing projection transformation processing on the 2D image to obtain a corresponding 3D image;
detecting information of the lane line in the 3D image, and extracting features of the lane line from the 3D image;
calculating the correction quantity of the external parameter according to the extracted features of the lane line;
and updating the external parameter according to the correction quantity.
2. The on-line calibration method for the vehicle-mounted panoramic system according to claim 1, wherein the external parameters include a rotation matrix and a translation matrix of a camera coordinate system where the camera device is located transformed into a world coordinate system.
3. The on-line calibration method for the vehicle-mounted looking-around system according to claim 1, wherein the vehicle-mounted looking-around system further comprises: the internal parameters of the camera device are set; in the on-line calibration method of the vehicle-mounted all-around system, when the 2D image is subjected to projection transformation processing to obtain a corresponding 3D image, at least part of pixel points in the 2D image are projected from a camera coordinate system to a world coordinate system by using a matrix of the external parameters and the internal parameters.
4. The on-line calibration method of the vehicle-mounted looking-around system as claimed in claim 3, further comprising:
when the correction quantity adopted for updating the external parameter is larger than or equal to a first threshold and the iteration number of the correction quantity is smaller than a second threshold, performing projection transformation processing on the 2D image again by using the updated external parameter to obtain an updated 3D image so as to obtain the iterated correction quantity, and updating the external parameter again by using the iterated correction quantity until the correction quantity is smaller than the first threshold or the iteration number of the correction quantity is larger than the second threshold.
5. The on-line calibration method of the vehicle-mounted looking-around system according to claim 3, wherein before the 2D image is subjected to projection transformation processing to obtain a corresponding 3D image, a rendering range of the 3D image in a world coordinate system is limited; and when the 2D image is subjected to projection transformation processing, only pixel points corresponding to the rendering range in the 2D image are subjected to projection transformation to the world coordinate system.
6. The on-line calibration method of the vehicle-mounted looking-around system as claimed in claim 1, wherein the information of the lane line is detected from the 3D image by a method of line detection.
7. The on-line calibration method for the vehicle-mounted all-round viewing system according to claim 1, wherein the correction amounts include correction amounts of a yaw angle, a roll angle, a pitch angle, an up-down offset amount and a left-right offset amount of the camera.
8. The on-line calibration method for the vehicle-mounted looking-around system according to any one of claims 1 to 7, wherein the characteristics of the lane line include the inclination and width of the lane line.
9. The vehicle-mounted all-round looking system is characterized by comprising an online calibration device and a plurality of camera devices, wherein the online calibration device is used for online calibrating external parameters of the camera devices in the vehicle-mounted all-round looking system; wherein the external parameter has an initial value, the plurality of camera devices are set on the vehicle according to the initial value of the external parameter before updating the external parameter, and the on-line calibration device of the vehicle-mounted looking-around system comprises:
the image extraction module is used for acquiring a group of synchronous 2D images shot by the camera device when the vehicle runs, and each 2D image comprises an image of a lane line;
the image transformation module is used for performing projection transformation processing on the 2D image extracted by the image extraction module to obtain a corresponding 3D image;
the lane line detection module is used for detecting the 3D image obtained by the image transformation module to obtain the information of the lane line;
the characteristic extraction module is used for extracting the characteristics of the lane line from the information of the lane line obtained by the lane line detection module;
the correction amount calculation module is used for calculating the correction amount of the external parameter according to the lane line characteristics extracted by the characteristic extraction module; and
and the parameter updating module is used for updating the external parameters according to the correction quantity output by the correction quantity calculating module.
10. The vehicle-mounted looking-around system of claim 9, wherein the on-line calibration device of the vehicle-mounted looking-around system further comprises:
and the correction quantity judging module is used for carrying out projection transformation processing on the 2D image again by using the updated external parameter and obtaining an updated 3D image when the correction quantity adopted for updating the external parameter is greater than or equal to a first threshold and the iteration number of the correction quantity is less than a second threshold, so as to obtain the iterated correction quantity, and updating the external parameter again by using the iterated correction quantity until the correction quantity is less than the first threshold or the iteration number of the correction quantity is greater than the second threshold.
CN202011501750.9A 2020-12-17 2020-12-17 On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof Active CN112529966B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011501750.9A CN112529966B (en) 2020-12-17 2020-12-17 On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011501750.9A CN112529966B (en) 2020-12-17 2020-12-17 On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof

Publications (2)

Publication Number Publication Date
CN112529966A true CN112529966A (en) 2021-03-19
CN112529966B CN112529966B (en) 2023-09-15

Family

ID=75001346

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011501750.9A Active CN112529966B (en) 2020-12-17 2020-12-17 On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof

Country Status (1)

Country Link
CN (1) CN112529966B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114202588A (en) * 2021-12-09 2022-03-18 纵目科技(上海)股份有限公司 Method and device for quickly and automatically calibrating vehicle-mounted panoramic camera
CN114418862A (en) * 2022-03-31 2022-04-29 苏州挚途科技有限公司 Method, device and system for splicing side images
CN115082573A (en) * 2022-08-19 2022-09-20 小米汽车科技有限公司 Parameter calibration method and device, vehicle and storage medium
CN115273547A (en) * 2022-07-26 2022-11-01 上海工物高技术产业发展有限公司 Road anti-collision early warning system
CN117173257A (en) * 2023-11-02 2023-12-05 安徽蔚来智驾科技有限公司 3D target detection and calibration parameter enhancement method, electronic equipment and medium
CN117218205A (en) * 2023-09-13 2023-12-12 北京斯年智驾科技有限公司 Camera external parameter correction method and system

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011118889A (en) * 2009-11-04 2011-06-16 Valeo Schalter & Sensoren Gmbh Video image-based road feature analysis, lane detection, and lane departure prevention method and device
JP2013222302A (en) * 2012-04-16 2013-10-28 Alpine Electronics Inc Mounting angle correction device for in-vehicle camera and mounting angle correction method
CN104851076A (en) * 2015-05-27 2015-08-19 武汉理工大学 Panoramic 360-degree-view parking auxiliary system for commercial vehicle and pick-up head installation method
CN105608693A (en) * 2015-12-18 2016-05-25 上海欧菲智能车联科技有限公司 Vehicle-mounted panoramic around view calibration system and method
CN106127787A (en) * 2016-07-01 2016-11-16 北京美讯美通信息科技有限公司 A kind of camera calibration method based on Inverse projection
WO2017122552A1 (en) * 2016-01-15 2017-07-20 ソニー株式会社 Image processing device and method, program, and image processing system
WO2018016599A1 (en) * 2016-07-21 2018-01-25 いすゞ自動車株式会社 Image processing device and image processing method
JP2018077713A (en) * 2016-11-10 2018-05-17 スズキ株式会社 Lane marking detection system
US20180346019A1 (en) * 2017-06-06 2018-12-06 Toyota Jidosha Kabushiki Kaisha Steering assist device
CN109859278A (en) * 2019-01-24 2019-06-07 惠州市德赛西威汽车电子股份有限公司 The scaling method and calibration system joined outside in-vehicle camera system camera
KR20190085718A (en) * 2018-01-11 2019-07-19 만도헬라일렉트로닉스(주) Lane departure warning system
CN110211176A (en) * 2019-05-31 2019-09-06 驭势科技(北京)有限公司 A kind of Camera extrinsic number correction System and method for
CN110264525A (en) * 2019-06-13 2019-09-20 惠州市德赛西威智能交通技术研究院有限公司 A kind of camera calibration method based on lane line and target vehicle
CN111243034A (en) * 2020-01-17 2020-06-05 广州市晶华精密光学股份有限公司 Panoramic auxiliary parking calibration method, device, equipment and storage medium
CN111815713A (en) * 2020-05-29 2020-10-23 安徽酷哇机器人有限公司 Method and system for automatically calibrating external parameters of camera

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011118889A (en) * 2009-11-04 2011-06-16 Valeo Schalter & Sensoren Gmbh Video image-based road feature analysis, lane detection, and lane departure prevention method and device
JP2013222302A (en) * 2012-04-16 2013-10-28 Alpine Electronics Inc Mounting angle correction device for in-vehicle camera and mounting angle correction method
CN104851076A (en) * 2015-05-27 2015-08-19 武汉理工大学 Panoramic 360-degree-view parking auxiliary system for commercial vehicle and pick-up head installation method
CN105608693A (en) * 2015-12-18 2016-05-25 上海欧菲智能车联科技有限公司 Vehicle-mounted panoramic around view calibration system and method
WO2017122552A1 (en) * 2016-01-15 2017-07-20 ソニー株式会社 Image processing device and method, program, and image processing system
CN106127787A (en) * 2016-07-01 2016-11-16 北京美讯美通信息科技有限公司 A kind of camera calibration method based on Inverse projection
WO2018016599A1 (en) * 2016-07-21 2018-01-25 いすゞ自動車株式会社 Image processing device and image processing method
JP2018077713A (en) * 2016-11-10 2018-05-17 スズキ株式会社 Lane marking detection system
US20180346019A1 (en) * 2017-06-06 2018-12-06 Toyota Jidosha Kabushiki Kaisha Steering assist device
KR20190085718A (en) * 2018-01-11 2019-07-19 만도헬라일렉트로닉스(주) Lane departure warning system
CN109859278A (en) * 2019-01-24 2019-06-07 惠州市德赛西威汽车电子股份有限公司 The scaling method and calibration system joined outside in-vehicle camera system camera
CN110211176A (en) * 2019-05-31 2019-09-06 驭势科技(北京)有限公司 A kind of Camera extrinsic number correction System and method for
CN110264525A (en) * 2019-06-13 2019-09-20 惠州市德赛西威智能交通技术研究院有限公司 A kind of camera calibration method based on lane line and target vehicle
CN111243034A (en) * 2020-01-17 2020-06-05 广州市晶华精密光学股份有限公司 Panoramic auxiliary parking calibration method, device, equipment and storage medium
CN111815713A (en) * 2020-05-29 2020-10-23 安徽酷哇机器人有限公司 Method and system for automatically calibrating external parameters of camera

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JEONG-KYUN LEE 等: "Online Extrinsic Camera Calibration for Temporally Consistent IPM Using Lane Boundary Observations with a Lane Width Prior", 《COMPUTER VISION AND PATTERN RECOGNITION 2020》 *
宫金良 等: "车道线信息的全面理解及偏离预警算法", 《激光杂志》, vol. 41, no. 2 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114202588A (en) * 2021-12-09 2022-03-18 纵目科技(上海)股份有限公司 Method and device for quickly and automatically calibrating vehicle-mounted panoramic camera
CN114202588B (en) * 2021-12-09 2022-09-23 纵目科技(上海)股份有限公司 Method and device for quickly and automatically calibrating vehicle-mounted panoramic camera
CN114418862A (en) * 2022-03-31 2022-04-29 苏州挚途科技有限公司 Method, device and system for splicing side images
CN115273547A (en) * 2022-07-26 2022-11-01 上海工物高技术产业发展有限公司 Road anti-collision early warning system
CN115273547B (en) * 2022-07-26 2023-07-21 上海工物高技术产业发展有限公司 Road anticollision early warning system
CN115082573A (en) * 2022-08-19 2022-09-20 小米汽车科技有限公司 Parameter calibration method and device, vehicle and storage medium
CN117218205A (en) * 2023-09-13 2023-12-12 北京斯年智驾科技有限公司 Camera external parameter correction method and system
CN117173257A (en) * 2023-11-02 2023-12-05 安徽蔚来智驾科技有限公司 3D target detection and calibration parameter enhancement method, electronic equipment and medium

Also Published As

Publication number Publication date
CN112529966B (en) 2023-09-15

Similar Documents

Publication Publication Date Title
CN112529966A (en) On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof
CN111242031B (en) Lane line detection method based on high-precision map
US10097812B2 (en) Stereo auto-calibration from structure-from-motion
CN107004277B (en) Online calibration of motor vehicle camera system
CN110910453B (en) Vehicle pose estimation method and system based on non-overlapping view field multi-camera system
KR101592740B1 (en) Apparatus and method for correcting image distortion of wide angle camera for vehicle
WO2010113673A1 (en) Calibration device, method, and program for onboard camera
CN109155066B (en) Method for motion estimation between two images of an environmental region of a motor vehicle, computing device, driver assistance system and motor vehicle
CN111862234B (en) Binocular camera self-calibration method and system
CN111559314B (en) Depth and image information fused 3D enhanced panoramic looking-around system and implementation method
CN110307791B (en) Vehicle length and speed calculation method based on three-dimensional vehicle boundary frame
CN112902874B (en) Image acquisition device and method, image processing method and device and image processing system
CN112614192B (en) On-line calibration method of vehicle-mounted camera and vehicle-mounted information entertainment system
CN105551020A (en) Method and device for detecting dimensions of target object
CN112037159A (en) Cross-camera road space fusion and vehicle target detection tracking method and system
CN113362228A (en) Method and system for splicing panoramic images based on improved distortion correction and mark splicing
CN111862236B (en) Self-calibration method and system for fixed-focus binocular camera
CN114283201A (en) Camera calibration method and device and road side equipment
CN115239922A (en) AR-HUD three-dimensional coordinate reconstruction method based on binocular camera
CN115511974A (en) Rapid external reference calibration method for vehicle-mounted binocular camera
CN113296516B (en) Robot control method for automatically lifting automobile
CN113345032B (en) Initialization map building method and system based on wide-angle camera large distortion map
CN113240597B (en) Three-dimensional software image stabilizing method based on visual inertial information fusion
CN110796604A (en) Image correction method and device
JP6768554B2 (en) Calibration device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant