CN112529966B - On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof - Google Patents

On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof Download PDF

Info

Publication number
CN112529966B
CN112529966B CN202011501750.9A CN202011501750A CN112529966B CN 112529966 B CN112529966 B CN 112529966B CN 202011501750 A CN202011501750 A CN 202011501750A CN 112529966 B CN112529966 B CN 112529966B
Authority
CN
China
Prior art keywords
image
vehicle
line
lane
around system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011501750.9A
Other languages
Chinese (zh)
Other versions
CN112529966A (en
Inventor
彭莎
苏文凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Haowei Technology Wuhan Co ltd
Original Assignee
Haowei Technology Wuhan Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haowei Technology Wuhan Co ltd filed Critical Haowei Technology Wuhan Co ltd
Priority to CN202011501750.9A priority Critical patent/CN112529966B/en
Publication of CN112529966A publication Critical patent/CN112529966A/en
Application granted granted Critical
Publication of CN112529966B publication Critical patent/CN112529966B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T3/08
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Abstract

The invention provides a vehicle-mounted looking-around system and an online calibration method. The online calibration method is used for carrying out online calibration on external parameters of the camera device in the vehicle-mounted looking-around system, and comprises the following steps: acquiring a group of synchronous 2D images shot by an imaging device when a vehicle runs, wherein each 2D image comprises an image of a lane line; performing projection transformation processing on the 2D image to obtain a corresponding 3D image; detecting information of lane lines in the 3D image, and extracting characteristics of the lane lines from the 3D image; calculating to obtain correction quantity of external parameters according to the extracted characteristics of the lane lines; and updating the external parameters according to the correction amount. The method can be used for carrying out on-line self-calibration on the vehicle-mounted looking-around system, and reducing the error of external parameters. The vehicle-mounted looking-around system comprises an online calibration device and a plurality of camera devices, and the online calibration device can be used for online self-calibration, so that the error of external parameters of the camera devices is smaller, and the image precision of the system is higher.

Description

On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof
Technical Field
The invention relates to the technical field of automobile vision systems, in particular to an on-line calibration method of a vehicle-mounted looking-around system and the vehicle-mounted looking-around system.
Background
Intellectualization is one of the important trends in the development of the automobile industry, and vision systems are increasingly widely applied in the field of active safety of automobiles. The 360-degree vehicle-mounted looking-around system is one of the advanced automobile auxiliary safety systems, and the system can provide surrounding conditions of a vehicle for a driver under a low-speed working condition, and provide visual assistance for low-speed operation (such as parking and the like) of the driver, so that the system has become a standard configuration of a large number of production automobile types.
At present, an off-line calibration method is generally adopted in a vehicle-mounted looking-around system to calibrate parameters adopted by a system configuration camera device. The parameters comprise internal parameters and external parameters, wherein the internal parameters refer to equipment parameters such as focal length, optical center, lens distortion and the like, the external parameters refer to a rotation matrix and a translation matrix from a camera coordinate system to a world coordinate system, and the calibration of the external parameters is a technical key point.
When an external parameter matrix (external parameter matrix for short) is calibrated offline at present, firstly, four cameras of a vehicle-mounted looking-around system are utilized to shoot the same scene (the vehicle-mounted looking-around system can be arranged on an experimental trolley to carry out simulated shooting), so that four synchronous 2D (namely 2-dimensional) images are obtained; then calibrating the external reference matrix, which comprises the following steps: 1) Defining a world coordinate system, wherein an XOY plane is a ground plane, and the origin of coordinates is the projection of a vehicle body center point on the ground plane; 2) Measuring physical distances (in centimeters) between each corner point of the calibration plate and the origin of the world coordinate system; 3) Detecting the positions (in pixels) of all the corner points of the calibration plate on the undistorted 2D image; 4) And calculating to obtain a projection transformation matrix, namely an external parameter matrix, of the 2D image transformed into the 3D (i.e. three-dimensional) image by using the physical distance between each corner point of the calibration plate and the origin of the world coordinate system and the corresponding relation of the position on the undistorted 2D image. Because the four cameras of the vehicle-mounted all-around system are used for transforming the origins of the same world coordinate system, in theory, after calibration, the four 3D images obtained by transformation can be completely overlapped, namely splicing is free from dislocation, so that a panoramic aerial view of the periphery of the automobile can be obtained.
However, when the physical distance between the corner point of the calibration plate and the origin of the world coordinate system is measured manually, a large manual measurement error exists; and in the off-line calibration, the experimental trolley is shorter, and a larger detection error exists when the angular point position of the calibration plate is detected on the undistorted 2D image. Therefore, when the vehicle-mounted looking-around system is calibrated by using the off-line calibration method, the obtained external parameter error is larger. In addition, in reality, because the long-term bearing of car or bump, lead to installing around the automobile body around the outside parameter of system often can change, influence the image accuracy of on-vehicle system around, only can not satisfy the practical application demand through off-line calibration.
Chinese patent (CN 106875448A) discloses a method for self-calibrating external parameters of a vehicle-mounted monocular camera. In the technical scheme, the information of vanishing points is used, so that the method is only suitable for front or rear cameras; the technical scheme uses vertical edge information at the same time, so that a specific scene such as a building is required; in addition, the technical scheme also uses optical flow information, so that the algorithm precision depends on detection and matching of feature points, and the constraint causes that the self-calibration method is not suitable for an on-vehicle looking-around system. In addition, chinese patent (CN 107145828A) discloses a vehicle panoramic image processing method and apparatus, in which a laser device is required to measure the physical position of a lane line to estimate the translational compensation amount of a vehicle body, and the configuration requirement on a vehicle-mounted looking-around system is high.
Therefore, how to realize the on-line calibration of the system parameters of the vehicle-mounted looking-around system, and the constraint is less so as to facilitate popularization and application is still a problem to be solved in the field.
Disclosure of Invention
The invention provides an online calibration method of a vehicle-mounted looking-around system, which is used for carrying out online calibration on external parameters of a camera device in the vehicle-mounted looking-around system, reducing the error of the external parameters, being beneficial to improving the real-time image precision of the vehicle-mounted looking-around system, and having less detection information required by the online calibration, thereby being convenient for popularization and application. The invention further provides a vehicle-mounted looking-around system which can perform online self-calibration on external parameters.
The invention provides an on-line calibration method of a vehicle-mounted looking-around system, which is used for on-line calibration of external parameters of a camera device in the vehicle-mounted looking-around system; the external parameters have initial values and are used for setting an imaging device in the vehicle-mounted looking-around system; the camera device is arranged on the vehicle according to the initial value of the external parameter before updating the external parameter; the on-line calibration method of the vehicle-mounted looking-around system comprises the following steps:
acquiring a group of synchronous 2D images shot by the camera device when a vehicle runs, wherein each 2D image comprises an image of a lane line;
Performing projection transformation processing on the 2D image to obtain a corresponding 3D image;
detecting information of the lane lines in the 3D image, and extracting characteristics of the lane lines from the 3D image;
calculating to obtain the correction quantity of the external parameter according to the extracted characteristics of the lane lines;
and updating the external parameter according to the correction amount.
Optionally, the external parameters include a rotation matrix and a translation matrix of a camera coordinate system where the image capturing device is located, which are transformed to a world coordinate system.
Optionally, the vehicle-mounted looking-around system further includes: for setting internal parameters of the camera device; in the on-line calibration method of the vehicle-mounted looking-around system, when the 2D image is subjected to projection transformation processing to obtain a corresponding 3D image, at least part of pixel points in the 2D image are projected from a camera coordinate system to a world coordinate system by utilizing the matrix of the external parameters and the internal parameters.
Optionally, the on-line calibration method of the vehicle-mounted looking-around system further comprises the following steps:
when the correction amount adopted for updating the external parameter is larger than or equal to a first threshold value and the iteration number of the correction amount is smaller than a second threshold value, performing projective transformation processing on the 2D image again by using the updated external parameter and obtaining an updated 3D image, further obtaining the correction amount after iteration, and updating the external parameter again by using the correction amount after iteration until the correction amount is smaller than the first threshold value or the iteration number of the correction amount is larger than the second threshold value.
Optionally, before performing projective transformation processing on the 2D image to obtain a corresponding 3D image, defining a rendering range of the 3D image in a world coordinate system; when the projection conversion processing is performed on the 2D image, only the pixel point projection corresponding to the rendering range in the 2D image is converted into the world coordinate system.
Optionally, a straight line detection method is adopted to detect the information of the lane line from the 3D image.
Optionally, the correction amount includes correction amounts of yaw angle, roll angle, pitch angle, up-down offset amount, and left-right offset amount of the image pickup device.
Optionally, the characteristics of the lane line include an inclination and a width of the lane line.
The invention further provides a vehicle-mounted looking-around system, which comprises an online calibration device and a plurality of camera devices, wherein the online calibration device is used for carrying out online calibration on external parameters of the camera devices in the vehicle-mounted looking-around system; the on-line calibration device of the vehicle-mounted look-around system comprises:
The image extraction module is used for acquiring a group of synchronous 2D images shot by the camera device when the vehicle runs, and each 2D image comprises an image of a lane line;
the image conversion module is used for carrying out projection conversion processing on the 2D image extracted by the image extraction module to obtain a corresponding 3D image;
the lane line detection module is used for detecting the 3D image obtained by the image transformation module to obtain lane line information;
the feature extraction module is used for extracting the features of the lane lines from the lane line information obtained by the lane line detection module;
the correction amount calculation module is used for calculating the correction amount of the external parameter according to the lane line characteristics extracted by the characteristic extraction module; and
and the parameter updating module is used for updating the external parameters according to the correction quantity output by the correction quantity calculating module.
Optionally, the on-line calibration device of the vehicle-mounted looking-around system further includes:
and the correction quantity judging module is used for carrying out projective transformation processing on the 2D image again by using the updated external parameter and obtaining an updated 3D image when the correction quantity adopted for updating the external parameter is larger than or equal to a first threshold value and the iteration number of the correction quantity is smaller than a second threshold value, so as to obtain the correction quantity after iteration, and updating the external parameter again by using the correction quantity after iteration until the correction quantity is smaller than the first threshold value or the iteration number of the correction quantity is larger than the second threshold value.
The on-line calibration method of the vehicle-mounted looking-around system provided by the invention can update the external parameters in the running process of the vehicle, update the external parameters changed in the running process of the vehicle in time relative to off-line calibration, and adjust the azimuth of the camera device in time by utilizing the updated external parameters, thereby realizing the calibration of the external parameters, leading the obtained panoramic image of the vehicle to have higher precision and further being beneficial to improving the safety of the vehicle. In addition, the on-line calibration method of the vehicle-mounted looking-around system has less constraint on the vehicle-mounted looking-around system, only utilizes the information of the lane lines, has no specific scene limitation and no laser equipment, has wider application range and is convenient to popularize and apply.
The vehicle-mounted looking-around system comprises an on-line calibration device and a plurality of camera devices. The vehicle-mounted looking-around system can complete the process of calibrating the external parameters by using the online calibrating device in the vehicle running process, and timely updates the external parameters changed in the vehicle running process relative to the offline calibration, and timely adjusts the azimuth of the image pickup device by using the updated external parameters, so that the calibration of the external parameters is realized, the vehicle-mounted looking-around system has a self-calibrating function, the external parameter error of the image pickup device can be reduced, the image precision of the vehicle-mounted looking-around system is improved, and the safety of the vehicle is improved.
Drawings
Fig. 1 is a flowchart of an on-line calibration method of an on-board looking-around system according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a dotted line obtained by performing straight line detection in an embodiment of the present invention.
FIG. 3 is a schematic diagram of an on-line calibration device in an on-board looking-around system according to an embodiment of the invention.
Detailed Description
The on-line calibration method of the vehicle-mounted looking-around system and the vehicle-mounted looking-around system thereof provided by the invention are further described in detail below with reference to the accompanying drawings and specific embodiments. The advantages and features of the present invention will become more apparent from the following description. It should be noted that the drawings are in a very simplified form and are all to a non-precise scale, merely for convenience and clarity in aiding in the description of embodiments of the invention.
Fig. 1 is a flowchart of an on-line calibration method of an on-board looking-around system according to an embodiment of the present invention. As shown in fig. 1, the on-line calibration method of the vehicle-mounted looking-around system provided in this embodiment is used for on-line calibration of external parameters of a camera device in the vehicle-mounted looking-around system; the external parameters have initial values and are used for setting an imaging device in the vehicle-mounted looking-around system; the camera device is arranged on the vehicle according to the initial value of the external parameter before updating the external parameter; the on-line calibration method of the vehicle-mounted looking-around system comprises the following steps:
Step S1, acquiring a group of synchronous 2D images shot by the camera device when a vehicle runs, wherein each 2D image comprises an image of a lane line;
s2, performing projection transformation processing on the 2D image to obtain a corresponding 3D image;
step S3, detecting information of the lane lines in the 3D image, and extracting characteristics of the lane lines from the 3D image;
s4, calculating to obtain the correction quantity of the external parameter according to the extracted characteristics of the lane lines; and
and S5, updating the external parameters according to the correction quantity.
In the above vehicle-mounted looking-around system, the image capturing devices are arranged in different directions of the vehicle to obtain multiple paths of synchronous 2D images (i.e. a group of synchronous 2D images) when the vehicle runs, specifically, the image capturing devices can be visible light cameras and can be fisheye cameras, so that wide-range dead-angle-free monitoring can be realized. In this embodiment, the vehicle-mounted looking-around system includes, for example, four cameras, which are respectively installed around the vehicle body, for example, a front camera and a rear camera are respectively disposed at the front and rear of the vehicle, and a left camera and a right camera are disposed at the handle positions of the vehicle doors at two sides. The synchronous multipath 2D images obtained by using the four image pick-up devices are four paths of synchronous frames. The present invention is not limited thereto, and in other embodiments, the vehicle-mounted looking-around system may include four or more photographing devices for respectively setting different orientations on a vehicle (real vehicle or laboratory vehicle), respectively obtaining 2D images, and transmitting the 2D images to the vehicle-mounted looking-around system according to a plurality of synchronization frames.
The on-line calibration method of the present invention will be specifically described below by taking an on-vehicle looking around system having four image pickup devices as an example.
Firstly, executing step S1 to obtain original internal parameters and external parameters of four camera devices in the vehicle-mounted looking-around system; when the vehicle runs, four image pick-up devices shoot scenes around the vehicle at the same time, and four paths of synchronous frames are obtained, wherein the synchronous frames comprise images of lane lines positioned at the side edges of the vehicle. In practical application, the lane lines shot by the imaging device are straight lines parallel to the running direction of the vehicle, the width of the lane lines is equal, and the two lane lines on the side edge of the vehicle are parallel.
And then, executing step S2, and performing projection transformation processing on the 2D image to obtain a corresponding 3D image. In this embodiment, when the corresponding 3D image is obtained by performing the projective transformation processing on the 2D image, at least part of the pixels in the 2D image may be projected from the camera coordinate system to the world coordinate system by setting the external parameters of the image capturing device and the internal parameters of the image capturing device.
In this embodiment, the internal parameters of the image capturing apparatus refer to focal length, optical center, and lens distortion, and are represented by a matrix a . For example, the internal parameter matrix a of the image pickup apparatus may be represented by a focal length f in pixel units x And f y Principal point (c) of image center x ,c y ) The expression is as shown in formula (1):
the external parameters refer to a rotation matrix and a translation matrix of the camera coordinate system transformed to the world coordinate system, and initial values of the external parameters can be obtained according to an off-line calibration method. The external parameters may be represented by a matrix B. For example, the external parameter matrix B may be composed of a 3×3 rotation matrix R and a 3×1 translation vector t, as shown in equation (2):
B=[R|t] (2)
as an example, point T in a 2D image 0 The coordinates of (a) are set as (u, v), T is T 0 The coordinates of the point T corresponding to the projection in the world coordinate system are (X w ,Y w ,Z w ) Wherein T is to 0 The projective transformation into T can be transformed by the equation (3), expressed as:
in the formula (3), s is any scaling ratio of projective transformation, A is an internal parameter matrix, R is a rotation matrix in an external parameter matrix, and t is a translation vector in the external parameter matrix. Here, the external parameter matrix includes a rotation matrix R including distortion factors when distortion of the lens of the image pickup apparatus is considered, and a translation vector t.
In this embodiment, after executing step S1 and before executing step S2, the rendering range of the 3D image in the world coordinate system may be further defined. For example, the center of the vehicle may be taken as the origin O of the world coordinate system including X, Y and Z axes perpendicular to each other, and the ground plane may be taken as the XOY plane of the world coordinate system. In the simulation experiment, the experimental vehicle is small, the rendering range in the X-axis direction can be 0 cm-50 cm, the rendering range in the Y-axis direction can be-50 cm, and the rendering is not performed in the Z-axis direction (namely Z=0). In a practical situation, the range of rendering the 3D image may be determined according to the actual situation of the vehicle. In a preferred embodiment, in step S2, only the pixel point projections corresponding to the rendering range in the 2D image may be converted into the world coordinate system to obtain the 3D image, and compared with the conversion of all the pixel point projections in the 2D image into the 3D image, the projection conversion amount of data may be reduced, and the speed of converting the 2D image projection into the 3D image may be increased.
Then, step S3 is performed to detect information of a lane line in the 3D image, and extract features of the lane line from the 3D image. The image capturing device can capture the lane lines when the vehicle is running, so that the lane line images exist in the multiple paths of synchronous 2D images. Through the conversion from the 2D image to the 3D image in step S2, the lane line image in the 2D image is also converted into the 3D image, and thus the 3D image has the information of the lane line.
In this embodiment, a method of detecting a straight line may be used to detect information of a lane line in the 3D image, and the lane line may be represented by the straight line obtained by the detection. In practice, the straight line in which the detected line segment is located may be used to characterize the lane line.
Specifically, the lane line detection can adopt a LSD (Line Segment Detector) straight line detection algorithm or a Hough straight line detection (Hough Line Detection) method.
As an example, the LSD straight line detection algorithm first calculates the gradient sizes and directions of all points in an image, then uses the points with small gradient direction change and adjacent points as a connected domain, then judges whether the points need to be disconnected according to the rectangle degree of each domain to form a plurality of domains with larger rectangle degrees according to rules, and finally improves and screens all the generated domains, and reserves the domains meeting the conditions, namely the final straight line detection result. The algorithm has the advantages of high detection speed, no need of parameter adjustment, and improvement of the accuracy of straight line detection by using an error control method.
The hough straight line detection method may include the steps of: 1) Converting the color image into a gray scale image; 2) Denoising the gray scale image; 3) Extracting edges; 4) Binarization, judging whether the extracted points are edge points or not; 5) Mapping the edge points to a Hough space; 6) Taking a local maximum value, and setting a threshold value to filter out an interference straight line; 7) Drawing a straight line and calibrating the corner points. The Hough straight line detection has the advantages of strong anti-interference capability, insensitivity to incomplete parts of straight lines in images, noise and other coexisting nonlinear structures, tolerance to gaps in feature boundary description and relative insensitivity to image noise.
In this embodiment, the straight line is calculated and obtained by using both end points of the detected line segment. Fig. 2 is a schematic diagram of a dotted line obtained by performing straight line detection in an embodiment of the present invention. As an example, as shown in fig. 2, a straight line1 and a straight line2 (straight lines are indicated by line segments in the figure) are detected. The two endpoints of the detected line segment are p1 (x 1 ,y 1 ) And p2 (x) 2 ,y 2 ) The equation of the straight line1 where p1 and p2 are located can be expressed as ax+by+c=0, wherein A, B, C is a parameter of the straight line and can be obtained By calculating coordinates of two endpoints; the slope of the line1 can be calculated using the coordinates of the endpoints p1 and p2, i.e., the slope of the line k= (y) 1 -y 2 )/(x 1 -x 2 ) The inclination angle corresponding to the slope k (i.e., the inclination angle between the straight line and the coordinate axis of the image (e.g., the X-axis)) can be obtained by using the arctangent function. Two end points p3 (x) 3 ,y 3 )、p4(x 4 ,y 4 ) And calculating to obtain a straight line2 where p3 and p4 are located, the distance from p3 to the straight line1 can be expressed asThe distance between two straight lines can be represented by the distance between the point on one straight line and the point on the other straight line, that is, the distance between the end point of one line segment and the other line segment can be represented by the distance between the end points of the one line segment and the other line segment; the length of the two straight lines (actually the length of the two line segments) can be expressed by the distance between the two end points of the line segments. Since the lane is on the ground, Z-axis is in the end point coordinates of the line segment characterizing the laneThe coordinate values of (2) are all 0.
After the straight line detection is finished, the detected plurality of straight lines may be filtered, the straight lines with the inclination angle of (90±n) (for example, n is 30), the distance between the two straight lines within a first specified range and the length difference between the two straight lines within a second specified range are reserved, the first specified range and the second specified range may be set according to practical situations, then the collinear straight lines with the same direction are combined, and the longest paired straight lines are selected to represent the lane lines. It should be understood that the straight line where the boundary of the lane line is located is longest, and the paired straight lines representing the lane line are boundary straight lines of the lane line, that is, the lane line is represented by the straight line of the boundary position of the lane line.
In this embodiment, since the front camera and the rear camera can capture two lane lines on two sides of the vehicle body, after the lane lines are detected and the straight lines are selected, four straight lines can be corresponding. The left camera and the right camera only shoot one lane line on one side of the vehicle body, and after the lane line is detected and the straight lines are selected, the two straight lines can be corresponding.
After the information of the lane line is detected in the 3D image, the characteristics of the lane line need to be extracted from the 3D image. The feature of the lane line can be calculated and obtained by using the feature point (i.e., the end point of the line segment) of the detected straight line (actually, the line segment). The characteristics of the lane lines may include inclination and width of the lane lines. The inclination angle of the lane line may be calculated using an inclination angle of a straight line representing the lane line, for example, the inclination angle of the lane line is an inclination angle average of the straight line representing the lane line. The width of the lane line may be obtained by calculating a distance between two straight lines characterizing the lane line, for example, referring to fig. 2, the lane line is characterized by line segments line1 and line2, and the width of the lane line may be a distance from an upper end point (p 0) of the line segment line1 to the line segment line2, a distance from a lower end point (p 4) of the line segment line2 to the line segment line1, or an average value of a distance from the upper end point (p 0) of the line segment line1 to the line segment line2 and a distance from the lower end point (p 4) of the line segment line2 to the line segment line 1.
And then executing step S4, and calculating the correction quantity of the external parameter according to the extracted characteristics of the lane line. The correction amount of the external parameter may be calculated using the width and the inclination angle of the lane line. The correction amounts of the external parameters may include correction amounts of Yaw angle (Yaw), roll angle (Roll), pitch angle (Pitch), up-down offset (settlement amount), and left-right offset of an image pickup device of the in-vehicle looking-around system. Each camera of the vehicle-mounted looking-around system has a corresponding lane line in the 3D image (for example, the lane line corresponding to the front camera is two lane lines photographed by the front camera, and the lane line corresponding to the left camera is the left lane line photographed by the left camera). The external parameter correction amount of each image pickup device may be calculated using the feature of the lane line corresponding to the image pickup device or using the relationship between the feature of the lane line corresponding to the image pickup device and the feature of the lane line corresponding to the other image pickup device.
Specifically, the correction amount of the yaw angle may be calculated from the inclination angle of the lane line. The correction amount of the yaw angle β of the image capturing device may be obtained by calculating a difference between the average value of the inclination angles of the lane lines corresponding to the image capturing device and 90 degrees, for example, the correction amount of the yaw angle β may be obtained by multiplying the difference between the average value of the inclination angles of the lane lines corresponding to the image capturing device and 90 degrees by the update rate, where when the image capturing device corresponds to only one lane line, the average value of the inclination angles of the lane lines is the correction amount of the yaw angle β.
The correction amount of the roll angle θ can be calculated using the width of the lane line. The correction amount of the roll angle θ may be calculated using the difference between the widths of the left and right lane lines, or using the difference between the distances from the end point of two line segments representing the lane line to the opposite line segment (not the line segment where the end point is located). For example, the product of the difference between the width of the left and right lane lines and the update rate or the product of the difference between the distances from the end points of the two line segments representing the lane lines to the opposite line segments and the update rate is the correction amount of the roll angle θ.
The correction amount of the pitch angle alpha can be calculated by using the trapezoid degree of the lane line. That is, the correction amount of the pitch angle α of an image capturing device may be calculated by using the trapezoidal degree of the lane line corresponding to the image capturing device, for example, the product of the trapezoidal degree of the lane line corresponding to the image capturing device and the update rate is the correction amount of the pitch angle α. The degree of trapezium of a lane line may be the difference between the width of a lane line and the average of the widths of all lane lines in the 3D image.
The correction amount of the settlement amount is a movement amount of the image pickup device in the vehicle height direction. The correction amount of the settlement amount can be calculated by using the width of the lane line. The correction amount of the settlement amount of an image pickup device can be obtained by calculating the difference between the width of the lane line corresponding to the image pickup device and the average width of all lane lines in the 3D image, for example, the correction amount of the settlement amount is the product of the difference between the width (i.e., the difference between the width of the lane line corresponding to the image and the average width of all lane lines in the image) and the update rate.
The correction amount of the left-right offset amount is a movement amount of the imaging device in the lateral direction of the vehicle (for example, the X-axis direction of the world coordinate system). The correction amount of the left-right offset amount may be calculated using a spatial positional relationship of the lane lines. The position of the lane line may be calculated using the coordinates (e.g., the coordinate values corresponding to the X-axis) of the end points of the line segment that characterizes the lane line. When one image pickup device corresponds to the left lane line and the right lane line, the correction amount of the left offset and the right offset of the image pickup device is obtained by calculating the difference between the position of the lane line corresponding to the image pickup device and the position of all lane lines in the 3D image, for example, the correction amount of the left offset and the right offset of the image pickup device is the product of the difference of the position and the update rate; when an image capturing device corresponds to one lane line (such as a left lane line or a right lane line), the correction amount of the left-right offset of the image capturing device is obtained by calculating the difference between the position of the lane line corresponding to the image capturing device and the position of all lane lines on the same side as the corresponding lane line in the 3D image, for example, the correction amount of the left-right offset of the image capturing device is the product of the difference between the positions and the update rate.
The "update rate" mentioned above may be a positive number smaller than 1, wherein the smaller the update rate used to obtain the correction amount of the external parameter, the slower the update, so that the number of iterations of the correction amount is relatively large in the subsequent steps, but the use of an update rate too large to obtain the correction amount of the external parameter may cause the iteration to fail to converge. In this embodiment, the values of the update rates used to obtain the correction amounts of the yaw angle, roll angle, pitch angle, vertical offset (settlement amount), and horizontal offset may be the same. In another embodiment, the values of the update rates used to obtain correction amounts of yaw angle, roll angle, pitch angle, up-down offset (settlement amount), and left-right offset may be different.
After the correction amount is obtained, the external parameter is updated according to the correction amount. Specifically, a plurality of sub-correction matrices corresponding to correction amounts of the yaw angle, the roll angle, the pitch angle, the settlement amount and the left-right offset amount can be calculated respectively, a plurality of sub-correction matrices are multiplied to obtain a correction rotational translation matrix, and the correction rotational translation matrix is used for correcting the external parameter matrix. For example, the modified rotational translation matrix may be multiplied by the matrix of external parameters to obtain an updated matrix of external parameters. The sub-correction matrix, the correction rotation translation matrix and the external parameter matrix have the same structure, for example, are all 4×4 matrices.
In this embodiment, as shown in fig. 1, the on-line calibration method of the vehicle-mounted looking-around system may further include step S6 of determining the correction amount, and when the correction amount used for updating the external parameter is greater than or equal to a first threshold (e) and the iteration number (n) of the correction amount is less than a second threshold, performing projective transformation processing on the 2D image again by using the updated external parameter and obtaining an updated 3D image, thereby obtaining the correction amount after iteration, and updating the external parameter again by using the correction amount after iteration (i.e. repeating steps S2 to S5); and stopping updating the external parameter when the correction amount adopted for updating the external parameter is smaller than the first threshold value or the iteration number of the correction amount is larger than the second threshold value. The first threshold and the second threshold may be set as desired. The step of judging the correction amount is added, whether to continue to recalculate the correction amount and update the external parameter is determined according to the judging result, and the error of the external parameter can be further reduced.
According to the on-line calibration method of the vehicle-mounted looking-around system, the external parameters of the camera device can be updated in the running process of the vehicle, the orientation of the camera device is timely adjusted by utilizing the updated external parameters, namely, the on-line self-calibration of the external parameters is realized, the on-line self-calibration of the external parameters of the camera device of the vehicle-mounted looking-around system is facilitated, for example, the external parameter errors of five degrees of freedom (such as yaw angle, roll angle, pitch angle, up-down offset and left-right offset of the camera device) of the vehicle-mounted looking-around system can be eliminated, so that the image precision of the vehicle-mounted looking-around system is improved, the safety of the vehicle is improved, in addition, the on-line calibration method of the vehicle-mounted looking-around system has less constraint on the vehicle-mounted looking-around system, only uses the information of lane lines, no specific scene limitation and no laser equipment is needed, and the application range is wide, and the vehicle-mounted looking-around system is convenient to popularize and apply.
The above-mentioned on-line calibration method of the vehicle-mounted look-around system may be performed by a computer processor, for example, a computer readable storage medium (such as an optical disc or a memory in a computer system) stores computer instructions, and when the computer instructions are executed by the processor, the on-line calibration method of the vehicle-mounted look-around system described in the above embodiment can be performed. The computer readable storage medium may be provided in the vehicle-mounted all-round system, or may be provided independently of the vehicle-mounted all-round system, for example, may be provided in a control system of a vehicle in which the vehicle-mounted all-round system is provided. The vehicle provided with the vehicle-mounted looking-around system can be a fuel oil vehicle, a hybrid electric vehicle or a pure electric vehicle.
The embodiment also provides a vehicle-mounted looking-around system. Fig. 3 is a schematic diagram of an on-vehicle looking around system according to an embodiment of the invention. The vehicle-mounted looking-around system comprises an online calibration device 2 and a plurality of camera devices 1, wherein the online calibration device 2 is used for carrying out online calibration on external parameters of the camera devices 1 in the vehicle-mounted looking-around system; wherein the external parameter has an initial value, and the plurality of image pickup devices 1 are set on the vehicle according to the initial value of the external parameter before updating the external parameter.
Specifically, as shown in fig. 3, the online calibration device includes an image extraction module 21, an image transformation module 22, a lane line detection module 23, a feature extraction module 24, a correction amount calculation module 25, and a parameter update module 26. The image extraction module 21 is configured to obtain a set of synchronous 2D images captured by the image capturing device while the vehicle is running, where each of the 2D images includes an image of a lane line. The image transformation module 22 is configured to perform projective transformation processing on the 2D image extracted by the image extraction module 21, so as to obtain a corresponding 3D image. The lane line detection module 23 is configured to detect the 3D image obtained by the image transformation module 22, and obtain lane line information. The feature extraction module 24 is configured to extract features of the lane line from the lane line information obtained by the lane line detection module 23. The correction amount calculating module 25 is configured to calculate a correction amount of the external parameter according to the lane line feature extracted by the feature extracting module 24. The parameter updating module 26 is configured to update the external parameter according to the correction amount output by the correction amount calculating module 25.
Specifically, the image pickup apparatus 1 has initial internal parameters and external parameters. The internal parameters refer to equipment parameters such as focal length, optical center, lens distortion and the like. The external parameters may include a rotation matrix and a translation matrix of a camera coordinate system in which the image capturing apparatus is located, transformed to a world coordinate system.
The image transformation module 22 may use the matrix of the external parameters and the internal parameters to project at least part of the pixels in the 2D image from the camera coordinate system into the world coordinate system when performing the projection transformation process on the 2D image to obtain the corresponding 3D image. The image transformation module 22 may also be configured to define a rendering range of the 3D image in a world coordinate system, and may only convert, in projection transformation of the 2D image, pixels corresponding to the rendering range into the world coordinate system, so as to reduce the workload of the image transformation module 22 and improve the working efficiency thereof.
The lane line detection module 23 may detect the information of the lane line from the 3D image by using a straight line detection method. The method of line detection may include LSD (Line Segment Detector) line detection algorithm or using hough line detection (Hough Line Detection) method. The lane line detection module 23 may be further configured to filter the extracted multiple straight lines, and retain straight lines with an inclination angle of (90±n) (for example, n is 30), a distance between the two straight lines within a first specified range, and a length difference between the two straight lines within a second specified range, and then combine the collinear straight lines with the same direction, and pick out the longest paired straight lines to characterize the lane line.
The feature extraction module 24 may calculate and obtain the feature of the lane line using the feature point (i.e., the end point of the line segment) of the straight line (actually, the line segment) detected by the lane line detection module 23. The lane line features extracted by the feature extraction module 24 may include an inclination and a width of the lane line.
The correction amount of the external parameter outputted from the correction amount calculation module 25 may include a yaw angle, a roll angle, a pitch angle, a vertical offset amount, a horizontal offset amount, and the like of the imaging device of the vehicle-mounted looking-around system. The parameter updating module 26 may be further configured to calculate a corrected rotational translation matrix corresponding to the correction amounts of the yaw angle, the roll angle, the pitch angle, the up-down offset amount, and the left-right translation amount, and correct the external parameter matrix according to the corrected rotational translation matrix.
In this embodiment, as shown in fig. 3, the online calibration device may further include a correction amount determination module 27. The correction amount judging module 27 is configured to, when the correction amount used for updating the external parameter is greater than or equal to a first threshold and the iteration number of the correction amount is less than a second threshold, perform projective transformation processing on the 2D image again by using the updated external parameter and obtain an updated 3D image, thereby obtaining the correction amount after iteration, and update the external parameter again by using the correction amount after iteration until the correction amount is less than the first threshold or the iteration number of the correction amount is greater than the second threshold. The correction amount determination module 27 may be added to calculate the correction amount a plurality of times and correct the external parameter a plurality of times, so that the error of the external parameter may be further reduced.
In this embodiment, when the correction amount determining module 27 determines to stop updating the external parameter (i.e., when the correction amount is smaller than the first threshold or the iteration number of the correction amount is greater than the second threshold), the parameter updating module 26 may further adjust the orientation of the image capturing apparatus according to the updated external parameter.
The vehicle-mounted looking-around system of the embodiment comprises an online calibration device 2 and a plurality of camera devices 1, wherein the online calibration device 2 comprises an image extraction module 21, an image transformation module 22, a lane line detection module 23, a feature extraction module 24, a correction amount calculation module 25 and a parameter updating module 26. The process of calibrating the external parameters by the on-line calibrating device can be completed in the running process of the vehicle, compared with the off-line calibration, the external parameters (especially when the external parameters change in the running process of the vehicle) are updated in time, and the orientation of the camera device is adjusted in time by utilizing the updated external parameters, so that the calibration of the external parameters is realized, the vehicle-mounted looking-around system has a self-calibrating function, the external parameter error of the camera device can be reduced, the image precision of the vehicle-mounted looking-around system is improved, and the safety of the vehicle is improved.
It is understood that the in-vehicle looking-around system of the embodiments of the present invention may include a plurality of computers, hardware, devices, etc. interconnected by a communication unit such as a network, or include a single computer, hardware, device, etc. having processes for implementing the present invention. The computer may include a Central Processing Unit (CPU), memory, and input and output components such as a keyboard, mouse, touch screen, display, etc. The various modules and units (image extraction module, image conversion module, lane line detection module, feature extraction module, correction amount calculation module, correction amount judgment module, parameter updating module) in the online calibration device may be combined in one module, or any one of the units may be split into a plurality of sub-units, or at least part of the functions of one or more of the units may be combined with at least part of the functions of other units and implemented in one module. According to an embodiment of the present invention, at least one of the individual modules in the online calibration device may be implemented at least partly as a hardware circuit, or may be implemented in hardware or firmware in any other reasonable way of integrating or packaging circuits, or at least one of the individual modules in the online calibration device may be implemented at least partly as a program code module which, when run by a computer controlling the online calibration device, may perform the functions of the respective module.
It should be noted that, in the present description, the differences between the parts described in the following description and the parts described in the previous description are emphasized, and the same or similar parts are referred to each other.
The foregoing description is only illustrative of the preferred embodiments of the present invention, and is not intended to limit the scope of the claims, and any person skilled in the art may make any possible variations and modifications to the technical solution of the present invention using the method and technical content disclosed above without departing from the spirit and scope of the invention, so any simple modification, equivalent variation and modification made to the above embodiments according to the technical matter of the present invention fall within the scope of the technical solution of the present invention.

Claims (10)

1. The on-line calibration method of the vehicle-mounted looking-around system is characterized by being used for on-line calibration of external parameters of a camera device in the vehicle-mounted looking-around system; the external parameters have initial values and are used for setting an imaging device in the vehicle-mounted looking-around system; the camera device is arranged on the vehicle according to the initial value of the external parameter before updating the external parameter; the on-line calibration method of the vehicle-mounted looking-around system comprises the following steps:
Acquiring a group of synchronous 2D images shot by the camera device when a vehicle runs, wherein each 2D image comprises an image of a lane line;
performing projection transformation processing on the 2D image to obtain a corresponding 3D image;
detecting information of the lane lines in the 3D image, and extracting characteristics of the lane lines from the 3D image;
calculating to obtain the correction quantity of the external parameter according to the extracted characteristics of the lane lines; the correction amounts of the external parameters include correction amounts of roll angle, yaw angle, and pitch angle; the correction amount of the rolling angle is obtained by calculating the difference between the left lane line width and the right lane line width corresponding to the image pickup device, the lane line width is obtained by calculating the difference between the distances from the end points of two line segments representing the lane line to the opposite line segments, and one of the two line segments is the opposite line segment of the other line segment; the correction amount of the yaw angle is obtained by calculating the difference value between the average value of the inclination angles of the lane lines corresponding to the image pickup device and 90 degrees; the correction amount of the pitch angle is obtained by calculating the trapezoid degree of the lane line corresponding to the image pickup device, wherein the trapezoid degree of the lane line is the difference between the width of the lane line and the average value of the width of all the lane lines in the 3D image;
And updating the external parameter according to the correction amount.
2. The on-line calibration method of an on-vehicle look-around system according to claim 1, wherein the external parameters include a rotation matrix and a translation matrix of a camera coordinate system in which the image capturing device is located, which are transformed into a world coordinate system.
3. The on-line calibration method of the on-vehicle looking-around system according to claim 1, wherein the on-vehicle looking-around system further comprises: for setting internal parameters of the camera device; in the on-line calibration method of the vehicle-mounted looking-around system, when the 2D image is subjected to projection transformation processing to obtain a corresponding 3D image, at least part of pixel points in the 2D image are projected from a camera coordinate system to a world coordinate system by utilizing the matrix of the external parameters and the internal parameters.
4. The on-line calibration method of the vehicle-mounted looking-around system according to claim 3, further comprising:
when the correction amount adopted for updating the external parameter is larger than or equal to a first threshold value and the iteration number of the correction amount is smaller than a second threshold value, performing projective transformation processing on the 2D image again by using the updated external parameter and obtaining an updated 3D image, further obtaining the correction amount after iteration, and updating the external parameter again by using the correction amount after iteration until the correction amount is smaller than the first threshold value or the iteration number of the correction amount is larger than the second threshold value.
5. The on-line calibration method of the vehicle-mounted looking-around system according to claim 3, wherein a rendering range of the 3D image in a world coordinate system is defined before the 2D image is subjected to projective transformation processing to obtain a corresponding 3D image; when the projection conversion processing is performed on the 2D image, only the pixel point projection corresponding to the rendering range in the 2D image is converted into the world coordinate system.
6. The on-line calibration method of the vehicle-mounted looking-around system according to claim 1, wherein the information of the lane line is detected from the 3D image by a straight line detection method.
7. The on-line calibration method of an on-vehicle see-around system according to claim 1, wherein the correction amount includes correction amounts of an up-down offset amount and a left-right offset amount of the image pickup device.
8. The on-line calibration method of an on-vehicle look-around system according to any one of claims 1 to 7, wherein the characteristics of the lane line include the inclination and width of the lane line.
9. The vehicle-mounted looking-around system is characterized by comprising an online calibration device and a plurality of camera devices, wherein the online calibration device is used for carrying out online calibration on external parameters of the camera devices in the vehicle-mounted looking-around system; the on-line calibration device of the vehicle-mounted look-around system comprises:
The image extraction module is used for acquiring a group of synchronous 2D images shot by the camera device when the vehicle runs, and each 2D image comprises an image of a lane line;
the image conversion module is used for carrying out projection conversion processing on the 2D image extracted by the image extraction module to obtain a corresponding 3D image;
the lane line detection module is used for detecting the 3D image obtained by the image transformation module to obtain lane line information;
the feature extraction module is used for extracting the features of the lane lines from the lane line information obtained by the lane line detection module;
the correction amount calculation module is used for calculating the correction amount of the external parameter according to the lane line characteristics extracted by the characteristic extraction module; the correction amounts of the external parameters include correction amounts of roll angle, yaw angle, and pitch angle; the correction amount of the rolling angle is obtained by calculating the difference between the left lane line width and the right lane line width corresponding to the image pickup device, the lane line width is obtained by calculating the difference between the distances from the end points of two line segments representing the lane line to the opposite line segments, and one of the two line segments is the opposite line segment of the other line segment; the correction amount of the yaw angle is obtained by calculating the difference value between the average value of the inclination angles of the lane lines corresponding to the image pickup device and 90 degrees; the correction amount of the pitch angle is obtained by calculating the trapezoid degree of the lane line corresponding to the image pickup device, wherein the trapezoid degree of the lane line is the difference between the width of the lane line and the average value of the width of all the lane lines in the 3D image; and
And the parameter updating module is used for updating the external parameters according to the correction quantity output by the correction quantity calculating module.
10. The vehicle-mounted look-around system of claim 9, wherein the on-line calibration device of the vehicle-mounted look-around system further comprises:
and the correction quantity judging module is used for carrying out projective transformation processing on the 2D image again by using the updated external parameter and obtaining an updated 3D image when the correction quantity adopted for updating the external parameter is larger than or equal to a first threshold value and the iteration number of the correction quantity is smaller than a second threshold value, so as to obtain the correction quantity after iteration, and updating the external parameter again by using the correction quantity after iteration until the correction quantity is smaller than the first threshold value or the iteration number of the correction quantity is larger than the second threshold value.
CN202011501750.9A 2020-12-17 2020-12-17 On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof Active CN112529966B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011501750.9A CN112529966B (en) 2020-12-17 2020-12-17 On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011501750.9A CN112529966B (en) 2020-12-17 2020-12-17 On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof

Publications (2)

Publication Number Publication Date
CN112529966A CN112529966A (en) 2021-03-19
CN112529966B true CN112529966B (en) 2023-09-15

Family

ID=75001346

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011501750.9A Active CN112529966B (en) 2020-12-17 2020-12-17 On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof

Country Status (1)

Country Link
CN (1) CN112529966B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114202588B (en) * 2021-12-09 2022-09-23 纵目科技(上海)股份有限公司 Method and device for quickly and automatically calibrating vehicle-mounted panoramic camera
CN114418862B (en) * 2022-03-31 2022-07-29 苏州挚途科技有限公司 Method, device and system for splicing side images
CN115273547B (en) * 2022-07-26 2023-07-21 上海工物高技术产业发展有限公司 Road anticollision early warning system
CN115082573B (en) * 2022-08-19 2023-04-11 小米汽车科技有限公司 Parameter calibration method and device, vehicle and storage medium
CN117218205A (en) * 2023-09-13 2023-12-12 北京斯年智驾科技有限公司 Camera external parameter correction method and system
CN117173257A (en) * 2023-11-02 2023-12-05 安徽蔚来智驾科技有限公司 3D target detection and calibration parameter enhancement method, electronic equipment and medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011118889A (en) * 2009-11-04 2011-06-16 Valeo Schalter & Sensoren Gmbh Video image-based road feature analysis, lane detection, and lane departure prevention method and device
JP2013222302A (en) * 2012-04-16 2013-10-28 Alpine Electronics Inc Mounting angle correction device for in-vehicle camera and mounting angle correction method
CN104851076A (en) * 2015-05-27 2015-08-19 武汉理工大学 Panoramic 360-degree-view parking auxiliary system for commercial vehicle and pick-up head installation method
CN105608693A (en) * 2015-12-18 2016-05-25 上海欧菲智能车联科技有限公司 Vehicle-mounted panoramic around view calibration system and method
CN106127787A (en) * 2016-07-01 2016-11-16 北京美讯美通信息科技有限公司 A kind of camera calibration method based on Inverse projection
WO2017122552A1 (en) * 2016-01-15 2017-07-20 ソニー株式会社 Image processing device and method, program, and image processing system
WO2018016599A1 (en) * 2016-07-21 2018-01-25 いすゞ自動車株式会社 Image processing device and image processing method
JP2018077713A (en) * 2016-11-10 2018-05-17 スズキ株式会社 Lane marking detection system
CN109859278A (en) * 2019-01-24 2019-06-07 惠州市德赛西威汽车电子股份有限公司 The scaling method and calibration system joined outside in-vehicle camera system camera
KR20190085718A (en) * 2018-01-11 2019-07-19 만도헬라일렉트로닉스(주) Lane departure warning system
CN110211176A (en) * 2019-05-31 2019-09-06 驭势科技(北京)有限公司 A kind of Camera extrinsic number correction System and method for
CN110264525A (en) * 2019-06-13 2019-09-20 惠州市德赛西威智能交通技术研究院有限公司 A kind of camera calibration method based on lane line and target vehicle
CN111243034A (en) * 2020-01-17 2020-06-05 广州市晶华精密光学股份有限公司 Panoramic auxiliary parking calibration method, device, equipment and storage medium
CN111815713A (en) * 2020-05-29 2020-10-23 安徽酷哇机器人有限公司 Method and system for automatically calibrating external parameters of camera

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6801591B2 (en) * 2017-06-06 2020-12-16 トヨタ自動車株式会社 Steering support device

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011118889A (en) * 2009-11-04 2011-06-16 Valeo Schalter & Sensoren Gmbh Video image-based road feature analysis, lane detection, and lane departure prevention method and device
JP2013222302A (en) * 2012-04-16 2013-10-28 Alpine Electronics Inc Mounting angle correction device for in-vehicle camera and mounting angle correction method
CN104851076A (en) * 2015-05-27 2015-08-19 武汉理工大学 Panoramic 360-degree-view parking auxiliary system for commercial vehicle and pick-up head installation method
CN105608693A (en) * 2015-12-18 2016-05-25 上海欧菲智能车联科技有限公司 Vehicle-mounted panoramic around view calibration system and method
WO2017122552A1 (en) * 2016-01-15 2017-07-20 ソニー株式会社 Image processing device and method, program, and image processing system
CN106127787A (en) * 2016-07-01 2016-11-16 北京美讯美通信息科技有限公司 A kind of camera calibration method based on Inverse projection
WO2018016599A1 (en) * 2016-07-21 2018-01-25 いすゞ自動車株式会社 Image processing device and image processing method
JP2018077713A (en) * 2016-11-10 2018-05-17 スズキ株式会社 Lane marking detection system
KR20190085718A (en) * 2018-01-11 2019-07-19 만도헬라일렉트로닉스(주) Lane departure warning system
CN109859278A (en) * 2019-01-24 2019-06-07 惠州市德赛西威汽车电子股份有限公司 The scaling method and calibration system joined outside in-vehicle camera system camera
CN110211176A (en) * 2019-05-31 2019-09-06 驭势科技(北京)有限公司 A kind of Camera extrinsic number correction System and method for
CN110264525A (en) * 2019-06-13 2019-09-20 惠州市德赛西威智能交通技术研究院有限公司 A kind of camera calibration method based on lane line and target vehicle
CN111243034A (en) * 2020-01-17 2020-06-05 广州市晶华精密光学股份有限公司 Panoramic auxiliary parking calibration method, device, equipment and storage medium
CN111815713A (en) * 2020-05-29 2020-10-23 安徽酷哇机器人有限公司 Method and system for automatically calibrating external parameters of camera

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Online Extrinsic Camera Calibration for Temporally Consistent IPM Using Lane Boundary Observations with a Lane Width Prior;Jeong-Kyun Lee 等;《Computer Vision and Pattern Recognition 2020》;全文 *
车道线信息的全面理解及偏离预警算法;宫金良 等;《激光杂志》;第41卷(第2期);全文 *

Also Published As

Publication number Publication date
CN112529966A (en) 2021-03-19

Similar Documents

Publication Publication Date Title
CN112529966B (en) On-line calibration method of vehicle-mounted looking-around system and vehicle-mounted looking-around system thereof
CN110567469B (en) Visual positioning method and device, electronic equipment and system
CN108986037B (en) Monocular vision odometer positioning method and positioning system based on semi-direct method
CN109903341B (en) Vehicle-mounted camera external parameter dynamic self-calibration method
CN111242031B (en) Lane line detection method based on high-precision map
JP4555876B2 (en) Car camera calibration method
CN110910453B (en) Vehicle pose estimation method and system based on non-overlapping view field multi-camera system
KR101592740B1 (en) Apparatus and method for correcting image distortion of wide angle camera for vehicle
CN110956661B (en) Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix
CN112902874B (en) Image acquisition device and method, image processing method and device and image processing system
CN113362228A (en) Method and system for splicing panoramic images based on improved distortion correction and mark splicing
CN114283201A (en) Camera calibration method and device and road side equipment
CN115511974A (en) Rapid external reference calibration method for vehicle-mounted binocular camera
CN111105467B (en) Image calibration method and device and electronic equipment
CN109345591A (en) A kind of vehicle itself attitude detecting method and device
CN110796604A (en) Image correction method and device
CN113345032A (en) Wide-angle camera large-distortion image based initial image construction method and system
TWI424259B (en) Camera calibration method
JP2013024712A (en) Method and system for calibrating multiple camera
CN115100290B (en) Monocular vision positioning method, monocular vision positioning device, monocular vision positioning equipment and monocular vision positioning storage medium in traffic scene
CN116681776A (en) External parameter calibration method and system for binocular camera
CN111563936A (en) Camera external parameter automatic calibration method and automobile data recorder
CN116030139A (en) Camera detection method and device, electronic equipment and vehicle
CN114754779B (en) Positioning and mapping method and device and electronic equipment
US10643077B2 (en) Image processing device, imaging device, equipment control system, equipment, image processing method, and recording medium storing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant