CN112509054B - Camera external parameter dynamic calibration method - Google Patents

Camera external parameter dynamic calibration method Download PDF

Info

Publication number
CN112509054B
CN112509054B CN202010700766.6A CN202010700766A CN112509054B CN 112509054 B CN112509054 B CN 112509054B CN 202010700766 A CN202010700766 A CN 202010700766A CN 112509054 B CN112509054 B CN 112509054B
Authority
CN
China
Prior art keywords
lane line
preset
camera
vehicle
intersection points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010700766.6A
Other languages
Chinese (zh)
Other versions
CN112509054A (en
Inventor
张蓉
张放
李晓飞
张德兆
王肖
霍舒豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Landshipu Information Technology Co ltd
Original Assignee
Chongqing Landshipu Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Landshipu Information Technology Co ltd filed Critical Chongqing Landshipu Information Technology Co ltd
Priority to CN202010700766.6A priority Critical patent/CN112509054B/en
Publication of CN112509054A publication Critical patent/CN112509054A/en
Application granted granted Critical
Publication of CN112509054B publication Critical patent/CN112509054B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a camera external parameter dynamic calibration method, which comprises the following steps: the camera acquires images according to a preset time interval to obtain a plurality of frames of first images; the lane line detection module inputs each frame of first image into a deep learning model to obtain lane line information; when the vehicle is in a straight running state, the lane line information is sent to the storage module; when the number of the lane line information reaches a first preset number, the calibration module selects lane line information of a second preset number; determining the ID of the selected lane line, and calculating the position coordinates of a plurality of groups of intersection points of the selected lane line and a preset calibration line; obtaining relative position coordinates according to a preset conversion coefficient and the position coordinates, and calculating the relative distance between two intersection points in each group of intersection points; obtaining standard deviation parameters according to the relative distance between each group of intersection points; and determining whether to adjust the external parameters of the camera according to the comparison result of the average value of the second preset number of standard deviation parameters and the preset optimal parameters.

Description

Camera external parameter dynamic calibration method
Technical Field
The invention relates to the technical field of automatic driving, in particular to a camera external parameter dynamic calibration method.
Background
In recent years, an autonomous vehicle is widely used in various scenes such as intelligent transportation, logistics distribution, cleaning work, and the like. Cameras are used as main sensors of automatic driving vehicles and are commonly used for detecting the position of an obstacle during the driving process of the vehicle, and the accuracy of the detected position of the obstacle is directly related to camera external parameters.
The current camera external parameter calibration is mainly divided into two types.
The first category is dependent on a particular calibration plate or calibration object. Therefore, the arrangement of the calibration scene needs to be performed manually, and the calibrated external parameters are a set of fixed parameters. When the vehicle is bumpy, the fixed set of external calibration parameters is no longer applicable.
The second type is external parameter calibration using vanishing points or parallel lane lines. Although the requirements of the calibration on the calibration scene are relatively low, a group of external parameter obtained by single calibration is also a fixed value, and the method is not applicable to sensing the target position when the vehicle jolts.
It is known that the external parameters of the cameras of the existing automatic driving vehicles are all calibration data when the vehicle is in a gentle road condition. Once a bump or a slope up or down occurs during the running of the vehicle, the accuracy of the target position in the reference coordinate system of the vehicle obtained based on the fixed external parameters is reduced, and the safety factor of the corresponding autonomous vehicle is also reduced.
Disclosure of Invention
The invention aims at overcoming the defects of the prior art, and provides a camera external parameter dynamic calibration method which can realize initial external parameter calibration in an off-line state of a vehicle and can also carry out dynamic camera external parameter calibration in real time in the running process of the vehicle. Especially, under the condition that the vehicle bumps or goes up and down the ramp, the external parameter calibration parameters of the camera can be automatically corrected, the position accuracy of the visual detection target is improved, and the safety coefficient of vehicle running is further improved.
In order to achieve the above object, the present invention provides a camera external parameter dynamic calibration method, which includes:
The camera acquires images of a fixed area in front of the vehicle according to a preset time interval, acquires a plurality of frames of first images and sends the first images to the lane line detection module; each frame of the first image has an image ID;
The lane line detection module inputs each frame of the first image into a deep learning model to obtain lane line information of each frame of the first image; the lane line information comprises an image ID, a lane line ID of each lane line and pixel coordinates corresponding to the lane line ID;
Determining whether the vehicle is in a straight-ahead state;
When the vehicle is in a straight running state, the lane line detection module sends the lane line information to the storage module; when the number of the lane line information in the storage module reaches a first preset number, the calibration module selects a second preset number of the lane line information from the storage module; the second preset number is smaller than the first preset number;
The calibration module determines any two lane line IDs in the second preset number of lane line information, the lane line IDs are selected, and the position coordinates of multiple groups of intersection points of the selected lane line and the preset calibration line are calculated according to the pixel coordinates corresponding to the selected lane line IDs and multiple preset ordinate; each group of intersection points comprises two intersection points of two lane lines corresponding to the selected lane line ID on a preset ordinate;
The calibration module obtains the relative position coordinates of the plurality of groups of intersection points in a vehicle reference coordinate system according to a preset conversion coefficient and the position coordinates of the plurality of groups of intersection points, and calculates the relative distance between two intersection points in each group of intersection points according to the relative position coordinates of each group of intersection points; carrying out standard deviation calculation according to the relative distance between each group of intersection points in one lane line information to obtain standard deviation parameters corresponding to the lane line information; the standard deviation parameter has a corresponding image ID;
The calibration module determines whether to adjust the external parameters of the camera according to the comparison result of the average value of the second preset number of standard deviation parameters and preset optimal parameters;
When the mean value of the standard deviation parameters is not equal to a preset optimal parameter, the calibration module adjusts the external parameters according to a gradient descent method or a grid search method to obtain adjusted external parameters;
And the calibration module performs multiplication calculation according to a preset camera internal parameter and the adjusted external parameter, assigns the calculation result to the preset conversion coefficient, and executes the camera external parameter dynamic calibration method again.
Preferably, the calibration module determines whether to adjust the external parameters of the camera according to the comparison result of the average value of the second preset number of standard deviation parameters and the preset optimal parameters, and the method further includes:
and when the mean value of the standard deviation parameters is equal to a preset numerical value, the calibration module determines that the preset conversion coefficient is unchanged.
Further preferably, after the camera performs image acquisition on a fixed area in front of the vehicle according to a preset time interval to obtain a plurality of frames of first images, the dynamic calibration method further includes:
The camera sends the multi-frame first image to a target detection module;
The target detection module performs recognition processing on the first image according to a machine vision technology or an artificial neural network technology to obtain target information and sends the target information to the calibration module; the target information comprises a target ID and pixel coordinates corresponding to the target ID.
Further preferably, the dynamic calibration method further includes:
and the calibration module obtains the relative position coordinates of the target according to the conversion coefficient and the pixel coordinates of the target.
Preferably, the determining whether the vehicle is in a straight running state specifically includes:
The lane line detection module acquires a vehicle speed and a steering wheel angle from a controller of the vehicle, and judges whether the vehicle is in a straight running state according to the vehicle speed and the steering wheel angle.
Further preferably, when the vehicle speed is less than a preset vehicle speed and/or the steering wheel angle is greater than or equal to a preset angle, the lane line detection module judges that the vehicle is not in a straight running state, and stops sending the lane line information to the storage module;
and when the vehicle speed is greater than or equal to a preset vehicle speed and the steering wheel angle is smaller than a preset angle, the lane line detection module judges that the vehicle is in a straight running state.
Preferably, after the lane line detection module sends the lane line information to the storage module, the dynamic calibration method further includes:
And the storage module stores and counts the storage quantity of the lane line information and judges whether the storage quantity reaches a first preset quantity or not.
The camera external parameter dynamic calibration method provided by the invention can realize initial external parameter calibration in a vehicle off-line state, and can also carry out dynamic camera external parameter calibration in real time in the running process of the vehicle. Especially, under the condition that the vehicle bumps or goes up and down the ramp, the external parameter calibration parameters of the camera can be automatically corrected, the position accuracy of the visual detection target is improved, and the safety coefficient of vehicle running is further improved.
Drawings
FIG. 1 is a flow chart of a camera external parameter dynamic calibration method provided by an embodiment of the invention;
fig. 2 is a schematic diagram of a lane line under an image coordinate system according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an image coordinate system lower scale line according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a lower calibration line of a vehicle reference coordinate system according to an embodiment of the present invention;
Fig. 5 is a schematic diagram of an object in a vehicle reference coordinate system according to an embodiment of the present invention.
Detailed Description
The technical scheme of the invention is further described in detail through the drawings and the embodiments.
The camera external parameter dynamic calibration method provided by the invention can realize initial external parameter calibration in a vehicle off-line state, and can also carry out dynamic camera external parameter calibration in real time in the running process of the vehicle. Especially, under the condition that the vehicle bumps or goes up and down the ramp, the external parameter calibration parameters of the camera can be automatically corrected, the position accuracy of the visual detection target is improved, and the safety coefficient of vehicle running is further improved.
In order to facilitate understanding of the technical solution of the present invention, first, a pixel coordinate system, a vehicle reference coordinate system and camera external references will be described.
The pixel coordinate system is a two-dimensional coordinate system, taking pixels as units, a first pixel point at the upper left of an image is taken as an origin, the horizontal right direction is taken as a u axis, and the vertical downward direction is taken as a v axis.
In the embodiment of the invention, the established vehicle reference coordinate system takes the projection of a central point obtained by a rear axle of the vehicle on the ground as an origin, takes a vertical ground direction as a z axis, takes an x axis as an immediate front of the vehicle, and takes a vertical x axis as a y axis pointing to the left side of the vehicle.
The camera profile includes a rotation matrix including 3 rotation parameters of the x, y, z axes and a translation matrix including 3 translation parameters of the x, y, z axes. In the embodiment of the invention, the camera external parameters dynamically calibrated refer to the above-mentioned 3 rotation parameters and 3 translation parameters.
The rotation matrix describes the orientation of the coordinate axes of the vehicle reference coordinate system relative to the camera coordinate axes, and the translation matrix describes the position of the spatial origin under the camera coordinate system. The rotation matrix and the translation matrix together describe how to convert points from the vehicle reference coordinate system to the camera coordinate system.
Fig. 1 is a flowchart of a camera external parameter dynamic calibration method according to an embodiment of the present invention, and the technical scheme of the present invention is described in detail below with reference to fig. 1.
Step 101, a camera acquires images of a fixed area in front of a vehicle according to a preset time interval, obtains a plurality of frames of first images and sends the first images to a lane line detection module;
Specifically, the camera is arranged at the middle position of the front end of the vehicle, so that the lens of the camera faces to the right front of the vehicle, namely, on the y-axis under the vehicle coordinate system, and the translation of the camera relative to the origin point in the vehicle reference coordinate system can be known through the graduated scale and the vehicle type of the vehicle. The camera acquires images of a fixed area in front of the vehicle according to a preset time interval, and the first image acquired at each time point is regarded as a frame of first image. The first image per frame has an image ID.
102, A lane line detection module inputs each frame of first image into a deep learning model to obtain lane line information of each frame of output first image;
Specifically, each frame of first image is input into a trained deep learning model, and the deep learning model identifies and semantically divides each frame of first image to obtain lane line information. The lane line information includes an image ID, a lane line ID of each lane line, and pixel coordinates corresponding to the lane line ID. That is, one lane line information corresponds to one frame of the first image, and each lane line information includes a plurality of lane lines.
Fig. 2 is a schematic diagram of lane lines in an image coordinate system according to an embodiment of the present invention, where each lane line is shown as a discrete point.
Step 103, determining whether the vehicle is in a straight running state;
Specifically, the lane line detection module obtains a vehicle speed and a steering wheel angle from a controller of the vehicle, and judges whether the vehicle is in a straight running state according to the vehicle speed and the steering wheel angle.
When the vehicle speed is less than the preset vehicle speed and/or the steering wheel angle is greater than or equal to the preset angle, the lane line detection module judges that the vehicle is not in a straight running state, and step 130 is executed; when the vehicle speed is greater than or equal to the preset vehicle speed and the steering wheel angle is smaller than the preset angle, the lane line detection module determines that the vehicle is in a straight running state, and step 104 is executed.
Step 130, the lane line detection module stops sending the lane line information to the storage module;
specifically, when the vehicle is not in a straight running state, the camera external parameters are not dynamically calibrated.
104, The lane line detection module sends lane line information to the storage module;
specifically, when the vehicle is in a straight running state, dynamic calibration of camera external parameters is performed based on lane line information detected by the lane line detection module.
Step 105, the storage module stores and counts the storage quantity of the lane line information;
Specifically, the storage module is a buffer space with a fixed size. When the buffer space is exceeded, the storage module deletes the fixed number of lane line information stored earliest to release the buffer space.
And step 106, judging whether the storage quantity reaches a first preset quantity.
When the number of lane line information in the storage module reaches the first preset number, step 107 is performed. When the number of lane line information in the storage module does not reach the first preset number, step 105 is executed again.
Step 107, the calibration module selects a second preset number of lane line information from the storage module;
Specifically, the calibration module randomly selects a second preset number of lane line information to reduce the influence of the lane line information of an abnormal frame on the external parameters of the camera. Wherein the second preset number is smaller than the first preset number.
Step 108, the calibration module determines any two lane line IDs in the second preset number of lane line information, the lane line IDs are selected, and the position coordinates of multiple groups of intersection points of the selected lane line and the preset calibration line are calculated according to the pixel coordinates corresponding to the selected lane line IDs and multiple preset ordinate;
Specifically, the difference between every two adjacent preset ordinate is a fixed value. This fixed value may be varied as desired. Each lane line information comprises a plurality of lane line IDs, the calibration module selects any two lane line IDs in each lane line information as selected lane line IDs, and a corresponding group of intersection points on the two selected lane lines are determined according to a plurality of preset ordinate coordinates. Each group of intersection points comprises two intersection points of two lane lines, wherein the preset ordinate points correspond to the selected lane line ID.
Fig. 3 is a schematic diagram of an image coordinate system lower calibration line provided by an embodiment of the present invention, and fig. 4 is a schematic diagram of a vehicle reference coordinate system lower calibration line provided by an embodiment of the present invention. In FIG. 3, two selected lane lines and multiple ordinate lines have multiple sets of intersection points, which are converted into the vehicle coordinate system, as shown in FIG. 4, respectively A1A2、B1B2、C1C2、D1D2、E1E2、F1F2.
Step 109, the calibration module obtains the relative position coordinates of a plurality of groups of intersection points in a vehicle reference coordinate system according to the preset conversion coefficient and the position coordinates of a plurality of groups of intersection points, and calculates the relative distance between two intersection points in each group of intersection points according to the relative position coordinates of each group of intersection points;
specifically, in the image coordinate system, the two selected lane lines are not parallel, whether the two selected lane lines are parallel or not is determined, and multiple groups of intersection points obtained in the previous step need to be converted into the vehicle reference coordinate system, so that the relative distance between the two intersection points in each group of intersection points is calculated according to the relative position coordinates in the vehicle reference coordinate system.
The formula for converting the vehicle reference coordinate system into the pixel coordinate system is as follows:
conversion coefficient q=zc/(K R T) (expression 2)
Wherein Zc is a scaling factor, K is an internal reference matrix of the camera, R is a rotation matrix, and T is a translation matrix.
Step 110, standard deviation calculation is carried out according to the relative distance between each group of intersection points in one lane line information, and standard deviation parameters corresponding to the lane line information are obtained;
specifically, standard deviation calculation is performed through the relative distance between each group of intersection points in one lane line information, so as to judge whether two selected lane lines in the lane line information are parallel. The standard deviation parameter has a corresponding image ID.
The calculation formula of the standard deviation parameter is as follows:
Wherein ,D(A1,A2)、D(B1,B2)、D(C1,C2)、D(D1,D2)、D(E1,E2)、D(F1,F2) are the relative spacing of the sets of intersection points, The COST is the standard deviation of the 6 relative pitches, which is the average of the 6 relative pitches.
Step 111, the calibration module determines whether to adjust the external parameters of the camera according to the comparison result of the average value of the second preset number of standard deviation parameters and the preset optimal parameters;
Specifically, when the relative distance between each group of intersection points in one lane line information is the same, two selected lane lines are judged to be parallel to each other, and the standard deviation parameter is 0. That is, as long as the standard deviation parameter is not 0, it means that the two selected lane lines are not parallel.
And the calibration module calculates the average value of the second number of standard deviation parameters, compares the calculated result with the optimal parameter, and determines whether the camera external parameters need to be adjusted according to the comparison result. The optimal parameter may be 0 or a section.
And when the mean value of the standard deviation parameters is not equal to the preset optimal parameters, the calibration module adjusts the external parameters according to a gradient descent method or a grid search method to obtain the adjusted external parameters. The calibration module performs multiplication calculation according to the preset camera internal parameters and the adjusted external parameters, assigns calculation results to preset conversion coefficients, and executes the camera external parameters dynamic calibration method again.
The gradient descent method is to solve the minimum value along the downward direction of the standard deviation parameter gradient, the gradient descent step length and the maximum iteration number can be set according to requirements, and finally the minimum values in different directions are obtained through the cyclic iteration of the gradient descent method. The minimum value of each direction corresponds to one of the parameters of the camera profile, thereby obtaining an adjusted camera profile.
The grid search method is to perform corresponding COST calculation on the combination of all the parameters according to the preset step length of the parameters and the adjusted upper and lower thresholds, and then select a group of combined camera parameters with optimal calculation results as adjusted parameters.
When the mean value of the standard deviation parameters is equal to a preset numerical value, the camera external parameters at the moment are in accordance with the current running state of the vehicle, and the calibration module determines that the preset conversion coefficient is unchanged.
After the conversion coefficient is determined, the pixel coordinates of the target in the image coordinate system can be converted into the relative position coordinates of the target in the vehicle reference coordinate system, so that the position of the target relative to the vehicle can be obtained, and obstacle avoidance or route planning of the vehicle can be realized more accurately.
Specifically, the camera sends a plurality of frames of first images to the target detection module. The target detection module performs recognition processing on the first image according to a machine vision technology or an artificial neural network technology to obtain target information, and sends the target information to the calibration module. The target information comprises a target ID and pixel coordinates corresponding to the target ID. And the calibration module obtains the relative position coordinates of the target according to the conversion coefficient and the pixel coordinates of the target.
Fig. 5 is a schematic diagram of an object in a vehicle reference coordinate system according to an embodiment of the present invention, where the relative position coordinates of the object are the coordinates of the object relative to the vehicle in the vehicle reference coordinate system. The object can be understood, among other things, as other vehicles, pedestrians or other obstacles affecting the travel of the vehicle that are currently traveling on the road.
The camera external parameter dynamic calibration method provided by the invention can realize initial external parameter calibration in a vehicle off-line state, and can also carry out dynamic camera external parameter calibration in real time in the running process of the vehicle. Especially, under the condition that the vehicle bumps or goes up and down the ramp, the external parameter calibration parameters of the camera can be automatically corrected, the position accuracy of the visual detection target is improved, and the safety coefficient of vehicle running is further improved.
Those of skill would further appreciate that the elements and method steps of the examples described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the elements and steps of the examples have been described generally in terms of functionality in the foregoing description to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The methods or steps of the methods described in connection with the embodiments disclosed herein may be embodied in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The foregoing description of the embodiments has been provided for the purpose of illustrating the general principles of the invention, and is not meant to limit the scope of the invention, but to limit the invention to the particular embodiments, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (7)

1. The camera external parameter dynamic calibration method is characterized by comprising the following steps of:
The camera acquires images of a fixed area in front of the vehicle according to a preset time interval, acquires a plurality of frames of first images and sends the first images to the lane line detection module; each frame of the first image has an image ID;
The lane line detection module inputs each frame of the first image into a deep learning model to obtain lane line information of each frame of the first image; the lane line information comprises an image ID, a lane line ID of each lane line and pixel coordinates corresponding to the lane line ID;
Determining whether the vehicle is in a straight-ahead state;
When the vehicle is in a straight running state, the lane line detection module sends the lane line information to the storage module; when the number of the lane line information in the storage module reaches a first preset number, the calibration module selects a second preset number of lane line information from the storage module; the second preset number is smaller than the first preset number;
The calibration module determines any two lane line IDs in the second preset number of lane line information, the lane line IDs are selected, and the position coordinates of multiple groups of intersection points of the selected lane line and the preset calibration line are calculated according to the pixel coordinates corresponding to the selected lane line IDs and multiple preset ordinate; each group of intersection points comprises two intersection points of two lane lines corresponding to the selected lane line ID on a preset ordinate;
The calibration module obtains the relative position coordinates of the plurality of groups of intersection points in a vehicle reference coordinate system according to a preset conversion coefficient and the position coordinates of the plurality of groups of intersection points, and calculates the relative distance between two intersection points in each group of intersection points according to the relative position coordinates of each group of intersection points; carrying out standard deviation calculation according to the relative distance between each group of intersection points in one lane line information to obtain standard deviation parameters corresponding to the lane line information; the standard deviation parameter has a corresponding image ID;
The calibration module determines whether to adjust the external parameters of the camera according to the comparison result of the average value of the second preset number of standard deviation parameters and preset optimal parameters;
When the mean value of the standard deviation parameters is not equal to a preset optimal parameter, the calibration module adjusts the external parameters according to a gradient descent method or a grid search method to obtain adjusted external parameters;
and the calibration module performs multiplication calculation according to a preset camera internal parameter and the adjusted external parameter, assigns a calculation result to the preset conversion coefficient, and executes the camera external parameter dynamic calibration method again.
2. The method according to claim 1, wherein the calibration module determines whether to adjust the external parameters of the camera according to a comparison result between the average value of the second preset number of standard deviation parameters and a preset optimal parameter, and the method further comprises:
and when the mean value of the standard deviation parameters is equal to a preset numerical value, the calibration module determines that the preset conversion coefficient is unchanged.
3. The camera exogenous dynamic calibration method according to claim 2, wherein after the camera performs image acquisition on a fixed area in front of a vehicle according to a preset time interval to obtain a plurality of frames of first images, the dynamic calibration method further comprises:
The camera sends the multi-frame first image to a target detection module;
The target detection module performs recognition processing on the first image according to a machine vision technology or an artificial neural network technology to obtain target information and sends the target information to the calibration module; the target information comprises a target ID and pixel coordinates corresponding to the target ID.
4. A camera exogenous dynamic calibration method according to claim 3, further comprising:
and the calibration module obtains the relative position coordinates of the target according to the conversion coefficient and the pixel coordinates of the target.
5. The camera exogenous dynamic calibration method according to claim 1, wherein the determining whether the vehicle is in a straight-ahead state specifically comprises:
The lane line detection module acquires a vehicle speed and a steering wheel angle from a controller of the vehicle, and judges whether the vehicle is in a straight running state according to the vehicle speed and the steering wheel angle.
6. The camera exogenous dynamic calibration method according to claim 5, wherein when the vehicle speed is less than a preset vehicle speed and/or the steering wheel angle is greater than or equal to a preset angle, the lane line detection module judges that the vehicle is not in a straight running state, and stops sending the lane line information to the storage module;
and when the vehicle speed is greater than or equal to a preset vehicle speed and the steering wheel angle is smaller than a preset angle, the lane line detection module judges that the vehicle is in a straight running state.
7. The camera exogenous dynamic calibration method according to claim 1, wherein after the lane line detection module sends the lane line information to a storage module, the dynamic calibration method further comprises:
And the storage module stores and counts the storage quantity of the lane line information and judges whether the storage quantity reaches a first preset quantity or not.
CN202010700766.6A 2020-07-20 2020-07-20 Camera external parameter dynamic calibration method Active CN112509054B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010700766.6A CN112509054B (en) 2020-07-20 2020-07-20 Camera external parameter dynamic calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010700766.6A CN112509054B (en) 2020-07-20 2020-07-20 Camera external parameter dynamic calibration method

Publications (2)

Publication Number Publication Date
CN112509054A CN112509054A (en) 2021-03-16
CN112509054B true CN112509054B (en) 2024-05-17

Family

ID=74953512

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010700766.6A Active CN112509054B (en) 2020-07-20 2020-07-20 Camera external parameter dynamic calibration method

Country Status (1)

Country Link
CN (1) CN112509054B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114049404B (en) * 2022-01-12 2022-04-05 深圳佑驾创新科技有限公司 Method and device for calibrating internal phase and external phase of vehicle
CN114419165B (en) * 2022-01-17 2024-01-12 北京百度网讯科技有限公司 Camera external parameter correction method, camera external parameter correction device, electronic equipment and storage medium
CN114663529B (en) * 2022-03-22 2023-08-01 阿波罗智能技术(北京)有限公司 External parameter determining method and device, electronic equipment and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106558080A (en) * 2016-11-14 2017-04-05 天津津航技术物理研究所 Join on-line proving system and method outside a kind of monocular camera
CN108216229A (en) * 2017-09-08 2018-06-29 北京市商汤科技开发有限公司 The vehicles, road detection and driving control method and device
CN109191531A (en) * 2018-07-30 2019-01-11 深圳市艾为智能有限公司 A kind of automatic outer ginseng scaling method of the rear in-vehicle camera based on lane detection
CN109389650A (en) * 2018-09-30 2019-02-26 京东方科技集团股份有限公司 A kind of scaling method of in-vehicle camera, device, vehicle and storage medium
CN109544633A (en) * 2017-09-22 2019-03-29 华为技术有限公司 Target ranging method, device and equipment
DE102017218722A1 (en) * 2017-10-19 2019-04-25 Robert Bosch Gmbh Environment detection system for detecting an environment of a vehicle and method for detecting an environment of a vehicle
CN109859278A (en) * 2019-01-24 2019-06-07 惠州市德赛西威汽车电子股份有限公司 The scaling method and calibration system joined outside in-vehicle camera system camera
CN110148177A (en) * 2018-02-11 2019-08-20 百度在线网络技术(北京)有限公司 For determining the method, apparatus of the attitude angle of camera, calculating equipment, computer readable storage medium and acquisition entity
CN110264525A (en) * 2019-06-13 2019-09-20 惠州市德赛西威智能交通技术研究院有限公司 A kind of camera calibration method based on lane line and target vehicle
CN110322702A (en) * 2019-07-08 2019-10-11 中原工学院 A kind of Vehicular intelligent speed-measuring method based on Binocular Stereo Vision System
CN110415298A (en) * 2019-07-22 2019-11-05 昆山伟宇慧创智能科技有限公司 A kind of calculation method for deviation
CN110412603A (en) * 2019-07-22 2019-11-05 昆山伟宇慧创智能科技有限公司 A kind of calibrating parameters adaptive updates method calculated for deviation
CN110782497A (en) * 2019-09-06 2020-02-11 腾讯科技(深圳)有限公司 Method and device for calibrating external parameters of camera
CN111179345A (en) * 2019-12-27 2020-05-19 大连海事大学 Method and system for automatically detecting violation behaviors of crossing lines of front vehicle based on vehicle-mounted machine vision

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10025317B2 (en) * 2016-09-30 2018-07-17 Faraday&Future Inc. Methods and systems for camera-based autonomous parking
US10769793B2 (en) * 2018-04-17 2020-09-08 Baidu Usa Llc Method for pitch angle calibration based on 2D bounding box and its 3D distance for autonomous driving vehicles (ADVs)

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106558080A (en) * 2016-11-14 2017-04-05 天津津航技术物理研究所 Join on-line proving system and method outside a kind of monocular camera
CN108216229A (en) * 2017-09-08 2018-06-29 北京市商汤科技开发有限公司 The vehicles, road detection and driving control method and device
CN109544633A (en) * 2017-09-22 2019-03-29 华为技术有限公司 Target ranging method, device and equipment
DE102017218722A1 (en) * 2017-10-19 2019-04-25 Robert Bosch Gmbh Environment detection system for detecting an environment of a vehicle and method for detecting an environment of a vehicle
CN110148177A (en) * 2018-02-11 2019-08-20 百度在线网络技术(北京)有限公司 For determining the method, apparatus of the attitude angle of camera, calculating equipment, computer readable storage medium and acquisition entity
CN109191531A (en) * 2018-07-30 2019-01-11 深圳市艾为智能有限公司 A kind of automatic outer ginseng scaling method of the rear in-vehicle camera based on lane detection
CN109389650A (en) * 2018-09-30 2019-02-26 京东方科技集团股份有限公司 A kind of scaling method of in-vehicle camera, device, vehicle and storage medium
CN109859278A (en) * 2019-01-24 2019-06-07 惠州市德赛西威汽车电子股份有限公司 The scaling method and calibration system joined outside in-vehicle camera system camera
CN110264525A (en) * 2019-06-13 2019-09-20 惠州市德赛西威智能交通技术研究院有限公司 A kind of camera calibration method based on lane line and target vehicle
CN110322702A (en) * 2019-07-08 2019-10-11 中原工学院 A kind of Vehicular intelligent speed-measuring method based on Binocular Stereo Vision System
CN110415298A (en) * 2019-07-22 2019-11-05 昆山伟宇慧创智能科技有限公司 A kind of calculation method for deviation
CN110412603A (en) * 2019-07-22 2019-11-05 昆山伟宇慧创智能科技有限公司 A kind of calibrating parameters adaptive updates method calculated for deviation
CN110782497A (en) * 2019-09-06 2020-02-11 腾讯科技(深圳)有限公司 Method and device for calibrating external parameters of camera
CN111179345A (en) * 2019-12-27 2020-05-19 大连海事大学 Method and system for automatically detecting violation behaviors of crossing lines of front vehicle based on vehicle-mounted machine vision

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Automatic on-the-fly extrinsic camera calibration of onboard vehicular cameras;M. B. De Paula等;Expert Systems with Applications: An International Journal(ACM);20140301;第41卷(第4期);1997–2007 *
Online Extrinsic Parameters Calibration for Stereovision Systems Used in Far-Range Detection Vehicle Applications;Sergiu Nedevschi等;IEEE Transactions on Intelligent Transportation Systems;20071206;第8卷(第4期);651-660 *
一种针孔相机与三维激光雷达外参标定方法;韩正勇等;传感器与微系统;20180420;第37卷(第04期);9-12+16 *
三维激光雷达-相机间外参的高效标定方法;刘今越等;仪器仪表学报;20191130;第40卷(第11期);64-72 *

Also Published As

Publication number Publication date
CN112509054A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
CN112509054B (en) Camera external parameter dynamic calibration method
CN110942449B (en) Vehicle detection method based on laser and vision fusion
US10860870B2 (en) Object detecting apparatus, object detecting method, and computer program product
CN110765922B (en) Binocular vision object detection obstacle system for AGV
US9311711B2 (en) Image processing apparatus and image processing method
CN109002039B (en) Obstacle avoidance reminding method, related device and computer readable storage medium
CN112698302B (en) Sensor fusion target detection method under bumpy road condition
CN109345593B (en) Camera posture detection method and device
CN105206109B (en) A kind of vehicle greasy weather identification early warning system and method based on infrared CCD
JP6233345B2 (en) Road surface gradient detector
CN107577996A (en) A kind of recognition methods of vehicle drive path offset and system
CN109752701A (en) A kind of road edge detection method based on laser point cloud
US10922817B2 (en) Perception device for obstacle detection and tracking and a perception method for obstacle detection and tracking
CN111967360B (en) Target vehicle posture detection method based on wheels
CN109074490A (en) Path detection method, related device and computer readable storage medium
CN109753841B (en) Lane line identification method and device
EP2960858A1 (en) Sensor system for determining distance information based on stereoscopic images
US20190102898A1 (en) Method and apparatus for monitoring region around vehicle
CN114179788B (en) Automatic parking method, system, computer readable storage medium and vehicle terminal
CN108470142B (en) Lane positioning method based on inverse perspective projection and lane distance constraint
CN109887273A (en) A kind of bridge mobile load Optimum Identification Method based on multi-source redundancy
CN115503747A (en) Road condition identification and reminding system based on intelligent automobile steer-by-wire system
CN105300390B (en) The determination method and device of obstructing objects movement locus
CN115082532B (en) Ship collision prevention method for river-crossing transmission line based on laser radar
CN113593035A (en) Motion control decision generation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220308

Address after: 401122 No.1, 1st floor, building 3, No.21 Yunzhu Road, Yubei District, Chongqing

Applicant after: Chongqing landshipu Information Technology Co.,Ltd.

Address before: B4-006, maker Plaza, 338 East Street, Huilongguan town, Changping District, Beijing 100096

Applicant before: Beijing Idriverplus Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant