CN116823958A - Yaw angle estimation method, device, equipment, vehicle and medium of vehicle-mounted camera - Google Patents

Yaw angle estimation method, device, equipment, vehicle and medium of vehicle-mounted camera Download PDF

Info

Publication number
CN116823958A
CN116823958A CN202310429708.8A CN202310429708A CN116823958A CN 116823958 A CN116823958 A CN 116823958A CN 202310429708 A CN202310429708 A CN 202310429708A CN 116823958 A CN116823958 A CN 116823958A
Authority
CN
China
Prior art keywords
lane line
yaw angle
vehicle
lane
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310429708.8A
Other languages
Chinese (zh)
Inventor
王明亮
陈文洋
常松涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Zhilian Beijing Technology Co Ltd
Original Assignee
Apollo Zhilian Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apollo Zhilian Beijing Technology Co Ltd filed Critical Apollo Zhilian Beijing Technology Co Ltd
Priority to CN202310429708.8A priority Critical patent/CN116823958A/en
Publication of CN116823958A publication Critical patent/CN116823958A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present disclosure provides a method, apparatus, device, vehicle and medium for yaw angle estimation of an onboard camera. Relates to the field of data processing, in particular to the field of intelligent driving and automatic driving vehicles. The specific implementation scheme is as follows: during the running process of the vehicle, acquiring a first vehicle image at a first moment based on the vehicle-mounted camera, and acquiring a second vehicle image at a second moment based on the vehicle-mounted camera; acquiring a first lane line detected from a first lane image and a second lane line detected from a second lane image; under the condition that the first lane line and the second lane line meet the calibration conditions, determining a yaw angle estimated value of the vehicle-mounted camera based on a mapping relation between the first lane line and the second lane line; the mapping relationship is used to map detection points on a first lane line to a second lane line. According to the embodiment of the disclosure, manual calibration is not needed, an additional sensor is not needed, and the estimation of the yaw angle can be automatically completed in the running process of the vehicle.

Description

Yaw angle estimation method, device, equipment, vehicle and medium of vehicle-mounted camera
Technical Field
The disclosure relates to the technical field of data processing, in particular to the technical field of intelligent driving, automatic driving, vehicles and the like.
Background
With the development of automobile technology, intelligent driving is increasingly widely used. The on-board camera is one of the most important sensors in intelligent driving, and is responsible for detection of lane lines, obstacles, and the like. Due to the influence of external disturbance, deformation of a vehicle body, replacement of a vehicle-mounted camera or a vehicle-mounted camera accessory and the like, the external parameters of the vehicle-mounted camera are correspondingly changed, namely, the external parameters are deviated. At this time, if the vehicle-mounted camera is used for detecting the lane lines, the situation that the vehicle is deviated left or right in the running process is caused, and even safety accidents are caused when the deviation is serious. It is very important to ensure the accuracy of camera external parameters.
Disclosure of Invention
The present disclosure provides a method, apparatus, device, vehicle and storage medium for yaw angle estimation of an onboard camera.
According to an aspect of the present disclosure, there is provided a yaw angle estimation method of an in-vehicle camera, including:
during the running process of the vehicle, acquiring a first vehicle image at a first moment based on the vehicle-mounted camera, and acquiring a second vehicle image at a second moment based on the vehicle-mounted camera;
Acquiring a first lane line detected from a first lane image and a second lane line detected from a second lane image, wherein the first lane line and the second lane line both represent the same target lane line;
under the condition that the first lane line and the second lane line meet the calibration condition, determining a yaw angle estimated value of the vehicle-mounted camera based on a mapping relation between the first lane line at the first moment and the second lane line at the second moment; the mapping relationship is used for mapping the detection points on the first lane line to the second lane line.
According to another aspect of the present disclosure, there is provided a yaw angle estimation device of an in-vehicle camera, including:
the image acquisition module is used for acquiring a first vehicle image at a first moment based on the vehicle-mounted camera and acquiring a second vehicle image at a second moment based on the vehicle-mounted camera in the running process of the vehicle;
the lane line acquisition module is used for acquiring a first lane line detected from a first lane image and a second lane line detected from a second lane image, wherein the first lane line and the second lane line both represent the same target lane line;
the estimating module is used for determining a yaw angle estimated value of the vehicle-mounted camera based on the mapping relation between the first lane line at the first moment and the second lane line at the second moment under the condition that the first lane line and the second lane line meet the calibration condition; the mapping relationship is used for mapping the detection points on the first lane line to the second lane line.
According to another aspect of the present disclosure, there is provided an electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of the embodiments of the present disclosure.
According to another aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform a method according to any one of the embodiments of the present disclosure.
According to another aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method according to any of the embodiments of the present disclosure.
According to another aspect of the present disclosure, there is provided a vehicle including the aforementioned electronic device.
According to the embodiment of the disclosure, manual calibration is not needed, an additional sensor is not needed, and the estimation of the yaw angle can be automatically completed in the running process of the vehicle.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a flow diagram of a yaw angle estimation method of an in-vehicle camera according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of lane line detection results according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of estimating a yaw angle based on scheme 1 according to an embodiment of the present disclosure;
FIG. 4 is another flow diagram of a yaw angle estimation method of an in-vehicle camera according to an embodiment of the present disclosure;
FIG. 5 is a schematic view of estimating a yaw angle based on scheme 2 according to an embodiment of the present disclosure;
FIG. 6 is a frame diagram of estimating a yaw angle according to an embodiment of the present disclosure;
FIG. 7 is another flow diagram of a yaw angle estimation method of an in-vehicle camera according to an embodiment of the present disclosure;
FIG. 8 is a block diagram of a yaw angle estimation device for implementing an in-vehicle camera of an embodiment of the present disclosure;
fig. 9 is a block diagram of an electronic device used to implement a yaw angle estimation method of an in-vehicle camera of an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In the vehicle-mounted camera, a yaw angle (yaw angle) of a camera external parameter is particularly important for detection of a lane line. Under the condition that the yaw angle deviates, the vehicle can run leftwards or rightwards, and the driving safety is affected due to the fact that the yaw angle deviates greatly.
In the related art, in the case that the yaw angle deviates, the vehicle is sent to a maintenance shop to manually calibrate the camera external parameters so as to correct the yaw angle. However, this method requires manual operation, and is low in automation. And the user needs long waiting time and the efficiency of correcting the yaw angle is low.
In another scheme, the laser radar and other sensors are required to be combined with the vehicle-mounted camera for calibration, and the scheme can check whether deviation occurs in external parameters of the camera. However, although this solution does not rely on manual intervention, it requires the use of other sensors, which are costly and not suitable for purely visual autopilot solutions.
In view of this, the present disclosure provides a yaw angle estimation method of an in-vehicle camera. The method can finish the estimation of the yaw angle in the running process of the vehicle, and an additional sensor is not needed. As shown in fig. 1, a flow chart of the method includes:
s101, acquiring a first lane image at a first moment based on an on-board camera and acquiring a second lane image at a second moment based on the on-board camera during running of the vehicle.
That is, in the running process of the vehicle, image acquisition can be performed on a running lane in real time based on any vehicle-mounted camera, so as to obtain lane images corresponding to each acquisition time. In the embodiment of the disclosure, for facilitating understanding of the scheme, a first lane image and a second lane image acquired at two moments respectively are described.
After the lane images are obtained, lane line detection can be performed on each frame of lane image, so that lane lines can be extracted from each frame of lane image. In implementation, the lane images can be acquired in real time, and lane detection is performed on the lane images in real time, so that corresponding lane lines are detected from different lane images aiming at the same lane line. The detected lane lines may be stored for use in a subsequent process flow.
S102, acquiring a first lane line detected from a first lane image and a second lane line detected from a second lane image, wherein the first lane line and the second lane line both represent the same target lane line.
In this embodiment of the present disclosure, a first lane line and a second lane line may be obtained from a storage space of a lane line, or after a first lane image and a second lane image are collected, the two frames of lane images may be detected respectively, so as to obtain the first lane line and the second lane line. In order to ensure accuracy of the estimating of the yaw angle, the first lane line and the second lane line are detected for the same actual lane line, i.e., the target lane line.
S103, under the condition that the first lane line and the second lane line meet the calibration conditions, determining a yaw angle estimated value of the vehicle-mounted camera based on a mapping relation between the first lane line at the first moment and the second lane line at the second moment; the mapping relationship is used to map detection points on a first lane line to a second lane line.
In case the first moment is earlier than the second moment, the lane detected at the previous moment is mapped onto the lane detected at the next moment. In case the first time is later than the second time, the lane detected at the next time is mapped onto the lane detected at the previous time.
Different lane lines detected for the same target lane line have a mapping relationship, and the mapping relationship is defined as mapping the first lane line onto the second lane line, so that the yaw angle can be solved.
The mapping relation comprises the yaw angle to be estimated, so that the first lane line and the second lane line are required to be on the same straight line through the mapping relation, and the yaw angle is estimated.
In summary, in the embodiment of the disclosure, for a target lane line, a first lane line and a second lane line in lane images acquired at different times are extracted, and then a yaw angle can be estimated based on a mapping relationship between the first lane line and the second lane line. According to the method, the yaw angle is estimated without manual calibration and additional sensors, and the estimation of the yaw angle can be completed based on visual information (namely the first vehicle road image and the second vehicle road image) in the running process of the vehicle. Therefore, in the running process of the vehicle, the method and the device can efficiently finish the estimation of the yaw angle.
In the implementation, the calibration condition is used for screening out the lane lines suitable for estimating the yaw angle from the lane lines detected in the image. For example, a lane line with good and clear visual effect is selected. In some embodiments, to improve the accuracy of the yaw angle estimation, the calibration conditions may include at least one of:
condition 1), the length of the lane line is greater than a preset length;
the condition can select a detected complete and clear lane line so as to improve the accuracy of the yaw angle estimation by improving the accuracy of basic data.
Condition 2), the bending degree of the lane line is lower than the preset bending degree;
in implementation, a curve equation of the lane line is obtained through lane line detection. For example, a first lane line corresponds to a first curve equation and a second lane line corresponds to a second curve equation. For each lane line, a plurality of detection points on the target lane line may be acquired, and a curve equation may be fitted based on the plurality of detection points.
After the curve equation is obtained, the curve equation's higher order term coefficient can be used to measure the bending degree of the lane line. For example, the coefficients of the higher order terms of the curve equation may be queried, and if the higher order term coefficients are close to 0, i.e., the absolute values of the higher order term coefficients are small enough, it may be determined that the curve equation is close to a straight line. Thus, for any one of the first lane line and the second lane line, in the case where the absolute value of the coefficient of the curve equation of the lane line higher than the first order term is smaller than the preset threshold value, it is determined that the degree of curvature of the lane line is lower than the preset degree of curvature.
By this condition, it is possible to help screen out a straight, i.e. nearly straight, lane line. The lane lines of the approximate straight line are consistent with the actual situation, and the accuracy of the yaw angle estimation can be improved by improving the accuracy of the basic data.
Condition 3), the first lane line and the second lane line have a parallel relationship;
in implementation, a first linear equation of the first lane line and a second linear equation of the second lane line may be determined, and if a line represented by the first linear equation is parallel to a line represented by the second linear equation, it is determined that the first lane line and the second lane line have a parallel relationship.
In implementation, a plurality of detection points in the curve equation of the first lane line can be adopted to fit a first straight line equation of the first lane line. Similarly, a plurality of detection points in the curve equation of the second lane line can be adopted to fit a second straight line equation of the second lane line.
In another embodiment, a first observation point (i.e., the point closest to the vehicle) may be obtained from a plurality of points on the first lane line, and a first linear equation of the first lane line may be fitted based on the first observation point. For example, determining a tangential slope of the first observation point in a curve equation of the first lane line, using the tangential slope as a slope of the first line equation of the first lane line, and solving a straight line passing through the first observation point to obtain the first line equation of the first lane line.
Similarly, a first observation point (i.e., the closest detection point to the vehicle) may be acquired from a plurality of detection points of the second lane line, and a second linear equation of the second lane line may be fitted based on the first observation point. For example, determining the slope of the tangent line of the first observation point in the curve equation of the second lane line, taking the slope of the tangent line as the slope of the second straight line equation of the second lane line, and solving the straight line of the first observation point to obtain the second straight line equation of the second lane line.
Of course, in the case of a curve equation of the first and second lane lines that approximates a straight line, the coefficients of the first and second lane lines may also be looked at once, and if the two differ less, it may also be understood that the first and second lane lines are nearly parallel.
The quality of the detected lane lines can be further ensured by requiring the first lane line and the second lane line to be parallel, and compared with the non-parallel lane lines, the mapping relationship between the parallel lane lines is simpler, so that the efficiency and the accuracy of the yaw estimation can be improved.
Condition 4), the time interval between the first time and the second time is greater than a preset interval;
i.e. the first moment and the second moment are required not to be too close. Because the vehicle is based on the real-time lane image collection of on-vehicle camera in the driving process, when the frame rate of gathering is great, has gathered multiframe lane image almost at the same time for the change between the image is less. In order to have sufficient variation to estimate the yaw angle, embodiments of the present disclosure require that the time interval between the first time instant and the second time instant be greater than a preset interval.
Condition 5) detecting at least n third lane lines between the first time and the second time, wherein n is a positive integer greater than or equal to 1, and the third lane lines are obtained by detecting lane lines of the target lane lines.
The setting purpose of condition 5) is the same as that of condition 4. Therefore, in practice, the alternative execution of condition 4) and condition 5) may also be required.
In the disclosed example, by setting condition 5) there is sufficient variation between the first lane line and the second lane line to facilitate estimating the yaw angle.
In some embodiments, the vehicle may perform image acquisition on the driving lane in real time based on the vehicle-mounted camera, obtain the lane image in real time, and then perform lane line detection on the lane image in real time, so as to extract a detection result of the same target lane line. Each lane line detected for the target lane line is stored, a sliding window can be set in implementation, and the lane lines belonging to the same sliding window detected from the lane images are stored in groups according to the sliding window. Wherein one sliding window belongs to a group.
And then selecting a first lane line and a second lane line according to the calibration conditions aiming at the lane lines in the same sliding window.
Under the condition that the first lane line and the second lane line meet the calibration conditions, a yaw angle estimated value is obtained based on the mapping relation between the two lane lines.
As shown in fig. 2, the black solid line indicates an actual lane line, and the broken line indicates a detected lane line (the detected lane lines in the embodiment of the present disclosure are all lane lines under the vehicle coordinate system), and the detected lane line may have a certain misalignment with the actual lane line due to the deviation of the yaw angle. In implementation, any one actual lane line can be selected as a target lane line, and a mapping relationship between the first lane line and the second lane line is established.
Based on the deviation shown in fig. 2, since the first lane line and the second lane line belong to the detection results of the same target lane line at different times, the detection point on the first lane line is converted to the target lane line at the first time through the yaw angle to be estimated in the mapping relationship to obtain a first intermediate point, the first intermediate point is converted to the target lane line at the second time through the displacement generated by the vehicle between the first time and the second time to obtain a second intermediate point, and the second intermediate point is converted to the second lane line at the second time through the negative value of the yaw angle to be estimated.
The mapping relation can be expressed as shown in the formula (1):
p’ 0 =T yaw *T loc *(T yaw ) -1 *p 0 (1)
wherein,,
in formula (1), p 0 Representing the detection point on the first lane, T yaw Expression representing the angle of the yaw to be estimated, θ represents the angle of the yaw to be estimated, T loc Representing the change in displacement of the vehicle between the first and second moments, p' 0 Representing the mapping of the detection points on the first road to the second road, ρ representing the rotation of the vehicle between the first and second instants, t x Representing a vehicleT y Representing the lateral displacement of the vehicle. t is t x And t y Is calculated based on the positioning data of the vehicle at the first time and the second time.
Based on the mapping relation, the conversion of points on the same lane line acquired at different moments can be realized, the to-be-estimated yaw angle and the displacement change of the vehicle are considered in the mapping relation, the conversion between the first lane line and the second lane line can be accurately and reasonably described, and the accurate estimation of the yaw angle is facilitated.
Based on this mapping relationship, the embodiments of the present disclosure exemplarily provide two schemes for estimating the yaw, and the following description will be made for different schemes, respectively.
Scheme 1), deriving a yaw angle estimation formula based on a mapping relationship
In practice, since the mapping relationship includes the yaw angle, so as to map the first lane line onto the second lane line, the yaw angle estimation formula can be derived based on the mapping relationship.
After deriving the yaw angle estimation formula, the yaw angle estimation formula can be saved, and a first linear equation of a first lane line and a second linear equation of a second lane line are obtained by collecting lane images during the running of the vehicle; and taking the parameters of the first linear equation and the parameters of the second linear equation as parameters of a yaw angle estimation formula to obtain a yaw angle estimation value.
Therefore, in the embodiment of the disclosure, parameters of a first linear equation of a first lane line and a second linear equation of a second lane line are acquired in the running process of a vehicle, and then a yaw angle estimated value can be determined by solving a yaw angle estimated formula. The implementation flow is simple and efficient, the estimated value of the yaw angle can be obtained in sufficient time under the condition that the vehicle runs slowly, and the yaw angle can be estimated in time under the condition of high-speed running, so that the intelligent driving efficiency is improved.
In practice, the yaw angle estimation formula can be obtained based on the mapping relation and a series of reasoning processes, wherein the reasoning processes allow approximation, equivalent substitution and other means to be adopted to obtain the yaw angle estimation formula.
Illustratively, assuming that the first time is before the second time, a first linear equation of the first lane line obtained at the first time is expressed as formula (2):
y=c 0i +c 1i x (2)
The second linear equation of the second lane line obtained at the second time is expressed as formula (3):
y=c 0j +c 1j x (3)
accordingly, the yaw angle estimation formula can be expressed as shown in formula (4):
in the formula (4), θ represents an estimated value of the yaw angle, t x Representing the longitudinal displacement of the vehicle, t y Representing the lateral displacement of the vehicle c oi Representing the intercept of the first linear equation, c 1i Representing the slope of the first linear equation, ρ represents the rotation of the vehicle from the first time to the second time, c oj Representing the intercept of the second linear equation.
Based on the estimated value of the yaw angle, the estimated value of the yaw angle can be simply, conveniently and efficiently solved, the yaw angle can be estimated when the vehicle runs at a low speed, and the yaw angle can be rapidly estimated in the high-speed running process of the vehicle. Therefore, the estimated lane line can be rapidly compensated, and the driving accuracy and safety are improved.
As shown in fig. 3, a rectangular box in the figure indicates a vehicle, the vehicle detects a first lane line at time t (i.e., a first time), a first straight line equation corresponding to the first lane line is l1, and detects a second lane line at time t+1, i.e., a second time, a second straight line equation corresponding to the second lane line is l2, and the yaw angle corresponding to time t and time t+1 can be solved through parameters of l1 and l2 and a yaw angle estimation formula.
Scheme 2), solving the yaw angle based on the optimized objective function
In this embodiment, an objective function may be constructed based on the mapping relation, and the estimation of the yaw angle may be completed by optimizing the objective function. May be implemented as a flow as shown in fig. 4, comprising:
s401, mapping a plurality of detection points on the first lane line onto a second linear equation of the second lane line through a mapping relation respectively to obtain a plurality of mapping points.
It is emphasized that embodiments of the present disclosure map a first lane line at a first time to a second lane line at a second time, thereby yielding a corresponding plurality of mapped points.
The mapping relation shown in formula (4) can still be used to obtain a plurality of mapping points.
S402, establishing an objective function, wherein the objective function is used for solving the sum of distances from the plurality of mapping points to a second linear equation of the second lane line.
That is, the map point should theoretically be on the second straight line equation of the second lane line, but the resulting map point may not be on the second straight line equation due to the deviation of the yaw angle. Thus, the distances from the mapping points to the second linear equation can be solved, and then the sum of the distances from the mapping points to the second linear equation on the first lane is taken as an objective function.
The expression of the objective function can be expressed as shown in formula (5):
where i represents an ith detection point on the first lane line, di represents a distance from a mapping point of the ith detection point on the first lane line on the second linear equation to the second linear equation, and n represents a total number of detection points of the first lane line.
S403, taking the minimum objective function as an optimization target, and adjusting the to-be-estimated yaw angle in the mapping relation.
When each detection point on the first lane line can be accurately mapped onto the second lane line, theoretically, the distance from each mapping point to the second linear equation of the second lane line approaches 0. Thus, the objective function, i.e., the distance from each mapping point to the second linear equation, is minimized to complete the optimization of the yaw angle to be estimated.
S404, under the condition that the minimum objective function is obtained, the value of the to-be-estimated yaw angle is taken as the estimated value of the yaw angle.
After optimization, under the condition that the objective function is minimized, the yaw angle is corrected, and the value of the to-be-estimated yaw angle at the moment is the required estimated value of the yaw angle.
By the embodiment of the disclosure, the yaw angle can be optimized by adopting a mode of optimizing the minimum objective function. In the optimization process, the minimum distance is taken as a constraint condition, the flow is short and easy to realize, the calibration of the yaw angle of the vehicle-mounted camera can be conveniently completed in the driving process, the calibration of the yaw angle without manual work or additional sensors is realized, and the application scene of the embodiment of the disclosure is increased.
As shown in fig. 5, two adjacent lane lines in fig. 5 respectively represent a first lane line and a second lane line, the first lane line is mapped onto the second lane line, and the distance d represented by the objective function is solved, so that the yaw angle can be solved by optimizing the objective function.
In the implementation, the lane images can be acquired in real time, lane line detection is performed in real time, detection results of the same target lane line in the lane images of different frames are obtained, and the detection results are stored in the same sliding window. Thus, it can be understood that a plurality of first moments and a plurality of second moments are included within the same sliding window. Every two lane lines meeting the calibration conditions form lane line pairs, and each lane line pair comprises a first lane line and a second lane line. To improve the accuracy of the yaw angle estimation, in the embodiment of the present disclosure, a plurality of time pairs are included in the sliding window, each time pair including a first time and a corresponding second time, and each time pair being a corresponding pair of lane lines. On the basis, yaw angle estimated values of the vehicle-mounted camera can be respectively solved in the sliding window based on the first lane line and the second lane line which are respectively corresponding at each moment to obtain a plurality of yaw angle estimated values; and solving the mean value of the plurality of yaw angle estimated values to obtain a yaw angle calibration value of the vehicle-mounted camera.
For example, the same sliding window comprises a lane line 1, a lane line 2, a lane line 3, a lane line 4 and a lane line 5, assuming that the lane line 1 and the lane line 3 meet the calibration conditions, solving one estimated value y1 of the yaw angle based on the lane line 1 and the lane line 3, solving the other estimated value y2 of the yaw angle based on the lane line 1 and the lane line 4, and solving the average value of the y1 and the y2 to obtain the yaw angle calibration value.
Of course, in other embodiments, the detection results of different target lane lines may be stored within the same sliding window. For example a plurality of first lane lines and a plurality of second lane lines comprising a left lane line, and a plurality of first lane lines and a plurality of second lane lines comprising a right lane line. The estimated yaw angle is estimated based on the first lane line and the second lane line of the left lane line to obtain a plurality of estimated yaw angle values, or the estimated yaw angle is estimated based on the first lane line and the second lane line of the right lane line to obtain a plurality of estimated yaw angle values. Then, the average value can be obtained by adopting a plurality of estimated values of the yaw angles of the left lane line and a plurality of estimated values of the yaw angles of the right lane line, so as to obtain the standard value of the yaw angles in the sliding window.
It can be appreciated that in the embodiment of the present disclosure, the target lane lines corresponding to the multiple pairs of time instants within the same sliding window may be the same or different.
By solving the mean value of the estimated values of the plurality of the yaw angles to calibrate the yaw angle of the vehicle-mounted camera, the influence of accidental errors on the estimated value of the yaw can be reduced, and the accuracy of the estimated value of the yaw angle is improved.
Similarly, in the embodiment of the disclosure, a plurality of sliding windows may be set, each sliding window may estimate a yaw angle calibration value, and when in implementation, the yaw angle calibration values of the vehicle-mounted camera may be solved in the plurality of sliding windows, so as to obtain a plurality of yaw angle calibration values; in the case where the error between the plurality of yaw angle calibration values is smaller than the error threshold value, the solution of the yaw angle calibration values is stopped, and it is determined that the yaw angle of the in-vehicle camera reaches a steady state.
Under the condition that the yaw angle reaches a stable state, solving is stopped, so that the computing resources can be saved, and the use efficiency of the computing resources is improved.
In practice, the solution provided by the embodiments of the present disclosure may be started to be executed at the time of starting the vehicle. After the yaw angle calibration is stable, the solution can be stopped. When an accident is encountered, or when it is restarted, the scheme provided by the embodiments of the present disclosure is re-executed.
Taking the foregoing scheme 1 as an example, the following describes a method for estimating a yaw angle according to an embodiment of the disclosure, as shown in fig. 6, which is a schematic diagram of an overall framework of the scheme provided by the embodiment of the disclosure. In fig. 6, lane line detection is performed on a lane image acquired by an in-vehicle camera, and lane lines detected from the lane image and positioning data of each frame of lane image are obtained. The positioning data is position data obtained by performing a positioning operation. For example, the vehicle-mounted camera acquires the lane image while the positioning module of the vehicle continuously performs the positioning operation to obtain the positioning data, whereby the positioning data closest to the acquisition time of the lane image can be obtained from the positioning data as the positioning data of the lane image. Thereby, input information is obtained.
The input data is used as upstream data for subsequent processing. In the implementation, the positioning data and the lane lines detected from the images are corresponding, and the lane lines and the corresponding positioning data are associated and stored in the same sliding window. A preprocessing operation is performed on the data within the same sliding window. And obtaining a first lane line and a second lane line which meet the calibration conditions, namely executing the yaw angle estimation operation, so as to obtain a yaw angle estimation value, and then correcting the lane lines.
The framework of fig. 6 is further described below in conjunction with fig. 7, including the steps of:
s701, detecting a target lane line based on the vehicle-mounted camera, and caching the obtained lane line and corresponding positioning data into a sliding window. I.e. corresponds to the acquisition of input information in fig. 6.
S702, corresponding to the preprocessing operation in FIG. 6, the data can be filtered according to the length of the lane line in the sliding window and the curve equation coefficient, and only the lane line with the length of the lane line being greater than the preset length and the bending degree of the lane line being lower than the preset bending degree is reserved, so that the lane line which is better in observation and is close to a straight line is obtained after the filtering. And approximating the lane lines to straight lines by using the first observation point of the lane lines and the tangential slope thereof to obtain a straight line equation corresponding to each lane line respectively.
S703, taking out lane lines with n lane lines at intervals from the front to the back from the sliding window to obtain a plurality of lane line pairs, wherein each lane line pair comprises a first lane line and a second lane line. Namely, the lane lines meeting the calibration conditions are screened out through data selection in the corresponding graph 6.
S704, for each lane line pair, calculating the displacement of the vehicle according to the positioning data of the first lane line and the positioning data of the second lane line in the lane line pair.
In practice, if a lane change is detected, then the set of data within the sliding window will not be calculated.
S705, calculating a yaw angle estimation value by using a yaw angle estimation formula based on the parameters of the linear equation of the first lane line and the linear equation of the second lane line.
S704 and S705 correspond to the geometric calculation process in fig. 6, and implement the estimation of the yaw angle.
S706, traversing all lane line pairs in the sliding window to obtain a group of estimated values of the yaw angle, and obtaining the average value of the group of estimated values of the yaw angle to obtain a final calibration value of the yaw angle. I.e. corresponds to the completion of the mean filtering in fig. 6.
Finally, corresponding to the correction process in fig. 6, the lane line is corrected by the obtained yaw calibration value for assisting driving.
In summary, in the embodiment of the present disclosure, the yaw angle can be estimated during the running of the vehicle to correct the yaw angle, so as to implement the supplement to the lane line, so as to facilitate the correct driving assistance.
Based on the same technical concept, in the embodiment of the present disclosure, as shown in fig. 8, there is also provided a yaw angle estimation device 800 of an in-vehicle camera, including:
the image acquisition module 801 is configured to acquire a first lane image at a first time based on the vehicle-mounted camera and acquire a second lane image at a second time based on the vehicle-mounted camera during a driving process of the vehicle;
a lane line acquisition module 802, configured to acquire a first lane line detected from a first lane image and a second lane line detected from a second lane image, where the first lane line and the second lane line each represent a same target lane line;
an estimation module 803, configured to determine a yaw angle estimation value of the vehicle-mounted camera based on a mapping relationship between the first lane line at the first time and the second lane line at the second time, when the first lane line and the second lane line meet the calibration condition; the mapping relationship is used for mapping the detection points on the first lane line to the second lane line.
In some embodiments, the calibration conditions include at least one of:
the length of the lane line is greater than the preset length;
the bending degree of the lane line is lower than the preset bending degree;
the first lane line and the second lane line have a parallel relationship;
The time interval between the first time and the second time is larger than the preset interval;
at least n third lane lines are detected between the first time and the second time, n is a positive integer greater than or equal to 1, and the third lane lines are obtained by detecting the lane lines of the target lane lines.
In some embodiments, the first intermediate point is obtained by converting the detection point on the first lane line to the target lane line at the first time by the yaw angle to be estimated, the second intermediate point is obtained by converting the first intermediate point to the target lane line at the second time by the displacement generated by the vehicle between the first time and the second time, and the second intermediate point is converted to the second lane line at the second time by the negative value of the yaw angle to be estimated in the mapping relation.
In some embodiments, the estimation module 803 includes:
the system comprises an acquisition unit, a calculation unit and a calculation unit, wherein the acquisition unit is used for acquiring a first linear equation of a first lane line and a second linear equation of a second lane line;
the estimating unit is used for taking the parameters of the first linear equation and the parameters of the second linear equation as the parameters of a yaw angle estimating formula to obtain a yaw angle estimated value; the yaw angle estimation formula is obtained by reasoning based on the mapping relation.
In some embodiments, the yaw angle estimation formula is:
θ represents an estimated value of the yaw angle, t x Representing the longitudinal displacement of the vehicle, t y Representing the lateral displacement of the vehicle c oi Representing the intercept of the first linear equation, c 1i Representing the slope of the first linear equation, ρ represents the rotation of the vehicle from the first time to the second time, c oj Representing the intercept of the second linear equation.
In some embodiments, the estimation module 803 includes:
the mapping unit is used for mapping a plurality of detection points on the first lane line onto a second linear equation of the second lane line through a mapping relation respectively to obtain a plurality of mapping points;
the construction unit is used for establishing an objective function, and the objective function is used for solving the sum of distances from the plurality of mapping points to a second linear equation of the second lane line;
the optimization unit is used for adjusting the yaw angle to be estimated in the mapping relation by taking the minimized objective function as an optimization target;
and the determining unit is used for acquiring the value of the yaw angle to be estimated as the yaw angle estimated value under the condition of minimizing the objective function.
In some embodiments, a plurality of time pairs are included within the sliding window, each time pair including a first time and a corresponding second time, further comprising:
the mean value filtering module is used for respectively solving yaw angle estimated values of the vehicle-mounted camera based on the first lane line and the second lane line which are respectively corresponding at each moment in the sliding window to obtain a plurality of yaw angle estimated values; and solving the average value of the plurality of yaw angle estimated values to obtain a yaw angle calibration value of the vehicle-mounted camera.
In some embodiments, further comprising:
the control module is used for respectively solving yaw angle calibration values of the vehicle-mounted camera in the sliding windows to obtain a plurality of yaw angle calibration values; and under the condition that the error among the yaw angle calibration values is smaller than the error threshold value, stopping solving the yaw angle calibration values, and determining that the yaw angle of the vehicle-mounted camera reaches a stable state.
Descriptions of specific functions and examples of each module and unit of the apparatus in the embodiments of the present disclosure may refer to related descriptions of corresponding steps in the foregoing method embodiments, which are not repeated herein.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the related user personal information all conform to the regulations of related laws and regulations, and the public sequence is not violated.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
Fig. 9 shows a schematic block diagram of an example electronic device 900 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile apparatuses, such as personal digital assistants, cellular telephones, smartphones, wearable devices, and other similar computing apparatuses. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 9, the apparatus 900 includes a computing unit 901 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 902 or a computer program loaded from a storage unit 908 into a Random Access Memory (RAM) 903. In the RAM 903, various programs and data required for the operation of the device 900 can also be stored. The computing unit 901, the ROM 902, and the RAM 903 are connected to each other by a bus 904. An input/output (I/O) interface 905 is also connected to the bus 904.
Various components in device 900 are connected to I/O interface 905, including: an input unit 906 such as a keyboard, a mouse, or the like; an output unit 907 such as various types of displays, speakers, and the like; a storage unit 908 such as a magnetic disk, an optical disk, or the like; and a communication unit 909 such as a network card, modem, wireless communication transceiver, or the like. The communication unit 909 allows the device 900 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunications networks.
The computing unit 901 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 901 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 901 performs the respective methods and processes described above, for example, a yaw angle estimation method of the in-vehicle camera. For example, in some embodiments, the method of yaw angle estimation of an onboard camera may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 908. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 900 via the ROM 902 and/or the communication unit 909. When the computer program is loaded into the RAM 903 and executed by the computing unit 901, one or more steps of the yaw angle estimation method of the in-vehicle camera described above may be performed. Alternatively, in other embodiments, the computing unit 901 may be configured to perform the yaw angle estimation method of the onboard camera by any other suitable means (e.g. by means of firmware).
Various implementations of the systems and techniques described here above can be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel, sequentially, or in a different order, provided that the desired results of the disclosed aspects are achieved, and are not limited herein.
Based on the same technical idea, the disclosed embodiments also provide a vehicle, which may include an electronic device as shown in fig. 9.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions, improvements, etc. that are within the principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (20)

1. A yaw angle estimation method of an in-vehicle camera, comprising:
during running of a vehicle, acquiring a first vehicle image at a first moment based on the vehicle-mounted camera, and acquiring a second vehicle image at a second moment based on the vehicle-mounted camera;
acquiring a first lane line detected from the first lane image and a second lane line detected from the second lane image, wherein the first lane line and the second lane line both represent the same target lane line;
Determining a yaw angle estimated value of the vehicle-mounted camera based on a mapping relationship between the first lane line at the first moment and the second lane line at the second moment under the condition that the first lane line and the second lane line meet a calibration condition; the mapping relationship is used for mapping detection points on the first road line to the second road line.
2. The method of claim 1, wherein the calibration conditions comprise at least one of:
the length of the lane line is greater than the preset length;
the bending degree of the lane line is lower than the preset bending degree;
the first lane line and the second lane line have a parallel relationship;
the time interval between the first time and the second time is larger than a preset interval;
at least n third lane lines are detected between the first time and the second time, n is a positive integer greater than or equal to 1, and the third lane lines are obtained by detecting the lane lines of the target lane lines.
3. The method according to claim 1 or 2, wherein a first intermediate point is obtained by converting a detection point on the first lane line to the target lane line at a first moment by means of a yaw angle to be estimated, a second intermediate point is obtained by converting the first intermediate point to the target lane line at a second moment by means of a displacement generated by the vehicle between the first moment and the second moment, and the second intermediate point is converted to the second lane line at the second moment by means of a negative value of the yaw angle to be estimated.
4. A method according to claim 1 or 3, wherein the determining a yaw angle estimation value of the onboard camera based on a mapping relationship between the first lane line at the first time and the second lane line at the second time comprises:
acquiring a first linear equation of the first lane line and a second linear equation of the second lane line;
taking the parameters of the first linear equation and the parameters of the second linear equation as parameters of a yaw angle estimation formula to obtain the yaw angle estimation value; the yaw angle estimation formula is obtained by reasoning based on the mapping relation.
5. The method of claim 4, wherein the yaw angle estimation formula is:
wherein θ represents a yaw angle estimated value, t x Representing the longitudinal displacement of the vehicle, t y Representing the lateral displacement of the vehicle c oi Representing the firstIntercept of straight line equation, c 1i Representing the slope of the first linear equation, ρ representing the rotation of the vehicle from a first time to a second time, c oj Representing the intercept of the second linear equation.
6. A method according to claim 1 or 3, wherein the determining a yaw angle estimation value of the onboard camera based on a mapping relationship between the first lane line at the first time and the second lane line at the second time comprises:
Mapping a plurality of detection points on the first lane line onto a second linear equation of the second lane line through the mapping relation respectively to obtain a plurality of mapping points;
establishing an objective function, wherein the objective function is used for solving the sum of distances from the plurality of mapping points to a second linear equation of the second lane line;
taking the minimized objective function as an optimization target, and adjusting a yaw angle to be estimated in the mapping relation;
and under the condition of minimizing the objective function, taking the value of the yaw angle to be estimated as the yaw angle estimated value.
7. The method of any of claims 1-6, comprising a plurality of time-of-day pairs within the sliding window, each time-of-day pair comprising a first time-of-day and a corresponding second time-of-day, further comprising:
in the sliding window, respectively solving yaw angle estimated values of the vehicle-mounted camera based on a first lane line and a second lane line which are respectively corresponding to each moment pair to obtain a plurality of yaw angle estimated values;
and solving the mean value of the plurality of yaw angle estimated values to obtain a yaw angle standard value of the vehicle-mounted camera.
8. The method of claim 7, further comprising:
respectively solving the yaw angle calibration values of the vehicle-mounted camera in a plurality of sliding windows to obtain a plurality of yaw angle calibration values;
And stopping solving the yaw angle calibration values and determining that the yaw angle of the vehicle-mounted camera reaches a stable state under the condition that errors among the yaw angle calibration values are smaller than an error threshold value.
9. A yaw angle estimation device of an in-vehicle camera, comprising:
the image acquisition module is used for acquiring a first vehicle image at a first moment based on the vehicle-mounted camera and acquiring a second vehicle image at a second moment based on the vehicle-mounted camera in the running process of the vehicle;
a lane line acquisition module, configured to acquire a first lane line detected from the first lane image and a second lane line detected from the second lane image, where the first lane line and the second lane line each represent a same target lane line;
the estimation module is used for determining a yaw angle estimation value of the vehicle-mounted camera based on a mapping relation between the first lane line at the first moment and the second lane line at the second moment under the condition that the first lane line and the second lane line meet the calibration condition; the mapping relationship is used for mapping detection points on the first road line to the second road line.
10. The apparatus of claim 9, wherein the calibration conditions comprise at least one of:
the length of the lane line is greater than the preset length;
the bending degree of the lane line is lower than the preset bending degree;
the first lane line and the second lane line have a parallel relationship;
the time interval between the first time and the second time is larger than a preset interval;
at least n third lane lines are detected between the first time and the second time, n is a positive integer greater than or equal to 1, and the third lane lines are obtained by detecting the lane lines of the target lane lines.
11. The apparatus according to claim 9 or 10, wherein a first intermediate point is obtained by converting a detection point on the first lane line to the target lane line at a first time by a yaw angle to be estimated, a second intermediate point is obtained by converting the first intermediate point to the target lane line at a second time by a displacement generated by the vehicle between the first time and the second time, and the second intermediate point is converted to the second lane line at the second time by a negative value of the yaw angle to be estimated.
12. The apparatus of claim 9 or 11, wherein the estimation module comprises:
the acquisition unit is used for acquiring a first linear equation of the first lane line and a second linear equation of the second lane line;
an estimation unit, configured to obtain the yaw angle estimation value by using the parameter of the first linear equation and the parameter of the second linear equation as parameters of a yaw angle estimation formula; the yaw angle estimation formula is obtained by reasoning based on the mapping relation.
13. The apparatus of claim 12, wherein the yaw angle estimation formula is:
wherein θ represents a yaw angle estimated value, t x Representing the longitudinal displacement of the vehicle, t y Representing the lateral displacement of the vehicle c oi Representing the intercept of the first linear equation, c 1i Representing the slope of the first linear equation, ρ representing the rotation of the vehicle from a first time to a second time, c oj Representing the intercept of the second linear equation.
14. The apparatus of claim 9 or 11, wherein the estimation module comprises:
the mapping unit is used for mapping the detection points on the first lane line onto a second linear equation of the second lane line through the mapping relation respectively to obtain a plurality of mapping points;
A building unit, configured to build an objective function, where the objective function is used to solve a sum of distances from the plurality of mapping points to a second linear equation of the second lane line;
the optimizing unit is used for adjusting the yaw angle to be estimated in the mapping relation by taking the minimized objective function as an optimizing target;
and the determining unit is used for acquiring the value of the yaw angle to be estimated as the yaw angle estimated value under the condition of minimizing the objective function.
15. The apparatus of any of claims 9-14, comprising a plurality of time-of-day pairs within the sliding window, each time-of-day pair comprising a first time-of-day and a corresponding second time-of-day, further comprising:
the mean value filtering module is used for respectively solving yaw angle estimated values of the vehicle-mounted camera based on a first lane line and a second lane line which are respectively corresponding at each moment in the sliding window to obtain a plurality of yaw angle estimated values; and solving the mean value of the plurality of yaw angle estimated values to obtain a yaw angle standard value of the vehicle-mounted camera.
16. The apparatus of claim 15, further comprising:
the control module is used for respectively solving the yaw angle calibration values of the vehicle-mounted camera in a plurality of sliding windows to obtain a plurality of yaw angle calibration values; and stopping solving the yaw angle calibration values and determining that the yaw angle of the vehicle-mounted camera reaches a stable state under the condition that errors among the yaw angle calibration values are smaller than an error threshold value.
17. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
18. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-8.
19. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any of claims 1-8.
20. A vehicle comprising the electronic device of claim 17.
CN202310429708.8A 2023-04-20 2023-04-20 Yaw angle estimation method, device, equipment, vehicle and medium of vehicle-mounted camera Pending CN116823958A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310429708.8A CN116823958A (en) 2023-04-20 2023-04-20 Yaw angle estimation method, device, equipment, vehicle and medium of vehicle-mounted camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310429708.8A CN116823958A (en) 2023-04-20 2023-04-20 Yaw angle estimation method, device, equipment, vehicle and medium of vehicle-mounted camera

Publications (1)

Publication Number Publication Date
CN116823958A true CN116823958A (en) 2023-09-29

Family

ID=88121163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310429708.8A Pending CN116823958A (en) 2023-04-20 2023-04-20 Yaw angle estimation method, device, equipment, vehicle and medium of vehicle-mounted camera

Country Status (1)

Country Link
CN (1) CN116823958A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180001876A1 (en) * 2016-06-30 2018-01-04 Toyota Jidosha Kabushiki Kaisha Lane departure prevention device
CN110320908A (en) * 2019-06-06 2019-10-11 华南农业大学 A kind of AGV real-time emulation system
WO2020038091A1 (en) * 2018-08-22 2020-02-27 北京市商汤科技开发有限公司 Intelligent driving control method and apparatus, electronic device, program and medium
US20200105017A1 (en) * 2018-09-30 2020-04-02 Boe Technology Group Co., Ltd. Calibration method and calibration device of vehicle-mounted camera, vehicle and storage medium
CN112560680A (en) * 2020-12-16 2021-03-26 北京百度网讯科技有限公司 Lane line processing method and device, electronic device and storage medium
CN112837352A (en) * 2021-04-20 2021-05-25 腾讯科技(深圳)有限公司 Image-based data processing method, device and equipment, automobile and storage medium
US20210300469A1 (en) * 2020-03-26 2021-09-30 Honda Motor Co., Ltd. Travel control system
CN114067001A (en) * 2022-01-14 2022-02-18 天津所托瑞安汽车科技有限公司 Vehicle-mounted camera angle calibration method, terminal and storage medium
CN114812575A (en) * 2022-03-15 2022-07-29 中汽创智科技有限公司 Correction parameter determining method and device, electronic equipment and storage medium
CN115511974A (en) * 2022-09-29 2022-12-23 浙江工业大学 Rapid external reference calibration method for vehicle-mounted binocular camera
CN115619875A (en) * 2022-10-21 2023-01-17 广州汽车集团股份有限公司 Vehicle-mounted camera calibration method and device, electronic equipment and storage medium
CN115704688A (en) * 2021-08-05 2023-02-17 上海博泰悦臻网络技术服务有限公司 High-precision map data relative position precision evaluation method, system, medium and terminal

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180001876A1 (en) * 2016-06-30 2018-01-04 Toyota Jidosha Kabushiki Kaisha Lane departure prevention device
WO2020038091A1 (en) * 2018-08-22 2020-02-27 北京市商汤科技开发有限公司 Intelligent driving control method and apparatus, electronic device, program and medium
US20200105017A1 (en) * 2018-09-30 2020-04-02 Boe Technology Group Co., Ltd. Calibration method and calibration device of vehicle-mounted camera, vehicle and storage medium
CN110320908A (en) * 2019-06-06 2019-10-11 华南农业大学 A kind of AGV real-time emulation system
US20210300469A1 (en) * 2020-03-26 2021-09-30 Honda Motor Co., Ltd. Travel control system
CN112560680A (en) * 2020-12-16 2021-03-26 北京百度网讯科技有限公司 Lane line processing method and device, electronic device and storage medium
CN112837352A (en) * 2021-04-20 2021-05-25 腾讯科技(深圳)有限公司 Image-based data processing method, device and equipment, automobile and storage medium
CN115704688A (en) * 2021-08-05 2023-02-17 上海博泰悦臻网络技术服务有限公司 High-precision map data relative position precision evaluation method, system, medium and terminal
CN114067001A (en) * 2022-01-14 2022-02-18 天津所托瑞安汽车科技有限公司 Vehicle-mounted camera angle calibration method, terminal and storage medium
CN114812575A (en) * 2022-03-15 2022-07-29 中汽创智科技有限公司 Correction parameter determining method and device, electronic equipment and storage medium
CN115511974A (en) * 2022-09-29 2022-12-23 浙江工业大学 Rapid external reference calibration method for vehicle-mounted binocular camera
CN115619875A (en) * 2022-10-21 2023-01-17 广州汽车集团股份有限公司 Vehicle-mounted camera calibration method and device, electronic equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JO, KICHUN等: "Precise Localization of an Autonomous Car Based on Probabilistic Noise Models of Road Surface Marker Features Using Multiple Cameras", 《IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS》 *
袁望方;张昭;马保宁;曹树星;: "基于机器视觉的车道偏离预警技术研究", 中国安全生产科学技术, no. 1 *
陈本智;: "基于双曲线模型的车道识别与偏离预警", 计算机应用, no. 09 *

Similar Documents

Publication Publication Date Title
US20190195631A1 (en) Positioning method, positioning device, and robot
CN111649739B (en) Positioning method and device, automatic driving vehicle, electronic equipment and storage medium
CN112560680A (en) Lane line processing method and device, electronic device and storage medium
CN115690739B (en) Multi-sensor fusion obstacle presence detection method and automatic driving vehicle
CN110717141B (en) Lane line optimization method, device and storage medium
CN114419165A (en) Camera external parameter correcting method, device, electronic equipment and storage medium
CN112528927A (en) Confidence determination method based on trajectory analysis, roadside equipment and cloud control platform
CN114743178B (en) Road edge line generation method, device, equipment and storage medium
CN115900695A (en) Intelligent parking vehicle positioning method applied to vehicle
CN117745845A (en) Method, device, equipment and storage medium for determining external parameter information
CN116823958A (en) Yaw angle estimation method, device, equipment, vehicle and medium of vehicle-mounted camera
CN116543032A (en) Impact object ranging method, device, ranging equipment and storage medium
CN116772858A (en) Vehicle positioning method, device, positioning equipment and storage medium
CN114120252B (en) Automatic driving vehicle state identification method and device, electronic equipment and vehicle
CN114987497A (en) Backward lane line fitting method and device, electronic equipment and storage medium
CN113587940A (en) Lane line checking method and system based on vehicle turning radius and vehicle
CN114694121A (en) Lane line correction method, lane line correction device, electronic apparatus, storage medium, and program product
CN114049615B (en) Traffic object fusion association method and device in driving environment and edge computing equipment
CN112180382A (en) Self-adaptive 3D-LSLAM positioning method, device and system based on constant-speed model
CN112683216B (en) Method and device for generating vehicle length information, road side equipment and cloud control platform
CN117289686B (en) Parameter calibration method and device, electronic equipment and storage medium
CN115187957A (en) Ground element detection method, device, equipment, medium and product
CN116468801A (en) External parameter calibration method and device of vehicle-mounted looking-around camera, vehicle and storage medium
CN117360548A (en) Vehicle transverse control method, device, equipment and storage medium
CN117557535A (en) Map element detection method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination