Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1 and fig. 2, fig. 1 is a schematic view illustrating an application scenario of a lane keeping method according to an embodiment of the present invention. Fig. 2 is a schematic flow chart of a lane keeping method according to an embodiment of the present invention. The lane keeping method is applied to a server. The server and the camera perform data interaction to realize the combination of the accuracy of lane line detection and the stability of the vehicle in the lane keeping process, firstly, internal and external parameter calibration of the camera is completed based on camera parameters to complete image gray processing and inverse perspective transformation; secondly, completing lane line feature extraction by DBSCAN (Based on space-Based Spatial Clustering of Applications with Noise) and realizing lane line fitting by combining a mobile least square method; and finally, fusing the lane line potential field and the lane keeping model for predictive control, and completing a vehicle longitudinal and transverse cooperative control module to realize a lane keeping function. The problem of algorithm compatibility under the scenes of a solid line and a dotted line is solved; the stability of the algorithm in the turning control process is improved, so that the vehicle has better tracking performance and tracking potential.
Fig. 2 is a schematic flow chart of a lane keeping method according to an embodiment of the present invention. As shown in fig. 2, the method includes the following steps S110 to S150.
And S110, acquiring camera calibration data.
In the present embodiment, the camera calibration data refers to 2D pixel coordinates formed by the camera shooting lane.
In an embodiment, referring to fig. 3, the step S110 may include steps S111 to S112.
And S111, calibrating the internal parameters and the external parameters of the camera based on the camera parameters to obtain a calibration result.
In this embodiment, the calibration result refers to the calibration content of the internal and external parameters of the camera.
And S112, acquiring 2D pixel coordinates according to the calibration result to obtain camera calibration data.
In this embodiment, the camera parameter identification is a precondition for realizing the expression of the mapping relationship between the spatial coordinates and the pixel coordinates, and the external reference and the internal reference identification of the camera are realized by combining information such as the installation position of the camera, the size of the lens and the size of the output image. The transformation of any coordinate in 3D space into 2D pixel coordinate is a perspective transformation, and the formula is summarized as follows
Wherein is present>
Is the camera internal reference, R and T are the camera external reference>
Is the position coordinate of a point in the geodetic coordinate system, is based on the evaluation of the location coordinate of the point in the geodetic coordinate system>
Is the Z-axis coordinate of the camera coordinate system, and (u, v) is the pixel coordinate of the point. This formula will 3D coordinate
Converted to 2D pixel coordinates (u, v). The output of this step is the 2D pixel coordinates (u, v) as input for lane line extraction.
And S120, extracting the lane line according to the camera calibration data to obtain an extraction result.
In the present embodiment, the extracted result refers to the extracted lane line.
In an embodiment, referring to fig. 4, the step S120 may include steps S121 to S123.
And S121, performing image gray scale processing on the camera calibration data to obtain a processing result.
In this embodiment, the processing result refers to data formed by performing image gradation processing on the camera calibration data.
The image gray scale processing of data belongs to the prior art, and is not described herein again.
And S122, performing inverse perspective transformation on the processing result to obtain a transformation result.
In this embodiment, the transformation result is a result obtained by performing inverse perspective transformation on the processing result.
Specifically, the inverse perspective transformation mainly converts a pixel coordinate system into a world coordinate system, outputs an input image as a bird's-eye view, and presents the characteristics of parallelism and equal width of real world lane lines so as to improve the detection accuracy of the lane lines. Mainly by
And realizing coordinate conversion. Wherein IPM is an inverse perspective matrix. Pixel coordinates (u, v) are input, and the pixel coordinates are inversely converted by an inverse perspective matrix to output 3D coordinates (X, Y, Z).
And S123, performing feature extraction on the transformation result to obtain an extraction result.
In an embodiment, referring to fig. 4, the step S123 may include steps S1231 to S1232.
And S1231, extracting clustering features of the transformation result aiming at the objects with the same density by adopting a DBSCAN clustering algorithm to obtain feature points.
In this embodiment, the feature point refers to a result obtained by extracting clustering features of the objects with the same density by using a DBSCAN clustering algorithm.
And S1232, screening the connection points of the characteristic points through parameter adjustment and identification so as to obtain an extraction result.
Specifically, after a series of 3D coordinates are obtained, a DBSCAN clustering algorithm is adopted, clustering features are extracted for objects with the same density, connection points are identified through parameter adjustment, and connection points are screened, and at the moment, lane lines are extracted as a series of feature points.
And S130, performing lane line fitting according to the extraction result to obtain a fitting result.
In this embodiment, the fitting result refers to a result formed by fitting a series of extracted feature points, and specifically, the extracted result is subjected to curve fitting based on a moving least square method to obtain a fitting result, so as to ensure the curvature continuity of the lane line, as shown in fig. 8 to 11.
And S140, constructing a lane line potential field based on the transverse position of the vehicle according to the fitting result.
In this embodiment, the lane line potential field refers to a lane line composite potential field.
In an embodiment, referring to fig. 6, the step S140 may include steps S141 to S142.
And S141, determining the curve coordinate mapping relation of the lane line according to the fitting result.
In this embodiment, after the fitting of the lane line is completed, the absolute coordinates of the lane line are obtained, and then the conversion between the absolute coordinates and the road coordinates is realized based on the coordinate conversion theory, so as to obtain the curve coordinate mapping relationship of the lane line, as shown in fig. 12.
And S142, designing a lane line potential field function based on the transverse position of the vehicle according to the curve coordinate mapping relation of the lane line so as to construct a lane line potential field.
The lane keeping function aims mainly to make the vehicle stably run in a certain range of the lane center line, and does not lose the stability of the vehicle in order to strictly follow the center line, and the lane line transverse distance of the vehicle in a curve coordinate system is set as d, which is shown in fig. 13 in detail.
Specifically, a lane line potential field function is designed based on the calculated lateral position of the vehicle, and a lane line composite potential field is constructed. Respectively defining potential field functions on lane boundaries and lane center lines in the structured road, wherein the potential field functions of the lane lines are designed as follows:
; />
(ii) a Wherein it is present>
And &>
Respectively a repulsive field at the boundary of a lane line and a gravitational field at the center of the lane line under a curve coordinate, b and a respectively are adjustment coefficients of the boundary of the lane and the potential field at the center of the lane, d
LR And d
LL Respectively, the abscissa of the right and left lane lines, d the abscissa of the target vehicle, d
LC The abscissa representing the center line of the lane. The lane line potential field may be constructed by combining the identified lane lines with a potential field function, as shown in fig. 14.
S150, fusing the lane line potential field with a lane keeping model to obtain a lane keeping function.
In the present embodiment, the lane keeping function refers to a function for predicting lane keeping.
In an embodiment, referring to fig. 7, the step S150 may include steps S151 to S153.
And S151, calculating the transverse position and the longitudinal position of the vehicle from the lane line based on the lane keeping model.
In the embodiment, an optimization problem is constructed based on a model predictive control theory, the conversion between geodetic coordinates and road coordinates is realized based on a coordinate conversion theory, and the transverse position and the longitudinal position of the vehicle from a lane line are calculated. The prediction and reference values in the prediction domain are generated as:
and S152, designing an objective function by combining the transverse distance, the reference transverse target and the lane line potential field.
In this embodiment, the objective function is
(ii) a Wherein, V
pre Represents N
P Step prediction state, V
ref Represents N
P State of stepThe reference and control weight coefficient matrices are respectively represented as a state weight coefficient matrix and a control weight coefficient matrix.
And S153, constructing an optimization problem by combining the lane line potential field, the objective function and the vehicle three-degree-of-freedom dynamic model to obtain a lane keeping function.
Combining the step S142, the step S151, the step S152 and the vehicle three-degree-of-freedom dynamic model, constructing a nonlinear, multi-objective and multi-constraint convex optimization problem as follows:
(ii) a Wherein A is
d And B
d The representation is based on
Coefficient of equation of state matrix, C
d And D
d Represents the observation equation coefficient matrix, < > is selected>
Representing a relaxation factor, converting the soft constraint to a semi-hard constraint. The semi-hard constraint allows the vehicle to deviate from the lane centerline within a certain range at the cost of penalty.
The method of the embodiment firstly finishes the calibration of the internal and external parameters of the camera based on the camera parameters, and finishes the image gray processing and the inverse perspective transformation; secondly, finishing the extraction of lane line features based on a space density clustering algorithm, and realizing the fitting of the lane lines by combining a moving least square method; and finally, fusing the lane line potential field and the lane keeping model for predictive control, and completing a vehicle longitudinal and transverse cooperative control module to realize a lane keeping function. The method is characterized in that a vehicle is combined with a model to predict transverse distance, a transverse target and a lane line potential field are referred to, meanwhile, a target function is designed by taking vehicle passing efficiency and tracking precision as targets, a nonlinear, multi-target and multi-constraint convex optimization problem function is constructed by combining corresponding formulas, a lane keeping function is combined with lane line detection precision, the stability of a vehicle in a lane keeping process is guaranteed, the lane keeping function has good compatibility under the scenes of solid lines and dotted lines, and meanwhile, the lane control function has better stability in the process of passing curve control. Compared with single transverse control, the vehicle tracking performance and tracking potential are improved.
According to the lane keeping method, the camera calibration data are obtained by calibrating the internal and external parameters of the camera, lane lines are extracted, then lane line fitting is carried out, a lane line potential field is constructed by combining the transverse position of a vehicle, the lane line potential field is fused with a lane keeping model to obtain a lane keeping function, the lane keeping function can be used for carrying out lane keeping prediction, small probability events in an actual driving scene can be responded, the lane keeping method is suitable for detecting curves and dotted lines, and the stability in the over-bending control process is improved.
Fig. 15 is a schematic block diagram of a lane keeping apparatus 300 according to an embodiment of the present invention. As shown in fig. 15, the present invention also provides a lane keeping device 300 corresponding to the above lane keeping method. The lane keeping apparatus 300 includes a unit for performing the above-described lane keeping method, and the apparatus may be configured in a server. Specifically, referring to fig. 15, the lane keeping device 300 includes a data acquisition unit 301, a lane line extraction unit 302, a fitting unit 303, a construction unit 304, and a fusion unit 305.
A data acquisition unit 301 configured to acquire camera calibration data; a lane line extraction unit 302, configured to extract a lane line according to the camera calibration data to obtain an extraction result; a fitting unit 303, configured to perform lane line fitting according to the extraction result to obtain a fitting result; a construction unit 304, configured to construct a lane line potential field based on a vehicle lateral position according to the fitting result; and a fusion unit 305, configured to fuse the lane line potential field and a lane keeping model to obtain a lane keeping function.
In one embodiment, as shown in fig. 16, the data acquisition unit 301 includes a calibration subunit 3011 and a coordinate acquisition subunit 3012.
The calibration subunit 3011 is configured to calibrate the camera internal parameter and the camera external parameter based on the camera parameter to obtain a calibration result; and a coordinate obtaining subunit 3012, configured to obtain 2D pixel coordinates according to the calibration result, so as to obtain camera calibration data.
In one embodiment, as shown in fig. 17, the lane line extraction unit 302 includes a processing subunit 3021, a transformation subunit 3022, and an extraction subunit 3023.
A processing subunit 3021, configured to perform image grayscale processing on the camera calibration data to obtain a processing result; a transformation subunit 3022, configured to perform inverse perspective transformation on the processing result to obtain a transformation result; an extracting subunit 3023, configured to perform feature extraction on the transform result to obtain an extraction result.
In an embodiment, as shown in fig. 18, the extracting subunit 3023 includes a feature point extracting module 30231 and a filtering module 30232.
A feature point extraction module 30231, configured to perform clustering feature extraction on the same-density object with respect to the transform result by using a DBSCAN clustering algorithm to obtain feature points; a screening module 30232, configured to screen the connection points through parameter adjustment and identification of the feature points to obtain an extraction result.
In an embodiment, the fitting unit 303 is configured to perform curve fitting on the extracted result based on a moving least square method to obtain a fitting result.
In one embodiment, as shown in fig. 19, the building unit 304 includes a relationship determining subunit 3041 and a function building subunit 3042.
A relationship determination subunit 3041, configured to determine a curve coordinate mapping relationship of the lane line according to the fitting result; the function constructing subunit 3042 is configured to design a lane line potential field function based on the vehicle transverse position according to the curve coordinate mapping relationship of the lane line, so as to construct the lane line potential field.
In one embodiment, as shown in FIG. 20, the fusion unit 305 includes a position calculation subunit 3051, an objective function design subunit 3052, and a problem construction subunit 3053.
A position calculation subunit 3051 for calculating a lateral position and a longitudinal position of the vehicle from the lane line based on the lane keeping model; an objective function design subunit 3052, configured to design an objective function by combining the lateral distance, the reference lateral target, and the lane line potential field; and the problem construction subunit 3053, configured to construct an optimization problem by combining the lane line potential field, the objective function, and the vehicle three-degree-of-freedom dynamic model, so as to obtain a lane keeping function.
It should be noted that, as will be clear to those skilled in the art, the specific implementation process of the lane keeping device 300 and each unit can be referred to the corresponding description in the foregoing method embodiment, and for convenience and brevity of description, no further description is provided herein.
The lane keeping apparatus 300 described above may be implemented in the form of a computer program that can be run on a computer device as shown in fig. 21.
Referring to fig. 21, fig. 21 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device 500 may be a server, wherein the server may be an independent server or a server cluster composed of a plurality of servers.
Referring to fig. 21, the computer device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a non-volatile storage medium 503 and an internal memory 504.
The non-volatile storage medium 503 may store an operating system 5031 and a computer program 5032. The computer program 5032 comprises program instructions that, when executed, may cause the processor 502 to perform a lane keeping method.
The processor 502 is used to provide computing and control capabilities to support the operation of the overall computer device 500.
The internal memory 504 provides an environment for running the computer program 5032 in the non-volatile storage medium 503, and when the computer program 5032 is executed by the processor 502, the processor 502 may be caused to execute a lane keeping method.
The network interface 505 is used for network communication with other devices. Those skilled in the art will appreciate that the configuration shown in fig. 21 is a block diagram of only a portion of the configuration associated with the present application and does not constitute a limitation of the computer device 500 to which the present application is applied, and that a particular computer device 500 may include more or less components than those shown, or combine certain components, or have a different arrangement of components.
Wherein the processor 502 is configured to run the computer program 5032 stored in the memory to implement the following steps:
acquiring camera calibration data; extracting a lane line according to the camera calibration data to obtain an extraction result; performing lane line fitting according to the extraction result to obtain a fitting result; constructing a lane line potential field based on the transverse position of the vehicle according to the fitting result; and fusing the lane line potential field and a lane keeping model to obtain a lane keeping function.
In an embodiment, when the processor 502 implements the step of acquiring the camera calibration data, the following steps are specifically implemented:
calibrating the internal parameters and the external parameters of the camera based on the camera parameters to obtain a calibration result; and acquiring 2D pixel coordinates according to the calibration result to obtain camera calibration data.
In an embodiment, when the processor 502 implements the step of extracting the lane line according to the camera calibration data to obtain the extraction result, the following steps are specifically implemented:
performing image gray processing on the camera calibration data to obtain a processing result; carrying out inverse perspective transformation on the processing result to obtain a transformation result; and performing feature extraction on the transformation result to obtain an extraction result.
In an embodiment, when implementing the step of extracting the feature of the transformation result to obtain the extraction result, the processor 502 specifically implements the following steps:
adopting a DBSCAN clustering algorithm to extract clustering features of the transformation result aiming at the objects with the same density so as to obtain feature points; and (4) screening the connection points of the characteristic points through parameter adjustment and identification to obtain an extraction result.
In an embodiment, when implementing the step of performing lane line fitting according to the extraction result to obtain a fitting result, the processor 502 specifically implements the following steps:
and performing curve fitting on the extraction result based on a moving least square method to obtain a fitting result.
In an embodiment, when implementing the step of constructing the lane line potential field based on the lateral position of the vehicle according to the fitting result, the processor 502 specifically implements the following steps:
determining a curve coordinate mapping relation of the lane line according to the fitting result; and designing a potential field function of the lane line based on the transverse position of the vehicle according to the curve coordinate mapping relation of the lane line so as to construct the potential field of the lane line.
In an embodiment, when the step of fusing the lane line potential field and the lane keeping model to obtain the lane keeping function is implemented by the processor 502, the following steps are specifically implemented:
calculating a lateral position and a longitudinal position of the vehicle from the lane line based on the lane keeping model; designing a target function by combining the transverse distance, the reference transverse target and the lane line potential field; and constructing an optimization problem by combining the lane line potential field, the objective function and the vehicle three-degree-of-freedom dynamic model to obtain a lane keeping function.
It should be understood that in the embodiment of the present Application, the Processor 502 may be a Central Processing Unit (CPU), and the Processor 502 may also be other general-purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will be understood by those skilled in the art that all or part of the flow of the method implementing the above embodiments may be implemented by a computer program instructing associated hardware. The computer program includes program instructions, and the computer program may be stored in a storage medium, which is a computer-readable storage medium. The program instructions are executed by at least one processor in the computer system to implement the flow steps of the embodiments of the method described above.
Accordingly, the present invention also provides a storage medium. The storage medium may be a computer-readable storage medium. The storage medium stores a computer program, wherein the computer program, when executed by a processor, causes the processor to perform the steps of:
acquiring camera calibration data; extracting a lane line according to the camera calibration data to obtain an extraction result; performing lane line fitting according to the extraction result to obtain a fitting result; constructing a lane line potential field based on the transverse position of the vehicle according to the fitting result; and fusing the lane line potential field and a lane keeping model to obtain a lane keeping function.
In an embodiment, when the step of obtaining camera calibration data is implemented by executing the computer program, the processor specifically implements the following steps:
calibrating the internal parameters and the external parameters of the camera based on the camera parameters to obtain a calibration result; and acquiring 2D pixel coordinates according to the calibration result to obtain camera calibration data.
In an embodiment, when the processor executes the computer program to implement the step of extracting lane lines according to the camera calibration data to obtain an extraction result, the following steps are specifically implemented:
performing image gray processing on the camera calibration data to obtain a processing result; carrying out inverse perspective transformation on the processing result to obtain a transformation result; and performing feature extraction on the transformation result to obtain an extraction result.
In an embodiment, when the processor executes the computer program to implement the step of extracting the feature of the transformation result to obtain the extraction result, the processor specifically implements the following steps:
adopting a DBSCAN clustering algorithm to extract clustering features of the transformation result aiming at the objects with the same density so as to obtain feature points; and (4) screening the connection points of the characteristic points through parameter adjustment and identification to obtain an extraction result.
In an embodiment, when the processor executes the computer program to implement the step of performing lane line fitting according to the extraction result to obtain a fitting result, the following steps are specifically implemented:
and performing curve fitting on the extraction result based on a mobile least square method to obtain a fitting result.
In an embodiment, when the step of constructing the lane line potential field based on the lateral position of the vehicle according to the fitting result is implemented by the processor executing the computer program, the following steps are specifically implemented:
determining a curve coordinate mapping relation of the lane line according to the fitting result; and designing a potential field function of the lane line based on the transverse position of the vehicle according to the curve coordinate mapping relation of the lane line so as to construct the potential field of the lane line.
In an embodiment, when the processor executes the computer program to realize the step of fusing the lane line potential field and the lane keeping model to obtain the lane keeping function, the following steps are specifically realized:
calculating a lateral position and a longitudinal position of the vehicle from the lane line based on the lane keeping model; designing a target function by combining the transverse distance, the reference transverse target and the lane line potential field; and constructing an optimization problem by combining the lane line potential field, the objective function and the vehicle three-degree-of-freedom dynamic model to obtain a lane keeping function.
The storage medium may be a usb disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk, which can store various computer readable storage media.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, various elements or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs. The units in the device of the embodiment of the invention can be merged, divided and deleted according to actual needs. In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a terminal, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.