CN108801286B - Method and device for determining a driving trajectory - Google Patents

Method and device for determining a driving trajectory Download PDF

Info

Publication number
CN108801286B
CN108801286B CN201810866434.8A CN201810866434A CN108801286B CN 108801286 B CN108801286 B CN 108801286B CN 201810866434 A CN201810866434 A CN 201810866434A CN 108801286 B CN108801286 B CN 108801286B
Authority
CN
China
Prior art keywords
lane line
predicted
current
line information
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810866434.8A
Other languages
Chinese (zh)
Other versions
CN108801286A (en
Inventor
张芬
杜金枝
王秀田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chery Automobile Co Ltd
Original Assignee
Chery Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chery Automobile Co Ltd filed Critical Chery Automobile Co Ltd
Priority to CN201810866434.8A priority Critical patent/CN108801286B/en
Publication of CN108801286A publication Critical patent/CN108801286A/en
Application granted granted Critical
Publication of CN108801286B publication Critical patent/CN108801286B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3446Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes

Abstract

The disclosure provides a method and a device for determining a driving track, and belongs to the technical field of unmanned driving. The method is applied to an unmanned intelligent vehicle, the intelligent vehicle comprises a processor, a camera and an inertia measurement component, the camera is mounted on a front windshield of the intelligent vehicle, the inertia measurement component is mounted at a preset position of a vehicle body of the intelligent vehicle, and the processor is respectively electrically connected with the camera and the inertia measurement component; wherein the method comprises the following steps: the processor determines the predicted lane line information within a preset time length based on the current lane line information sent by the camera, the current speed, the current acceleration and the current angular speed sent by the inertia measurement component; the processor determines a predicted travel track within a preset time period based on the current travel track and the predicted lane line information. By adopting the method and the device, the accuracy of the driving track can be improved.

Description

Method and device for determining a driving trajectory
Technical Field
The disclosure relates to the technical field of unmanned driving, in particular to a method and a device for determining a driving track.
Background
With the rapid development of scientific technology, the unmanned technology has gradually matured, but the safety of the unmanned intelligent vehicle in driving is still the focus of attention of people.
In the correlation technique, in order to guarantee that the intelligent vehicle safely travels in the lane line to avoid the condition of deviating the lane line, the front windshield of the intelligent vehicle is usually provided with a camera to collect lane line information in front of the vehicle, so that the intelligent vehicle can generate a traveling track of the next moment based on the current lane line information. Thus, the intelligent vehicle can reliably travel the track based on the lane line information sent by the camera and can travel in the lane line according to the travel track.
In carrying out the present disclosure, the inventors found that at least the following problems exist:
the intelligent vehicle determines the driving track only according to the lane line information sent by the camera, and the accuracy of the obtained driving track is relatively low.
Disclosure of Invention
The embodiment of the disclosure provides a method and a device for determining a driving track, so as to solve the problems of the related art. The technical scheme is as follows:
according to the embodiment, the method for determining the driving track is applied to an unmanned intelligent vehicle, the intelligent vehicle comprises a processor, a camera and an inertia measurement component, the camera is installed on a front windshield of the intelligent vehicle, the inertia measurement component is installed at a preset position of a vehicle body of the intelligent vehicle, and the processor is respectively and electrically connected with the camera and the inertia measurement component;
wherein the method comprises the following steps:
the processor determines the predicted lane line information within a preset time length based on the current lane line information sent by the camera, the current speed, the current acceleration and the current angular speed sent by the inertia measurement component;
the processor determines a predicted travel track within a preset time period based on the current travel track and the predicted lane line information.
Optionally, the inertia measurement component is installed at a position of a center of gravity of a body of the smart car.
Optionally, the current lane line information includes a current vehicle heading angle and a current horizontal axis distance between a lane line where the intelligent vehicle is located and a center point of the camera, and the predicted lane line information includes a predicted vehicle heading angle and a predicted horizontal axis distance between the lane line where the intelligent vehicle is located and the center point of the camera.
Optionally, the processor determines the predicted lane line information within a preset time length based on the current lane line information sent by the camera, the current vehicle speed, the current acceleration and the current angular speed sent by the inertia measurement component, and the method includes:
the processor determines first predicted lane line information within a preset time length based on the current lane line information sent by the camera;
the processor determines second predicted lane line information within a preset time length by using a Kalman state equation based on the current lane line information, the current speed, the current acceleration and the current angular velocity sent by the inertia measurement component;
the processor determines predicted lane line information based on the first predicted lane line information and the second predicted lane line information.
Optionally, the processor determines, based on the current lane line information sent by the camera, first predicted lane line information within a preset time duration, including:
the processor is based on the current course angle theta in the current lane line information sent by the camera(t)Distance d from current transverse axis(t)Determining a predicted course angle in the first predicted lane line information within a preset time length
Figure BDA0001751053100000021
And predicting the distance of the transverse axis
Figure BDA0001751053100000022
The processor determines second predicted lane line information within a preset time length by using a Kalman state equation based on the current lane line information, the current speed, the current acceleration and the current angular velocity sent by the inertia measurement component, and the second predicted lane line information comprises:
the processor is based on the current course angle theta in the current lane line information(t)Distance d from current transverse axis(t)Current vehicle speed v sent by the inertia measurement componenttCurrent acceleration atAnd the current angular velocity wtUsing the Kalman equation of state
Figure BDA0001751053100000023
Determining a preset time duration t0Predicted lateral distance in the second predicted lane line information
Figure BDA0001751053100000024
Using Kalman equation of state
Figure BDA0001751053100000025
Determining a preset time duration t0Predicted vehicle heading angle in the second predicted lane line information
Figure BDA0001751053100000026
The processor determines predicted lane line information based on the first predicted lane line information and the second predicted lane line information, including:
the processor predicts a course angle based on first predicted lane line information
Figure BDA0001751053100000031
Predicting distance across axis
Figure BDA0001751053100000032
And the predicted vehicle heading angle in the second predicted lane line information
Figure BDA0001751053100000033
Predicting distance across axis
Figure BDA0001751053100000034
Using a weighted formula
Figure BDA0001751053100000035
And
Figure BDA0001751053100000036
determining predicted vehicle heading angle in predicted lane line information
Figure BDA0001751053100000037
Predicting distance across axis
Figure BDA0001751053100000038
β1And λ1Are all weight coefficients.
According to the embodiment, the device for determining the driving track is applied to an unmanned intelligent vehicle, the intelligent vehicle comprises a camera and an inertia measurement component, the camera is installed on a front windshield of the intelligent vehicle, the inertia measurement component is installed at a preset position of a vehicle body of the intelligent vehicle, and the device is respectively electrically connected with the camera and the inertia measurement component;
wherein the apparatus comprises:
the lane line information prediction module is used for determining predicted lane line information within a preset time length based on the current lane line information sent by the camera, the current speed, the current acceleration and the current angular speed sent by the inertia measurement component;
and the driving track prediction module is used for determining a predicted driving track within a preset time length based on the current driving track and the predicted lane line information.
Optionally, the inertia measurement component is installed at a position of a center of gravity of a body of the smart car.
Optionally, the current lane line information includes a current vehicle heading angle and a current horizontal axis distance between a lane line where the intelligent vehicle is located and a center point of the camera, and the predicted lane line information includes a predicted vehicle heading angle and a predicted horizontal axis distance between the lane line where the intelligent vehicle is located and the center point of the camera.
Optionally, the lane line information prediction module includes:
the first prediction unit is used for determining first predicted lane line information within a preset time length based on the current lane line information sent by the camera;
the second prediction unit is used for determining second prediction lane line information within a preset time length by using a Kalman state equation based on the current lane line information, the current speed, the current acceleration and the current angular speed which are sent by the inertia measurement component;
a determination unit configured to determine predicted lane line information based on the first predicted lane line information and the second predicted lane line information.
Optionally, the first prediction unit is configured to determine a current heading angle θ based on current lane line information sent by the camera(t)Distance d from current transverse axis(t)Determining a predicted course angle in the first predicted lane line information within a preset time length
Figure BDA0001751053100000041
And predicting the distance of the transverse axis
Figure BDA0001751053100000042
The second prediction unit is used for predicting the current course angle theta based on the current lane line information(t)Distance d from current transverse axis(t)Current vehicle speed v sent by the inertia measurement componenttCurrent acceleration atAnd the current angular velocity wtUsing the Kalman equation of state
Figure BDA0001751053100000043
Determining a preset time duration t0Predicted lateral distance in the second predicted lane line information
Figure BDA0001751053100000044
Using Kalman equation of state
Figure BDA0001751053100000045
Determining a preset time duration t0Predicted vehicle heading angle in the second predicted lane line information
Figure BDA0001751053100000046
The determining unit is used for predicting a course angle based on the first predicted lane line information
Figure BDA0001751053100000047
Predicting distance across axis
Figure BDA0001751053100000048
And the predicted vehicle heading angle in the second predicted lane line information
Figure BDA0001751053100000049
Predicting distance across axis
Figure BDA00017510531000000410
Using weightingFormula (II)
Figure BDA00017510531000000411
And
Figure BDA00017510531000000412
determining predicted vehicle heading angle in predicted lane line information
Figure BDA00017510531000000413
Predicting distance across axis
Figure BDA00017510531000000414
β1And λ1Are all weight coefficients.
According to the present embodiment, there is provided an apparatus for determining a driving trajectory, the apparatus comprising a processor and a memory, the memory storing at least one instruction, the instruction being loaded and executed by the processor to implement the method for determining a driving trajectory according to the first aspect.
According to the present embodiment, a computer-readable storage medium is provided, and at least one instruction is stored in the storage medium, and the instruction is loaded and executed by a processor to implement the method for determining a driving trajectory.
The technical scheme provided by the embodiment of the invention has the beneficial effects that at least:
in the embodiment of the disclosure, the unmanned intelligent vehicle comprises a camera mounted on a front windshield and an inertia measurement component mounted at a preset position of a vehicle body, and when a processor of the intelligent vehicle determines a driving track in the driving process, the intelligent vehicle can determine predicted lane line information within a preset time length based on current lane line information sent by the camera, current vehicle speed, current acceleration and current angular speed sent by the inertia measurement component; and then, determining the predicted driving track within the preset time length based on the current driving track and the predicted lane line information. Therefore, in the running process of the intelligent vehicle, when the running track is determined, the data of the camera and the data of the inertia measurement component are fused to obtain the predicted running track, and therefore the accuracy of the running track can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is a flowchart of a method for determining a driving trajectory according to an embodiment of the present disclosure;
FIG. 2 is a schematic view of a vehicle heading angle provided by an embodiment of the present disclosure;
FIG. 3 is a flow chart of a method of determining a travel trajectory provided by an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an apparatus for determining a driving trajectory according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an apparatus for determining a driving trajectory according to an embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the present disclosure more apparent, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings.
The embodiment of the disclosure provides a method for determining a driving track, which is applied to an unmanned intelligent vehicle, wherein the intelligent vehicle can comprise a processor, a plurality of cameras and an inertia measurement component, the number of the cameras can be one or more, the cameras can be monocular cameras or binocular cameras, the cameras can be arranged on a front windshield of the intelligent vehicle, and the cameras can be positioned in the vehicle and arranged on the front windshield to protect the cameras. The inertial measurement unit may also be abbreviated as imu (inertial measurement unit), is used to measure the speed, acceleration and angular velocity of an object, and is usually installed at the position of the center of gravity of the measured object. The processor may be a chip added in the smart car for determining the driving track, or may be an electronic Control unit (ecu) of the smart car itself.
As shown in fig. 1, the processor may execute the following flow:
in step 101, the processor determines the predicted lane line information within a preset time length based on the current lane line information sent by the camera, the current vehicle speed, the current acceleration and the current angular speed sent by the inertia measurement component.
The preset time period may be any short-time period set by a technician, for example, 10 milliseconds or 20 milliseconds, and the shorter the preset time period is, the higher the real-time performance of the driving track determined by the processor is, the higher the accuracy thereof is.
The lane line information may include a vehicle heading angle, a horizontal axis distance between a lane line where the vehicle is located and a center point of the camera, and the vehicle heading angle may be, as shown in the schematic diagram of fig. 2, in a ground coordinate system x0,y0Lower vehicle center of mass velocity v and horizontal axis x0In fig. 2, epsilon is the vehicle centroid slip angle, eta is the vehicle yaw angle, and obviously gamma is epsilon + eta. Correspondingly, the current lane line information comprises a current vehicle course angle and a current horizontal axis distance between a lane line where the intelligent vehicle is located and the central point of the camera; the predicted lane line information comprises a predicted vehicle course angle, and a predicted transverse axis distance between a lane line where the intelligent vehicle is located and the camera.
As shown in fig. 3, step 101 may be performed according to the following flow:
in step 1011, the processor determines first predicted lane line information within a preset time length based on the current lane line information sent by the camera.
In implementation, when the intelligent vehicle is in a driving process, the camera can acquire an image in front of the vehicle in real time, the camera can be internally provided with a small-sized processor, the acquired image can be correspondingly processed, for example, a lane line in the image can be extracted, a lane line equation can be fitted based on the extracted lane line, and then current lane line information is obtained from the lane line equation. In order to reduce the processing load of the processor, correspondingly, after the camera obtains the current lane line information, the camera can also determine first predicted lane line information within a preset time length based on the current lane line information, and then send the first predicted lane line information to the processor. Of course, in order to reduce the complexity of the camera, correspondingly, after the camera determines the current lane line information, the current lane line information may also be sent to the processor, and the processor determines the first predicted lane line information within the preset time duration based on the current lane line information.
The method for predicting the lane line information at the next moment by the processor based on the current lane line information sent by the camera can be predicted by a statistical method, namely, future data in a short time can be predicted according to historical data, and correspondingly, the processor predicts the future data in the short time based on the current course angle theta in the current lane line information sent by the camera(t)Distance d from current transverse axis(t)Determining a predicted course angle in the first predicted lane line information within a preset time length
Figure BDA0001751053100000061
And predicting the distance of the transverse axis
Figure BDA0001751053100000062
t0I.e. the preset duration.
In step 1012, the processor determines second predicted lane line information within a preset time duration using a kalman state equation based on the current lane line information, the current vehicle speed, the current acceleration, and the current angular velocity sent by the inertial measurement unit.
Wherein, the kalman state equations are respectively:
Figure BDA0001751053100000063
Figure BDA0001751053100000064
in the formula:
Figure BDA0001751053100000065
for the predicted lateral-axis distance, d, in the second predicted lane-line information(t)Is the current cross-axis distance, vtFor the current vehicle speed, t0For a predetermined duration, atAs the current acceleration, the acceleration is set to be,
Figure BDA0001751053100000066
for the predicted vehicle heading angle, theta, in the second predicted lane line information(t)Is the current heading angle.
As can be seen from the kalman state equation, the processor may predict the second predicted lane line information within the preset time period based on the current lane line information and the data sent by the inertia measurement component.
In step 1013, the processor determines predicted lane line information based on the first predicted lane line information and the second predicted lane line information.
After the processor determines the first predicted lane line information and the second predicted lane line information, the processor may determine the predicted lane line information using a weighting formula based on the first predicted lane line information and the second predicted lane line information, where the weighting formula may be:
Figure BDA0001751053100000071
Figure BDA0001751053100000072
in the formula:
Figure BDA0001751053100000073
to predict the predicted horizontal-axis distance in the lane line information,
Figure BDA0001751053100000074
for predicting a predicted vehicle heading angle, lambda, in lane line information1Predicting the horizontal axis for the first predicted lane line informationWeight coefficient of distance, (1-lambda)1) Is a weight coefficient, beta, of the distance on the horizontal axis predicted in the second predicted lane line information1(1-beta) as a weight coefficient for predicting a vehicle heading angle in the first predicted lane line information1) And the weight coefficient of the predicted vehicle heading angle in the second predicted lane line information.
In step 102, the processor determines a predicted travel track within a preset time period based on the current travel track and the predicted lane line information.
In an implementation, the processor may determine the predicted travel track within a preset time period based on the current travel track and the determined predicted lane line information, for example, the processor may generate the predicted travel track within the preset time period and the preset distance based on the current travel track and the determined predicted lane line information, and then send the predicted travel track to an actuator of the vehicle, for example, a steering wheel, a power system, and the like of the vehicle, so that the smart vehicle can travel according to the predicted travel track within the next preset time period and the preset distance. For another example, the intelligent vehicle may further include a trajectory planning component, the processor may send the determined predicted lane line information to the trajectory planning component, the trajectory planning component may determine the predicted travel trajectory within the preset time duration in the future based on the current travel trajectory and the predicted lane line information, and then the trajectory planning component may calculate a vehicle speed and a vehicle heading angle required within the preset time duration in the future according to the predicted travel trajectory, and send a control instruction of the calculated vehicle speed and vehicle heading angle to an execution mechanism of the vehicle in real time, so that the intelligent vehicle may travel according to the predicted travel trajectory within the next preset time duration and preset distance.
Based on the above, the unmanned intelligent vehicle comprises a camera mounted on a front windshield and an inertia measurement component mounted at a preset position of a vehicle body, and when determining a running track, a processor of the intelligent vehicle may first determine predicted lane line information within a preset time length based on current lane line information sent by the camera, current vehicle speed, current acceleration and current angular speed sent by the inertia measurement component; and then, determining the predicted driving track within the preset time length based on the current driving track and the predicted lane line information. Therefore, in the running process of the intelligent vehicle, when the running track is determined, the data of the camera and the data of the inertia measurement component are fused to obtain the predicted running track, and therefore the accuracy of the running track can be improved.
In one possible application, the smart car may be a home car or a commercial car that is put on the market, for example, the smart car may be used to transport goods and the like. The intelligent vehicle can determine the driving track in real time according to the method for determining the driving track in the embodiment and drive according to the determined driving track in real time, and further can improve the driving safety of the unmanned intelligent vehicle.
In the embodiment of the disclosure, the unmanned intelligent vehicle comprises a camera mounted on a front windshield and an inertia measurement component mounted at a preset position of a vehicle body, and when a processor of the intelligent vehicle determines a driving track in the driving process, the intelligent vehicle can determine predicted lane line information within a preset time length based on current lane line information sent by the camera, current vehicle speed, current acceleration and current angular speed sent by the inertia measurement component; and then, determining the predicted driving track within the preset time length and the preset distance based on the current driving track and the predicted lane line information. Therefore, in the running process of the intelligent vehicle, when the running track is determined, the data of the camera and the data of the inertia measurement component are fused to obtain the predicted running track, and therefore the accuracy of the running track can be improved.
Based on the same technical concept, the embodiment of the disclosure further provides a device for determining a driving track, the device is applied to an unmanned intelligent vehicle, the intelligent vehicle comprises a camera and an inertia measurement component, the camera is mounted on a front windshield of the intelligent vehicle, the inertia measurement component is mounted at a preset position of a vehicle body of the intelligent vehicle, and the device is electrically connected with the camera and the inertia measurement component respectively;
as shown in fig. 4, the apparatus includes:
a lane line information prediction module 410, configured to determine predicted lane line information within a preset time period based on current lane line information sent by the camera, and a current vehicle speed, a current acceleration, and a current angular velocity sent by the inertia measurement component;
and the driving track prediction module 420 is configured to determine a predicted driving track within a preset time duration based on the current driving track and the predicted lane line information.
Optionally, the inertia measurement component is installed at a position of a center of gravity of a body of the smart car.
Optionally, the current lane line information includes a current vehicle heading angle and a current horizontal axis distance between a lane line where the intelligent vehicle is located and a center point of the camera, and the predicted lane line information includes a predicted vehicle heading angle and a predicted horizontal axis distance between the lane line where the intelligent vehicle is located and the center point of the camera.
Optionally, as shown in fig. 5, the lane line information prediction module 410 includes:
a first prediction unit 411, configured to determine first predicted lane line information within a preset time duration based on current lane line information sent by the camera;
a second prediction unit 412, configured to determine, based on the current lane line information, the current vehicle speed, the current acceleration, and the current angular velocity sent by the inertial measurement unit, second prediction lane line information within a preset time duration by using a kalman state equation;
a determining unit 413 configured to determine predicted lane line information based on the first predicted lane line information and the second predicted lane line information.
Optionally, the first prediction unit 411Based on the current course angle theta in the current lane line information sent by the camera(t)Distance d from current transverse axis(t)Determining a predicted course angle in the first predicted lane line information within a preset time length
Figure BDA0001751053100000091
And predicting the distance of the transverse axis
Figure BDA0001751053100000092
The second prediction unit 412 is configured to predict a current heading angle θ based on the current lane line information(t)Distance d from current transverse axis(t)Current vehicle speed v sent by the inertia measurement componenttCurrent acceleration atAnd the current angular velocity wtUsing the Kalman equation of state
Figure BDA0001751053100000093
Determining a preset time duration t0Predicted lateral distance in the second predicted lane line information
Figure BDA0001751053100000094
Using Kalman equation of state
Figure BDA0001751053100000095
Determining a preset time duration t0Predicted vehicle heading angle in the second predicted lane line information
Figure BDA0001751053100000096
The determining unit 413 is configured to determine a predicted heading angle based on the first predicted lane line information
Figure BDA0001751053100000097
Predicting distance across axis
Figure BDA0001751053100000098
And the predicted vehicle heading angle in the second predicted lane line information
Figure BDA0001751053100000099
Predicting distance across axis
Figure BDA00017510531000000910
Using a weighted formula
Figure BDA00017510531000000911
And
Figure BDA00017510531000000912
determining predicted vehicle heading angle in predicted lane line information
Figure BDA00017510531000000913
Predicting distance across axis
Figure BDA00017510531000000914
β1And λ1Are all weight coefficients.
In the embodiment of the disclosure, the unmanned intelligent vehicle comprises a camera mounted on a front windshield and an inertia measurement component mounted at a preset position of a vehicle body, and the device for determining a driving track of the intelligent vehicle can determine predicted lane line information within a preset time length based on current lane line information sent by the camera, current vehicle speed, current acceleration and current angular speed sent by the inertia measurement component; and then, determining the predicted driving track within the preset time length and the preset distance based on the current driving track and the predicted lane line information. Therefore, in the running process of the intelligent vehicle, when the running track is determined, the data of the camera and the data of the inertia measurement component are fused to obtain the predicted running track, and therefore the accuracy of the running track can be improved.
It should be noted that: in the device for determining a travel track according to the above embodiment, when determining a travel track, only the division of the above functional modules is taken as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the apparatus for determining a driving track and the method for determining a driving track provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present disclosure and should not be taken as limiting the invention, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (6)

1. The method for determining the driving track is characterized by being applied to an unmanned intelligent vehicle, wherein the intelligent vehicle comprises a processor, a camera and an inertia measurement component, the camera is installed on a front windshield of the intelligent vehicle, the inertia measurement component is installed at a preset position of a vehicle body of the intelligent vehicle, and the processor is respectively and electrically connected with the camera and the inertia measurement component;
wherein the method comprises the following steps:
the processor is based on the current course angle theta in the current lane line information sent by the camera(t)Distance d from current transverse axis(t)Determining a predicted course angle in the first predicted lane line information within a preset time length
Figure FDA0003241830720000011
And predicting the distance of the transverse axis
Figure FDA0003241830720000012
The processor is based on the current course angle theta in the current lane line information(t)Distance d from current transverse axis(t)Current vehicle speed v sent by the inertia measurement componenttCurrent acceleration atAnd the current angular velocity wtUsing the Kalman equation of state
Figure FDA0003241830720000013
Determining a preset time duration t0Predicted lateral distance in intra-second predicted lane line information
Figure FDA0003241830720000014
Using Kalman equation of state
Figure FDA0003241830720000015
Determining a preset time duration t0Predicted vehicle heading angle in the second predicted lane line information
Figure FDA0003241830720000016
The processor predicts a course angle based on first predicted lane line information
Figure FDA0003241830720000017
Predicting distance across axis
Figure FDA0003241830720000018
And the predicted vehicle heading angle in the second predicted lane line information
Figure FDA0003241830720000019
Predicting distance across axis
Figure FDA00032418307200000110
Using a weighted formula
Figure FDA00032418307200000111
And
Figure FDA00032418307200000112
determining predicted vehicle heading angle in predicted lane line information
Figure FDA00032418307200000113
Predicting distance across axis
Figure FDA00032418307200000114
β1And λ1Are all weight coefficients;
the processor determines a predicted travel track within a preset time period based on the current travel track and the predicted lane line information.
2. The method of claim 1, wherein the inertial measurement unit is mounted at a center of gravity of a body of the smart vehicle.
3. The method of claim 1, wherein the current lane line information comprises a current lateral axis distance between a lane line on which the intelligent vehicle is located and a center point of the camera, and the predicted lane line information comprises a predicted lateral axis distance between the lane line on which the intelligent vehicle is located and the center point of the camera.
4. A device for determining a driving track is characterized in that the device is applied to an unmanned intelligent vehicle, the intelligent vehicle comprises a camera and an inertia measurement component, the camera is mounted on a front windshield of the intelligent vehicle, the inertia measurement component is mounted at a preset position of a vehicle body of the intelligent vehicle, and the device is respectively electrically connected with the camera and the inertia measurement component;
wherein the apparatus comprises:
a first prediction unit for predicting the current course angle theta based on the current lane line information sent by the camera(t)Distance d from current transverse axis(t)Determining a predicted course angle in the first predicted lane line information within a preset time length
Figure FDA0003241830720000021
And predicting the distance of the transverse axis
Figure FDA0003241830720000022
A second prediction unit for predicting a current course angle theta based on the current lane line information(t)Distance d from current transverse axis(t)Current vehicle speed v sent by the inertia measurement componenttCurrent acceleration atAnd the current angular velocity wtUsing the Kalman equation of state
Figure FDA0003241830720000023
Determining a preset time duration t0Predicted lateral distance in intra-second predicted lane line information
Figure FDA0003241830720000024
Using Kalman equation of state
Figure FDA0003241830720000025
Determining a preset time duration t0Predicted vehicle heading angle in the second predicted lane line information
Figure FDA0003241830720000026
A determination unit for determining a predicted course angle based on the first predicted lane line information
Figure FDA0003241830720000027
Predicting distance across axis
Figure FDA0003241830720000028
And the predicted vehicle heading angle in the second predicted lane line information
Figure FDA0003241830720000029
Predicting distance across axis
Figure FDA00032418307200000210
Using a weighted formula
Figure FDA00032418307200000211
And
Figure FDA00032418307200000212
determining predicted vehicle heading angle in predicted lane line information
Figure FDA00032418307200000213
Predicting distance across axis
Figure FDA00032418307200000214
β1And λ1Are all weight coefficients;
and the driving track prediction module is used for determining a predicted driving track within a preset time length based on the current driving track and the predicted lane line information.
5. The apparatus of claim 4, wherein the inertial measurement unit is mounted at a center of gravity of a body of the smart car.
6. The apparatus of claim 4, wherein the current lane line information comprises a current lateral axis distance between a lane line on which the intelligent vehicle is located and a center point of the camera, and the predicted lane line information comprises a predicted lateral axis distance between the lane line on which the intelligent vehicle is located and the center point of the camera.
CN201810866434.8A 2018-08-01 2018-08-01 Method and device for determining a driving trajectory Active CN108801286B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810866434.8A CN108801286B (en) 2018-08-01 2018-08-01 Method and device for determining a driving trajectory

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810866434.8A CN108801286B (en) 2018-08-01 2018-08-01 Method and device for determining a driving trajectory

Publications (2)

Publication Number Publication Date
CN108801286A CN108801286A (en) 2018-11-13
CN108801286B true CN108801286B (en) 2021-11-30

Family

ID=64078821

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810866434.8A Active CN108801286B (en) 2018-08-01 2018-08-01 Method and device for determining a driving trajectory

Country Status (1)

Country Link
CN (1) CN108801286B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109460739A (en) * 2018-11-13 2019-03-12 广州小鹏汽车科技有限公司 Method for detecting lane lines and device
CN109885066B (en) * 2019-03-26 2021-08-24 北京经纬恒润科技股份有限公司 Motion trail prediction method and device
CN110764505B (en) * 2019-11-03 2022-10-04 华中师范大学 Unmanned automobile control system
CN112487861A (en) * 2020-10-27 2021-03-12 爱驰汽车(上海)有限公司 Lane line recognition method and device, computing equipment and computer storage medium
CN112706785B (en) * 2021-01-29 2023-03-28 重庆长安汽车股份有限公司 Method and device for selecting cognitive target of driving environment of automatic driving vehicle and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011110156A3 (en) * 2010-03-06 2011-11-10 Continental Teves Ag & Co. Ohg Lane keeping system for a motor vehicle
CN103204162A (en) * 2012-01-11 2013-07-17 通用汽车环球科技运作有限责任公司 Lane Tracking System With Active Rear-steer
CN103786723A (en) * 2012-10-30 2014-05-14 谷歌公司 Controlling vehicle lateral lane positioning
CN104882025A (en) * 2015-05-13 2015-09-02 东华大学 Crashing detecting and warning method based on vehicle network technology
CN105691388A (en) * 2016-01-14 2016-06-22 南京航空航天大学 Vehicle collision avoidance system and track planning method thereof
CN107000745A (en) * 2014-11-28 2017-08-01 株式会社电装 The travel controlling system and travel control method of vehicle
CN107600073A (en) * 2017-08-10 2018-01-19 同济大学 A kind of vehicle centroid side drift angle estimating system and method based on Multi-source Information Fusion
CN107672589A (en) * 2017-09-26 2018-02-09 苏州观瑞汽车技术有限公司 A kind of track of vehicle real-time predicting method and device based on GPR Detection Data

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10351129B2 (en) * 2017-01-13 2019-07-16 Ford Global Technologies, Llc Collision mitigation and avoidance

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011110156A3 (en) * 2010-03-06 2011-11-10 Continental Teves Ag & Co. Ohg Lane keeping system for a motor vehicle
CN103204162A (en) * 2012-01-11 2013-07-17 通用汽车环球科技运作有限责任公司 Lane Tracking System With Active Rear-steer
CN103786723A (en) * 2012-10-30 2014-05-14 谷歌公司 Controlling vehicle lateral lane positioning
CN107000745A (en) * 2014-11-28 2017-08-01 株式会社电装 The travel controlling system and travel control method of vehicle
CN104882025A (en) * 2015-05-13 2015-09-02 东华大学 Crashing detecting and warning method based on vehicle network technology
CN105691388A (en) * 2016-01-14 2016-06-22 南京航空航天大学 Vehicle collision avoidance system and track planning method thereof
CN107600073A (en) * 2017-08-10 2018-01-19 同济大学 A kind of vehicle centroid side drift angle estimating system and method based on Multi-source Information Fusion
CN107672589A (en) * 2017-09-26 2018-02-09 苏州观瑞汽车技术有限公司 A kind of track of vehicle real-time predicting method and device based on GPR Detection Data

Also Published As

Publication number Publication date
CN108801286A (en) 2018-11-13

Similar Documents

Publication Publication Date Title
CN108801286B (en) Method and device for determining a driving trajectory
KR102292277B1 (en) LIDAR localization inferring solutions using 3D CNN networks in autonomous vehicles
US11594011B2 (en) Deep learning-based feature extraction for LiDAR localization of autonomous driving vehicles
CN110967991B (en) Method and device for determining vehicle control parameters, vehicle-mounted controller and unmanned vehicle
CN108573271B (en) Optimization method and device for multi-sensor target information fusion, computer equipment and recording medium
CN110377025A (en) Sensor aggregation framework for automatic driving vehicle
CN108688660B (en) Operating range determining device
KR101975725B1 (en) Method and System for Determining Road Surface Friction of Autonomous Driving Vehicle Using Learning-based Model Predictive Control
CN108334077B (en) Method and system for determining unity gain for speed control of an autonomous vehicle
WO2018232681A1 (en) Traffic prediction based on map images for autonomous driving
KR20200096411A (en) LIDAR position estimation performing temporal smoothing using RNN and LSTM in autonomous vehicles
US10909377B2 (en) Tracking objects with multiple cues
CN112242069A (en) Method and device for determining vehicle speed
US11731612B2 (en) Neural network approach for parameter learning to speed up planning for complex driving scenarios
CN110562251A (en) automatic driving method and device
CN111076716B (en) Method, apparatus, device and computer-readable storage medium for vehicle localization
CN113859265B (en) Reminding method and device in driving process
CN110543148B (en) Task scheduling method and device
CN110654380A (en) Method and device for controlling a vehicle
US11328596B2 (en) Parking prediction
CN113815644B (en) System and method for reducing uncertainty in estimating autonomous vehicle dynamics
WO2020018140A1 (en) Ballistic estimnation of vehicle data
JP6854141B2 (en) Vehicle control unit
CN111126365B (en) Data acquisition method and device
US11912300B2 (en) Behavioral planning in autonomus vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant