CN111324120A - Cut-in and cut-out scene extraction method for automatic driving front vehicle - Google Patents

Cut-in and cut-out scene extraction method for automatic driving front vehicle Download PDF

Info

Publication number
CN111324120A
CN111324120A CN202010121033.7A CN202010121033A CN111324120A CN 111324120 A CN111324120 A CN 111324120A CN 202010121033 A CN202010121033 A CN 202010121033A CN 111324120 A CN111324120 A CN 111324120A
Authority
CN
China
Prior art keywords
vehicle
lane
cut
lane line
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010121033.7A
Other languages
Chinese (zh)
Inventor
石娟
郭魁元
张志强
张晋崇
秦孔建
徐亮
张嘉芮
王霁宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Automotive Technology and Research Center Co Ltd
CATARC Automotive Test Center Tianjin Co Ltd
Original Assignee
China Automotive Technology and Research Center Co Ltd
CATARC Automotive Test Center Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Automotive Technology and Research Center Co Ltd, CATARC Automotive Test Center Tianjin Co Ltd filed Critical China Automotive Technology and Research Center Co Ltd
Priority to CN202010121033.7A priority Critical patent/CN111324120A/en
Publication of CN111324120A publication Critical patent/CN111324120A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides an extraction method for cut-in and cut-out scenes of an automatic driving front vehicle, which comprises the following steps: s1, installing data acquisition equipment on the vehicle, and acquiring data information of the vehicle, the target vehicle and the lane line; and S2, judging the cut-in and cut-out of the target vehicle by combining the data information collected in the step S1, and extracting the video or the picture at the position. The method for extracting the cut-in and cut-out scene of the automatic driving front vehicle can directly acquire the scene meeting the conditions from the scene database. The lane changing scene data is extracted without manually comparing the videos, so that the labor and time costs are saved.

Description

Cut-in and cut-out scene extraction method for automatic driving front vehicle
Technical Field
The invention belongs to the technical field of unmanned driving, and particularly relates to a cut-in and cut-out scene extraction method for an automatic driving front vehicle.
Background
At present, the automatic driving technology becomes one of the leading technologies of automobiles, the SAE (society of automotive engineering) in the united states divides the automatic driving technology into 6 grades, wherein 0 grade is an alarm, such as FCW, LDW and the like, and only gives sound and light reminding to a driver under certain conditions without controlling the automobile. The level 1 is one-way auxiliary driving, such as an ACC (adaptive cruise control) function, which can assist the vehicle to perform longitudinal control, and an LKA (lane keeping system) function, which can assist the vehicle to perform transverse control. Level 2 is bidirectional auxiliary driving, and the control system can simultaneously control the vehicle in the transverse direction and the longitudinal direction. Level 3 is a partial autopilot function. Level 4 is automatic driving in a specific scene. Level 5 is fully autonomous.
With the gradual increase of the automatic driving grade, the automatic driving control system gradually replaces the driver to control the vehicle to complete the driving behavior. In order to ensure high safety and reliability of an automatic driving system, more and more developers adopt a method of combining road test and simulation test to verify the performance of the system, and the scene is more and more emphasized as the basis of the road test and the simulation test. The main scene acquisition methods at present include scene data acquisition under natural driving behaviors, limit scene acquisition based on a driving simulator, scenes proposed by traffic accidents and test scenes specified by traffic regulations. However, the acquired data are large-scale continuous data, and when an automatic driving developer tests the system, the automatic driving developer needs to test specific scenes such as the cut-in of a front vehicle, the lane change of the self vehicle and the like.
Disclosure of Invention
In view of the above, the present invention is directed to a method for extracting a cut-in and cut-out scene of an automatic front-vehicle, so as to solve the problem of low extraction efficiency caused by the fact that the existing method adopts manual extraction and captures a scene with a specific requirement by manually contrasting videos or pictures.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
an extraction method for cut-in and cut-out scenes of an automatic driving front vehicle comprises the following steps:
s1, installing data acquisition equipment on the vehicle, and acquiring data information of the vehicle, the target vehicle and the lane line;
and S2, judging the cut-in and cut-out of the target vehicle by combining the data information collected in the step S1, and extracting the video or the picture at the position.
Further, in step S1, the acquiring device includes: the device comprises a millimeter wave radar, a camera, a gyroscope sensor and a steering wheel corner sensor;
the millimeter wave radar is used for acquiring the transverse distance, the longitudinal distance, the transverse relative speed, the longitudinal relative speed and the relative acceleration between the vehicle and the target vehicle;
the camera is used for the type and color of the lane line and the distance between the vehicle and the lane line;
the gyroscope sensor is used for acquiring the speed and the yaw rate of the vehicle;
the steering wheel corner sensor is used for collecting the corner of the steering wheel of the vehicle.
Further, the camera is mounted on a front windshield of the vehicle;
the millimeter wave radar is installed on the front bumper of the vehicle.
Further, in step S2, the method for recognizing the cut-in and cut-out of the target vehicle is as follows:
s201, judging whether the vehicle is in a running state at present, if so, entering a step S202, and if not, continuing to execute the step S201;
s202, judging whether the running state of the vehicle is in a lane changing state or a driving state in a lane; if the lane change state is detected, returning to the step S201; if the lane form state is the in-lane form state, the step S203 is entered;
and S203, judging whether a target vehicle appears in the lane of the vehicle, wherein the adjacent lane vehicle changes the lane to the lane when the vehicle does not have the target object, the adjacent lane vehicle cuts in a vehicle which is closer to the target vehicle when the vehicle has the target vehicle, and the scene extraction is carried out if the adjacent lane vehicle cuts in a vehicle which is closer to the target vehicle.
Further, in step S201, a method of determining whether the vehicle is in a running state includes:
judging by judging the speed of the vehicle; when the speed of the bicycle is zero, the bicycle is in a parking state; continuing to execute step S201;
when the vehicle speed is greater than zero, the vehicle is in a traveling state, and the process proceeds to step S202.
Further, in the step S202, it is determined whether the vehicle approaches the left or right lane line by using a distance change trend to the lane line, and further, it is determined whether the vehicle is changing lanes or driving in a lane, and the specific method is as follows:
establishing a coordinate system by taking the middle position of the head of the vehicle as the origin of the coordinate system, the copilot direction as the positive x-axis direction and the vehicle running direction as the positive Y-axis direction, wherein a1 represents the distance from the lane line on the left side in the lane, a2 represents the distance from the lane line on the adjacent left side, b1 represents the distance from the lane line on the right side in the lane, and b2 represents the distance from the lane line on the adjacent right side;
storing the coefficient of the lane line of the vehicle as one row of a historical lane line matrix, and deleting the earliest frame when the historical lane line reaches n rows, wherein the n rows correspond to n frames of pictures, so that the latest n-1 frames of lane line data are stored in the historical lane line matrix;
and judging whether the current driving in the lane or lane changing is performed by utilizing the historical lane line matrix:
firstly, finding out a lane line of a vehicle from all current lane lines, comparing the sizes of a1 and a2, wherein the largest line is the left lane line of the vehicle, and comparing b1 and b2, the smallest line is the right lane line of the vehicle;
judging whether the vehicle has the tendency of lane changing and lane departure by using the distance change tendency of the distance lane line;
the method for judging the lane change state comprises the following steps:
if the distance | a1| from the left lane line is smaller, a1 changes from a negative value to a positive value, and the lane line changes from the left lane line to the right lane line, it is determined that the lane change is performed to the left;
in the right lane line, the distance | b1| from the left lane line becomes smaller, b1 changes from a positive value to a negative value, and the lane line changes from the right lane line to the left lane line, so that the lane change is determined to be performed rightward;
the method for judging the driving state in the lane comprises the following steps:
in the left lane line, the distance | a1| from the left lane line becomes small, and a1 does not change from a negative value to a positive value; if the distance | b1| from the right lane line is larger in the right lane line, it is determined that the lane departure is proceeding to the left, but the lane change operation is not occurring, and it is determined that the vehicle is traveling in the lane;
in the right lane line, the distance | b1| from the left lane line becomes small, and b1 does not change from a positive value to a negative value; if the distance | a1| from the right lane line is large in the left lane line, it is determined that the lane change operation has not occurred although the lane change tends to be performed rightward, and it is determined that the vehicle is traveling in the lane.
Further, in step S203, the method for determining the cut-in and cut-out of the target vehicle is as follows:
in the lane keeping state, the state of the preceding vehicle is further judged and divided into free running, following, preceding vehicle cut-in and preceding vehicle cut-out, and the specific judgment method is as follows:
extracting nearest target in own lane
Judging the transverse distance Range _ X and the longitudinal distance Range _ Y of the information between the vehicle and the target object, which are detected by a millimeter wave radar, when the vehicle is in the driving state in the lane line; firstly, judging Range _ X, screening out the target objects with Range _ X in the Range of [ -2m,2m ], then selecting the target object with the smallest Range _ Y, and storing the target object into a row in a duration target object information matrix, wherein 4 variables are object ID, Range _ X, Range _ Y and distance target related Speed _ X respectively; setting Range _ Y not to exceed 60m at maximum;
when the number of the historical target object information matrixes reaches n rows, deleting the earliest frame, thereby ensuring that the latest n-1 frames of latest target object information are stored in the historical target object information matrixes;
by utilizing historical target information, whether the current and the latest target of the previous frame are the same target or not is considered by judging the variation of Range _ Y; if the variation of Range _ Y is not mutated and the absolute value is smaller than the set threshold, the same target is set, and the vehicle following state is set currently; if the absolute value of the variation of the Range _ Y is larger than a set threshold, judging that the target is not the same target, and further judging, if the absolute value of the variation of the Range _ Y is larger than the set threshold, primarily judging that the target is cut out of the front vehicle; if the variation of Range _ Y is smaller than the set threshold, it is preliminarily determined that the vehicle is cut in, and if there is no current lane target and there is a history target in the previous frame, it is determined that the vehicle is cut out or driven away, and the vehicle is free to run.
Compared with the prior art, the method for extracting the cut-in and cut-out scene of the automatic driving front vehicle has the following advantages:
the method for extracting the cut-in and cut-out scene of the automatic driving front vehicle can directly acquire the scene meeting the conditions from the scene database. The lane changing scene data is extracted without manually comparing the videos, so that the labor and time costs are saved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a flowchart of an extraction method for cut-in and cut-out scenes of an automatic driving front vehicle according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of lane line distance information according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a process of determining whether the vehicle is driving in a lane change or lane according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating corresponding positions of a host vehicle and a target vehicle according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating a process of selecting a nearest target in the lane according to an embodiment of the present invention;
fig. 6 is a flowchart illustrating the process of determining whether the target object is in a stable follow-up, cut-in, or cut-out state according to the embodiment of the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "up", "down", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on those shown in the drawings, and are used only for convenience in describing the present invention and for simplicity in description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and thus, are not to be construed as limiting the present invention. Furthermore, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first," "second," etc. may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless otherwise specified.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art through specific situations.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
A forward-looking camera, a millimeter wave radar, a gyroscope and a steering wheel corner sensor are mounted on a running vehicle to acquire front target object information, lane line information and parameters related to the motion of the vehicle. Specific output information includes the following table:
Figure BDA0002392979830000071
and constructing a data scene library by carrying sensors such as a forward-looking camera, a millimeter wave radar, a gyroscope, a steering wheel corner and the like on a running vehicle. The front-view camera is arranged on a front windshield of the vehicle, can identify lane lines under actual road conditions and send the distance of the lane line where the vehicle is located, and the distance between adjacent lane lines is calibrated in the early stage, so that the accuracy of output signals is ensured. The millimeter wave radar is installed at the position of the front bumper. The gyroscope has no requirement on the installation position and can be placed in the vehicle. The steering wheel angle sensor can output steering wheel angle information of the bicycle. The signals output by all sensors remain synchronized.
With the above listed signals, the extraction of the front cut scene is achieved in three steps, as shown in fig. 1. The first step is to judge that the vehicle is in a forward driving state at present; the second step is that under the condition that the vehicle is confirmed to be running in the forward direction, the running state of the vehicle is judged: lane changing and driving in a lane; the third step is to determine whether a closer target object appears in the lane of the vehicle, and comprises the following steps: when the vehicle has no target object, the vehicle in the adjacent lane changes to the vehicle lane, and when the vehicle has the target object, the adjacent lane cuts into a vehicle with a shorter distance.
Each of the processes is described in detail below with specific reference,
(1) the first step is as follows: judging whether the vehicle is in a driving state
Judging the speed of the vehicle;
when the speed of the vehicle is zero, the vehicle is in a parking state.
When the speed of the bicycle is larger than zero, the bicycle is in a forward driving state, and the bicycle enters a second step.
(2) The second step is that: driving tendency of the vehicle
This section judges whether the vehicle approaches the left or right lane line by using the trend of the change in the distance to the lane line. The driving trend of the vehicle is divided into two states of lane changing and driving in a lane.
The coordinate system of the vehicle is specified to accord with the right-hand rule standard, the middle position of the vehicle head is specified to be the origin of the coordinate system, the copilot direction is the positive x direction, and the vehicle running direction is the positive Y direction. A1 represents the distance from the left lane line in the lane, a2 represents the distance from the lane line of the left adjacent lane, b1 represents the distance from the right lane line in the lane, and b2 represents the distance from the lane line of the right adjacent lane; as shown in fig. 2.
The coefficients of the own lane line are stored as one row of the historical lane line matrix, 4 numbers (a1, a2, b1, b2), and when the historical lane line reaches 10 rows (corresponding to 10 frames of pictures) (n, which can be selected according to the situation), the earliest frame is deleted, thereby ensuring that the latest 9(n-1) frames of lane line data are stored in the historical lane line matrix.
By using the historical lane line matrix, whether the current lane is driven in the lane or changed can be judged. The function can be realized by using the coefficient change of the historical lane line:
as shown in fig. 3, the own lane line is first found from all the lane lines at present. Comparing a1 with a2, the largest is the left lane line of the lane. Compared with b1 and b2, the smallest lane line is the right lane line of the lane.
And judging whether the vehicle has the tendency of lane changing and lane departure by using the distance change tendency of the distance lane line.
Lane changing:
1) when the distance | a1| from the left lane line becomes smaller and a1 becomes positive from a negative value in the left lane line and the lane line changes from the left lane line to the right lane line, it is determined that the lane change is made to the left.
2) In the right lane line, the distance | b1| from the left lane line becomes smaller, and b1 changes from a positive value to a negative value, and this lane line changes from the right lane line to the left lane line, and it is determined that the lane change is performed rightward.
Driving in the lane:
1) in the left lane line, the distance | a1| from the left lane line becomes small, and a1 does not change from a negative value to a positive value; if the distance | b1| from the right lane line is large, it is determined that the lane change operation has not occurred although the lane change tends to be performed to the left, and it is determined that the vehicle is traveling in the lane.
2) In the right lane line, the distance | b1| from the left lane line becomes small, and b1 does not change from a positive value to a negative value; if the distance | a1| from the right lane line is large in the left lane line, it is determined that the lane change operation has not occurred although the lane change tends to be performed rightward, and it is determined that the vehicle is traveling in the lane.
3) Specifically, the distance between the left lane line and the right lane line is not significantly deviated, and it is determined that the vehicle is traveling in the lane.
(3) The third step: consider whether there is a more recent target present
In the lane keeping state, the state of the preceding vehicle is further considered, and the lane keeping state is divided into a free running state, a following state, a preceding vehicle cut-in state and a preceding vehicle cut-out state.
1) Extracting nearest target in own lane
And under the condition that the vehicle is in the driving state in the lane line, judging by using the transverse distance Range _ X and the longitudinal distance Range _ Y of the information of the vehicle and the target object, which are detected by the millimeter wave radar. First, the Range _ X is determined, the objects with the Range _ X within the Range of [ -2m,2m ] are screened out, then the object with the minimum Range _ Y is selected and stored in one row of the duration object information matrix, 4 variables (object ID, Range _ X, Range _ Y, distance object Speed _ X). The maximum Range _ Y is set to be not more than 60m (which can be modified according to the accuracy setting of the distance sensed by the camera, and more than 60m determines that the vehicle is in a free running state, and different distance limits can be set for different sensors), and the schematic diagram is shown in fig. 4.
When the number of the historical object information matrix reaches 10 rows (corresponding to 10 frames of pictures) (n, which can be selected according to the situation), the oldest frame is deleted, so that the historical object information matrix is ensured to store the latest object information of 9(n-1) frames. The flow chart is shown in figure 5.
2) Division of free-running, car-following, cut-in, cut-out situations (as shown in fig. 6)
Using the historical target information, whether the latest target of the current and previous frames is the same target is considered by judging the amount of change in Range _ Y. If the variation of Range _ Y is not changed, if the absolute value of the variation of Range _ Y is less than 2m (the threshold value can be set according to the situation), the same target is reached, and the following state is present. If the absolute value of the variation of the Range _ Y is larger than a set threshold value, judging that the variation is not the same target, and further judging, if the variation of the Range _ Y is larger than 2m (the threshold value), primarily judging that the vehicle is cut into a front vehicle; if the variation of Range _ Y is smaller than-2 m (threshold value), it is preliminarily determined that the vehicle is cut in, and if there is no current lane target and there is a history target in the previous frame, it is determined that the vehicle is cut out or driven away, and the vehicle is free to run.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (7)

1. A cut-in and cut-out scene extraction method for an automatic driving front vehicle is characterized by comprising the following steps:
s1, installing data acquisition equipment on the vehicle, and acquiring data information of the vehicle, the target vehicle and the lane line;
and S2, judging the cut-in and cut-out of the target vehicle by combining the data information collected in the step S1, and extracting the video or the picture at the position.
2. The automatic driving front vehicle cut-in and cut-out scene extraction method according to claim 1, wherein in the step S1, the acquisition device includes: the device comprises a millimeter wave radar, a camera, a gyroscope sensor and a steering wheel corner sensor;
the millimeter wave radar is used for acquiring the transverse distance, the longitudinal distance, the transverse relative speed, the longitudinal relative speed and the relative acceleration between the vehicle and the target vehicle;
the camera is used for the type and color of the lane line and the distance between the vehicle and the lane line;
the gyroscope sensor is used for acquiring the speed and the yaw rate of the vehicle;
the steering wheel corner sensor is used for collecting the corner of the steering wheel of the vehicle.
3. The automatic driving front vehicle cut-in and cut-out scene extraction method according to claim 2, characterized in that: the camera is arranged on the front windshield of the vehicle;
the millimeter wave radar is installed on the front bumper of the vehicle.
4. The method for extracting cut-in and cut-out scene of the autonomous driving vehicle according to claim 1, wherein in step S2, the method for identifying the cut-in and cut-out of the target vehicle is as follows:
s201, judging whether the vehicle is in a running state at present, if so, entering a step S202, and if not, continuing to execute the step S201;
s202, judging whether the running state of the vehicle is in a lane changing state or a driving state in a lane; if the lane change state is detected, returning to the step S201; if the lane form state is the in-lane form state, the step S203 is entered;
and S203, judging whether a target vehicle appears in the lane of the vehicle, wherein the adjacent lane vehicle changes the lane to the lane when the vehicle does not have the target object, the adjacent lane vehicle cuts in a vehicle which is closer to the target vehicle when the vehicle has the target vehicle, and the scene extraction is carried out if the adjacent lane vehicle cuts in a vehicle which is closer to the target vehicle.
5. The automatic driving front vehicle cut-in and cut-out scene extraction method according to claim 4, characterized in that: in step S201, the method for determining whether the vehicle is in a running state is as follows:
judging by judging the speed of the vehicle; when the speed of the bicycle is zero, the bicycle is in a parking state; continuing to execute step S201;
when the vehicle speed is greater than zero, the vehicle is in a traveling state, and the process proceeds to step S202.
6. The automatic driving front vehicle cut-in and cut-out scene extraction method according to claim 4, characterized in that: in step S202, whether the vehicle approaches the left or right lane line is determined by using the trend of the change in the distance to the lane line, and then whether the vehicle is changing lanes or driving in a lane is determined, which includes the following specific steps:
establishing a coordinate system by taking the middle position of the head of the vehicle as the origin of the coordinate system, the copilot direction as the positive x-axis direction and the vehicle running direction as the positive Y-axis direction, wherein a1 represents the distance from the lane line on the left side in the lane, a2 represents the distance from the lane line on the adjacent left side, b1 represents the distance from the lane line on the right side in the lane, and b2 represents the distance from the lane line on the adjacent right side;
storing the coefficient of the lane line of the vehicle as one row of a historical lane line matrix, and deleting the earliest frame when the historical lane line reaches n rows, wherein the n rows correspond to n frames of pictures, so that the latest n-1 frames of lane line data are stored in the historical lane line matrix;
and judging whether the current driving in the lane or lane changing is performed by utilizing the historical lane line matrix:
firstly, finding out a lane line of a vehicle from all current lane lines, comparing the sizes of a1 and a2, wherein the largest line is the left lane line of the vehicle, and comparing b1 and b2, the smallest line is the right lane line of the vehicle;
judging whether the vehicle has the tendency of lane changing and lane departure by using the distance change tendency of the distance lane line;
the method for judging the lane change state comprises the following steps:
if the distance | a1| from the left lane line is smaller, a1 changes from a negative value to a positive value, and the lane line changes from the left lane line to the right lane line, it is determined that the lane change is performed to the left;
in the right lane line, the distance | b1| from the left lane line becomes smaller, b1 changes from a positive value to a negative value, and the lane line changes from the right lane line to the left lane line, so that the lane change is determined to be performed rightward;
the method for judging the driving state in the lane comprises the following steps:
in the left lane line, the distance | a1| from the left lane line becomes small, and a1 does not change from a negative value to a positive value; if the distance | b1| from the right lane line is larger in the right lane line, it is determined that the lane departure is proceeding to the left, but the lane change operation is not occurring, and it is determined that the vehicle is traveling in the lane;
in the right lane line, the distance | b1| from the left lane line becomes small, and b1 does not change from a positive value to a negative value; if the distance | a1| from the right lane line is large in the left lane line, it is determined that the lane change operation has not occurred although the lane change tends to be performed rightward, and it is determined that the vehicle is traveling in the lane.
7. The method for extracting cut-in and cut-out scene of the pre-autonomous vehicle according to claim 4, wherein in step S203, the method for determining the cut-in and cut-out of the target vehicle is as follows:
in the lane keeping state, the state of the preceding vehicle is further judged and divided into free running, following, preceding vehicle cut-in and preceding vehicle cut-out, and the specific judgment method is as follows:
extracting nearest target in own lane
Judging the transverse distance Range _ X and the longitudinal distance Range _ Y of the information between the vehicle and the target object, which are detected by a millimeter wave radar, when the vehicle is in the driving state in the lane line; firstly, judging Range _ X, screening out the target objects with Range _ X in the Range of [ -2m,2m ], then selecting the target object with the smallest Range _ Y, and storing the target object into a row in a duration target object information matrix, wherein 4 variables are object ID, Range _ X, Range _ Y and distance target related Speed _ X respectively; setting Range _ Y not to exceed 60m at maximum;
when the number of the historical target object information matrixes reaches n rows, deleting the earliest frame, thereby ensuring that the latest n-1 frames of latest target object information are stored in the historical target object information matrixes;
by utilizing historical target information, whether the current and the latest target of the previous frame are the same target or not is considered by judging the variation of Range _ Y; if the variation of Range _ Y is not mutated and the absolute value is smaller than the set threshold, the same target is set, and the vehicle following state is set currently; if the absolute value of the variation of the Range _ Y is larger than a set threshold, judging that the target is not the same target, and further judging, if the absolute value of the variation of the Range _ Y is larger than the set threshold, primarily judging that the target is cut out of the front vehicle; if the variation of Range _ Y is smaller than the set threshold, it is preliminarily determined that the vehicle is cut in, and if there is no current lane target and there is a history target in the previous frame, it is determined that the vehicle is cut out or driven away, and the vehicle is free to run.
CN202010121033.7A 2020-02-26 2020-02-26 Cut-in and cut-out scene extraction method for automatic driving front vehicle Pending CN111324120A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010121033.7A CN111324120A (en) 2020-02-26 2020-02-26 Cut-in and cut-out scene extraction method for automatic driving front vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010121033.7A CN111324120A (en) 2020-02-26 2020-02-26 Cut-in and cut-out scene extraction method for automatic driving front vehicle

Publications (1)

Publication Number Publication Date
CN111324120A true CN111324120A (en) 2020-06-23

Family

ID=71165311

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010121033.7A Pending CN111324120A (en) 2020-02-26 2020-02-26 Cut-in and cut-out scene extraction method for automatic driving front vehicle

Country Status (1)

Country Link
CN (1) CN111324120A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111599181A (en) * 2020-07-22 2020-08-28 中汽院汽车技术有限公司 Typical natural driving scene recognition and extraction method for intelligent driving system test
CN111967123A (en) * 2020-06-30 2020-11-20 中汽数据有限公司 Method for generating simulation test case in simulation test
CN112389430A (en) * 2020-11-06 2021-02-23 北京航空航天大学 Method for judging time period for switching lane of vehicle into fleet based on offset rate
CN112634655A (en) * 2020-12-15 2021-04-09 北京百度网讯科技有限公司 Lane changing processing method and device based on lane line, electronic equipment and storage medium
CN112721932A (en) * 2021-01-25 2021-04-30 中国汽车技术研究中心有限公司 Method and device for determining vehicle lane change parameters, electronic equipment and medium
CN113343892A (en) * 2021-06-24 2021-09-03 东风汽车集团股份有限公司 Vehicle line-following driving scene extraction method
CN113569666A (en) * 2021-07-09 2021-10-29 东风汽车集团股份有限公司 Method for detecting continuous illegal lane change of vehicle and computer equipment
CN113868875A (en) * 2021-09-30 2021-12-31 天津大学 Method, device and equipment for automatically generating test scene and storage medium
CN114435389A (en) * 2020-11-02 2022-05-06 上海汽车集团股份有限公司 Vehicle control method and device and vehicle
CN114545385A (en) * 2022-02-18 2022-05-27 华域汽车系统股份有限公司 Fusion target detection method based on vehicle-mounted forward-looking camera and forward millimeter wave radar
CN115223131A (en) * 2021-11-09 2022-10-21 广州汽车集团股份有限公司 Adaptive cruise following target vehicle detection method and device and automobile
CN117272690A (en) * 2023-11-21 2023-12-22 中汽智联技术有限公司 Method, equipment and medium for extracting dangerous cut-in scene of automatic driving vehicle

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016134093A (en) * 2015-01-21 2016-07-25 株式会社デンソー Vehicle cruise control device
JP2017041126A (en) * 2015-08-20 2017-02-23 株式会社デンソー On-vehicle display control device and on-vehicle display control method
CN106647776A (en) * 2017-02-24 2017-05-10 驭势科技(北京)有限公司 Judgment method and device for lane changing trend of vehicle and computer storage medium
CN110097785A (en) * 2019-05-30 2019-08-06 长安大学 A kind of front truck incision or urgent lane-change identification prior-warning device and method for early warning
CN110126730A (en) * 2018-02-02 2019-08-16 上海博泰悦臻电子设备制造有限公司 Vehicle lane change based reminding method and system
US20190291727A1 (en) * 2016-12-23 2019-09-26 Mobileye Vision Technologies Ltd. Navigation Based on Liability Constraints
CN110458050A (en) * 2019-07-25 2019-11-15 清华大学苏州汽车研究院(吴江) Vehicle based on Vehicular video cuts detection method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016134093A (en) * 2015-01-21 2016-07-25 株式会社デンソー Vehicle cruise control device
JP2017041126A (en) * 2015-08-20 2017-02-23 株式会社デンソー On-vehicle display control device and on-vehicle display control method
US20190291727A1 (en) * 2016-12-23 2019-09-26 Mobileye Vision Technologies Ltd. Navigation Based on Liability Constraints
CN106647776A (en) * 2017-02-24 2017-05-10 驭势科技(北京)有限公司 Judgment method and device for lane changing trend of vehicle and computer storage medium
CN110126730A (en) * 2018-02-02 2019-08-16 上海博泰悦臻电子设备制造有限公司 Vehicle lane change based reminding method and system
CN110097785A (en) * 2019-05-30 2019-08-06 长安大学 A kind of front truck incision or urgent lane-change identification prior-warning device and method for early warning
CN110458050A (en) * 2019-07-25 2019-11-15 清华大学苏州汽车研究院(吴江) Vehicle based on Vehicular video cuts detection method and device

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111967123A (en) * 2020-06-30 2020-11-20 中汽数据有限公司 Method for generating simulation test case in simulation test
CN111967123B (en) * 2020-06-30 2023-10-27 中汽数据有限公司 Method for generating simulation test cases in simulation test
CN111599181A (en) * 2020-07-22 2020-08-28 中汽院汽车技术有限公司 Typical natural driving scene recognition and extraction method for intelligent driving system test
CN111599181B (en) * 2020-07-22 2020-10-27 中汽院汽车技术有限公司 Typical natural driving scene recognition and extraction method for intelligent driving system test
CN114435389B (en) * 2020-11-02 2024-01-30 上海汽车集团股份有限公司 Vehicle control method and device and vehicle
CN114435389A (en) * 2020-11-02 2022-05-06 上海汽车集团股份有限公司 Vehicle control method and device and vehicle
CN112389430A (en) * 2020-11-06 2021-02-23 北京航空航天大学 Method for judging time period for switching lane of vehicle into fleet based on offset rate
CN112389430B (en) * 2020-11-06 2024-01-19 北京航空航天大学 Determination method for vehicle lane change cutting-in motorcade period based on offset rate
CN112634655A (en) * 2020-12-15 2021-04-09 北京百度网讯科技有限公司 Lane changing processing method and device based on lane line, electronic equipment and storage medium
CN112721932A (en) * 2021-01-25 2021-04-30 中国汽车技术研究中心有限公司 Method and device for determining vehicle lane change parameters, electronic equipment and medium
CN113343892A (en) * 2021-06-24 2021-09-03 东风汽车集团股份有限公司 Vehicle line-following driving scene extraction method
CN113569666A (en) * 2021-07-09 2021-10-29 东风汽车集团股份有限公司 Method for detecting continuous illegal lane change of vehicle and computer equipment
CN113569666B (en) * 2021-07-09 2023-12-15 东风汽车集团股份有限公司 Method for detecting continuous illegal lane change of vehicle and computer equipment
CN113868875A (en) * 2021-09-30 2021-12-31 天津大学 Method, device and equipment for automatically generating test scene and storage medium
CN113868875B (en) * 2021-09-30 2022-06-17 天津大学 Method, device and equipment for automatically generating test scene and storage medium
CN115223131A (en) * 2021-11-09 2022-10-21 广州汽车集团股份有限公司 Adaptive cruise following target vehicle detection method and device and automobile
CN114545385A (en) * 2022-02-18 2022-05-27 华域汽车系统股份有限公司 Fusion target detection method based on vehicle-mounted forward-looking camera and forward millimeter wave radar
CN117272690A (en) * 2023-11-21 2023-12-22 中汽智联技术有限公司 Method, equipment and medium for extracting dangerous cut-in scene of automatic driving vehicle
CN117272690B (en) * 2023-11-21 2024-02-23 中汽智联技术有限公司 Method, equipment and medium for extracting dangerous cut-in scene of automatic driving vehicle

Similar Documents

Publication Publication Date Title
CN111324120A (en) Cut-in and cut-out scene extraction method for automatic driving front vehicle
CN109421738B (en) Method and apparatus for monitoring autonomous vehicles
CN109421739B (en) Method and apparatus for monitoring autonomous vehicles
CN106485949B (en) The sensor of video camera and V2V data for vehicle merges
CN108146503B (en) Vehicle collision avoidance
CN112289067B (en) Signal display estimation system
DE112010002021B4 (en) Vehicle environment estimator
US8175334B2 (en) Vehicle environment recognition apparatus and preceding-vehicle follow-up control system
CN103287358B (en) For determining the out-of-alignment method of object sensor
CN110400478A (en) A kind of road condition notification method and device
CN111845728B (en) Driving assistance data acquisition method and system
KR20200014931A (en) Vehicle information storage method, vehicle driving control method, and vehicle information storage device
JP4937933B2 (en) Outside monitoring device
CN113470371B (en) Method, system, and computer-readable storage medium for identifying an offending vehicle
CN109544725B (en) Event-driven-based automatic driving accident intelligent processing method
CN111409455A (en) Vehicle speed control method and device, electronic device and storage medium
CN114194190A (en) Lane maneuver intention detection system and method
CN110920617A (en) Vehicle travel control system
CN114719840A (en) Vehicle intelligent driving guarantee method and system based on road characteristic fusion
CN113743356A (en) Data acquisition method and device and electronic equipment
JP6185367B2 (en) Driving assistance device
CN116443046A (en) Assistance system with a leader determination module for an automatic vehicle in a merge trajectory
CN115140029A (en) Safety capability detection method and device for automatic driving automobile
CN113386771A (en) Road model generation method and equipment
CN114084129A (en) Fusion-based vehicle automatic driving control method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200623