CN117125057A - Collision detection method, device, equipment and storage medium based on lane change of vehicle - Google Patents

Collision detection method, device, equipment and storage medium based on lane change of vehicle Download PDF

Info

Publication number
CN117125057A
CN117125057A CN202311386257.0A CN202311386257A CN117125057A CN 117125057 A CN117125057 A CN 117125057A CN 202311386257 A CN202311386257 A CN 202311386257A CN 117125057 A CN117125057 A CN 117125057A
Authority
CN
China
Prior art keywords
vehicle
information
obstacle
collision
self
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311386257.0A
Other languages
Chinese (zh)
Other versions
CN117125057B (en
Inventor
邢海涛
田维伟
杨磊
王渊哲
王志忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jika Intelligent Robot Co ltd
Original Assignee
Jika Intelligent Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jika Intelligent Robot Co ltd filed Critical Jika Intelligent Robot Co ltd
Priority to CN202311386257.0A priority Critical patent/CN117125057B/en
Publication of CN117125057A publication Critical patent/CN117125057A/en
Application granted granted Critical
Publication of CN117125057B publication Critical patent/CN117125057B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18163Lane change; Overtaking manoeuvres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0011Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses a collision detection method, device, equipment and storage medium based on lane change of a vehicle, which comprise the steps of determining self-vehicle planning information and a target lane in a lane change scene, wherein the self-vehicle planning information comprises a self-vehicle planning path and self-vehicle planning time; detecting obstacles on a target lane through a vehicle sensing module to acquire sensing information; and detecting according to the vehicle planning information and the perception information to obtain collision information corresponding to the lane changing scene. The sensing module is introduced to detect the target lane, and collision information is obtained by detection based on the obtained sensing information and the vehicle planning information, so that the consumption of calculation resources caused by the introduction of the prediction module is reduced, and the collision detection efficiency is greatly improved.

Description

Collision detection method, device, equipment and storage medium based on lane change of vehicle
Technical Field
The present invention relates to the field of automatic driving technologies, and in particular, to a method, an apparatus, a device, and a storage medium for detecting a collision based on lane changing of a vehicle.
Background
At present, intelligent vehicles are generally equipped with pilot-assisted driving functions, and collision risks are increased rapidly in complex traffic scenes, such as lane changing scenes, so that collision detection is generally required to reduce collision risks in order to ensure driving safety.
However, at present, for collision detection under a lane change scene, a prediction track of a dynamic target object is usually given by combining with a prediction module, and whether each path point is collided is detected sequentially by combining with a planned path of a vehicle, or a two-dimensional collision detection method is adopted. However, when the collision detection is performed in the above manner, a large amount of calculation resources are consumed as the number of detection targets increases, and the efficiency of the collision detection is significantly reduced.
Disclosure of Invention
The invention provides a collision detection method, device and equipment based on vehicle lane change and a storage medium, so as to improve the efficiency of collision detection.
According to an aspect of the present invention, there is provided a collision detection method based on lane change of a vehicle, including:
determining self-vehicle planning information and a target lane in a lane change scene, wherein the self-vehicle planning information comprises a self-vehicle planning path and self-vehicle planning time;
detecting obstacles on the target lane through a vehicle sensing module to acquire sensing information;
and detecting according to the vehicle planning information and the perception information to obtain collision information corresponding to the lane changing scene.
According to another aspect of the present invention, there is provided a collision detection apparatus based on lane change of a vehicle, comprising:
The system comprises a self-vehicle planning information and target lane determining module, wherein the self-vehicle planning information comprises a self-vehicle planning path and self-vehicle planning time;
the perception information detection module is used for detecting the obstacle of the target lane through the vehicle perception module so as to acquire perception information;
and the collision information detection module is used for detecting according to the vehicle planning information and the perception information to obtain collision information corresponding to the lane change scene.
According to another aspect of the present invention, there is provided a computer apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of any one of the embodiments of the present invention.
According to another aspect of the invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to perform the method according to any of the embodiments of the invention.
According to the technical scheme, the target lane is detected by the introduction sensing module, and the collision information is obtained by detection based on the obtained sensing information and the vehicle planning information, so that the consumption of calculation resources caused by the introduction of the prediction module is reduced, and the collision detection efficiency is greatly improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a collision detection method based on lane change of a vehicle according to a first embodiment of the present invention;
FIG. 2 is a schematic view of collision detection of a stationary vehicle with an obstacle according to a first embodiment of the present invention;
FIG. 3 is a schematic view of collision detection for a stationary non-vehicle as an obstacle according to a first embodiment of the present invention;
FIG. 4 is a schematic diagram of determining an obstacle risk condition for a dynamic vehicle according to a first embodiment of the present invention;
FIG. 5 is a schematic view of determining longitudinal distances of a vehicle and a dynamic vehicle obstacle according to a first embodiment of the invention;
FIG. 6 is a schematic view of a collision detection of a dynamic vehicle in front of an obstacle according to a first embodiment of the present invention;
FIG. 7 is a schematic view of collision detection of a dynamic vehicle behind an automobile as an obstacle according to a first embodiment of the invention;
fig. 8 is a flowchart of a collision detection method based on lane change of a vehicle according to a second embodiment of the present invention;
fig. 9 is a schematic structural diagram of a viterbi decoding apparatus according to a third embodiment of the invention;
fig. 10 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, "comprises," "comprising," and "having" and any variations thereof are intended to cover a non-exclusive inclusion, such that a process, method, apparatus, article, or device that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or device.
Example 1
Fig. 1 is a flowchart of a vehicle lane-changing based collision detection method according to an embodiment of the present invention, where the method may be performed by a vehicle lane-changing based collision detection apparatus, and the vehicle lane-changing based collision detection apparatus may be implemented in hardware and/or software, and the apparatus may be integrally configured in an electronic device. As shown in fig. 1, the method includes:
Step S101, determining the vehicle planning information and the target lane in the lane change scene.
Optionally, determining the vehicle planning information and the target lane in the lane change scene includes: extracting own vehicle planning information from own vehicles in a lane change scene; acquiring a target direction of a self-vehicle needing lane changing according to a self-vehicle planning path in self-vehicle planning information; and determining a target lane according to the current driving lane and the target direction.
Specifically, in this embodiment, when the vehicle is traveling, the vehicle planning path and the vehicle planning time from the current position to the destination may be determined according to the destination input in advance by the user, and the vehicle planning path and the vehicle planning time may be stored in the vehicle storage unit as vehicle planning information in advance, so that in the lane changing scenario, the target direction of the vehicle that needs to change the lane may be obtained according to the vehicle planning path, for example, it is determined that the vehicle is currently traveling on three lanes, and it is necessary to change the lane to the left, and it is determined that the target lane is the left lane.
Step S102, detecting obstacles on a target lane through the vehicle sensing module to acquire sensing information.
Optionally, the vehicle sensing module comprises shooting equipment and radar equipment; obstacle detection is carried out on a target lane through a vehicle sensing module so as to obtain sensing information, and the method comprises the following steps: shooting an image of a target lane through shooting equipment to obtain an acquisition image; detecting an obstacle on a target lane through radar equipment to obtain radar detection information; and determining perception information according to the acquired image and radar detection information, wherein the perception information comprises target lane line coefficients or obstacle information.
Specifically, in the present embodiment, the prediction module is not used to predict the travel path of the target obstacle during collision detection, but the sensing module detects the obstacle on the target lane, and the vehicle sensing module in the present embodiment may be a photographing device and a radar device that are mounted at the vehicle head position. The radar device is used for sending radar waves to a preset range of the target lane and acquiring radar detection information according to reflected waves fed back by obstacles on the target lane.
After the acquired image is acquired, the acquired image can be subjected to image recognition to acquire whether an obstacle exists on the target lane, and when the existence of the obstacle is determined, the type of the obstacle, the state of the obstacle and the target lane line coefficient are further determined according to the recognition result. In addition, the obstacle position information and the obstacle operation information may be further determined according to the radar detection information, wherein the obstacle operation information may be an operation speed of the obstacle, and in particular may be determined according to an offset position of the obstacle determined at adjacent unit moments. It can thus be known that the perception information in the present embodiment includes the target lane line coefficient and the obstacle information, wherein the obstacle information further includes the obstacle type, the obstacle state, the obstacle position information, the obstacle running information, and the like. Of course, the present embodiment is merely an example, and the specific content of the perception information is not limited.
Step S103, detecting according to the vehicle planning information and the perception information to obtain collision information corresponding to the lane change scene.
Optionally, detecting according to the vehicle planning information and the perception information to obtain collision information corresponding to the lane change scene includes: extracting obstacle information from the perception information when it is determined that an obstacle exists on the target lane according to the acquired image, wherein the obstacle information includes an obstacle type, an obstacle state, obstacle position information, and obstacle operation information; when the obstacle state is static, static collision detection is carried out according to the vehicle planning information and the perception information to obtain static collision information corresponding to the lane change scene; when the obstacle state is dynamic, dynamic collision detection is carried out according to the vehicle planning information and the perception information to obtain dynamic collision information corresponding to the lane changing scene.
Specifically, in the present embodiment, after the perception information and the vehicle planning information are acquired, if it is determined that an obstacle exists in the target lane, the acquired obstacle information is extracted from the perception information. Since the obstacle states include static states and dynamic states, and the collision detection methods corresponding to the different states are different, the different obstacle states are specifically described in the present embodiment.
Optionally, static collision detection is performed according to the vehicle planning information and the perception information to obtain static collision information corresponding to the lane change scene, which includes: when the obstacle type is a vehicle, determining a vehicle coverage according to obstacle position information, wherein the obstacle position information comprises a vehicle center position coordinate, a transverse angle, a vehicle length and a vehicle width; acquiring the positions of all track points on a planning path of the vehicle, and detecting according to the positions of the track points and the vehicle coverage area to generate static vehicle collision information; when the type of the obstacle is non-vehicle, acquiring the position of each track point on the planning path of the vehicle and the vehicle position; and detecting according to the positions of the track points and the vehicle position to generate static non-vehicle collision information.
In the case where the obstacle state is static, whether the vehicle is the obstacle is further determined according to the type of the obstacle, as shown in fig. 2, which is a schematic diagram of collision detection of the static vehicle, because the vehicle occupies a certain space, the coverage area of the space of the vehicle needs to be considered when acquiring the static collision information. When the coverage area occupied by the vehicle is acquired, specifically, the obstacle position information in the perception information is extracted, wherein the obstacle position information comprises the center position coordinate, the transverse angle, the length and the width of the vehicle, so that four vertex coordinates of the vehicle are determined according to the acquired obstacle position information: a (ax, ay), b (bx, by), c (cx, cy), and d (dx, dy) determine vehicle coverage from the four vertex coordinates. Since the positions of the respective track points on the own-vehicle planned path are known, it is sequentially determined whether the respective track points are within the vehicle coverage, for example, for any point W (x, y) on the own-vehicle planned path, the detection parameters k1, k2, k3, and k4 are calculated using the following formula (1).
(1)
When k1, k2, k3 and k4 are coincident at the same timeWhen the vehicle is in the coverage range of the vehicle corresponding to a, b, c and d, the track point W can be determined; and when the track points do not accord with the track points, determining that the track points are not located in the coverage range of the vehicle. Generating collision risk prompt information of the current lane change only when one track point on the vehicle planning path is determined to be within the vehicle coverage range; when no track point is located in the coverage range of the vehicle, then the generation time is thatAnd the front lane change does not have collision risk prompt information, and the obtained prompt information is used as static vehicle collision information.
When the obstacle state is static, if the obstacle state is determined to be non-vehicle according to the obstacle type, as shown in fig. 3, a provided collision detection schematic diagram of the obstacle being static non-vehicle is shown, in this embodiment, coordinate conversion is performed on the obstacle position information and each track point position on the vehicle planning path so as to uniformly convert the obstacle position information and each track point position to a road coordinate system, then whether the transverse coordinate of the converted static non-vehicle obstacle is greater than half of the vehicle width is judged, if yes, collision risk prompt information of the current lane change is generated, otherwise, collision risk prompt information of the current lane change is generated, and the obtained prompt information is used as static vehicle collision information.
Optionally, the method for obtaining the dynamic collision information corresponding to the lane change scene by performing dynamic collision detection according to the vehicle planning information and the perception information includes: when the obstacle type is a vehicle, determining the risk condition of the obstacle according to the obstacle position information and the target lane line coefficient; when the risk condition is a risk obstacle, determining the longitudinal distance between the current obstacle vehicle and the vehicle according to the obstacle position information and the vehicle position information; judging whether the longitudinal distance is within the safety range of the vehicle, if so, generating dynamic collision information according to the vehicle running information and the obstacle running information, otherwise, directly determining that collision risk exists, and taking the collision risk as the dynamic collision information.
Optionally, generating the dynamic collision information according to the vehicle running information and the obstacle running information includes: acquiring a terminal longitudinal position of the vehicle according to the vehicle running information and the vehicle planning time, determining a terminal transverse position of the vehicle according to the terminal longitudinal position of the vehicle and a preset lane line coefficient formula, and determining a terminal position of the vehicle according to the terminal longitudinal position of the vehicle and the terminal transverse position of the vehicle; acquiring the terminal longitudinal position of the obstacle vehicle according to the obstacle running information and the vehicle planning time, determining the terminal transverse position of the obstacle vehicle according to the terminal longitudinal position of the obstacle vehicle and a preset lane line coefficient formula, and determining the terminal position of the obstacle vehicle according to the terminal longitudinal position of the obstacle vehicle and the terminal transverse position of the obstacle vehicle; dynamic collision information is generated from the own vehicle end position and the obstacle vehicle end position.
Specifically, when it is determined that the obstacle is a dynamic vehicle, the collision detection mode adopted is different from the detection mode in the static scene. Firstly, determining the risk condition of the obstacle according to the obstacle position information and the target lane line coefficient, namely determining whether the obstacle is a risk obstacle in a lane change scene. As shown in fig. 4, which is a schematic diagram for determining obstacle risk conditions of a dynamic vehicle, when the coordinates of the obstacle are (X0, Y0), the first lane coefficients shared by the target lane and the own lane are C0, C1, C2 and C3, respectively, and the second lane coefficients remote from the own lane in the target lane are NextC0, nextC1, nextC2 and NextC3, respectively, the lateral coordinates Y1 on the first lane can be calculated according to the following formula (2):
(2)
the lateral coordinate Y2 on the second road line is calculated according to the following formula (3):
(3)
and when Y0 is determined to be in the range of Y1 and Y2, determining that the obstacle is a risk obstacle, otherwise, determining that the obstacle is a non-risk obstacle. In addition, when the lane line coefficient of the target lane cannot be obtained from the sensing information, it is only necessary to determine whether the distance between Y0 and Y1 is smaller than the limit value, and Y0 is outside the lane line on the left side of the host vehicle, and determine the risk condition of the obstacle according to the determination result. Of course, the present embodiment is merely illustrative, and the specific manner of determining the risk condition of the obstacle is not limited.
Specifically, when it is determined that a risk obstacle exists on the target lane, since the target lane is a left lane, the obstacle position information and the vehicle position information are transformed by taking a lane line shared by the target lane and the vehicle lane as a reference, and the transformed vehicle longitudinal coordinate X0 and the transformed obstacle longitudinal coordinate X1 are obtained, and the longitudinal distance D between the obstacle vehicle and the vehicle is obtained according to the longitudinal coordinates of the two, as shown in fig. 5, which is a schematic diagram of determining the longitudinal distances between the vehicle and the obstacle of the dynamic vehicle. Since the vehicle safety range is preset, when the longitudinal distance of the vehicle safety range and the vehicle safety range is determined to be within the vehicle safety range, the current collision risk is not shown, but since the vehicle and the obstacle vehicle are in dynamic operation, the dynamic collision information is further required to be generated according to the vehicle operation information and the obstacle operation information; however, when the longitudinal distance of the two is determined not to be within the vehicle safety range, the collision risk prompt information of the current lane change is directly generated, and the prompt information is used as the dynamic collision information.
It is worth mentioning that when generating dynamic collision information from the own vehicle operation information and the obstacle operation information, there are two scenarios: a scene in which an obstacle is in front of the host vehicle and the traveling speed is greater than 0 and less than that of the host vehicle, as shown in fig. 6, is a schematic view of collision detection in which the obstacle is a dynamic vehicle in front of the host vehicle, and another scene in which the obstacle is behind the host vehicle and the traveling speed is greater than that of the host vehicle, as shown in fig. 7, is a schematic view of collision detection in which the obstacle is a dynamic vehicle behind the host vehicle, and since the manner of collision detection is substantially the same in both the two scenes, the scene in which the obstacle is a dynamic vehicle in front of the host vehicle shown in fig. 6 is specifically described in this embodiment. And when the vehicle changes lanes from left to right, determining the product distance of the current speed v1 of the vehicle and the planning time t, finding the longitudinal position corresponding to the center line of the current lane as the vehicle terminal longitudinal position according to the product distance, substituting the vehicle terminal longitudinal position into a vehicle lane line coefficient formula at the left side of the vehicle, such as formula (2), so as to obtain the vehicle terminal transverse position, and determining the vehicle terminal position according to the longitudinal position and the transverse position. Similarly, the product distance of the current vehicle speed v2 of the obstacle and the planning time t is also determined, the final longitudinal position of the obstacle vehicle is determined according to the product distance, and then the final longitudinal position of the obstacle vehicle is substituted into a lane line coefficient formula of the obstacle vehicle, such as formula (3), so as to obtain the final transverse position of the obstacle vehicle, and the final position of the obstacle vehicle is determined according to the obtained transverse position and the longitudinal position. And then taking a lane line shared by the target lane and the self lane as a reference, carrying out standard seat replacement on the self-vehicle end position and the obstacle vehicle end position, acquiring a converted self-vehicle longitudinal coordinate egoX and a converted obstacle vehicle longitudinal coordinate objX, judging whether the difference value between egoX and objX is smaller than a limiting value, if so, generating collision risk prompt information of the current lane change, otherwise, generating collision risk prompt information of the current lane change, and taking the prompt information as dynamic collision information. Of course, the present embodiment is only exemplified by the obstacle being located in front of the vehicle, and the principle of collision detection of the lane change of the vehicle is substantially the same as that when the obstacle is located in rear of the vehicle, and the description thereof will be omitted.
According to the method, the sensing module is introduced to detect the target lane, and collision information is obtained by detecting based on the obtained sensing information and the vehicle planning information, so that the consumption of calculation resources caused by the introduction of the prediction module is reduced, and the collision detection efficiency is greatly improved.
Example two
Fig. 8 is a flowchart of a vehicle lane change-based collision detection method according to a second embodiment of the present invention, where after acquiring collision information, the method further includes: according to the obtained collision information, collision early warning is carried out, as shown in fig. 8, the method comprises the following steps: and performing collision early warning according to the acquired collision information.
Step S201, determining the vehicle planning information and the target lane in the lane change scene.
Optionally, determining the vehicle planning information and the target lane in the lane change scene includes: extracting own vehicle planning information from own vehicles in a lane change scene; acquiring a target direction of a self-vehicle needing lane changing according to a self-vehicle planning path in self-vehicle planning information; and determining a target lane according to the current driving lane and the target direction.
Step S202, obstacle detection is carried out on a target lane through the vehicle sensing module so as to acquire sensing information.
Optionally, the vehicle sensing module comprises shooting equipment and radar equipment; obstacle detection is carried out on a target lane through a vehicle sensing module so as to obtain sensing information, and the method comprises the following steps: shooting an image of a target lane through shooting equipment to obtain an acquisition image; detecting an obstacle on a target lane through radar equipment to obtain radar detection information; and determining perception information according to the acquired image and radar detection information, wherein the perception information comprises target lane line coefficients or obstacle information.
Step S203, detecting according to the vehicle planning information and the perception information to obtain collision information corresponding to the lane change scene.
Optionally, detecting according to the vehicle planning information and the perception information to obtain collision information corresponding to the lane change scene includes: extracting obstacle information from the perception information when it is determined that an obstacle exists on the target lane according to the acquired image, wherein the obstacle information includes an obstacle type, an obstacle state, obstacle position information, and obstacle operation information; when the obstacle state is static, static collision detection is carried out according to the vehicle planning information and the perception information to obtain static collision information corresponding to the lane change scene; when the obstacle state is dynamic, dynamic collision detection is carried out according to the vehicle planning information and the perception information to obtain dynamic collision information corresponding to the lane changing scene.
Optionally, static collision detection is performed according to the vehicle planning information and the perception information to obtain static collision information corresponding to the lane change scene, which includes: when the obstacle type is a vehicle, determining a vehicle coverage according to obstacle position information, wherein the obstacle position information comprises a vehicle center position coordinate, a transverse angle, a vehicle length and a vehicle width; acquiring the positions of all track points on a planning path of the vehicle, and detecting according to the positions of the track points and the vehicle coverage area to generate static vehicle collision information; when the type of the obstacle is non-vehicle, acquiring the position of each track point on the planning path of the vehicle and the vehicle position; and detecting according to the positions of the track points and the vehicle position to generate static non-vehicle collision information.
Optionally, the method for obtaining the dynamic collision information corresponding to the lane change scene by performing dynamic collision detection according to the vehicle planning information and the perception information includes: when the obstacle type is a vehicle, determining the risk condition of the obstacle according to the obstacle position information and the target lane line coefficient; determining the longitudinal distance between the current obstacle vehicle and the vehicle according to the obstacle position information and the vehicle position information when the risk condition is a risk obstacle; judging whether the longitudinal distance is within the safety range of the vehicle, if so, generating dynamic collision information according to the vehicle running information and the obstacle running information, otherwise, directly determining that collision risk exists, and taking the collision risk as the dynamic collision information.
Step S204, collision early warning is carried out according to the acquired collision information.
Specifically, in this embodiment, after the collision information is obtained, an unused pre-warning mode is adopted according to the difference of the collision information, for example, when the collision information is that the current lane is changed and there is a collision risk, a red light is adopted to flash for collision pre-warning, and when the collision information is that the current lane is changed and there is no collision risk, a green light is adopted to normally light for collision pre-warning, which is, of course, only illustrated in this embodiment, and not limited by the specific mode adopted for collision pre-warning.
It should be noted that, in this embodiment, different pre-warning modes may be adopted for the static collision information and the dynamic collision information, for example, a light mode may be adopted for the static collision information to perform collision pre-warning, and a voice mode may be adopted for the dynamic collision information because of a higher collision coefficient.
For example, in a specific scenario, when it is determined that there is both a static obstacle and a dynamic obstacle on the target lane, and the obtained static collision information is that there is a collision risk for the current lane change, the obtained dynamic collision information is that there is no collision risk for the current lane change, and then red light flashing is used for collision early warning for the static collision information, and voice is used for collision early warning for the dynamic collision information by using "there is no dynamic collision risk for the current lane change". Therefore, a user can know the collision risk condition of the current lane change comprehensively.
According to the method, the sensing module is introduced to detect the target lane, and collision information is obtained by detecting based on the obtained sensing information and the vehicle planning information, so that the consumption of calculation resources caused by the introduction of the prediction module is reduced, and the collision detection efficiency is greatly improved.
Example III
Fig. 9 is a schematic structural diagram of a collision detection device based on lane change of a vehicle according to a third embodiment of the present invention. As shown in fig. 9, the apparatus includes: the vehicle planning information, the target lane determining module 310, the perception information detecting module 320, and the collision information detecting module 330.
The vehicle planning information and target lane determining module 310 is configured to determine vehicle planning information and target lanes in a lane change scene, where the vehicle planning information includes a vehicle planning path and a vehicle planning time;
the sensing information detection module 320 is configured to detect an obstacle of the target lane by using the vehicle sensing module to obtain sensing information;
the collision information detection module 330 is configured to detect according to the vehicle planning information and the perception information to obtain collision information corresponding to the lane change scene.
Optionally, the self-vehicle planning information and the target lane determining module are used for extracting the self-vehicle planning information from the self-vehicle in the lane change scene;
Acquiring a target direction of a self-vehicle needing lane changing according to a self-vehicle planning path in self-vehicle planning information;
and determining a target lane according to the current driving lane and the target direction.
Optionally, the vehicle sensing module comprises shooting equipment and radar equipment; the sensing information detection module is used for carrying out image shooting on the target lane through shooting equipment to obtain an acquired image;
detecting an obstacle on a target lane through radar equipment to obtain radar detection information;
and determining perception information according to the acquired image and radar detection information, wherein the perception information comprises target lane line coefficients or obstacle information.
Optionally, the collision information detection module includes: an obstacle information extraction unit for extracting obstacle information from the perception information when it is determined that an obstacle exists on the target lane according to the acquired image, wherein the obstacle information includes an obstacle type, an obstacle state, obstacle position information, and obstacle operation information;
the static collision information detection module is used for carrying out static collision detection according to the vehicle planning information and the perception information to obtain static collision information corresponding to the lane change scene when the obstacle state is static;
And the dynamic collision information detection module is used for carrying out dynamic collision detection according to the vehicle planning information and the perception information to obtain the dynamic collision information corresponding to the lane change scene when the obstacle state is dynamic.
Optionally, the static collision information detection module is used for determining the coverage area of the vehicle according to the obstacle position information when the obstacle type is the vehicle, wherein the obstacle position information comprises the central position coordinate, the transverse angle, the length and the width of the vehicle;
acquiring the positions of all track points on a planning path of the vehicle, and detecting according to the positions of the track points and the vehicle coverage area to generate static vehicle collision information;
when the type of the obstacle is non-vehicle, acquiring the position of each track point on the planning path of the vehicle and the position information of the obstacle;
and detecting according to the positions of the track points and the position information of the obstacle to generate static non-vehicle collision information.
Optionally, the dynamic collision information detection module is used for determining the risk condition of the obstacle according to the obstacle position information and the target lane line coefficient when the obstacle type is a vehicle;
determining the longitudinal distance between the current obstacle vehicle and the vehicle according to the obstacle position information and the vehicle position information when the risk condition is a risk obstacle;
Judging whether the longitudinal distance is within the safety range of the vehicle, if so, generating dynamic collision information according to the vehicle running information and the obstacle running information,
otherwise, directly determining that collision risk exists, and taking the collision risk as dynamic collision information.
Optionally, the dynamic collision information detection module is further configured to obtain a vehicle end longitudinal position according to the vehicle running information and the vehicle planning time, determine a vehicle end lateral position according to the vehicle end longitudinal position and a preset lane line coefficient formula, and determine a vehicle end position according to the vehicle end longitudinal position and the vehicle end lateral position;
acquiring the terminal longitudinal position of the obstacle vehicle according to the obstacle running information and the vehicle planning time, determining the terminal transverse position of the obstacle vehicle according to the terminal longitudinal position of the obstacle vehicle and a preset lane line coefficient formula, and determining the terminal position of the obstacle vehicle according to the terminal longitudinal position of the obstacle vehicle and the terminal transverse position of the obstacle vehicle;
dynamic collision information is generated from the own vehicle end position and the obstacle vehicle end position.
The collision detection device based on the lane change of the vehicle provided by the embodiment of the invention can execute the Viterbi decoding method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 10 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 10, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the respective methods and processes described above, such as a collision detection method based on a lane change of the vehicle.
In some embodiments, the vehicle lane-change based collision detection method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into the RAM 13 and executed by the processor 11, one or more steps of the vehicle lane-change based collision detection method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the vehicle lane-change based collision detection method in any other suitable manner (e.g., by means of firmware).
Various implementations of the apparatus and techniques described here above may be implemented in digital electronic circuit devices, integrated circuit devices, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), on-chip device devices (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on programmable devices including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, operable to receive data and instructions from, and to transmit data and instructions to, a storage device, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable crown block work warning device such that the computer programs, when executed by the processor, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution apparatus, device, or apparatus. The computer readable storage medium may include, but is not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor apparatus, device, or apparatus, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the apparatus and techniques described herein may be implemented on a device having: a display device (e.g., a touch screen) for displaying information to a user; and keys, the user may provide input to the device through a touch screen or keys. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A vehicle lane-changing based collision detection method, comprising:
determining self-vehicle planning information and a target lane in a lane change scene, wherein the self-vehicle planning information comprises a self-vehicle planning path and self-vehicle planning time;
detecting obstacles on the target lane through a vehicle sensing module to acquire sensing information;
and detecting according to the vehicle planning information and the perception information to obtain collision information corresponding to the lane changing scene.
2. The method of claim 1, wherein determining the vehicle planning information and the target lane in the lane change scenario comprises:
in the lane change scene, extracting the own vehicle planning information from the own vehicle;
acquiring a target direction of a self-vehicle needing lane changing according to the self-vehicle planning path in the self-vehicle planning information;
and determining the target lane according to the current driving lane and the target direction.
3. The method of claim 1, wherein the self-vehicle perception module comprises a camera device and a radar device;
the obstacle detection for the target lane by the vehicle sensing module to obtain sensing information comprises:
the shooting equipment shoots an image of the target lane to obtain an acquisition image;
detecting an obstacle on the target lane through the radar equipment to obtain radar detection information;
and determining the perception information according to the acquired image and the radar detection information, wherein the perception information comprises target lane line coefficients or obstacle information.
4. The method of claim 3, wherein the detecting according to the vehicle planning information and the perception information to obtain collision information corresponding to a lane change scene comprises:
Extracting the obstacle information from the perception information when it is determined that an obstacle exists on the target lane according to the acquired image, wherein the obstacle information comprises an obstacle type, an obstacle state, obstacle position information and obstacle operation information;
when the obstacle state is static, static collision detection is carried out according to the self-vehicle planning information and the perception information to obtain static collision information corresponding to a lane change scene;
and when the obstacle state is dynamic, carrying out dynamic collision detection according to the vehicle planning information and the perception information to obtain dynamic collision information corresponding to the lane changing scene.
5. The method of claim 4, wherein the performing static collision detection according to the vehicle planning information and the perception information to obtain static collision information corresponding to a lane change scene comprises:
when the obstacle type is a vehicle, determining a vehicle coverage according to the obstacle position information, wherein the obstacle position information comprises a vehicle center position coordinate, a transverse angle, a vehicle length and a vehicle width;
acquiring the positions of all track points on the vehicle planning path, and detecting according to the positions of the track points and the vehicle coverage area to generate static vehicle collision information;
When the type of the obstacle is a non-vehicle, acquiring the position of each track point on the self-vehicle planning path and the position information of the obstacle;
and detecting according to the positions of the track points and the position information of the obstacle to generate static non-vehicle collision information.
6. The method of claim 4, wherein the performing dynamic collision detection according to the vehicle planning information and the sensing information to obtain dynamic collision information corresponding to a lane change scene comprises:
when the obstacle type is a vehicle, determining a risk condition of the obstacle according to the obstacle position information and the target lane line coefficient;
when the risk condition is a risk obstacle, determining the longitudinal distance between the current obstacle vehicle and the vehicle according to the obstacle position information and the vehicle position information;
judging whether the longitudinal distance is within a self-vehicle safety range, if so, generating the dynamic collision information according to the self-vehicle operation information and the obstacle operation information,
otherwise, directly determining that collision risk exists, and taking the collision risk as the dynamic collision information.
7. The method of claim 6, wherein the generating the dynamic collision information from the vehicle operation information and the obstacle operation information comprises:
Acquiring a terminal longitudinal position of the self-vehicle according to the self-vehicle running information and the self-vehicle planning time, determining a terminal transverse position of the self-vehicle according to the terminal longitudinal position of the self-vehicle and a preset lane line coefficient formula, and determining the terminal position of the self-vehicle according to the terminal longitudinal position of the self-vehicle and the terminal transverse position of the self-vehicle;
acquiring a final longitudinal position of the obstacle vehicle according to the obstacle running information and the vehicle planning time, determining a final transverse position of the obstacle vehicle according to the final longitudinal position of the obstacle vehicle and the preset lane line coefficient formula, and determining the final position of the obstacle vehicle according to the final longitudinal position of the obstacle vehicle and the final transverse position of the obstacle vehicle;
and generating the dynamic collision information according to the vehicle terminal position and the obstacle vehicle terminal position.
8. A collision detection device based on lane change of a vehicle, comprising:
the system comprises a self-vehicle planning information and target lane determining module, wherein the self-vehicle planning information comprises a self-vehicle planning path and self-vehicle planning time;
the perception information detection module is used for detecting the obstacle of the target lane through the vehicle perception module so as to acquire perception information;
And the collision information detection module is used for detecting according to the vehicle planning information and the perception information to obtain collision information corresponding to the lane change scene.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1-7 when the program is executed by the processor.
10. A storage medium having stored thereon computer program of instructions, which when executed by a processor, performs the method of any of claims 1-7.
CN202311386257.0A 2023-10-25 2023-10-25 Collision detection method, device, equipment and storage medium based on lane change of vehicle Active CN117125057B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311386257.0A CN117125057B (en) 2023-10-25 2023-10-25 Collision detection method, device, equipment and storage medium based on lane change of vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311386257.0A CN117125057B (en) 2023-10-25 2023-10-25 Collision detection method, device, equipment and storage medium based on lane change of vehicle

Publications (2)

Publication Number Publication Date
CN117125057A true CN117125057A (en) 2023-11-28
CN117125057B CN117125057B (en) 2024-01-30

Family

ID=88861315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311386257.0A Active CN117125057B (en) 2023-10-25 2023-10-25 Collision detection method, device, equipment and storage medium based on lane change of vehicle

Country Status (1)

Country Link
CN (1) CN117125057B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170364083A1 (en) * 2016-06-21 2017-12-21 Baidu Online Network Technology (Beijing) Co., Ltd. Local trajectory planning method and apparatus for smart vehicles
CN112416004A (en) * 2020-11-19 2021-02-26 腾讯科技(深圳)有限公司 Control method and device based on automatic driving, vehicle and related equipment
CN113479217A (en) * 2021-07-26 2021-10-08 惠州华阳通用电子有限公司 Lane changing and obstacle avoiding method and system based on automatic driving
CN115179949A (en) * 2022-09-13 2022-10-14 毫末智行科技有限公司 Vehicle pressure speed changing control method, device, equipment and storage medium
CN116252817A (en) * 2023-04-25 2023-06-13 中国第一汽车股份有限公司 Automatic driving lane change decision method, device, equipment and storage medium
CN116872921A (en) * 2023-07-26 2023-10-13 重庆长安汽车股份有限公司 Method and system for avoiding risks of vehicle, vehicle and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170364083A1 (en) * 2016-06-21 2017-12-21 Baidu Online Network Technology (Beijing) Co., Ltd. Local trajectory planning method and apparatus for smart vehicles
CN112416004A (en) * 2020-11-19 2021-02-26 腾讯科技(深圳)有限公司 Control method and device based on automatic driving, vehicle and related equipment
CN113479217A (en) * 2021-07-26 2021-10-08 惠州华阳通用电子有限公司 Lane changing and obstacle avoiding method and system based on automatic driving
CN115179949A (en) * 2022-09-13 2022-10-14 毫末智行科技有限公司 Vehicle pressure speed changing control method, device, equipment and storage medium
CN116252817A (en) * 2023-04-25 2023-06-13 中国第一汽车股份有限公司 Automatic driving lane change decision method, device, equipment and storage medium
CN116872921A (en) * 2023-07-26 2023-10-13 重庆长安汽车股份有限公司 Method and system for avoiding risks of vehicle, vehicle and storage medium

Also Published As

Publication number Publication date
CN117125057B (en) 2024-01-30

Similar Documents

Publication Publication Date Title
CN111598034B (en) Obstacle detection method, obstacle detection device and storage medium
EP3620817A1 (en) Method and apparatus for generating object detection box, device, storage medium, and vehicle
US20220019817A1 (en) Vehicle locating method, electronic device and storage medium
US11003922B2 (en) Peripheral recognition device, peripheral recognition method, and computer readable medium
CN111052201B (en) Collision prediction device, collision prediction method, and storage medium
CN112580571A (en) Vehicle running control method and device and electronic equipment
US20230386076A1 (en) Target detection method, storage medium, electronic device, and vehicle
EP3907659A2 (en) Perception data detection method and apparatus
CN112348000A (en) Obstacle recognition method, device, system and storage medium
CN115346192A (en) Data fusion method, system, equipment and medium based on multi-source sensor perception
US20200166346A1 (en) Method and Apparatus for Constructing an Environment Model
CN112215209B (en) Car following target determining method and device, car and storage medium
CN113052047B (en) Traffic event detection method, road side equipment, cloud control platform and system
Hu et al. Context-aware data augmentation for lidar 3d object detection
CN117125057B (en) Collision detection method, device, equipment and storage medium based on lane change of vehicle
Kazerouni et al. An intelligent modular real-time vision-based system for environment perception
CN112902911A (en) Monocular camera-based distance measurement method, device, equipment and storage medium
CN114510996A (en) Video-based vehicle matching method and device, electronic equipment and storage medium
CN116513172A (en) Vehicle collision risk determination method, device, equipment and storage medium
JPWO2019123582A1 (en) Object information generator and object information generator
US20220318456A1 (en) Simulation method based on three-dimensional contour, storage medium, computer equipment
CN117636307A (en) Object detection method and device based on semantic information and automatic driving vehicle
CN116299199A (en) Radar distance display method and device, electronic equipment and storage medium
US20220207880A1 (en) Traffic monitoring system, traffic monitoring method, and storage medium
CN117774963A (en) Forward collision early warning method and device, electronic equipment and intelligent driving vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant