CN117601855A - Target motion prediction method, system and storage medium - Google Patents

Target motion prediction method, system and storage medium Download PDF

Info

Publication number
CN117601855A
CN117601855A CN202311569750.6A CN202311569750A CN117601855A CN 117601855 A CN117601855 A CN 117601855A CN 202311569750 A CN202311569750 A CN 202311569750A CN 117601855 A CN117601855 A CN 117601855A
Authority
CN
China
Prior art keywords
target
self
vehicle
vehicle driving
longitudinal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311569750.6A
Other languages
Chinese (zh)
Inventor
张舒琦
马晓炜
孙超
田贺
芦畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DIAS Automotive Electronic Systems Co Ltd
Original Assignee
DIAS Automotive Electronic Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DIAS Automotive Electronic Systems Co Ltd filed Critical DIAS Automotive Electronic Systems Co Ltd
Priority to CN202311569750.6A priority Critical patent/CN117601855A/en
Publication of CN117601855A publication Critical patent/CN117601855A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4026Cycles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4042Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4043Lateral speed
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses a target motion prediction method, which is used for identifying a target in an emergency braking working condition of a vehicle and comprises the following steps: sensing and identifying the target type; according to the sensor sensing environment data, calculating whether the current moment of the target is positioned in a self-vehicle driving lane or not; if the target is in the self-vehicle driving lane, adopting the perceived and input target transverse and longitudinal speed and transverse and longitudinal acceleration, and adopting a uniform acceleration model to predict the target motion state; if the target is outside the self-vehicle driving lane, the target is not subjected to lateral motion prediction, the lateral speed and the lateral acceleration of the target which are input through perception are removed, the longitudinal speed and the longitudinal acceleration are reserved, and a uniform acceleration model is adopted to predict the motion state of the target; judging whether the current moment of the target is in the self-vehicle running track or not; judging whether the target is in the self-vehicle driving track at the TTC moment; finally, whether to trigger AEB is decided.

Description

Target motion prediction method, system and storage medium
Technical Field
The invention relates to the field of automobiles, in particular to a target motion prediction method, a target motion prediction system and a storage medium for identifying targets under emergency braking conditions of vehicles.
Background
With advances in intelligent driving technology and perfection of related regulations, automatic emergency braking functions have become an essential key ring in advanced assisted driving systems and unmanned systems. The automatic emergency braking function can take braking in advance to reduce or avoid collision when the self-vehicle is about to collide with targets of other road participants such as vehicles, pedestrians, bicycles and the like, thereby reducing traffic accidents and protecting the life and property safety of drivers and passengers.
The target motion state prediction is an important ring in the automatic emergency braking function algorithm, and collision risk can be correctly estimated only by predicting the future motion state of the target relatively accurately, so that the automatic emergency braking function is triggered at a proper time, the driving experience of a driver or rear-end collision accidents are not influenced by triggering the function too early or when the collision risk is not generated, and the expected collision-reducing function effect cannot be achieved because the triggering is too late.
The difficulty of the current target motion state prediction mainly lies in inaccurate target attribute of environment perception output, especially for smaller targets such as pedestrians, bicycles, battery cars, motorcycles and the like, and the perception results output by commonly used sensors such as cameras, radars and the like often have a larger difference from the target attribute in the real environment. If the future motion state of the target is predicted by adopting a wrong sensing result, wrong risk assessment is often caused, and thus, the false triggering problem of automatic emergency braking is generated.
Therefore, a scheme capable of accurately predicting the front target motion and avoiding false triggering of the emergency brake of the bicycle is needed.
Disclosure of Invention
In the summary section, a series of simplified form concepts are introduced that are all prior art simplifications in the section, which are described in further detail in the detailed description section. The summary of the invention is not intended to define the key features and essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
The invention aims to provide a target motion prediction method capable of avoiding false triggering of emergency braking caused by target motion recognition.
In order to solve the technical problems, the invention provides a target motion prediction method aiming at targets such as pedestrians, bicycles, battery cars, motorcycles and the like under a driving road environment, which is used for reducing false triggering of an automatic emergency braking function aiming at the targets.
The invention provides a target motion prediction method for identifying a target in an emergency braking working condition of a vehicle, which comprises the following steps of:
s1, sensing and identifying target types; the step is to acquire data from a vehicle-mounted sensing system, and any vehicle-mounted sensing system meeting the requirements of the invention in the prior art can be used in the step; namely, the vehicle-mounted sensing system can sense and identify pedestrians, bicycles, battery cars and/or motorcycles;
s2, calculating whether the current moment of the target is positioned in a self-vehicle driving lane according to the sensor sensing environment data;
s3, if the target is in the self-vehicle driving lane, adopting the perceived and input target transverse and longitudinal speed and transverse and longitudinal acceleration, and adopting a uniform acceleration model to predict the target motion state;
if the target is outside the self-vehicle driving lane, the target is not subjected to lateral motion prediction, the lateral speed and the lateral acceleration of the target which are input through perception are removed, the longitudinal speed and the longitudinal acceleration are reserved, and a uniform acceleration model is adopted to predict the motion state of the target;
s4, judging whether the current moment of the target is in the self-vehicle driving track or not;
s5, judging whether the target is in the self-vehicle driving track at the TTC moment;
and S6, determining whether to trigger AEB according to the judgment result of the step S4, the judgment result of the step S5 and the AEB danger threshold.
The target categories include: at least one of pedestrians, bicycles, battery cars and motorcycles.
Optionally, further improving the target motion prediction method, calculating whether the current time of the target is located in the self-vehicle driving lane includes:
calculating the transverse distance x_lanemark_right between the target and the lane line on the right side of the self-vehicle driving lane under the condition of the current target longitudinal distance in the self-vehicle coordinate system, and the transverse distance x_lanemark_left between the target and the lane line on the left side of the self-vehicle driving lane, wherein the transverse distance x_obj between the target and the self-vehicle;
if the type of the lane line on the right side is a solid line or a road edge, and x_obj is less than x_lane_right+C1, judging that the target is positioned on the outer side of the lane line on the right side of the self-vehicle;
if the left lane line type is a solid line or a road edge and x_obj is more than x_lanemark_left-C1, judging that the target is positioned outside the left self-vehicle driving lane line;
otherwise, judging that the target is positioned in the lane line of the self-vehicle driving;
wherein C1 is the standard quantity, and is the threshold value calibrated according to the sensor perception error in the road test process.
Optionally, the method for predicting the target motion is further improved, and the step S4 is implemented and includes:
the self-vehicle driving track equation is calculated by the current navigation-related angular speed of the self-vehicle according to circular motion;
the distances between the four corner points of the calculated target and the center of the current self-vehicle driving track are respectively expressed as follows: y_corerl, y_corer2, y_corer3, and y_corer4;
y_min=min(abs(y_cornerl,y_corner2,y_corner3,y_corner4));。
if y_corel, y_corer 2, y_corer 3, y_corer 4 are not exactly equal in number under the coordinate system, or |y_min| <0.5 (vehicle width+target width) -C2, then it is determined that the target is currently in the vehicle running track, and C2 is the calibration quantity, which is a calibration threshold determined according to the sensor error range and the tolerance degree to the target intrusion.
Optionally, the method for predicting the target motion is further improved, and the step S5 is implemented and includes:
calculating the time TTC required by the vehicle to reach the target in the longitudinal direction;
the relative distance between the vehicle and the target is expressed as X rel =X obj -X ego
The relative speed of the bicycle and the target is expressed as V rel =V obj -V ego
The relative acceleration between the bicycle and the target is expressed as A rel =A obj -A ego
X obj Representing the longitudinal position of the object, X ego Indicating the longitudinal position of the bicycle, V obj Representing the target longitudinal velocity, V ego Indicating the longitudinal speed of the bicycle, A obj Indicating target longitudinal acceleration, A ego Indicating the longitudinal acceleration of the vehicle.
Optionally, the method for predicting the target motion is further improved, and the step S6 is implemented and includes:
and triggering AEB if the current moment and TTC moment of the target are both in the self-vehicle driving track and are larger than the AEB dangerous threshold.
The present invention provides a computer-readable storage medium having stored therein a computer program which, when executed, is adapted to carry out the steps of the object motion prediction method according to any one of the above.
The present invention provides a target motion prediction system for a vehicle emergency braking system, comprising:
the data receiving unit acquires the sensing target type from the vehicle-mounted sensing system;
a calculation unit that calculates whether or not a target current time is located within a host vehicle travel lane;
a first judgment unit which selects a target transverse and longitudinal movement prediction parameter and a model according to the position relationship between the target and a self-vehicle driving lane line;
the second judging unit is used for judging whether the current moment of the target is in the self-vehicle driving track or not;
a third judging unit for judging whether the target is in the self-vehicle driving track at the TTC moment;
and the fourth judging unit is used for determining whether to trigger AEB according to the judging result of the second judging unit, the judging result of the third judging unit and the AEB dangerous threshold value.
Optionally, the target motion prediction system is further improved, and the calculating unit calculates whether the current time of the target is located in the self-vehicle driving lane in the following manner;
calculating the transverse distance x_lanemark_right between the target and the lane line on the right side of the self-vehicle driving lane under the condition of the current target longitudinal distance in the self-vehicle coordinate system, and the transverse distance x_lanemark_left between the target and the lane line on the left side of the self-vehicle driving lane, wherein the transverse distance x_obj between the target and the self-vehicle;
if the type of the lane line on the right side is a solid line or a road edge, and x_obj is less than x_lane_right+C1, judging that the target is positioned on the outer side of the lane line on the right side of the self-vehicle;
if the left lane line type is a solid line or a road edge and x_obj is more than x_lanemark_left-C1, judging that the target is positioned outside the left self-vehicle driving lane line;
otherwise, judging that the target is positioned in the lane line of the self-vehicle driving;
wherein C1 is the standard quantity, and is the threshold value calibrated according to the sensor perception error in the road test process.
Optionally, further improving the target motion prediction system, the first judging unit selecting the target transverse and longitudinal motion prediction parameters and the model according to the position relationship between the target and the lane line of the vehicle comprises:
if the target is in the self-vehicle driving lane, adopting the perceived and input target transverse and longitudinal speed and transverse and longitudinal acceleration, and adopting a uniform acceleration model to predict the target motion state;
if the target is outside the self-vehicle driving lane, the target is not subjected to lateral motion prediction, the lateral speed and the lateral acceleration of the target which are input through perception are cleared, the longitudinal speed and the longitudinal acceleration are reserved, and the motion state of the target is predicted by adopting a uniform acceleration model.
Optionally, further improving the target motion prediction system, the second judging unit judges whether the current time of the target is in the track of the self-vehicle, including:
the self-vehicle driving track equation is calculated by the current navigation-related angular speed of the self-vehicle according to circular motion;
the distances between the four corner points of the calculated target and the center of the current self-vehicle driving track are respectively expressed as follows: y_corerl, y_corer2, y_corer3, and y_corer4;
y_min=min(abs(y_cornerl,y_corner2,y_corner3,y_corner4));。
if y_corel, y_corer 2, y_corer 3, y_corer 4 are not exactly equal in number under the coordinate system, or |y_min| <0.5 (vehicle width+target width) -C2, then it is determined that the target is currently in the vehicle running track, and C2 is the calibration quantity, which is a calibration threshold determined according to the sensor error range and the tolerance degree to the target intrusion.
Optionally, further improving the target motion prediction system, the third judging unit judges whether the target is in the self-driving track at the TTC time, including:
calculating the time TTC required by the vehicle to reach the target in the longitudinal direction;
the relative distance between the vehicle and the target is expressed as X rel =X obj -X ego
The relative speed of the bicycle and the target is expressed as V rel =V obj -V ego
The relative acceleration between the bicycle and the target is expressed as A rel =A obj -A ego
X obj Representing the longitudinal position of the object, X ego Indicating the longitudinal position of the bicycle, V obj Representing the target longitudinal velocity, V ego Indicating the longitudinal speed of the bicycle, A obj Indicating target longitudinal acceleration, A ego Indicating the longitudinal acceleration of the vehicle.
Optionally, further improving the target motion prediction system, the fourth judging unit decides whether to trigger AEB according to the judging result of the second judging unit, the judging result of the third judging unit and the AEB risk threshold, including:
and triggering AEB if the current moment and TTC moment of the target are both in the self-vehicle driving track and are larger than the AEB dangerous threshold.
According to whether the target current moment is positioned in the self-vehicle driving lane or not, the target transverse and longitudinal movement prediction parameters and the model are selected, whether the target current moment is positioned in the self-vehicle driving track or not and whether the target is positioned in the self-vehicle driving track or not is predicted to have collision danger or not according to the target current moment, and the prediction method can eliminate automatic emergency braking false triggering caused by inaccurate motion attribute of the sensing input target (smaller targets such as pedestrians, bicycles, battery cars and motorcycles) (for example, collision risk exists at the current moment and false triggering caused by collision risk does not exist at the TTC moment), so that driving experience of a driver is improved, safety of an automatic emergency braking function is fully exerted, and safety problems such as rear-end collision are not caused.
Drawings
The accompanying drawings are intended to illustrate the general features of methods, structures and/or materials used in accordance with certain exemplary embodiments of the invention, and supplement the description in this specification. The drawings of the present invention, however, are schematic illustrations that are not to scale and, thus, may not be able to accurately reflect the precise structural or performance characteristics of any given embodiment, the present invention should not be construed as limiting or restricting the scope of the numerical values or attributes encompassed by the exemplary embodiments according to the present invention. The invention is described in further detail below with reference to the attached drawings and detailed description:
FIG. 1 is a schematic diagram of the positional relationship among a host vehicle, a target, and lane lines (including road edges).
Fig. 2 is a schematic diagram of a positional relationship for determining whether a target is in a track of a vehicle at the current time.
Fig. 3 is a schematic diagram of a positional relationship of how TTC determines whether a target is in a vehicle track.
Detailed Description
Other advantages and technical effects of the present invention will become more fully apparent to those skilled in the art from the following disclosure, which is a detailed description of the present invention given by way of specific examples. The invention may be practiced or carried out in different embodiments, and details in this description may be applied from different points of view, without departing from the general inventive concept. It should be noted that the following embodiments and features in the embodiments may be combined with each other without conflict. The following exemplary embodiments of the present invention may be embodied in many different forms and should not be construed as limited to the specific embodiments set forth herein. It should be appreciated that these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the technical solution of these exemplary embodiments to those skilled in the art. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly connected" or "directly coupled" to another element, there are no intervening elements present. Like reference numerals refer to like elements throughout the several views. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
First embodiment
The invention provides a target motion prediction method, which is used for identifying a target in an emergency braking working condition of a vehicle and comprises the following steps:
s1, sensing and identifying target types; the target categories include: at least one of pedestrians, bicycles, battery cars and motorcycles; alternatively, the invention predicts the best effect only for these target types;
s2, calculating whether the current moment of the target is positioned in a self-vehicle driving lane according to the sensor sensing environment data;
referring to fig. 1, in the vehicle coordinate system (left positive and right negative), under the condition of the current target longitudinal distance, the transverse distance between the target and the lane line on the right side of the vehicle driving lane is x_lanemark_right, the transverse distance between the target and the lane line on the left side of the vehicle driving lane is x_lanemark_left, and the transverse distance between the target and the vehicle is x_obj;
if the type of the lane line on the right side is a solid line or a road edge, and x_obj is less than x_lane_right+C1, judging that the target is positioned on the outer side of the lane line on the right side of the self-vehicle;
if the left lane line type is a solid line or a road edge and x_obj is more than x_lanemark_left-C1, judging that the target is positioned outside the left self-vehicle driving lane line;
otherwise, judging that the target is positioned in the lane line of the self-vehicle driving;
c1 is a calibration quantity, which is a threshold value calibrated according to the sensing error of the sensor in the road test process;
s3, if the target is in the self-vehicle driving lane, adopting the perceived and input target transverse and longitudinal speed and transverse and longitudinal acceleration, and adopting a uniform acceleration model to predict the target motion state;
if the target is positioned outside the self-vehicle driving lane, the target is considered to have certain obstruction to transversely cross the lane line (road edge), the possibility that the target such as pedestrians, bicycles, battery cars and motorcycles cross the lane line (road edge) is smaller under the condition, the target is not predicted to transversely move, the transverse speed and the transverse acceleration of the target which are perceived and input are cleared, the longitudinal speed and the longitudinal acceleration are reserved, and the uniform acceleration model is adopted to predict the movement state of the target;
s4, referring to FIG. 2, judging whether the current moment of the target is in the self-vehicle driving track or not includes:
the self-vehicle driving track equation is calculated by the current navigation-related angular speed of the self-vehicle according to circular motion;
the distances between the four corner points of the calculated target and the center of the current self-vehicle driving track are respectively expressed as follows: y_corer1, y_corer2, y_corer3, and y_corer4;
y_min=min(abs(y_corner1,y_corner2,y_corner3,y_corner4));。
if y_corer 1, y_corer 2, y_corer 3, y_corer 4 are not exactly equal in number under the coordinate system, or |y_min| <0.5 (vehicle width+target width) -C2, then it is determined that the target is currently in the vehicle running track, and C2 is the calibration quantity, which is a calibration threshold determined according to the sensor error range and the tolerance degree to the target intrusion.
S5, referring to FIG. 3, judging whether the target is in the self-vehicle driving track at the TTC moment comprises:
calculating the time TTC required by the vehicle to reach the target in the longitudinal direction;
the relative distance between the vehicle and the target is expressed as X rel =X obj -X ego
The relative speed of the bicycle and the target is expressed as V rel =V obj -V ego
The relative acceleration between the bicycle and the target is expressed as A rel =A obj -A ego
X obj Representing the longitudinal position of the object, X ego Indicating the longitudinal position of the bicycle, V obj Representing the target longitudinal velocity, V ego Indicating the longitudinal speed of the bicycle, A obj Indicating target longitudinal acceleration, A ego Representing the longitudinal acceleration of the bicycle;
and S6, triggering AEB if the current moment and TTC moment of the target are both in the self-vehicle driving track and are greater than the AEB dangerous threshold.
A second embodiment;
the present invention provides a computer readable storage medium having stored therein a computer program which when executed is adapted to carry out the steps of the object motion prediction method according to any one of claims 1-5.
Including both non-transitory and non-transitory, removable and non-removable media, the information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include non-transitory computer-readable media (transshipment) such as modulated data signals and carrier waves.
Furthermore, it will be understood that, although the terms "first," "second," etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the example embodiments of the present invention.
A third embodiment;
the present invention provides a target motion prediction system for a vehicle emergency braking system, comprising:
the data receiving unit acquires the sensing target type from the vehicle-mounted sensing system;
a calculation unit that calculates whether or not a target current time is located within a host vehicle travel lane, comprising:
calculating the transverse distance x_lanemark_right between the target and the lane line on the right side of the self-vehicle driving lane under the condition of the current target longitudinal distance in the self-vehicle coordinate system, and the transverse distance x_lanemark_left between the target and the lane line on the left side of the self-vehicle driving lane, wherein the transverse distance x_obj between the target and the self-vehicle;
if the type of the lane line on the right side is a solid line or a road edge, and x_obj is less than x_lane_right+C1, judging that the target is positioned on the outer side of the lane line on the right side of the self-vehicle;
if the left lane line type is a solid line or a road edge and x_obj is more than x_lanemark_left-C1, judging that the target is positioned outside the left self-vehicle driving lane line;
otherwise, judging that the target is positioned in the lane line of the self-vehicle driving;
wherein C1 is a standard quantity, and is a threshold value calibrated according to the sensing error of the sensor in the road test process;
a first judgment unit which selects a target lateral-longitudinal motion prediction parameter and a model according to a positional relationship between a target and a lane line of a host vehicle, comprising:
if the target is in the self-vehicle driving lane, adopting the perceived and input target transverse and longitudinal speed and transverse and longitudinal acceleration, and adopting a uniform acceleration model to predict the target motion state;
if the target is outside the self-vehicle driving lane, the target is not subjected to lateral motion prediction, the lateral speed and the lateral acceleration of the target which are input through perception are cleared, the longitudinal speed and the longitudinal acceleration are reserved, and the motion state of the target is predicted by adopting a uniform acceleration model.
The second judging unit judges whether the current moment of the target is in the self-vehicle driving track or not, and comprises:
the self-vehicle driving track equation is calculated by the current navigation-related angular speed of the self-vehicle according to circular motion;
the distances between the four corner points of the calculated target and the center of the current self-vehicle driving track are respectively expressed as follows: y_corer1, y_corer2, y_corer3, and y_corer4;
y_min=min(abs(y_corner1,y_corner2,y_corner3,y_corner4));。
if y_corel, y_corer 2, y_corer 3, y_corer 4 are not completely identical in the coordinate system, or |y_min| <0.5 (vehicle width+target width) -C2, then judging that the target is currently in the vehicle running track, wherein C2 is a calibration quantity, and is a calibration threshold value determined according to the error range of the sensor and the tolerance degree of the target intrusion;
the third judging unit judges whether the target is in the self-vehicle driving track at the TTC moment, and comprises: calculating the time TTC required by the vehicle to reach the target in the longitudinal direction;
the relative distance between the vehicle and the target is expressed as X rel =X obj -X ego
The relative speed of the bicycle and the target is expressed as V rel =V obj -V ego
The relative acceleration between the bicycle and the target is expressed as A rel =A obj -A ego
X obj Representing the longitudinal position of the object, X ego Indicating the longitudinal position of the bicycle, V obj Representing the target longitudinal velocity, V ego Indicating the longitudinal speed of the bicycle, A obj Indicating target longitudinal acceleration, A ego Representing the longitudinal acceleration of the bicycle;
the fourth judging unit decides whether to trigger AEB according to the judging result of the second judging unit, the judging result of the third judging unit and the AEB risk threshold, and includes: and triggering AEB if the current moment and TTC moment of the target are both in the self-vehicle driving track and are larger than the AEB dangerous threshold.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The present invention has been described in detail by way of specific embodiments and examples, but these should not be construed as limiting the invention. Many variations and modifications may be made by one skilled in the art without departing from the principles of the invention, which is also considered to be within the scope of the invention.

Claims (12)

1. The target motion prediction method is used for identifying the target of the emergency braking working condition of the vehicle and is characterized by comprising the following steps of:
s1, sensing and identifying target types;
s2, calculating whether the current moment of the target is positioned in a self-vehicle driving lane according to the sensor sensing environment data;
s3, if the target is in the self-vehicle driving lane, adopting the perceived and input target transverse and longitudinal speed and transverse and longitudinal acceleration, and adopting a uniform acceleration model to predict the target motion state;
if the target is outside the self-vehicle driving lane, the target is not subjected to lateral motion prediction, the lateral speed and the lateral acceleration of the target which are input through perception are removed, the longitudinal speed and the longitudinal acceleration are reserved, and a uniform acceleration model is adopted to predict the motion state of the target;
s4, judging whether the current moment of the target is in the self-vehicle driving track or not;
s5, judging whether the target is in the self-vehicle driving track at the TTC moment;
and S6, determining whether to trigger AEB according to the judgment result of the step S4, the judgment result of the step S5 and the AEB danger threshold.
2. The target motion prediction method according to claim 1, wherein calculating whether the current time of the target is located within the own-vehicle driving lane comprises:
calculating the transverse distance x_lanemark_right between the target and the lane line on the right side of the self-vehicle driving lane under the condition of the current target longitudinal distance in the self-vehicle coordinate system, and the transverse distance x_lanemark_left between the target and the lane line on the left side of the self-vehicle driving lane, wherein the transverse distance x_obj between the target and the self-vehicle;
if the type of the lane line on the right side is a solid line or a road edge, and x_obj is less than x_lane_right+C1, judging that the target is positioned on the outer side of the lane line on the right side of the self-vehicle;
if the left lane line type is a solid line or a road edge and x_obj is more than x_lanemark_left-C1, judging that the target is positioned outside the left self-vehicle driving lane line;
otherwise, judging that the target is positioned in the lane line of the self-vehicle driving;
wherein C1 is the standard quantity, and is the threshold value calibrated according to the sensor perception error in the road test process.
3. The method of claim 1, wherein the step S4 is performed by:
the self-vehicle driving track equation is calculated by the current navigation-related angular speed of the self-vehicle according to circular motion;
the distances between the four corner points of the calculated target and the center of the current self-vehicle driving track are respectively expressed as follows: y_corer1, y_corer2, y_corer3, and y_corer4;
y_min=min(abs(y_cornerl,y_corner2,y_corner3,y_corner4));。
if y_corer, y corer 2, y corer 3, y corer 4 are not exactly equal in number under the coordinate system, or |ymin| <0.5 (width of vehicle+target width) -C2, then it is determined that the target is currently in the track of vehicle, and C2 is the calibration quantity, which is the calibration threshold determined according to the sensor error range and the tolerance degree to the intrusion of the target.
4. The method of predicting target motion according to claim 1, wherein the step S5 is performed by:
calculating the time TTC required by the vehicle to reach the target in the longitudinal direction;
the relative distance between the vehicle and the target is expressed as X rel =X obj -X ego
The relative speed of the bicycle and the target is expressed as V rel =V obj -V ego
The relative acceleration between the bicycle and the target is expressed as A rel =A obj -A ego
X obj Representing the longitudinal position of the object, X ego Indicating the longitudinal position of the bicycle, V obj Representing the target longitudinal velocity, V ego Indicating the longitudinal speed of the bicycle, A obj Indicating target longitudinal acceleration, A ego Indicating the longitudinal acceleration of the vehicle.
5. The method of claim 1, wherein the step S6 is performed by:
and triggering AEB if the current moment and TTC moment of the target are both in the self-vehicle driving track and are larger than the AEB dangerous threshold.
6. A computer-readable storage medium, characterized by: which has stored therein a computer program which, when executed, is adapted to carry out the steps of the object motion prediction method according to any one of claims 1-5.
7. A target motion prediction system for a vehicle emergency braking system, comprising:
the data receiving unit acquires the sensing target type from the vehicle-mounted sensing system;
a calculation unit that calculates whether or not a target current time is located within a host vehicle travel lane;
a first judgment unit which selects a target transverse and longitudinal movement prediction parameter and a model according to the position relationship between the target and a self-vehicle driving lane line;
the second judging unit is used for judging whether the current moment of the target is in the self-vehicle driving track or not;
a third judging unit for judging whether the target is in the self-vehicle driving track at the TTC moment;
and the fourth judging unit is used for determining whether to trigger AEB according to the judging result of the second judging unit, the judging result of the third judging unit and the AEB dangerous threshold value.
8. The object motion prediction system according to claim 7, wherein the calculation unit calculates whether the current time of the object is located in the own-vehicle travel lane by;
calculating the transverse distance x_lanemark_right between the target and the lane line on the right side of the self-vehicle driving lane under the condition of the current target longitudinal distance in the self-vehicle coordinate system, and the transverse distance x_lanemark_left between the target and the lane line on the left side of the self-vehicle driving lane, wherein the transverse distance x_obj between the target and the self-vehicle;
if the type of the lane line on the right side is a solid line or a road edge, and x_obj is less than x_lane_right+C1, judging that the target is positioned on the outer side of the lane line on the right side of the self-vehicle;
if the left lane line type is a solid line or a road edge and x_obj is more than x_lanemark_left-C1, judging that the target is positioned outside the left self-vehicle driving lane line;
otherwise, judging that the target is positioned in the lane line of the self-vehicle driving;
wherein C1 is the standard quantity, and is the threshold value calibrated according to the sensor perception error in the road test process.
9. The object motion prediction system according to claim 7, wherein the first judgment unit selecting the object lateral-longitudinal motion prediction parameter and the model based on a positional relationship between the object and the lane line of the own vehicle comprises:
if the target is in the self-vehicle driving lane, adopting the perceived and input target transverse and longitudinal speed and transverse and longitudinal acceleration, and adopting a uniform acceleration model to predict the target motion state;
if the target is outside the self-vehicle driving lane, the target is not subjected to lateral motion prediction, the lateral speed and the lateral acceleration of the target which are input through perception are cleared, the longitudinal speed and the longitudinal acceleration are reserved, and the motion state of the target is predicted by adopting a uniform acceleration model.
10. The object motion prediction system according to claim 7, wherein the second judging unit judging whether the current time of the object is within the own-vehicle travel track, includes:
the self-vehicle driving track equation is calculated by the current navigation-related angular speed of the self-vehicle according to circular motion;
the distances between the four corner points of the calculated target and the center of the current self-vehicle driving track are respectively expressed as follows: y_corer1, y_corer2, y_corer3, and y_corer4;
y_min=min(abs(y_corner1,y_corner2,y_corner3,y_corner4));。
if y_corel, y_corer 2, y_corer 3, y_corer 4 are not exactly equal in the coordinate system, or |ymin| <0.5 (vehicle width+target width) -C2, then it is determined that the target is currently in the vehicle driving track, and C2 is the calibration quantity, which is a calibration threshold determined according to the sensor error range and the tolerance degree to the target intrusion.
11. The object motion prediction system according to claim 7, wherein the third judging unit judging whether the object is within the own-vehicle travel track at the TTC time, includes:
calculating the time TTC required by the vehicle to reach the target in the longitudinal direction;
the relative distance between the vehicle and the target is expressed as X rel =X obj -X ego
The relative speed of the bicycle and the target is expressed as V rel =V obj -V ego
The relative acceleration between the bicycle and the target is expressed as A rel =A obj -A ego
X obj Representing the longitudinal position of the object, X ego Indicating the longitudinal position of the bicycle, V obj Representing the target longitudinal velocity, V ego Indicating the longitudinal speed of the bicycle, A obj Indicating target longitudinal acceleration, A ego Indicating the longitudinal acceleration of the vehicle.
12. The target motion prediction system according to claim 7, wherein the fourth judging unit decides whether to trigger AEB based on the second judging unit judging result, the third judging unit judging result, and the AEB risk threshold value, comprising:
and triggering AEB if the current moment and TTC moment of the target are both in the self-vehicle driving track and are larger than the AEB dangerous threshold.
CN202311569750.6A 2023-11-23 2023-11-23 Target motion prediction method, system and storage medium Pending CN117601855A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311569750.6A CN117601855A (en) 2023-11-23 2023-11-23 Target motion prediction method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311569750.6A CN117601855A (en) 2023-11-23 2023-11-23 Target motion prediction method, system and storage medium

Publications (1)

Publication Number Publication Date
CN117601855A true CN117601855A (en) 2024-02-27

Family

ID=89947433

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311569750.6A Pending CN117601855A (en) 2023-11-23 2023-11-23 Target motion prediction method, system and storage medium

Country Status (1)

Country Link
CN (1) CN117601855A (en)

Similar Documents

Publication Publication Date Title
CN110155046B (en) Automatic emergency braking hierarchical control method and system
US9937905B2 (en) Side collision avoidance system and method for vehicle
CN106240565B (en) Collision mitigation and avoidance
CN110884490B (en) Method and system for judging vehicle intrusion and assisting driving, vehicle and storage medium
CN110696823B (en) Method and system for predicting collision time of vehicle and vehicle
US20140350813A1 (en) Apparatus and method for preventing collision with vehicle
KR20140057583A (en) Safety device for motor vehicles
US20210385573A1 (en) Enhanced autonomous systems with sound sensor arrays
CN106470884B (en) Determination of vehicle state and driver assistance while driving a vehicle
CN112406820B (en) Multi-lane enhanced automatic emergency braking system control method
CN111391830B (en) Longitudinal decision system and longitudinal decision determination method for automatic driving vehicle
CN113859232B (en) Method and system for predicting and alarming potential targets in automatic driving of vehicle
US20160137207A1 (en) Method and Apparatus For Efficiently Providing Occupancy Information on the Surroundings of a Vehicle
CN113453969A (en) Method for protecting a vehicle
CN116390879A (en) System and method for avoiding impending collisions
CN116872921A (en) Method and system for avoiding risks of vehicle, vehicle and storage medium
US20220375349A1 (en) Method and device for lane-changing prediction of target vehicle
CN114132311A (en) Method and module for screening dangerous targets for automatic emergency braking of vehicle
CN108520639A (en) A kind of method for early warning and system preventing vehicle collision
CN116279341B (en) Safety braking method and device, electronic equipment and storage medium
CN116853235A (en) Collision early warning method, device, computer equipment and storage medium
CN114475587B (en) Risk assessment algorithm for introducing target behaviors and collision probability
CN117601855A (en) Target motion prediction method, system and storage medium
CN114987496A (en) Dangerous target screening method of automatic emergency braking system
CN116135638A (en) Self-adaptive anti-collision method and system for vehicle crossing and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination