CN113771845A - Method, device, vehicle and storage medium for predicting vehicle track - Google Patents

Method, device, vehicle and storage medium for predicting vehicle track Download PDF

Info

Publication number
CN113771845A
CN113771845A CN202010515572.9A CN202010515572A CN113771845A CN 113771845 A CN113771845 A CN 113771845A CN 202010515572 A CN202010515572 A CN 202010515572A CN 113771845 A CN113771845 A CN 113771845A
Authority
CN
China
Prior art keywords
vehicle
trajectory
target vehicle
coordinates
host
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010515572.9A
Other languages
Chinese (zh)
Inventor
唐帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Priority to CN202010515572.9A priority Critical patent/CN113771845A/en
Publication of CN113771845A publication Critical patent/CN113771845A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/65Data transmitted between vehicles

Abstract

The present disclosure provides a method of predicting a vehicle trajectory, an apparatus for predicting a vehicle trajectory, a vehicle, and a storage medium. The method comprises the following steps: acquiring a first track of a target vehicle within a predetermined range from the host vehicle, wherein the first track is predicted by the host vehicle based on a sensor signal associated with the target vehicle and sensed by a sensor of the host vehicle; receiving a second trajectory from the target vehicle via the vehicle-to-vehicle communication link, wherein the second trajectory is a planned trajectory planned by an autonomous driving system of the target vehicle; determining a first confidence associated with the first trajectory and a second confidence associated with the second trajectory; and fusing the first track and the second track by using the first confidence coefficient and the second confidence coefficient to obtain the predicted track of the target vehicle.

Description

Method, device, vehicle and storage medium for predicting vehicle track
Technical Field
The present disclosure relates generally to the field of driving assistance technology, and more particularly, to a method for predicting a vehicle trajectory, an apparatus for predicting a vehicle trajectory, a vehicle, and a storage medium.
Background
Autonomous driving techniques may involve a number of aspects including environmental awareness, behavioral decision-making, path planning, and motion control. The environment sensing is a precondition of automatic driving, and is mainly used for acquiring surrounding environment information, including road boundary detection, vehicle detection, pedestrian detection, positioning of obstacles (such as vehicles and pedestrians), and the like; the behavior decision or prediction is mainly to make decision judgment according to the information acquired by the sensing system, such as predicting the next action of other vehicles and pedestrians.
The automatic driving vehicle needs to have the capability of predicting the motion trend or track of surrounding vehicles in a traffic scene, so that a dangerous scene is avoided, and the safe and comfortable driving experience is ensured. However, due to the complexity of road traffic scenarios, it has become a challenging task for autonomous vehicles to accurately predict the trends or trajectories of surrounding vehicles.
Disclosure of Invention
According to one aspect of the present disclosure, a method of predicting a vehicle trajectory is provided. The method comprises the following steps: acquiring a first track of a target vehicle within a predetermined range from the host vehicle, wherein the first track is predicted by the host vehicle based on a sensor signal associated with the target vehicle and sensed by a sensor of the host vehicle; receiving a second trajectory from the target vehicle via the vehicle-to-vehicle communication link, wherein the second trajectory is a planned trajectory planned by an autonomous driving system of the target vehicle; determining a first confidence associated with the first trajectory and a second confidence associated with the second trajectory; and fusing the first track and the second track by using the first confidence coefficient and the second confidence coefficient to obtain the predicted track of the target vehicle.
According to another aspect of the present disclosure, an apparatus for predicting a trajectory of a vehicle is provided. The device includes: a processor, and a memory storing a program. The program comprises instructions which, when executed by a processor, cause the processor to perform a method of predicting a trajectory of a vehicle according to an embodiment of the present disclosure.
According to another aspect of the present disclosure, a vehicle is provided. The vehicle comprises the device for predicting the track of the vehicle according to the embodiment of the disclosure.
According to another aspect of the present disclosure, a non-transitory computer-readable storage medium storing a program is provided. The program includes instructions that, when executed by one or more processors, cause the one or more processors to perform a method according to an embodiment of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the embodiments and, together with the description, serve to explain the exemplary implementations of the embodiments. The illustrated embodiments are for purposes of illustration only and do not limit the scope of the claims. Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements.
FIG. 1 is a schematic diagram illustrating vehicle trajectory prediction, according to some exemplary embodiments;
FIG. 2 is a flow chart illustrating a method of predicting vehicle trajectory according to some exemplary embodiments;
FIG. 3 is a block diagram illustrating an apparatus for predicting vehicle trajectories according to some exemplary embodiments;
FIG. 4 is a block diagram illustrating a vehicle according to some exemplary embodiments; and
FIG. 5 is a schematic diagram illustrating an application scenario for a vehicle, according to some exemplary embodiments.
Detailed Description
In the present disclosure, unless otherwise specified, the use of the terms "first", "second", etc. to describe various elements is not intended to limit the positional relationship, the timing relationship, or the importance relationship of the elements, and such terms are used only to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, based on the context, they may also refer to different instances.
The terminology used in the description of the various described examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, if the number of elements is not specifically limited, the elements may be one or more. Further, as used in this disclosure, the terms "and/or" and at least one of "… …" encompass any and all possible combinations of the listed items.
The automatic driving vehicle can accurately predict the track of the surrounding vehicle in time, and take corresponding countermeasures or means according to the track, which is important for safe driving. A typical method for predicting the vehicle trajectory is to perform prediction based on a sensor signal of a target vehicle sensed by a sensor provided in the host vehicle itself. The target vehicle, that is, the vehicle whose trajectory is to be predicted, may be any vehicle that may affect the traveling of the host vehicle within a preset distance range of the host vehicle, for example. In one example, some behavior of the target vehicle, such as turning on a turn signal by the target vehicle, may be known through the visual camera, so as to predict the possible driving intention and trajectory of the target vehicle.
FIG. 1 shows a schematic diagram of an example 100 of vehicle trajectory prediction. As shown in fig. 1, the vision camera of the vehicle (own vehicle) 110 can take pictures of surrounding vehicles in real time, and by analyzing the pictures, the own vehicle 110 can know that the vehicle 120 (target vehicle) ahead on the left turns on the turn lamps 121 and 122 indicating turning to the right, thereby determining that the vehicle 120 wants or is about to change lanes to the right. In conjunction with other sensor signals (e.g., speed, distance from host vehicle 110, etc.) of the vehicle 120 as perceived, the host vehicle 110 may predict a turning trajectory, such as trajectory 130, for the vehicle 120 to turn to the right. The prediction method is mainly used for predicting based on signals sensed by the sensor. However, subject to the sensing range or sensing accuracy of the sensors, the trajectory 130 of the vehicle 120 predicted by this method may not be the same as or may not be completely trusted as the actual travel trajectory of the vehicle 120.
Another method of predicting Vehicle trajectory is to receive (e.g., when the target Vehicle is an autonomous Vehicle) a planned trajectory planned by the target Vehicle according to its autonomous system directly from the target Vehicle via a communication connection established with the target Vehicle, such as Vehicle-to-Vehicle (V2V) communication. For example, as shown in fig. 1, the host-vehicle 110 may receive the planned trajectory 140 of the vehicle 120 directly from the vehicle 120. Due to possible delays in the communication of V2V, or unstable communication links, projected trajectory 140 received from vehicle 120 may not be the same as or completely trustworthy as the actual driving trajectory of vehicle 120.
The disclosed embodiments provide a method for predicting vehicle trajectories in conjunction with sensor sensing and V2V communication. A first trajectory of a target vehicle is acquired based on a sensor signal associated with the target vehicle as perceived by a sensor of the host vehicle. A second trajectory from the target vehicle is received via the V2V communication link. Then, a first confidence associated with the first trajectory and a second confidence associated with the second trajectory are determined. And fusing the first track and the second track by using the first confidence coefficient and the second confidence coefficient so as to obtain the predicted track of the target vehicle. By fusing the tracks from different sources, the determined predicted track can more accurately reflect the next track of the target vehicle, and the driving safety is improved.
FIG. 2 is a flow chart illustrating a method 200 of predicting a vehicle trajectory, according to some exemplary embodiments. The method 200 is described below in conjunction with the example 100 of fig. 1.
In step S201, a first trajectory 130 of the target vehicle 120 within a predetermined range from the host vehicle 110 is acquired.
In some embodiments, the first trajectory 130 is predicted by the host vehicle 110 based on sensor signals associated with the target vehicle 120 as perceived by sensors of the host vehicle 110. Subject to the range of sensor-perceived signals, or the effect of other vehicles on the travel of the host-vehicle 110, the host-vehicle 110 may only need to acquire the trajectories of vehicles within its predetermined range. The predetermined range may be, for example, 500 meters. In general, vehicles far away from the host vehicle 110 (for example, vehicles that are not in the same lane as the host vehicle 110 and are far away (for example, more than 500 meters)) have little or no influence on the traveling of the host vehicle 110, so that it is not necessary to acquire information of these vehicles for analysis, thereby reducing the data processing amount and improving the processing efficiency.
As used herein, the term "trajectory" may be understood or represented as a geometric model, such as a curve, a surface, etc., in a high definition (high definition) map, a high automated driving (high automated driving) map coordinate system. In one example, as described above, some of the behavior of the target vehicle 120 may be known through the visual cameras, such as turning on turn lights, brake lights, etc., predicting driving intent that the target vehicle 120 may want to change lanes, turn around, or slow down, etc., and then predicting the trajectory of the target vehicle 120 in conjunction with other sensor signals of the target vehicle 120 as perceived.
At step S202, the second trajectory 140 from the target vehicle 120 is received via the V2V communication link.
In some embodiments, the second trajectory 140 is a planned trajectory planned by an autonomous driving system of the target vehicle 120. In some examples, the target vehicle 120 may be an autonomous vehicle that may acquire its own planned trajectory in real time and broadcast some of its own information in real time. Or alternatively, the target vehicle 120 may also be a vehicle in an assisted driving mode. The driving assistance may include, for example, traffic jam assistance, highway driving assistance, automated parking assistance, lane keeping assistance, and the like. The information broadcast by target vehicle 120 may include, for example, a planned trajectory of target vehicle 120, as well as identification information of target vehicle 120, as noted below, and the like. The host vehicle 110 may receive the planned trajectory broadcast by the target vehicle 120 via the V2V communication link.
In some embodiments, within a predetermined range from the host-vehicle 110, there may be one or more vehicles, such as a front-left vehicle located in an adjacent lane to the host-vehicle 110, a front-right vehicle located in an adjacent lane to the host-vehicle 110, a front vehicle located in the same lane as the host-vehicle 110, and so forth. Prior to determining the predicted trajectory of the target vehicle 120, the target vehicle 120 needs to be identified from one or more vehicles surrounding the host-vehicle 110. In some examples, the host vehicle 110 may identify the target vehicle 120 from the identification information of the vehicle. Using one or more sensors with which the host-vehicle 110 is equipped, the host-vehicle 110 may detect respective first identification information for one or more vehicles within a predetermined range thereof. The host-vehicle 110 may also receive respective second identification information broadcast from the one or more vehicles via the V2V communication link. Based on the similarity between the first identification information and the second identification information, the host-vehicle 110 may perform matching, thereby identifying the target vehicle 120 from the one or more vehicles.
In some examples, each of the first identification information and the second identification information may include at least one selected from the group consisting of: vehicle license plate number information, vehicle body color information, vehicle brand information, vehicle model information, and vehicle satellite navigation position information. For example, the host vehicle 110 may take pictures of one or more vehicles within a predetermined range via a camera, and identify the license plate number of each vehicle from the pictures. The host-vehicle 110 may also receive its respective license plate number information broadcast by the one or more vehicles via the V2V communication link. Through the license plate number comparison or matching, the host-vehicle 110 may identify the target vehicle 120 from the one or more vehicles. In some examples, to more accurately identify the target vehicle 120, the host-vehicle 110 may also be compared or matched together according to both the body color and the license plate number of the vehicle. The disclosed embodiments do not limit the manner in which target vehicle 120 is matched, as long as target vehicle 120 can be identified.
At step S203, a first confidence level associated with the first trajectory 130 and a second confidence level associated with the second trajectory 140 are determined.
In some embodiments, each trajectory may include coordinates of target vehicle 120 at a current time in the respective coordinate system and at a plurality of future times after the current time. For example, for the current time t-t 0, the plurality of future times may be t0+0.1 seconds, t0+0.2 seconds, …, t0+1 seconds, t0+2 seconds, or later. In some examples, the first trajectory 130 includes a first sequence of coordinates of the target vehicle 120 in a first coordinate system associated with the host-vehicle 110. The first coordinate sequence includes: the target vehicle 120 has a corresponding plurality of first coordinates at a plurality of future times that are subsequent to the current time in the first coordinate system. The first coordinate system associated with the host-vehicle 110 may include, for example, a local coordinate system (e.g., a vision camera coordinate system, a lidar coordinate system, etc.) associated with sensors of the host-vehicle 110, a host-vehicle 110 body coordinate system, and so forth. In one example, the x-axis of the body coordinate system of the host-vehicle 110 may be a direction toward the right side of the host-vehicle 110 when the host-vehicle 110 faces forward, the y-axis may be a direction in which the host-vehicle 110 advances, the z-axis may be a direction perpendicular to the ground toward the roof of the host-vehicle 110, and the origin may be the center of the rear axle of the host-vehicle 110. In other examples, the body coordinate system of the host vehicle 110 may also be defined in other forms, for example, the x-axis is the traveling direction of the host vehicle 110. The disclosed embodiments are not limited in this respect.
In some examples, second trajectory 140 includes a second sequence of coordinates of target vehicle 120 in a second coordinate system associated with target vehicle 120. The second coordinate sequence includes: a corresponding plurality of second coordinates of the target vehicle 120 at a plurality of future times after the current time in the second coordinate system. The second coordinate system associated with target vehicle 120 may include, for example, a local coordinate system (e.g., a vision camera coordinate system, a lidar coordinate system, etc.) associated with sensors of target vehicle 120, a body coordinate system of target vehicle 120, and so forth. For example, the x-axis of the body coordinate system of target vehicle 120 may be the direction pointing to the right of target vehicle 120 when target vehicle 120 is facing forward, the y-axis may be the direction of forward travel of target vehicle 120, the z-axis may be the direction perpendicular to the ground pointing to the roof of target vehicle 120, and the origin may be the center of the rear axle of target vehicle 120. In other examples, the body coordinate system of the target vehicle 120 may also be defined in other forms, for example, the x-axis is the forward direction of the target vehicle 120. The disclosed embodiments are not limited in this respect.
In some examples, each trajectory is also provided with a respective confidence. For example, the first trajectory 130 may be represented as (x)1(t),y1(t),k1),x1(t) and y1(t) is the coordinates of the target vehicle 120 in the first coordinate system at time t, k1Is the first confidence. The second trace 140 may be represented as (x)2(t),y2(t),k2),x2(t) and y2(t) is the coordinates of the target vehicle 120 in the second coordinate system at time t, k2Is the second confidence.
In some embodiments, the first confidence level may be determined based on at least one of a noise level of the sensor signal, a behavior of the target vehicle 120, and an accuracy of a perception and prediction algorithm used by the host-vehicle 110 to predict the first trajectory 130. The first confidence may be inversely proportional to a noise level of the sensor signal. For example, the value of the first confidence may be smaller when the noise level of the sensor signal is larger. Alternatively or additionally, the first confidence level may be related to the behavior of the target vehicle 120. For example, if the host-vehicle 110 detects an uncertain movement of the target vehicle 120 (e.g., a lateral shift left and right within a lane), the first confidence may be set to have a smaller value. Alternatively or additionally, the first confidence level may be proportional to the perception of the host-vehicle 110 used to predict the first trajectory 130 and the accuracy of the prediction algorithm. For example, when the algorithm is highly accurate, the value of the first confidence may be large. In an example, the accuracy of the perception and prediction algorithms of the host vehicle 110 may be predetermined, e.g., default values set at the factory of the autonomous driving system. The second confidence level may be determined based on at least one of the trajectory planning accuracy of the target vehicle 120 and the noise level in the V2V communication link. The trajectory planning accuracy of the target vehicle 120 refers to the accuracy with which the autonomous driving system of the target vehicle 120 plans the trajectory. The trajectory planning accuracy of the target vehicle 120 may be predetermined, for example, a default value set at the time of factory shipment of the autonomous driving system. The value of the second confidence may be proportional to the trajectory planning accuracy of the target vehicle 120. For example, when the trajectory planning accuracy is high, the value of the second confidence may be high. Alternatively or additionally, the value of the second confidence level may be inversely proportional to the noise level in the V2V communication link. For example, when the noise level in the V2V communication link is large, the value of the second confidence may be small. In an example, the trajectory planning accuracy of the target vehicle 120 may be transmitted from the target vehicle 120 to the host-vehicle 110 via a V2V communication link.
In step S204, the first trajectory 130 and the second trajectory 140 are fused with the first confidence level and the second confidence level to obtain a predicted trajectory of the target vehicle 120.
As described above, the first trajectory 130 includes a first sequence of coordinates of the target vehicle 120 in a first coordinate system associated with the host-vehicle 110, and the second trajectory 140 includes a second sequence of coordinates of the target vehicle 120 in a second coordinate system associated with the target vehicle 120. In some examples, as previously described, the first coordinate system may include a local coordinate system associated with a sensor of the host-vehicle 110, a body coordinate system of the host-vehicle 110, etc., and the second coordinate system may include a local coordinate system associated with a sensor of the target-vehicle 120, a body coordinate system of the target-vehicle 120, etc. The first coordinate system may be different from the second coordinate system. When the first coordinate system is different from the second coordinate system, coordinate conversion is required before subsequent processing or calculation. In this case, fusing the first trajectory 130 and the second trajectory 140 with the first confidence and the second confidence may include: converting a plurality of second coordinates in the second coordinate system into a plurality of third coordinates in the first coordinate system; fusing a plurality of first coordinates and a plurality of third coordinates in the first coordinate system by using the first confidence coefficient and the second confidence coefficient to obtain a plurality of fourth coordinates; and fitting the plurality of fourth coordinates using a regression fitting method to obtain a predicted trajectory of the target vehicle 120.
In some examples, the plurality of fourth coordinates may be fitted using any suitable regression fitting method to arrive at the predicted trajectory of target vehicle 120. For example, the regression fitting method includes at least one selected from the group consisting of interpolation, polishing, and least-squares.
In some embodiments, after converting the coordinates of the target vehicle 120 in the second coordinate system to corresponding coordinates in the first coordinate system, fusing the plurality of first coordinates and the plurality of third coordinates with the first confidence level and the second confidence level includes: for each of a plurality of future time instants, a weighted sum of a corresponding one of the plurality of first coordinates at the future time instant and a corresponding one of the plurality of third coordinates at the future time instant is calculated. The first confidence coefficient is used as a weighting coefficient corresponding to the first coordinate, and the second confidence coefficient is used as a weighting coefficient corresponding to the third coordinate.
For example, as shown in fig. 1, for a coordinate 131 located on the trajectory 130 (first trajectory) at a future time (e.g., t ═ t0+1.2 seconds) and a coordinate 141 located on the planned trajectory 140 (second trajectory), the confidence of the first trajectory 130 is 40% and the confidence of the second trajectory 140 is 80%. Then, the fused coordinates, for example, coordinates 150, can be obtained by coordinate conversion and fusion processing. The predicted trajectory of the target vehicle 120 may be obtained by fitting the plurality of coordinates obtained by the fusion using a regression fitting method.
In some examples, after obtaining the predicted trajectory of the target vehicle 120, a planned trajectory of the host-vehicle 110 may also be determined based on the predicted trajectory of the target vehicle 120. For example, after obtaining the predicted trajectory of the target vehicle 120, the planned trajectory of the host vehicle 110 may be determined based on the predicted trajectory in combination with information such as the current traveling speed, acceleration, position, and destination to which the host vehicle 110 is instructed to arrive. From the planned trajectory, the host-vehicle 110 will be maintained on the desired path while adhering to traffic regulations and avoiding detected obstacles.
In some examples, after the planned trajectory is determined, the host-vehicle 110 may also broadcast the planned trajectory via V2V communication. In other examples, after the planned trajectory is determined, the host vehicle 110 may further determine a collision or a collision risk level of the host vehicle 110 and the target vehicle 120 in combination with a predicted trajectory of the target vehicle 120, a relative positional relationship between the host vehicle 110 and the target vehicle 120, and the like. For example, a collision or collision risk level of the host vehicle 110 with the target vehicle 120 may be determined under different scenarios (e.g., deceleration of the vehicle in front of the same lane, acceleration or uniform speed of the vehicle in front of the same lane, lane change cut-in of the vehicle in front of an adjacent lane, lane change exit of the vehicle in front of the same lane, etc.). And taking corresponding action or measure according to the risk level.
According to another aspect of the disclosed embodiments, there is provided an apparatus for predicting a trajectory of a vehicle. FIG. 3 shows a block diagram of an apparatus 300 for predicting a vehicle trajectory, according to some exemplary embodiments. As shown in fig. 3, the apparatus 300 includes a processor 301 and a memory 302. The memory 302 is used to store a program comprising instructions that, when executed by the processor 301, cause the processor 301 to perform the method 200 described above and its various variants.
According to another aspect of the present disclosure, a vehicle is provided. FIG. 4 illustrates a block diagram of a vehicle 400, according to some exemplary embodiments. In addition to including some of the basic components of the vehicle, the vehicle 400 also includes the apparatus 300 described above with respect to fig. 3.
According to another aspect of the present disclosure, a non-transitory computer-readable storage medium storing a program is provided. The program includes instructions that, when executed by one or more processors, cause the one or more processors to perform the method 200 described above and its various variations.
Fig. 5 illustrates an example diagram of one application scenario of a vehicle 510, in which the vehicle 510 (the host vehicle) may communicate with various entities (e.g., another vehicle 520) via various communication links (e.g., V2V), according to some example embodiments. According to some embodiments, the vehicle 510 may be the vehicle 110 described above with respect to fig. 1 and the vehicle 400 described with respect to fig. 4.
The vehicle 510 may include sensors 511 for sensing the surroundings. The sensors 511 may include one or more of the following sensors: ultrasonic sensor, millimeter wave radar, laser radar, vision camera and infrared camera. Different sensors may provide different detection accuracies and ranges. The ultrasonic sensors can be arranged around the vehicle and used for measuring the distance between an object outside the vehicle and the vehicle by utilizing the characteristics of strong ultrasonic directionality and the like. The millimeter wave radar may be installed in front of, behind, or other positions of the vehicle for measuring the distance of an object outside the vehicle from the vehicle using the characteristics of electromagnetic waves. The lidar may be mounted in front of, behind, or otherwise of the vehicle for detecting object edges, shape information, and thus object identification and tracking. The radar apparatus can also measure a speed variation of the vehicle and the moving object due to the doppler effect. The camera may be mounted in front of, behind, or otherwise on the vehicle. The visual camera may capture conditions inside and outside the vehicle in real time and present to the driver and/or passengers. In addition, by analyzing the picture captured by the visual camera, information such as traffic light indication, intersection situation, other vehicle running state, and the like can be acquired. The infrared camera can capture objects under night vision conditions.
The vehicle 510 may also include an output device 512. The output devices 512 include, for example, a display, a speaker, and the like, to present various outputs or instructions. Furthermore, the display may be implemented as a touch screen, so that input may also be detected in different ways. A user graphical interface may be presented on the touch screen to enable a user to access and control the corresponding controls.
The vehicle 510 may also include one or more controllers 513. The controller 513 may include a processor, such as a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU), or other special purpose processor, etc., in communication with various types of computer-readable storage devices or media (not shown). A computer-readable storage apparatus or medium may include any non-transitory storage device, which may be non-transitory and may implement any storage device that stores data, and may include, but is not limited to, a magnetic disk drive, an optical storage device, solid state memory, floppy disk, flexible disk, hard disk, magnetic tape, or any other magnetic medium, an optical disk or any other optical medium, a Read Only Memory (ROM), a Random Access Memory (RAM), a cache memory, and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions, and/or code. Some of the data in the computer readable storage device or medium represents executable instructions used by the controller 513 to control the vehicle. The controller 513 may include an autopilot system for automatically controlling various actuators in the vehicle. The autopilot system is configured to control the powertrain, steering system, and braking system, etc. of the vehicle 510 via a plurality of actuators in response to inputs from a plurality of sensors 511 or other input devices to control acceleration, steering, and braking, respectively, without human intervention or limited human intervention. Part of the processing functions of the controller 513 may be implemented by cloud computing. For example, some processing may be performed using an onboard processor while other processing may be performed using the computing resources in the cloud. According to some embodiments, the controller 513 and its associated computer readable storage may be one example of the apparatus 300 of fig. 3 above. The computer-readable storage associated with the controller 513 may be one example of the non-transitory computer-readable storage medium described above.
The vehicle 510 also includes a communication device 514. The communication device 514 includes a satellite positioning module capable of receiving satellite positioning signals from satellites 570 and generating coordinates based on these signals. The communication device 514 also includes modules to communicate with a mobile communication network 530, which may implement any suitable communication technology, such as current or evolving wireless communication technologies (e.g., 5G technologies) like GSM/GPRS, CDMA, LTE, etc. The communication device 514 may also have a Vehicle-to-anything (V2X) module configured to enable Vehicle-to-outside communications, for example, with V2V communications and Vehicle-to-Infrastructure (V2I) communications. Further, the communication device 514 may also have a module configured to communicate with a user terminal 540 (including but not limited to a smartphone, tablet, or wearable device such as a watch) via, for example, wireless local area network using IEEE802.11 standards or bluetooth. With the communication device 514, the vehicle 510 can access the online server 550 or the cloud server 560 via the wireless communication system, and the online server 550 or the cloud server 560 is configured to provide corresponding services of data processing, data storage, and data transmission for the motor vehicle.
In addition, the vehicle 510 includes a powertrain, a steering system, a braking system, and the like, which are not shown in fig. 5, for implementing the driving functions of the motor vehicle.
Although embodiments or examples of the present disclosure have been described with reference to the accompanying drawings, it is to be understood that the methods, systems, and apparatus described above are merely exemplary embodiments or examples and that the scope of the present disclosure is not limited by these embodiments or examples, but only by the claims as issued and their equivalents. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. Various elements in the embodiments or examples may be omitted or may be replaced with equivalents thereof. Further, the steps may be performed in an order different from that described in the present disclosure. Further, various elements in the embodiments or examples may be combined in various ways. It is important that as technology evolves, many of the elements described herein may be replaced with equivalent elements that appear after the present disclosure.

Claims (12)

1. A method of predicting a vehicle trajectory, comprising:
acquiring a first track of a target vehicle within a predetermined range from a host vehicle, wherein the first track is predicted by the host vehicle based on a sensor signal associated with the target vehicle and perceived by a sensor of the host vehicle;
receiving a second trajectory of the target vehicle from the target vehicle via a vehicle-to-vehicle communication link, wherein the second trajectory is a planned trajectory planned by an autonomous driving system of the target vehicle;
determining a first confidence level associated with the first trajectory and a second confidence level associated with the second trajectory; and
and fusing the first track and the second track by using the first confidence coefficient and the second confidence coefficient to obtain a predicted track of the target vehicle.
2. The method of claim 1, wherein determining a first confidence level associated with the first trajectory and a second confidence level associated with the second trajectory comprises:
determining the first confidence level based on at least one of a noise level of the sensor signal, a behavior of the target vehicle, and an accuracy of a perception and prediction algorithm used by the host vehicle to predict the first trajectory; and
determining the second confidence level based on at least one of a trajectory planning accuracy of the target vehicle and a noise level in the vehicle-to-vehicle communication link.
3. The method according to claim 1 or 2,
wherein the first trajectory comprises a first coordinate sequence of the target vehicle in a first coordinate system associated with the host vehicle, the first coordinate sequence comprising: a corresponding plurality of first coordinates of the target vehicle in the first coordinate system at a plurality of future times after the current time, and
wherein the second trajectory comprises a second coordinate sequence of the target vehicle in a second coordinate system associated with the target vehicle, the second coordinate sequence comprising: a corresponding plurality of second coordinates of the target vehicle in the second coordinate system at the plurality of future times subsequent to the current time.
4. The method of claim 3, wherein fusing the first and second trajectories with the first and second confidences comprises:
converting the plurality of second coordinates into a plurality of third coordinates in the first coordinate system;
fusing the plurality of first coordinates and the plurality of third coordinates by using the first confidence degree and the second confidence degree to obtain a plurality of fourth coordinates; and
fitting the plurality of fourth coordinates using a regression fitting method to obtain the predicted trajectory of the target vehicle.
5. The method of claim 4, wherein fusing the plurality of first coordinates and the plurality of third coordinates with the first confidence level and the second confidence level comprises:
for each future time instance of the plurality of future time instances:
calculating a weighted sum of a corresponding first coordinate of the plurality of first coordinates at the future time instant and a corresponding third coordinate of the plurality of third coordinates at the future time instant,
wherein the first confidence is used as a weighting coefficient of the corresponding first coordinate, and the second confidence is used as a weighting coefficient of the corresponding third coordinate.
6. The method of claim 4, wherein the regression fitting method comprises at least one selected from the group consisting of interpolation, polishing, and least squares.
7. The method of claim 1 or 2, further comprising:
determining a planned trajectory of the host vehicle based on the predicted trajectory of the target vehicle; and
broadcasting the planned trajectory of the host vehicle.
8. The method of claim 1 or 2, further comprising:
detecting, with the sensor of the host vehicle, respective first identification information of one or more vehicles within the predetermined range from the host vehicle;
receiving respective second identification information from the one or more vehicles via the vehicle-to-vehicle communication link; and
matching the first identification information and the second identification information based on the similarity between the first identification information and the second identification information so that the host vehicle identifies the target vehicle from the one or more vehicles.
9. The method of claim 8, each of the first identification information and the second identification information comprising at least one selected from the group consisting of:
vehicle license plate number, vehicle body color, vehicle make, vehicle model, and vehicle satellite navigation position.
10. An apparatus for predicting a trajectory of a vehicle, comprising:
a processor, and
a memory storing a program comprising instructions that, when executed by the processor, cause the processor to perform the method of any of claims 1 to 9.
11. A vehicle comprising the apparatus of claim 10.
12. A non-transitory computer-readable storage medium storing a program, the program comprising instructions that when executed by one or more processors cause the one or more processors to perform the method of any one of claims 1-9.
CN202010515572.9A 2020-06-09 2020-06-09 Method, device, vehicle and storage medium for predicting vehicle track Pending CN113771845A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010515572.9A CN113771845A (en) 2020-06-09 2020-06-09 Method, device, vehicle and storage medium for predicting vehicle track

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010515572.9A CN113771845A (en) 2020-06-09 2020-06-09 Method, device, vehicle and storage medium for predicting vehicle track

Publications (1)

Publication Number Publication Date
CN113771845A true CN113771845A (en) 2021-12-10

Family

ID=78834227

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010515572.9A Pending CN113771845A (en) 2020-06-09 2020-06-09 Method, device, vehicle and storage medium for predicting vehicle track

Country Status (1)

Country Link
CN (1) CN113771845A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117011816A (en) * 2022-05-04 2023-11-07 动态Ad有限责任公司 Trace segment cleaning of trace objects

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160167648A1 (en) * 2014-12-11 2016-06-16 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle interaction with external environment
US20160297436A1 (en) * 2015-04-09 2016-10-13 Hyundai Motor Company Apparatus and method for identifying surrounding vehicles
US20180143644A1 (en) * 2016-11-22 2018-05-24 Baidu Usa Llc Method and system to predict vehicle traffic behavior for autonomous vehicles to make driving decisions
CN109515436A (en) * 2018-12-07 2019-03-26 安徽江淮汽车集团股份有限公司 A kind of selection method of intelligent driving vehicle running path fusion
CN110164183A (en) * 2019-05-17 2019-08-23 武汉理工大学 A kind of safety assistant driving method for early warning considering his vehicle driving intention under the conditions of truck traffic
CN110364009A (en) * 2019-07-16 2019-10-22 华人运通(上海)自动驾驶科技有限公司 Traveling planing method, device, roadside device and storage medium based on roadside device
CN110497908A (en) * 2018-05-16 2019-11-26 通用汽车环球科技运作有限责任公司 Automated driving system and the control logic for using sensor fusion row intelligent vehicle control
CN110621955A (en) * 2017-03-17 2019-12-27 维宁尔美国公司 Grouping for efficient cooperative positioning computation
US20200089238A1 (en) * 2018-09-15 2020-03-19 Toyota Research Institute, Inc. Systems and methods for predicting the trajectory of a road agent external to a vehicle

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160167648A1 (en) * 2014-12-11 2016-06-16 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle interaction with external environment
US20160297436A1 (en) * 2015-04-09 2016-10-13 Hyundai Motor Company Apparatus and method for identifying surrounding vehicles
US20180143644A1 (en) * 2016-11-22 2018-05-24 Baidu Usa Llc Method and system to predict vehicle traffic behavior for autonomous vehicles to make driving decisions
CN110621955A (en) * 2017-03-17 2019-12-27 维宁尔美国公司 Grouping for efficient cooperative positioning computation
CN110497908A (en) * 2018-05-16 2019-11-26 通用汽车环球科技运作有限责任公司 Automated driving system and the control logic for using sensor fusion row intelligent vehicle control
US20200089238A1 (en) * 2018-09-15 2020-03-19 Toyota Research Institute, Inc. Systems and methods for predicting the trajectory of a road agent external to a vehicle
CN109515436A (en) * 2018-12-07 2019-03-26 安徽江淮汽车集团股份有限公司 A kind of selection method of intelligent driving vehicle running path fusion
CN110164183A (en) * 2019-05-17 2019-08-23 武汉理工大学 A kind of safety assistant driving method for early warning considering his vehicle driving intention under the conditions of truck traffic
CN110364009A (en) * 2019-07-16 2019-10-22 华人运通(上海)自动驾驶科技有限公司 Traveling planing method, device, roadside device and storage medium based on roadside device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117011816A (en) * 2022-05-04 2023-11-07 动态Ad有限责任公司 Trace segment cleaning of trace objects
GB2618438A (en) * 2022-05-04 2023-11-08 Motional Ad Llc Track segment cleaning of tracked objects

Similar Documents

Publication Publication Date Title
US9922565B2 (en) Sensor fusion of camera and V2V data for vehicles
US20210122364A1 (en) Vehicle collision avoidance apparatus and method
US20180056998A1 (en) System and Method for Multi-Vehicle Path Planning Technical Field
US11959999B2 (en) Information processing device, information processing method, computer program, and mobile device
US11100675B2 (en) Information processing apparatus, information processing method, program, and moving body
CN111559383A (en) Method and system for determining Autonomous Vehicle (AV) motion based on vehicle and edge sensor data
JP2021099793A (en) Intelligent traffic control system and control method for the same
US11295477B1 (en) Deep learning-based camera calibration
US11708088B2 (en) Dynamically modifying collision avoidance response procedure in autonomous vehicles
JP2023126642A (en) Information processing device, information processing method, and information processing system
CN113692521A (en) Information processing apparatus, information processing method, and information processing program
CN112534297A (en) Information processing apparatus, information processing method, computer program, information processing system, and mobile apparatus
US20220253065A1 (en) Information processing apparatus, information processing method, and information processing program
CN112506177A (en) Behavior control apparatus and behavior control method for autonomous vehicle
CN113771845A (en) Method, device, vehicle and storage medium for predicting vehicle track
Gogineni Multi-sensor fusion and sensor calibration for autonomous vehicles
CN115840441A (en) Method for vehicle, system for vehicle and storage medium
JP2021068315A (en) Estimation method and estimation system of lane condition
JP2022028989A (en) Information processor, method for processing information, and program
EP4141482A1 (en) Systems and methods for validating camera calibration in real-time
US20240124029A1 (en) Selecting a vehicle action based on a combination of vehicle action intents
US11358598B2 (en) Methods and systems for performing outlet inference by an autonomous vehicle to determine feasible paths through an intersection
US20230382427A1 (en) Motion prediction in an autonomous vehicle using fused synthetic and camera images
US20220374734A1 (en) Multi-target tracking with dependent likelihood structures
US20240131984A1 (en) Turn signal assignment for complex maneuvers

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination