GB2599939A - Method of updating the existance probability of a track in fusion based on sensor perceived areas - Google Patents

Method of updating the existance probability of a track in fusion based on sensor perceived areas Download PDF

Info

Publication number
GB2599939A
GB2599939A GB2016383.8A GB202016383A GB2599939A GB 2599939 A GB2599939 A GB 2599939A GB 202016383 A GB202016383 A GB 202016383A GB 2599939 A GB2599939 A GB 2599939A
Authority
GB
United Kingdom
Prior art keywords
sensor
probability
tracks
existence
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2016383.8A
Other versions
GB202016383D0 (en
Inventor
Burca Cristian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive GmbH
Continental Automotive Romania SRL
Original Assignee
Continental Automotive GmbH
Continental Automotive Romania SRL
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive GmbH, Continental Automotive Romania SRL filed Critical Continental Automotive GmbH
Priority to GB2016383.8A priority Critical patent/GB2599939A/en
Publication of GB202016383D0 publication Critical patent/GB202016383D0/en
Publication of GB2599939A publication Critical patent/GB2599939A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention refers to a method of updating the existence probability of a track, configured to be performed by means of a fusion system of an ego vehicle, an electronic control unit mounted on the ego vehicle and a multitude of sensors operatively coupled to the ego vehicle. On the basis of the sensor data a perceived area is computed for each sensor using environment data that is available to the fusion system then individual sensor tracks are generated based on the received sensor data and it is determined whether an association between the individual sensor tracks exists to provide associated and unassociated tracks. For each unassociated track the probability of the existence of a target vehicle is updated then it is determined if a track is visible or not by determining if it is inside the dynamic perceived area of at least one sensor and if so then the probability of existence of a target vehicle for the unassociated track is decreased by a first factor to obtain a probability of existence within the perceived area. If the unassociated track is outside the perceived area then either the probability of existence is left alone or it is reduced by a second factor lower than the first to provide a probability of existence outside the perceived area. The respective probabilities of existence inside or outside the perceived area are compared against a threshold and those unassociated tracks with a probability below the threshold are deleted while associated tracks and unassociated tracks with probabilities above the threshold are updated from which sensor confirmed tracks are then generated.

Description

Description
Method of Updating the Existence Probability of a Track in Fusion Based on Sensor Perceived Areas The invention relates to a method of updating the existence probability of a track in fusion based on sensor perceived areas.
Sensor fusion is a huge topic in advanced driving assistance systems (ADAS) and autonomous driving. From object detection to free-space and road estimation, fusion of information is mandatory for a consistent and precise result. Dynamic or static objects, i.e. pedestrians, bicyclists, vehicles, trucks are detected and tracked by object detection fusion algorithms and the precision in detection is highly dependent on the quality of the sensors used. Algorithms that fuse sensor information are implemented so that they reach satisfactory results in certain scenarios; nevertheless, there are certain scenarios and use cases not yet covered, although they must be when considering autonomy at higher levels (L3, L4).
High level fusion is the term used for object level fusion. This approach is the lowest in complexity, but it is also sometimes hard to configure and prone to errors as the sensor internal algorithms and outputs are not always trustworthy. Mid and low-level fusion uses mid or low-level information as inputs (features, point clouds or even pixels and peak points). Those algorithms are more complex to implement, that's why they are preferred for low level fusion machine learning.
However, for all these fusion algorithms it is mandatory to know and collect as many information possible that could contribute to better fusion output. One contribution that could generate better outputs and, in some cases, avoid late detection is knowing the unknown: this means being aware of the space the sensors are and are not capable of detecting. In current approaches this is just limited to theoretical field of view of the sensors, but the following description will show the importance of detailed sensor perceiving area inside fusion algorithms.
In the following description there are used as example object detection fusion algorithms, but the underlying idea can be extrapolated to other fusion methods as well. In object detection fusion algorithms, the term "measurements cr sensor tracks" is used to describe the objects (vehicles, trucks, pedestrians, bicyclists, other) that are detected and/or tracked by sensors and "tracks or system tracks" is the term used for fusion objects, each track/object corresponding to a single hypothesized target.
The current approaches are using probability of existence (PoE) attached to each track that the fusion algorithm creates. The value of probability of existence for each track is increasing or decreasing based on sensor inputs (if the track is identified frequently, by many sensors, the chance that the object is there gets high). When the probability of existence of a track gets high enough, the track is confirmed and will be part of the fusion output. Otherwise, when the probability of existence does not pass a defined threshold, fusion does not output the track because many times the sensors are providing ghost objects that are just reflections or sensor false posi-tives. Those should not be part of the output as they may affect the driving function decisions (for example, making a full break because of a false positive detection). There is also a lower threshold for the probability of existence that is used for track deletion. This means that if the track is no longer confirmed by any sensor measurements, the track will be deleted ("killed") eventually after the probability of existence will decrease low enough.
An issue arises when a track is deleted after the sensors have not confirmed it and this is caused not because it is no longer there but because it is occluded, or it is not in the area visible to the sensor(s). The fact that the track is deleted too early can lead to dangerous and unwanted scenarios, since if the track will re-appear in the scene it will take some time until is confirmed again as a new track. This late confirmation could have a big impact even on safety of the autonomous vehicle.
Therefore, the technical problem to be solved, in this context, is to overcome the overall uncertainty related to occluded tracks due to the mechanism based on the probability of existence created by fusion algorithms. More detailed, the invention seeks answers to the following questions: how to identify occluded track in fusion algorithms; how to calculate a dynamic area perceived by sensors; what occlusions to consider when calculating the dynamic perceived areas; and, finally, how to deal with occluded tracks in fusion algorithms.
Therefore, the goal of the invention is to solve the deficiencies of the mentioned prior art and to provide a method for computing and using a dynamic field of view of the sensors mounted on a vehicle, so that the fusion algorithms to be aware of the unknown parts (not visible areas) of this dynamic field 30 of view and act according to this additional information.
This goal is achieved according to the invention by means of the technical characteristics mentioned in the independent claim, namely a method of updating the existence probability of a track in fusion based on sensor perceived areas.
Further advantageous embodiments are the subject matter of the 5 dependent claims.
The subject-matter of the present invention is a method of updating the existence probability of a track, configured to be performed by means of a fusion system of an ego vehicle, an electronic control unit mounted on the ego vehicle and a multitude of sensors operatively coupled to the ego vehicle, the method comprising the next steps: - receiving sensor data from the multitude of sensors; - computing a dynamic perceive area for each sensor compu- ting a dynamic perceive area for each sensor using envi-ronment data that is available to the fusion system; - generating individual sensor tracks based on the received sensor data; - determining whether an association between the individual sensor tracks exists and further providing associated and unassociated tracks; - for each unassociated track, updating a probability of existence of a target vehicle, - determining if a track is visible or not, meaning deter-mining if it is inside the dynamic perceive area of any sensor as following: - if the respective unassociated track is inside the dynamic perceived area of at least one sensor, then decreasing the updated probability of existence with a first factor and obtaining a probability of existence inside the dynamic perceived area; - if the respective unassociated track is outside the dynamic perceived area of at least one sensor, then no intervening on the updated probability of existence or lowering the decrease of the updated probability of existence by applying a second factor lower than the first factor and obtaining a probability of existence outside the dynamic perceived area; - comparing the respective probability of existence inside or outside the dynamic perceived area, against a prede-fined threshold and - deleting the unassociated tracks with respective proba-bility of existence under the predefined threshold or - updating the associated tracks and the unassociated tracks with respective probability of existence above the predefined threshold, and - generating sensor confirmed tracks comprising updated as-sociated tracks and updated unassociated tracks.
The main advantages of adopting the method according to invention consist in an efficient, timely use of track-related information (the occluded tracks are no longer dismissed too quickly). This way it is created a clear distinction between a track that is no longer detected by a sensor (although it should) and a track that is hidden/occluded by other objects. Any ambiguities caused by the difference between non-visible tracks and no track detections around the vehicle are closed and solved.
Further special features and advantages of the present invention can be taken from the following description of advantageous embodiments by way of the accompanying drawings.
Figures 1 present a schematic diagram of a track management unit, part of a multi-sensor data fusion system, according to prior art; Figures 2 show several stages of computing a sensor oerceived area, namely: Fig.2a shows an object inside an ideal field of view of a sensor, namely an area perceived by sensor as unobstructed by 5 any occlusion; Fig. 2b,2c illustrate the subtraction of non-visible areas behind the object detected within the ideal field of view; Fig. 2d shows an example of dynamic field of view of a sensor detecting two different objects; Fig. 2e shows an illustration of a sensor field of view affected by object occlusions; Fig. 3 present a slope scenario; Fig. 4 present a blocked view scenario; Fig. 5 is a schematic representation of a method of updating 15 the existence probability of a track, according to invention.
Referring now to Fig. 1, there is shown a diagram of a track management unit, part of a classic fusion system from prior art. Based on raw sensor data provided by a multitude of sen- sors (e.g., radar, camera, acoustic, lidar sensors) and processed by a multi-sensorial object detection module, a detection list is supplied as input to a track management unit. The track management unit performs a series of operations such as track association, prediction, updating and deleting. The out-put consists into a list of confirmed tracks.
Figures 2 show several stages of computing a dynamic field of view of a sensor, also called a sensor perceived area, starting from an ideal (i.e. theoretical) field of view (Fig. 2a) within which an object is detected (Fig. 2b, 2c), and where non-visible areas occluded by a detected object are not anymore perceived by the sensor and, consequently, are subtracted from the ideal field of view when computing the dynamic field of view. There is also presented by Fig. 2d a dynamic field of view for two detected objects, one of which may be static and the other mobile. Moreover, fig. 2e illustrates how occlusions affect a sensor field of view.
Further on, it needs to be mentioned that the dynamic representation of a sensor field of view is used to deal with occluded tracks in an object detection fusion algorithm. Further explanation is given about this dynamic representation of a sensor field of view (or sensor perceived areas and the way these areas are integrated by fusion systems). Autonomous vehicles usually have a sensor configuration that is supposed to cover as much as possible of the ego vehicle surrounding area. Object fusion algorithms collect information from every sensor of this configuration and output a set of static and dynamic objects (sensor tracks) that are present around the autonomous vehicle. Using these lists of objects -static and dynamic -the fusion algorithm infers the area that is detectable around the vehicle. Starting with a theoretical area (i.e., the ideal field of view), using polygon representation for example, the algorithm subtracts the non-visible area generated by an object occlusion as can be seen in Fig. 2b, c (the non-visible areas are created by projecting the edges of objects on the horizon from the sensor position).
Along with object occlusions, road geometry influences may be considered. For example, a slope of road may be read from high definition maps to determine if the visibility is affected by it. Figures 3a. 3b, 3c illustrate a slope scenario with an ego vehicle going on a slopped road, the ego vehicle being equipped with a fusion system comprising ADAS sensors and an electronic control unit including at least one processor able to process data received from sensors and fused them by means of a fusion algorithm. At a first moment 11 (see Fig. 3a), an oncoming vehicle is detected by the sensors of the ego vehicle and tracked as target (all front sensors confirm its presence, probability of existence exceeds a predefined threshold, for ex., 80%). As the ego vehicle advances, at a successive moment T2 (see Fig. 3b), the ego vehicle is not anymore able to detect the target, due to the slope, the probability of existence of the target decreases and the afferent track would be abandoned (since the probability of existence drops under the predefined threshold). Furthermore, at another successive moment T3, the target re-appears in the field of view of the ego vehicle sensors and front sensors start detecting again the target (see Fig. 3c). Until the two vehicles pass one next to the other the probability of existence has not enough time to increase, therefore the target-associated track is confirmed very late or not at all.
Weather conditions and road infrastructure must also be considered here. Using sensor manufacturer information or data driven approaches, the fusion algorithm may infer if the field of view of one sensor is affected by external conditions (weather or infrastructure, for example tunnels). Those fac-tors might reduce the range or azimuth of the sensor field of view and this information should be used when calculating the dynamic field of view of a sensor.
Related to this aspects, Figures 4a-4c show a blocked view scenario handled by a classic fusion algorithm, with a similar representation of different successive moments of time when an ego vehicle and another oncoming vehicle arrive to a junction on a road from perpendicular directions. There are objects (building or other infrastructure) placed on one side of the road that obstruct, at some point, the field of view of the ego vehicle sensors. At a first moment Ti (according to Fig. 4a), the field of view of the ego vehicle sensors is not obstructed, so the oncoming vehicle is detected and tracked as target, front and side sensors of the ego vehicle confirm its presence, the probability of existence for the target exceeds the predefined threshold. As both ego vehicle and the target advance, at a second moment T2 (as seen in Fig. 4b), the ego vehicle cannot detect the target anymore, due to the presence of the occluding objects interposed between them, the probability of existence for the target is decreased in this period and the track is abandoned right after the probability of existence dropped under the predefined threshold. From the moment the target re-appears in the field of view of the ego-vehicle sensors (moment 13, as seen in Fig. 4c) and front or side sensors of the ego vehicle start detecting the target again, until the target passes, its probability of existence has not enough time to increase back, therefore the track is confirmed very late -which is neither desired nor efficient.
Fig. 5 shows a schematic diagram of the method of updating the existence probability of a track, according to the invention.
The inventive method has three major steps: Si: Computing an ideal field of view for each available sensor based on all sensor properties (sensor pose, range and angle) by creating a polygon of points that form the respective sensor's area of visibility; 32. Computing a dynamic field of view for each available sensor, by analyzing environment information (e.g. road geometry limitations, static obstacles, traffic participants, weather conditions, infrastructure and so on), detecting occlusions and creating "shadow" polygons for each occlusion in respect to each sensor (as two sensors can see the same occlusion different because of the position and angle where they are mounted). These "shadow" polygons are areas behind occlusions not visible for the sensor anymore, therefore in order to compute the dynamic field of view, such non-visible areas are going to be subtracted from the respective ideal field of views for each sensor. Basically, the dynamic field of view is an area dynamically perceived by sensor, and as such it is called in the following description a dynamic perceive area; S3. Including the computed dynamic perceive area as information to a track management unit of a fusion system by performing the next sub-steps: a. when updating sensor tracks with new sensor measurements, performing an additional check for all unassociated sensor tracks, in order to determine if a sensor track is visible or not or -in other words -if it is inside the dynamic perceived area or not -and updating the probability of existence of each sensor track; b. If the sensor track is inside the dynamic oerceived 15 area, then decreasing the updated probability of existence with a first factor; c. If the sensor track is outside the dynamic perceived area, then no intervening on the updated probability of existence or lowering the decrease by applying a second factor lower than the first factor. In this case the sensor track deleting is delayed, while an afferent confidence ellipse increases accordingly (the prediction should be made using the same motion model used before).
The updating method according to invention may be considered in any fusion algorithm, not only object detection. Road model fusion may apply this, but also free space estimation could have benefits by applying this method.
However, while certain embodiments of the present invention have been described in detail, those familiar with the art to which this invention relates will recognize various alternative designs and embodiments for practicing the invention as defined by the following claims.

Claims (4)

  1. IIPatent claims 1.R method of updating the existence probability of a track, configured to be performed by means of a fusion system of an ego vehicle, an electronic control unit mounted on the ego vehicle and a multitude of sensors operatively coupled to the ego vehicle, the method comprising the next steps: - receiving sensor data from the multitude of sensors; - computing a dynamic perceive area for each sensor using environment data that is available to the fusion system; - generating individual sensor tracks based on the received sensor data; - determining whether an association between the individual sensor tracks exists and further providing associated and unassociated tracks; - for each unassociated track, updating a probability of existence of a target vehicle, - determining if a track is visible or not, meaning deter-mining if it is inside the dynamic perceive area of any sensor as following: - if the respective unassociated track is inside the dy-namic perceived area of at least one sensor, then decreasing the updated probability of existence with a first factor and obtaining a probability of existence inside the dynamic perceived area; - if the respective unassociated track is outside the dynamic perceived area of at least one sensor, then no intervening on the updated probability of existence or lowering the decrease of the updated probability of ex-istence by applying a second factor lower than the first factor and obtaining a probability of existence outside the dynamic perceived area; - comparing the respective probability of existence inside or outside the dynamic perceived area, against a predefined threshold and - deleting the unassociated tracks with respective proba-bility of existence under the predefined threshold or -updating the associated tracks and the unassociated tracks with respective probability of existence above the predefined threshold, and - generating sensor confirmed tracks comprising updated as-sociated tracks and updated unassociated tracks.
  2. 2. Method according to claim 1, wherein the computing of dy-namic perceive area for each sensor comprises: - computing a visibility area for each available sensor based on sensor properties, namely sensor pose, range and angle; - analysing environment information for each available sen-sor and detecting occlusions, - computing non-visible areas that are not visible for each sensor in respect to each occlusion, and -computing dynamic perceive areas by subtracting the re-spective non-visible area from the respective visibility area for each sensor.
  3. 3. Method according to claim 1, wherein environment information means road geometry limitations, static obstacles, traffic participants, weather conditions, infrastructure.
  4. 4. Fusion system for tracking a target vehicle, comprising: an electronic control unit mounted in a vehicle equipped with a multitude of sensors, the electronic control unit including a memory configured to store program instructions and at least one processor configured to execute the stored program instructions, wherein, upon executing the stored program instructions, the at least one processor is configured to operate as: - a multi-sensor tracking unit which receives sensor data from the multitude of sensors, computes a dynamic per- ceive area for each sensor and generates individual sen-sor tracks; and - a track management unit which manages the individual sensor tracks and outputs confirmed tracks by employing the method of updating the existence probability of a target vehicle according to claim 1.
GB2016383.8A 2020-10-15 2020-10-15 Method of updating the existance probability of a track in fusion based on sensor perceived areas Pending GB2599939A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2016383.8A GB2599939A (en) 2020-10-15 2020-10-15 Method of updating the existance probability of a track in fusion based on sensor perceived areas

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2016383.8A GB2599939A (en) 2020-10-15 2020-10-15 Method of updating the existance probability of a track in fusion based on sensor perceived areas

Publications (2)

Publication Number Publication Date
GB202016383D0 GB202016383D0 (en) 2020-12-02
GB2599939A true GB2599939A (en) 2022-04-20

Family

ID=73598566

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2016383.8A Pending GB2599939A (en) 2020-10-15 2020-10-15 Method of updating the existance probability of a track in fusion based on sensor perceived areas

Country Status (1)

Country Link
GB (1) GB2599939A (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113112151B (en) * 2021-04-12 2023-12-08 河南鑫安利安全科技股份有限公司 Intelligent wind control evaluation method and system based on multidimensional sensing and enterprise data quantification
CN114136328B (en) * 2021-11-25 2024-03-12 北京经纬恒润科技股份有限公司 Sensor information fusion method and device
CN114463984B (en) * 2022-03-02 2024-02-27 智道网联科技(北京)有限公司 Vehicle track display method and related equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MICHAEL AEBERHARD ET AL: "Object existence probability fusion using dempster-shafer theory in a high-level sensor data fusion architecture", INTELLIGENT VEHICLES SYMPOSIUM (IV), 2011 IEEE, IEEE, 5 June 2011 (2011-06-05), pages 770 - 775, XP031998940, ISBN: 978-1-4577-0890-9, DOI: 10.1109/IVS.2011.5940430 *
MIRKO MAEHLISCH ET AL: "De-cluttering with Integrated Probabilistic Data Association for Multisensor Multitarget ACC Vehicle Tracking", INTELLIGENT VEHICLES SYMPOSIUM, 2007 IEEE, IEEE, PI, 1 June 2007 (2007-06-01), pages 178 - 183, XP031126941, ISBN: 978-1-4244-1067-5 *

Also Published As

Publication number Publication date
GB202016383D0 (en) 2020-12-02

Similar Documents

Publication Publication Date Title
US11353553B2 (en) Multisensor data fusion method and apparatus to obtain static and dynamic environment features
GB2599939A (en) Method of updating the existance probability of a track in fusion based on sensor perceived areas
EP3208635B1 (en) Vision algorithm performance using low level sensor fusion
CN110386065B (en) Vehicle blind area monitoring method and device, computer equipment and storage medium
CN111932901B (en) Road vehicle tracking detection apparatus, method and storage medium
CN110264495B (en) Target tracking method and device
US8452528B2 (en) Visual recognition area estimation device and driving support device
CN113370911B (en) Pose adjustment method, device, equipment and medium of vehicle-mounted sensor
CN109448439B (en) Vehicle safe driving method and device
US20220373353A1 (en) Map Updating Method and Apparatus, and Device
WO2019202628A1 (en) Road surface detector, image display device using road surface detector, obstacle detector using road surface detector, image display method using road surface detection method, and obstacle detection method using road surface detection method
JP6557923B2 (en) On-vehicle radar device and area detection method
CN105654031A (en) Systems and methods for object detection
CN113281760B (en) Obstacle detection method, obstacle detection device, electronic device, vehicle and storage medium
KR20230032632A (en) Vehicle and controlling method of vehicle
WO2022242465A1 (en) Method and apparatus for fusing data of multiple sensors
CN115469312A (en) Method and device for detecting passable area of vehicle, electronic device and storage medium
CN106405539B (en) Vehicle radar system and method for removing a non-interesting object
CN115223131A (en) Adaptive cruise following target vehicle detection method and device and automobile
CN114084129A (en) Fusion-based vehicle automatic driving control method and system
CN117416349A (en) Automatic driving risk pre-judging system and method based on improved YOLOV7-Tiny and SS-LSTM in V2X environment
CN116612638A (en) Traffic collision accident detection method, device and readable medium
CN116563801A (en) Traffic accident detection method, device, electronic equipment and medium
CN115359332A (en) Data fusion method and device based on vehicle-road cooperation, electronic equipment and system
CN114359346A (en) Point cloud data processing method and device, nonvolatile storage medium and processor