GB2594079A - A method for tracking at least one object in the surroundings of an autonomous motor vehicle with a Frenet frame, as well as an assistance system - Google Patents

A method for tracking at least one object in the surroundings of an autonomous motor vehicle with a Frenet frame, as well as an assistance system Download PDF

Info

Publication number
GB2594079A
GB2594079A GB2005547.1A GB202005547A GB2594079A GB 2594079 A GB2594079 A GB 2594079A GB 202005547 A GB202005547 A GB 202005547A GB 2594079 A GB2594079 A GB 2594079A
Authority
GB
United Kingdom
Prior art keywords
object tracking
assistance system
tracking
motor vehicle
sensor fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2005547.1A
Other versions
GB202005547D0 (en
Inventor
Pilania Vinay
Welz Tobias
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to GB2005547.1A priority Critical patent/GB2594079A/en
Publication of GB202005547D0 publication Critical patent/GB202005547D0/en
Publication of GB2594079A publication Critical patent/GB2594079A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/20Road profile
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/35Data fusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Abstract

At least one object in a surrounding of an at least partially autonomous motor vehicle 10 is tracked by an assistance system 12 of the vehicle 10. The object is detected by at least two detection devices 16,18, 20 of the assistance system and data of the two detection devices are fused together as sensor fusion data 24. The object is tracked in a Frenet frame depending on the sensor fusion data by an electronical computing device 22. The sensor fusion data is transferred to a supplementary partial multi-object tracking framework 26 of the electronical computing device that considered road topography information of the road, where the object is located, and the object tracking is performed depending on the sensor fusion data and the road topography information. Motion models for objects may be used for applying a one step prediction.

Description

A METHOD FOR TRACKING AT LEAST ONE OBJECT IN THE SURROUNDINGS OF AN AUTONOMOUS MOTOR VEHICLE WITH A FRENET FRAME, AS WELL AS AN ASSISTANCE SYSTEM
FIELD OF THE INVENTION
[0001] The present disclosure relates to the field of automobiles. More specifically the present disclosure relates to a method for tracking at least one object in the surroundings of an at least partially autonomous motor vehicle.
BACKGROUND INFORMATION
[0002] In general, the state-of-the-art approach is to consider the object tracking at the sensor fusion stage, where the key idea is to interact with the raw sensor information, in particular with raw sensor date from the sensor, than detect the relevant objects in the raw data, fuse the raw data from different sensors, if there are multiple sensors involved, and finally track, for example the position, the velocity and the shape, the outcome of sensor fusion in which that stage may be called low-level object tracking.
[0003] For instance, DE 10 2015 221 626 Al discloses a procedure for determining a trajectory for guiding a vehicle. The method comprises determining a guidance reference curve along which the vehicle is to be guided. The method further comprises the determination of a linear station reference curve so that the linear station reference curve meets a higher continuity requirement than the guidance reference curve. The method further comprises determining initial values for a plurality of state variables of the vehicle relative to the linear station reference curve. Furthermore, the method comprises determining a trajectory based on the initial values, based on the reference curve and based on the model of the dynamics of the vehicle. Determining the trajectory comprises determining a quality function that takes into account a deviation of the trajectory from the guidance reference curve.
[0004] However, the state of the art relates to a method and a corresponding device for determining a trajectory for the control and/or regulation of the transverse and/or longitudinal guidance of a vehicle during a drive -maneuver, there is no refined tracking of an object in the surroundings possible. Therefore, there is a need in the art for an effective method of validating and enhancing the outcome of low-level object tracking.
SUMMARY OF THE INVENTION
[0005] It is an object of the invention to provide a method and an assistance system by which a refined tracking of the at least one object in the surroundings of a motor vehicle can be realized.
[0006] This object is solved by a method and an assistance system according to the independent claims. Advantages and embodiments are disclosed in the dependent claims.
[0007] One aspect of the invention relates to a method for tracking at least one object in the surrounding of an at least partially autonomous motor vehicle by an assistance system of the motor vehicle, where the object is detected by at least two detection devices of the motor vehicle and data of the two detection devices are fused together to create sensor fusion data within the assistance system. The sensor fusion data may be further utilized to track an object in a Frenet frame by an electronic computing device.
[0008] In an embodiment, the sensor fusion data are transferred to a supplementary partial multi-object tracking framework of the electronic computing device, wherein the supplementary partial multi-object tracking framework considers road topography information of the road and where the object is located and positioned, and object tracking is performed depending on the sensor fusion data and the road topography information.
[0009] Improved object tracking over prior solutions can be achieved using the aforementioned methodology. The novel method allows the computing system to estimate the following tasks: a refined object position, velocity, or shape and any other tasks related to the concept of this disclosure.
[0010] In particular, sensor fusion of the at least two detection devices is computed by the electronic computing device. Although not limited to the following definitions, the sensor fusion is a combination of sensor data or data derived from disparate sources, in particular from the at least two detection devices, such that the resulting information has less uncertainty than would be possible when these sources were used individually.
[0011] In other words, in this disclosure, there is a Frenet frame based partial multi-object tracking approach for the at least partial autonomous motor vehicle, in particular for an autonomous driving of the motor vehicle. For the purpose of this disclosure, this is defined as high-level object tracking, wherein the outcome of a low-level object tracking which can be the output of the sensor fusion, is the input. For that the high-level partial object tracking approach refines the estimated position, velocity, or shape from the low-level tracking by considering the road topography as additional information, which may not be available at the low-level object tracking or not encouraged to use at the low-level due to the functional safety related issues. The reason for calling it a partial object tracking is that the measurement update step of the estimation algorithm is not considered. For example, a Kalman filter or a particle filter is not considered.
[0012] The high-level partial object tracking is advantageous for the following premises. Firstly, the method allows the computing system to evaluate robust information about the moving obstacles surrounding the partially autonomous motor vehicle during occlusion handling. Secondly, from a functional safety point of view, if the low-level object tracking pipeline brakes during the operation of the autonomous motor vehicle then the high-level partial object tracking acts as a memory for storing/tracking the information for a time period good enough for the at least partially autonomous motor vehicle to make an emergency stop for example. Lastly, high-level partial object tracking may be used to overcome short-term sensor/fusion failures or occlusion.
[0013] In an advantageous form of the embodiment a measurement update step of an estimation algorithm for the object tracking is unconsidered by the supplementary partial multi-object tracking framework.
[0014] In an embodiment, if appropriate motion models for the at least one object may be used for applying a one-step prediction of the supplementary partial multi-object tracking framework.
[0015] In an embodiment, after the one-step prediction is performed, Frenet states from the object are associated with new detections of the detection devices.
[0016] In a preferred embodiment, the association for a Frenet state is limited to at least one distance threshold for a data association by the supplementary partial multi-object tracking framework. The distances for the threshold may be computed based on the road topology and/or reachable space.
[0017] Another aspect of the invention relates to an assistance system for an object tracking of at least one object in a surrounding of an at least partially autonomous motor vehicle, with at least two detection devices and an electronical computing device, wherein the assistance system is configured to perform a method according to the first aspect. In particular the method is performed by the assistant system.
[0018] A further aspect of the invention relates to a motor vehicle with an assistance system according to the preceding aspect.
[0019] Further advantageous embodiments of the method are to be regarded as advantageous embodiments of the assistance system as well as to the motor vehicle. Assistance system as well as the motor vehicle for this purpose comprise substantial features, which facilitate a performance of the method or advantageous embodiments thereof.
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] Further advantages, features, and details of the invention derive from the following description of the preferred embodiment as well as from the drawing. The features and feature combinations previously mentioned in the description as well as the features and feature combinations mentioned in the following description of the figure and/or shown in the figure alone can be employed not only in the respectively indicated combination but also in any other combination or taken alone without leaving the scope of the invention.
[0021] The only figure shows a block diagram of an embodiment of a motor vehicle with an assistance system.
[0022] In the figure the same elements or elements having the same function are indicated by the same references.
DETAILED DESCRIPTION
[0023] In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
[0024] While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
[0025] The terms "comprises', "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by "comprises.., a" does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
[0026] In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient details to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
[0027] FIG. 1 shows a schematic block diagram of an embodiment of a motor vehicle 10 with an embodiment of an assistance system 12. The motor vehicle 10 is at least partially autonomous, in particular the motor vehicle 10 is full autonomous. The assistance system 12 is configured to perform an object tracking 14 of at least one object in a surrounding of the at least partially autonomous motor vehicle 10. The assistance system 12 comprises at least two detection devices 16, 18, 20. In an embodiment the assistance system 12 comprises at least three detection devices 16, 18, 20. For example, the assistance system 12 may comprise a first detection device 16, a second detection device 18 and a third detection device 20. Although not limited to the following examples, the detection device (16, 18, 20) may be a camera, a lidar sensor, an ultrasonic sensor or a radar sensor or any equivalent applicable to for the purpose of this disclosure. Moreover, the assistance system 12 comprises at least one electronical computing device 22.
[0028] FIG. 1 shows a method for an object tracking 14 of at least one object in the surrounding of the at least partially autonomous motor vehicle 10 by the assistance system 12 of the motor vehicle 10, by which the object is detected by at least any of the two detection devices 16, 18, 20 the assistance system 12. The detected data of the two detection devices 16, 18, 20 are then fused together to create a sensor fusion data 24 by the electronical computing device 22 of the assistance system 12, wherein the object is tracked in a Frenet frame depending on the sensor fusion data 24.
[0029] In an embodiment, the sensor fusion data 24 are transferred to a supplementary partial multi-object tracking framework 26 of the electronical computing device 22, wherein the supplementary partial multi-object tracking framework 26 considers a road topography information of the road, where the object is located, and the object tracking is performed depending on the sensory fusion data and the road topography information.
[0030] FIG. 1 further shows that from the sensor fusion data 24 of the dynamic objects 28 are transferred to a map association 30. A stored map 32 is transferred to the map association 30 as well as to the supplementary partial multi-object tracking framework 26. The map association 30 is transferred to assigned dynamic objects 34, wherein the assigned dynamic objects 34 are transferred to a data association 36 of the supplementary partial multi-object tracking framework 26. Motion models 38, which describes different mathematical models of different types of moving objects, are transferred to the data association 36. The data association 36 then is transferred to a block called assign operations to data association result 38. This block is transferred to a track updating 40. The track updating 40 is transferred to internal tracks 42, recorded feedback loop 44 with the motion models 38. Furthermore, the internal tracks 42 are transferred to a model creator 46. An additional content 48 is transferred to the model creator 46 and the model creator 46 is connected to the object tracking 14.
[0031] It should be appreciated that, the partial object tracking 14 is the measurement update step of the estimation algorithm which is unconsidered. Furthermore, it is shown that appropriate motion models 38 for at least one object are used for applying a one-step prediction of the supplementary partial multi-object tracking framework 26.
[0032] Furthermore it is shown that after the one-step prediction is performed Frenet states from the object are associated with new detection of the detection devices 16, 18, 20. Furthermore, it is shown that the association for a Frenet state is limited to at least one distance threshold for data association 36 by the supplementary partial multi-object tracking framework 26.
[0033] In particular FIG.1 shows the supplementary partial object tracking stage as a part of the environmental model provider. Its input is the latest list of assigned dynamic objects 44 and the list of internal tracks 42 from the last iteration. Its output is the list of internal tracks 42 and it has furthermore access to additional environment information such as the stored map 32. The data structure of the internal is comparable to the one from the assigned dynamic objects 34. The first step of the partial tracking is applying a plurality of motion models 38 to the tracks from the last iteration. There is a selection algorithm used to individually choose a suitable motion model 38 for each of the internal tracks 42. After the one-step prediction, the dynamic objects are associated to the predicted internal tracks 42. The data association 36 result's come in the shape of list of associations from internal tracks 42 to dynamic objects. Once, the data association 36 results as available, another algorithm assigns update operations to the track-two-objectassociations. This algorithm is called the track update stage. Once, the updated operations are assigned to the data associations 36, they are executed modifying the list of internal tracks 42 resulting in the new/updated list of internal tracks 42.
[0034] In particular FIG.1 shows that there is a Frenet frame based partial multi-object tracking approach for autonomous driving. This stage is in particular called high-level object tracking where the outcome of the low-level object tracking is the input. In particular the output of the sensor fusion 24 is called the low-level object tracking. That is the high-level partial object tracking approach tries to refine the estimated position, velocity, shape or more from the low-level tracking by considering the road topography as additional information, which may not be available at low-level object tracking or not encouraged to use at low-level due to dysfunctional safety released issues. The reason
B
for calling it as partial object tracking is that the measurement update step of estimation algorithm are not considered.
[0035] This high-level partial object tracking is also important, first, for reasoning about occlusion handling to have robust information about the moving obstacles surrounding the at least partially autonomous motor vehicle 10. Second, for functional safety point of view, that is, if the low-level object tracking pipeline brakes for any reason during the operation then the high-level partial object tracking acts as memory for storing/tracking the information for a time period good enough for the at least partially autonomous motor vehicle 10 to make an emergency stop.
[0036] The advantage of using road topography at high-level partial object tracking enhances the reasoning about the outcome of a low-level tracking but at the same time, it makes the high-level partial tracking complicated. Sometimes, a moving obstacle could be partially overlapping with two lanes. In this particular case, it could be associated to two different lanes. Similarly in intersections, a moving obstacle could be associated with more than two lanes. Due to multi-lane association, a moving obstacle has multiple Frenet states. In reality, there is only one Cartesian state but multiple Frenet states. The multiple Frenet states results from transforming the problem from Cartesian frame to a Frenet frame. Each of these Frenet states, for example for a moving obstacle, need to be associated to the new data, which in the current method, is the output of low-level tracking, and then tracked separately.
[0037] It should be appreciated that the state of a moving obstacle then consists of its Cartesian coordinate state and all of its Frenet state obtained after associating it to the road lanes.
[0038] The high-level partial object tracking method includes the following steps: In a first step the one-step prediction is applied to the existent tracks, especially from the previous iteration. To apply one-step prediction, appropriate motion models 38 are used. For example, a pedestrian will have a different motion model, a bicycle or a car will have different motion models 38. The motion model 38 also changes based on the curve of the road. After one-step prediction, the Frenet states are associated with a new measurement, which are also associated to the road lanes. This can be called sub-step as data association 38. As stated earlier, the measurement is the road lanes associated output of low-level object tracking. Road lanes association is an additional step after the low-level tracking and before processing its outcome in high-level partial tracking. It uses evidence theory-based approach to associate the moving obstacles to the road lanes. Data association 38 for such partial object tracking is not trivial and it is not straight forwarded to use the existing approaches. Therefore, in addition to a Frenet frame base partial object tracking framework 26, the new data association technique 58 is provided.
[0039] The data association 38 strategy may be considered as a low-level association and a high-level association. The low-level association is between Frenet states of existing tracks to Frenet states of a new measurement. Moreover, the high-level association is between a moving obstacle to a new measurement. Note that both moving obstacles and new measurements have the same structural representation consisting of a Cartesian state and movable Frenet states, depending on association to road lanes. Since there could be many possible associations, therefore, the association for a Frenet state is limited to a distance threshold. Based on the distance threshold, possible lanes sequences are computed and the new measurement on these lanes are only associated to the concerned Frenet track. The data associations 38 confidence, in particular high-level association, between a moving obstacle, in particular an existing track, to a new measurement is computer based on the confidences of the low-level associations. In particular the maximum may be used. After the data association 38, the remaining usual steps, for example discontinuing track, track initialization, merging and splitting for example are implemented. It is important that these steps are slightly adjusted to the Frenet based framework and are different from existing state of the art.
[0040] In particular the measurement step is not applied. Instead, if there is a strong association than the value of a new measurement is simply copied to the associated, in particular existing, track. It is assumed that this strong connection is established after intelligent reasoning to validate authenticity of a new measurement. It could be false positive or false negative.
[0041] In particular there is a Frenet frame based partial object tracking framework provided for autonomous driving. There is also a data association 38 technique for a Frenet frame based data association 38 provided.
Reference signs motor vehicle 12 assistance system 14 object tracking 16 first detection device 18 second detection device third detection device 22 electronical computing device 24 sensor fusion 26 supplementary partial multi-object tracking framework 28 object map association 32 stored map 34 assigned dynamic objects 36 data association 38 motion model track updating 42 internal tracks 44 feadback loop 46 model creator 48 additional content assign operations to data association result

Claims (6)

  1. CLAIMS1. A method for tracking (14) at least one object in a surrounding of an at least partially autonomous motor vehicle (10) by an assistance system (12) of the motor vehicle (10), by which the object is detected by at least two detection devices (16, 18, 20) of the assistance system (12) and data of the two detection devices (16, 18, 20) are fused together as sensor fusion data (24) within the assistance system (12), wherein the object is tracked in a Frenet frame depending on the sensor fusion data (24) by an electronical computing device (22), characterized in that the sensor fusion data (24) are transferred to a supplementary partial multi-object tracking framework (26) of the electronical computing device (22), wherein the supplementary partial multi-object tracking framework (26) considers a road topography information of the road, where the object is, and the object tracking is performed depending on the sensor fusion data (24) and the road topography information.
  2. 2. The method according to claim 1, characterized in that a measurement update step of an estimation algorithm for object tracking (14) is unconsidered by the supplementary partial multi-object tracking framework (26).
  3. 3. The method according to claim 1 or 2, characterized in that appropriate motion models (38) for the at least one object are used for applying a one-step prediction of the supplementary partial multi-object tracking framework (26).
  4. 4. The method according to claim 3, characterized in that after the one-step prediction is performed Frenet states from the object is associated with new detections of the detection devices (16, 18, 20).
  5. 5. The method according to anyone of the preceding claims, characterized in that the association for a Frenet state is limited to at least one distance threshold for data association (50) by the supplementary partial multi-object tracking framework (26).
  6. 6. An assistance system (12) for tracking (14) of at least one object in a surrounding of an at least partially autonomous motor vehicle (10), with at least two detection devices (16, 18, 20) and an electronical computing device (22), wherein the assistance system (12) is configured to perform a method according to any one of the claims 1 to 5.
GB2005547.1A 2020-04-16 2020-04-16 A method for tracking at least one object in the surroundings of an autonomous motor vehicle with a Frenet frame, as well as an assistance system Withdrawn GB2594079A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2005547.1A GB2594079A (en) 2020-04-16 2020-04-16 A method for tracking at least one object in the surroundings of an autonomous motor vehicle with a Frenet frame, as well as an assistance system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2005547.1A GB2594079A (en) 2020-04-16 2020-04-16 A method for tracking at least one object in the surroundings of an autonomous motor vehicle with a Frenet frame, as well as an assistance system

Publications (2)

Publication Number Publication Date
GB202005547D0 GB202005547D0 (en) 2020-06-03
GB2594079A true GB2594079A (en) 2021-10-20

Family

ID=70860159

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2005547.1A Withdrawn GB2594079A (en) 2020-04-16 2020-04-16 A method for tracking at least one object in the surroundings of an autonomous motor vehicle with a Frenet frame, as well as an assistance system

Country Status (1)

Country Link
GB (1) GB2594079A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100253540A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Enhanced road vision on full windshield head-up display
WO2014179109A1 (en) * 2013-05-03 2014-11-06 Google Inc. Predictive reasoning for controlling speed of a vehicle
WO2019010659A1 (en) * 2017-07-13 2019-01-17 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for trajectory determination
US20190382031A1 (en) * 2018-06-18 2019-12-19 Baidu Usa Llc Methods for handling sensor failures in autonomous driving vehicles
CN110647151A (en) * 2019-10-16 2020-01-03 北京京东乾石科技有限公司 Coordinate conversion method and device, computer readable storage medium and electronic equipment
US20200050214A1 (en) * 2017-04-26 2020-02-13 Bayerische Motoren Werke Aktiengesellschaft Method, Computer Program Product, Computer-Readable Medium, Control Unit, and Vehicle Comprising the Control Unit for Determining a Collective Maneuver of at Least Two Vehicles
CN111002980A (en) * 2019-12-10 2020-04-14 苏州智加科技有限公司 Road obstacle trajectory prediction method and system based on deep learning
US20200167934A1 (en) * 2018-11-27 2020-05-28 GM Global Technology Operations LLC Systems and methods for applying maps to improve object tracking, lane-assignment and classification

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100253540A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Enhanced road vision on full windshield head-up display
WO2014179109A1 (en) * 2013-05-03 2014-11-06 Google Inc. Predictive reasoning for controlling speed of a vehicle
US20200050214A1 (en) * 2017-04-26 2020-02-13 Bayerische Motoren Werke Aktiengesellschaft Method, Computer Program Product, Computer-Readable Medium, Control Unit, and Vehicle Comprising the Control Unit for Determining a Collective Maneuver of at Least Two Vehicles
WO2019010659A1 (en) * 2017-07-13 2019-01-17 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for trajectory determination
US20190382031A1 (en) * 2018-06-18 2019-12-19 Baidu Usa Llc Methods for handling sensor failures in autonomous driving vehicles
US20200167934A1 (en) * 2018-11-27 2020-05-28 GM Global Technology Operations LLC Systems and methods for applying maps to improve object tracking, lane-assignment and classification
CN110647151A (en) * 2019-10-16 2020-01-03 北京京东乾石科技有限公司 Coordinate conversion method and device, computer readable storage medium and electronic equipment
CN111002980A (en) * 2019-12-10 2020-04-14 苏州智加科技有限公司 Road obstacle trajectory prediction method and system based on deep learning

Also Published As

Publication number Publication date
GB202005547D0 (en) 2020-06-03

Similar Documents

Publication Publication Date Title
US9707959B2 (en) Driving assistance apparatus
JP6822752B2 (en) Driving assistance technology for active vehicle control
Amditis et al. A situation-adaptive lane-keeping support system: Overview of the safelane approach
WO2017163667A1 (en) Driving assistance method, driving assistance device which utilizes same, autonomous driving control device, vehicle, driving assistance system, and program
Sontges et al. Worst-case analysis of the time-to-react using reachable sets
JP2018206036A (en) Vehicle control system, method thereof and travel support server
JP2004017876A (en) On-vehicl obstacle detection device
US11400942B2 (en) Vehicle lane trajectory probability prediction utilizing kalman filtering with neural network derived noise
US10803307B2 (en) Vehicle control apparatus, vehicle, vehicle control method, and storage medium
Jeong et al. Bidirectional long shot-term memory-based interactive motion prediction of cut-in vehicles in urban environments
US11789141B2 (en) Omnidirectional sensor fusion system and method and vehicle including the same
US11748593B2 (en) Sensor fusion target prediction device and method for vehicles and vehicle including the device
Lu et al. Intention prediction-based control for vehicle platoon to handle driver cut-in
Jo et al. Track fusion and behavioral reasoning for moving vehicles based on curvilinear coordinates of roadway geometries
US20230271621A1 (en) Driving assistance device, learning device, driving assistance method, medium with driving assistance program, learned model generation method, and medium with learned model generation program
GB2594079A (en) A method for tracking at least one object in the surroundings of an autonomous motor vehicle with a Frenet frame, as well as an assistance system
US20210271902A1 (en) Sensor recognition integration device
KR20200133122A (en) Apparatus and method for preventing vehicle collision
KR102498328B1 (en) Method And Apparatus for Vehicle State Transition Learning Based on Vehicle State Based Model
CN115195718A (en) Lane keeping auxiliary driving method and system and electronic equipment
Daniel Trajectory generation and data fusion for control-oriented advanced driver assistance systems
CN113591673A (en) Method and device for recognizing traffic signs
CN113302108A (en) Method, device, computer program and computer program product for operating a vehicle
Tyagi et al. Autonomous vehicle algorithm decision-making considering other road users
US20230373498A1 (en) Detecting and Determining Relevant Variables of an Object by Means of Ultrasonic Sensors

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)