GB2607299A - Track fusion for an autonomous vehicle - Google Patents

Track fusion for an autonomous vehicle Download PDF

Info

Publication number
GB2607299A
GB2607299A GB2107786.2A GB202107786A GB2607299A GB 2607299 A GB2607299 A GB 2607299A GB 202107786 A GB202107786 A GB 202107786A GB 2607299 A GB2607299 A GB 2607299A
Authority
GB
United Kingdom
Prior art keywords
data
track
fusion
information
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2107786.2A
Other versions
GB202107786D0 (en
Inventor
Peukert Janis
Duraisamy Bharanidhar
Schwarz Tilo
Yuan Ting
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to GB2107786.2A priority Critical patent/GB2607299A/en
Publication of GB202107786D0 publication Critical patent/GB202107786D0/en
Publication of GB2607299A publication Critical patent/GB2607299A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles

Abstract

Means of fusing track data captured by a plurality of sensors in order to guide an autonomous vehicle. The track fusion data is produced by: receiving sensor data from at least one sensor 1-4, generating and validating track data, and obtaining track fusion data using one of a plurality of fusion algorithms 11-13 to fuse the track data. Validation may include information about completeness or covariance of track data, or missing sensor data. The fusion algorithm may be based on reconstructed covariance or decorrelated tracking and may be based on self-learned feature information. The sensor units may comprise radar, cameras, LIDAR, or an ultrasonic unit. The method enables inputs from multiple types of sensor to be combined to provide more complete track data.

Description

Track fusion for an autonomous vehicle
FIELD OF THE INVENTION
[0001] The invention relates to a method of providing track fusion data to guide a motion of an autonomous vehicle, wherein the method comprises the steps of receiving sensor data from a plurality of sensors and generating track data from the sensor data. Furthermore, the present invention relates to a method of controlling one or more components of an autonomous vehicle. Finally, the present invention relates to a system for providing track fusion data to guide a motion of an autonomous vehicle, wherein the device includes a plurality of sensors for providing sensor data and tracking means for generating track data for the sensor data.
BACKGROUND INFORMATION
[0002] Autonomous driving requires comprehensive and precise knowledge of the surrounding objects, especially of dynamic objects including stop-go-objects. However, due to sensor perception limitations, which may deliver partial knowledge of an object, and vehicle communication constrains, which may arise from a limited bandwidth when sending fusion raw observations, an accurate track fusion approach has to be designed to handle different incomplete information issues. Specifically, a track fusion module is critical to provide accurate information for follow-up planning and situation analysis. However, due to the multi-step estimation process performed during implementation, the track covariance might be missing, unreliable or only partially available. This leads to serious difficulties in performing a correct (or even reasonable) track fusion.
[0003] US9910441B2 discloses an adaptive autonomous vehicle planner method with generated trajectories to influence navigation of autonomous vehicles. In particular, the method may include receiving path data to navigate from a first geographic location to a second geographic location, generating data representing a trajectory with which to control a motion of the autonomous vehicle based on the path data, generating data representing a contingent trajectory, monitoring generation of the tractory and implementing the contingent trajectory subsequence to an absence of the trajectory.
[0004] Furthermore, AU2009200855B2 discloses a method for generating a model of an environment in which a plurality of equipment units are deployed for the extraction of at least on resource from the environment. The system comprises a free-extraction or in-ground modeling unit configured to receive data from a first plurality of heterogeneous sensors in the environment and to fuse the data into a pre-extraction model, descriptive of the environment and the at least one resource. An environment modelling unit is configured to receive environment data relating to that plurality of equipment units operating in the environment and to combine the equipment data into an equipment model. Post-extraction or out-of-ground modeling unit is to configured to receive data from a second plurality of sensors and to fuse the data into a post-extraction model descriptive of material extracted from the environment, wherein at least one of the equipment units operates to extract the at least one resource from the environment. Information from at least one of the pre-extraction models, the equipment model and the post extraction model is communicable to the equipment units for use in controlling operation of the equipment units in the environment.
SUMMARY OF THE INVENTION
[0005] It is the object of the present invention to provide reliable track fusion data even if track data may be missing or incomplete or unreliable [0006] This object is solved by a method according to claim 1 and device according to claim 6. Further developments are defined in the subclaims.
[0007] Accordingly, there is provided a method of providing track fusion data to guide a motion of an autonomous vehicle, the method comprising the steps of receiving sensor data from a plurality of sensors and generating track data from the sensor data by validating the track data and providing a respective validation result and choosing one of a plurality of fusion algorithms for fusing the track data in order to obtain the track fusion data.
[0008] This approach provides a solution to handle various situations where the data can be descent or incomplete. For example, the track covariance could be missing, unreliable or partially available due to no/partial knowledge of inner filter design or tracker parameter setting.
[0009] In one embodiment of the inventive method the validation result includes information of the completeness of the track data and/or information about co-variance and/or information about missing sensor data.
[0010] In another embodiment of the inventive method the fusion algorithms include an algorithm based on re-constructed covariance or decorrelated tracking.
[0011] In another embodiment of the inventive method the fusion algorithms include an algorithm based on self-learned-feature information.
[0012] Furthermore, there may be provided a method of controlling one or more components of an autonomous vehicle by providing track fusion data according to the methods described above.
[0013] Additionally, there is provided a system for providing track fusion data to guide a motion of an autonomous vehicle, the system including: a plurality of sensors for providing sensor data and tracking means for generating track data from the sensor data, validating means for validating the track data and providing a respective validation result and fusion means for choosing one of a plurality of fusion algorithms for fusing the track data in order to obtain the track fusion data.
[0014] In a specific embodiment of the inventive device the sensor means includes at least one of a radar unit, a camera unit, a lidar unit and an ultrasonic unit.
[0015] Further advantages, features, and details of the invention derive from the following description of preferred embodiments as well as from the drawing. The features and feature combinations previously mentioned in the description as well as the features and feature combinations mentioned in the following description of the figure and/or shown in the figure alone can be employed not only in the respectively indicated combination but also in any other combination or taken alone without leaving the scope of the invention.
BRIEF DESCRIPTION OF THE DRAWING
[0016] The novel features and characteristic of the disclosure are set forth in the appended claims. The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and together with the description, serve to explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described below, by way of example only, and with reference to the accompanying figures.
[0017] The attached drawing Fig.1 shows a workflow of an embodiment of a generic track fusion.
DETAILED DESCRIPTION
[0018] In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration". Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
[0019] While the disclosure is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawing and will be described in detail below. It should be understood, however, that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
[0020] The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion so that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus preceded by "comprises" or "comprise' does not or do not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
[0021] In the following detailed description of the embodiment of the disclosure, reference is made to the accompanying drawing that forms part hereof, and in which is shown by way of illustration a specific embodiment in which the disclosure may be practiced. This embodiment is described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
[0022] The present disclosure relates to the field of autonomous vehicles, and more specifically to techniques, including artificial intelligence and machine learning-based techniques, used by an autonomous vehicle management system of an autonomous vehicle for improving the sensory fusion system of an autonomous driving vehicle.
[0023] Autonomous driving requires comprehensive and precise knowledge of the surrounding objects, both static and dynamic objects. However, sensors, for example, a camera, radar, Light Detection and Ranging (LiDAR), ultrasound, etc., have limited bandwidth. These limitations in the sensory fusion process result in partial detection of objects surrounding the autonomous vehicle.
[0024] During the sensory fusion process numerous difficulties may arise. For instance, cameras can provide object category information but they are inaccurate in depth estimation and also very sensitive to severe weather conditions. LiDAR can provide object shape but since they rely on the reflection of laser pulses, heavy rain or fog (undesirable weather conditions) may significantly lower the effectiveness of LiDAR and make it difficult to get an accurate picture of the vehicle's surroundings. In contrast, radar sensors, are not affected by weather condition and provide reliable radial velocity information. However, the current data set is not dense enough for accurate object shape estimation. Therefore, this disclosure introduces the track fusion model where the model is designed to monitor and compensate the accuracy of the sensory fusion data.
[0025] The fusion model further comprises a track information validator, where the information validator further comprises a system identifier configured to validate the tracked data and consequently determine the usability of this data.
[0026] In an embodiment, the validation result may include information about the completeness of the track data. For example, the validator may find out that sensor information is missing. In more details, the validator may assess the completeness of the track data with respect to information about covariance. For example, track data of different sources may be covariant or not. Alternatively, or additionally the validation results may include information about missing sensor data. This information may be specifically related to a sensor type which did not deliver the necessary information.
[0027] In a further embodiment, if the usability of data is validated then one of several fusion algorithms for fusing the track data may be chosen in order to obtain the track fusion data. One or more of these fusion algorithms may be based on reconstructed covariance or decorelated tracking. The fusion algorithm may also include an algorithm based on self-learned feature information. Such fusion algorithm may be obtained by any machine learning applicable to this invention. Specifically, such machine learning can be realized with deep neuronal networks. The plurality of fusion algorithms may also include a traditional fusion algorithm. Any of these algorithms may be executed in track fusers with or without a memory. Furthermore, track fuser may have a feedback or not. All of the above mentioned track users or fusion algorithms may be used individually or in combination.
[0028] In an embodiment, Fig. 1 shows a block diagram of a track fusion module. The single blocks can be regarded as hardware blocks and/or as software blocks. The block diagram represents one specific embodiment of the present invention. However, it may represent a method or a device. Fig. 1 further shows the vehicle comprising a plurality of sensors for gathering data about the environment of the vehicle. Respective sensor may include a radar 1, a camera 2, a LiDAR 3, an ultrasonic unit 4 and so on. The vehicle further includes a plurality of local trackers 5.
[0029] In an embodiment, the plurality of local trackers 5 are configured to receive the sensor data and convert these data into track data. The local trackers 5 may further comprise a plurality of tracker units (not shown in the Fig.), for instance one tracker or multiple trackers for each sensor or any combination thereof. The track data from the plurality of local trackers 5 are further considered as the input of the a fusion center 6. The fusion center 6 may be generic track fusion tool box. The fusion center 6 includes a track information validator 7 where the track information validator 7 is used to carry an analysis of the track data using a system identifier (not shown in Fig).
[0030] In a specific embodiment, a valid track fusion requires the associated average covariance data received from the track data to be statistically consistent. However, due to wrong filter design or bandwidth limitation, the track and/or its associated covariance might be incomplete. In order to gain a respective information, a chi-square test can validate the situation, for instance.
[0031] The track information validator 7 may further classify the track data. In the present example this classification may be made in three classes as shown in the Fig. 1. In the present case the three classes are full information 8, missing/unreliable covariance information 9 and missing sensor information 10. If the track data is complete, it comprises sensor information and covariance information. In this case, the track data are classified to "full information" 8. If the covariance of the track data is missing or is unreliable, the track data is classified to "missing/unreliable" covariance information" 9. Otherwise, if sensor information is missing in the track data, the track data is classified to "missing sensor information" 10.
[0032] Based on this classification, a track fusion algorithm will be selected. In the present case there are available the following three algorithms: track fuser 11 with/without memory and/or no/having feedback, track fuser 12 with reconstructed covariance or de-correlated tracker and track fuser 13 with self-learned feature information. This track data is classified to "full information" 8, the fusion center 6 chooses the track fuser 11 with/without memory and/or no/having feedback. In this case the track fuser may comprise any known track fusing algorithm. The fusing algorithm may be equipped with this feedback loop or not. Furthermore, the track fuser may employ a memory or not.
[0033] If the track data is classified as "missing/unreliable covariance information' 9 the fusion center 6 selects track fuser 12 with reconstructed covariance or de-correlated tracker. In this case the missing or unreliable covariance information is reconstructed in order to use it for track fusion. Otherwise, if the track data is classified as "missing sensor information" 10, the fusion center 6 may select track fusor 13 with self-learned feature information. In this case the missing sensor information is estimated by a machine learning algorithm.
[0034] A health monitor 14 may supervise the process in the fusion center 6. Specifically, it may control the classification of the track data and the respective selection of the track fuser. For monitoring the health value of the track fusion a statistical
S
consistency check may be performed. The statistical consistency check requires the corresponding dimensionless value must be contained within a confidential region. If the value is consecutively out of the range for a certain amount of time, it is an improper case.
[0035] In an alternative embodiment, the motivation of using the track fusion tool box is to handle severe conditions for safety with the best available center motivation. This solution will handle the common practical issues in intelligent vehicle applications, specifically if information is incomplete, sensor data is missing and the system falls.
[0036] The track fusion toolbox is generic, since according to the information validation module one can choose the following different solutions for handling the corresponding issues: 1. Track fusion without memory and feedback.
2. Track fusion with memory and no feedback.
3. Track fusion with memory and feedback.
4. Track fusion without memory and feedback 5. De-correlated tracker.
6. Artificial intelligence boosted track fuser.
Reference signs 1 Radar 2 Camera 3 Lidar 4 Ultrasonic unit Local trackers 6 Fusion center 7 Track information validadator 8 Full information 9 Missing/unreliable covariance information Missing sensor information 11 Track fusor with/without memory 12 Track fusor with reconstructed covariance or de-correlated 13 Track fusor with self-learned feature information 14 Health monitor

Claims (7)

  1. CLAIMS1. A method of providing track fusion data to guide a motion of an autonomous vehicle, the method comprising the steps of: - receiving sensor data from at least one sensor device (1, 2, 3, 4) and - generating track data from the sensor data, characterized by - validating the track data and providing a respective validation result, - choosing one of a plurality of fusion algorithms (11, 12, 13) for fusing the track data in order to obtain the track fusion data.
  2. 2. The method according to claim 1, characterized in that the validation result comprises information about the completeness of the track data and/or information about covariance and/or information about missing sensor data.
  3. 3. The method according to claim 1 or 2, characterized in that the fusion algorithms (11, 12, 13) further include an algorithm based on reconstructed covariance or decorrelated tracking (12).
  4. 4. The method according to any one of claims 1 to 3, characterized in that the fusion algorithms (11, 12, 13) include an algorithm based on self-learned feature information (13).
  5. 5. A method of controlling one or more components of an autonomous vehicle by providing track fusion data according to the method of any one of the preceding claims.
  6. 6. A device for providing track fusion data to guide a motion of an autonomous vehicle, the device including: - a plurality of sensors (1, 2, 3, 4) for providing sensor data and - a tracking system (5) for generating track data from the sensor data, characterized by - a validating system (7) for validating the track data and providing a respective validation result, and - a fusion center (6) for choosing one of a plurality of fusion algorithms (11, 12, 13) for fusing the track data in order to obtain the track fusion data.
  7. 7. The device according to claim 6, characterized in that the plurality of sensors (1,2,3,4) comprises at least one of a radar unit (1), a camera unit (2), a LiDAR unit (3) and an ultrasonic unit (4).
GB2107786.2A 2021-06-01 2021-06-01 Track fusion for an autonomous vehicle Withdrawn GB2607299A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2107786.2A GB2607299A (en) 2021-06-01 2021-06-01 Track fusion for an autonomous vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2107786.2A GB2607299A (en) 2021-06-01 2021-06-01 Track fusion for an autonomous vehicle

Publications (2)

Publication Number Publication Date
GB202107786D0 GB202107786D0 (en) 2021-07-14
GB2607299A true GB2607299A (en) 2022-12-07

Family

ID=76741322

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2107786.2A Withdrawn GB2607299A (en) 2021-06-01 2021-06-01 Track fusion for an autonomous vehicle

Country Status (1)

Country Link
GB (1) GB2607299A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115061139A (en) * 2022-07-01 2022-09-16 重庆邮电大学 Multi-sensor fusion method and system for intelligent driving vehicle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2009200855B2 (en) 2008-03-04 2014-05-15 Technological Resources Pty. Limited Method and system for exploiting information from heterogeneous sources
CN107192998A (en) * 2017-04-06 2017-09-22 中国电子科技集团公司第二十八研究所 A kind of adapter distribution track data fusion method based on covariance target function
US9910441B2 (en) 2015-11-04 2018-03-06 Zoox, Inc. Adaptive autonomous vehicle planner logic
WO2018154367A1 (en) * 2017-02-23 2018-08-30 Kpit Technologies Limited System and method for target track management of an autonomous vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2009200855B2 (en) 2008-03-04 2014-05-15 Technological Resources Pty. Limited Method and system for exploiting information from heterogeneous sources
US9910441B2 (en) 2015-11-04 2018-03-06 Zoox, Inc. Adaptive autonomous vehicle planner logic
WO2018154367A1 (en) * 2017-02-23 2018-08-30 Kpit Technologies Limited System and method for target track management of an autonomous vehicle
CN107192998A (en) * 2017-04-06 2017-09-22 中国电子科技集团公司第二十八研究所 A kind of adapter distribution track data fusion method based on covariance target function

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DURAISAMY BHARANIDHAR ET AL: "Influence of the Sensor Local Track Covariance on the Track-to-Track Sensor Fusion", 2015 IEEE 18TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS, IEEE, 15 September 2015 (2015-09-15), pages 2643 - 2650, XP032804337, DOI: 10.1109/ITSC.2015.425 *

Also Published As

Publication number Publication date
GB202107786D0 (en) 2021-07-14

Similar Documents

Publication Publication Date Title
US10663594B2 (en) Processing method of a 3D point cloud
US11138467B2 (en) Method, device, product, and computer program for operating a technical system
Rahman et al. Run-time monitoring of machine learning for robotic perception: A survey of emerging trends
EP3477616A1 (en) Method for controlling a vehicle using a machine learning system
EP1255177A2 (en) Image recognizing apparatus and method
US20210081689A1 (en) Method and System for Determining an Activity of an Occupant of a Vehicle
WO2018161217A1 (en) A transductive and/or adaptive max margin zero-shot learning method and system
US11487295B2 (en) Method and device for abstracting a data record
US20220396281A1 (en) Platform for perception system development for automated driving system
EP4229545A1 (en) System for detection and management of uncertainty in perception systems
CN112823321A (en) Position positioning system and method for mixing position identification results based on multiple types of sensors
GB2607299A (en) Track fusion for an autonomous vehicle
Zhou et al. Autonomous visual self-localization in completely unknown environment using evolving fuzzy rule-based classifier
Dani et al. Information fusion in human-robot collaboration using neural network representation
CN110291771B (en) Depth information acquisition method of target object and movable platform
CN115457353A (en) Fusion method and device for multi-sensor data
KR20210074488A (en) Method for generating guide lane for autonomous driving under atypical environment and system thereof
WO2022243337A2 (en) System for detection and management of uncertainty in perception systems, for new object detection and for situation anticipation
US20220019890A1 (en) Method and device for creating a machine learning system
CN113807182B (en) Method, device, medium and electronic equipment for processing point cloud
CN114879182A (en) Unmanned scene diagnosis method, electronic device and storage medium
US20210390419A1 (en) Device and Method for Training and Testing a Classifier
US20210302991A1 (en) Method and system for generating an enhanced field of view for an autonomous ground vehicle
KR20200142716A (en) Method for providing navigation information, server and method for providing vehicle map
Snorrason et al. Vision-based obstacle detection and path planning for planetary rovers

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)