GB2603120A - A method for creating a unified view of the surroundings of an autonomous machine by a sensor fusion system, as well as a corresponding sensor fusion system - Google Patents

A method for creating a unified view of the surroundings of an autonomous machine by a sensor fusion system, as well as a corresponding sensor fusion system Download PDF

Info

Publication number
GB2603120A
GB2603120A GB2100786.9A GB202100786A GB2603120A GB 2603120 A GB2603120 A GB 2603120A GB 202100786 A GB202100786 A GB 202100786A GB 2603120 A GB2603120 A GB 2603120A
Authority
GB
United Kingdom
Prior art keywords
cell
information
sensor fusion
fusion system
surroundings
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2100786.9A
Other versions
GB202100786D0 (en
Inventor
Govindachar Suresh
Vatavu Andrei
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to GB2100786.9A priority Critical patent/GB2603120A/en
Publication of GB202100786D0 publication Critical patent/GB202100786D0/en
Publication of GB2603120A publication Critical patent/GB2603120A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to a method/system for analyzing the surroundings of an autonomous machine 10 by a sensor fusion system 12 of the autonomous machine 10, the method comprising the steps of: capturing data of the surroundings by at least one capturing device 14 of the sensor fusion system 12 and processing a plurality of information of the data by the capturing device 14, S1, S2, summarizing each information of the at least one capturing device 14 in at least one cell out of a plurality of cells to a common cell information, wherein a cell describes a predetermined region of the surroundings captured by the capturing device 14, S3, S4, determining a state of each cell depending on the cell information by using an innovative particle filter termed a glint filter, which treats each input data as a separate item of information about a glint and estimates this information using a collection of particles, by an electronic computing device 16 of the sensor fusion system 12, S5, S6 and clustering the plurality of cells depending on the determined state of each cell for analyzing the surroundings S7.

Description

A METHOD FOR CREATING A UNIFIED VIEW OF THE SURROUNDINGS OF AN AUTONOMOUS MACHINE BY A SENSOR FUSION SYSTEM, AS WELL AS A CORRESPONDING SENSOR FUSION SYSTEM
FIELD OF THE INVENTION
[0001] The invention relates to the field of autonomous machines. More specifically, the invention relates to a method for providing a unified view of the surroundings of an autonomous machine by a sensor fusion system of the autonomous machine, as well as to a corresponding sensor fusion system.
BACKGROUND INFORMATION
[0002] Sensor fusion systems are measurement systems that extract information from data provided by sensors. The data provided by sensors is a collection of local information, for example, the camera sensor provides pixels, which are information about a tiny region. The camera processor might convert pixels to stixels, which are information about a slightly larger region, and provide stixels to the higher levels of the measurement system. The problem for a measurement system is to extract large-scale information from the small scaled information provided by sensors.
[0003] US 6,944,566 B2 discloses a multi-sensor data fusion system and a method, which are providing an additive fusion technique including a modified belief function to adaptively weight the contributions from a plurality of sensors in the system and to produce multiple reliability terms including reliability terms associated with noise for low SNR situations. During a predetermined tracking period, data is received from each individual sensor in the system and a predetermined algorithm is performed to generate sensor reliability functions for each sensor based on each sensor SNR using at least one additional reliability factor associated with noise. Each sensor reliability function may be individually weighted based on the SNR for each sensor and other factors. Additive calculations are performed on the reliability functions to produce at least one system reliability function which provides a confidence level for the multi-sensor system relating to the correct classification of desired objects.
SUMMARY OF THE INVENTION
[0004] It is an object of the invention to provide a method as well as a corresponding sensor fusion system by which the surroundings of an autonomous machine as captured by sensors are analyzed in a more optimal way to result in a unified view of the surroundings while incorporating all the data.
[0005] This objective is solved by a method as well as a corresponding sensor fusion system according to the independent claims. Advantageous forms of configurations are presented in the dependent claims.
[0006] One aspect of the invention relates to a method for analyzing the surroundings of an autonomous machine by a sensor fusion system of the autonomous machine. Data of the surroundings are captured by at least one capturing device of the sensor fusion system and a plurality of information of the data are processed by the capturing device. Each information of the at least one capturing device is summarized in at least one cell out of a plurality of cells to a common cell information, wherein a cell describes a predetermined region of the captured surroundings by the capturing device. A state of each cell is determined depending on the cell information by using a mathematical glint filter, which treats each input data as a separate item of information about a glint and estimates this information using a collection of particles, by an electronic computing device of the sensor fusion system. The plurality of cells are clustered depending on the determined state of each cell for analyzing the surroundings.
[0007] In an embodiment each data of the capturing device, which may also be referred to as a sensor, is treated as a separate item of information, which may also be called a glint. This information is estimated using a small collection of particles, wherein this collection of particles constitutes an estimator, which is used for estimating the glint. One of the constraints on the evolution of these particles is such that particles that do not efficiently reflect the glint are sampled out, thereby ensuring special locality of the particles of each glint. The method also supports initializing multiple estimators for each glint.
[0008] The processing involved in the glint filter makes it suitable for efficient processing on a variety of electronic computing devices such as a processor, GPUs, FPGA and ASIC. The processing involved deals with local information and the information for the various local regions may be processed independently. There is no need to start with entities, for example, no pre-clustering, and there is no need to correlate entities. The problem of ensuring that particles do indeed contribute to the state of the associated glint is solved by the incorporating of the distance to the deterministic predicted cell during the re-sampling of particles.
[0009] In an embodiment, the cell information describes a time stamp when the cell information was captured, and/or a height of the cell and/or a velocity of the cell and/or an occupancy of the cell and/or a freespace of the cell and/or a color of the cell.
[0010] In another embodiment, the cell information is a grid of information or a sparse collection of information.
[0011] In another embodiment, the state of the cell is estimated as statistically aggregated information of the cell information at the cell.
[0012] Another aspect of the invention relates to a sensor fusion system for an autonomous machine for analyzing the surroundings of the autonomous machine, comprising at least one capturing device and at least one electronic computing device, wherein the sensor fusion system is configured to perform a method according to the preceding aspect. In particular, the method is performed by the sensor fusion system.
[0013] Another aspect of the invention relates to an autonomous machine with a sensor fusion system according to the preceding aspect. The autonomous machine may be for example an at least partially autonomous motor vehicle.
[0014] Advantageous forms of the method are to be regarded as advantageous forms of the sensor fusion system as well as of the autonomous machine. The sensor fusion system and the autonomous machine therefore comprises means for performing the method.
[0015] Further advantages, features, and details of the invention derive from the following description of a preferred embodiment as well as from the drawing. The features and feature combinations previously mentioned in the description as well as the feature and feature combinations mentioned in the following description of the figure and/or shown in the figure alone can be employed not only in the respectively indicated combination but also in any other combination or taken alone without leaving the scope of the invention.
BRIEF DESCRIPTION OF THE DRAWING
[0016] The novel features and characteristics of the disclosure are set forth in the independent claims. The accompanying drawing, which is incorporated in and constitutes part of this disclosure, illustrates an exemplary embodiment and together with the description, serves to explain the disclosed principles. In the figure, the same reference signs are used throughout the figure to refer to identical features and components. Some embodiments of the system and/or methods in accordance with embodiments of the present subject matter are now described below, by way of example only, and with reference to the accompanying figure.
[0017] The drawing shows in: [0018] Fig. 1 a schematic block diagram of an embodiment of an autonomous machine with an embodiment of a sensor fusion system.
[0019] In the figure same elements or elements having the same function are indicated by the same reference signs.
DETAILED DESCRIPTION
[0020] In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
[0021] While the disclosure is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawing and will be described in detail below. It should be understood, however, that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
[0022] The terms "comprises'', "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion so that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus preceded by "comprises" or "comprise" does not or do not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
[0023] In the following detailed description of the embodiment of the disclosure, reference is made to the accompanying drawing that forms part hereof, and in which is shown by way of illustration a specific embodiment in which the disclosure may be practiced. The embodiment is described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
[0024] Fig. 1 shows a schematic block diagram of an autonomous machine 10 with an embodiment of a sensor fusion system 12, wherein the sensor fusion system 12 is configured to perform a method according to an embodiment of the invention. The sensor fusion system 12 comprises at least one capturing device 14 and at least one electronic computing device 16. The autonomous machine 10 may be for example an at least partially autonomous motor vehicle.
[0025] The method is for analyzing the surroundings of the autonomous machine 10. In a first step Si the capturing device 14 captures data of the surroundings. In a second step S2 the data captured by the capturing device 14 are processed to a plurality of information, the extent of the plurality depending on the covariance or tolerance or inverse model of the capturing device. In a third step S3, for each cell, each information of the at least one capturing device 14 is summarized and in a fourth step S4 the summarized information are summarized in one cell out of a plurality of summarized information corresponding to some small duration of capture time to a common cell information, wherein a cell describes a predetermined region of the surroundings captured by the capturing device 14. In a fifth step S5 a mathematical glint filter is used by the electronic computing device 16 of the sensor fusion system 12 in order to determine in a sixth step S6 a state of each cell depending on the cell information. The glint filter treats each input data as a separate item of information about a glint and estimates this information using a collection of particles. In a seventh step S7 the plurality of cells are clustered depending on the determined state of each cell for analyzing the surroundings.
[0026] According to an embodiment, the sensor fusion system 12 treats each input data as a separate item of information and estimates this information using a small collection of particles. The sensor fusion system 12 processes the input data from each capturing device 14 to result in information about a small neighborhood called a cell. Multiple data from the same input source and/or from multiple input sources that are associated with the same cell are combined using techniques such as a Dempster-Shafer to result in a unified input information for that cell. The unified information may continue to maintain a list of the different sources that have provided data for that cell. The information about the cell may include the time stamp of when the physical location associated with the cell was sensed by the capturing device 14 and may include estimates by the capturing device 14 of other parameters sensed at the physical location of the cell. The sensor fusion system 12 implements the glint filter for the cell information, which may be called a glint. The sensor fusion system 12 predicts particles and updates their weights as the result of various comparisons such as occupancy, velocity, or semantic information. The sensor fusion system 12 re-samples the particles of a glint based on the spatial distance to the deterministic predicted cell, optionally together with the weights of the particles. Thereby the sensor fusion system 12 ensures that the glint represents local information. The sensor fusion system 12 re-samples the collection of all glints based on the weights of the glints. The sensor fusion system 12 estimates the state of each cell as the statistically aggregated information of the glint at the cell. The sensor fusion system 12 then may cluster the cells based on their states.
[0027] In an embodiment data from each capturing device 14, which may be also referred to as a sensor, is processed to result in information about a small neighborhood, for example, a 10 cm x 10 cm region, which may be called the cell. Given a cell, it is possible that the same capturing device 14 provides multiple items of data for that cell and/or that multiple capturing devices 14 provide data for that cell. Such multiple data per cell are combined via techniques such as Dempster-Shafer to result in a unified measurement information for that cell which may be called the cell information. This unified cell information may continue to maintain a list of different capturing devices 14 that have provided data for that cell. The information about the cell could include the time stamp of when the physical location associated with the cell was sensed by the sensor, for example, when the sensing took place, and could include estimates corresponding to height, velocity, presence or occupancy, absence or free-space, color or, for example, what was sensed or detected at the physical location associated with the cell. The glint is the information of a cell, in other words the cell information. It is irrelevant whether the glints, that is the information of the various cells, are maintained as a grid of glints or, more efficiently and preferably, as a sparse collection of glints.
[0028] The mathematical glint filter, is in particular a special particle filter, for example a Bayesian filter implemented by the Monte Carlo method using particles. The reason for a probabilistic rather than deterministic approach is to account for the covariances or tolerances in the input data. In the sensor fusion system 12 implemented with the glint filter, such as the glint filter, the particles help account for the covariances in the inputs.
[0029] When a cell that has a glint for the first time is detected, a small number of particles is spawned such that the distribution of information among the particles is the information of the glint. Some of the attributes of the glint would be unknown, for example, if none of the capturing devices 14 can measure velocity, then one has the choice for assigning an initial velocity. In such cases, the initialization phase of step S5 may make a single choice and initialize a single glint or make multiple choices and initialize multiple glints for the same cell which may be referred to as the multi-hypothesis glint filter. The fifth step S5 may further involve the phases of prediction, particle's measurement update, glint's measurement update, glint state estimation, cell state estimation, glint re-weighting, new glint initialization, particle resampling, and glint resampling.
[0030] The equations for measuring the glint are not performed on the information of the glint itself; rather, they are performed on the particles of the glint, and the glint is estimated as the statistically aggregated value of the particles. Thus, for example, for the next time step, the prediction of a glint is not the prediction of the glint itself but is the statistically aggregated information of the predicted particles of the glint. The aggregated locations of the particles would be the predicted cell. The cell that is the prediction of the glint itself is termed as the deterministic predicted cell and will play a role in a processing of the glint filter.
[0031] Although at the beginning of each time step, all the particles of a glint have the same weight, during the time step each particle is predicted and has its own weight updated. This updated weight could be the result of various comparisons such as occupancy, velocity, semantic information or furthermore.
[0032] The measured information for the glint at this new time step is a glint, the new measurements may or may not have at the predicted cell. Measurement update would be the combining of the glint and this measured glint, which might not exist if the new measurements have no information for the predicted cell.
[0033] The state of each cell is estimated as the statistically aggregated information of the glints at the cell (step S6). The cells may then be clustered based on their states (step 57). This collection of clustered cells is the estimation of the surroundings. It is the result of the sensor fusion system 12 based on the glint filter.
[0034] The glint filter also includes the re-sampling step. Particles of a glint are re-sampled based on their spatial distance to the deterministic predicted cell, optionally together with their weights. Thereby ensuring that the glint represents local information. The collection of all glints is re-sampled based on the weights of the glints. If for some application, there is a tendency for the cells to end up with a large number of glints, it might be beneficial to down-sample the number of glints of such cells.
[0035] In alternative embodiments, autonomous machines 10 may include autonomous robots, carts, cars, trucks, drones, missiles, submarines, and surveillance systems. These autonomous machines 10 would have one or more of a variety of sensors such as radar, camera, sonar etc., and data from these sensors would need to be processed by a sensor fusion system 12 to result in a unified view of the surroundings. Specifically, the invention relates to a method for providing a unified view of the surroundings of an autonomous machine 10 by a sensor fusion system 12 of the autonomous machine 10, as well as to a corresponding sensor fusion system 12, with the sensor fusion system being implemented with a novel particle filter termed the glint filter.
Reference Signs motor vehicle 12 assistance system 14 capturing device 16 electronic computing device Si first step 32 second step S3 third step 34 fourth step fifth step 36 sixth step 37 seventh step

Claims (1)

  1. CLAIMS1. A method for analyzing the surroundings of an autonomous machine (10) by a sensor fusion system (12) of the autonomous machine (10), the method comprising the steps: -capturing data of the surroundings by at least one capturing device (14) of the sensor fusion system (12) and processing a plurality of information of the data by the capturing device (14); (Si, S2) - summarizing each information of the at least one capturing device (14) in at least one cell out of a plurality of cells to a common cell information, wherein a cell describes a predetermined region of the surroundings captured by the capturing device (14); (S3, S4) - determining a state of each cell depending on the cell information by using a mathematical glint filter, which treats each input data as a separate item of information about a glint and estimates this information using a collection of particles, by an electronic computing device (16) of the sensor fusion system (12); (55, S6) and - clustering the plurality of cells depending on the determined state of each cell for analyzing the surroundings. (S7) 2. The method according to claim 1, characterized in that the cell information describes a timestamp, when the cell information was captured, and/or a height of the cell and/or a velocity of the cell and/or an occupancy of the cell and/or a free-space of the cell and/or a color of the cell.3. The method according to claim 1 or 2, characterized in that the cell information is a grid of information or a sparse collection of information.4. The method according to any one of claims 1 to 3, characterized in that the state of the cell is estimated as the statistically aggregated information of the cell information at the cell.5. A sensor fusion system (12) for an autonomous machine (10) for analyzing the surroundings of the autonomous machine (10), comprising at least one capturing device (14) and at least one electronic computing device (16), wherein the sensor fusion system (12) is configured to perform a method according to claims 1 to 4.
GB2100786.9A 2021-01-21 2021-01-21 A method for creating a unified view of the surroundings of an autonomous machine by a sensor fusion system, as well as a corresponding sensor fusion system Withdrawn GB2603120A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2100786.9A GB2603120A (en) 2021-01-21 2021-01-21 A method for creating a unified view of the surroundings of an autonomous machine by a sensor fusion system, as well as a corresponding sensor fusion system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2100786.9A GB2603120A (en) 2021-01-21 2021-01-21 A method for creating a unified view of the surroundings of an autonomous machine by a sensor fusion system, as well as a corresponding sensor fusion system

Publications (2)

Publication Number Publication Date
GB202100786D0 GB202100786D0 (en) 2021-03-10
GB2603120A true GB2603120A (en) 2022-08-03

Family

ID=74859067

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2100786.9A Withdrawn GB2603120A (en) 2021-01-21 2021-01-21 A method for creating a unified view of the surroundings of an autonomous machine by a sensor fusion system, as well as a corresponding sensor fusion system

Country Status (1)

Country Link
GB (1) GB2603120A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2617557A (en) * 2022-04-08 2023-10-18 Mercedes Benz Group Ag A display device for displaying an information of surroundings of a motor vehicle as well as a method for displaying an information

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6944566B2 (en) 2002-03-26 2005-09-13 Lockheed Martin Corporation Method and system for multi-sensor data fusion using a modified dempster-shafer theory

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6944566B2 (en) 2002-03-26 2005-09-13 Lockheed Martin Corporation Method and system for multi-sensor data fusion using a modified dempster-shafer theory

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TANZMEISTER GEORG ET AL: "Grid-based mapping and tracking in dynamic environments using a uniform evidential environment representation", 2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), IEEE, 31 May 2014 (2014-05-31), pages 6090 - 6095, XP032650735, DOI: 10.1109/ICRA.2014.6907756 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2617557A (en) * 2022-04-08 2023-10-18 Mercedes Benz Group Ag A display device for displaying an information of surroundings of a motor vehicle as well as a method for displaying an information

Also Published As

Publication number Publication date
GB202100786D0 (en) 2021-03-10

Similar Documents

Publication Publication Date Title
CN106981074B (en) Method and apparatus for estimating an orientation of a camera relative to a surface
CN108897836B (en) Method and device for robot to map based on semantics
Clark et al. Group target tracking with the gaussian mixture probability hypothesis density filter
US11409841B2 (en) Method for detecting an object
US10275893B2 (en) Visual tracking of an object
AU2010209473B2 (en) Detecting potential changed objects in images
CN110422175B (en) Vehicle state estimation method and device, electronic device, storage medium, and vehicle
CN112313536B (en) Object state acquisition method, movable platform and storage medium
CN106612385B (en) Video detecting method and video detecting device
CN105574892A (en) Doppler-based segmentation and optical flow in radar images
CN105447881A (en) Doppler-based segmentation and optical flow in radar images
Baig et al. A robust motion detection technique for dynamic environment monitoring: A framework for grid-based monitoring of the dynamic environment
CN108519631B (en) Precipitation intensity prediction method
GB2603120A (en) A method for creating a unified view of the surroundings of an autonomous machine by a sensor fusion system, as well as a corresponding sensor fusion system
Ait Abdelali et al. An adaptive object tracking using Kalman filter and probability product kernel
CN111402293A (en) Vehicle tracking method and device for intelligent traffic
CN112668413B (en) Human body posture estimation method and device, electronic equipment and readable storage medium
CN114528941A (en) Sensor data fusion method and device, electronic equipment and storage medium
Rameshbabu et al. Target tracking system using kalman filter
US20230288551A1 (en) Apparatus and method for determining kinetic information
CN112863124A (en) Fall detection and assistance
Georgy et al. Unconstrained underwater multi-target tracking in passive sonar systems using two-stage PF-based technique
CN111221486B (en) Information display system and information display method
CN114491413A (en) Probability density hypothesis track generation method and system based on minimum cross entropy
CN113269301A (en) Method and system for estimating parameters of multi-target tracking system based on neural network

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)