WO2022115993A1 - Method and apparatus for tuning sensor fusion weights - Google Patents

Method and apparatus for tuning sensor fusion weights Download PDF

Info

Publication number
WO2022115993A1
WO2022115993A1 PCT/CN2020/133093 CN2020133093W WO2022115993A1 WO 2022115993 A1 WO2022115993 A1 WO 2022115993A1 CN 2020133093 W CN2020133093 W CN 2020133093W WO 2022115993 A1 WO2022115993 A1 WO 2022115993A1
Authority
WO
WIPO (PCT)
Prior art keywords
fusion
station
fusion weights
weights
multiple sensors
Prior art date
Application number
PCT/CN2020/133093
Other languages
French (fr)
Inventor
Marc Patrick ZAPF
Jinyao LIANG
Boqi REN
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Priority to CN202080107583.1A priority Critical patent/CN116648600A/en
Priority to PCT/CN2020/133093 priority patent/WO2022115993A1/en
Priority to DE112020007428.5T priority patent/DE112020007428T5/en
Publication of WO2022115993A1 publication Critical patent/WO2022115993A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D3/00Indicating or recording apparatus with provision for the special purposes referred to in the subgroups
    • G01D3/08Indicating or recording apparatus with provision for the special purposes referred to in the subgroups with provision for safeguarding the apparatus, e.g. against abnormal operation, against breakdown
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • aspects of the present disclosure relate generally to sensor fusion, and more particularly, to tuning fusion weights for fusing information from multiple sensors of a station.
  • Intelligent transportation systems such as traffic monitoring apparatus, autonomous vehicles and so on rely on a variety of sensors to perceive the environment. For example, various operations, such as detecting and tracking vehicles, pedestrians, or other objects of the environment, speed detection, position detection, and so on, can be performed based on data captured by the variety of sensors.
  • a vehicle may have different types of sensors equipped, such as cameras, LiDAR (Light Detection And Ranging) , radar, ultrasonic sensors, and so on.
  • the different types of sensors may have their respective advantages and weaknesses.
  • cameras can perceive color and texture information from the world and are good at classifying objects, however, their detection range is limited and they perform poorly in limited lighting or adverse weather conditions.
  • LiDARs provide precise distance information, have a large detection range and are able to detect small objects, they work well at night but do not provide color information and their performance decreases in heavy rain.
  • Radars provide precise distance and velocity information and work well in bad weather conditions but have a rather low resolution.
  • the data from different sensors can be fused, which is known as sensor fusion.
  • a method for tuning sensor fusion weights for a station comprises: determining one or more conditions relating to operation performance of multiple sensors of the station; determining fusion weights for the multiple sensors of the station based on the one or more conditions; obtaining fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station; adjusting the fusion weights for the multiple sensors of the station based on the obtained fusion weights for the multiple neighboring stations; and performing fusion of information obtained from the multiple sensors of the station based on the adjusted fusion weights.
  • a method for tuning sensor fusion weights for a station comprises: determining noise variance metrics for multiple sensors of the station; determining at least one of a weather condition and a lighting condition for the station; determining fusion weights for the multiple sensors of the station based on the noise variance metrics and the at least one of the weather condition and the lighting condition; and performing fusion of information obtained from the multiple sensors of the station based on the fusion weights.
  • a computer system which comprises one or more processors and one or more storage devices storing computer-executable instructions that, when executed, cause the one or more processors to perform the operations of the method as mentioned above as well as to perform the operations of the method provided according to aspects of the disclosure.
  • there provides one or more computer readable storage media storing computer-executable instructions that, when executed, cause one or more processors to perform the operations of the method as mentioned above as well as to perform the operations of the method provided according to aspects of the disclosure.
  • the apparatus comprises a plurality of sensors for obtaining sensor data and a processing system to execute instructions that cause the processing system to perform the operations of the method as mentioned above as well as to perform the operations of the method provided according to aspects of the disclosure.
  • Fig. 1 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
  • Fig. 2 is a block diagram illustrating exemplary sensor fusion methods according to aspects of the disclosure.
  • Fig. 3 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
  • Fig. 4 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
  • Fig. 5 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
  • Fig. 6 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
  • Fig. 7 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
  • Fig. 8 is a flowchart illustrating an exemplary method for tuning sensor fusion weights for a station according to aspects of the disclosure.
  • Fig. 9 is a flowchart illustrating an exemplary method for tuning sensor fusion weights for a station according to aspects of the disclosure.
  • Fig. 10 is a block diagram illustrating an exemplary computing system 1000 according to aspects of the disclosure.
  • Fig. 1 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
  • the apparatus 1 illustrated in Fig. 1 may be a vehicle such as autonomous vehicle, a stationary surveillance device such as a surveillance device mounted along a road, or the like, or may be a part of the vehicle, surveillance device, or the like.
  • the apparatus 1 can be equipped with various sensors 10.
  • the various sensors 10 may include a camera system 110, a LiDAR system 120, and a radar system 130.
  • the set of sensors 10 of the apparatus 1 may include other types of sensors in addition to the illustrated example sensors 110-130, for example, various other sensor resources such as sonar, ultrasonic sensors, proximity sensors, infrared sensors, wheel speed sensors, rain sensors and so on may be included in the set of sensors 10.
  • the set of sensors 10 of the apparatus 1 may not include all the three types of sensors 110-130, any combinations of at least two of sensors 110-130 are possible to be equipped on the apparatus 1.
  • the apparatus 1 may include a processing system 20.
  • the processing system 20 may be implemented in various ways, for example, the processing system 20 may include one or more processors and/or controllers, which may execute software to perform various operations or functions, such as operations or functions according to various aspect of the disclosure.
  • the processing system 20 may receive sensor data from the sensors 10 such as sensors 110-130, and perform various operations by analyzing the sensor data.
  • the processing system 20 may include a condition detection module 210, a fusion weight tuning module 220, and a sensor fusion module 230. It is appreciated that the modules 210-230 may be implemented in various ways, for example, may be implemented as software modules or functions which are executable by processors and/or controllers.
  • the condition detection module 210 may be configured to determine one or more conditions relating to operation performance of at least a part of multiple sensors 10 of the apparatus 1.
  • the condition relating to the operation performance of a sensor may be noise variance metric of the sensor.
  • the noise variance metric may describe how reliable the information from the sensor at a time point is.
  • image noise may be utilized as a parameter to determine the noise variance metric for the camera sensors 110.
  • the signal noise may be utilized as a parameter to determine the noise variance metric for the LiDAR sensors 120 or the radar sensors 130.
  • the object detection confidence may be utilized as a parameter to determine the noise variance metric for a sensor 10. It is appreciated that any method for determining the noise variance metric for a sensor 10 is applicable to the embodiments of the disclosure, and the disclosure is not limited to a specific method for determining noise variance metric of a sensor.
  • the condition relating to the operation performance of a sensor may be a weather condition in which the sensors 10 of the apparatus works.
  • the condition detection module 110 may determine the weather condition based on the sensor data from some sensors 10. For example, a current weather condition may be determined based on the image data captured by the camera system 110.
  • an image classification model may be utilized to determine the current weather condition by classifying the captured images as one of multiple weather conditions, which may include for example sunny, foggy, misty, rainy, snowy, and so on. It is appreciated that any method for determining the weather condition is applicable to the embodiments of the disclosure, and the disclosure is not limited to a specific method for determining the weather condition.
  • the condition relating to the operation performance of a sensor may be a lighting condition in which the sensors 10 of the apparatus works.
  • the condition detection module 110 may determine the lighting condition based on the sensor data from some sensors 10. For example, a current lighting condition may be determined based on the image data captured by the camera system110.
  • an image classification model may be utilized to determine the current lighting condition by classifying the captured images as one of multiple lighting conditions, which may include for example sun glare, normal lighting, low lighting, night, and so on.
  • the lighting condition may be determined based on the intensity of the captured image. It is appreciated that any method for determining the lighting condition is applicable to the embodiments of the disclosure, and the disclosure is not limited to a specific method for determining the lighting condition.
  • the fusion weight tuning module 220 may be configured to determine the fusion weights for respective sensors based on the condition detected by the condition detection module 210.
  • the fusion weight tuning module 220 may determine the fusion weights for sensors 110, 120 and 130 respectively based on the noise variance metrics of the sensors 110, 120 and 130, and then the sensor fusion module 230 may perform fusion of the information obtained from the sensors 110, 120 and 130 based on their fusion weights.
  • the fusion weight tuning module 220 may determine the fusion weights for sensors 110, 120 and 130 respectively based on the weather condition detected by the condition detection module 210, and then the sensor fusion module 230 may perform fusion of the information obtained from the sensors 110, 120 and 130 based on their fusion weights.
  • the fusion weight for cameras 110 may be reduced relative to severity of the weather condition as the camera works less well in bad weather conditions, so fusion weights for cameras 110 may be decreased in bad weather conditions and may be increased in good weather conditions.
  • fusion weight adjustment may be defined based on intrinsic sensor characteristics. For example, radar sensors 130 usually works well in bad weather conditions, so fusion weights for radar sensors 130 may be increased in bad weather and may be decreased in good weather.
  • the performance of LiDAR sensors 120 decreases in heavy rain, so fusion weights for radar sensors 130 may be decreased in rain.
  • the exemplary weather conditions 1-4 may be for example sunny, foggy, rainy, snowy.
  • the fusion weights for each of the sensors 110-130 under each of the weather conditions 1-4 may be defined in the table 1.
  • the fusion weight tuning module 220 may set the fusion weights for camera 110, LiDAR 120 and radar 130 as Wc 3, Wl 3 and Wr 3 respectively.
  • the fusion weights as exampled in Table 1 may be defined heuristically, for example, may be defined according to experiments or experience and may be updated as appropriate.
  • the fusion weights may be set as a zero value or non-zero value, and the disclosure is not limited to specific fusion weight values. It is appreciated that the fusion weights of respective sensors in respective weather conditions may be determined in any applicable way. For example, an image classification model may be used to decide a fusion weight for a certain sensor in consideration of weather condition by classifying images from the camera 110.
  • the fusion weight tuning module 220 may determine the fusion weights for sensors 110, 120 and 130 respectively based on the lighting condition or lighting level detected by the condition detection module 210, and then the sensor fusion module 230 may perform fusion of the information obtained from the sensors 110, 120 and 130 based on their fusion weights.
  • the fusion weight for cameras 110 may be reduced relative to severity of the lighting condition as the camera works less well in bad lighting conditions, so fusion weights for cameras 110 may be decreased in bad lighting conditions and may be increased in good lighting conditions.
  • LiDAR sensors 120 and radar sensors 130 both work well in bad lighting conditions, so fusion weights for LiDAR sensors 120 and radar sensors 130 may be increased in bad lighting conditions and may be deceased in good lighting conditions.
  • the exemplary lighting conditions 1-4 may be for example sun glare, normal lighting, low lighting, night.
  • the fusion weights for each of the sensors 110-130 under each of the lighting conditions 1-4 may be defined in the table 2.
  • the fusion weight tuning module 220 may set the fusion weights for camera 110, LiDAR 120 and radar 130 as Wc 4, Wl 4 and Wr 4 respectively.
  • the fusion weights as exampled in Table 2 may be defined heuristically, for example, may be defined according to experiments or experience and may be updated as appropriate.
  • the fusion weights may be set as a zero value or non-zero value, and the disclosure is not limited to specific fusion weight values. It is appreciated that the fusion weights of respective sensors in respective lighting conditions may be determined in any applicable way. For example, an image classification model may be used to decide a fusion weight for a certain sensor in consideration of lighting condition by classifying images from the camera 110.
  • Fig. 2 is a block diagram illustrating sensor fusion methods schematically according to aspects of the disclosure.
  • Sensor fusion can generally be divided into three main categories: low-level sensor fusion (LLF) , mid-level fusion (MLF) , and high-level sensor fusion (HLF) .
  • LLF low-level sensor fusion
  • MLF mid-level fusion
  • HLF high-level sensor fusion
  • the diagrams A and B in Fig. 2 respectively show HLF and LLF architectures.
  • objection detection operations as shown in blocks 240A to 240C may be performed on sensor data captured by respective sensors, examples of which may be the cameras 110, LiDAR 120 and radar 130 shown in Fig. 1.
  • Each of the objection detection modules 240A to 240C may generate a list of objects based on the sensor data from the corresponding sensor.
  • the lists of objects generated by the modules 240A to 240C may be provided to the sensor fusion module 230, which is also shown in Fig. 1.
  • the sensor fusion module 230 may fuse the object lists, where the respective object lists obtained from different sensors such as sensors 110-130 may be given different weights in the fusion.
  • raw data from the sensors such as sensors 110-130 are fused in the fusion module 230, where the raw data obtained from different sensors such as sensors 110-130 may be given different weights in the fusion. Then the detection module 240 performs objection detection based on the fused data.
  • MLF is an abstraction on top of LLF.
  • features are extracted from raw data of multiple sensors such as sensors 110-130, and the extracted features are fused by the fusion module 230, where the intermediate data such as the extracted features obtained from different sensors such as sensors 110-130 may be given different weights in the fusion. Then the detection module 240 performs objection detection based on the fused data.
  • Fig. 3 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
  • the condition detection module 210 may include a weather condition detection module 2110, a lighting condition detection module 2120, a camera noise detection module 2130, a LiDAR noise detection module 2140, a radar noise detection module 2150.
  • the weather condition detection module 2110 may determine a current weather condition based on the images captured by the camera system 110.
  • the weather condition detection module 2110 may process a predefined number of image frames which are captured in a predefined time period so as to determine the current weather based on the number of image frames. By predefining the time period, it is helpful to avoid rapid shift of fusion weights due to short-term, extreme weather measurements.
  • the weather condition detection module 2110 may be implemented as an image classification model which classifies the captured images as one of multiple weather conditions, such as sunny, foggy, misty, rainy, snowy, and so on. It is appreciated that any method for determining the weather condition is applicable for the implementation of the weather condition detection module 2110.
  • the lighting condition detection module 2120 may determine a current lighting condition based on the images captured by the camera system 110.
  • the lighting condition detection module 2120 may process a predefined number of image frames which are captured in a predefined time period so as to determine the current weather based on the number of image frames. By predefining the time period, it is helpful to avoid rapid shift of fusion weights due to short-term, extreme lighting measurements.
  • the lighting condition detection module 2120 may be implemented as an image classification model which classifies the captured images as one of multiple lighting conditions, such as sun glare, normal lighting, low lighting, night, and so on. It is appreciated that any method for determining the lighting condition is applicable for the implementation of the weather condition detection module 2120.
  • the weather condition detection module 2110 and the lighting condition detection module 2120 may be implemented as a single image classification model which classifies the captured images as one of multiple weather conditions and one of multiple lighting conditions. It is appreciated that any method for determining the weather condition and lighting condition altogether is applicable for the implementing the weather condition detection module 2110 and the lighting condition detection module 2120 as one module.
  • the camera noise detection module 2130, the LiDAR noise detection module 2140 and the radar noise detection module 2150 may respectively determine current noise variance metrics for the camera system 110, LiDAR system 120 and radar system 130 based on a parameter of the sensors.
  • the parameter may be the signal noise of the sensors 110-130.
  • the parameter may be the object detection confidence. It is appreciated that any method for determining the noise variance metric for a sensor is applicable in aspects of the disclosure.
  • the processing system 20 of the apparatus 1 includes objection detection modules 240A to 240C.
  • the objection detection modules 240A to 240C respectively perform objection detection operations on sensor data captured by respective sensors such as the cameras 110, LiDAR 120 and radar 130, so as to generate a list of objects. It is appreciated that, in the example of taking the objection detection confidence as the parameter for determining the noise variance metric, the operations of the noise detection modules 2130-2150 may be associated with the operations of the objection detection modules 240A to 240C.
  • modules 240A to 240C may perform other operations or functions in other embodiments, for example, the operations performed by the modules 240A to 240C may be position detection, speed detection, tracking and so on, and accordingly the corresponding operation confidence may be taken as the parameter for determining the noise variance metric by the noise detection modules 2130-2150.
  • the fusion weight tuning module 220 may determine the fusion weights for sensors 110 and 120 based on the noise variance metrics of the sensors 110 and 120 and the weather condition, which are respectively detected by the camera noise detection module 2130, the LiDAR noise detection module 2140 and the weather condition detection module 2110.
  • the fusion weights for the sensors 110 and 120 determined based on the noise variance metrics of the sensors 110 and 120 may be adjusted utilizing the fusion weights for the sensors 110 and 120 determined based on the current weather, so as to obtain the adjusted fusion weights for the sensors 110 and 120.
  • the adjusted fusion weight for one sensor may be the multiplication of the fusion weights for the sensor associated with respective conditions.
  • the adjusted fusion weight for the camera 110 may be the multiplication of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110 and the fusion weight for the camera 110 determined based on the current weather
  • the adjusted fusion weight for the LiDAR 120 may be the multiplication of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120 and the fusion weight for the LiDAR 120 determined based on the current weather.
  • the adjusted fusion weight for one sensor may be the sum of the fusion weights for the sensor associated with respective conditions.
  • the adjusted fusion weight for the camera 110 may be the sum of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110 and the fusion weight for the camera 110 determined based on the current weather
  • the adjusted fusion weight for the LiDAR 120 may be the sum of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120 and the fusion weight for the LiDAR 120 determined based on the current weather.
  • the sensor fusion module 230 may fuse the lists of objects received from objection detection modules 240A-240B based on the fusion weights received from the fusion weight tuning module 220. By adjusting the fusion weights determined based on the noise variance metrics with the fusion weights determined based on the current weather, it is helpful to improve the reliability of the sensor fusion where some potential issue associated with using only the noise variance metrics for fusion weight tuning may be mitigated. For example, in foggy conditions, camera data may be used to detect objects at close distance with high confidence, but it is unaware that distance tracking capability based on camera data is impaired.
  • the fusion weights for the camera 110 may be timely reduced and the fusion weights for the LiDAR 120 may be increased, so as to mitigating the potential problem in such circumstance.
  • the fusion weight tuning module 220 may determine the fusion weights for sensors 110 and 120 based on the noise variance metrics of the sensors 110 and 120 and the lighting condition, which are respectively detected by the camera noise detection module 2130, the LiDAR noise detection module 2140 and the lighting condition detection module 2120.
  • the fusion weights for the sensors 110 and 120 determined based on the noise variance metrics of the sensors 110 and 120 may be adjusted utilizing the fusion weights for the sensors 110 and 120 determined based on the current lighting level, so as to obtain the adjusted fusion weights for the sensors 110 and 120.
  • the adjusted fusion weight for the camera 110 may be the multiplication of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110 and the fusion weight for the camera 110 determined based on the current lighting condition
  • the adjusted fusion weight for the LiDAR 120 may be the multiplication of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120 and the fusion weight for the LiDAR 120 determined based on the current lighting condition.
  • the adjusted fusion weight for the camera 110 may be the sum of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110 and the fusion weight for the camera 110 determined based on the current lighting condition
  • the adjusted fusion weight for the LiDAR 120 may be the sum of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120 and the fusion weight for the LiDAR 120 determined based on the current lighting condition.
  • the sensor fusion module 230 may fuse the lists of objects received from objection detection modules 240A-240B based on the fusion weights received from the fusion weight tuning module 220.
  • the fusion weight tuning module 220 may determine the fusion weights for sensors 110 and 120 based on the noise variance metrics of the sensors 110 and 120 as well as the weather condition and the lighting condition, which are respectively detected by the camera noise detection module 2130, the LiDAR noise detection module 2140, the weather condition detection module 2110 and the lighting condition detection module 2120.
  • the fusion weights for the sensors 110 and 120 determined based on the noise variance metrics of the sensors 110 and 120 may be adjusted utilizing the fusion weights for the sensors 110 and 120 determined based on the current weather and the fusion weights for the sensors 110 and 120 determined based on the current lighting level, so as to obtain the adjusted fusion weights for the sensors 110 and 120.
  • the adjusted fusion weight for the camera 110 may be the multiplication of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110, the fusion weight for the camera 110 determined based on the current weather and the fusion weight for the camera 110 determined based on the current lighting condition
  • the adjusted fusion weight for the LiDAR 120 may be the multiplication of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120, the fusion weight for the LiDAR 120 determined based on the current weather and the fusion weight for the LiDAR 120 determined based on the current lighting condition.
  • the adjusted fusion weight for the camera 110 may be the sum of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110, the fusion weight for the camera 110 determined based on the current weather and the fusion weight for the camera 110 determined based on the current lighting condition
  • the adjusted fusion weight for the LiDAR 120 may be the sum of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120, the fusion weight for the LiDAR 120 determined based on the current weather and the fusion weight for the LiDAR 120 determined based on the current lighting condition.
  • the sensor fusion module 230 may fuse the lists of objects received from objection detection modules 240A-240B based on the fusion weights received from the fusion weight tuning module 220.
  • the fusion weight tuning module 220 may determine the fusion weights for any two or more of sensors 110, 120 and 130 based on the noise variance metrics of the two or more sensors as well as at least one of the weather condition and the lighting condition. For example, the fusion weights for the two or more sensors determined based on their noise variance metrics may be adjusted utilizing at least one of the fusion weights for the two or more sensors determined based on the current weather and the fusion weights for the two or more sensors determined based on the current lighting level, so as to obtain the adjusted fusion weights for the tow or more sensors.
  • the sensor fusion module 230 may fuse the lists of objects received from the corresponding two or more of the objection detection modules 240A-240C based on the fusion weights received from the fusion weight tuning module 220.
  • Fig. 4 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
  • the difference of the structure illustrated in Fig. 4 from that of Fig. 3 is that the objection detection module 240 operates after the operation of the sensor fusion module 230.
  • the sensor fusion module performs LLF fusion on the raw data captured by the different sensors such as sensors 110-130, or performs MLF fusion on the extracted features obtained from different sensors such as sensors 110-130, as explained above with reference to Fig. 2. Then the detection module 240 performs objection detection based on the fused data.
  • Other parts of Fig. 4 are similar to those of Fig. 3 and the details are not repeated here.
  • the information to be fused at the sensor fusion module 230 may be the target measurement such as the detected object, speed, position and so on, the raw data captured by the different sensors, and the intermediate data such as the extracted features and so on.
  • the different types of data to be fused at the sensor fusion module 230 may be commonly referred to as information obtained from sensors.
  • Fig. 5 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
  • each of the cars 510-560 may be or may include the apparatus 1 illustrated in Figs. 1, 3 and 4, and may be referred to as a station, a movable station and so on.
  • the station 510 may determine one or more conditions relating to operation performance of its multiple sensors such as the camera 110, the LiDAR 120, the radar 130 and so on. As illustrated above, the one or more conditions may be the noise variance metrics of the respective sensors 10, the weather condition, the lighting condition and so on. Then the station 510 may determine fusion weights for the multiple sensors based on the one or more conditions. It is appreciated the multiple sensors may be any two or more of the sensors 110-130 as well as other possible sensors equipped on the station 510. There may be a situation that the fusion weights for the multiple sensors determined by the station 510 based on its own sensor data may be not reliably or constantly accurate due to temporary and extreme measurements from the sensors, especially for the cameras 110.
  • the station 510 may obtain fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station 510, and adjust the fusion weights for the multiple sensors of the station 510 based on the obtained fusion weights for the multiple neighboring stations. It is appreciated that the station 510 may communicate wirelessly with the neighboring stations via various networks, such as Wireless Local Area Network (WLAN) , 4G wireless communication network, 5G wireless communication network, the future evolved networks like 6G and so on. The communication network is not illustrated in the Fig. 5 for simplicity.
  • WLAN Wireless Local Area Network
  • 4G wireless communication network 4G wireless communication network
  • 5G wireless communication network the future evolved networks like 6G and so on.
  • the communication network is not illustrated in the Fig. 5 for simplicity.
  • the neighboring stations may be for example at least a part of the stations 520 to 560 as illustrated in Fig. 5 as well as other stations as appropriate.
  • the predefined range of the station 510 may be a predefined distance from the station 510.
  • the neighboring stations located within the predefined distance from the station 510 may have similar conditions such as weather condition and lighting condition, and may provide their current fusion weights to the station 510 for it to adjust its fusion weights.
  • the predefined range of the station 510 may be a predefined number of neighboring stations around the station 510.
  • the station 510 may receive fusion weights from the predefined number of neighboring stations around the station 510.
  • the predefined range of the station 510 may be a combination of the above mentioned predefined distance from the station 510 and the above mentioned predefined number of neighboring stations around the station 510.
  • the stations 520-560 are taken as the neighboring stations and the sensors 110-120 are taken as the multiple sensors in the following description.
  • the fusion weight tuning module 220 of the station 510 may perform the function of adjusting the fusion weights for the multiple sensors 110-120 of the station 510 based on the obtained fusion weights of the multiple sensors 110-120 of the multiple neighboring stations 520-560. Then the sensor fusion module 230 of the station 510 may perform fusion of information obtained from the multiple sensors 110-120 of the station 510 based on the adjusted fusion weights.
  • the station 510 may adjust the fusion weight for its sensor 110 by averaging the fusion weight for its sensor 110 and the obtained fusion weights for the sensor 110 of the multiple neighboring stations 520-560.
  • the station 510 may adjust the fusion weight for its sensor 120 by averaging the fusion weight for its sensor 120 and the obtained fusion weights for the sensor 120 of the multiple neighboring stations 520-560.
  • the station 510 may adjust the fusion weight for its sensors 110 or 120 by weighted averaging the fusion weight for the sensor 110 or 120 of the multiple stations 510-560.
  • the calculating weights for the fusion weights from the neighboring stations 520-560 may be inverse proportional to the distances between the respective neighboring station 520-560 and the station 510.
  • the station 510 may discard abnormal values from the fusion weight for the sensor 110 or 120 of the multiple stations 510-560, and then adjust the fusion weight for its sensor 110 or 120 by averaging or weighted averaging the remained fusion weights for the sensor 110 or 120 of the multiple stations 510-560.
  • the abnormal values may be determined in any suitable way.
  • the abnormal values may be determined based on a predefined variance range, where a value among multiple fusion weight values of the multiple stations 510-560 would be taken as an abnormal value if its deviation from the mean is out of the variance range, otherwise, it is taken as a normal value.
  • the station 510 may take the fusion weight as its adjusted fusion weight, otherwise, it may discard this abnormal value and obtain the adjusted fusion weight for the sensor 110 or 120 based on the fusion weights for the sensor 110 or 120 of the multiple neighboring stations 520-560.
  • the station 510 may obtain the adjusted fusion weight for its sensor 110 or 120 by voting among the fusion weight for the sensor 110 or 120 of the multiple stations 510-560. For example, if majority of fusion weights for the sensor 110 or 120 of the multiple stations 510-560 indicate the same value, the station 510 may take this value as its adjusted fusion weight for its sensor 110 or 120.
  • the fusion weight detection module 220 of the station 510 may determine fusion weights for its sensor 110 or 120 based on the noise variance metrics of its sensor 110 and 120 and the weather and/or lighting condition.
  • the station 510 may obtain the fusion weights for the sensor 110 or 120 of the multiple neighboring stations 520-560, which are determined based on the noise variance condition and the weather and/or lighting condition for the multiple neighboring stations 520-560. Then the station 510 may adjust the fusion weights for its sensor 110 or 120 based on the obtained fusion weights for the sensor 110 or 120 of the multiple neighboring stations 520-560.
  • the fusion weight detection module 220 of the station 510 may determine first fusion weights for its sensor 110 or 120 based on the noise variance metrics of the sensor 110 and 120 and determine second fusion weights and/or third fusion weights for its sensor 110 or 120 based on the weather and/or lighting condition.
  • the station 510 may obtain second fusion weights and/or third fusion weights for the sensor 110 or 120 of the multiple neighboring stations 520-560, which are determined based on the weather and/or lighting condition for the multiple neighboring stations 520-560.
  • the station 510 may adjust the second fusion weights and/or the third fusion weights for its sensor 110 or 120 based on the obtained second fusion weights and/or third fusion weights for the multiple neighboring stations 520-560.
  • the station 510 may obtain the adjusted fusion weights for its sensor 110 or 120 based on the first fusion weights determined by itself and the adjusted the second fusion weights and/or the third fusion weights for its sensor 110 or 120.
  • Fig. 6 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
  • each of the devices 610-660 may be or may include the apparatus 1 illustrated in Figs. 1, 3 and 4, and may be referred to as a station, a stationary station and so on.
  • surveillance devices 610-660 are similar to those of cars 510-560 illustrated in Fig. 5 in terms of the aspects of the disclosure. Taking the device 610 as an example, its operation for adjusting the fusion weights for its sensors based on information from neighboring stations such as stations 620-660 is similar to the car 510 as detailed above, therefore the details are not repeated.
  • the multiple neighboring stations located within the predefined range of the station 610 may be a predefined set of neighboring stations related to the station 610.
  • the predefined set of neighboring stations related to the station 610 may be set at the time of deployment of the stations 610-660 and so on.
  • Fig. 7 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
  • multiple surveillance devices 710-760 and cars 770-790 each may be or may include the apparatus 1 illustrated in Figs. 1, 3 and 4, and may be referred to as a station.
  • Each of the surveillance devices 710-760 and cars 770-790 may operation in the same way as the station 510 and 610 described above with reference to Figs. 5 and 6, and the neighboring stations for a particular station may include both movable stations 770-790 and stationary stations 710-760. Taking station 770 as an example, the neighboring stations within a predefined range of it may include devices 710-760 and cars 780-790.
  • a network device may perform the operations on behalf of the station. For example, the network device may obtain the related data from stations such as stations 510-560 and/or stations 610-660, calculate and provide the adjusted fusion weights for each station 510-560.
  • Fig. 8 is a flowchart illustrating an exemplary method for tuning sensor fusion weights for a station according to aspects of the disclosure.
  • one or more conditions relating to operation performance of multiple sensors of the station may be determined.
  • fusion weights for the multiple sensors of the station may be determined based on the one or more conditions.
  • fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station may be obtained.
  • the fusion weights for the multiple sensors of the station may be adjusted based on the obtained fusion weights for the multiple neighboring stations.
  • fusion of information obtained from the multiple sensors of the station may be performed based on the adjusted fusion weights.
  • the adjusted fusion weight for each of the multiple sensors of the station may be obtained by averaging or weighted averaging the determined fusion weight for the sensor of the station and the obtained fusion weights for the sensor of the multiple neighboring stations.
  • abnormal values from the determined fusion weights for the multiple sensors of the station and the obtained fusion weights for the multiple sensors of the multiple neighboring stations may be discarded.
  • the adjusted fusion weight for each of the multiple sensors of the station may be obtained by averaging or weighted averaging the remained fusion weights for the sensor.
  • the adjusted fusion weight for each of the multiple sensors of the station may be obtained by voting among the determined fusion weight for the sensor of the station and the obtained fusion weights for the sensor of the multiple neighboring stations
  • whether the determined fusion weight for each of the multiple sensors of the station is a normal value or an abnormal value may be determined based on the obtained fusion weights for the sensor of the multiple neighboring stations.
  • the determined fusion weight for the sensor of the station may be taken as its adjusted fusion weight if it is determined as the normal value.
  • the adjusted fusion weight for the sensor of the station may be obtained based on at least part of the obtained fusion weights for the sensor of the multiple neighboring stations if the determined fusion weight for the sensor of the station is determined as the abnormal value.
  • the multiple neighboring stations located within the predefined range of the station being at least one of: multiple neighboring stations located within a predefined distance from the station, a predefined number of neighboring stations around the station, or a predefined set of neighboring stations related to the station.
  • the one or more conditions may comprise one or more of a weather condition, a lighting condition and a noise variance condition for each of the multiple sensors.
  • the noise variance condition may comprise noise variance metrics of a parameter for each of the multiple sensors.
  • the one or more conditions may comprise the noise variance condition and at least one of the weather condition and the lighting condition.
  • the fusion weights for the multiple sensors of the station may be determined based on the noise variance condition and at least one of the weather condition and the lighting condition. And the fusion weights for the multiple sensors of the station may be adjusted based on the fusion weights for the multiple neighboring stations, which are determined based on the noise variance condition and at least one of the weather condition and the lighting condition for the multiple neighboring stations.
  • first fusion weights for the multiple sensors of the station may be determined based on the noise variance condition
  • at least one of second fusion weights and third fusion weights for the multiple sensors of the station may be determined based on at least one of the weather condition and the lighting condition.
  • the at least one of the second fusion weights and the third fusion weights for the multiple sensors of the station may be adjusted based on at least one of second fusion weights and third fusion weights for the multiple sensors of the multiple neighboring stations, which are determined based on at least one of the weather condition and the lighting condition for the multiple neighboring stations.
  • the adjusted fusion weights for the multiple sensors of the station may be obtained based on the first fusion weights and the adjusted at least one of the second fusion weights and the third fusion weights for the multiple sensors of the station.
  • At least one of weather condition and lighting condition may be determined based on image data from a camera sensor.
  • the multiple sensors may comprise at least two of a camera sensor, a radar sensor, and a LiDAR sensor. In an embodiment, the multiple sensors may comprise a camera sensor and a radar sensor. In an embodiment, the multiple sensors may comprise a camera sensor and a LiDAR sensor. In an embodiment, the multiple sensors may comprise a camera sensor, a radar sensor, and a LiDAR sensor.
  • each of the station and the multiple neighboring stations is one of a mobile station such as a vehicle and a stationary station such as a surveillance device amounted on a pole along the road.
  • Fig. 9 is a flowchart illustrating an exemplary method for tuning sensor fusion weights for a station according to aspects of the disclosure.
  • noise variance metrics for multiple sensors of the station may be determined.
  • fusion weights for the multiple sensors of the station may be determined based on the noise variance metrics and the at least one of the weather condition and the lighting condition.
  • fusion of information obtained from the multiple sensors of the station may be obtained based on the fusion weights.
  • first fusion weights for the multiple sensors of the station may be determined based on the noise variance metrics of the multiple sensors. At least one of second fusion weights and third fusion weights for the multiple sensors of the station may be determined based on at least one of the weather condition and the lighting condition for the station. And the fusion weights for the multiple sensors of the station may be determined based on the first fusion weights and the at least one of the second fusion weights and the third fusion weights.
  • At least one of the weather condition and the lighting condition for the station may be determined based on image data from a camera sensor.
  • At least one of the weather condition and the lighting condition for the station may be determined by analyzing a predefined number of image frames captured by the camera sensor.
  • fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station may be obtained.
  • the fusion weights for the multiple sensors of the station may be adjusted based on obtained fusion weights for the multiple sensors of the multiple neighboring stations.
  • the fusion of information obtained from the multiple sensors of the station may be performed based on the adjusted fusion weights.
  • At least one of second fusion weights and third fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station may be obtained, where the at least one of the second fusion weights and the third fusion weights for the multiple sensors of multiple neighboring stations may be determined based on at least one of the weather condition and the lighting condition for the multiple neighboring stations.
  • the at least one of the second fusion weights and third fusion weights for the multiple sensors of the station may be adjusted based on the obtained at least one of second fusion weights and third fusion weights for the multiple sensors of the multiple neighboring stations.
  • the adjusted fusion weights for the multiple sensors of the station may be obtained based on the first fusion weights and the adjusted at least one of the second fusion weights and the third fusion weights for the multiple sensors of the station.
  • Fig. 10 is a block diagram illustrating an exemplary computing system according to aspects of the disclosure.
  • the computing system 1000 may comprise at least one processor 1010.
  • the computing system 1000 may further comprise at least one storage device 1020. It should be appreciated that the computing system 1000 may be configured to implement the processing system 20 as illustrated in Figs. 1, 3 and 4.
  • the storage device 1020 may store computer-executable instructions that, when executed, cause the processor 1010 to determine one or more conditions relating to operation performance of multiple sensors of a station, determine fusion weights for the multiple sensors of the station based on the one or more conditions, obtain fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station, adjust the fusion weights for the multiple sensors of the station based on the obtained fusion weights for the multiple neighboring stations, and perform fusion of information obtained from the multiple sensors of the station based on the adjusted fusion weights.
  • the storage device 1020 may store computer-executable instructions that, when executed, cause the processor 1010 to determine noise variance metrics for multiple sensors of a station, determine at least one of a weather condition and a lighting condition for the station, determine fusion weights for the multiple sensors of the station based on the noise variance metrics and the at least one of the weather condition and the lighting condition, and perform fusion of information obtained from the multiple sensors of the station based on the fusion weights.
  • the storage device 1020 may store computer-executable instructions that, when executed, cause the processor 1010 to perform any operations according to the embodiments of the present disclosure as described in connection with FIGs. 1-9.
  • the embodiments of the present disclosure may be embodied in a computer-readable medium such as non-transitory computer-readable medium.
  • the non-transitory computer-readable medium may comprise instructions that, when executed, cause one or more processors to perform any operations according to the embodiments of the present disclosure as described in connection with FIGs. 1-10.
  • the embodiments of the present disclosure may be embodied as a computer program product which comprises instructions that, when executed, cause one or more processors to perform any operations according to the embodiments of the present disclosure as described in connection with FIGs. 1-10.
  • modules in the apparatuses described above may be implemented in various approaches. These modules may be implemented as hardware, software, or a combination thereof. Moreover, any of these modules may be further functionally divided into sub-modules or combined together.

Abstract

The present disclosure provides a method for tuning sensor fusion weights for a station. The method comprises: determining one or more conditions relating to operation performance of multiple sensors of the station; determining fusion weights for the multiple sensors of the station based on the one or more conditions; obtaining fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station; adjusting the fusion weights for the multiple sensors of the station based on the obtained fusion weights for the multiple neighboring stations; and performing fusion of information obtained from the multiple sensors of the station based on the adjusted fusion weights.

Description

METHOD AND APPARATUS FOR TUNING SENSOR FUSION WEIGHTS FIELD
Aspects of the present disclosure relate generally to sensor fusion, and more particularly, to tuning fusion weights for fusing information from multiple sensors of a station.
BACKGROUND
Intelligent transportation systems such as traffic monitoring apparatus, autonomous vehicles and so on rely on a variety of sensors to perceive the environment. For example, various operations, such as detecting and tracking vehicles, pedestrians, or other objects of the environment, speed detection, position detection, and so on, can be performed based on data captured by the variety of sensors.
Taking vehicles as an example, a vehicle may have different types of sensors equipped, such as cameras, LiDAR (Light Detection And Ranging) , radar, ultrasonic sensors, and so on. The different types of sensors may have their respective advantages and weaknesses. For example, cameras can perceive color and texture information from the world and are good at classifying objects, however, their detection range is limited and they perform poorly in limited lighting or adverse weather conditions. LiDARs provide precise distance information, have a large detection range and are able to detect small objects, they work well at night but do not provide color information and their performance decreases in heavy rain. Radars provide precise distance and velocity information and work well in bad weather conditions but have a rather low resolution. In order to improve performance of the sensor based measurements, the data from different sensors can be fused, which is known as sensor fusion.
There needs enhancement for improving the reliability and robustness of the sensor fusion technique.
SUMMARY
According to an embodiment, there provides a method for tuning sensor fusion weights for a station. The method comprises: determining one or more conditions relating to operation performance of multiple sensors of the station; determining fusion weights for the multiple sensors of the station based on the one or more conditions; obtaining fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station; adjusting the  fusion weights for the multiple sensors of the station based on the obtained fusion weights for the multiple neighboring stations; and performing fusion of information obtained from the multiple sensors of the station based on the adjusted fusion weights.
According to an embodiment, there provides a method for tuning sensor fusion weights for a station. The method comprises: determining noise variance metrics for multiple sensors of the station; determining at least one of a weather condition and a lighting condition for the station; determining fusion weights for the multiple sensors of the station based on the noise variance metrics and the at least one of the weather condition and the lighting condition; and performing fusion of information obtained from the multiple sensors of the station based on the fusion weights.
According to an embodiment, there provides a computer system, which comprises one or more processors and one or more storage devices storing computer-executable instructions that, when executed, cause the one or more processors to perform the operations of the method as mentioned above as well as to perform the operations of the method provided according to aspects of the disclosure.
According to an embodiment, there provides one or more computer readable storage media storing computer-executable instructions that, when executed, cause one or more processors to perform the operations of the method as mentioned above as well as to perform the operations of the method provided according to aspects of the disclosure.
According to an embodiment, there provides an apparatus for adjusting sensor fusion weights, the apparatus comprises a plurality of sensors for obtaining sensor data and a processing system to execute instructions that cause the processing system to perform the operations of the method as mentioned above as well as to perform the operations of the method provided according to aspects of the disclosure.
By using the sensor fusion weight tuning method of the disclosure, the reliability and robustness of operations utilizing sensor fusion may be improved. Other advantages of the disclosure would be explained in the following description.
BRIEF DESCRIPTION OF THE DRAWINGS
The disclosed aspects will hereinafter be described in connection with the appended drawings that are provided to illustrate and not to limit the disclosed aspects.
Fig. 1 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
Fig. 2 is a block diagram illustrating exemplary sensor fusion methods  according to aspects of the disclosure.
Fig. 3 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
Fig. 4 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
Fig. 5 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
Fig. 6 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
Fig. 7 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
Fig. 8 is a flowchart illustrating an exemplary method for tuning sensor fusion weights for a station according to aspects of the disclosure.
Fig. 9 is a flowchart illustrating an exemplary method for tuning sensor fusion weights for a station according to aspects of the disclosure.
Fig. 10 is a block diagram illustrating an exemplary computing system 1000 according to aspects of the disclosure.
DETAILED DESCRIPTION
The present disclosure will now be discussed with reference to several exemplary implementations. It is to be understood that these implementations are discussed only for enabling those skilled in the art to better understand and thus implement the embodiments of the present disclosure, rather than suggesting any limitations on the scope of the present disclosure.
Fig. 1 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
The apparatus 1 illustrated in Fig. 1 may be a vehicle such as autonomous vehicle, a stationary surveillance device such as a surveillance device mounted along a road, or the like, or may be a part of the vehicle, surveillance device, or the like.
The apparatus 1 can be equipped with various sensors 10. In the illustrated example of Fig. 1, the various sensors 10 may include a camera system 110, a LiDAR system 120, and a radar system 130. It is appreciated that the set of sensors 10 of the apparatus 1 may include other types of sensors in addition to the illustrated example sensors 110-130, for example, various other sensor resources such as sonar, ultrasonic sensors, proximity sensors, infrared sensors, wheel speed sensors, rain sensors and so on may be included in the set of sensors 10. On the other hand, the set of sensors 10 of the apparatus 1 may not include all the three types of sensors 110-130, any  combinations of at least two of sensors 110-130 are possible to be equipped on the apparatus 1.
The apparatus 1 may include a processing system 20. The processing system 20 may be implemented in various ways, for example, the processing system 20 may include one or more processors and/or controllers, which may execute software to perform various operations or functions, such as operations or functions according to various aspect of the disclosure.
The processing system 20 may receive sensor data from the sensors 10 such as sensors 110-130, and perform various operations by analyzing the sensor data. In the example of Fig. 1, the processing system 20 may include a condition detection module 210, a fusion weight tuning module 220, and a sensor fusion module 230. It is appreciated that the modules 210-230 may be implemented in various ways, for example, may be implemented as software modules or functions which are executable by processors and/or controllers.
The condition detection module 210 may be configured to determine one or more conditions relating to operation performance of at least a part of multiple sensors 10 of the apparatus 1.
The condition relating to the operation performance of a sensor may be noise variance metric of the sensor. The noise variance metric may describe how reliable the information from the sensor at a time point is. For example, image noise may be utilized as a parameter to determine the noise variance metric for the camera sensors 110. Similarly, the signal noise may be utilized as a parameter to determine the noise variance metric for the LiDAR sensors 120 or the radar sensors 130. As another example, given an exemplary operation of object detection, the object detection confidence may be utilized as a parameter to determine the noise variance metric for a sensor 10. It is appreciated that any method for determining the noise variance metric for a sensor 10 is applicable to the embodiments of the disclosure, and the disclosure is not limited to a specific method for determining noise variance metric of a sensor.
The condition relating to the operation performance of a sensor may be a weather condition in which the sensors 10 of the apparatus works. The condition detection module 110 may determine the weather condition based on the sensor data from some sensors 10. For example, a current weather condition may be determined based on the image data captured by the camera system 110. In an implementation, an image classification model may be utilized to determine the current weather condition by classifying the captured images as one of multiple weather conditions, which may include for example sunny, foggy, misty, rainy, snowy, and so on. It is appreciated  that any method for determining the weather condition is applicable to the embodiments of the disclosure, and the disclosure is not limited to a specific method for determining the weather condition.
The condition relating to the operation performance of a sensor may be a lighting condition in which the sensors 10 of the apparatus works. The condition detection module 110 may determine the lighting condition based on the sensor data from some sensors 10. For example, a current lighting condition may be determined based on the image data captured by the camera system110. In an implementation, an image classification model may be utilized to determine the current lighting condition by classifying the captured images as one of multiple lighting conditions, which may include for example sun glare, normal lighting, low lighting, night, and so on. In another implementation, the lighting condition may be determined based on the intensity of the captured image. It is appreciated that any method for determining the lighting condition is applicable to the embodiments of the disclosure, and the disclosure is not limited to a specific method for determining the lighting condition.
The fusion weight tuning module 220 may be configured to determine the fusion weights for respective sensors based on the condition detected by the condition detection module 210.
For example, the fusion weight tuning module 220 may determine the fusion weights for  sensors  110, 120 and 130 respectively based on the noise variance metrics of the  sensors  110, 120 and 130, and then the sensor fusion module 230 may perform fusion of the information obtained from the  sensors  110, 120 and 130 based on their fusion weights. Generally, the higher the noise variance metric or noise metric of a sensor is, the lower the fusion weight of the sensor is given. It is appreciated that the sensor fusion may happen between two or more of the  sensors  110, 120, 130 as well as other applicable sensors 10.
In another example, the fusion weight tuning module 220 may determine the fusion weights for  sensors  110, 120 and 130 respectively based on the weather condition detected by the condition detection module 210, and then the sensor fusion module 230 may perform fusion of the information obtained from the  sensors  110, 120 and 130 based on their fusion weights. Generally, the fusion weight for cameras 110 may be reduced relative to severity of the weather condition as the camera works less well in bad weather conditions, so fusion weights for cameras 110 may be decreased in bad weather conditions and may be increased in good weather conditions. For other sensors, fusion weight adjustment may be defined based on intrinsic sensor characteristics. For example, radar sensors 130 usually works well in bad weather conditions, so fusion weights for radar sensors 130 may be increased in  bad weather and may be decreased in good weather. The performance of LiDAR sensors 120 decreases in heavy rain, so fusion weights for radar sensors 130 may be decreased in rain.
Table 1
  Camera weight LiDAR weight Radar weight
Weather condition
 1 Wc 1 Wl 1 Wr 1
Weather condition 2 Wc 2 Wl 2 Wr 2
Weather condition 3 Wc 3 Wl 3 Wr 3
Weather condition 4 Wc 4 Wl 4 Wr 4
An example of fusion weight setting for multiple sensors of an apparatus is illustrated in above Table 1. The exemplary weather conditions 1-4 may be for example sunny, foggy, rainy, snowy. The fusion weights for each of the sensors 110-130 under each of the weather conditions 1-4 may be defined in the table 1. For example, when the weather condition 3 “rainy” is detected, the fusion weight tuning module 220 may set the fusion weights for camera 110, LiDAR 120 and radar 130 as Wc 3, Wl 3 and Wr 3 respectively. The fusion weights as exampled in Table 1 may be defined heuristically, for example, may be defined according to experiments or experience and may be updated as appropriate. It is appreciated that the fusion weights may be set as a zero value or non-zero value, and the disclosure is not limited to specific fusion weight values. It is appreciated that the fusion weights of respective sensors in respective weather conditions may be determined in any applicable way. For example, an image classification model may be used to decide a fusion weight for a certain sensor in consideration of weather condition by classifying images from the camera 110.
In another example, the fusion weight tuning module 220 may determine the fusion weights for  sensors  110, 120 and 130 respectively based on the lighting condition or lighting level detected by the condition detection module 210, and then the sensor fusion module 230 may perform fusion of the information obtained from the  sensors  110, 120 and 130 based on their fusion weights. Generally, the fusion weight for cameras 110 may be reduced relative to severity of the lighting condition as the camera works less well in bad lighting conditions, so fusion weights for cameras 110 may be decreased in bad lighting conditions and may be increased in good lighting conditions. For example, LiDAR sensors 120 and radar sensors 130 both work well in bad lighting conditions, so fusion weights for LiDAR sensors 120 and radar sensors 130 may be increased in bad lighting conditions and may be deceased in good lighting conditions.
Table 2
  Camera weight LiDAR weight Radar weight
Lighting condition
 1 Wc 1 Wl 1 Wr 1
Lighting condition 2 Wc 2 Wl 2 Wr 2
Lighting condition 3 Wc 3 Wl 3 Wr 3
Lighting condition 4 Wc 4 Wl 4 Wr 4
An example of fusion weight setting for multiple sensors is illustrated in above Table 2. The exemplary lighting conditions 1-4 may be for example sun glare, normal lighting, low lighting, night. The fusion weights for each of the sensors 110-130 under each of the lighting conditions 1-4 may be defined in the table 2. For example, when the lighting condition 4 “night” is detected, the fusion weight tuning module 220 may set the fusion weights for camera 110, LiDAR 120 and radar 130 as Wc 4, Wl 4 and Wr 4 respectively. The fusion weights as exampled in Table 2 may be defined heuristically, for example, may be defined according to experiments or experience and may be updated as appropriate. It is appreciated that the fusion weights may be set as a zero value or non-zero value, and the disclosure is not limited to specific fusion weight values. It is appreciated that the fusion weights of respective sensors in respective lighting conditions may be determined in any applicable way. For example, an image classification model may be used to decide a fusion weight for a certain sensor in consideration of lighting condition by classifying images from the camera 110.
Fig. 2 is a block diagram illustrating sensor fusion methods schematically according to aspects of the disclosure.
Sensor fusion can generally be divided into three main categories: low-level sensor fusion (LLF) , mid-level fusion (MLF) , and high-level sensor fusion (HLF) . The diagrams A and B in Fig. 2 respectively show HLF and LLF architectures.
Given an exemplary operation of object detection, as illustrated in left diagram A, in HLF, objection detection operations as shown in blocks 240A to 240C may be performed on sensor data captured by respective sensors, examples of which may be the cameras 110, LiDAR 120 and radar 130 shown in Fig. 1. Each of the objection detection modules 240A to 240C may generate a list of objects based on the sensor data from the corresponding sensor. The lists of objects generated by the modules 240A to 240C may be provided to the sensor fusion module 230, which is also shown in Fig. 1. The sensor fusion module 230 may fuse the object lists, where the respective object lists obtained from different sensors such as sensors 110-130 may be given different weights in the fusion.
As illustrated in right diagram B, in LLF, raw data from the sensors such  as sensors 110-130 are fused in the fusion module 230, where the raw data obtained from different sensors such as sensors 110-130 may be given different weights in the fusion. Then the detection module 240 performs objection detection based on the fused data.
MLF is an abstraction on top of LLF. In MLF, features are extracted from raw data of multiple sensors such as sensors 110-130, and the extracted features are fused by the fusion module 230, where the intermediate data such as the extracted features obtained from different sensors such as sensors 110-130 may be given different weights in the fusion. Then the detection module 240 performs objection detection based on the fused data.
It is appreciated that various sensor fusion methods such as the illustrated HLF, MLF, MLF methods and so on are applicable to the embodiment of the disclosure, and the disclosure is not limited to any specific sensor fusion algorithm or method.
Fig. 3 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
Same or similar labels as denoted in Figs. 1 and 2 are used in Fig. 3, and details for the similar modules or blocks would not be repeated in the description of Fig. 3.
In the example of Fig. 3, the condition detection module 210 may include a weather condition detection module 2110, a lighting condition detection module 2120, a camera noise detection module 2130, a LiDAR noise detection module 2140, a radar noise detection module 2150.
The weather condition detection module 2110 may determine a current weather condition based on the images captured by the camera system 110. In an embodiment, the weather condition detection module 2110 may process a predefined number of image frames which are captured in a predefined time period so as to determine the current weather based on the number of image frames. By predefining the time period, it is helpful to avoid rapid shift of fusion weights due to short-term, extreme weather measurements. In an implementation, the weather condition detection module 2110 may be implemented as an image classification model which classifies the captured images as one of multiple weather conditions, such as sunny, foggy, misty, rainy, snowy, and so on. It is appreciated that any method for determining the weather condition is applicable for the implementation of the weather condition detection module 2110.
The lighting condition detection module 2120 may determine a current lighting condition based on the images captured by the camera system 110. In an  embodiment, the lighting condition detection module 2120 may process a predefined number of image frames which are captured in a predefined time period so as to determine the current weather based on the number of image frames. By predefining the time period, it is helpful to avoid rapid shift of fusion weights due to short-term, extreme lighting measurements. In an implementation, the lighting condition detection module 2120 may be implemented as an image classification model which classifies the captured images as one of multiple lighting conditions, such as sun glare, normal lighting, low lighting, night, and so on. It is appreciated that any method for determining the lighting condition is applicable for the implementation of the weather condition detection module 2120.
In an implementation, the weather condition detection module 2110 and the lighting condition detection module 2120 may be implemented as a single image classification model which classifies the captured images as one of multiple weather conditions and one of multiple lighting conditions. It is appreciated that any method for determining the weather condition and lighting condition altogether is applicable for the implementing the weather condition detection module 2110 and the lighting condition detection module 2120 as one module.
The camera noise detection module 2130, the LiDAR noise detection module 2140 and the radar noise detection module 2150 may respectively determine current noise variance metrics for the camera system 110, LiDAR system 120 and radar system 130 based on a parameter of the sensors. For example, the parameter may be the signal noise of the sensors 110-130. As another example, given an exemplary operation of object detection, the parameter may be the object detection confidence. It is appreciated that any method for determining the noise variance metric for a sensor is applicable in aspects of the disclosure.
In the example of Fig. 3, the processing system 20 of the apparatus 1 includes objection detection modules 240A to 240C. The objection detection modules 240A to 240C respectively perform objection detection operations on sensor data captured by respective sensors such as the cameras 110, LiDAR 120 and radar 130, so as to generate a list of objects. It is appreciated that, in the example of taking the objection detection confidence as the parameter for determining the noise variance metric, the operations of the noise detection modules 2130-2150 may be associated with the operations of the objection detection modules 240A to 240C. It is appreciated that the modules 240A to 240C may perform other operations or functions in other embodiments, for example, the operations performed by the modules 240A to 240C may be position detection, speed detection, tracking and so on, and accordingly the corresponding operation confidence may be taken as the parameter for determining  the noise variance metric by the noise detection modules 2130-2150.
In an embodiment, the fusion weight tuning module 220 may determine the fusion weights for  sensors  110 and 120 based on the noise variance metrics of the  sensors  110 and 120 and the weather condition, which are respectively detected by the camera noise detection module 2130, the LiDAR noise detection module 2140 and the weather condition detection module 2110. For example, the fusion weights for the  sensors  110 and 120 determined based on the noise variance metrics of the  sensors  110 and 120 may be adjusted utilizing the fusion weights for the  sensors  110 and 120 determined based on the current weather, so as to obtain the adjusted fusion weights for the  sensors  110 and 120. In an example, the adjusted fusion weight for one sensor may be the multiplication of the fusion weights for the sensor associated with respective conditions. In this embodiment, the adjusted fusion weight for the camera 110 may be the multiplication of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110 and the fusion weight for the camera 110 determined based on the current weather, and the adjusted fusion weight for the LiDAR 120 may be the multiplication of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120 and the fusion weight for the LiDAR 120 determined based on the current weather. In another example, the adjusted fusion weight for one sensor may be the sum of the fusion weights for the sensor associated with respective conditions. In this embodiment, the adjusted fusion weight for the camera 110 may be the sum of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110 and the fusion weight for the camera 110 determined based on the current weather, and the adjusted fusion weight for the LiDAR 120 may be the sum of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120 and the fusion weight for the LiDAR 120 determined based on the current weather.
The sensor fusion module 230 may fuse the lists of objects received from objection detection modules 240A-240B based on the fusion weights received from the fusion weight tuning module 220. By adjusting the fusion weights determined based on the noise variance metrics with the fusion weights determined based on the current weather, it is helpful to improve the reliability of the sensor fusion where some potential issue associated with using only the noise variance metrics for fusion weight tuning may be mitigated. For example, in foggy conditions, camera data may be used to detect objects at close distance with high confidence, but it is unaware that distance tracking capability based on camera data is impaired. By taking the weather condition as additional information in addition to the noise variance metrics so as to determine the fusion weights of the sensors, the fusion weights for the camera 110  may be timely reduced and the fusion weights for the LiDAR 120 may be increased, so as to mitigating the potential problem in such circumstance.
In an embodiment, the fusion weight tuning module 220 may determine the fusion weights for  sensors  110 and 120 based on the noise variance metrics of the  sensors  110 and 120 and the lighting condition, which are respectively detected by the camera noise detection module 2130, the LiDAR noise detection module 2140 and the lighting condition detection module 2120. For example, the fusion weights for the  sensors  110 and 120 determined based on the noise variance metrics of the  sensors  110 and 120 may be adjusted utilizing the fusion weights for the  sensors  110 and 120 determined based on the current lighting level, so as to obtain the adjusted fusion weights for the  sensors  110 and 120. In an example, the adjusted fusion weight for the camera 110 may be the multiplication of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110 and the fusion weight for the camera 110 determined based on the current lighting condition, and the adjusted fusion weight for the LiDAR 120 may be the multiplication of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120 and the fusion weight for the LiDAR 120 determined based on the current lighting condition. In another example, the adjusted fusion weight for the camera 110 may be the sum of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110 and the fusion weight for the camera 110 determined based on the current lighting condition, and the adjusted fusion weight for the LiDAR 120 may be the sum of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120 and the fusion weight for the LiDAR 120 determined based on the current lighting condition. The sensor fusion module 230 may fuse the lists of objects received from objection detection modules 240A-240B based on the fusion weights received from the fusion weight tuning module 220.
In an embodiment, the fusion weight tuning module 220 may determine the fusion weights for  sensors  110 and 120 based on the noise variance metrics of the  sensors  110 and 120 as well as the weather condition and the lighting condition, which are respectively detected by the camera noise detection module 2130, the LiDAR noise detection module 2140, the weather condition detection module 2110 and the lighting condition detection module 2120. For example, the fusion weights for the  sensors  110 and 120 determined based on the noise variance metrics of the  sensors  110 and 120 may be adjusted utilizing the fusion weights for the  sensors  110 and 120 determined based on the current weather and the fusion weights for the  sensors  110 and 120 determined based on the current lighting level, so as to obtain the adjusted  fusion weights for the  sensors  110 and 120. In an example, the adjusted fusion weight for the camera 110 may be the multiplication of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110, the fusion weight for the camera 110 determined based on the current weather and the fusion weight for the camera 110 determined based on the current lighting condition, and the adjusted fusion weight for the LiDAR 120 may be the multiplication of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120, the fusion weight for the LiDAR 120 determined based on the current weather and the fusion weight for the LiDAR 120 determined based on the current lighting condition. In another example, the adjusted fusion weight for the camera 110 may be the sum of the fusion weight for the camera 110 determined based on the noise variance metrics of the camera 110, the fusion weight for the camera 110 determined based on the current weather and the fusion weight for the camera 110 determined based on the current lighting condition, and the adjusted fusion weight for the LiDAR 120 may be the sum of the fusion weight for the LiDAR 120 determined based on the noise variance metrics of the LiDAR 120, the fusion weight for the LiDAR 120 determined based on the current weather and the fusion weight for the LiDAR 120 determined based on the current lighting condition. The sensor fusion module 230 may fuse the lists of objects received from objection detection modules 240A-240B based on the fusion weights received from the fusion weight tuning module 220.
It is appreciated that the fusion weight tuning module 220 may determine the fusion weights for any two or more of  sensors  110, 120 and 130 based on the noise variance metrics of the two or more sensors as well as at least one of the weather condition and the lighting condition. For example, the fusion weights for the two or more sensors determined based on their noise variance metrics may be adjusted utilizing at least one of the fusion weights for the two or more sensors determined based on the current weather and the fusion weights for the two or more sensors determined based on the current lighting level, so as to obtain the adjusted fusion weights for the tow or more sensors. The sensor fusion module 230 may fuse the lists of objects received from the corresponding two or more of the objection detection modules 240A-240C based on the fusion weights received from the fusion weight tuning module 220.
Fig. 4 is a block diagram illustrating an exemplary apparatus according to aspects of the disclosure.
The difference of the structure illustrated in Fig. 4 from that of Fig. 3 is that the objection detection module 240 operates after the operation of the sensor fusion module 230. The sensor fusion module performs LLF fusion on the raw data  captured by the different sensors such as sensors 110-130, or performs MLF fusion on the extracted features obtained from different sensors such as sensors 110-130, as explained above with reference to Fig. 2. Then the detection module 240 performs objection detection based on the fused data. Other parts of Fig. 4 are similar to those of Fig. 3 and the details are not repeated here.
As explained, the information to be fused at the sensor fusion module 230 may be the target measurement such as the detected object, speed, position and so on, the raw data captured by the different sensors, and the intermediate data such as the extracted features and so on. The different types of data to be fused at the sensor fusion module 230 may be commonly referred to as information obtained from sensors.
Fig. 5 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
In the example of Fig. 5, multiple cars 510-560 are running on a road. Each of the cars 510-560 may be or may include the apparatus 1 illustrated in Figs. 1, 3 and 4, and may be referred to as a station, a movable station and so on.
Taking the station 510 as an example, the station 510 may determine one or more conditions relating to operation performance of its multiple sensors such as the camera 110, the LiDAR 120, the radar 130 and so on. As illustrated above, the one or more conditions may be the noise variance metrics of the respective sensors 10, the weather condition, the lighting condition and so on. Then the station 510 may determine fusion weights for the multiple sensors based on the one or more conditions. It is appreciated the multiple sensors may be any two or more of the sensors 110-130 as well as other possible sensors equipped on the station 510. There may be a situation that the fusion weights for the multiple sensors determined by the station 510 based on its own sensor data may be not reliably or constantly accurate due to temporary and extreme measurements from the sensors, especially for the cameras 110.
In order to mitigate the influence of this situation, the station 510 may obtain fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station 510, and adjust the fusion weights for the multiple sensors of the station 510 based on the obtained fusion weights for the multiple neighboring stations. It is appreciated that the station 510 may communicate wirelessly with the neighboring stations via various networks, such as Wireless Local Area Network (WLAN) , 4G wireless communication network, 5G wireless communication network, the future evolved networks like 6G and so on. The communication network is not illustrated in the Fig. 5 for simplicity.
The neighboring stations may be for example at least a part of the stations 520 to 560 as illustrated in Fig. 5 as well as other stations as appropriate. The predefined range of the station 510 may be a predefined distance from the station 510. For example, the neighboring stations located within the predefined distance from the station 510 may have similar conditions such as weather condition and lighting condition, and may provide their current fusion weights to the station 510 for it to adjust its fusion weights. The predefined range of the station 510 may be a predefined number of neighboring stations around the station 510. For example, the station 510 may receive fusion weights from the predefined number of neighboring stations around the station 510. The predefined range of the station 510 may be a combination of the above mentioned predefined distance from the station 510 and the above mentioned predefined number of neighboring stations around the station 510.
For sake of simplicity, the stations 520-560 are taken as the neighboring stations and the sensors 110-120 are taken as the multiple sensors in the following description.
In an implementation, the fusion weight tuning module 220 of the station 510 may perform the function of adjusting the fusion weights for the multiple sensors 110-120 of the station 510 based on the obtained fusion weights of the multiple sensors 110-120 of the multiple neighboring stations 520-560. Then the sensor fusion module 230 of the station 510 may perform fusion of information obtained from the multiple sensors 110-120 of the station 510 based on the adjusted fusion weights.
In an implementation, the station 510, particularly the fusion weight tuning module 220, may adjust the fusion weight for its sensor 110 by averaging the fusion weight for its sensor 110 and the obtained fusion weights for the sensor 110 of the multiple neighboring stations 520-560. Similarly, the station 510 may adjust the fusion weight for its sensor 120 by averaging the fusion weight for its sensor 120 and the obtained fusion weights for the sensor 120 of the multiple neighboring stations 520-560.
In an implementation, the station 510 may adjust the fusion weight for its  sensors  110 or 120 by weighted averaging the fusion weight for the  sensor  110 or 120 of the multiple stations 510-560. For calculating the weighted average fusion weight for the  sensor  110 or 120 among the multiple stations 510-560, the calculating weights for the fusion weights from the neighboring stations 520-560 may be inverse proportional to the distances between the respective neighboring station 520-560 and the station 510.
In an implementation, the station 510 may discard abnormal values from the fusion weight for the  sensor  110 or 120 of the multiple stations 510-560, and then  adjust the fusion weight for its  sensor  110 or 120 by averaging or weighted averaging the remained fusion weights for the  sensor  110 or 120 of the multiple stations 510-560. The abnormal values may be determined in any suitable way. For example, the abnormal values may be determined based on a predefined variance range, where a value among multiple fusion weight values of the multiple stations 510-560 would be taken as an abnormal value if its deviation from the mean is out of the variance range, otherwise, it is taken as a normal value.
In an implementation, if the station 510 determines the fusion weight for its  sensor  110 or 120 is a normal value, it may take the fusion weight as its adjusted fusion weight, otherwise, it may discard this abnormal value and obtain the adjusted fusion weight for the  sensor  110 or 120 based on the fusion weights for the  sensor  110 or 120 of the multiple neighboring stations 520-560.
In an implementation, the station 510 may obtain the adjusted fusion weight for its  sensor  110 or 120 by voting among the fusion weight for the  sensor  110 or 120 of the multiple stations 510-560. For example, if majority of fusion weights for the  sensor  110 or 120 of the multiple stations 510-560 indicate the same value, the station 510 may take this value as its adjusted fusion weight for its  sensor  110 or 120.
As described above with reference to Fig. 3, the fusion weight detection module 220 of the station 510 may determine fusion weights for its  sensor  110 or 120 based on the noise variance metrics of its  sensor  110 and 120 and the weather and/or lighting condition. In an implementation, the station 510 may obtain the fusion weights for the  sensor  110 or 120 of the multiple neighboring stations 520-560, which are determined based on the noise variance condition and the weather and/or lighting condition for the multiple neighboring stations 520-560. Then the station 510 may adjust the fusion weights for its  sensor  110 or 120 based on the obtained fusion weights for the  sensor  110 or 120 of the multiple neighboring stations 520-560.
As described above with reference to Fig. 3, the fusion weight detection module 220 of the station 510 may determine first fusion weights for its  sensor  110 or 120 based on the noise variance metrics of the  sensor  110 and 120 and determine second fusion weights and/or third fusion weights for its  sensor  110 or 120 based on the weather and/or lighting condition. In an implementation, the station 510 may obtain second fusion weights and/or third fusion weights for the  sensor  110 or 120 of the multiple neighboring stations 520-560, which are determined based on the weather and/or lighting condition for the multiple neighboring stations 520-560. The station 510 may adjust the second fusion weights and/or the third fusion weights for its  sensor  110 or 120 based on the obtained second fusion weights and/or third fusion weights for the multiple neighboring stations 520-560. The station 510 may obtain the  adjusted fusion weights for its  sensor  110 or 120 based on the first fusion weights determined by itself and the adjusted the second fusion weights and/or the third fusion weights for its  sensor  110 or 120.
Fig. 6 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
In the example of Fig. 6, multiple surveillance devices 610-660 are set along a roadside. Each of the devices 610-660 may be or may include the apparatus 1 illustrated in Figs. 1, 3 and 4, and may be referred to as a station, a stationary station and so on.
The structure and function of surveillance devices 610-660 are similar to those of cars 510-560 illustrated in Fig. 5 in terms of the aspects of the disclosure. Taking the device 610 as an example, its operation for adjusting the fusion weights for its sensors based on information from neighboring stations such as stations 620-660 is similar to the car 510 as detailed above, therefore the details are not repeated.
In an implementation, as the devices 610-660 are stationary stations, the multiple neighboring stations located within the predefined range of the station 610 may be a predefined set of neighboring stations related to the station 610. For example, the predefined set of neighboring stations related to the station 610 may be set at the time of deployment of the stations 610-660 and so on.
Fig. 7 illustrates an exemplary scenario for tuning sensor fusion weights for a station according to aspects of the disclosure.
In the example of Fig. 7, multiple surveillance devices 710-760 and cars 770-790 each may be or may include the apparatus 1 illustrated in Figs. 1, 3 and 4, and may be referred to as a station.
Each of the surveillance devices 710-760 and cars 770-790 may operation in the same way as the  station  510 and 610 described above with reference to Figs. 5 and 6, and the neighboring stations for a particular station may include both movable stations 770-790 and stationary stations 710-760. Taking station 770 as an example, the neighboring stations within a predefined range of it may include devices 710-760 and cars 780-790.
Although it is described in the examples of Figs. 5-7 that a station such as station 510 performs the operations of tuning the sensor fusion weights, it is appreciated that a network device may perform the operations on behalf of the station. For example, the network device may obtain the related data from stations such as stations 510-560 and/or stations 610-660, calculate and provide the adjusted fusion weights for each station 510-560.
Fig. 8 is a flowchart illustrating an exemplary method for tuning sensor  fusion weights for a station according to aspects of the disclosure.
At step 1010, one or more conditions relating to operation performance of multiple sensors of the station may be determined.
At step 1020, fusion weights for the multiple sensors of the station may be determined based on the one or more conditions.
At step 830, fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station may be obtained.
At step 840, the fusion weights for the multiple sensors of the station may be adjusted based on the obtained fusion weights for the multiple neighboring stations.
At step 850, fusion of information obtained from the multiple sensors of the station may be performed based on the adjusted fusion weights.
In an embodiment, the adjusted fusion weight for each of the multiple sensors of the station may be obtained by averaging or weighted averaging the determined fusion weight for the sensor of the station and the obtained fusion weights for the sensor of the multiple neighboring stations.
In an embodiment, abnormal values from the determined fusion weights for the multiple sensors of the station and the obtained fusion weights for the multiple sensors of the multiple neighboring stations may be discarded. And the adjusted fusion weight for each of the multiple sensors of the station may be obtained by averaging or weighted averaging the remained fusion weights for the sensor.
In an embodiment, the adjusted fusion weight for each of the multiple sensors of the station may be obtained by voting among the determined fusion weight for the sensor of the station and the obtained fusion weights for the sensor of the multiple neighboring stations
In an embodiment, whether the determined fusion weight for each of the multiple sensors of the station is a normal value or an abnormal value may be determined based on the obtained fusion weights for the sensor of the multiple neighboring stations. The determined fusion weight for the sensor of the station may be taken as its adjusted fusion weight if it is determined as the normal value. And the adjusted fusion weight for the sensor of the station may be obtained based on at least part of the obtained fusion weights for the sensor of the multiple neighboring stations if the determined fusion weight for the sensor of the station is determined as the abnormal value.
In an embodiment, the multiple neighboring stations located within the predefined range of the station being at least one of: multiple neighboring stations located within a predefined distance from the station, a predefined number of  neighboring stations around the station, or a predefined set of neighboring stations related to the station.
In an embodiment, the one or more conditions may comprise one or more of a weather condition, a lighting condition and a noise variance condition for each of the multiple sensors. The noise variance condition may comprise noise variance metrics of a parameter for each of the multiple sensors.
In an embodiment, the one or more conditions may comprise the noise variance condition and at least one of the weather condition and the lighting condition.
In an embodiment, the fusion weights for the multiple sensors of the station may be determined based on the noise variance condition and at least one of the weather condition and the lighting condition. And the fusion weights for the multiple sensors of the station may be adjusted based on the fusion weights for the multiple neighboring stations, which are determined based on the noise variance condition and at least one of the weather condition and the lighting condition for the multiple neighboring stations.
In an embodiment, first fusion weights for the multiple sensors of the station may be determined based on the noise variance condition, and at least one of second fusion weights and third fusion weights for the multiple sensors of the station may be determined based on at least one of the weather condition and the lighting condition. The at least one of the second fusion weights and the third fusion weights for the multiple sensors of the station may be adjusted based on at least one of second fusion weights and third fusion weights for the multiple sensors of the multiple neighboring stations, which are determined based on at least one of the weather condition and the lighting condition for the multiple neighboring stations. And the adjusted fusion weights for the multiple sensors of the station may be obtained based on the first fusion weights and the adjusted at least one of the second fusion weights and the third fusion weights for the multiple sensors of the station.
In an embodiment, at least one of weather condition and lighting condition may be determined based on image data from a camera sensor.
In an embodiment, the multiple sensors may comprise at least two of a camera sensor, a radar sensor, and a LiDAR sensor. In an embodiment, the multiple sensors may comprise a camera sensor and a radar sensor. In an embodiment, the multiple sensors may comprise a camera sensor and a LiDAR sensor. In an embodiment, the multiple sensors may comprise a camera sensor, a radar sensor, and a LiDAR sensor.
In an embodiment, each of the station and the multiple neighboring  stations is one of a mobile station such as a vehicle and a stationary station such as a surveillance device amounted on a pole along the road.
Fig. 9 is a flowchart illustrating an exemplary method for tuning sensor fusion weights for a station according to aspects of the disclosure.
At step 910, noise variance metrics for multiple sensors of the station may be determined.
At step 920, at least one of a weather condition and a lighting condition for the station may be determined.
At step 930, fusion weights for the multiple sensors of the station may be determined based on the noise variance metrics and the at least one of the weather condition and the lighting condition.
At step 940, fusion of information obtained from the multiple sensors of the station may be obtained based on the fusion weights.
In an embodiment, first fusion weights for the multiple sensors of the station may be determined based on the noise variance metrics of the multiple sensors. At least one of second fusion weights and third fusion weights for the multiple sensors of the station may be determined based on at least one of the weather condition and the lighting condition for the station. And the fusion weights for the multiple sensors of the station may be determined based on the first fusion weights and the at least one of the second fusion weights and the third fusion weights.
In an embodiment, at least one of the weather condition and the lighting condition for the station may be determined based on image data from a camera sensor.
In an embodiment, at least one of the weather condition and the lighting condition for the station may be determined by analyzing a predefined number of image frames captured by the camera sensor.
In an embodiment, fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station may be obtained. The fusion weights for the multiple sensors of the station may be adjusted based on obtained fusion weights for the multiple sensors of the multiple neighboring stations. The fusion of information obtained from the multiple sensors of the station may be performed based on the adjusted fusion weights.
In an embodiment, at least one of second fusion weights and third fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station may be obtained, where the at least one of the second fusion weights and the third fusion weights for the multiple sensors of multiple neighboring stations may be determined based on at least one of the weather condition  and the lighting condition for the multiple neighboring stations. The at least one of the second fusion weights and third fusion weights for the multiple sensors of the station may be adjusted based on the obtained at least one of second fusion weights and third fusion weights for the multiple sensors of the multiple neighboring stations. And the adjusted fusion weights for the multiple sensors of the station may be obtained based on the first fusion weights and the adjusted at least one of the second fusion weights and the third fusion weights for the multiple sensors of the station.
Fig. 10 is a block diagram illustrating an exemplary computing system according to aspects of the disclosure.
The computing system 1000 may comprise at least one processor 1010. The computing system 1000 may further comprise at least one storage device 1020. It should be appreciated that the computing system 1000 may be configured to implement the processing system 20 as illustrated in Figs. 1, 3 and 4.
The storage device 1020 may store computer-executable instructions that, when executed, cause the processor 1010 to determine one or more conditions relating to operation performance of multiple sensors of a station, determine fusion weights for the multiple sensors of the station based on the one or more conditions, obtain fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station, adjust the fusion weights for the multiple sensors of the station based on the obtained fusion weights for the multiple neighboring stations, and perform fusion of information obtained from the multiple sensors of the station based on the adjusted fusion weights.
The storage device 1020 may store computer-executable instructions that, when executed, cause the processor 1010 to determine noise variance metrics for multiple sensors of a station, determine at least one of a weather condition and a lighting condition for the station, determine fusion weights for the multiple sensors of the station based on the noise variance metrics and the at least one of the weather condition and the lighting condition, and perform fusion of information obtained from the multiple sensors of the station based on the fusion weights.
It should be appreciated that the storage device 1020 may store computer-executable instructions that, when executed, cause the processor 1010 to perform any operations according to the embodiments of the present disclosure as described in connection with FIGs. 1-9.
The embodiments of the present disclosure may be embodied in a computer-readable medium such as non-transitory computer-readable medium. The non-transitory computer-readable medium may comprise instructions that, when executed, cause one or more processors to perform any operations according to the  embodiments of the present disclosure as described in connection with FIGs. 1-10.
The embodiments of the present disclosure may be embodied as a computer program product which comprises instructions that, when executed, cause one or more processors to perform any operations according to the embodiments of the present disclosure as described in connection with FIGs. 1-10.
It should be appreciated that all the operations in the methods described above are merely exemplary, and the present disclosure is not limited to any operations in the methods or sequence orders of these operations, and should cover all other equivalents under the same or similar concepts.
It should also be appreciated that all the modules in the apparatuses described above may be implemented in various approaches. These modules may be implemented as hardware, software, or a combination thereof. Moreover, any of these modules may be further functionally divided into sub-modules or combined together.
The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein. All structural and functional equivalents to the elements of the various aspects described throughout the present disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims. 

Claims (22)

  1. A method for tuning sensor fusion weights for a station, comprising:
    determining one or more conditions relating to operation performance of multiple sensors of the station;
    determining fusion weights for the multiple sensors of the station based on the one or more conditions;
    obtaining fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station;
    adjusting the fusion weights for the multiple sensors of the station based on the obtained fusion weights for the multiple neighboring stations; and
    performing fusion of information obtained from the multiple sensors of the station based on the adjusted fusion weights.
  2. The method of claim 1, wherein the adjusting the fusion weights for the multiple sensors of the station comprising:
    obtaining the adjusted fusion weight for each of the multiple sensors of the station by averaging or weighted averaging the determined fusion weight for the sensor of the station and the obtained fusion weights for the sensor of the multiple neighboring stations.
  3. The method of claim 1, wherein the adjusting the fusion weights for the multiple sensors of the station comprising:
    discarding abnormal values from the determined fusion weights for the multiple sensors of the station and the obtained fusion weights for the multiple sensors of the multiple neighboring stations; and
    obtaining the adjusted fusion weight for each of the multiple sensors of the station by averaging or weighted averaging the remained fusion weights for the sensor.
  4. The method of claim 1, wherein the adjusting the fusion weights for the multiple sensors of the station comprising:
    obtaining the adjusted fusion weight for each of the multiple sensors of the station by voting among the determined fusion weight for the sensor of the station and the obtained fusion weights for the sensor of the multiple neighboring stations.
  5. The method of claim 1, wherein the adjusting the fusion weights for the multiple sensors of the station comprising:
    determining whether the determined fusion weight for each of the multiple sensors of the station is a normal value or an abnormal value based on the obtained fusion weights for the sensor of the multiple neighboring stations;
    taking the determined fusion weight for the sensor of the station as its adjusted fusion weight if it is determined as the normal value; and
    obtaining the adjusted fusion weight for the sensor of the station based on at least part of the obtained fusion weights for the sensor of the multiple neighboring stations if the determined fusion weight for the sensor of the station is determined as the abnormal value.
  6. The method of claim 1, wherein the multiple neighboring stations located within the predefined range of the station being at least one of:
    multiple neighboring stations located within a predefined distance from the station;
    a predefined number of neighboring stations around the station; or
    a predefined set of neighboring stations related to the station.
  7. The method of claim 1, wherein the one or more conditions comprising one or more of a weather condition, a lighting condition and a noise variance condition for each of the multiple sensors.
  8. The method of claim 7, wherein the one or more conditions comprising the noise variance condition and at least one of the weather condition and the lighting condition.
  9. The method of claim 7, wherein
    the determining fusion weights for the multiple sensors of the station comprising: determining the fusion weights for the multiple sensors of the station based on the noise variance condition and at least one of the weather condition and the lighting condition; and
    the adjusting the fusion weights for the multiple sensors of the station comprising: adjusting the fusion weights for the multiple sensors of the station based on the fusion weights for the multiple neighboring stations, which are determined based on the noise variance condition and at least one of the weather condition and the lighting condition for the multiple neighboring stations.
  10. The method of claim 7, wherein
    the determining fusion weights for the multiple sensors of the station comprising: determining first fusion weights for the multiple sensors of the station based on the noise variance condition and determining at least one of second fusion weights and third fusion weights for the multiple sensors of the station based on at least one of the weather condition and the lighting condition; and
    the adjusting the fusion weights for the multiple sensors of the station comprising: adjusting the at least one of the second fusion weights and the third fusion weights for the multiple sensors of the station based on at least one of second fusion weights and third fusion weights for the multiple sensors of the multiple neighboring stations, which are determined based on at least one of the weather condition and the lighting condition for the multiple neighboring stations, and obtaining the adjusted fusion weights for the multiple sensors of the station based on the first fusion weights and the adjusted at least one of the second fusion weights and the third fusion weights for the multiple sensors of the station.
  11. The method of claim 1, wherein the determining one or more conditions comprising determining at least one of weather condition and lighting condition based on image data from a camera sensor.
  12. The method of claim 1, wherein the multiple sensors comprising at least two of a camera sensor, a radar sensor, and a LiDAR sensor.
  13. The method of claim 1, wherein each of the station and the multiple neighboring stations is one of a mobile station and a stationary station.
  14. A method for tuning sensor fusion weights for a station, comprising:
    determining noise variance metrics for multiple sensors of the station;
    determining at least one of a weather condition and a lighting condition for the station;
    determining fusion weights for the multiple sensors of the station based on the noise variance metrics and the at least one of the weather condition and the lighting condition; and
    performing fusion of information obtained from the multiple sensors of the station based on the fusion weights.
  15. The method of claim 14, wherein the determining fusion weights for multiple sensors of the station comprising:
    determining first fusion weights for the multiple sensors of the station based on the noise variance metrics of the multiple sensors;
    determining at least one of second fusion weights and third fusion weights for the multiple sensors of the station based on at least one of the weather condition and the lighting condition for the station; and
    determining the fusion weights for the multiple sensors of the station based on the first fusion weights and the at least one of the second fusion weights and the third fusion weights.
  16. The method of claim 15, wherein the determining at least one of a weather condition and a lighting condition for the station comprising:
    determining at least one of the weather condition and the lighting condition for the station based on image data from a camera sensor.
  17. The method of claim 16, wherein the determining at least one of the weather condition and the lighting condition for the station based on image data from a camera sensor comprises: determining at least one of the weather condition and the lighting condition for the station by analyzing a predefined number of image frames captured by the camera sensor.
  18. The method of claim 14, further comprising:
    obtaining fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station; and
    adjusting the fusion weights for the multiple sensors of the station based on obtained fusion weights for the multiple sensors of the multiple neighboring stations,
    wherein the performing fusion of information comprising performing fusion of information obtained from the multiple sensors of the station based on the adjusted fusion weights.
  19. The method of claim 15, further comprising:
    obtaining at least one of second fusion weights and third fusion weights for the multiple sensors of multiple neighboring stations located within a predefined range of the station, which are determined based on at least one of the weather condition and the lighting condition for the multiple neighboring stations;
    adjusting the at least one of the second fusion weights and third fusion weights for the multiple sensors of the station based on the obtained at least one of second fusion weights and third fusion weights for the multiple sensors of the multiple neighboring stations; and
    obtaining the adjusted fusion weights for the multiple sensors of the station based on the first fusion weights and the adjusted at least one of the second fusion weights and the third fusion weights for the multiple sensors of the station,
    wherein the performing fusion of information comprising performing fusion of information obtained from the multiple sensors of the station based on the adjusted fusion weights.
  20. A computer system, comprising:
    one or more processors; and
    one or more storage devices storing computer-executable instructions that, when executed, cause the one or more processors to perform the operations of the method of one of claims 1-19.
  21. One or more computer readable storage media storing computer-executable instructions that, when executed, cause one or more processors to perform the operations of the method of one of claims 1-19.
  22. An apparatus for adjusting sensor fusion weights, comprising:
    a plurality of sensors for obtaining sensor data; and
    a processing system to execute instructions that cause the processing system to perform the operations of the method of one of claims 1-19.
PCT/CN2020/133093 2020-12-01 2020-12-01 Method and apparatus for tuning sensor fusion weights WO2022115993A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202080107583.1A CN116648600A (en) 2020-12-01 2020-12-01 Method and apparatus for tuning sensor fusion weights
PCT/CN2020/133093 WO2022115993A1 (en) 2020-12-01 2020-12-01 Method and apparatus for tuning sensor fusion weights
DE112020007428.5T DE112020007428T5 (en) 2020-12-01 2020-12-01 METHOD AND DEVICE FOR TUNING SENSOR FUSION WEIGHTS

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/133093 WO2022115993A1 (en) 2020-12-01 2020-12-01 Method and apparatus for tuning sensor fusion weights

Publications (1)

Publication Number Publication Date
WO2022115993A1 true WO2022115993A1 (en) 2022-06-09

Family

ID=81852901

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/133093 WO2022115993A1 (en) 2020-12-01 2020-12-01 Method and apparatus for tuning sensor fusion weights

Country Status (3)

Country Link
CN (1) CN116648600A (en)
DE (1) DE112020007428T5 (en)
WO (1) WO2022115993A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040130630A1 (en) * 2003-01-03 2004-07-08 Ostromek Timothy E. Method and system for real-time image fusion
CN105352535A (en) * 2015-09-29 2016-02-24 河海大学 Measurement method on the basis of multi-sensor date fusion
CN107703935A (en) * 2017-09-12 2018-02-16 安徽胜佳和电子科技有限公司 Multiple data weighting fusions carry out method, storage device and the mobile terminal of avoidance
CN108934028A (en) * 2018-07-05 2018-12-04 电子科技大学 A kind of more Bernoulli Jacob's filter distribution fusion methods
CN111428759A (en) * 2020-03-10 2020-07-17 北京新能源汽车技术创新中心有限公司 Data fusion method, electronic device and storage medium
CN112003891A (en) * 2020-07-16 2020-11-27 山东省网联智能车辆产业技术研究院有限公司 Multi-sensing data fusion method for intelligent networked vehicle controller

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040130630A1 (en) * 2003-01-03 2004-07-08 Ostromek Timothy E. Method and system for real-time image fusion
CN105352535A (en) * 2015-09-29 2016-02-24 河海大学 Measurement method on the basis of multi-sensor date fusion
CN107703935A (en) * 2017-09-12 2018-02-16 安徽胜佳和电子科技有限公司 Multiple data weighting fusions carry out method, storage device and the mobile terminal of avoidance
CN108934028A (en) * 2018-07-05 2018-12-04 电子科技大学 A kind of more Bernoulli Jacob's filter distribution fusion methods
CN111428759A (en) * 2020-03-10 2020-07-17 北京新能源汽车技术创新中心有限公司 Data fusion method, electronic device and storage medium
CN112003891A (en) * 2020-07-16 2020-11-27 山东省网联智能车辆产业技术研究院有限公司 Multi-sensing data fusion method for intelligent networked vehicle controller

Also Published As

Publication number Publication date
DE112020007428T5 (en) 2023-06-29
CN116648600A (en) 2023-08-25

Similar Documents

Publication Publication Date Title
CN111554088B (en) Multifunctional V2X intelligent roadside base station system
US11353553B2 (en) Multisensor data fusion method and apparatus to obtain static and dynamic environment features
CN106991389B (en) Device and method for determining road edge
US9154741B2 (en) Apparatus and method for processing data of heterogeneous sensors in integrated manner to classify objects on road and detect locations of objects
JP7351139B2 (en) sensing device
CN112215306B (en) Target detection method based on fusion of monocular vision and millimeter wave radar
CN110264495B (en) Target tracking method and device
CN111932901B (en) Road vehicle tracking detection apparatus, method and storage medium
US20220373353A1 (en) Map Updating Method and Apparatus, and Device
CN111753623B (en) Method, device, equipment and storage medium for detecting moving object
US20220215197A1 (en) Data processing method and apparatus, chip system, and medium
US11281916B2 (en) Method of tracking objects in a scene
US11698459B2 (en) Method and apparatus for determining drivable region information
GB2599939A (en) Method of updating the existance probability of a track in fusion based on sensor perceived areas
Bouain et al. Multi-sensor fusion for obstacle detection and recognition: A belief-based approach
US11643082B2 (en) Systems and methods for determining real-time lane level snow accumulation
WO2022115993A1 (en) Method and apparatus for tuning sensor fusion weights
KR102062579B1 (en) Vehicle license-plate recognition system that recognition of Vehicle license-plate damaged by shadow and light reflection through the correction
US20230387976A1 (en) Antenna monitoring and selection
US20220375231A1 (en) Method for operating at least one environment sensor on a vehicle
Lu et al. Fusion technology of radar and RGB camera sensors for object detection and tracking and its embedded system implementation
Ahlers et al. Cooperative laserscanner pre-data-fusion
US20230079545A1 (en) Method and control unit for monitoring a sensor system
US11756420B2 (en) Vehicle detection system
US20230213664A1 (en) Systems and methods for radio frequency (rf) ranging-aided localization and map generation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20963842

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202080107583.1

Country of ref document: CN

122 Ep: pct application non-entry in european phase

Ref document number: 20963842

Country of ref document: EP

Kind code of ref document: A1