CN116648600A - Method and apparatus for tuning sensor fusion weights - Google Patents
Method and apparatus for tuning sensor fusion weights Download PDFInfo
- Publication number
- CN116648600A CN116648600A CN202080107583.1A CN202080107583A CN116648600A CN 116648600 A CN116648600 A CN 116648600A CN 202080107583 A CN202080107583 A CN 202080107583A CN 116648600 A CN116648600 A CN 116648600A
- Authority
- CN
- China
- Prior art keywords
- sensors
- site
- fusion
- weights
- weight
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D3/00—Indicating or recording apparatus with provision for the special purposes referred to in the subgroups
- G01D3/08—Indicating or recording apparatus with provision for the special purposes referred to in the subgroups with provision for safeguarding the apparatus, e.g. against abnormal operation, against breakdown
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/91—Radar or analogous systems specially adapted for specific applications for traffic control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Traffic Control Systems (AREA)
Abstract
The present disclosure provides a method for tuning sensor fusion weights for a site. The method comprises the following steps: determining one or more conditions related to operational performance of a plurality of sensors of the site; determining fusion weights for the plurality of sensors of the site based on the one or more conditions; obtaining fusion weights of a plurality of sensors of a plurality of adjacent sites located within a predefined range of the site; adjusting fusion weights of a plurality of sensors of a site based on the obtained fusion weights of a plurality of neighboring sites; and performing fusion of information obtained from the plurality of sensors of the site based on the adjusted fusion weights.
Description
Technical Field
Aspects of the present disclosure relate generally to sensor fusion, and more particularly to tuning fusion weights for fusing information from multiple sensors of a site.
Background
Intelligent transportation systems, such as traffic monitoring devices, autonomous vehicles, etc., rely on a variety of sensors for sensing the environment. For example, various operations may be performed based on data captured by various sensors, such as detecting and tracking vehicles, pedestrians, or other objects of the environment, speed detection, position detection, and the like.
Taking a vehicle as an example, the vehicle may be equipped with different types of sensors, such as cameras, liDAR (light detection and ranging), radar, ultrasonic sensors, etc. Different types of sensors may have their corresponding advantages and weaknesses. For example, cameras may perceive color and texture information from the world and be adept at classifying objects, however, the detection range of cameras is limited and cameras perform poorly under limited lighting or adverse weather conditions. LiDAR provides accurate distance information, has a large detection range and is able to detect small objects, they work well at night, but does not provide color information and their performance degrades during heavy rain. Radar provides accurate distance and speed information and works well in bad weather conditions, but with a fairly low resolution. To improve the performance of sensor-based measurements, data from different sensors may be fused, which is referred to as sensor fusion.
There is a need to improve the reliability and robustness of sensor fusion techniques.
Disclosure of Invention
According to an embodiment, a method for tuning sensor fusion weights of a site is provided. The method comprises the following steps: determining one or more conditions related to operational performance of a plurality of sensors of the site; determining fusion weights for a plurality of sensors of the site based on one or more conditions; obtaining fusion weights of a plurality of sensors of a plurality of adjacent sites located within a predefined range of the site; adjusting fusion weights of a plurality of sensors of a site based on the obtained fusion weights of a plurality of neighboring sites; and performing fusion of information obtained from the plurality of sensors of the site based on the adjusted fusion weights.
According to an embodiment, a method for tuning sensor fusion weights of a site is provided. The method comprises the following steps: determining noise variance measures for a plurality of sensors of the site; determining at least one of weather conditions and lighting conditions for the site; determining a fusion weight for a plurality of sensors of the site based on the noise variance metric and at least one of the weather condition and the lighting condition; and performing fusion of information obtained from the plurality of sensors of the site based on the fusion weights.
According to an embodiment, a computer system is provided that includes one or more processors and one or more storage devices storing computer-executable instructions that, when executed, cause the one or more processors to perform the operations of the methods described above and perform the operations of the methods provided in accordance with aspects of the present disclosure.
According to an embodiment, there is provided one or more computer-readable storage media storing computer-executable instructions that, when executed, cause one or more processors to perform the operations of the methods described above and to perform the operations of the methods provided in accordance with aspects of the present disclosure.
According to an embodiment, there is provided an apparatus for adjusting sensor fusion weights, the apparatus comprising: a plurality of sensors for obtaining sensor data; and a processing system to execute instructions that cause the processing system to perform the operations of the methods described above and to perform the operations of the methods provided in accordance with aspects of the present disclosure.
By using the sensor fusion weight tuning method of the present disclosure, reliability and robustness of operations utilizing sensor fusion can be improved. Other advantages of the present disclosure will be explained in the following description.
Drawings
The disclosed aspects will hereinafter be described in conjunction with the appended drawings, which are provided to illustrate and not to limit the disclosed aspects.
Fig. 1 is a block diagram illustrating an exemplary apparatus according to aspects of the present disclosure.
Fig. 2 is a block diagram illustrating an exemplary sensor fusion method in accordance with aspects of the present disclosure.
Fig. 3 is a block diagram illustrating an exemplary apparatus according to aspects of the present disclosure.
Fig. 4 is a block diagram illustrating an exemplary apparatus according to aspects of the present disclosure.
Fig. 5 illustrates an exemplary scenario of sensor fusion weights for tuning a site in accordance with aspects of the present disclosure.
Fig. 6 illustrates an exemplary scenario of sensor fusion weights for tuning a site in accordance with aspects of the present disclosure.
Fig. 7 illustrates an exemplary scenario of sensor fusion weights for tuning a site in accordance with aspects of the present disclosure.
Fig. 8 is a flow chart illustrating an exemplary method for tuning sensor fusion weights for a site in accordance with aspects of the present disclosure.
Fig. 9 is a flow chart illustrating an exemplary method for tuning sensor fusion weights for a site in accordance with aspects of the present disclosure.
Fig. 10 is a block diagram illustrating an exemplary computing system 1000 in accordance with aspects of the disclosure.
Detailed Description
The present disclosure will now be discussed with reference to several exemplary embodiments. It should be understood that these embodiments are discussed only to enable those skilled in the art to better understand the embodiments of the present disclosure and thus to practice the embodiments of the present disclosure, and are not meant to imply any limitation on the scope of the present disclosure.
Fig. 1 is a block diagram illustrating an exemplary apparatus according to aspects of the present disclosure.
The apparatus 1 shown in fig. 1 may be a vehicle such as an autonomous vehicle, a stationary monitoring device such as a monitoring device installed along a road, or the like, or may be part of a vehicle, a monitoring device, or the like.
The device 1 may be equipped with various sensors 10. In the example illustrated in fig. 1, the various sensors 10 may include a camera system 110, a LiDAR system 120, and a radar system 130. It should be appreciated that the set of sensors 10 of the apparatus 1 may include other types of sensors in addition to the exemplary sensors 110-130 shown, for example, various other sensor resources such as sonar, ultrasonic, proximity, infrared, wheel speed, rain, etc. may be included in the set of sensors 10. On the other hand, the set of sensors 10 of the device 1 may not comprise all three types of sensors 110-130, any combination of at least two of the sensors 110-130 being possible to be equipped on the device 1.
The apparatus 1 may comprise a processing system 20. The processing system 20 may be implemented in various ways, for example, the processing system 20 may include one or more processors and/or controllers that may execute software to perform various operations or functions, such as operations or functions in accordance with various aspects of the present disclosure.
Processing system 20 may receive sensor data from sensors 10, such as sensors 110-130, and perform various operations by analyzing the sensor data. In the example of fig. 1, the processing system 20 may include a condition detection module 210, a fusion weight tuning module 220, and a sensor fusion module 230. It is to be appreciated that modules 210-230 may be implemented in various manners, e.g., as software modules or functions executable by a processor and/or controller.
The condition detection module 210 may be configured to determine one or more conditions related to the operational performance of at least a portion of the plurality of sensors 10 of the apparatus 1.
The condition related to the operational performance of the sensor may be a noise variance measure of the sensor. The noise variance measure may describe the degree of reliability of information from the sensor at a point in time. For example, image noise may be used as a parameter to determine a noise variance measure of the camera sensor 110. Similarly, signal noise may be used as a parameter to determine a noise variance metric of LiDAR sensor 120 or radar sensor 130. As another example, given the exemplary operation of object detection, object detection confidence may be used as a parameter to determine a noise variance measure of sensor 10. It should be appreciated that any method for determining a noise variance measure of the sensor 10 is applicable to embodiments of the present disclosure, and the present disclosure is not limited to a particular method for determining a noise variance measure of the sensor.
The condition related to the operational performance of the sensor may be the weather condition in which the sensor 10 of the device is operating. The condition detection module 110 may determine weather conditions based on sensor data from some of the sensors 10. For example, the current weather conditions may be determined based on image data captured by the camera system 110. In one embodiment, the image classification model may be used to determine the current weather condition by classifying the captured image as one of a plurality of weather conditions, which may include, for example, sunny, foggy, rainy, snowy, and the like. It should be understood that any method for determining weather conditions is applicable to embodiments of the present disclosure, and the present disclosure is not limited to a particular method for determining weather conditions.
The condition related to the operational performance of the sensor may be the lighting condition under which the sensor 10 of the device is operating. The condition detection module 110 may determine the lighting condition based on sensor data from some of the sensors 10. For example, the current lighting conditions may be determined based on image data captured by the camera system 110. In one embodiment, the image classification model may be used to determine the current lighting conditions by classifying the captured image as one of a plurality of lighting conditions, which may include, for example, sunlight glare, normal lighting, low lighting, night time, and the like. In another embodiment, the lighting conditions may be determined based on the intensity of the captured image. It should be understood that any method for determining lighting conditions is applicable to embodiments of the present disclosure, and the present disclosure is not limited to a particular method for determining lighting conditions.
The fusion weight tuning module 220 may be configured to determine fusion weights for respective sensors based on the conditions detected by the condition detection module 210.
For example, the fusion weight tuning module 220 may determine fusion weights for the sensors 110, 120, and 130 based on noise variance metrics of the sensors 110, 120, and 130, respectively, and then the sensor fusion module 230 may perform fusion of information obtained from the sensors 110, 120, and 130 based on the fusion weights thereof. Generally, the higher the noise variance measure or noise measure of a sensor, the lower the fusion weight of a given sensor. It should be appreciated that sensor fusion may occur between two or more of the sensors 110, 120, 130 and other applicable sensors 10.
As another example, the fusion weight tuning module 220 may determine fusion weights for the sensors 110, 120, and 130, respectively, based on the weather conditions detected by the condition detection module 210, and then the sensor fusion module 230 may perform fusion of information obtained from the sensors 110, 120, and 130 based on the fusion weights thereof. Generally, the blending weight of the camera 110 may be reduced relative to the severity of the weather conditions because the camera does not work well in poor weather conditions, and thus the blending weight of the camera 110 may be reduced and may be increased in good weather conditions. For other sensors, fusion weight adjustments may be defined based on inherent sensor characteristics. For example, radar sensor 130 generally works well in bad weather conditions, so the fusion weight of radar sensor 130 may increase in bad weather and may decrease in good weather. The performance of LiDAR sensor 120 decreases during heavy rain, so the fusion weight of radar sensor 130 may decrease during rain.
TABLE 1
Camera weights | LiDAR weight | Radar weight | |
Weather condition 1 | Wc 1 | Wl 1 | Wr 1 |
Weather condition 2 | Wc 2 | Wl 2 | Wr 2 |
Weather condition 3 | Wc 3 | Wl 3 | Wr 3 |
Weather condition 4 | Wc 4 | Wl 4 | Wr 4 |
An example of fusion weight settings for multiple sensors of the device is shown in table 1 above. Exemplary weather conditions 1-4 may be, for example, sunny, foggy, rainy, snowy. The fusion weights for each of the sensors 110-130 under each of the weather conditions 1-4 may be defined in table 1. For example, when weather condition 3 "rain" is detected, the fused weight tuning module 220 may set the fused weights of the camera 110, liDAR 120, and radar 130 to Wc 3, wl 3, and Wr 3, respectively. The fusion weights as illustrated in table 1 may be heuristically defined, e.g., may be defined experimentally or empirically and updated as appropriate. It should be understood that the fusion weights may be set to zero or non-zero values, and the present disclosure is not limited to particular fusion weight values. It should be appreciated that the fusion weights for the respective sensors under the respective weather conditions may be determined in any suitable manner. For example, an image classification model may be used to determine fusion weights for particular sensors by classifying images from the camera 110, taking into account weather conditions.
As another example, the fusion weight tuning module 220 may determine fusion weights for the sensors 110, 120, and 130, respectively, based on the illumination conditions or illumination levels detected by the condition detection module 210, and then the sensor fusion module 230 may perform fusion of information obtained from the sensors 110, 120, and 130 based on the fusion weights thereof. Generally, the blending weight of the camera 110 may be reduced relative to the severity of the lighting conditions because the camera does not work well under poor lighting conditions, and thus the blending weight of the camera 110 may be reduced under poor lighting conditions and may be increased under good lighting conditions. For example, both LiDAR sensor 120 and radar sensor 130 work well under poor lighting conditions, so the fusion weight of LiDAR sensor 120 and radar sensor 130 may increase under poor lighting conditions and may decrease under good lighting conditions.
TABLE 2
Camera weights | LiDAR weight | Radar weight | |
Lighting condition 1 | Wc 1 | Wl 1 | Wr 1 |
Lighting condition 2 | Wc 2 | Wl 2 | Wr 2 |
Lighting condition 3 | Wc 3 | Wl 3 | Wr 3 |
Lighting condition 4 | Wc 4 | Wl 4 | Wr 4 |
An example of fusion weight settings for multiple sensors is shown in table 2 above. Exemplary lighting conditions 1-4 may be, for example, sunlight glare, normal lighting, low lighting, night time. The fusion weights for each of the sensors 110-130 under each of the lighting conditions 1-4 may be defined in table 2. For example, when lighting condition 4 "night" is detected, the fusion weight tuning module 220 may set the fusion weights of the camera 110, liDAR 120, and radar 130 to Wc 4, wl 4, and Wr 4, respectively. The fusion weights as illustrated in table 2 may be heuristically defined, e.g., may be defined experimentally or empirically and updated as appropriate. It should be understood that the fusion weights may be set to zero or non-zero values, and the present disclosure is not limited to particular fusion weight values. It should be appreciated that the fusion weights of the respective sensors under the respective lighting conditions may be determined in any suitable manner. For example, an image classification model may be used to determine fusion weights for particular sensors by classifying images from camera 110, taking into account lighting conditions.
Fig. 2 is a block diagram schematically illustrating a sensor fusion method according to aspects of the present disclosure.
Sensor fusion can generally be divided into three main categories: low level sensor fusion (LLF), medium level sensor fusion (MLF), and high level sensor fusion (HLF). Graphs a and B in fig. 2 show HLF architecture and LLF architecture, respectively.
Given the exemplary operations of object detection as shown in left panel a, in HLF, the object detection operations as shown in blocks 240A through 240C may be performed on sensor data captured by respective sensors, examples of which may be camera 110, liDAR 120, and radar 130 shown in fig. 1. Each of the object detection modules 240A-240C may generate an object list based on sensor data from a corresponding sensor. The object list generated by modules 240A-240C may be provided to a sensor fusion module 230, which is also shown in fig. 1. The sensor fusion module 230 may fuse object lists, where corresponding object lists obtained from different sensors, such as sensors 110-130, may be given different weights at the time of fusion.
As shown in right panel B, in LLF, raw data from sensors such as sensors 110-130 are fused in fusion module 230, where raw data obtained from different sensors such as sensors 110-130 may be given different weights at the time of fusion. Then, the detection module 240 performs object detection based on the fusion data.
MLF is an abstraction over LLF. In MLF, features are extracted from raw data of multiple sensors, such as sensors 110-130, and the extracted features are fused by fusion module 230, where intermediate data, such as extracted features, obtained from different sensors, such as sensors 110-130, may be given different weights at the time of fusion. Then, the detection module 240 performs object detection based on the fusion data.
It should be understood that various sensor fusion methods, such as the illustrated HLF, MLF, MLF method, are applicable to embodiments of the present disclosure, and the present disclosure is not limited to any particular sensor fusion algorithm or method.
Fig. 3 is a block diagram illustrating an exemplary apparatus according to aspects of the present disclosure.
The same or similar labels as represented in fig. 1 and 2 are used in fig. 3, and details of similar modules or blocks will not be repeated in the description of fig. 3.
In the example of fig. 3, the condition detection module 210 may include a weather condition detection module 2110, a lighting condition detection module 2120, a camera noise detection module 2130, a LiDAR noise detection module 2140, a radar noise detection module 2150.
The weather condition detection module 2110 may determine the current weather condition based on an image captured by the camera system 110. In one embodiment, the weather condition detection module 2110 may process a predefined number of image frames captured over a predefined period of time to determine the current weather based on the plurality of image frames. By predefining the time period, it helps to avoid rapid shifting of the fused weights due to short-term, extreme weather measurements. In one embodiment, the weather condition detection module 2110 may be implemented as an image classification model that classifies a captured image as one of a variety of weather conditions, such as sunny, foggy, rainy, snowy, and the like. It should be appreciated that any method for determining weather conditions is applicable to embodiments of the weather condition detection module 2110.
The lighting condition detection module 2120 may determine the current lighting condition based on the image captured by the camera system 110. In one embodiment, the lighting condition detection module 2120 may process a predefined number of image frames captured within a predefined time period to determine the current weather based on the plurality of image frames. By predefining the time period, it helps to avoid rapid shifting of the fusion weights due to short-term, extreme illumination measurements. In one embodiment, the lighting condition detection module 2120 may be implemented as an image classification model that classifies a captured image as one of a plurality of lighting conditions, such as sunlight glare, normal lighting, low lighting, night time, and the like. It should be appreciated that any method for determining lighting conditions is applicable to embodiments of the weather condition detection module 2120.
In one embodiment, the weather condition detection module 2110 and the lighting condition detection module 2120 may be implemented as a single image classification model that classifies a captured image into one of a plurality of weather conditions and one of a plurality of lighting conditions. It should be appreciated that any method for fully determining weather conditions and lighting conditions is suitable for implementing the weather condition detection module 2110 and the lighting condition detection module 2120 as one module.
Camera noise detection module 2130, liDAR noise detection module 2140, and radar noise detection module 2150 may determine current noise variance metrics of camera system 110, liDAR system 120, and radar system 130, respectively, based on parameters of the sensors. For example, the parameter may be signal noise of the sensors 110-130. As another example, given the exemplary operation of object detection, the parameter may be object detection confidence. It should be appreciated that any method for determining a noise variance measure of a sensor is applicable to aspects of the present disclosure.
In the example of fig. 3, the processing system 20 of the apparatus 1 includes object detection modules 240A to 240C. The object detection modules 240A to 240C perform object detection operations on sensor data captured by respective sensors such as the camera 110, the LiDAR 120, and the radar 130, respectively, to generate an object list. It should be appreciated that in examples where object detection confidence is used as a parameter for determining a noise variance metric, the operation of the noise detection modules 2130-2150 may be associated with the operation of the object detection modules 240A-240C. It should be appreciated that in other embodiments, the modules 240A-240C may perform other operations or functions, for example, the operations performed by the modules 240A-240C may be position detection, velocity detection, tracking, etc., and thus, the corresponding operational confidence may be taken as a parameter for determining the noise variance measure by the noise detection modules 2130-2150.
In one embodiment, the fusion weight tuning module 220 may determine the fusion weights of the sensors 110 and 120 based on the noise variance metrics and weather conditions of the sensors 110 and 120, which are detected by the camera noise detection module 2130, the LiDAR noise detection module 2140, and the weather condition detection module 2110, respectively. For example, the fusion weights of the sensors 110 and 120 determined based on the noise variance metrics of the sensors 110 and 120 may be adjusted using the fusion weights of the sensors 110 and 120 determined based on the current weather to obtain adjusted fusion weights of the sensors 110 and 120. In an example, the adjusted fusion weight of one sensor may be a multiplication of the fusion weights of the sensors associated with the respective conditions. In this embodiment, the adjusted fusion weight of the camera 110 may be a multiplication of the fusion weight of the camera 110 determined based on the noise variance metric of the camera 110 and the fusion weight of the camera 110 determined based on the current weather, and the adjusted fusion weight of the LiDAR 120 may be a multiplication of the fusion weight of the LiDAR 120 determined based on the noise variance metric of the LiDAR 120 and the fusion weight of the LiDAR 120 determined based on the current weather. As another example, the adjusted fusion weight for one sensor may be a sum of fusion weights for sensors associated with respective conditions. In this implementation, the adjusted fusion weight of the camera 110 may be a sum of the fusion weight of the camera 110 determined based on the noise variance metric of the camera 110 and the fusion weight of the camera 110 determined based on the current weather, and the adjusted fusion weight of the LiDAR 120 may be a sum of the fusion weight of the LiDAR 120 determined based on the noise variance metric of the LiDAR 120 and the fusion weight of the LiDAR 120 determined based on the current weather.
The sensor fusion module 230 may fuse the object list received from the object detection modules 240A-240B based on the fusion weights received from the fusion weight tuning module 220. By adjusting the fusion weights determined based on the noise variance metric with the fusion weights determined based on the current weather, the reliability of sensor fusion is facilitated, wherein some potential problems associated with using only the noise variance metric for fusion weight tuning may be alleviated. For example, in a foggy condition, camera data may be used to detect objects at close range with high confidence, but it is not appreciated that the range tracking ability based on camera data is impaired. By regarding weather conditions as additional information in addition to noise variance metrics to determine the fusion weights of the sensors, the fusion weights of the cameras 110 can be timely reduced and the fusion weights of the LiDARs 120 can be increased to mitigate potential problems in such situations.
In one embodiment, the fusion weight tuning module 220 may determine the fusion weights of the sensors 110 and 120 based on noise variance metrics and illumination conditions of the sensors 110 and 120, which are detected by the camera noise detection module 2130, the LiDAR noise detection module 2140, and the illumination condition detection module 2120, respectively. For example, the fusion weights of the sensors 110 and 120 determined based on the noise variance metrics of the sensors 110 and 120 may be adjusted using the fusion weights of the sensors 110 and 120 determined based on the current illumination level to obtain adjusted fusion weights of the sensors 110 and 120. In an example, the adjusted fusion weight of the camera 110 may be a multiplication of the fusion weight of the camera 110 determined based on the noise variance metric of the camera 110 and the fusion weight of the camera 110 determined based on the current lighting conditions, and the adjusted fusion weight of the LiDAR 120 may be a multiplication of the fusion weight of the LiDAR 120 determined based on the noise variance metric of the LiDAR 120 and the fusion weight of the LiDAR 120 determined based on the current lighting conditions. As another example, the adjusted fusion weight of the camera 110 may be a sum of the fusion weight of the camera 110 determined based on the noise variance metric of the camera 110 and the fusion weight of the camera 110 determined based on the current lighting conditions, and the adjusted fusion weight of the LiDAR 120 may be a sum of the fusion weight of the LiDAR 120 determined based on the noise variance metric of the LiDAR 120 and the fusion weight of the LiDAR 120 determined based on the current lighting conditions. The sensor fusion module 230 may fuse the object list received from the object detection modules 240A-240B based on the fusion weights received from the fusion weight tuning module 220.
In one embodiment, the fusion weight tuning module 220 may determine the fusion weights of the sensors 110 and 120 based on noise variance metrics of the sensors 110 and 120 and weather and lighting conditions detected by the camera noise detection module 2130, liDAR noise detection module 2140, weather condition detection module 2110, and lighting condition detection module 2120, respectively. For example, the fusion weights of the sensors 110 and 120 determined based on the noise variance metrics of the sensors 110 and 120 may be adjusted using the fusion weights of the sensors 110 and 120 determined based on the current weather and the fusion weights of the sensors 110 and 120 determined based on the current lighting level to obtain adjusted fusion weights of the sensors 110 and 120. In an example, the adjusted fusion weights of the cameras 110 may be a multiplication of fusion weights of the cameras 110 determined based on noise variance metrics of the cameras 110, fusion weights of the cameras 110 determined based on current weather, and fusion weights of the cameras 110 determined based on current lighting conditions, and the adjusted fusion weights of the lidars 120 may be a multiplication of fusion weights of the lidars 120 determined based on noise variance metrics of the lidars 120, fusion weights of the lidars 120 determined based on current weather, and fusion weights of the lidars 120 determined based on current lighting conditions. As another example, the adjusted fusion weight of the camera 110 may be a sum of the fusion weight of the camera 110 determined based on the noise variance metric of the camera 110, the fusion weight of the camera 110 determined based on the current weather, and the fusion weight of the camera 110 determined based on the current lighting conditions, and the adjusted fusion weight of the LiDAR 120 may be a sum of the fusion weight of the LiDAR 120 determined based on the noise variance metric of the LiDAR 120, the fusion weight of the LiDAR 120 determined based on the current weather, and the fusion weight of the LiDAR 120 determined based on the current lighting conditions. The sensor fusion module 230 may fuse the object list received from the object detection modules 240A-240B based on the fusion weights received from the fusion weight tuning module 220.
It should be appreciated that the fusion weight tuning module 220 may determine the fusion weights of any two or more of the sensors 110, 120, and 130 based on the noise variance metric of the two or more sensors and at least one of the weather conditions and the lighting conditions. For example, at least one of the fusion weights of the two or more sensors determined based on the current weather and the fusion weights of the two or more sensors determined based on the current lighting level may be utilized to adjust the fusion weights of the two or more sensors determined based on the noise variance metrics of the two or more sensors in order to obtain adjusted fusion weights of the two or more sensors. The sensor fusion module 230 may fuse the object list received from corresponding two or more of the object detection modules 240A-240C based on the fusion weights received from the fusion weight tuning module 220.
Fig. 4 is a block diagram illustrating an exemplary apparatus according to aspects of the present disclosure.
The structure shown in fig. 4 is different from the structure of fig. 3 in that the object detection module 240 operates after the operation of the sensor fusion module 230. The sensor fusion module performs LLF fusion on raw data captured by different sensors, such as sensors 110-130, or performs MLF fusion on extracted features obtained from different sensors, such as sensors 110-130, as explained above with reference to fig. 2. Then, the detection module 240 performs object detection based on the fusion data. Other portions of fig. 4 are similar to those of fig. 3, and details are not repeated here.
As explained, the information to be fused at the sensor fusion module 230 may be target measurements such as detected objects, velocities, locations, etc., raw data captured by different sensors, and intermediate data such as extracted features. The different types of data to be fused at the sensor fusion module 230 may be generally referred to as information obtained from the sensors.
Fig. 5 illustrates an exemplary scenario of sensor fusion weights for tuning a site in accordance with aspects of the present disclosure.
In the example of FIG. 5, a plurality of automobiles 510-560 are traveling on a roadway. Each of the cars 510-560 may be or may include the apparatus 1 shown in fig. 1, 3, and 4, and may be referred to as a station, a mobile station, or the like.
Taking site 510 as an example, site 510 may determine one or more conditions related to the operational performance of its multiple sensors (such as camera 110, liDAR 120, radar 130, etc.). As indicated above, the one or more conditions may be noise variance measures, weather conditions, lighting conditions, etc. of the respective sensor 10. Site 510 may then determine fusion weights for the plurality of sensors based on one or more conditions. It should be appreciated that the plurality of sensors may be any two or more of the sensors 110-130 as well as other possible sensors provided on the site 510. Due to temporary and extreme measurements from sensors, particularly camera 110, there may be situations where the fusion weights of multiple sensors determined by site 510 based on their own sensor data may not be reliable or constant accurate.
To mitigate the impact of this, the site 510 may obtain fusion weights for a plurality of sensors of a plurality of neighboring sites that are within a predefined range of the site 510, and adjust the fusion weights for the plurality of sensors of the site 510 based on the obtained fusion weights for the plurality of neighboring sites. It should be appreciated that station 510 may communicate wirelessly with neighboring stations via various networks such as a Wireless Local Area Network (WLAN), a 4G wireless communication network, a 5G wireless communication network, a future evolution network (e.g., 6G), etc. For simplicity, the communication network is not shown in fig. 5.
For example, the neighboring sites may be at least a portion of sites 520 through 560 as shown in fig. 5 and may be other sites as appropriate. The predefined range of the station 510 may be a predefined distance from the station 510. For example, neighboring stations located within a predefined distance from station 510 may have similar conditions, such as weather conditions and lighting conditions, and may provide their current fusion weights to station 510 for them to adjust their fusion weights. The predefined range of sites 510 may be a predefined number of neighboring sites around sites 510. For example, the station 510 may receive the fusion weights from a predefined number of neighboring stations around the station 510. The predefined range of the station 510 may be a combination of the predefined distance from the station 510 and the predefined number of neighboring stations around the station 510.
For simplicity, in the following description, sites 520-560 are referred to as neighboring sites, and sensors 110-120 are referred to as multiple sensors.
In one embodiment, the fused weight tuning module 220 of the site 510 may perform the function of adjusting the fused weights of the plurality of sensors 110-120 of the site 510 based on the obtained fused weights of the plurality of sensors 110-120 of the plurality of neighboring sites 520-560. The sensor fusion module 230 of the site 510 may then perform fusion of information obtained from the plurality of sensors 110-120 of the site 510 based on the adjusted fusion weights.
In one embodiment, a site 510 (and in particular, the fusion weight tuning module 220) may adjust the fusion weights of its sensors 110 by averaging the fusion weights of its sensors 110 with the obtained fusion weights of the sensors 110 of a plurality of neighboring sites 520-560. Similarly, a site 510 may adjust the fusion weights of its sensors 120 by averaging the fusion weights of its sensors 120 with the obtained fusion weights of the sensors 120 of multiple neighboring sites 520-560.
In one embodiment, a site 510 may adjust the fusion weights of its sensors 110 or 120 by weighted averaging the fusion weights of the sensors 110 or 120 of multiple sites 510-560. To calculate a weighted average fused weight of a sensor 110 or 120 between multiple sites 510-560, the calculated weight of the fused weight from neighboring sites 520-560 may be inversely proportional to the distance between the respective neighboring sites 520-560 and site 510.
In one embodiment, site 510 may discard outliers of fusion weights from sensors 110 or 120 of multiple sites 510-560 and then adjust the fusion weights of its sensors 110 or 120 by averaging or weighted averaging the remaining fusion weights of the sensors 110 or 120 of multiple sites 510-560. The outliers may be determined in any suitable manner. For example, the outliers may be determined based on a predefined variance range, wherein a value between the multiple fused weight values of the multiple sites 510-560 is taken as an outlier if the deviation of the value from the mean is outside the variance range, and otherwise, the value is taken as a normal value.
In one embodiment, if a station 510 determines that the fused weight of its sensor 110 or 120 is a normal value, the station may treat the fused weight as its adjusted fused weight, otherwise the station may discard the outlier and obtain the adjusted fused weight of the sensor 110 or 120 based on the fused weights of the sensors 110 or 120 of multiple neighboring stations 520-560.
In one embodiment, a site 510 may obtain adjusted fusion weights for its sensors 110 or 120 by voting between the fusion weights of the sensors 110 or 120 of multiple sites 510-560. For example, if most of the fusion weights of the sensors 110 or 120 of multiple sites 510-560 indicate the same value, the site 510 may treat that value as its adjusted fusion weight of its sensor 110 or 120.
As described above with reference to fig. 3, the fused weight detection module 220 of the site 510 may determine the fused weight of its sensor 110 or 120 based on the noise variance metric of its sensors 110 and 120 and weather and/or lighting conditions. In one embodiment, the site 510 may obtain fused weights for the sensors 110 or 120 of the plurality of neighboring sites 520-560, the fused weights being determined based on noise variance conditions and weather and/or lighting conditions of the plurality of neighboring sites 520-560. Site 510 may then adjust the fusion weights of its sensors 110 or 120 based on the obtained fusion weights of the sensors 110 or 120 of the plurality of neighboring sites 520-560.
As described above with reference to fig. 3, the fused weight detection module 220 of the site 510 may determine a first fused weight of the sensor 110 or 120 based on the noise variance metric of its sensors 110 and 120, and determine a second fused weight and/or a third fused weight of its sensor 110 or 120 based on weather and/or lighting conditions. In one embodiment, the site 510 may obtain a second fused weight and/or a third fused weight for the sensors 110 or 120 of the plurality of neighboring sites 520-560, the second fused weight and/or the third fused weight being determined based on weather and/or lighting conditions of the plurality of neighboring sites 520-560. The site 510 may adjust the second and/or third fused weights of its sensors 110 or 120 based on the obtained second and/or third fused weights of the plurality of neighboring sites 520-560. The station 510 may obtain the adjusted fusion weights for its sensor 110 or 120 based on the first fusion weight determined by itself and the adjusted second and/or third fusion weights for its sensor 110 or 120.
Fig. 6 illustrates an exemplary scenario of sensor fusion weights for tuning a site in accordance with aspects of the present disclosure.
In the example of FIG. 6, a plurality of monitoring devices 610-660 are disposed along the roadside. Each of the devices 610-660 may be or may include the apparatus 1 shown in fig. 1, 3, and 4, and may be referred to as a site, a fixed site, or the like.
For aspects of the present disclosure, the structure and function of the monitoring devices 610-660 are similar to the structure and function of the automobiles 510-560 shown in FIG. 5. Taking the device 610 as an example, the device's operation for adjusting its sensor fusion weights based on information from neighboring stations such as stations 620-660 is similar to the car 510 as detailed above, so these details are not repeated.
In one embodiment, since the devices 610-660 are fixed sites, the plurality of neighboring sites that are within a predefined range of the site 610 may be a predefined set of neighboring sites that are related to the site 610. For example, a predefined set of neighboring sites related to site 610 may be set when sites 610-660 are deployed, etc.
Fig. 7 illustrates an exemplary scenario of sensor fusion weights for tuning a site in accordance with aspects of the present disclosure.
In the example of fig. 7, the plurality of monitoring devices 710-760 and automobiles 770-790 may be or may include the apparatus 1 shown in fig. 1, 3 and 4, respectively, and may be referred to as a station.
Each of the monitoring devices 710-760 and cars 770-790 may operate in the same manner as the stations 510 and 610 described above with reference to fig. 5 and 6, and the neighbors of a particular station may include both movable stations 770-790 and stationary stations 710-760. Taking site 770 as an example, neighboring sites within its predefined range may include devices 710-760 and cars 780-790.
Although the operations of tuning sensor fusion weights are described in the examples of fig. 5-7 as being performed by a site, such as site 510, it should be understood that the network device may perform operations on behalf of the site. For example, the network device may obtain relevant data from stations such as stations 510-560 and/or stations 610-660, calculate and provide adjusted fusion weights for each station 510-560.
Fig. 8 is a flow chart illustrating an exemplary method for tuning sensor fusion weights for a site in accordance with aspects of the present disclosure.
At step 1010, one or more conditions related to operational performance of a plurality of sensors of a site may be determined.
At step 1020, fusion weights for a plurality of sensors of a site may be determined based on one or more conditions.
At step 830, fusion weights for a plurality of sensors of a plurality of neighboring sites that are within a predefined range of the site may be obtained.
At step 840, the fusion weights of the plurality of sensors of the site may be adjusted based on the obtained fusion weights of the plurality of neighboring sites.
At step 850, fusion of information obtained from multiple sensors of the site may be performed based on the adjusted fusion weights.
In one embodiment, the adjusted fusion weight for each of the plurality of sensors of the site may be obtained by averaging or weighted averaging the determined fusion weights of the sensors of the site and the obtained fusion weights of the sensors of the plurality of neighboring sites.
In one embodiment, the determined fusion weights from the plurality of sensors of the site and the obtained outliers of the fusion weights for the plurality of sensors of the plurality of neighboring sites may be discarded. And the adjusted fusion weights for the sensors may be obtained by averaging or weighted averaging the remaining fusion weights for each of the plurality of sensors of the site.
In one embodiment, the adjusted fusion weight for each of the plurality of sensors of the site may be obtained by voting between the determined fusion weights for the sensors of the site and the obtained fusion weights for the sensors of the plurality of neighboring sites.
In one embodiment, the determined fusion weight for each of the plurality of sensors of the site may be determined based on the obtained fusion weights for the sensors of the plurality of neighboring sites as a normal value or an outlier. If the determined fusion weight of the sensor of the site is determined to be a normal value, the determined fusion weight of the sensor of the site may be taken as its adjusted fusion weight. And if the determined fusion weights of the sensors of the sites are determined to be outliers, an adjusted fusion weight of the sensors of the sites may be obtained based on at least a portion of the obtained fusion weights of the sensors of the plurality of neighboring sites.
In one embodiment, the plurality of neighboring sites that are within a predefined range of the site are at least one of: a plurality of neighboring sites located within a predefined distance from the site; a predefined number of neighboring sites around a site; or a predefined set of neighboring sites related to a site.
In one embodiment, the one or more conditions may include one or more of a weather condition, an illumination condition, and a noise variance condition for each of the plurality of sensors. The noise variance condition may include a noise variance measure of a parameter of each of the plurality of sensors.
In one embodiment, the one or more conditions may include a noise variance condition and at least one of a weather condition and a lighting condition.
In one embodiment, the fusion weights of the plurality of sensors of the site may be determined based on the noise variance condition and at least one of the weather condition and the lighting condition. And the fused weights of the plurality of sensors of the site may be adjusted based on the fused weights of the plurality of neighboring sites, the fused weights of the plurality of neighboring sites being determined based on the noise variance condition of the plurality of neighboring sites and at least one of the weather condition and the lighting condition.
In one embodiment, a first fused weight of the plurality of sensors of the site may be determined based on the noise variance condition, and at least one of a second fused weight and a third fused weight of the plurality of sensors of the site may be determined based on at least one of the weather condition and the lighting condition. At least one of the second fused weight and the third fused weight of the plurality of sensors of the plurality of neighboring sites may be adjusted based on at least one of the second fused weight and the third fused weight of the plurality of sensors of the plurality of neighboring sites, the second fused weight and the third fused weight of the plurality of sensors of the plurality of neighboring sites being determined based on at least one of weather conditions and lighting conditions of the plurality of neighboring sites. And an adjusted fused weight of the plurality of sensors of the site may be obtained based on the first fused weight of the plurality of sensors of the site and the adjusted at least one of the second fused weight and the third fused weight of the plurality of sensors of the site.
In one implementation, at least one of weather conditions and lighting conditions may be determined based on image data from a camera sensor.
In one embodiment, the plurality of sensors may include at least two of a camera sensor, a radar sensor, and a LiDAR sensor. In one embodiment, the plurality of sensors may include a camera sensor and a radar sensor. In one embodiment, the plurality of sensors may include a camera sensor and a LiDAR sensor. In one embodiment, the plurality of sensors may include a camera sensor, a radar sensor, and a LiDAR sensor.
In one embodiment, each of the site and the plurality of neighboring sites is one of a mobile site such as a vehicle and a stationary site such as a monitoring device mounted on a pile along a roadway.
Fig. 9 is a flow chart illustrating an exemplary method for tuning sensor fusion weights for a site in accordance with aspects of the present disclosure.
At step 910, noise variance metrics for a plurality of sensors of a site may be determined.
At step 920, at least one of weather conditions and lighting conditions of the site may be determined.
At step 930, fusion weights for a plurality of sensors of the site may be determined based on the noise variance metric and at least one of the weather condition and the lighting condition.
At step 940, fusion of information obtained from multiple sensors of the site may be obtained based on the fusion weights.
In one embodiment, a first fused weight for a plurality of sensors of a site may be determined based on noise variance metrics for the plurality of sensors. At least one of the second fused weight and the third fused weight of the plurality of sensors of the site may be determined based on at least one of weather conditions and lighting conditions of the site. And determining the fused weights for the plurality of sensors of the site based on the first fused weight and at least one of the second fused weight and the third fused weight.
In one implementation, at least one of weather conditions and lighting conditions of the site may be determined based on image data from the camera sensor.
In one embodiment, at least one of weather conditions and lighting conditions of the site may be determined by analyzing a predefined number of image frames captured by the camera sensor.
In one embodiment, fusion weights for a plurality of sensors of a plurality of neighboring sites that are within a predefined range of the site may be obtained. The fusion weights of the plurality of sensors of the site may be adjusted based on the obtained fusion weights of the plurality of sensors of the plurality of neighboring sites. Fusion of information obtained from multiple sensors of a site may be performed based on the adjusted fusion weights.
In one embodiment, at least one of a second fused weight and a third fused weight of a plurality of sensors of a plurality of neighboring sites that are within a predefined range of the site may be obtained, wherein at least one of the second fused weight and the third fused weight of the plurality of sensors of the plurality of neighboring sites may be determined based on at least one of weather conditions and lighting conditions of the plurality of neighboring sites. At least one of the second fused weights and the third fused weights of the plurality of sensors of the site may be adjusted based on the obtained at least one of the second fused weights and the third fused weights of the plurality of sensors of the plurality of neighboring sites. And an adjusted fused weight of the plurality of sensors of the site may be obtained based on the first fused weight of the plurality of sensors of the site and the adjusted at least one of the second fused weight and the third fused weight of the plurality of sensors of the site.
FIG. 10 is a block diagram illustrating an exemplary computing system in accordance with aspects of the present disclosure.
The computing system 1000 may include at least one processor 1010. The computing system 1000 may also include at least one storage device 1020. It should be appreciated that the computing system 1000 may be configured to implement the processing system 20 as shown in fig. 1, 3, and 4.
The storage device 1020 may store computer executable instructions that, when executed, cause the processor 1010 to: determining one or more conditions related to operational performance of a plurality of sensors of the site; determining fusion weights for a plurality of sensors of the site based on one or more conditions; obtaining fusion weights of a plurality of sensors of a plurality of adjacent sites located within a predefined range of the site; adjusting fusion weights of a plurality of sensors of a site based on the obtained fusion weights of a plurality of neighboring sites; and performing fusion of information obtained from the plurality of sensors of the site based on the adjusted fusion weights.
The storage device 1020 may store computer executable instructions that, when executed, cause the processor 1010 to: determining noise variance measures for a plurality of sensors of the site; determining at least one of weather conditions and lighting conditions for the site; determining a fusion weight for a plurality of sensors of the site based on the noise variance metric and at least one of the weather condition and the lighting condition; and performing fusion of information obtained from the plurality of sensors of the site based on the fusion weights.
It should be appreciated that the storage device 1020 may store computer executable instructions that, when executed, cause the processor 1010 to perform any operations according to embodiments of the present disclosure as described in connection with fig. 1-9.
Embodiments of the present disclosure may be embodied in a computer-readable medium, such as a non-transitory computer-readable medium. The non-transitory computer-readable medium may include instructions that, when executed, cause the one or more processors to perform any operations according to embodiments of the present disclosure as described in connection with fig. 1-10.
Embodiments of the present disclosure may be embodied as a computer program product comprising instructions that, when executed, cause one or more processors to perform any operations in accordance with embodiments of the present disclosure as described in connection with fig. 1-10.
It should be understood that all operations in the above-described methods are merely exemplary, and the present disclosure is not limited to any operations in these methods or to the order of these operations, and should cover all other equivalents under the same or similar concepts.
It should also be understood that all of the modules in the apparatus described above may be implemented in a variety of ways. These modules may be implemented as hardware, software, or a combination thereof. Furthermore, any of these modules may be functionally further divided into sub-modules or may be combined together.
The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Accordingly, the claims are not intended to be limited to the aspects shown herein. All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference.
Claims (22)
1. A method for tuning sensor fusion weights of a site, the method comprising:
determining one or more conditions related to operational performance of a plurality of sensors of the site;
determining fusion weights for the plurality of sensors of the site based on the one or more conditions;
obtaining fusion weights for the plurality of sensors of a plurality of neighboring sites located within a predefined range of the site;
adjusting the fusion weights of the plurality of sensors of the site based on the obtained fusion weights of the plurality of neighboring sites; and
fusion of information obtained from the plurality of sensors of the site is performed based on the adjusted fusion weights.
2. The method of claim 1, wherein adjusting the fusion weights of the plurality of sensors of the site comprises:
the adjusted fusion weight for each of the plurality of sensors of the site is obtained by averaging or weighted averaging the determined fusion weights for the sensors of the site and the obtained fusion weights for the sensors of the plurality of neighboring sites.
3. The method of claim 1, wherein adjusting the fusion weights of the plurality of sensors of the site comprises:
discarding the determined fused weights of the plurality of sensors from the site and the obtained outliers of the fused weights of the plurality of sensors of the plurality of neighboring sites; and
the adjusted fusion weights for each of the plurality of sensors of the site are obtained by averaging or weighted averaging the remaining fusion weights for the sensors.
4. The method of claim 1, wherein the adjusting the fusion weights of the plurality of sensors of the site comprises:
the adjusted fusion weight for each of the plurality of sensors of the site is obtained by voting between the determined fusion weights for the sensors of the site and the obtained fusion weights for the sensors of the plurality of neighboring sites.
5. The method of claim 1, wherein adjusting the fusion weights of the plurality of sensors of the site comprises:
determining whether the determined fusion weight of each sensor of the plurality of sensors of the site is a normal value or an outlier based on the obtained fusion weights of the sensors of the plurality of neighboring sites;
if the determined fusion weight of the sensor of the site is determined to be the normal value, then taking the determined fusion weight of the sensor of the site as its adjusted fusion weight; and
if the determined fusion weights of the sensors of the sites are determined to be the outliers, the adjusted fusion weights of the sensors of the sites are obtained based on at least a portion of the obtained fusion weights of the sensors of the plurality of neighboring sites.
6. The method of claim 1, wherein the plurality of neighboring sites that are within the predefined range of the site are at least one of:
a plurality of neighboring stations located within a predefined distance from the station;
a predefined number of neighboring sites around the site; or alternatively
A predefined set of neighboring sites related to the site.
7. The method of claim 1, wherein the one or more conditions comprise one or more of a weather condition, an illumination condition, and a noise variance condition for each of the plurality of sensors.
8. The method of claim 7, wherein the one or more conditions include the noise variance condition and at least one of the weather condition and the lighting condition.
9. The method of claim 7, wherein
Determining fusion weights for the plurality of sensors of the site includes: determining the fusion weights for the plurality of sensors of the site based on the noise variance condition and at least one of the weather condition and the lighting condition; and is also provided with
Adjusting the fusion weights of the plurality of sensors of the site includes: the fused weights of the plurality of sensors of the site are adjusted based on the fused weights of the plurality of neighboring sites, the fused weights of the plurality of neighboring sites being determined based on the noise variance condition of the plurality of neighboring sites and at least one of the weather condition and the lighting condition.
10. The method of claim 7, wherein
The determining the fusion weights of the plurality of sensors of the site includes: determining a first fused weight of the plurality of sensors of the site based on the noise variance condition, and determining at least one of a second fused weight and a third fused weight of the plurality of sensors of the site based on at least one of the weather condition and the lighting condition; and is also provided with
The adjusting the fusion weights of the plurality of sensors of the site includes: adjusting at least one of a second fused weight and a third fused weight of the plurality of sensors of the site based on the at least one of the second fused weight and the third fused weight of the plurality of sensors of the plurality of neighboring sites, the second fused weight and the third fused weight of the plurality of sensors of the plurality of neighboring sites being determined based on at least one of the weather conditions and the lighting conditions of the plurality of neighboring sites, and obtaining the adjusted fused weight of the plurality of sensors of the site based on the adjusted at least one of the first fused weight and the second fused weight and the third fused weight of the plurality of sensors of the site.
11. The method of claim 1, wherein the determining one or more conditions comprises: at least one of weather conditions and lighting conditions is determined based on image data from the camera sensor.
12. The method of claim 1, wherein the plurality of sensors comprises at least two of a camera sensor, a radar sensor, and a LiDAR sensor.
13. The method of claim 1, wherein the station and each of the plurality of neighboring stations are one of a mobile station and a fixed station.
14. A method for tuning sensor fusion weights of a site, comprising:
determining noise variance measures for a plurality of sensors of the site;
determining at least one of weather conditions and lighting conditions for the site;
determining a fusion weight of the plurality of sensors of the site based on the noise variance metric and the at least one of the weather condition and the lighting condition; and
fusion of information obtained from the plurality of sensors of the site is performed based on the fusion weights.
15. The method of claim 14, wherein determining fusion weights for a plurality of sensors of the site comprises:
Determining a first fusion weight for the plurality of sensors of the site based on the noise variance metrics of the plurality of sensors;
determining at least one of a second fused weight and a third fused weight of the plurality of sensors of the site based on at least one of the weather condition and the lighting condition of the site; and
the fused weights of the plurality of sensors of the site are determined based on the first fused weight and based on the at least one of the second fused weight and the third fused weight.
16. The method of claim 15, wherein determining at least one of weather conditions and lighting conditions for the site comprises:
at least one of the weather condition and the lighting condition of the site is determined based on image data from a camera sensor.
17. The method of claim 16, wherein determining at least one of the weather condition and the lighting condition of the site based on image data from a camera sensor comprises: at least one of the weather condition and the lighting condition of the site is determined by analyzing a predefined number of image frames captured by the camera sensor.
18. The method of claim 14, further comprising:
obtaining fusion weights of a plurality of sensors of a plurality of adjacent sites located within a predefined range of the site; and
adjusting the fusion weights of the plurality of sensors of the site based on the obtained fusion weights of the plurality of sensors of the plurality of neighboring sites,
wherein the fusing of the execution information includes: fusion of information obtained from the plurality of sensors of the site is performed based on the adjusted fusion weights.
19. The method of claim 15, further comprising:
obtaining at least one of a second fused weight and a third fused weight of the plurality of sensors of a plurality of neighboring sites located within a predefined range of the site, the second fused weight and the third fused weight determined based on at least one of the weather conditions and the lighting conditions of the plurality of neighboring sites;
adjusting at least one of a second fused weight and a third fused weight of the plurality of sensors of the plurality of neighboring sites based on the obtained at least one of the second fused weight and the third fused weight of the plurality of sensors of the sites; and
Obtaining the adjusted fused weights for the plurality of sensors of the site based on the first fused weights for the plurality of sensors of the site and at least one of the second fused weights and the third fused weights for the site,
wherein the fusing of the execution information includes: fusion of information obtained from the plurality of sensors of the site is performed based on the adjusted fusion weights.
20. A computer system, comprising:
one or more processors; and
one or more storage devices storing computer-executable instructions that, when executed, cause the one or more processors to perform operations of the method of one of claims 1 to 19.
21. One or more computer-readable storage media storing computer-executable instructions that, when executed, cause one or more processors to perform the operations of the method of one of claims 1 to 19.
22. An apparatus for adjusting sensor fusion weights, comprising:
A plurality of sensors for obtaining sensor data; and
processing system for executing instructions that cause the processing system to perform the operations of the method according to one of claims 1 to 19.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/133093 WO2022115993A1 (en) | 2020-12-01 | 2020-12-01 | Method and apparatus for tuning sensor fusion weights |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116648600A true CN116648600A (en) | 2023-08-25 |
Family
ID=81852901
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202080107583.1A Pending CN116648600A (en) | 2020-12-01 | 2020-12-01 | Method and apparatus for tuning sensor fusion weights |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN116648600A (en) |
DE (1) | DE112020007428T5 (en) |
WO (1) | WO2022115993A1 (en) |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7176963B2 (en) * | 2003-01-03 | 2007-02-13 | Litton Systems, Inc. | Method and system for real-time image fusion |
CN105352535A (en) * | 2015-09-29 | 2016-02-24 | 河海大学 | Measurement method on the basis of multi-sensor date fusion |
CN107703935A (en) * | 2017-09-12 | 2018-02-16 | 安徽胜佳和电子科技有限公司 | Multiple data weighting fusions carry out method, storage device and the mobile terminal of avoidance |
CN108934028B (en) * | 2018-07-05 | 2020-07-31 | 电子科技大学 | Distributed fusion method for multiple Bernoulli filters |
CN111428759A (en) * | 2020-03-10 | 2020-07-17 | 北京新能源汽车技术创新中心有限公司 | Data fusion method, electronic device and storage medium |
CN112003891B (en) * | 2020-07-16 | 2022-09-06 | 山东派蒙机电技术有限公司 | Multi-sensing data fusion method for intelligent networked vehicle controller |
-
2020
- 2020-12-01 CN CN202080107583.1A patent/CN116648600A/en active Pending
- 2020-12-01 WO PCT/CN2020/133093 patent/WO2022115993A1/en active Application Filing
- 2020-12-01 DE DE112020007428.5T patent/DE112020007428T5/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2022115993A1 (en) | 2022-06-09 |
DE112020007428T5 (en) | 2023-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5939357B2 (en) | Moving track prediction apparatus and moving track prediction method | |
JP6682833B2 (en) | Database construction system for machine learning of object recognition algorithm | |
CN109703460B (en) | Multi-camera complex scene self-adaptive vehicle collision early warning device and early warning method | |
US9154741B2 (en) | Apparatus and method for processing data of heterogeneous sensors in integrated manner to classify objects on road and detect locations of objects | |
US10489664B2 (en) | Image processing device, device control system, and computer-readable storage medium | |
CN111932901B (en) | Road vehicle tracking detection apparatus, method and storage medium | |
WO2018212346A1 (en) | Control device, scanning system, control method, and program | |
JP2020035443A (en) | Sensing device | |
CN104217590A (en) | On-board traffic density estimator | |
CN109905847B (en) | Collaborative correction system and method for accumulated errors of GNSS blind area intelligent vehicle auxiliary positioning system | |
JP2017102556A (en) | Information processing device, information processing method, vehicle control device, and vehicle control method | |
CN111223320A (en) | Low-adhesion road surface intelligent driving safety control method based on V2I | |
CN115243932A (en) | Method and device for calibrating camera distance of vehicle and method and device for continuously learning vanishing point estimation model | |
CN111145569A (en) | Road monitoring and vehicle running control method and device and vehicle-road cooperative system | |
JP2002160598A (en) | Outside car control device | |
US20230377458A1 (en) | Vehicle detection system | |
CN116648600A (en) | Method and apparatus for tuning sensor fusion weights | |
US20230128212A1 (en) | Vehicle running-control processing system | |
CN116386313A (en) | Method and system for detecting vehicle information | |
CN117528264A (en) | Parameter adjustment method and device and intelligent driving system | |
US12106579B2 (en) | Method for operating at least one environment sensor on a vehicle | |
JP2019149054A (en) | Periphery monitoring device, periphery monitoring system, and periphery monitoring method | |
CN115431943A (en) | Target screening method, device, equipment, medium and vehicle | |
WO2021106297A1 (en) | Provision device, vehicle management device, vehicle management system, vehicle management method, and vehicle management program | |
CN115830860B (en) | Traffic accident prediction method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |