AU2020103979A4 - Multi-sensor cooperative target tracking system - Google Patents

Multi-sensor cooperative target tracking system Download PDF

Info

Publication number
AU2020103979A4
AU2020103979A4 AU2020103979A AU2020103979A AU2020103979A4 AU 2020103979 A4 AU2020103979 A4 AU 2020103979A4 AU 2020103979 A AU2020103979 A AU 2020103979A AU 2020103979 A AU2020103979 A AU 2020103979A AU 2020103979 A4 AU2020103979 A4 AU 2020103979A4
Authority
AU
Australia
Prior art keywords
target
observation data
sensor
module
trajectory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2020103979A
Inventor
Peizhi Cui
Zhen Lei
Yan Sun
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Academy of Armored Forces of PLA
Original Assignee
Academy of Armored Forces of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Academy of Armored Forces of PLA filed Critical Academy of Armored Forces of PLA
Priority to AU2020103979A priority Critical patent/AU2020103979A4/en
Application granted granted Critical
Publication of AU2020103979A4 publication Critical patent/AU2020103979A4/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/20Direction control systems for self-propelled missiles based on continuous observation of target position
    • F41G7/30Command link guidance systems
    • F41G7/301Details
    • F41G7/303Sighting or tracking devices especially provided for simultaneous observation of the target and of the missile
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The present disclosure relates to a multi-sensor cooperative target tracking system. The tracking system includes: an observation data obtaining module, configured to obtain observation data from multiple sensors and parse the observation data to determine parsed observation data; a preprocessing module, configured to preprocess the parsed observation data to determine preprocessed observation data; a first determining module, configured to determine, based on a current type of weather, whether multiple sensors of different types work simultaneously; a target trajectory fusion module, configured to: determine a first target trajectory based on preprocessed observation data of each sensor when the multiple sensors of different types work simultaneously, and fuse multiple first target trajectories, to determine a fused target trajectory; and a target trajectory output module, configured to determine a first target trajectory based on preprocessed observation data of a single sensor when one sensor works. According to the present disclosure, all-weather observation ability can be improved. 1/4 101 Observation data obtaining module 102 Preprocessing module IF_ 103 First determining module 104 Target trajectory fusion module 105 Target trajectory output module FIG. 1

Description

1/4
101
Observation data obtaining module
102
Preprocessing module
IF_ 103
First determining module
104
Target trajectory fusion module
105
Target trajectory output module
FIG. 1
MULTI-SENSOR COOPERATIVE TARGET TRACKING SYSTEM TECHNICAL FIELD
[001] The present disclosure relates to the field of target tracking, and in particular to a multi-sensor cooperative target tracking system.
BACKGROUND
[002] During target tracking, video stream information of a target is detected by a sensor. Data information and image information of the video stream need to be fused separately, and situation information is displayed on a host computer.
[003] In an existing target tracking system, only one type of sensor is used for target recognition, such as an infrared sensor or a visible light sensor. During specific operation, the infrared sensor is far superior to radar in terms of target recognition performance, and has optimal concealment and is less liable to be interfered. However, the infrared sensor has a relatively short operating range, and is sensitive to the meteorological environment, such as clouds, rain, or fog, which results in poor all-weather working ability.
[004] Imaging from the visible light sensor has rich color, edge and texture information, which can intuitively show details of a target, has a relatively high resolution, and is easy for human eyes to observe. However, the visible light sensor is not capable of all-weather imaging, is sensitive to light conditions, and is prone to overexposure or underexposure. It can be seen that the existing target tracking system is greatly impacted by environmental factors, and has poor all-weather working ability.
SUMMARY
[005] The present disclosure aims to provide a multi-sensor cooperative target tracking system, to solve a problem of poor all-weather working ability of an existing target tracking system.
[006] To achieve the above purpose, the present disclosure provides the following technical solutions.
[007] A multi-sensor cooperative target tracking system includes: an observation data obtaining module, configured to obtain observation data from multiple sensors and parse the observation data to determine parsed observation data, where the observation data includes a longitude, latitude, and elevation of a target and a type of the target, and types of the sensors include an infrared sensor, a visible light sensor, and a radar; a preprocessing module, configured to preprocess the parsed observation data to determine preprocessed observation data; a first determining module, configured to determine, based on a current type of weather, whether multiple sensors of different types work simultaneously; a target trajectory fusion module, configured to: determine a first target trajectory based on preprocessed observation data of each sensor when the multiple sensors of different types work simultaneously, and fuse multiple first target trajectories, to determine a fused target trajectory; and a target trajectory output module, configured to determine a first target trajectory based on preprocessed observation data of a single sensor when one sensor works.
[008] Preferably, the preprocessing module specifically includes: a time registration unit, configured to perform time registration on the parsed observation data to determine registered observation data; and a filtering unit, configured to filter the registered observation data to determine the preprocessed observation data.
[009] Preferably, the first determining module specifically includes: a single sensor determining unit, configured to determine that there is only one sensor at work when the current type of weather is good weather; and a unit for determining that the multiple sensors work simultaneously, configured to determine, when the current type of weather is bad weather, that the multiple sensors of different types work simultaneously.
[0010] Preferably, the target trajectory fusion module specifically includes: an association unit, configured to determine a trajectory correlation of the multiple first target trajectories by using a data correlation algorithm, and determine multiple first target trajectories of a same target based on the trajectory correlation; and a fusion unit, configured to fuse the multiple first target trajectories of the same target to determine a fused target trajectory.
[0011] Preferably, the fusion unit specifically includes: a fusion subunit, configured to fuse the multiple first target trajectories of the same target based on a covariance convex algorithm to determine the fused target trajectory.
[0012] According to the embodiments provided in the present disclosure, the present disclosure discloses the following technical effects. The present disclosure provides a multi-sensor cooperative target tracking system, and based on different types of weather, multiple sensors are used to work simultaneously to achieve all-weather working of the multi-sensor cooperative target tracking system, to improve all-weather observation ability and tracking accuracy.
BRIEF DESCRIPTION OF DRAWINGS
[0013] In order to illustrate the examples of the present disclosure or the technical solutions of the prior art, the accompanying drawing to be used will be described briefly below. Notably, the following accompanying drawing merely illustrates some examples of the present disclosure, but other accompanying drawings can also be obtained those of ordinary skill in the art based on the accompanying drawing without any creative efforts.
[0014] FIG. 1 is a structural diagram of a multi-sensor cooperative target tracking system according to the present disclosure;
[0015] FIG. 2 is a flowchart of the single-sensor multi-target tracking algorithm;
[0016] FIG. 3 is a schematic diagram of a distributed structure; and
[0017] FIG. 4 is a tracking flow chart of an NN track correlation, KF filtering, and a CC algorithm. DESCRIPTION OF EMBODIMENTS
[0018] The following clearly and completely describes the technical solutions in the examples of the present disclosure with reference to accompanying drawings in the examples of the present disclosure. Apparently, the described examples are merely a part rather than all of the examples of the present disclosure. All other examples obtained by a person of ordinary skill in the art based on the examples of the present disclosure without creative efforts shall fall within the protection scope of the present disclosure.
[0019] The objective of the present disclosure is to provide a multi-sensor cooperative target tracking system, which can improve all-weather observation ability.
[0020] To make the foregoing objective, features, and advantages of the present disclosure clearer and more comprehensible, the present disclosure is further described in detail below with reference to the accompanying drawings and specific embodiments.
[0021] FIG. 1 is a structural diagram of a multi-sensor cooperative target tracking system according to the present disclosure. As shown in FIG. 1, the multi-sensor cooperative target tracking system includes an observation data obtaining module 101, a preprocessing module 102, a first determining module 103, a target trajectory fusion module 104, and a target trajectory output module 105.
[0022] The observation data obtaining module101 is configured to obtain observation data from multiple sensors and parse the observation data to determine parsed observation data. The observation data includes a longitude, latitude, and elevation of a target and a type of the target, and types of the sensors include an infrared sensor, a visible light sensor, and a radar.
[0023] The preprocessing module 102 is configured to preprocess the parsed observation data to determine preprocessed observation data.
[0024] The first determining module 103 is configured to determine, based on a current type of weather, whether multiple sensors of different types work simultaneously.
[0025] The target trajectory fusion module 104 is configured to: determine a first target trajectory based on preprocessed observation data of each sensor when the multiple sensors of different types work simultaneously, and fuse multiple first target trajectories, to determine a fused target trajectory.
[0026] The target trajectory output module 105 is configured to determine a first target trajectory based on preprocessed observation data of a single sensor when one sensor works.
[0027] The multi-sensor cooperative target tracking system provided in the present disclosure may be described in the following aspects.
[0028] (1) Data input and data parsing
[0029] Data is input through a network port. Data input by using a multi-sensor fusion module on a single platform is indexed in an order from platform position information (a ground Cartesian coordinate system x, y, h) to a sensor number (infrared, visible light, a radar type, and a sequence number).
[0030] (2) Data preprocessing (space-time alignment)
[0031] When performing time registration on data from multiple sensors, data of each platform is usually managed on a platform with a longer scanning period. Commonly used methods include least squares criterion and interpolation and extrapolation.
[0032] According to the present disclosure, time of data frames of sensors with different sampling frequencies and start time is aligned into a same GPS time label. Each target position is described as a relative coordinate system, and is transformed into same GPS geodetic coordinate system information B, L, and H.
[0033] (3) Processing of redundant information and a false alarm target in a fusion process of data from multiple sensors on a single-platform.
[0034] In the fusion process of data from the single platform, data information that is of a target and that is acquired by three types of sensors includes longitude information, latitude information, elevation information, and type information of the target. In an information fusion process, data fusion means organic combination of redundant and complementary information from different sources, different modes, different media, different pieces of time, and different representations, so as to finally obtain a more accurate description of a perceived object. In the information fusion process, the redundant information is merged and the false alarm target is eliminated.
[0035] For the redundant information and the false alarm target, there may be the following situations: I. Due to a random interference in environment and a measurement error of a sensor during a detection process, there are redundant information and a false alarm target in data acquired by the sensor. II. When multiple sensors work simultaneously, there is a situation how to determine whether targets with similar trajectories acquired by the multiple sensors are a same target. In situation I, for redundant information and measurement errors, an appropriate filtering method is selected according to the system, and the true value is estimated and predicted from the observation data. In this process, to determine whether observation data of a sensor is a false alarm target, a data association algorithm is needed to determine whether the data is from a target or an environmental interference. If the data is from the target, it is further necessary to determine which target the data is from. In situationII, after data preprocessing, trajectory information from the multiple sensors for multiple targets is obtained. However, for different sensors, it is impossible to directly determine whether two trajectories describe a same target. For example, for a same target A, a sensor 1 returns to obtain a trajectory, and a sensor 2 returns to obtain another trajectory. At this time, a description label from the sensor 1 for the target A is x, and a description label from the sensor 2 for the target A is y.
[0036] Based on a data association (DA) algorithm, a trajectory correlation is determined, and an appropriate relevance threshold is set. If the relevance is greater than the threshold, it can be identified as a same target. Then, a quantity of targets may be determined after identification, and all trajectories of the target can be correlated.
[0037] (4) Situation where there is a single sensor works.
[0038] When multiple sensors do not need to work together because the weather is good or a situation is not complicated, and only one of radar, infrared, and visible light sensors works, if observation data returned by the sensor is a state of a real target, and there is no missed detection, no false detection, and no observation noise, a position state of each target at each sampling time is directly updated to obtain a trajectory of the target. Considering redundant information and a false alarm target, combining characteristics of a processing algorithm of the data fusion, a data processing flow is shown in FIG. 2.
[0039] The data preprocessing includes the data parsing and other processes. A nearest neighbor (NN) algorithm is used to remove the false alarm target, classify the targets, determine which target acquired data is from, and perform trajectory correlation. A filtering algorithm is mainly used to further process the observation data, more accurately estimate state information of the target based on filtering, and finally obtain estimation information of the target to predict a future trend. In a simulation process, a second-order constant velocity (Constant Velocity, CV) model is used to describe a motion model of a target.
[0040] (5) Situation where there are multiple sensors work simultaneously.
[0041] Information fusion systems may be classified into three categories based on different hardware architectures of the information fusion systems: centralized, distributed, and hybrid. Based on the centralized structure, original measurement data from each sensor node is transmitted to a fusion center, and the fusion center completes processing of data from each node, such as time alignment, coordinate conversion, data association, and track fusion. A tracking result of the system depends entirely on the fusion center. Once the fusion center fails, the entire system may completely collapse. Therefore, this structure is not commonly used in engineering, and a fusion result based on this structure is often used as a reference for performance evaluation of distributed and hybrid fusion algorithms.
[0042] As shown in FIG. 3, based on the distributed fusion structure, each sensor uses measurement data obtained by the sensor to separately track a target to form a local estimation, then sends an estimation result to the fusion center, and the fusion center fuses the local estimation from each sensor into a joint estimation of the target. The distributed fusion structure not only has local independent tracking ability, but also global monitoring ability. When performance of a sensor decreases or a sensor malfunctions, an observation result of the sensor has little impact on performance and an estimation result of the entire information fusion system. This effectively improves survivability of the system. Compared with the centralized structure, the distributed structure has lower requirements on bandwidth of a data bus and data processing capacity of the fusion center, and has fast calculation speed, and better reliability and scalability. Therefore, the present disclosure intends to use the distributed structure to fuse multi-sensor data information.
[0043] In the present disclosure, when the weather is bad or a situation is complicated, multiple sensors need to work simultaneously for data fusion to improve accuracy. In this case, the distributed fusion structure is used to process data from the multiple sensors to realize multi-target tracking fused by multiple sensors on a single platform.
[0044] When multiple sensors on a single platform work simultaneously, the distributed fusion structure is used to process data from the multiple sensors. First, data information from each sensor is processed based on a processing flow used when a single sensor works alone, to separately obtain trajectory information of multiple targets. Then, the trajectory information of multiple targets obtained by each sensor is transmitted to the fusion center, and trajectory information from the multiple sensors for the multiple targets is obtained. However, for different sensors, it is impossible to directly determine whether two trajectories describe a same target. Therefore, a data association algorithm is required to process the trajectories to determine a trajectory correlation. An appropriate relevance threshold is set. If a relevance is greater than the threshold, it can be identified as a same target. Then, a quantity of targets may be determined after identification. After trajectory information from different sensors is correlated, it is recognized as a same target trajectory. Trajectory information from the multiple sensors needs to be fused based on a fusion algorithm, and finally target trajectory information fused by the multiple sensors on the single platform is output.
[0045] AK-means algorithm is used to correlate the trajectory information to determine whether the two trajectories describe the same target. For selection of a quantity of clusters, a Calinsky criterion is used.
[0046] For a same target trajectory determined based on the data association algorithm, a covariance convex track fusion algorithm is used to fuse trajectory information of the same target from the multiple sensors.
[0047] (6) An example of NN track correlation, KF filtering, and CC algorithm.
[0048] Taking observation of two moving targets by two sensors as an example, the two targets adopt the CV model. FIG. 4 shows a tracking flow chart of the NN track correlation, the KF filtering, and the CC algorithm. A specific algorithm flow chart is shown in FIG. 4.
[0049] (7) Choice offiltering algorithm
[0050] In the present disclosure, although particle filtering is used for filtering, it has relatively high requirement for real-time performance, and in a case of non-linear and non-Gaussian conditions, the present disclosure intends to use a particle flow filter (PFF) algorithm. Compared with a traditional PF, the PFF is many orders of magnitude faster. For a difficult high-order filtering problem, the PFF is several orders of magnitude higher in accuracy than an extended Kalman filter (EKF). Secondly, the PFF may not suffer from a particle degradation problem in PF.
[0051] Each embodiment of the present specification is described in a progressive manner, each embodiment focuses on the difference from other embodiment s, and the same and similar parts between the embodiment s may refer to each other.
[0052] In this specification, several examples are used for illustration of the principles and implementations of the present disclosure. The description of the foregoing embodiment s is used to help illustrate the method of the present disclosure and the core principles thereof. In addition, those of ordinary skill in the art can make various modifications in terms of specific implementations and scope of application in accordance with the teachings of the present disclosure. In conclusion, the content of the present specification shall not be construed as a limitation to the present disclosure.

Claims (5)

The claims defining the invention are as follows:
1. A multi-sensor cooperative target tracking system, comprising: an observation data obtaining module, configured to obtain observation data from multiple sensors and parse the observation data to determine parsed observation data, wherein the observation data comprises a longitude, latitude, and elevation of a target and a type of the target, and types of the sensors comprise an infrared sensor, a visible light sensor, and a radar; a preprocessing module, configured to preprocess the parsed observation data to determine preprocessed observation data; a first determining module, configured to determine, based on a current type of weather, whether multiple sensors of different types work simultaneously; a target trajectory fusion module, configured to: determine a first target trajectory based on preprocessed observation data of each sensor when the multiple sensors of different types work simultaneously, and fuse multiple first target trajectories, to determine a fused target trajectory; and a target trajectory output module, configured to determine a first target trajectory based on preprocessed observation data of a single sensor when one sensor works.
2. The multi-sensor cooperative target tracking system according to claim 1, wherein the preprocessing module specifically comprises: a time registration unit, configured to perform time registration on the parsed observation data to determine registered observation data; and a filtering unit, configured to filter the registered observation data to determine the preprocessed observation data.
3. The multi-sensor cooperative target tracking system according to claim 1, wherein the first determining module specifically comprises: a single sensor determining unit, configured to determine that there is only one sensor at work when the current type of weather is good weather; and a unit for determining that the multiple sensors work simultaneously, configured to determine, when the current type of weather is bad weather, that the multiple sensors of different types work simultaneously.
4. The multi-sensor cooperative target tracking system according to claim 1, wherein the target trajectory fusion module specifically comprises: an association unit, configured to determine a trajectory correlation of the multiple first target trajectories by using a data correlation algorithm, and determine multiple first target trajectories of a same target based on the trajectory correlation; and a fusion unit, configured to fuse the multiple first target trajectories of the same target to determine a fused target trajectory.
5. The multi-sensor cooperative target tracking system according to claim 4, wherein the fusion unit specifically comprises: a fusion subunit, configured to fuse the multiple first target trajectories of the same target based on a covariance convex algorithm to determine the fused target trajectory.
AU2020103979A 2020-12-09 2020-12-09 Multi-sensor cooperative target tracking system Ceased AU2020103979A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2020103979A AU2020103979A4 (en) 2020-12-09 2020-12-09 Multi-sensor cooperative target tracking system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2020103979A AU2020103979A4 (en) 2020-12-09 2020-12-09 Multi-sensor cooperative target tracking system

Publications (1)

Publication Number Publication Date
AU2020103979A4 true AU2020103979A4 (en) 2021-02-18

Family

ID=74591547

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2020103979A Ceased AU2020103979A4 (en) 2020-12-09 2020-12-09 Multi-sensor cooperative target tracking system

Country Status (1)

Country Link
AU (1) AU2020103979A4 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113552551A (en) * 2021-07-23 2021-10-26 中国人民解放军海军航空大学 Direct correlation method for distributed 2D sensor network tracks
CN115861366A (en) * 2022-11-07 2023-03-28 成都融达昌腾信息技术有限公司 Multi-source perception information fusion method and system for target detection
CN116150299A (en) * 2023-04-21 2023-05-23 北京航空航天大学 Multi-source track association method guided by credibility propagation network

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113552551A (en) * 2021-07-23 2021-10-26 中国人民解放军海军航空大学 Direct correlation method for distributed 2D sensor network tracks
CN113552551B (en) * 2021-07-23 2023-08-15 中国人民解放军海军航空大学 Distributed 2D sensor network track direct correlation method
CN115861366A (en) * 2022-11-07 2023-03-28 成都融达昌腾信息技术有限公司 Multi-source perception information fusion method and system for target detection
CN115861366B (en) * 2022-11-07 2024-05-24 成都融达昌腾信息技术有限公司 Multi-source perception information fusion method and system for target detection
CN116150299A (en) * 2023-04-21 2023-05-23 北京航空航天大学 Multi-source track association method guided by credibility propagation network

Similar Documents

Publication Publication Date Title
AU2020103979A4 (en) Multi-sensor cooperative target tracking system
CN109212521B (en) Target tracking method based on fusion of forward-looking camera and millimeter wave radar
CN113671480A (en) Radar and video fusion traffic target tracking method, system, equipment and terminal
CN113487759B (en) Parking patrol method and device, mobile patrol equipment and patrol system
CN112991391A (en) Vehicle detection and tracking method based on radar signal and vision fusion
CN111391823A (en) Multilayer map making method for automatic parking scene
EP3910533B1 (en) Method, apparatus, electronic device, and storage medium for monitoring an image acquisition device
CN111937036A (en) Method, apparatus, and computer-readable storage medium having instructions for processing sensor data
CN111611901A (en) Vehicle reverse running detection method, device, equipment and storage medium
KR101678004B1 (en) node-link based camera network monitoring system and method of monitoring the same
CN115965655A (en) Traffic target tracking method based on radar-vision integration
CN112684430A (en) Indoor old person walking health detection method and system, storage medium and terminal
Gies et al. Environment perception framework fusing multi-object tracking, dynamic occupancy grid maps and digital maps
CN115034324B (en) Multi-sensor fusion perception efficiency enhancement method
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
WO2019100354A1 (en) State sensing method and related apparatus
CN118334915B (en) Unmanned aerial vehicle route conflict optimal scheduling system and method
CN118033622A (en) Target tracking method, device, equipment and computer readable storage medium
CN116863382A (en) Expressway multi-target tracking method based on radar fusion
Rameshbabu et al. Target tracking system using kalman filter
Rieken et al. Sensor scan timing compensation in environment models for automated road vehicles
Mamchenko et al. Algorithm for sensor data merging using analytical module for priority sensor selection
Domhof et al. Multi-sensor object tracking performance limits by the cramer-rao lower bound
Huang et al. Radar-camera fusion for ground-based perception of small uav in urban air mobility
Becker et al. Identification of vehicle tracks and association to wireless endpoints by multiple sensor modalities

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)