CN117269951A - Target tracking method for air-ground multi-view information enhancement - Google Patents

Target tracking method for air-ground multi-view information enhancement Download PDF

Info

Publication number
CN117269951A
CN117269951A CN202311051377.5A CN202311051377A CN117269951A CN 117269951 A CN117269951 A CN 117269951A CN 202311051377 A CN202311051377 A CN 202311051377A CN 117269951 A CN117269951 A CN 117269951A
Authority
CN
China
Prior art keywords
tracking
waveform data
daytime
radar
shielding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311051377.5A
Other languages
Chinese (zh)
Other versions
CN117269951B (en
Inventor
陈宇
仇梓峰
靳锴
朱良彬
王雅涵
白慧慧
杨健
陈韬亦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jiaotong University
CETC 54 Research Institute
Original Assignee
Beijing Jiaotong University
CETC 54 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jiaotong University, CETC 54 Research Institute filed Critical Beijing Jiaotong University
Priority to CN202311051377.5A priority Critical patent/CN117269951B/en
Publication of CN117269951A publication Critical patent/CN117269951A/en
Application granted granted Critical
Publication of CN117269951B publication Critical patent/CN117269951B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a target tracking method with enhanced air-ground multi-view information, which comprises the steps of collecting unmanned aerial vehicle monitoring data Gn, collecting unmanned aerial vehicle monitoring radar reflection waveform data Dz at different times, inputting target tracking sparse Bayesian characteristic models through the unmanned aerial vehicle monitoring radar reflection waveform data Dz at different times, judging whether the problem of non-uniform scale and shielding of the target tracking condition occurs, recording the scale, shielding time and place when the problem of non-uniform scale and shielding of the target tracking condition occurs, utilizing the unmanned aerial vehicle monitoring radar reflection waveform data Dz at different times to obtain target tracking information, processing multi-source heterogeneous and low-quality information through the target tracking sparse Bayesian characteristic models, and carrying out real-time monitoring and processing by combining common monitoring data; the invention takes the time length of the problem with different scales and the shielding problem as one of the inputs of the neural network to obtain the whole multi-source heterogeneous and low-quality information, and combines the local and the whole to process the multi-source heterogeneous and low-quality information more accurately.

Description

Target tracking method for air-ground multi-view information enhancement
Technical Field
The invention relates to the field of target tracking, in particular to a target tracking method for enhancing air-ground multi-view information.
Background
Target tracking of a radar is one of important directions of radar technology application, and by transmitting radio wave signals and receiving the reflected radio wave signals, motion state information such as the change rate (radial speed), azimuth and altitude of the distance from a target to an electromagnetic wave transmitting point is calculated and obtained, so that the accurate position of the target is obtained.
The strong tracking filtering is a common method in radar target tracking, and can achieve a high-precision target tracking effect, and the basic idea is to correct a prediction error covariance matrix in real time by utilizing a self-adaptive suboptimal fading factor, namely, to directly correct and estimate the motion state of a target, and update and estimation of model parameters are not considered. However, in the target tracking processing process of the radar, the acquired motion state is low in quality and the shielding information is not processed, so that researchers cannot analyze and adjust model parameters in place, and the tracking difficulty of the target motion state is increased.
Disclosure of Invention
The object of the present invention is to provide a target tracking method with enhanced air-ground multi-view information, which is used for solving the above problems in the prior art.
In daytime, the embodiment of the invention provides a target tracking method for enhancing air-ground multi-view information, which comprises the following steps:
Collecting unmanned aerial vehicle monitoring data Gn; the unmanned aerial vehicle monitoring data Gn are target position data obtained by measuring preset unit time and place;
collecting radar reflected waveform data Dz of unmanned aerial vehicles at different times; the unmanned aerial vehicle monitoring radar reflection waveform data Dz is radar reflection waveform data of a target object reflection echo received by the monitoring radar;
inputting the radar reflection waveform data Dz of the unmanned aerial vehicle monitoring at different times into a target object tracking condition clustering matching recognition model, and judging whether the problems of different scales and shielding of the target object tracking condition occur or not;
marking the scale and shielding information when the scale and shielding problem of the tracking condition of the target object are different, and recording the scale, shielding time and place;
monitoring radar reflected waveform data Dz by using unmanned aerial vehicles with different times to obtain target tracking information; the target tracking information comprises a target tracking radius Dz u And radar reflection area Sx;
and processing multi-source heterogeneous and low-quality information by utilizing the unmanned aerial vehicle monitoring data Gn, the target object tracking information, the scale, the shielding time and the place and by utilizing a target object tracking sparse Bayesian characteristic model.
The cluster matching recognition model has the expression:
Wherein F is x Represents the scale error of the tracking target object, M j Represents the standard scale of the tracking target object, R represents the Euclidean distance of the scale, N represents the iteration times, N represents the number of clustering centers, eta represents the convergence coefficient of the scale, and beta (E) n ) Representing a dynamic distribution function of the tracking target object;representing radar reflection area error of tracking target object, B y The radar tracking method comprises the steps of representing a standard radar reflection area of a tracking target object, Q representing a Euclidean distance of the radar reflection area, lambda representing a convergence coefficient of the radar reflection area, L (t) representing an error accumulation change function of the radar reflection area, and U representing a tensor product;
judging whether the target object tracking condition has different scales and shielding problems or not, wherein the expression is as follows:
wherein W represents a scale change threshold value, and H represents a radar reflection area threshold value;
the sparse Bayesian characteristic model has the expression:
wherein Q is vcx Represents the screened multi-source heterogeneous and low-quality information set, u represents a distance change factor, P represents the distance of a tracking target object, F ry Representing a dynamically changing value between the occlusion location,the time of influencing the scale change is represented by I, the iteration number is represented by I, the total node number of sparse Bayes is represented by I, the occlusion area change amount is represented by Lc, the area occlusion factor is represented by E, the scale change factor is represented by sigma, and the scale change factor is represented by G ry Representing the scale change interval of the tracking target object.
Optionally, the inputting the radar reflection waveform data Dz of the unmanned aerial vehicle monitoring at different times into the target object tracking condition cluster matching recognition model, judging whether the target object tracking condition is different in scale and the shielding problem occurs, includes:
acquiring radar reflected waveform data Dzb of the unmanned aerial vehicle in the daytime; the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb are radar reflection waveform data in the different time unmanned aerial vehicle monitoring radar reflection waveform data Dz;
collecting radar reflection waveform data Dzy of the unmanned aerial vehicle at night; the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy is radar reflection waveform data with the largest change of reflection area of the distance daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb in different time unmanned aerial vehicle monitoring radar reflection waveform data Dz and the monitoring node behind the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb;
respectively inputting the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb and the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy into a target object tracking condition clustering matching recognition model to obtain daytime target object tracking radar reflectivity and night target object tracking radar reflectivity; the daytime target tracking radar reflectivity corresponds to the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb; the night target tracking radar reflectivity corresponds to night unmanned aerial vehicle monitoring radar reflection waveform data Dzy;
And comparing the daytime target tracking radar reflectivity with the night target tracking radar reflectivity by utilizing the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb and the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, and judging whether the target tracking condition is different in scale and the shielding problem occurs.
Optionally, the utilizing the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb and the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy compares the daytime target object tracking radar reflectivity and the night target object tracking radar reflectivity, and judges whether the target object tracking condition has different scales and a shielding problem, including:
the daytime shielding area is obtained by utilizing the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the daytime target tracking radar reflectivity and the night target tracking radar reflectivity; the daytime occlusion area represents the position occluded in the daytime unmanned monitoring radar reflection waveform data Dzb;
the night shielding area is obtained by utilizing the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, the daytime target tracking radar reflectivity and the night target tracking radar reflectivity; the night occlusion area represents the position occluded in night drone monitoring radar reflection waveform data Dzy;
Using the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, the daytime shielding area and the night shielding area,
obtaining echo reflection variance; the echo reflection variance represents the change condition of a reflection target object in the radar reflection waveform data Dzb monitored by the unmanned aerial vehicle in the daytime and the radar reflection waveform data Dzy monitored by the unmanned aerial vehicle at night;
when the echo reflection variance exceeds a target object tracking preset change interval, the problems of different scales and shielding of the target object tracking condition occur;
when the echo reflection variance is within a preset change interval of target object tracking, the problems of non-uniform scale and shielding of the target object tracking condition are not caused.
Optionally, utilize daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, daytime shelter from area and night shelter from the area and obtain daytime shelter from the area, include:
collecting background noise radar reflection waveform data; the background noise radar reflection waveform data represents noise radar reflection waveform data when the monitoring radar is not shielded;
the radar reflection waveform data Dzb monitored by the unmanned aerial vehicle in the daytime is subjected to noise filtering to obtain radar reflection waveform data of tracking noise of a target object in the daytime;
Subtracting noise in the background noise radar reflection waveform data from noise in the daytime target tracking noise radar reflection waveform data to obtain daytime target tracking noise difference radar reflection waveform data;
calibrating a value smaller than a noise threshold value in the daytime target tracking noise difference radar reflection waveform data as a reference value to obtain daytime noise shielding radar reflection waveform data;
determining whether a shielding area exists in the daytime noise shielding radar reflection waveform data by utilizing the daytime noise shielding radar reflection waveform data, the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the daytime target tracking radar reflectivity and the night target tracking radar reflectivity;
when the shielding area exists in the daytime noise shielding radar reflection waveform data, the area exceeding the reference value in the daytime noise shielding radar reflection waveform data is taken as the daytime shielding area.
Optionally, the shielding area is considered to whether the shielding area exists in the daytime noise shielding radar reflection waveform data, the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the daytime target object tracking radar reflectivity and the night target object tracking radar reflectivity, and the method comprises the following steps:
Collecting the tracking characteristic area of a target object in the daytime; the daytime target tracking characteristic area is a position corresponding to the boundary of daytime noise shielding radar reflection waveform data in the daytime target tracking radar reflectivity;
collecting different reflection characteristics of the same target object; the different reflection characteristics of the target tracking represent the reflection characteristics of the position of the area of the target tracking characteristic of the daytime target tracking radar in the reflectivity of the daytime target tracking radar;
collecting different reflection characteristics tracked by a target object in the daytime; the tracking of different reflection characteristics of the target object in the daytime is that the same target object tracks one of the different reflection characteristics of the target object;
collecting different reflection characteristics of the interference target object tracking; the interference target object tracking different reflection characteristics represent tracking different reflection characteristics of the target object in different monitoring points of the central environment by taking the tracking different reflection characteristics of the target object in the daytime as the central environment;
taking the different reflection characteristics tracked by the target object in the daytime and the different reflection characteristics tracked by the interference target object as inputs, and calculating the output of the convolutional neural network to obtain the fusion reflection characteristics;
and determining whether the noise shielding radar reflection waveform data in the daytime has shielding area or not by utilizing the reflectivity of the target tracking radar in the night, the different reflection characteristics and the fusion reflection characteristics of the target tracking radar in the daytime.
Optionally, the determining whether the shielding area exists in the noise shielding radar reflection waveform data in the daytime by utilizing the reflectivity of the target object tracking radar in the night, the tracking of different reflection characteristics and the fusion reflection characteristics of the target object in the daytime includes:
collecting different reflection characteristics tracked by a target object at night; the different reflection characteristics of the target object tracking at night are reflection characteristics of the target object tracking radar at night in the corresponding positions of the different reflection characteristics of the target object tracking at daytime in the reflectivity of the target object tracking radar at night;
collecting and monitoring different reflection characteristics of the interference target object tracking; the monitoring of the different reflection characteristics of the interference target object tracking is that the target object tracking at night takes the different reflection characteristics as the center and does not belong to the influence factors of the shielding area;
subtracting different reflection characteristics of tracking the target object in the daytime from different reflection characteristics of tracking the monitoring interference target object to obtain an environmental difference influence factor; monitoring interference targets to track different reflection characteristics to correspondingly obtain different environmental difference influence factors; an environmental difference influence factor corresponds to a monitoring interference target object to track different reflection characteristics; each environmental difference influencing factor has a different influencing factor value; solving the error of each influence factor value in the environment difference influence factors aiming at each environment difference influence factor, collecting error values, and correspondingly collecting different error values by different influence factor values; summing the different error values to obtain a summed error value; calculating an arithmetic covariance of the sum error value, and taking the arithmetic covariance as an environment standard deviation; each environmental difference influence factor corresponds to one environmental standard deviation, and different environmental difference influence factors correspond to different environmental standard deviations;
When the environmental standard deviation exceeds the fluctuation range of the environmental difference, the noise shielding radar reflection waveform data in the daytime is considered to have shielding area.
Optionally, the utilization daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, daytime shielding area and night shielding area obtain echo reflection variance, include:
fusing the daytime shielding area and the night shielding area to obtain a fused shielding area; the fusion shielding area represents an area containing a daytime target tracking area and a night target tracking area;
calibrating a scattering value outside the fused shielding area in the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb as a reference value to obtain daytime background object tracking radar reflection waveform data;
calibrating a scattering value outside the fusion shielding area in the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy to be a reference value to obtain night background object tracking radar reflection waveform data;
converting the daytime background object tracking radar reflection waveform data into video signals to obtain daytime video background object tracking radar reflection waveform data;
converting the night background object tracking radar reflection waveform data into video signals to obtain night video background object tracking radar reflection waveform data;
Collecting radio frequency difference radar reflection waveform data; the radio frequency difference radar reflection waveform data are radar reflection waveform data formed by absolute values of tracking variances of different background targets; the background object tracking variance is obtained by subtracting the night video background object tracking radar reflection waveform data from the daytime video background object tracking radar reflection waveform data median value;
converting the radio frequency difference radar reflected waveform data into scattering, and filtering noise to obtain noise difference radar reflected waveform data;
and classifying noise values in the noise difference radar reflection waveform data to obtain echo reflection variances.
Optionally, the processing the multi-source heterogeneous and low-quality information by using the unmanned aerial vehicle monitoring data Gn, the target object tracking information, the scale, the shielding time and the place and by using the target object tracking sparse bayesian characteristic model includes:
inputting the unmanned aerial vehicle monitoring data Gn and the target tracking information into a daytime target tracking sparse Bayesian characteristic model to obtain a daytime processing result; different daytime processing results are correspondingly acquired by different target object tracking information;
when the target object tracking information is a problem of different marking scales and shielding, storing the daytime scales, the shielding time and the place at the cloud end, moving the target object tracking information forward, and repeatedly judging the condition of the different marking scales and the shielding problem until the target object tracking information is the problem of different marking scales and the shielding problem again;
When the target object tracking information is the problem of different mark scales and shielding, storing the night scale, shielding time and place at the cloud end, inputting the unmanned aerial vehicle monitoring data Gn and different target object tracking information into a target object tracking sparse Bayesian characteristic model, and obtaining different night processing results;
inputting the unmanned aerial vehicle monitoring data Gn, target tracking information, daytime scale, shielding time and place, night scale, shielding time and place into a night target tracking sparse Bayesian characteristic model to obtain overall multi-source heterogeneous and low-quality information;
and (3) carrying out extended Kalman filtering processing on the local multi-source isomerism, the low-quality information and the whole multi-source isomerism and the low-quality information.
Optionally, the step of inputting the unmanned aerial vehicle monitoring data Gn, the target object tracking information, the daytime scale, the shielding time and place, the night scale, the shielding time and place, and the night target object tracking sparse bayesian characteristic model to obtain overall multi-source heterogeneous and low-quality information includes:
subtracting the daytime scale, the shielding time and the place from the night scale, the shielding time and the place to obtain monitoring time; the monitoring time is the time between two scales, the shielding time and the place point;
Tracking different target object with radius Dz u Adding to obtain the total target tracking radius Dk u
Adding the different radar reflection areas Sx to obtain a total radar reflection area Sx j
Tracking the unmanned aerial vehicle monitoring data Gn and the total target object by halfDiameter Dk u Total radar reflection area Sx j And monitoring time, inputting a target object tracking sparse Bayesian characteristic model, and obtaining overall multi-source heterogeneous and low-quality information.
Compared with the prior art, the embodiment of the invention achieves the following beneficial effects:
the embodiment of the invention also provides a target tracking method for air-ground multi-view information enhancement, which comprises the following steps: collecting unmanned aerial vehicle monitoring data Gn; the unmanned aerial vehicle monitoring data Gn are target position data obtained by measuring preset unit time and place; collecting radar reflected waveform data Dz of unmanned aerial vehicles at different times; the unmanned aerial vehicle monitoring radar reflection waveform data Dz is radar reflection waveform data of a target object reflection echo received by the monitoring radar; inputting the radar reflection waveform data Dz of the unmanned aerial vehicle monitoring at different times into a target object tracking condition clustering matching recognition model, and judging whether the problems of different scales and shielding of the target object tracking condition occur or not; marking the scale and shielding information when the scale and shielding problem of the tracking condition of the target object are different, and recording the scale, shielding time and place; monitoring radar reflected waveform data Dz by using unmanned aerial vehicles with different times to obtain target tracking information; the target tracking information comprises a target tracking radius Dz u And radar reflection area Sx; and processing multi-source heterogeneous and low-quality information by utilizing the unmanned aerial vehicle monitoring data Gn, the target object tracking information, the scale, the shielding time and the place and by utilizing a target object tracking sparse Bayesian characteristic model.
Since the amount of reflection is controlled for a period of time to achieve a reduction in the processing result at the time of target monitoring, the length of this period of time may affect the processing result. Therefore, the monitoring radar is adopted for real-time monitoring and common monitoring data are combined for processing. In the technical scheme of the heterogeneous and low-quality information processing method of different environments, which is provided by the embodiment of the application, the problems of different shielding scales and shielding are detected first, so that whether the echo reflection quantity is controlled is judged. Firstly separating the radar reflection waveform data from background radar reflection waveform data to detect shielding, and then judging whether the monitored shielding is the shielding according to the motion condition in the two adjacent pieces of radar reflection waveform data. In the process, the local target object tracking condition of radar reflection waveform data Dz is monitored by each unmanned aerial vehicle before the problem of different scales and after the problem of different scales is calculated respectively. And calculating overall multi-source heterogeneous and low-quality information for a period of time by using the recorded time length of the different scales and the occlusion problem as an influence factor as the input of the neural network. The embodiment of the application combines the local target object tracking condition and the whole target object tracking condition to more accurately process multi-source heterogeneous and low-quality information, thereby realizing accurate tracking of the target.
Drawings
FIG. 1 is a flow chart of a method of an embodiment of the present invention.
Fig. 2 is a schematic diagram of the cell composition of an embodiment of the present invention.
Description of the embodiments
The following description of the embodiments of the present invention will be made apparent and fully in view of the accompanying drawings, in which some, but not all embodiments of the invention are shown. All other embodiments, which are intended to be encompassed by the present invention, by those of ordinary skill in the art with the benefit of this disclosure are intended to be within the scope of the present invention.
Examples
As shown in fig. 1, an embodiment of the present invention provides a target tracking method for air-ground multi-view information enhancement, which includes:
step A1: collecting unmanned aerial vehicle monitoring data Gn; the unmanned aerial vehicle monitoring data Gn are target position data obtained by measuring preset unit time and place.
Wherein, the unmanned aerial vehicle monitors the data detected by the different environment targets of data Gn. The target position data in this embodiment is the concentration of the reflected echo. The preset unit time and place in this example is a standard day and night and a monitoring area with a radius of 10 km. The target location data includes echoes of natural factors and human factors.
Step A2: and collecting radar reflected waveform data Dz of unmanned aerial vehicle monitoring at different times. The unmanned aerial vehicle monitoring radar reflection waveform data Dz is radar reflection waveform data of a target object reflection echo received by the monitoring radar.
The unmanned aerial vehicle monitoring radar reflection waveform data Dz is radar reflection waveform data of real-time monitoring target object conditions.
Step A3: inputting the radar reflection waveform data Dz of the unmanned aerial vehicle monitoring at different times into a target object tracking condition clustering matching recognition model, and judging whether the problems of different scales and shielding of the target object tracking condition occur or not.
Step A4: when the problems of different scales and shielding of the target object tracking condition occur, marking the scales and shielding information, marking the problems of different scales and shielding, and recording the scales, the shielding time and the place. And discarding the unmanned aerial vehicle monitoring radar reflection waveform data Dz.
Step A5: and when the problems of non-uniform scale and shielding of the target tracking condition do not occur, obtaining the target tracking information. The target tracking information comprises a target tracking radius Dz u And radar reflection area Sx.
The method comprises the steps of adding the area quantity in the daytime noise shielding radar reflection waveform data obtained in the process of judging whether the tracking condition of the target object is different in scale and shielding problem to obtain the tracking radius Dz of the target object u . And adding noise values in the daytime noise shielding radar reflection waveform data obtained in the process of judging whether the tracking condition of the target object is different in scale and shielding problem to obtain a radar reflection area Sx.
Step A6: and processing multi-source heterogeneous and low-quality information by utilizing the unmanned aerial vehicle monitoring data Gn, the target object tracking information, the scale, the shielding time and the place and by utilizing a target object tracking sparse Bayesian characteristic model.
The cluster matching recognition model has the expression:
wherein F is x Represents the scale error of the tracking target object, M j Represents the standard scale of the tracking target object, R represents the Euclidean distance of the scale, N represents the iteration times, N represents the number of clustering centers, eta represents the convergence coefficient of the scale, and beta (E) n ) Representing a dynamic distribution function of the tracking target object;representing radar reflection area error of tracking target object, B y The radar tracking method comprises the steps of representing a standard radar reflection area of a tracking target object, Q representing a Euclidean distance of the radar reflection area, lambda representing a convergence coefficient of the radar reflection area, L (t) representing an error accumulation change function of the radar reflection area, and U representing a tensor product;
judging whether the problems of different scales and shielding of the tracking condition of the target object occur or not, wherein the expression is as follows:
Wherein W represents a scale change threshold value, and H represents a radar reflection area threshold value;
the sparse Bayesian characteristic model has the expression:
wherein Q is vcx Represents the screened multi-source heterogeneous and low-quality information set, u represents a distance change factor, P represents the distance of a tracking target object, F ry Representing a dynamically changing value between the occlusion location,the time of influencing the scale change is represented by I, the iteration number is represented by I, the total node number of sparse Bayes is represented by I, the occlusion area change amount is represented by Lc, the area occlusion factor is represented by E, the scale change factor is represented by sigma, and the scale change factor is represented by G ry Representing the scale change interval of the tracking target object.
Optionally, the inputting the radar reflection waveform data Dz of the unmanned aerial vehicle monitoring at different times into the target object tracking condition cluster matching recognition model, judging whether the target object tracking condition is different in scale and the shielding problem occurs, includes:
and acquiring radar reflection waveform data Dzb of the unmanned aerial vehicle in the daytime. The daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb is radar reflection waveform data in the different time unmanned aerial vehicle monitoring radar reflection waveform data Dz.
And acquiring radar reflection waveform data Dzy of the unmanned aerial vehicle at night. The night unmanned aerial vehicle monitoring radar reflection waveform data Dzy is radar reflection waveform data with the largest change of reflection area of the distance daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb in different time unmanned aerial vehicle monitoring radar reflection waveform data Dz and the monitoring node after the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb.
And respectively inputting the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb and the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy into a target object tracking condition clustering matching recognition model to obtain daytime target object tracking radar reflectivity and night target object tracking radar reflectivity. The daytime target tracking radar reflectivity corresponds to the daytime drone monitoring radar reflection waveform data Dzb. The night target tracking radar reflectivity corresponds to the night drone monitoring radar reflection waveform data Dzy.
In this embodiment, the target object tracking condition cluster matching recognition model introduces a 7D feature vector based on a Line Mod algorithm to replace an original 3D space vector. The 7D feature vector includes a 3D spatial position feature vector (X, Y, Z) and a 4D pose vector (gradient direction, gradient magnitude, surface normal vector direction, surface normal vector magnitude). When feature information is calculated, each feature point is influenced by the 7D feature vector, a more reasonable feature block can be obtained, and features with obvious correlation on the surface of an object can be well separated, so that the obtained template has more logic correlation and identification. In the feature information dimension reduction process, a gradient reduction method is adopted to reduce the dimension of the feature information containing 7D feature vectors into 3D, then feature point mean clustering is adopted to realize total matching of templates, redundant clustering is automatically eliminated, a new feature template with a large amount of unique feature information is obtained, and recognition accuracy can be obviously improved in the matching process.
In the embodiment of the invention, the shielding radar reflection waveform data is input into a slicing structure, slicing operation is carried out on the shielding radar reflection waveform data, and the shielding radar reflection waveform data with high resolution is split into shielding radar reflection waveform data with different low resolutions by using a method of column separation sampling and splicing. The method comprises the steps of inputting different low-resolution shielding radar reflection waveform data into a cross-stage local structure, dividing the different low-resolution shielding radar reflection waveform data into two branches, respectively performing convolution operation to halve the number of channels, then performing parameter quantity reduction operation on one branch, and combining the two branches to enable a model to learn more characteristics. And inputting the radar reflectivity with more learned features into an upsampling pyramid structure for upsampling, and fusing the features to obtain multi-scale feature output. The downsampling pyramid structure downsamples from bottom to top, so that the top layer features contain strong shielding position information, radar reflectivities of different sizes all contain strong shielding feature information, and accurate prediction of shielding radar reflection waveform data of different sizes is guaranteed. And finally outputting the tracking radar reflectivity of the target object in the daytime and the tracking radar reflectivity of the target object at night. Radar reflectivity (daytime target tracking radar reflectivity and night target tracking radar reflectivity) characterizes the occlusion presence value, the center point position of the occlusion presence area and the width and height of the occlusion presence area.
And comparing the tracking radar reflectivity of the daytime target object with the tracking radar reflectivity of the night target object by utilizing the monitoring radar reflection waveform data Dzb of the daytime unmanned aerial vehicle and the monitoring radar reflection waveform data Dzy of the night unmanned aerial vehicle, and judging whether the tracking condition of the target object has different scales and shielding problems.
By the method, the characteristics of the radar reflection waveform data Dz of the unmanned aerial vehicle are extracted through the target object tracking situation cluster matching recognition model, and the difference of the characteristics (the target object tracking radar reflectivity in the daytime and the target object tracking radar reflectivity at night) of the radar reflection waveform data Dz of the two unmanned aerial vehicle are compared, so that whether the target object tracking situation is different in scale and the shielding problem is judged. While the data is stored to facilitate later processing of the multi-source heterogeneous, low quality information.
Optionally, the utilizing the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb and the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy compares the daytime target object tracking radar reflectivity and the night target object tracking radar reflectivity, and judges whether the target object tracking condition has different scales and a shielding problem, including:
the daytime shielding area is obtained by utilizing the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the daytime target tracking radar reflectivity and the night target tracking radar reflectivity; the daytime occlusion area represents the position of occlusion in the daytime drone monitoring radar reflection waveform data Dzb.
The night shielding area is obtained by utilizing the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, the daytime target tracking radar reflectivity and the night target tracking radar reflectivity; the night occlusion area represents the position occluded in the night drone monitoring radar reflection waveform data Dzy.
The night shielding area acquisition method is the same as the day shielding area acquisition method. Background noise radar reflected waveform data is collected.
The background noise radar reflection waveform data represents noise radar reflection waveform data when the monitoring radar is not shielded. And carrying out noise filtering on the radar reflection waveform data Dzy monitored by the unmanned aerial vehicle at night to obtain radar reflection waveform data of target tracking noise at night. And subtracting the noise in the background noise radar reflection waveform data from the noise in the night target tracking noise radar reflection waveform data to obtain night target tracking noise difference radar reflection waveform data. And calibrating a value smaller than a noise threshold value in the night target object tracking noise difference radar reflection waveform data as a reference value to obtain night noise shielding radar reflection waveform data. And determining whether the area is a shielding area by utilizing the night noise shielding radar reflection waveform data, the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy and the radar reflectivity tracked by the two targets. And when the area is the shielding area, the area exceeding the reference value in the night noise shielding radar reflection waveform data is taken as the night shielding area.
Utilize daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, daytime shelter from area and night shelter from area, obtain echo reflection variance. The echo reflection variance represents the change of the reflected target object in the radar reflection waveform data Dzb monitored by the unmanned aerial vehicle in the daytime and the radar reflection waveform data Dzy monitored by the unmanned aerial vehicle at night.
When the echo reflection variance exceeds a target object tracking preset change interval, the problems of different scales and shielding of the target object tracking condition occur.
In this embodiment, the target tracking preset change interval is 10cm2.
When the echo reflection variance is within a preset change interval of target object tracking, the problems of non-uniform scale and shielding of the target object tracking condition are not caused.
By the method, the area where the shielding is located is detected by the radar reflectivity, and the shielding change condition in the two pieces of radar reflection waveform data is obtained according to the different shielding areas of the two pieces of radar reflection waveform data and the radio frequency variance between the shielding areas, so that whether the target object tracking condition is different in scale and the shielding problem is judged.
Optionally, utilize daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, daytime shelter from area and night shelter from the area and obtain daytime shelter from the area, include:
Collecting background noise radar reflection waveform data; the background radar reflection waveform data represents radar reflection waveform data when the monitoring radar is not shielded.
And filtering noise from the radar reflection waveform data Dzb monitored by the daytime unmanned aerial vehicle to obtain the radar reflection waveform data of the daytime target tracking noise.
The method for filtering the background noise radar reflection waveform data and the method for filtering the daytime target tracking noise radar reflection waveform data noise is the same. The noise is gaussian white noise.
And subtracting the noise in the background noise radar reflection waveform data from the noise in the daytime target tracking noise radar reflection waveform data to obtain daytime target tracking noise difference radar reflection waveform data.
And calibrating a value smaller than a noise threshold value in the daytime target tracking noise difference radar reflection waveform data as a reference value to obtain daytime noise shielding radar reflection waveform data.
In this embodiment, the target tracking preset change interval is 20cm2.
And determining whether a shielding area exists in the daytime noise shielding radar reflection waveform data by utilizing the daytime noise shielding radar reflection waveform data, the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the daytime target tracking radar reflectivity and the night target tracking radar reflectivity.
When the shielding area exists in the daytime noise shielding radar reflection waveform data, the area exceeding the reference value in the daytime noise shielding radar reflection waveform data is taken as the daytime shielding area.
By the method, the shielding area is approximately obtained according to the difference between the background noise radar reflection waveform data and the daytime target tracking noise radar reflection waveform data. However, due to the fact that noise and possibly illumination are different between the two pictures, and other shielding objects are generated, shielding characteristics are required to be further detected, and further whether shielding area exists in noise shielding radar reflection waveform data in the daytime, namely whether the detected area is actually the shielding area or not is judged.
Optionally, the shielding area is considered to whether the shielding area exists in the daytime noise shielding radar reflection waveform data, the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the daytime target object tracking radar reflectivity and the night target object tracking radar reflectivity, and the method comprises the following steps:
and acquiring the tracking characteristic area of the object in the daytime. And the daytime target tracking characteristic area is the corresponding position of the boundary of the daytime noise shielding radar reflection waveform data in the daytime target tracking radar reflectivity.
The same target object is collected to track different reflection characteristics. And the different reflection characteristics of the target tracking represent the reflection characteristics of the position of the area of the target tracking characteristic of the daytime target tracking radar in the reflectivity of the daytime target tracking radar.
And collecting different reflection characteristics tracked by the target object in the daytime. The tracking of different reflection characteristics of the target object in the daytime is that the same target object tracks one of the different reflection characteristics of the target object to track the different reflection characteristics.
Different reflection characteristics of the interference target object tracking are acquired. The interference target object tracking different reflection characteristics represent tracking different reflection characteristics of the target object in different monitoring points of the central environment by taking the tracking different reflection characteristics of the target object in the daytime as the central environment.
And taking the different reflection characteristics tracked by the target object in the daytime and the different reflection characteristics tracked by the interference target object as inputs, calculating the output of the convolutional neural network, and fusing to obtain fused reflection characteristics.
And determining whether the radar reflection waveform data is a shielding area or not by utilizing the reflection characteristics and the fusion reflection characteristics of the target object tracking radar reflectivity at night in the daytime at positions corresponding to different reflection characteristics.
By the method, the area obtained by radar reflection waveform data is mapped into radar reflectances, and the reflection characteristics of the boundary in the two radar reflectances are judged to be the shielding area because the shielding moves towards the periphery. Meanwhile, as the shielding and diffusion states of the shielding and the environmental position are close, the characteristic of the environmental shielding is combined by using a fusion method.
Optionally, the determining whether the shielding area exists in the noise shielding radar reflection waveform data in the daytime by utilizing the reflectivity of the target object tracking radar in the night, the tracking of different reflection characteristics and the fusion reflection characteristics of the target object in the daytime includes:
and (5) collecting different reflection characteristics of the target object tracking at night. And the night target object tracks the reflection characteristics of the positions corresponding to the different reflection characteristics in the daytime target object tracking radar reflectivity.
And collecting and monitoring different reflection characteristics of the interference target object tracking. The monitoring of the different reflection characteristics of the interference target object tracking is that the target object tracking at night is centered on the different reflection characteristics and does not belong to the influence factors of the shielding area.
And subtracting different reflection characteristics of tracking the target object in the daytime from different reflection characteristics of tracking the monitoring interference target object to obtain an environment difference influence factor. And monitoring different reflection characteristics of the interference target object tracking corresponds to different environmental difference influence factors. An environmental differential impact factor corresponds to a monitoring interference target tracking different reflection characteristics. Each of the environmental differential impact factors has a different impact factor value. And solving the error of each influence factor value in the environment difference influence factors aiming at each environment difference influence factor, and acquiring error values, wherein different influence factor values correspondingly acquire different error values. And summing the different error values to obtain a summed error value. The arithmetic covariance of the sum error value is obtained, and the arithmetic covariance is used as the environmental standard deviation. Each environmental difference influence factor corresponds to one environmental standard deviation, and different environmental difference influence factors correspond to different environmental standard deviations. And when the environmental standard deviation exceeds the fluctuation range of the environmental difference, determining the coverage area (determining that the coverage area exists in the noise coverage radar reflection waveform data in the daytime).
The environmental difference fluctuation range of the present embodiment was 5cm2.
By the method, the blocked motion state is outwards diffused, so that the state is identified by the blocked motion condition in the two pieces of radar reflection waveform data. Because the shielding is only diffused to the environment in a short time, one shielding radar reflection waveform data is taken as a base point, and the motion condition is judged according to the characteristic condition of the corresponding position in the other Zhang Leida reflection waveform data, so that whether the shielding is carried out or not is judged, namely whether the shielding area exists in the noise shielding radar reflection waveform data in the daytime or not is judged.
Optionally, the obtaining the echo reflection variance by using the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, the daytime shielding area and the night shielding area includes:
and fusing the daytime shielding area and the night shielding area to obtain a fused shielding area. The fusion shielding area represents an area containing both a daytime target tracking area and a night target tracking area.
And calibrating a scattering value outside the fused shielding area in the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb as a reference value to obtain daytime background object tracking radar reflection waveform data.
And calibrating a scattering value outside the fused shielding area in the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy to be a reference value to obtain night background object tracking radar reflection waveform data.
And converting the daytime background object tracking radar reflection waveform data into video signals to obtain daytime video background object tracking radar reflection waveform data.
And converting the night background object tracking radar reflection waveform data into video signals to obtain night video background object tracking radar reflection waveform data.
Wherein the conversion of scatter to video signals is performed using a ScanStreamer.
And collecting radio frequency difference radar reflection waveform data. The radio frequency difference radar reflection waveform data are radar reflection waveform data formed by absolute values of tracking variances of different background targets. And the background object tracking variance is the data of the movement corresponding to the data of the background object tracking radar reflection waveform of the video background object tracking radar at night subtracted from the median value of the data of the background object tracking radar reflection waveform of the video background object tracking radar at daytime.
And converting the radio frequency difference radar reflected waveform data into scattering, and filtering noise to obtain noise difference radar reflected waveform data.
Wherein the conversion of video signals into scatter is performed using a ScanStreamer.
And classifying noise values in the noise difference radar reflection waveform data to obtain echo reflection variances.
By the method, the representation form of the shielding concentration in the radar reflection waveform data is the color shade, so that the noise value is used for judging the shielding concentration after the background factors are removed, and the echo reflection condition is obtained. The variances of the three radar reflection waveform data video signals are calculated respectively, and then other operations such as noise filtering and the like are performed, so that the variances of the two radar reflection waveform data can be judged in three aspects of hue, saturation and brightness, and the echo reflection condition can be obtained more accurately.
Optionally, the processing the multi-source heterogeneous and low-quality information by using the unmanned aerial vehicle monitoring data Gn, the target object tracking information, the scale, the shielding time and the place and by using the target object tracking sparse bayesian characteristic model includes:
and inputting the unmanned aerial vehicle monitoring data Gn and the target tracking information into a daytime target tracking sparse Bayesian characteristic model to obtain a daytime processing result. Different target object tracking information correspondingly acquires different daytime processing results.
Wherein, a daytime processing result corresponds to a target object tracking information. The different target tracking information is obtained by monitoring radar reflection waveform data Dz by the unmanned aerial vehicle at different time before the target tracking information is not changed.
When the target object tracking information is a problem of different marking scales and shielding, storing the daytime scales, the shielding time and the place at the cloud end, moving the target object tracking information forward, and repeatedly judging the condition of the different marking scales and the shielding problem until the target object tracking information is the problem of different marking scales and the shielding problem again;
when the target object tracking information is the problem of different mark scales and shielding, storing the night scale, shielding time and place at the cloud end, inputting the unmanned aerial vehicle monitoring data Gn and different target object tracking information into a target object tracking sparse Bayesian characteristic model, and obtaining different night processing results.
Wherein, a night processing result corresponds to a target object tracking information. The different target tracking information is obtained by monitoring radar reflection waveform data Dz by the unmanned aerial vehicle at different time after the target tracking information changes again.
And inputting the unmanned aerial vehicle monitoring data Gn, target tracking information, daytime scale, shielding time and place, night scale, shielding time and place into a night target tracking sparse Bayesian characteristic model to obtain overall multi-source heterogeneous and low-quality information.
The structure of the object tracking sparse Bayesian characteristic model is shown in fig. 2.
And solving the daytime processing results and the different night processing results to utilize the extended Kalman filtering processing.
Wherein the multi-source heterogeneous, low-quality information includes primary clutter, secondary clutter and tertiary clutter. The network output corresponds to a value, and the value is used for carrying out subsequent operations such as averaging, so that meaning is given to the value, for example, the output value is 1, which represents primary confusion, 2, which represents secondary confusion, and 3, which represents tertiary confusion, and the higher the level is, the more chaotic the data is.
By the method, the multi-source heterogeneous and low-quality information of each piece of radar reflection waveform data is firstly judged, and the time of the problem of different scales and shielding is stored to control whether the multi-source heterogeneous and low-quality information at the moment is skipped and the time change of the problem of different scales and shielding is controlled. The target controls the reflection quantity for a period of time during monitoring so as to reduce the processing result, and the control time, namely the different scales and the length of the shielding problem time, can influence the processing result. Therefore, in the process of the problem of different scales and occlusion, the recorded time length of the problem of different scales and occlusion is used as an influence factor to be used as the input of the neural network to calculate the multi-source heterogeneous and low-quality information of the whole time. And combining local and whole parts to obtain the processed multi-source heterogeneous and low-quality information more accurately.
Optionally, the step of inputting the unmanned aerial vehicle monitoring data Gn, the target object tracking information, the daytime scale, the shielding time and place, the night scale, the shielding time and place, and the night target object tracking sparse bayesian characteristic model to obtain overall multi-source heterogeneous and low-quality information includes:
subtracting the daytime scale, the shielding time and the place from the night scale, the shielding time and the place to obtain monitoring time; the monitoring time is the time between two scales, the shielding time and the place point.
Tracking different target object with radius Dz u Adding to obtain the total target tracking radius Dk u
Adding the different radar reflection areas Sx to obtain a total radar reflection area Sx j
Wherein the total target tracking radius Dk u And total radar reflection area Sx j Is a fixed ratio.
Tracking the unmanned aerial vehicle monitoring data Gn and the total target object with a radius Dk u Total radar reflection area Sx j And monitoring time, inputting a target object tracking sparse Bayesian characteristic model, and obtaining overall multi-source heterogeneous and low-quality information.
By the method, the target object tracking is judged according to time because the monitoring time is influenced, and the monitoring time is used for inputting whether control is input or not and is equivalent to a switch. And directly measured monitoring data is input.
According to the method, the characteristics of the radar reflection waveform data Dz of the unmanned aerial vehicle are extracted through the target object tracking condition cluster matching recognition model to detect radar reflectivity, the area where shielding is located is found, and according to the different shielding areas of the two pieces of radar reflection waveform data and the radio frequency variance between the shielding areas, the shielding change condition in the two pieces of radar reflection waveform data is obtained, so that whether the target object tracking condition is different in scale and the shielding problem is judged. While the data is stored to facilitate later processing of the multi-source heterogeneous, low quality information.
Because the two pictures have different noise and possibly different illumination, and other shielding objects and other reasons appear, the shielding characteristics also need to be further detected. Because the shade moves towards the periphery, the shade area obtained by radar reflection waveform data is mapped into radar reflectivities, and whether the shade area is judged by the reflection characteristics of the boundary in the two radar reflectivities.
Meanwhile, as the shielding and diffusion states of the shielding and the environmental position are close, the characteristic of the environmental shielding is combined by using a fusion method. The blocked motion state is outwards diffused, so that the state is identified according to the blocked motion condition in the two pieces of radar reflection waveform data, one piece of blocked radar reflection waveform data is taken as a base point, and the motion condition is judged according to the characteristic condition of the corresponding position in the other piece of Zhang Leida reflection waveform data, so that whether the state is blocked or not is judged.
The representation form of the shielding concentration in the radar reflection waveform data is the color depth, so that the noise value is used for judging the shielding concentration after removing the background factors, the variances of three channels of the radar reflection waveform data video signals are respectively calculated, then other operations such as noise filtering and the like are performed, the variances of two pieces of radar reflection waveform data can be judged, and the echo reflection condition can be obtained more accurately. And after the shielding area is obtained, carrying out one treatment on the multi-source heterogeneous and low-quality information. Firstly judging multi-source heterogeneous and low-quality information of each piece of radar reflection waveform data, and storing time of different scales and shielding problems, wherein the time is used for controlling whether the multi-source heterogeneous and low-quality information at the moment is skipped to skip the change of the time of the different scales and the shielding problems. The target controls the reflection quantity for a period of time during monitoring so as to reduce the processing result, and the control time, namely the different scales and the length of the shielding problem time, can influence the processing result. Therefore, in the process of the problem of different scales and shielding, the recorded time length of the problem of different scales and shielding is used as an influence factor to be used as the input of the neural network to calculate the multi-source heterogeneous and low-quality information of a whole period of time, and the processed multi-source heterogeneous and low-quality information is obtained more accurately by combining the local part and the whole part.
Examples
As shown in FIG. 2, the target tracking method enhanced by the air-ground multi-view information is realized by different units, and comprises a data acquisition unit, a scale difference and shielding problem monitoring unit, a marking unit, a target object tracking and detecting unit and a multi-source heterogeneous and low-quality information processing unit.
The data acquisition unit is used for acquiring unmanned aerial vehicle monitoring data Gn. The unmanned aerial vehicle monitoring data Gn are target position data obtained by measuring preset unit time and place. And collecting radar reflected waveform data Dz of unmanned aerial vehicle monitoring at different times. The unmanned aerial vehicle monitoring radar reflection waveform data Dz is radar reflection waveform data of a target object reflection echo received by the monitoring radar.
The scale non-uniformity and shielding problem monitoring unit is used for inputting radar reflection waveform data Dz of unmanned aerial vehicle monitoring at different times into a target object tracking condition cluster matching recognition model to judge whether the scale non-uniformity and shielding problem of the target object tracking condition occur or not.
The marking unit is used for marking the scale and shielding information when the scale of the tracking condition of the target object is different and the shielding problem occurs, and recording the scale, the shielding time and the place.
The target tracking detection unit is used for monitoring radar reflected waveform data Dz by using unmanned aerial vehicles at different times to obtain target tracking information; the target tracking information comprises a target tracking radius Dz u And radar reflection area Sx.
The multi-source heterogeneous and low-quality information processing unit is used for processing multi-source heterogeneous and low-quality information by utilizing the unmanned aerial vehicle monitoring data Gn, the target object tracking information, the scale, the shielding time and the place and by utilizing the target object tracking sparse Bayesian characteristic model.
The specific manner in which the respective units perform the operations has been described in detail herein with respect to the units in the above embodiments, and will not be described in detail herein with respect to the embodiments of the method.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the elements in the apparatus of the embodiments may be adaptively changed and disposed in one or a different apparatus than the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into different sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Various component embodiments of the invention may be implemented in hardware, or in software elements running on one or different processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in an apparatus according to embodiments of the present invention may be implemented in practice using a microprocessor or digital signal processor (DZP). The present invention can also be implemented as an apparatus or device program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present invention may be stored on a computer readable medium, or may have the form of one or a different signal. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim.

Claims (9)

1. The target tracking method for the air-ground multi-view information enhancement is characterized by comprising the following steps of:
step A1: collecting unmanned aerial vehicle monitoring data Gn; the unmanned aerial vehicle monitoring data Gn are target object position data measured in preset unit time and place;
step A2: collecting radar reflected waveform data Dz of unmanned aerial vehicles at different times; the unmanned aerial vehicle monitoring radar reflection waveform data Dz is radar reflection waveform data of a target object reflection echo received by the monitoring radar;
step A3: inputting the radar reflection waveform data Dz of the unmanned aerial vehicle monitoring at different times into a target object tracking condition clustering matching recognition model, and judging whether the problems of different scales and shielding of the target object tracking condition occur or not;
step A4: when the tracking condition of the target object has different scales and the shielding problem occurs, marking the scales and shielding information, and recording the scales, the shielding time and the place;
step A5: monitoring radar reflected waveform data Dz by using unmanned aerial vehicles with different times to obtain target tracking information; the target tracking information comprises a target tracking radius Dz u And radar reflection area Sx;
step A6: processing multi-source heterogeneous and low-quality information by utilizing the unmanned aerial vehicle monitoring data Gn, target object tracking information, scale, shielding time and place and by utilizing a target object tracking sparse Bayesian characteristic model;
The expression of the cluster matching recognition model is as follows:
wherein F is x Represents the scale error of the tracking target object, M j Represents the standard scale of the tracking target object, R represents the Euclidean distance of the scale, N represents the iteration times, N represents the number of clustering centers, eta represents the convergence coefficient of the scale, and beta (E) n ) Representing a dynamic distribution function of the tracking target object; h a Representing radar reflection area error of tracking target object, B y Represents a standard radar reflection area of a tracking target object, Q represents a Euclidean distance of the radar reflection area, lambda represents a convergence coefficient of the radar reflection area, L (t) represents an error accumulation change function of the radar reflection area,representing a tensor product;
judging whether the target object tracking condition has different scales and shielding problems or not, wherein the expression is as follows:
wherein W represents a scale change threshold value, and H represents a radar reflection area threshold value;
the expression of the sparse Bayesian characteristic model is as follows:
wherein Q is vcx Representing screened outMultisource heterogeneous and low-quality information set, u represents distance change factor, P represents distance of tracking target object, F ry Representing a dynamic change value between the shielding place and the shielding value, ζ represents time for influencing scale change, I represents iteration times, I represents the number of sparse Bayesian total nodes, lc represents shielding area change quantity, E represents area shielding factor, σ represents scale change factor, G ry Representing the scale change interval of the tracking target object.
2. The method for target tracking with enhanced air-ground multi-view information according to claim 1, wherein the step of inputting the radar reflection waveform data Dz of unmanned aerial vehicle monitoring at different times into a target tracking condition cluster matching recognition model to judge whether the target tracking condition is different in scale and the shielding problem is caused comprises the following steps:
acquiring radar reflected waveform data Dzb of the unmanned aerial vehicle in the daytime; the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb are radar reflection waveform data in the different time unmanned aerial vehicle monitoring radar reflection waveform data Dz;
collecting radar reflection waveform data Dzy of the unmanned aerial vehicle at night; the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy is radar reflection waveform data with the largest change of reflection area of the distance daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb in different time unmanned aerial vehicle monitoring radar reflection waveform data Dz and the monitoring node behind the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb;
respectively inputting the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb and the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy into a target object tracking condition clustering matching recognition model to obtain daytime target object tracking radar reflectivity and night target object tracking radar reflectivity; the daytime target tracking radar reflectivity corresponds to the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb; the night target tracking radar reflectivity corresponds to night unmanned aerial vehicle monitoring radar reflection waveform data Dzy;
And comparing the daytime target tracking radar reflectivity with the night target tracking radar reflectivity by utilizing the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb and the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, and judging whether the target tracking condition is different in scale and the shielding problem occurs.
3. The method for tracking the target with enhanced air-ground multi-view information according to claim 2, wherein the steps of comparing the daytime target tracking radar reflectivity with the night target tracking radar reflectivity by using the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb and the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, and judging whether the target tracking situation has different scales and a shielding problem comprise:
the daytime shielding area is obtained by utilizing the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the daytime target tracking radar reflectivity and the night target tracking radar reflectivity; the daytime occlusion area represents the position occluded in the daytime unmanned monitoring radar reflection waveform data Dzb;
the night shielding area is obtained by utilizing the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, the daytime target tracking radar reflectivity and the night target tracking radar reflectivity; the night occlusion area represents the position occluded in night drone monitoring radar reflection waveform data Dzy;
Acquiring echo reflection variance by using the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, the daytime shielding area and the night shielding area; the echo reflection variance represents the change condition of a reflection target object in the radar reflection waveform data Dzb monitored by the unmanned aerial vehicle in the daytime and the radar reflection waveform data Dzy monitored by the unmanned aerial vehicle at night;
when the echo reflection variance exceeds a target object tracking preset change interval, judging that the scale of the target object tracking condition is different and the shielding problem occurs; otherwise, judging that the problems of non-uniform scale and shielding of the tracking condition of the target object do not occur.
4. The method for tracking the target with enhanced air-ground multi-view information according to claim 3, wherein the obtaining the daytime occlusion area by using the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, the daytime occlusion area and the night occlusion area comprises:
collecting background noise radar reflection waveform data; the background noise radar reflection waveform data represents noise radar reflection waveform data when the monitoring radar is not shielded;
the radar reflection waveform data Dzb monitored by the unmanned aerial vehicle in the daytime is subjected to noise filtering to obtain radar reflection waveform data of tracking noise of a target object in the daytime;
Subtracting noise in the background noise radar reflection waveform data from noise in the daytime target tracking noise radar reflection waveform data to obtain daytime target tracking noise difference radar reflection waveform data;
calibrating a value smaller than a noise threshold value in the daytime target tracking noise difference radar reflection waveform data as a reference value to obtain daytime noise shielding radar reflection waveform data;
determining whether a shielding area exists in the daytime noise shielding radar reflection waveform data by utilizing the daytime noise shielding radar reflection waveform data, the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the daytime target tracking radar reflectivity and the night target tracking radar reflectivity;
when the shielding area exists in the daytime noise shielding radar reflection waveform data, the area exceeding the reference value in the daytime noise shielding radar reflection waveform data is taken as the daytime shielding area.
5. The method for tracking the target with enhanced air-to-ground multi-view information according to claim 4, wherein the determining whether the shielding area exists by using the daytime noise shielding radar reflection waveform data, the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the daytime target tracking radar reflectivity and the night target tracking radar reflectivity comprises:
Collecting the tracking characteristic area of a target object in the daytime; the daytime target tracking characteristic area is a position corresponding to the boundary of daytime noise shielding radar reflection waveform data in the daytime target tracking radar reflectivity;
collecting different reflection characteristics of the same target object; the different reflection characteristics of the target tracking represent the reflection characteristics of the position of the area of the target tracking characteristic of the daytime target tracking radar in the reflectivity of the daytime target tracking radar;
collecting different reflection characteristics tracked by a target object in the daytime; the tracking of different reflection characteristics of the target object in the daytime is that the same target object tracks one of the different reflection characteristics of the target object;
collecting different reflection characteristics of the interference target object tracking; the interference target object tracking different reflection characteristics represent tracking different reflection characteristics of the target object in different monitoring points of the central environment by taking the tracking different reflection characteristics of the target object in the daytime as the central environment;
inputting different reflection characteristics of tracking the target object in the daytime and different reflection characteristics of tracking the interference target object into a convolutional neural network to obtain fusion reflection characteristics;
and determining whether the shielding area exists in the noise shielding radar reflection waveform data in the daytime by utilizing the reflectivity of the target object tracking radar in the night, the different reflection characteristics and the fusion reflection characteristics of the target object tracking in the daytime.
6. The method for tracking the target with enhanced air-ground multi-view information according to claim 5, wherein the step of determining whether the noise shielding radar reflection waveform data in the daytime has a shielding area by using the night target tracking radar reflectivity, the daytime target tracking different reflection characteristics and the fusion reflection characteristics comprises the steps of:
collecting different reflection characteristics tracked by a target object at night; the different reflection characteristics of the target object tracking at night are reflection characteristics of the target object tracking radar at night in the corresponding positions of the different reflection characteristics of the target object tracking at daytime in the reflectivity of the target object tracking radar at night;
collecting and monitoring different reflection characteristics of the interference target object tracking; the monitoring of the different reflection characteristics of the interference target object tracking is that the target object tracking at night takes the different reflection characteristics as the center and does not belong to the influence factors of the shielding area;
subtracting different reflection characteristics of tracking the target object in the daytime from different reflection characteristics of tracking the monitoring interference target object to obtain an environmental difference influence factor; monitoring interference targets to track different reflection characteristics to correspondingly obtain different environmental difference influence factors; an environmental difference influence factor corresponds to a monitoring interference target object to track different reflection characteristics; each environmental difference influencing factor has a different influencing factor value; solving the error of each influence factor value in the environment difference influence factors aiming at each environment difference influence factor, collecting error values, and correspondingly collecting different error values by different influence factor values; summing the different error values to obtain a summed error value; calculating an arithmetic covariance of the sum error value, and taking the arithmetic covariance as an environment standard deviation; each environmental difference influence factor corresponds to one environmental standard deviation, and different environmental difference influence factors correspond to different environmental standard deviations;
And when the environmental standard deviation exceeds the fluctuation range of the environmental difference, judging that the noise shielding radar reflection waveform data has shielding area in the daytime.
7. The method for tracking an object with enhanced air-ground multi-view information according to claim 3, wherein the obtaining echo reflection variance by using the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb, the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy, a daytime shielding area and a night shielding area comprises:
fusing the daytime shielding area and the night shielding area to obtain a fused shielding area; the fusion shielding area represents an area containing a daytime target tracking area and a night target tracking area;
calibrating a scattering value outside the fused shielding area in the daytime unmanned aerial vehicle monitoring radar reflection waveform data Dzb as a reference value to obtain daytime background object tracking radar reflection waveform data;
calibrating a scattering value outside the fusion shielding area in the night unmanned aerial vehicle monitoring radar reflection waveform data Dzy to be a reference value to obtain night background object tracking radar reflection waveform data;
converting the daytime background object tracking radar reflection waveform data into video signals to obtain daytime video background object tracking radar reflection waveform data;
Converting the night background object tracking radar reflection waveform data into video signals to obtain night video background object tracking radar reflection waveform data;
collecting radio frequency difference radar reflection waveform data; the radio frequency difference radar reflection waveform data are radar reflection waveform data formed by absolute values of tracking variances of different background targets; the background object tracking variance is obtained by subtracting the night video background object tracking radar reflection waveform data from the daytime video background object tracking radar reflection waveform data median value;
converting the radio frequency difference radar reflected waveform data into scattering, and filtering noise to obtain noise difference radar reflected waveform data;
and classifying noise values in the noise difference radar reflection waveform data to obtain echo reflection variances.
8. The method for target tracking with enhanced air-ground multi-view information according to claim 6, wherein the processing multi-source heterogeneous and low-quality information by using the unmanned aerial vehicle monitoring data Gn, target tracking information, scale, occlusion time and location and by using a target tracking sparse bayesian feature model comprises: inputting the unmanned aerial vehicle monitoring data Gn and the target tracking information into a daytime target tracking sparse Bayesian characteristic model to obtain a daytime processing result; different daytime processing results are correspondingly acquired by different target object tracking information;
When the target object tracking information is in the marked scale difference and the shielding problem, storing the daytime scale, the shielding time and the place at the cloud end, moving the target object tracking information forward, and repeatedly judging the situation of the marked scale difference and the shielding problem until the target object tracking information is in the marked scale difference and the shielding problem again;
when the target object tracking information is the problem of different mark scales and shielding, storing the night scale, shielding time and place at the cloud end, inputting the unmanned aerial vehicle monitoring data Gn and different target object tracking information into a target object tracking sparse Bayesian characteristic model, and obtaining different night processing results;
inputting the unmanned aerial vehicle monitoring data Gn, target tracking information, daytime scale, shielding time and place, night scale, shielding time and place into a night target tracking sparse Bayesian characteristic model to obtain overall multi-source heterogeneous and low-quality information;
and (3) carrying out extended Kalman filtering processing on the local multi-source isomerism, the low-quality information and the whole multi-source isomerism and the low-quality information.
9. The method for target tracking with enhanced air-ground multi-view information according to claim 8, wherein the step of inputting the unmanned aerial vehicle monitoring data Gn, target tracking information, day scale, shielding time and place, and night scale, shielding time and place into a night target tracking sparse bayesian characteristic model to obtain overall multi-source heterogeneous and low-quality information comprises the steps of:
Subtracting the daytime scale, the shielding time and the place from the night scale, the shielding time and the place to obtain monitoring time; the monitoring time is the time between two scales, the shielding time and the place point;
tracking different target object with radius Dz u Adding to obtain the total target tracking radius Dk u
Adding the different radar reflection areas Sx to obtain a total radar reflection area Sx j
Tracking the unmanned aerial vehicle monitoring data Gn and the total target object with a radius Dk u Total radar reflection area Sx j And monitoring time, inputting a target object tracking sparse Bayesian characteristic model, and obtaining overall multi-source heterogeneous and low-quality information.
CN202311051377.5A 2023-08-21 2023-08-21 Target tracking method for air-ground multi-view information enhancement Active CN117269951B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311051377.5A CN117269951B (en) 2023-08-21 2023-08-21 Target tracking method for air-ground multi-view information enhancement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311051377.5A CN117269951B (en) 2023-08-21 2023-08-21 Target tracking method for air-ground multi-view information enhancement

Publications (2)

Publication Number Publication Date
CN117269951A true CN117269951A (en) 2023-12-22
CN117269951B CN117269951B (en) 2024-03-26

Family

ID=89211246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311051377.5A Active CN117269951B (en) 2023-08-21 2023-08-21 Target tracking method for air-ground multi-view information enhancement

Country Status (1)

Country Link
CN (1) CN117269951B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120308124A1 (en) * 2011-06-02 2012-12-06 Kriegman-Belhumeur Vision Technologies, Llc Method and System For Localizing Parts of an Object in an Image For Computer Vision Applications
US20150198711A1 (en) * 2014-01-16 2015-07-16 GM Global Technology Operations LLC Object fusion system of multiple radar imaging sensors
CA3025355A1 (en) * 2016-05-27 2017-11-30 Rhombus Systems Group, Inc. Radar system to track low flying unmanned aerial vehicles and objects
CN107480704A (en) * 2017-07-24 2017-12-15 南开大学 It is a kind of that there is the real-time vision method for tracking target for blocking perception mechanism
CN108665479A (en) * 2017-06-08 2018-10-16 西安电子科技大学 Infrared object tracking method based on compression domain Analysis On Multi-scale Features TLD
CN109085571A (en) * 2018-08-20 2018-12-25 中国人民解放军海军航空大学 Hypersonic method for tracking target based on triple bayesian criterions
WO2022036733A1 (en) * 2020-08-20 2022-02-24 南京航空航天大学 Low interception-oriented networking radar dwell time and radiation power joint optimization method
CN114299417A (en) * 2021-12-09 2022-04-08 连云港杰瑞电子有限公司 Multi-target tracking method based on radar-vision fusion
CN116266360A (en) * 2021-12-16 2023-06-20 长安大学 Vehicle target detection tracking method based on multi-source information fusion

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120308124A1 (en) * 2011-06-02 2012-12-06 Kriegman-Belhumeur Vision Technologies, Llc Method and System For Localizing Parts of an Object in an Image For Computer Vision Applications
US20150198711A1 (en) * 2014-01-16 2015-07-16 GM Global Technology Operations LLC Object fusion system of multiple radar imaging sensors
CA3025355A1 (en) * 2016-05-27 2017-11-30 Rhombus Systems Group, Inc. Radar system to track low flying unmanned aerial vehicles and objects
CN108665479A (en) * 2017-06-08 2018-10-16 西安电子科技大学 Infrared object tracking method based on compression domain Analysis On Multi-scale Features TLD
CN107480704A (en) * 2017-07-24 2017-12-15 南开大学 It is a kind of that there is the real-time vision method for tracking target for blocking perception mechanism
CN109085571A (en) * 2018-08-20 2018-12-25 中国人民解放军海军航空大学 Hypersonic method for tracking target based on triple bayesian criterions
WO2022036733A1 (en) * 2020-08-20 2022-02-24 南京航空航天大学 Low interception-oriented networking radar dwell time and radiation power joint optimization method
CN114299417A (en) * 2021-12-09 2022-04-08 连云港杰瑞电子有限公司 Multi-target tracking method based on radar-vision fusion
CN116266360A (en) * 2021-12-16 2023-06-20 长安大学 Vehicle target detection tracking method based on multi-source information fusion

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ALEKSANDAR BOJCHEVSKI等: "Bayesian Robust Attributed Graph Clustering: Joint Learning of Partial Anomalies and Group Structure", THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, vol. 32, no. 1, 29 April 2018 (2018-04-29) *
毛焕等: "结合重识别特征和运动预测的多目标跟踪方法", 无线电通信技术, vol. 49, no. 4, 30 April 2023 (2023-04-30) *
王艳川;黄海;李邵梅;高超;: "基于在线检测和尺度自适应的相关滤波跟踪", 光学学报, no. 02, 17 October 2017 (2017-10-17) *
颜晓文;谢杰腾;: "基于贝叶斯方法的视觉跟踪", 物联网技术, no. 04, 20 April 2015 (2015-04-20) *

Also Published As

Publication number Publication date
CN117269951B (en) 2024-03-26

Similar Documents

Publication Publication Date Title
CN111553859B (en) Laser radar point cloud reflection intensity completion method and system
CN109635685B (en) Target object 3D detection method, device, medium and equipment
JP2021523443A (en) Association of lidar data and image data
CN113866742B (en) Method for point cloud processing and target classification of 4D millimeter wave radar
CN111027401A (en) End-to-end target detection method with integration of camera and laser radar
CN114299417A (en) Multi-target tracking method based on radar-vision fusion
Hinz Detection and counting of cars in aerial images
CN108345823B (en) Obstacle tracking method and device based on Kalman filtering
CN114022830A (en) Target determination method and target determination device
CN111340855A (en) Road moving target detection method based on track prediction
WO2020250020A9 (en) Lidar and radar based tracking and mapping system and method thereof
CN114495064A (en) Monocular depth estimation-based vehicle surrounding obstacle early warning method
US11281916B2 (en) Method of tracking objects in a scene
Yaghoobi Ershadi et al. Vehicle tracking and counting system in dusty weather with vibrating camera conditions
CN115187941A (en) Target detection positioning method, system, equipment and storage medium
CN113253269B (en) SAR self-focusing method based on image classification
JP7418476B2 (en) Method and apparatus for determining operable area information
CN117269951B (en) Target tracking method for air-ground multi-view information enhancement
CN116978009A (en) Dynamic object filtering method based on 4D millimeter wave radar
CN117029840A (en) Mobile vehicle positioning method and system
CN117148315B (en) Unmanned automobile operation detection method and system
US11592565B2 (en) Flexible multi-channel fusion perception
US20230025579A1 (en) High-definition mapping
EP4099211A1 (en) Method and device for training a machine learning algorithm
Wu et al. The design and implementation of real-time automatic vehicle detection and counting system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant