CN103679128A - Anti-cloud-interference airplane target detection method - Google Patents
Anti-cloud-interference airplane target detection method Download PDFInfo
- Publication number
- CN103679128A CN103679128A CN201210359066.0A CN201210359066A CN103679128A CN 103679128 A CN103679128 A CN 103679128A CN 201210359066 A CN201210359066 A CN 201210359066A CN 103679128 A CN103679128 A CN 103679128A
- Authority
- CN
- China
- Prior art keywords
- target
- candidate target
- image
- candidate
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Analysis (AREA)
Abstract
The invention belongs to a photoelectric technology flying target detection method in the technical field of image processing, and particularly relates to an anti-cloud-interference airplane target detection method in the process of airplane landing. Concrete steps are listed as follows: (1) filtering is performed; (2) a mean value and a standard deviation are calculated; (3) an optimal segmentation threshold is calculated; (4) an image is segmented; (5) a candidate target area is confirmed; (6) length-width ratio of candidate targets are calculated; (7) the candidate targets are screened according to length-width ratio parameters; (8) central coordinates of all the candidate targets are calculated and stored; (9) the candidate targets existing continuously are confirmed; (10) first stage candidate targets are confirmed; (11) change of transverse coordinates within five frames of the first stage candidate targets is calculated; (12) the first stage candidate targets are screened according to change of the transverse coordinates; (13) tracks of the candidate targets are calculated; (14) target positions of tracking targets are further acquired; (15) position coordinate estimation values of second stage candidate targets are calculated; (16) distance between the position coordinate estimation values and practical values of a seventh frame and a ninth frame is calculated; (17) the target is confirmed; and (18) the target is finally confirmed.
Description
Technical field
The invention belongs to the photoelectric technology airbound target detection method in technical field of image processing, be specifically related to a kind of Aircraft Targets detection method of anti-interference of clouds.
Background technology
One of major function of airport low altitude three-dimensional prevention and control system is the monitoring to aircraft landing process.The landing process of aircraft is the period that aviation accident probability is higher, and complete monitoring aircraft landing process is conducive to grasp the information such as flight attitude of aircraft, for the investigation of accident and the eliminating of accident potential provide important evidence.In the three-dimensional prevention and control system in airport, adopt infrared detector to carry out imaging to the low dummy section in airport, when the aircraft landing appears in visual field, according to feature, carry out target acquistion, determine the position of Aircraft Targets in image, start tracing process, target is carried out with motion tracking, until safe falling is to runway.
In aircraft landing process tracking, very crucial to catching of Aircraft Targets.When Aircraft Targets has just appeared in the infrared eye visual field, target area occupied in image is little, easily by the noise signal of infrared eye, flooded, and the weather of the low dummy section in airport also directly has influence on target acquistion process, when particularly the same day, aerial cloud amount was more, Aircraft Targets process is easily Aircraft Targets by the similar cloud cluster error detection of shape in catching, and under the larger background of cloud amount, to effective identification of Aircraft Targets, is exactly therefore the key of problem.
Because cloud layer and Aircraft Targets have obvious difference in flying speed, according to the motion feature of target, can effectively cloud noise be rejected, thereby realize the correct identification of Aircraft Targets.Therefore, for in the three-dimensional video monitoring system of airport to catching the demand of Aircraft Targets in aircraft landing process, need at present a kind of method that adopts target geometric properties and motion feature to combine badly, by the analysis of target geometric properties is effectively rejected, propose obviously do not meeting in shape the target area of aircraft signature.
Summary of the invention
The technical problem to be solved in the present invention is to provide a kind of method that adopts target geometric properties and motion feature to combine, in infrared monitoring and control image, by image processing method, aerial little target is extracted, thereby carry out target-recognition according to the motion feature of target, reject cloud noise, obtain real Aircraft Targets.
In order to realize this purpose, the technical scheme that the present invention takes is
A kind of Aircraft Targets detection method of anti-interference of clouds, be applied in the three-dimensional prevention and control system in airport Aircraft Targets detection system in aircraft landing process, according to the kinetic characteristic of the Aircraft Targets that lands in infrared image, motion feature to low target is analyzed, remove cloud noise, in low target to be confirmed, determine the Aircraft Targets in landing process, concrete steps are as follows:
(1) filtering:
In low latitude, traffic pattern monitoring infrared image, determine the low dummy section in image, low dummy section is determined according to video camera putting position and field range;
For the low dummy section of Aircraft Targets in image, after employing elder generation is horizontal, mode is carried out Top-Hat morphologic filtering longitudinally, and filtering factor length is determined according to target in real image;
(2) computation of mean values and standard deviation:
To carrying out average μ and the standard deviation sigma of the image-region calculation of filtered image of Top-Hat filtering;
Filtering image I is set
ffor the capable c row of r, average μ is expressed as:
standard deviation sigma is expressed as:
I wherein
f(i, j) presentation video I
fthe pixel value of the capable j row of i pixel;
(3) calculate optimum segmentation threshold value:
For image, be greater than the pixel value of μ+σ in mutually, with maximum between-cluster variance criterion, calculate optimum segmentation threshold value Th;
(4) cut apart image:
According to threshold value Th to filtering image I
fcarry out image and cut apart, form and cut apart image I
s;
Cut apart rule as follows:
I wherein
f(i, j) presentation video I
fthe pixel value of the capable j row of i pixel, I
s(i, j) presentation video I
sthe pixel value of the capable j row of i pixel;
(5) determine candidate target region:
By cutting apart the connected region that in image, pixel value is 1, carry out pixel count statistics, the isolated point that removal pixel count is 1; Other connected regions are sorted from big to small according to pixel count, select 5 regions of pixel count maximum as candidate target;
(6) calculated candidate target length breadth ratio:
Long parameter 1gh is target area boundary rectangle transverse direction pixel count, and wide parameter wdh is target area boundary rectangle longitudinal direction pixel count; Target length breadth ratio 1w is defined as:
(7) according to length breadth ratio choice of parameters candidate target:
Length breadth ratio parameter 1w is less than to 1 to reject from first order candidate target region with the target area that is greater than 10;
(8) calculate and store the centre coordinate of all candidate targets:
Candidate target width is candidate target region rightmost pixel and Far Left pixel horizontal ordinate difference, and candidate target height is topmost pixel and pixel ordinate difference bottom of candidate target region, and candidate target area is that candidate target region covers pixel count;
If p that candidate target covers pixel coordinate is respectively (x
1, y
1), (x
2, y
2) ..., (x
p, y
p), this candidate target center (cx, cy) coordinate is:
(9) in candidate target, determine the candidate target of continued presence:
To picture frame repeating step (1)~step (8) afterwards, record the centre coordinate of each frame candidate target, for centre coordinate in k frame, be (cx
k, cy
k) candidate target T1, in k-1 frame, centre coordinate is (cx
k-1, cy
k-1) candidate target and the Euclidean distance of candidate target T1 centre coordinate nearest, and meet following relation:
time, between judgement two continuous frames, the centre distance of candidate target is nearest, thereby confirms that in k-1 frame, centre coordinate is (cx
k-1, cy
k-1) target and T1 be same target;
(10) determine first order candidate target:
If continuous 5 frames of a certain target exist, confirm as first order candidate target, record the positional information in continuous 5 frames of this target; The candidate target that is i for target designation, target position information is transverse and longitudinal coordinate cx, the cy in image; Continuous 5 frame target position informations are designated as
(11) the lateral coordinates of calculating within first order candidate target 5 frames changes:
The candidate target that is i for target designation, lateral coordinates running parameter hc is the absolute value of the difference of this target the 1st frame position horizontal ordinate and the 5th frame position horizontal ordinate, is described as:
(12) according to lateral coordinates, change screening first order candidate target, obtain second level candidate target:
In candidate target, remove its corresponding lateral coordinates variation hc value and be less than system thresholds T
hctarget, obtain second level candidate target;
(13) the track of calculated candidate target:
The second level candidate's tracking target that is i for target designation, transverse direction and longitudinal direction location parameter equation are: cx=a
0+ a
1t, cy=b
0+ b
1t, wherein parametric t is the image frame number at place, target area;
The least square fitting of employing based on straight line, with the 1st, 3,5 frame position information matching location parameter equations, obtains a
0, a
1, b
0, b
1calculated value a
0 i, a
1 i, b
0 i, b
1 i:
(14) further obtain the target location of tracking target:
The target area of second level candidate target is carried out to target location by step (9) describing method in follow-up 4 frames and obtain, obtain the position coordinates of target i in 6~9 frames, the second level candidate target that is i for target designation, the positional information in 6~9 frames is designated as
(15) calculate the position coordinates estimated value of second level candidate target:
According to the parameter of the location parameter equation of obtaining in step (13), calculate the position coordinates estimated value of second level candidate target in the 7th frame and the 9th frame;
The second level candidate target that is i for target designation, the position coordinates estimated value in the 7th frame and the 9th frame is:
(16) calculate the distance between the 7th frame and the 9th frame position coordinate estimated value and actual value:
The second level candidate target that is i for target designation, is designated as respectively:
(17) determine target:
In the candidate target of the second level, reject
with
one of them is greater than system thresholds T numerical value
cdif,
with
numerical value is all less than threshold value T
cd, target i can be identified as landing Aircraft Targets;
(18) finally determine target:
The Aircraft Targets of confirming when step (17) during more than 1, is selected cd
7and cd
9the goal verification of numerical value sum minimum is landing Aircraft Targets.
Further, the Aircraft Targets detection method of a kind of anti-interference of clouds as above, step (12) in, system thresholds T
hcbe taken as 10.
Further, the Aircraft Targets detection method of a kind of anti-interference of clouds as above, step (17) in, system thresholds T
cdbe taken as 10.
Further, the Aircraft Targets detection method of a kind of anti-interference of clouds as above, in step (1), camera review resolution is 640 * 256, in the middle of choosing, the 65th of 1/2d the to walk to 192 row regions be low dummy section.
Further, the Aircraft Targets detection method of a kind of anti-interference of clouds as above, in step (1), while carrying out Top-Hat morphologic filtering, filtering factor is laterally 27, is longitudinally 21.
In aircraft landing process, be easily subject to the interference of cloud layer in sky, particularly, when aerial cloud amount is more, Aircraft Targets process is very easily Aircraft Targets by the similar cloud cluster error detection of shape in catching.The method that technical solution of the present invention introducing target geometric properties and motion feature combine realizes the identification to Aircraft Targets in landing process, thereby guaranteed that photoelectric follow-up is in aircraft landing process, even if cloud noise much also can be stablized and obtain the position of Aircraft Targets in image in continuous video frame image
Accompanying drawing explanation
Fig. 1 is the method flow diagram in technical solution of the present invention.
Embodiment
Below in conjunction with accompanying drawing, technical solution of the present invention is further elaborated.
In the three-dimensional prevention and control system in airport, adopt the method that target geometric properties and motion feature are combined to realize the identification to Aircraft Targets in landing process.First by Top-Hat filtering, background is suppressed, by filtered image is carried out to Threshold segmentation, obtain potential target region.By calculating potential region geometry feature, remove pseudo-region; Motion feature and the movement locus of recycling target judge, identify real Aircraft Targets.
The geometric properties of target comprises height, width and the ratio of width to height etc. of target area, by judgement, can propose obviously do not meeting in shape the target area of aircraft signature.In multiple image after this, repeat above operation, obtain the area information in multiframe and carry out association.Because cloud layer and Aircraft Targets have obvious difference in flying speed, cloud layer is in static or utmost point lower-speed state in image, and aircraft is kept in motion.Calculate the displacement of target in image in successive image frame.If displacement is greater than threshold value, regards as aircraft candidate target, thereby remove non-moving target.
Due to aircraft in landing process in fixed cameras scene in rectilinear motion state, and interfere information is in irregular movement state, therefore by track, finally confirms Aircraft Targets.In aircraft candidate target, by the historical data of position in n two field picture, carry out least squares line fitting, obtain aircraft candidate target track.The distance of target and this straight path in calculated for subsequent frame, if multiframe middle distance is greater than threshold value, removes target.Otherwise confirm that this target is Aircraft Targets.
Based on above-mentioned design premises, in the aircraft landing process of the anti-interference of clouds that the present invention proposes, Aircraft Targets detection method adopts the TMS320C6416 DSP design of graphics image signal processor hardware platform that TI company produces, adopt the special-purpose C language of TMS320C6416 DSP and special-purpose assembly language hybrid programming to realize its software code, after compiling, be solidificated on image-signal processor hardware, power up rear DSP Auto Loader operation.
As shown in Figure 1, technical solution of the present invention is specifically: Aircraft Targets detection method in a kind of aircraft landing process of anti-interference of clouds, be applied in the three-dimensional prevention and control system in airport Aircraft Targets detection system in aircraft landing process, according to the kinetic characteristic of the Aircraft Targets that lands in infrared image, motion feature to low target is analyzed, remove cloud noise, determine the Aircraft Targets in landing process in low target to be confirmed, concrete steps are as follows:
(1) filtering:
In low latitude, traffic pattern monitoring infrared image, determine the low dummy section in image, low dummy section is determined according to video camera putting position and field range;
For the low dummy section of Aircraft Targets in image, after employing elder generation is horizontal, mode is carried out Top-Hat morphologic filtering longitudinally, and filtering factor length is determined according to target in real image;
In this specific embodiment, camera review resolution is 640 * 256, and in the middle of choosing, the 65th of 1/2d the to walk to 192 row regions be low dummy section; Choosing filtering factor is laterally 27, is longitudinally 21.
(2) computation of mean values and standard deviation:
To carrying out average μ and the standard deviation sigma of the image-region calculation of filtered image of Top-Hat filtering;
Filtering image I is set
ffor the capable c row of r, average μ is expressed as:
standard deviation sigma is expressed as:
I wherein
f(i, j) presentation video I
fthe pixel value of the capable j row of i pixel;
(3) calculate optimum segmentation threshold value:
For image, be greater than the pixel value of μ+σ in mutually, with maximum between-cluster variance criterion, calculate optimum segmentation threshold value Th;
(4) cut apart image:
According to threshold value Th to filtering image I
fcarry out image and cut apart, form and cut apart image I
s;
Cut apart rule as follows:
I wherein
f(i, j) presentation video I
fthe pixel value of the capable j row of i pixel, I
s(i, j) presentation video I
sthe pixel value of the capable j row of i pixel;
(5) determine candidate target region:
By cutting apart the connected region that in image, pixel value is 1, carry out pixel count statistics, the isolated point that removal pixel count is 1; Other connected regions are sorted from big to small according to pixel count, select 5 regions of pixel count maximum as candidate target;
(6) calculated candidate target length breadth ratio:
Long parameter 1gh is target area boundary rectangle transverse direction pixel count, and wide parameter wdh is target area boundary rectangle longitudinal direction pixel count; Target length breadth ratio 1w is defined as:
(7) according to length breadth ratio choice of parameters candidate target:
Length breadth ratio parameter 1w is less than to 1 to reject from first order candidate target region with the target area that is greater than 10;
(8) calculate and store the centre coordinate of all candidate targets:
Candidate target width is candidate target region rightmost pixel and Far Left pixel horizontal ordinate difference, and candidate target height is topmost pixel and pixel ordinate difference bottom of candidate target region, and candidate target area is that candidate target region covers pixel count;
If p that candidate target covers pixel coordinate is respectively (x
1, y
1), (x
2, y
2) ..., (x
p, y
p), this candidate target center (cx, cy) coordinate is:
(9) in candidate target, determine the candidate target of continued presence:
To picture frame repeating step (1)~step (8) afterwards, record the centre coordinate of each frame candidate target, for centre coordinate in k frame, be (cx
k, cy
k) candidate target T1, in k-1 frame, centre coordinate is (cx
k-1, cy
k-1) candidate target and the Euclidean distance of candidate target T1 centre coordinate nearest, and meet following relation:
time, between judgement two continuous frames, the centre distance of candidate target is nearest, thereby confirms that in k-1 frame, centre coordinate is (cx
k-1, cy
k-
1) target and T1 be same target;
(10) determine first order candidate target:
If continuous 5 frames of a certain target exist, confirm as first order candidate target, record the positional information in continuous 5 frames of this target; The candidate target that is i for target designation, target position information is transverse and longitudinal coordinate cx, the cy in image; Continuous 5 frame target position informations are designated as
(11) the lateral coordinates of calculating within first order candidate target 5 frames changes:
The candidate target that is i for target designation, lateral coordinates running parameter hc is the absolute value of the difference of this target the 1st frame position horizontal ordinate and the 5th frame position horizontal ordinate, is described as:
(12) according to lateral coordinates, change screening first order candidate target, obtain second level candidate target:
In candidate target, remove its corresponding lateral coordinates variation hc value and be less than system thresholds T
hctarget, obtain second level candidate target;
In this specific embodiment, system thresholds T
hcbe taken as 10.
(13) the track of calculated candidate target:
The second level candidate's tracking target that is i for target designation, transverse direction and longitudinal direction location parameter equation are: cx=a
0+ a
1t, cy=b
0+ b
1t, wherein parametric t is the image frame number at place, target area;
The least square fitting of employing based on straight line, with the 1st, 3,5 frame position information matching location parameter equations, obtains a
0, a
1, b
0, b
1calculated value a
0 i, a
1 i, b
0 i, b
1 i:
(14) further obtain the target location of tracking target:
The target area of second level candidate target is carried out to target location by step (9) describing method in follow-up 4 frames and obtain, obtain the position coordinates of target i in 6~9 frames, the second level candidate target that is i for target designation, the positional information in 6~9 frames is designated as
(15) calculate the position coordinates estimated value of second level candidate target:
According to the parameter of the location parameter equation of obtaining in step (13), calculate the position coordinates estimated value of second level candidate target in the 7th frame and the 9th frame;
The second level candidate target that is i for target designation, the position coordinates estimated value in the 7th frame and the 9th frame is:
(16) calculate the distance between the 7th frame and the 9th frame position coordinate estimated value and actual value:
The second level candidate target that is i for target designation, is designated as respectively:
(17) determine target:
In the candidate target of the second level, reject
with
one of them is greater than system thresholds T numerical value
cdif,
with
numerical value is all less than threshold value T
cd, target i can be identified as landing Aircraft Targets;
In this specific embodiment, system thresholds T
cdbe taken as 10.
(18) finally determine target:
The Aircraft Targets of confirming when step (17) during more than 1, is selected cd
7and cd
9the goal verification of numerical value sum minimum is landing Aircraft Targets.
Claims (5)
1. an Aircraft Targets detection method for anti-interference of clouds, is applied in the three-dimensional prevention and control system in airport Aircraft Targets detection system in aircraft landing process, it is characterized in that:
According to the kinetic characteristic of the Aircraft Targets that lands in infrared image, the motion feature of low target is analyzed, remove cloud noise, in low target to be confirmed, determine the Aircraft Targets in landing process, concrete steps are as follows:
(1) filtering:
In low latitude, traffic pattern monitoring infrared image, determine the low dummy section in image, low dummy section is determined according to video camera putting position and field range;
For the low dummy section of Aircraft Targets in image, after employing elder generation is horizontal, mode is carried out Top-Hat morphologic filtering longitudinally, and filtering factor length is determined according to target in real image;
(2) computation of mean values and standard deviation:
To carrying out average μ and the standard deviation sigma of the image-region calculation of filtered image of Top-Hat filtering;
Filtering image I is set
ffor the capable c row of r, average μ is expressed as:
standard deviation sigma is expressed as:
I wherein
f(i, j) presentation video I
fthe pixel value of the capable j row of i pixel;
(3) calculate optimum segmentation threshold value:
For image, be greater than the pixel value of μ+σ in mutually, with maximum between-cluster variance criterion, calculate optimum segmentation threshold value Th;
(4) cut apart image:
According to threshold value Th to filtering image I
fcarry out image and cut apart, form and cut apart image I
s;
Cut apart rule as follows:
I wherein
f(i, j) presentation video I
fthe pixel value of the capable j row of i pixel, I
s(i, j) presentation video I
sthe pixel value of the capable j row of i pixel;
(5) determine candidate target region:
By cutting apart the connected region that in image, pixel value is 1, carry out pixel count statistics, the isolated point that removal pixel count is 1; Other connected regions are sorted from big to small according to pixel count, select 5 regions of pixel count maximum as candidate target;
(6) calculated candidate target length breadth ratio:
Long parameter 1gh is target area boundary rectangle transverse direction pixel count, and wide parameter wdh is target area boundary rectangle longitudinal direction pixel count; Target length breadth ratio 1w is defined as:
(7) according to length breadth ratio choice of parameters candidate target:
Length breadth ratio parameter 1w is less than to 1 to reject from first order candidate target region with the target area that is greater than 10;
(8) calculate and store the centre coordinate of all candidate targets:
Candidate target width is candidate target region rightmost pixel and Far Left pixel horizontal ordinate difference, and candidate target height is topmost pixel and pixel ordinate difference bottom of candidate target region, and candidate target area is that candidate target region covers pixel count;
If p that candidate target covers pixel coordinate is respectively (x
1, y
1), (x
2, y
2) ..., (x
p, y
p), this candidate target center (cx, cy) coordinate is:
(9) in candidate target, determine the candidate target of continued presence:
To picture frame repeating step (1)~step (8) afterwards, record the centre coordinate of each frame candidate target, for centre coordinate in k frame, be (cx
k, cy
k) candidate target T1, in k-1 frame, centre coordinate is (cx
k-1, cy
k-1) candidate target and the Euclidean distance of candidate target T1 centre coordinate nearest, and meet following relation:
time, between judgement two continuous frames, the centre distance of candidate target is nearest, thereby confirms that in k-1 frame, centre coordinate is (cx
k-1, cy
k-1) target and T1 be same target;
(10) determine first order candidate target:
If continuous 5 frames of a certain target exist, confirm as first order candidate target, record the positional information in continuous 5 frames of this target; The candidate target that is i for target designation, target position information is transverse and longitudinal coordinate cx, the cy in image; Continuous 5 frame target position informations are designated as
(11) the lateral coordinates of calculating within first order candidate target 5 frames changes:
The candidate target that is i for target designation, lateral coordinates running parameter hc is the absolute value of the difference of this target the 1st frame position horizontal ordinate and the 5th frame position horizontal ordinate, is described as:
(12) according to lateral coordinates, change screening first order candidate target, obtain second level candidate target:
In candidate target, remove its corresponding lateral coordinates variation hc value and be less than system thresholds T
hctarget, obtain second level candidate target;
(13) the track of calculated candidate target:
The second level candidate's tracking target that is i for target designation, transverse direction and longitudinal direction location parameter equation are: cx=a
0+ a
1t, cy=b
0+ b
1t, wherein parametric t is the image frame number at place, target area;
The least square fitting of employing based on straight line, with the 1st, 3,5 frame position information matching location parameter equations, obtains a
0, a
1, b
0, b
1calculated value a
0 i, a
1 i, b
0 i, b
1 i:
(14) further obtain the target location of tracking target:
The target area of second level candidate target is carried out to target location by step (9) describing method in follow-up 4 frames and obtain, obtain the position coordinates of target i in 6~9 frames, the second level candidate target that is i for target designation, the positional information in 6~9 frames is designated as
(15) calculate the position coordinates estimated value of second level candidate target:
According to the parameter of the location parameter equation of obtaining in step (13), calculate the position coordinates estimated value of second level candidate target in the 7th frame and the 9th frame;
The second level candidate target that is i for target designation, the position coordinates estimated value in the 7th frame and the 9th frame is:
(16) calculate the distance between the 7th frame and the 9th frame position coordinate estimated value and actual value:
The second level candidate target that is i for target designation, is designated as respectively:
(17) determine target:
In the candidate target of the second level, reject
with
one of them is greater than system thresholds T numerical value
cdif,
with
numerical value is all less than threshold value T
cd, target i can be identified as landing Aircraft Targets;
(18) finally determine target:
The Aircraft Targets of confirming when step (17) during more than 1, is selected cd
7and cd
9the goal verification of numerical value sum minimum is landing Aircraft Targets.
2. the Aircraft Targets detection method of a kind of anti-interference of clouds as claimed in claim 1, is characterized in that: step (12) in, system thresholds T
hcbe taken as 10.
3. the Aircraft Targets detection method of a kind of anti-interference of clouds as claimed in claim 1, is characterized in that: step (17) in, system thresholds T
cdbe taken as 10.
4. the Aircraft Targets detection method of a kind of anti-interference of clouds as claimed in claim 1, is characterized in that: in step (1), camera review resolution is 640 * 256, and in the middle of choosing, the 65th of 1/2d the to walk to 192 row regions be low dummy section.
5. the Aircraft Targets detection method of a kind of anti-interference of clouds as claimed in claim 1, is characterized in that: in step (1), while carrying out Top-Hat morphologic filtering, filtering factor is laterally 27, is longitudinally 21.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210359066.0A CN103679128B (en) | 2012-09-24 | 2012-09-24 | A kind of Aircraft Targets detection method of anti-interference of clouds |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210359066.0A CN103679128B (en) | 2012-09-24 | 2012-09-24 | A kind of Aircraft Targets detection method of anti-interference of clouds |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103679128A true CN103679128A (en) | 2014-03-26 |
CN103679128B CN103679128B (en) | 2016-09-28 |
Family
ID=50316620
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210359066.0A Active CN103679128B (en) | 2012-09-24 | 2012-09-24 | A kind of Aircraft Targets detection method of anti-interference of clouds |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103679128B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105738870A (en) * | 2014-12-10 | 2016-07-06 | 上海机电工程研究所 | Multi-mode filtering method |
CN109886132A (en) * | 2019-01-25 | 2019-06-14 | 北京市遥感信息研究所 | A kind of sea of clouds background Aircraft Targets detection method, apparatus and system |
CN111222511A (en) * | 2020-04-13 | 2020-06-02 | 中山大学 | Infrared unmanned aerial vehicle target detection method and system |
CN111767914A (en) * | 2019-04-01 | 2020-10-13 | 佳能株式会社 | Target object detection device and method, image processing system, and storage medium |
CN114046696A (en) * | 2021-09-24 | 2022-02-15 | 中国人民解放军空军工程大学 | Method for acquiring foil-type infrared surface source bait dynamic diffusion characteristic test |
CN114360296A (en) * | 2021-12-15 | 2022-04-15 | 中国飞行试验研究院 | Full-automatic airplane approach landing process monitoring method based on foundation photoelectric equipment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102004922A (en) * | 2010-12-01 | 2011-04-06 | 南京大学 | High-resolution remote sensing image plane extraction method based on skeleton characteristic |
CN102298698A (en) * | 2011-05-30 | 2011-12-28 | 河海大学 | Remote sensing image airplane detection method based on fusion of angle points and edge information |
-
2012
- 2012-09-24 CN CN201210359066.0A patent/CN103679128B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102004922A (en) * | 2010-12-01 | 2011-04-06 | 南京大学 | High-resolution remote sensing image plane extraction method based on skeleton characteristic |
CN102298698A (en) * | 2011-05-30 | 2011-12-28 | 河海大学 | Remote sensing image airplane detection method based on fusion of angle points and edge information |
Non-Patent Citations (2)
Title |
---|
王卫华等: "一种复杂云层背景红外弱小目标稳健检测算法", 《信号处理》 * |
王卫华等: "基于时空域融合滤波的红外运动小目标检测算法", 《红外与激光工程》 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105738870A (en) * | 2014-12-10 | 2016-07-06 | 上海机电工程研究所 | Multi-mode filtering method |
CN105738870B (en) * | 2014-12-10 | 2018-03-30 | 上海机电工程研究所 | A kind of multimode filtering method |
CN109886132A (en) * | 2019-01-25 | 2019-06-14 | 北京市遥感信息研究所 | A kind of sea of clouds background Aircraft Targets detection method, apparatus and system |
CN109886132B (en) * | 2019-01-25 | 2020-12-15 | 北京市遥感信息研究所 | Method, device and system for detecting target of cloud sea background airplane |
CN111767914A (en) * | 2019-04-01 | 2020-10-13 | 佳能株式会社 | Target object detection device and method, image processing system, and storage medium |
CN111222511A (en) * | 2020-04-13 | 2020-06-02 | 中山大学 | Infrared unmanned aerial vehicle target detection method and system |
CN111222511B (en) * | 2020-04-13 | 2020-07-24 | 中山大学 | Infrared unmanned aerial vehicle target detection method and system |
CN114046696A (en) * | 2021-09-24 | 2022-02-15 | 中国人民解放军空军工程大学 | Method for acquiring foil-type infrared surface source bait dynamic diffusion characteristic test |
CN114360296A (en) * | 2021-12-15 | 2022-04-15 | 中国飞行试验研究院 | Full-automatic airplane approach landing process monitoring method based on foundation photoelectric equipment |
CN114360296B (en) * | 2021-12-15 | 2024-04-09 | 中国飞行试验研究院 | Full-automatic aircraft approach landing process monitoring method based on foundation photoelectric equipment |
Also Published As
Publication number | Publication date |
---|---|
CN103679128B (en) | 2016-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111461023B (en) | Method for quadruped robot to automatically follow pilot based on three-dimensional laser radar | |
CN110532889B (en) | Track foreign matter detection method based on rotor unmanned aerial vehicle and YOLOv3 | |
CN114842438B (en) | Terrain detection method, system and readable storage medium for automatic driving automobile | |
CN103679128A (en) | Anti-cloud-interference airplane target detection method | |
CN105758397B (en) | A kind of aircraft camera positioning method | |
CN107609522A (en) | A kind of information fusion vehicle detecting system based on laser radar and machine vision | |
CN106845364B (en) | Rapid automatic target detection method | |
CN107193011A (en) | A kind of method for being used to quickly calculate car speed in automatic driving car area-of-interest | |
CN111213155A (en) | Image processing method, device, movable platform, unmanned aerial vehicle and storage medium | |
CN107444665A (en) | A kind of unmanned plane Autonomous landing method | |
CN105158762A (en) | Identifying and tracking convective weather cells | |
CN101847265A (en) | Method for extracting moving objects and partitioning multiple objects used in bus passenger flow statistical system | |
CN103824070A (en) | Rapid pedestrian detection method based on computer vision | |
CN109682378A (en) | A kind of unmanned plane indoor positioning and multi-object tracking method based entirely on visual information | |
CN105931217A (en) | Image processing technology-based airport pavement FOD (foreign object debris) detection method | |
CN115249349A (en) | Point cloud denoising method, electronic device and storage medium | |
CN102111530A (en) | Device and method for movable object detection | |
CN114038193A (en) | Intelligent traffic flow data statistical method and system based on unmanned aerial vehicle and multi-target tracking | |
CN113763427A (en) | Multi-target tracking method based on coarse-fine shielding processing | |
CN105810023A (en) | Automatic airport undercarriage retraction and extension monitoring system and method | |
CN114926422A (en) | Method and system for detecting boarding and alighting passenger flow | |
CN105023231A (en) | Bus data acquisition method based on video recognition and cell phone GPS | |
KR101441422B1 (en) | Decision-Making Device and Method using Ontology technique to predict a collision with obstacle during take-off and landing of aircraft | |
Truong-Hong et al. | Automatic detection of road edges from aerial laser scanning data | |
Fakhfakh et al. | Weighted v-disparity approach for obstacles localization in highway environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |