CN106951823A - A kind of infrared image undercarriage automatic identifying method - Google Patents

A kind of infrared image undercarriage automatic identifying method Download PDF

Info

Publication number
CN106951823A
CN106951823A CN201710065439.6A CN201710065439A CN106951823A CN 106951823 A CN106951823 A CN 106951823A CN 201710065439 A CN201710065439 A CN 201710065439A CN 106951823 A CN106951823 A CN 106951823A
Authority
CN
China
Prior art keywords
directions
undercarriage
area
infrared image
pos
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710065439.6A
Other languages
Chinese (zh)
Other versions
CN106951823B (en
Inventor
白俊奇
刘�文
王寿峰
常传文
章林
陈宇寒
朱伟
刘文松
苗锋
郑浩
陈福玉
龙超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Lesi Electronic Equipment Co., Ltd.
Original Assignee
CETC 28 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 28 Research Institute filed Critical CETC 28 Research Institute
Priority to CN201710065439.6A priority Critical patent/CN106951823B/en
Publication of CN106951823A publication Critical patent/CN106951823A/en
Application granted granted Critical
Publication of CN106951823B publication Critical patent/CN106951823B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/28Indexing scheme for image data processing or generation, in general involving image processing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a kind of infrared image undercarriage automatic identifying method, comprise the following steps:Step 1, input infrared image I;Step 2, statistics infrared image I histogram His, calculates the segmentation threshold Th of foreground and background;Step 3, using segmentation threshold Th by infrared image I binaryzations, binary map Bi Img are obtained;Step 4, the position Pos of two engines of aircraft is positioned1And Pos2;Step 5, undercarriage distributed areas before and after calculating, respectively Area1、Area2And Area3;Step 6, fall into a trap calculation undercarriage characteristic parameter F in distributed areas1、F2And F3;Step 7, according to characteristic parameter F1、F2And F3Complete undercarriage identification.Undercarriage automatic identification technology recognition correct rate of the present invention is high, operand is small, be easy to hardware real-time implementation.

Description

A kind of infrared image undercarriage automatic identifying method
Technical field
The present invention relates to a kind of infrared image undercarriage automatic identifying method.
Background technology
The abundant opening of undercarriage and the prerequisite that locking is aircraft safety landing.At present, although military-civil aircraft is all The machinery, light prompt mark and instrument for being provided with undercarriage indicate that part aircraft also has images first-class monitoring out of my cabin Equipment can observe undercarriage.But observer, manual confirmation need to be set up on ground toward contact during aircraft landing The open mode of undercarriage, to ensure to be perfectly safe.And manual observation is many due to being limited by weather, period, intensity of illumination etc. It is winged to meet that the configured information of undercarriage can not may in time, be effectively provided under the influence of extraneous factor, extreme case The demand of quick decision before machine landing, so being helped in infrared electro, that the undercarriage based on infrared image is added in drop system is full-automatic Identification technology just has very important significance.
In infrared image target detection, identification field, domestic and foreign scholars have carried out numerous studies, and obtain many researchs Achievement, main thought is according to the marginal information of target in image, target local gray level average and variance, target speed etc. The characteristic information of itself opens target with background segment.Undercarriage can not simply use traditional object detection method, lead to Cross foreground segmentation and directly distinguish in undercarriage region from aircraft overall structure and analyzed.Also need to solve following key Problem:(1) aircraft yardstick and attitude are accurately positioned under uncertain visual field, the structure of aircraft is extracted;(2) aircraft ring frames are based on Appropriate adaptivenon-uniform sampling, positioning are carried out to undercarriage region and is extracted.(3) rack-like of rising and falling is carried out to the undercarriage region of extraction The automatic identification of state;(4) most existing target identification method operands are big, are not easy to hardware real-time implementation.
The content of the invention
The purpose of the present invention be for double aircrafts held up of carrying out the coffin upon burial, design a kind of simple method, strong applicability, work well, And it is adapted to the infrared image undercarriage automatic identification technology of hardware real-time implementation.
The implementation steps of technical solution of the present invention are as follows:
Step 1, input infrared image I;
Step 2, statistics infrared image I histogram His, calculates the segmentation threshold Th of foreground and background;
Step 3, using segmentation threshold Th by infrared image I binaryzations, binary map Bi Img are obtained;
Step 4, the position Pos of two engines of aircraft is positioned1And Pos2
Step 5, undercarriage distributed areas before and after calculating, respectively Area1、Area2And Area3
Step 6, fall into a trap calculation undercarriage characteristic parameter F in distributed areas1、F2And F3
Step 7, according to characteristic parameter F1、F2And F3Complete undercarriage identification.
Step 2 comprises the following steps:
Step 2-1, calculates infrared image I histogram His (k) { k=0 ..., 255 }, k represents gray level;
Step 2-2, setting thresholding T1, 1 × N is carried out from high grade grey level to low gray level to histogram His and draws window processing, if Draw the pixel count His (k) of current gray level in window>T1, current gray level level is judged as background value, otherwise, it is determined that current gray level level is Prospect value, wherein, 5≤N≤15, N represents window width, 100≤T1≤150;
Step 2-3,1 × N of statistics draws His (k) in window>T1Number Count, if Count/N>0.8, then by the k at N/2 Value is determined as segmentation threshold Th, otherwise, continues to carry out 1 × N strokes of window processing from high grade grey level to low gray level, until finding segmentation Untill threshold value.
Step 4 includes:
Infrared imaging gray value depends on the difference of target and background, and difference is bigger, and gray value is higher.Aircraft IR signature Show as engine gray value highest.To avoid blind element, noise etc. from influenceing, from high grade grey level to low gray scale in histogram His Level is searched, will pixel count His (k) first>40 gray level is determined as engine gray value, according to the gray value in infrared image The position Pos of middle location engine1And Pos2, their coordinate is respectively (x1,y1) and (x2,y2), x1,y1Position is represented respectively Pos1Abscissa and ordinate, x2,y2Position Pos is represented respectively2Abscissa and ordinate, k represents gray level.
Step 5 includes:
Undercarriage distribution characteristics is:Nose-gear region Area1Near the axis of two engines, rear undercarriage Region Area2And Area3It is located at the position Pos of two engines respectively1And Pos2Near, with two engine location coordinate (x1, y1) and (x2,y2) it is reference point, x1<x2, rectangular coordinate system is set up, then:
Area1It is expressed as:Original position in transverse axis x directions isStop bits in transverse axis x directions It is set toIt is in the original position of longitudinal y directionsIt is in the end position of longitudinal y directions
Area2It is expressed as:Original position in transverse axis x directions is x1;End position in transverse axis x directions isIt is in the original position of longitudinal y directionsIt is in the end position of longitudinal y directions
Area3It is expressed as:Original position in transverse axis x directions isEnd position in transverse axis x directions is x2;It is in the original position of longitudinal y directionsIt is in the end position of longitudinal y directions
Wherein, T2Represent length-width ratio, 2≤T2≤4。
Step 6 includes:
In binary map Bi Img, it is assumed that front and rear undercarriage distributed areas Area1、Area2And Area3Pixel value is 1 y Direction maximum position is respectively ymax1、ymax2And ymax3, then characteristic parameter F1、F2And F3Expression formula is as follows:
Step 7 includes:
If F1>T3, then judge that nose-gear is in down state, otherwise, it is determined that nose-gear is in collapsed state;
If F2>T3, then judge that rear left undercarriage is in down state, otherwise, it is determined that rear left undercarriage is in collapsed state;
If F3>T3, then right landing gear is in down state after judging, otherwise, it is determined that rear right landing gear is in collapsed state;
Wherein, 0.5≤T3≤1.Judge T according to actual conditions1、T2、T3Specific value.
Beneficial effect:The present invention compared with prior art, with following remarkable advantage:(1) using based on statistics with histogram Adaptive targets dividing method, do not influenceed by aircraft yardstick and attitude, being capable of Accurate Segmentation target;(2) risen and fallen by setting up Frame identification model, to the accurate identification of undercarriage on the premise of pixel count is few;(3) from aircraft engine as reference, to rising Fall frame positioning, undercarriage recognition correct rate can be improved;(4) high exponent arithmetic(al) and labyrinth is not present, algorithm operation quantity is small, it is easy to Hardware real-time implementation.
Brief description of the drawings
The present invention is done with reference to the accompanying drawings and detailed description and further illustrated, it is of the invention above-mentioned or Otherwise advantage will become apparent.
Fig. 1 is the flow chart of infrared image undercarriage automatic identification technology of the present invention.
Fig. 2 a are original images.
Fig. 2 b are binary maps.
Fig. 2 c are remote undercarriage recognition results.
Fig. 3 a are original images.
Fig. 3 b are binary maps.
Fig. 3 c are closely undercarriage recognition effect figures.
Embodiment
Below in conjunction with the accompanying drawings and embodiment the present invention will be further described.
The implementation steps of technical solution of the present invention are as follows:
Step 1, input infrared image I;
Step 2, statistics infrared image I histogram His, calculates the segmentation threshold Th of foreground and background;
Step 3, using segmentation threshold Th by infrared image I binaryzations, binary map Bi Img are obtained;
Step 4, the position Pos of two engines of aircraft is positioned1And Pos2
Step 5, undercarriage distributed areas before and after calculating, respectively Area1、Area2And Area3
Step 6, fall into a trap calculation undercarriage characteristic parameter F in distributed areas1、F2And F3
Step 7, according to characteristic parameter F1、F2And F3Complete undercarriage identification.
Step 2 comprises the following steps:
Step 2-1, calculates infrared image I histogram His (k) { k=0 ..., 255 }, k represents gray level;
Step 2-2, setting thresholding T1, 1 × N is carried out from high grade grey level to low gray level to histogram His and draws window processing, if Draw the pixel count His (k) of current gray level in window>T1, current gray level level is judged as background value, otherwise, it is determined that current gray level level is Prospect value, wherein, 5≤N≤15, N represents window width, 100≤T1≤150;
Step 2-3,1 × N of statistics draws His (k) in window>T1Number Count, if Count/N>0.8, then by the k at N/2 Value is determined as segmentation threshold Th, otherwise, continues to carry out 1 × N strokes of window processing from high grade grey level to low gray level, until finding segmentation Untill threshold value.
Step 4 includes:
Infrared imaging gray value depends on the difference of target and background, and difference is bigger, and gray value is higher.Aircraft IR signature Show as engine gray value highest.To avoid blind element, noise etc. from influenceing, from high grade grey level to low gray scale in histogram His Level is searched, will pixel count His (k) first>40 gray level is determined as engine gray value, according to the gray value in infrared image The position Pos of middle location engine1And Pos2, their coordinate is respectively (x1,y1) and (x2,y2), x1,y1Position is represented respectively Pos1Abscissa and ordinate, x2,y2Position Pos is represented respectively2Abscissa and ordinate, k represents gray level.
Step 5 includes:
Undercarriage distribution characteristics is:Nose-gear region Area1Near the axis of two engines, rear undercarriage Region Area2And Area3It is located at the position Pos of two engines respectively1And Pos2Near, with two engine location coordinate (x1, y1) and (x2,y2) it is reference point, x1<x2, rectangular coordinate system is set up, then:
Area1It is expressed as:Original position in transverse axis x directions isStop bits in transverse axis x directions It is set toIt is in the original position of longitudinal y directionsIt is in the end position of longitudinal y directions
Area2It is expressed as:Original position in transverse axis x directions is x1;End position in transverse axis x directions isIt is in the original position of longitudinal y directionsIt is in the end position of longitudinal y directions
Area3It is expressed as:Original position in transverse axis x directions isEnd position in transverse axis x directions is x2;It is in the original position of longitudinal y directionsIt is in the end position of longitudinal y directions
Wherein, T2Represent length-width ratio, 2≤T2≤4。
Step 6 includes:
In binary map Bi Img, it is assumed that front and rear undercarriage distributed areas Area1、Area2And Area3Pixel value is 1 y Direction maximum position is respectively ymax1、ymax2And ymax3, then characteristic parameter F1、F2And F3Expression formula is as follows:
Step 7 includes:
If F1>T3, then judge that nose-gear is in down state, otherwise, it is determined that nose-gear is in collapsed state;
If F2>T3, then judge that rear left undercarriage is in down state, otherwise, it is determined that rear left undercarriage is in collapsed state;
If F3>T3, then right landing gear is in down state after judging, otherwise, it is determined that rear right landing gear is in collapsed state;
Wherein, 0.5≤T3≤1。
Embodiment
Thermal infrared imager focal plane arrays (FPA) size is 640 × 512, and working frame frequency is 50 frame per second.Image processing platform is adopted DSP+FPGA frameworks are used, infrared image undercarriage automatic identification technology is realized in DSP Processor, meet the need handled in real time Ask, as shown in figure 1, specific implementation step is as follows:
(1) input infrared image I is obtained;
DSP Processor input picture I is 8 bit digital images, and picture size is 640 × 512.
(2) I histogram His are counted, the segmentation threshold Th of foreground and background is calculated;
Histogram His is expressed as His (k) { k=0...255 }, setting thresholding T1=120, draw 1 × N of window and take N=10, root According to Count/N>0.8 can be calculated threshold value Th=130.
(3) image I binaryzations are obtained into binary map Bi Img, as shown in Fig. 2 b and Fig. 3 b using threshold value Th=130;
(4) positioning aircraft engine position Pos1And Pos2, i.e. (x1,y1) and (x2,y2);
, will pixel count His (k) first in histogram His (k) { k=0...255 }>40 gray level is considered to start Machine gray value, can be calculated engine gray level for 240, according to gray value 240 can in infrared image location engine position Put Pos1And Pos2, for example in fig. 2 a be (320,304) and (398,296), in fig. 3 a for (158,412) and (182, 396)。
(5) undercarriage distributed areas are respectively Area before and after calculating1、Area2And Area3
Area1It is represented by:Original position in x directions isEnd position in x directions isOriginal position in y directions isEnd position in y directions is
Area2It is represented by:Original position in x directions is x1;End position in x directions isIn y The original position in direction isEnd position in y directions is
Area3It is represented by:Original position in x directions isEnd position in x directions is x2;In y The original position in direction isEnd position in y directions is
Take length-width ratio T2=3, according to above-mentioned expression formula, it can be calculated:
In fig. 2 a, Area1Original position in x directions is 351;End position in x directions is 367;In y directions Original position is 300;End position in y directions is 326;Area2It is represented by:Original position in x directions is 320;In x The end position in direction is 340;Original position in y directions is 300;End position in y directions is 326;Area3It can represent For:Original position in x directions is 378;End position in x directions is 398;Original position in y directions is 300;In y side To end position be 326.
In fig. 3 a, Area1Original position in x directions is 208;End position in x directions is 232;In y directions Original position is 404;End position in y directions is 445;Area2It is represented by:Original position in x directions is 158;In x The end position in direction is 189;Original position in y directions is 404;End position in y directions is 445;Area3It can represent For:Original position in x directions is 282;End position in x directions is 251;Original position in y directions is 404;In y side To end position be 445.
(6) in Area1、Area2And Area3Calculate undercarriage characteristic parameter F1、F2And F3
According to F1、F2And F3Computing formula can be obtained, in fig. 2 a, F1=0.53, F2=0.81 and F3=0.79;In Fig. 3 a In, F1=0.57, F2=0.83 and F3=0.85.
(7) according to characteristic parameter F1、F2And F3Parameter completes undercarriage identification;
Take T3=0.5, principle is recognized according to undercarriage, Fig. 2 a and Fig. 3 a undercarriage can be identified, such as Fig. 2 c and figure Shown in 3c.
The invention provides a kind of infrared image undercarriage automatic identifying method, implement the technical scheme method and Approach is a lot, and the above is only the preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, under the premise without departing from the principles of the invention, some improvements and modifications can also be made, these improvements and modifications also should It is considered as protection scope of the present invention.Each part being not known in the present embodiment can use prior art to be realized.

Claims (6)

1. a kind of infrared image undercarriage automatic identifying method, it is characterised in that comprise the following steps:
Step 1, input infrared image I;
Step 2, statistics infrared image I histogram His, calculates the segmentation threshold Th of foreground and background;
Step 3, infrared image I binaryzations are obtained into binary map BiImg using segmentation threshold Th;
Step 4, the position Pos of two engines of aircraft is positioned1And Pos2
Step 5, undercarriage distributed areas before and after calculating, respectively Area1、Area2And Area3
Step 6, fall into a trap calculation undercarriage characteristic parameter F in distributed areas1、F2And F3
Step 7, according to characteristic parameter F1、F2And F3Complete undercarriage identification.
2. technology according to claim 1, it is characterised in that step 2 comprises the following steps:
Step 2-1, calculates infrared image I histogram His (k) { k=0 ..., 255 }, k represents gray level;
Step 2-2, setting thresholding T1, 1 × N is carried out from high grade grey level to low gray level to histogram His and draws window processing, if drawing window The pixel count His (k) of interior current gray level>T1, current gray level level is judged as background value, otherwise, it is determined that current gray level level is prospect Value, wherein, 5≤N≤15, N represents window width, 100≤T1≤150;
Step 2-3,1 × N of statistics draws His (k) in window>T1Number Count, if Count/N>0.8, then the k values at N/2 are sentenced It is set to segmentation threshold Th, otherwise, continues to carry out 1 × N strokes of window processing from high grade grey level to low gray level, until finding segmentation threshold Untill.
3. method according to claim 2, it is characterised in that step 4 includes:
Searched in histogram His from high grade grey level to low gray level, will pixel count His (k) first>40 gray level is determined as Engine gray value, the position Pos of two engines is positioned according to the gray value in infrared image1And Pos2, their coordinate Respectively (x1,y1) and (x2,y2), x1,y1Engine location Pos is represented respectively1Abscissa and ordinate, x2,y2Represent respectively Engine location Pos2Abscissa and ordinate.
4. method according to claim 3, it is characterised in that step 5 includes:
Undercarriage distribution characteristics is:Nose-gear region Area1Near the axis of two engines, rear undercarriage region Area2And Area3It is located at the position Pos of two engines respectively1And Pos2Near, with two engine location coordinate (x1,y1) (x2,y2) it is reference point, x1<x2, rectangular coordinate system is set up, then:
Area1It is expressed as:Original position in transverse axis x directions isEnd position in transverse axis x directions isIt is in the original position of longitudinal y directionsIt is in the end position of longitudinal y directions
Area2It is expressed as:Original position in transverse axis x directions is x1;End position in transverse axis x directions is The original position of longitudinal y directions isIt is in the end position of longitudinal y directions
Area3It is expressed as:Original position in transverse axis x directions isEnd position in transverse axis x directions is x2; The original position of longitudinal y directions isIt is in the end position of longitudinal y directions
Wherein, T2Represent length-width ratio, 2≤T2≤4。
5. technology according to claim 4, it is characterised in that step 6 includes:
In binary map BiImg, it is assumed that front and rear undercarriage distributed areas Area1、Area2And Area3The y directions that pixel value is 1 are most Big position is respectively ymax1、ymax2And ymax3, then characteristic parameter F1、F2And F3Expression formula is as follows:
F 1 = ( | y m a x 1 | - ( y 1 + y 2 ) 2 ) / ( | x 1 - x 2 | T 2 ) ;
F 2 = ( | y m a x 2 | - ( y 1 + y 2 ) 2 ) / ( | x 1 - x 2 | T 2 ) ;
F 3 = ( | y m a x 3 | - ( y 1 + y 2 ) 2 ) / ( | x 1 - x 2 | T 2 ) .
6. method according to claim 5, it is characterised in that step 7 includes:
If F1>T3, then judge that nose-gear is in down state, otherwise, it is determined that nose-gear is in collapsed state;
If F2>T3, then judge that rear left undercarriage is in down state, otherwise, it is determined that rear left undercarriage is in collapsed state;
If F3>T3, then right landing gear is in down state after judging, otherwise, it is determined that rear right landing gear is in collapsed state;
Wherein, 0.5≤T3≤1。
CN201710065439.6A 2017-02-06 2017-02-06 A kind of infrared image undercarriage automatic identifying method Active CN106951823B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710065439.6A CN106951823B (en) 2017-02-06 2017-02-06 A kind of infrared image undercarriage automatic identifying method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710065439.6A CN106951823B (en) 2017-02-06 2017-02-06 A kind of infrared image undercarriage automatic identifying method

Publications (2)

Publication Number Publication Date
CN106951823A true CN106951823A (en) 2017-07-14
CN106951823B CN106951823B (en) 2019-11-15

Family

ID=59465334

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710065439.6A Active CN106951823B (en) 2017-02-06 2017-02-06 A kind of infrared image undercarriage automatic identifying method

Country Status (1)

Country Link
CN (1) CN106951823B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107633505A (en) * 2017-08-24 2018-01-26 南京理工大学 A kind of undercarriage detection method based on target gray distribution character
CN109614864A (en) * 2018-11-06 2019-04-12 南京莱斯电子设备有限公司 A kind of ground visual angle multi-model undercarriage folding and unfolding condition detection method
WO2022071892A1 (en) * 2020-10-01 2022-04-07 Chew Rong Jie David A system for detecting the deployment of a landing gear of an aircraft and a method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016011099A1 (en) * 2014-07-18 2016-01-21 Sikorsky Aircraft Corporation System for determining weight-on-wheels using lidar
CN105810023A (en) * 2016-05-16 2016-07-27 福建福光股份有限公司 Automatic airport undercarriage retraction and extension monitoring system and method
CN106210061A (en) * 2016-07-14 2016-12-07 桂林长海发展有限责任公司 A kind of automatic recognition system of undercarriage folding and unfolding
CN106203353A (en) * 2016-07-14 2016-12-07 桂林长海发展有限责任公司 The detecting system of a kind of undercarriage and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016011099A1 (en) * 2014-07-18 2016-01-21 Sikorsky Aircraft Corporation System for determining weight-on-wheels using lidar
CN105810023A (en) * 2016-05-16 2016-07-27 福建福光股份有限公司 Automatic airport undercarriage retraction and extension monitoring system and method
CN106210061A (en) * 2016-07-14 2016-12-07 桂林长海发展有限责任公司 A kind of automatic recognition system of undercarriage folding and unfolding
CN106203353A (en) * 2016-07-14 2016-12-07 桂林长海发展有限责任公司 The detecting system of a kind of undercarriage and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈镇 等: "一种应用于飞机红外图像的分割方法及其实现", 《半导体光电》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107633505A (en) * 2017-08-24 2018-01-26 南京理工大学 A kind of undercarriage detection method based on target gray distribution character
CN109614864A (en) * 2018-11-06 2019-04-12 南京莱斯电子设备有限公司 A kind of ground visual angle multi-model undercarriage folding and unfolding condition detection method
CN109614864B (en) * 2018-11-06 2021-08-27 南京莱斯电子设备有限公司 Method for detecting retractable state of undercarriage of multi-model aircraft at ground-based view angle
WO2022071892A1 (en) * 2020-10-01 2022-04-07 Chew Rong Jie David A system for detecting the deployment of a landing gear of an aircraft and a method thereof

Also Published As

Publication number Publication date
CN106951823B (en) 2019-11-15

Similar Documents

Publication Publication Date Title
Aquino et al. Automated early yield prediction in vineyards from on-the-go image acquisition
CN103942803B (en) SAR (Synthetic Aperture Radar) image based automatic water area detection method
CN105373135B (en) A kind of method and system of aircraft docking guidance and plane type recognition based on machine vision
CN109145872B (en) CFAR and Fast-RCNN fusion-based SAR image ship target detection method
CN104268877B (en) A kind of infrared image sea horizon self-adapting detecting method
CN103336966B (en) A kind of weed images discrimination method being applied to agricultural intelligent machine
CN106951823A (en) A kind of infrared image undercarriage automatic identifying method
CN105547287B (en) A kind of irregular coelonavigation sight information extracting method
CN111126184B (en) Post-earthquake building damage detection method based on unmanned aerial vehicle video
CN106709426A (en) Ship target detection method based on infrared remote sensing image
CN105005813A (en) Insect pest analyzing and counting method and system
CN109614864B (en) Method for detecting retractable state of undercarriage of multi-model aircraft at ground-based view angle
CN105043395B (en) A kind of real-time Dynamic Location method of aircraft menology soft landing
CN106023199B (en) A kind of flue gas blackness intelligent detecting method based on image analysis technology
CN104182992A (en) Method for detecting small targets on the sea on the basis of panoramic vision
CN106558031A (en) A kind of image enchancing method of the colored optical fundus figure based on imaging model
CN110455201A (en) Stalk plant height measurement method based on machine vision
CN105023272A (en) Crop leaf insect pest detection method and system
CN111259763A (en) Target detection method and device, electronic equipment and readable storage medium
CN113554694A (en) Method and system for acquiring infrared effective shielding area in smoke screen release process
CN105354547A (en) Pedestrian detection method in combination of texture and color features
CN103177244B (en) Method for quickly detecting target organisms in underwater microscopic images
CN109215059A (en) Local data&#39;s correlating method of moving vehicle tracking in a kind of video of taking photo by plane
CN105139358A (en) Video raindrop removing method and system based on combination of morphology and fuzzy C clustering
KR20180096966A (en) Automatic Counting Method of Rice Plant by Centroid of Closed Rice Plant Contour Image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20181113

Address after: 210007 the 5 building of Tianan Digital City, 36 Yongfeng Avenue, Qinhuai District, Nanjing, Jiangsu.

Applicant after: Nanjing Lesi Electronic Equipment Co., Ltd.

Address before: 210007 1 East Street, alfalfa garden, Nanjing, Jiangsu

Applicant before: China Electronic Technology Corporation (Group) 28 Institute

GR01 Patent grant
GR01 Patent grant