CN107958461A - A kind of carrier aircraft method for tracking target based on binocular vision - Google Patents
A kind of carrier aircraft method for tracking target based on binocular vision Download PDFInfo
- Publication number
- CN107958461A CN107958461A CN201711123360.0A CN201711123360A CN107958461A CN 107958461 A CN107958461 A CN 107958461A CN 201711123360 A CN201711123360 A CN 201711123360A CN 107958461 A CN107958461 A CN 107958461A
- Authority
- CN
- China
- Prior art keywords
- depth information
- carrier aircraft
- characteristic point
- image
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a kind of carrier aircraft method for tracking target based on binocular vision.The carrier aircraft method for tracking target based on binocular vision includes the following steps:Step 1:Binocular optical system is arranged in its maximum width of carrier aircraft platform, adjusts focal length, ensures target in the overlapping ranges of binocular optical system, and the binocular optical system includes two-way collection image, and two-way collection image is respectively set to L1, L2;Step 2:The image gathered by SIFT methods to two-way matches, and mating structure is screened, and obtains the characteristic point after screening, and the parallax of two-way image is calculated by characteristic point;Parallax is converted into depth information;The depth information of acquisition is added to original image, so as to obtain the optical imagery with depth information;Step 5:Training CNN networks;Step 6:The optical imagery with depth information is identified with the CNN networks after training, so as to obtain the recognition result of target.The application relative laser radar has the advantages that volume mass is small.
Description
Technical field
The present invention relates to Aircraft Targets autonomous classification technical field, more particularly to a kind of carrier aircraft mesh based on binocular vision
Mark tracking.
Background technology
Aircraft is in the task of execution, it is often necessary to which interesting target (ToI, Target of Interesting) is carried out
Investigation and identification, this task is up to the present still mainly by being accomplished manually.The mission requirements of aircraft will cause aircraft can not
Identification of the various dimensions information completion to target is obtained by aircraft in formation.
Thus, it is desirable to have a kind of technical solution is come at least one drawbacks described above for overcoming or at least mitigating the prior art.
The content of the invention
It is an object of the invention to provide a kind of carrier aircraft method for tracking target based on binocular vision to overcome or at least subtract
At least one drawbacks described above of the light prior art.
To achieve the above object, the present invention provides a kind of carrier aircraft method for tracking target based on binocular vision, described to be based on
The carrier aircraft method for tracking target of binocular vision includes the following steps:
Step 1:Binocular optical system is arranged in its maximum width of carrier aircraft platform, adjusts focal length, ensures target in binocular
In the overlapping ranges of optical system, the binocular optical system includes two-way collection image, and two-way collection image is respectively set to
L1,L2;
Step 2:The image gathered by SIFT methods to two-way matches, and mating structure is screened, and obtains
Characteristic point after screening, the parallax of two-way image is calculated by characteristic point;Parallax is converted into depth information;By the depth of acquisition
Information is added to one in original image, so as to obtain the optical imagery with depth information;
Step 5:Training CNN networks;
Step 6:The optical imagery with depth information is identified with the CNN networks after training, so as to obtain target
Recognition result.
Preferably, the step 2 includes:
Step 21:SIFT matchings are carried out to the L1, L2, clarification of objective point is obtained, obtains set of characteristic points S;
Step 22:To S processing, its distribution vector V=is calculated according to the relative position relation between characteristic point in S
[k, d], wherein k are the slope of two figure characteristic point lines, and d is characterized the distance of a line, is selected according to the required precision of identification
The highest density region of characteristic point, abandons the characteristic point not in the region, forms new set of characteristic points Sni;
Step 23:To new feature point set SniClustered, C is selected according to accuracy of identification cluster numbers, obtain C cluster;
Step 24:The cluster centre each clustered is calculated, in the characteristic point of this cluster, Euclidean distance is calculated and obtains and gather
Center of the characteristic point of class centre distance minimum as this cluster, and calculate the transverse parallaxes in two images between the point and indulge
To parallax;
Step 25:Transverse parallaxes by being obtained in step 24 and longitudinal parallax carry out each characteristic area depth away from
It is as follows from recovery, calculation formula:
X/U=Y/V=L/F
Wherein:X lateral separations between target and observation platform;Y transverse and longitudinal distances between target and observation platform;L
Parallel to optical axis distance between target and observation platform, f is Current camera focal length;
Step 26:Obtained depth information is normalized, the square consistent with original image scale is formed by characteristic area distribution
Battle array, and the gray level image dot product obtained in the step 24, obtains including the optical imagery of depth information.
Preferably, the step 6 is specially:
Pass through formula:
It is identified, wherein, δ is default given value, when result is more than δ, it is believed that is wrapped in collection image
Containing target to be identified;RiRecognition result for CNN to characteristic area, SiIt is characterized the area in region.
The carrier aircraft method for tracking target based on binocular vision of the application, which provides a kind of unit pinpoint target, independently to be known
Other method, the parallax information collected using binocular optical system to target, is enable carrier aircraft to integrate two-way optics and adopted
Collect the image that equipment obtains, coordinate carrier aircraft current spatial coordinate, increase the depth information of clearing for target imaging, finally by volume
Product neutral net (CNN) completes autonomous classification of the unit platform to target.
The advantages of the application, is that aircraft, which need not carry out target energy radiation, can complete the needs of ranging, enhancing
Stealth capabilities.And since energy Radiation Module is not required, relative laser radar is isometric, quality is all smaller, and it is imaged speed
Degree is very fast.
Brief description of the drawings
Fig. 1 is the flow diagram of the carrier aircraft method for tracking target based on binocular vision of one embodiment of the application.
Embodiment
To make the purpose, technical scheme and advantage that the present invention is implemented clearer, below in conjunction with the embodiment of the present invention
Attached drawing, the technical solution in the embodiment of the present invention is further described in more detail.In the accompanying drawings, identical from beginning to end or class
As label represent same or similar element or there is same or like element.Described embodiment is the present invention
Part of the embodiment, instead of all the embodiments.The embodiments described below with reference to the accompanying drawings are exemplary, it is intended to uses
It is of the invention in explaining, and be not considered as limiting the invention.Based on the embodiments of the present invention, ordinary skill people
Member's all other embodiments obtained without creative efforts, belong to the scope of protection of the invention.Under
Face is described in detail the embodiment of the present invention with reference to attached drawing.
In the description of the present invention, it is to be understood that term " " center ", " longitudinal direction ", " transverse direction ", "front", "rear",
The orientation or position relationship of the instruction such as "left", "right", " vertical ", " level ", " top ", " bottom " " interior ", " outer " is based on attached drawing institutes
The orientation or position relationship shown, is for only for ease of the description present invention and simplifies description, rather than instruction or the dress for implying meaning
Put or element there must be specific orientation, with specific azimuth configuration and operation, therefore it is not intended that the present invention is protected
The limitation of scope.
Fig. 1 is the flow diagram of the carrier aircraft method for tracking target based on binocular vision of one embodiment of the application.
Carrier aircraft method for tracking target based on binocular vision as shown in Figure 1 includes the following steps:
Step 1:Binocular optical system is arranged in its maximum width of carrier aircraft platform, adjusts focal length, ensures target in binocular
In the overlapping ranges of optical system, the binocular optical system includes two-way collection image, and two-way collection image is respectively set to
L1,L2;
Step 2:The image gathered by SIFT methods to two-way matches, and mating structure is screened, and obtains
Characteristic point after screening, the parallax of two-way image is calculated by characteristic point;Parallax is converted into depth information;By the depth of acquisition
Information is added to one in original image, so as to obtain the optical imagery with depth information;
Step 5:Training CNN networks;
Step 6:The optical imagery with depth information is identified with the CNN networks after training, so as to obtain target
Recognition result.
The carrier aircraft method for tracking target based on binocular vision of the application, which provides a kind of unit pinpoint target, independently to be known
Other method, the parallax information collected using binocular optical system to target, is enable carrier aircraft to integrate two-way optics and adopted
Collect the image that equipment obtains, coordinate carrier aircraft current spatial coordinate, increase the depth information of clearing for target imaging, finally by volume
Product neutral net (CNN) completes autonomous classification of the unit platform to target.
The advantages of the application, is that aircraft, which need not carry out target energy radiation, can complete the needs of ranging, enhancing
Stealth capabilities.And since energy Radiation Module is not required, relative laser radar is isometric, quality is all smaller, and it is imaged speed
Degree is very fast.
In the present embodiment, the step 2 includes:
Step 21:SIFT matchings are carried out to the L1, L2, clarification of objective point is obtained, obtains set of characteristic points S;
Step 22:To S processing, its distribution vector V=is calculated according to the relative position relation between characteristic point in S
[k, d], wherein k are the slope of two figure characteristic point lines, and d is characterized the distance of a line, is selected according to the required precision of identification
The highest density region of characteristic point, abandons the characteristic point not in the region, forms new set of characteristic points Sni;
Step 23:To new feature point set SniClustered, C is selected according to accuracy of identification cluster numbers, obtain C cluster;
Step 24:The cluster centre each clustered is calculated, in the characteristic point of this cluster, Euclidean distance is calculated and obtains and gather
Center of the characteristic point of class centre distance minimum as this cluster, and calculate the transverse parallaxes in two images between the point and indulge
To parallax;
Step 25:Transverse parallaxes by being obtained in step 24 and longitudinal parallax carry out each characteristic area depth away from
It is as follows from recovery, calculation formula:
X/U=Y/V=L/F
Wherein:X lateral separations between target and observation platform;Y transverse and longitudinal distances between target and observation platform;L
Parallel to optical axis distance between target and observation platform, f is Current camera focal length;
Step 26:Obtained depth information is normalized, the square consistent with original image scale is formed by characteristic area distribution
Battle array, and the gray level image dot product obtained in the step 24, obtains including the optical imagery of depth information.
In the present embodiment, the step 6 is specially:
Pass through formula:
It is identified, wherein, δ is default given value, when result is more than δ, it is believed that is wrapped in collection image
Containing target to be identified;RiRecognition result for CNN to characteristic area, SiIt is characterized the area in region.
It is last it is to be noted that:The above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations.To the greatest extent
Pipe is with reference to the foregoing embodiments described in detail the present invention, it will be understood by those of ordinary skill in the art that:It is still
It can modify to the technical solution described in foregoing embodiments, or which part technical characteristic is equally replaced
Change;And these modifications or replacement, the essence of appropriate technical solution is departed from the essence of various embodiments of the present invention technical solution
God and scope.
Claims (3)
- A kind of 1. carrier aircraft method for tracking target based on binocular vision, it is characterised in that the carrier aircraft mesh based on binocular vision Mark tracking includes the following steps:Step 1:Binocular optical system is arranged in its maximum width of carrier aircraft platform, adjusts focal length, ensures target in binocular optical In the overlapping ranges of system, the binocular optical system includes two-way collection image, and two-way collection image is respectively set to L1, L2;Step 2:The image gathered by SIFT methods to two-way matches, and mating structure is screened, and obtains screening Characteristic point afterwards, the parallax of two-way image is calculated by characteristic point;Parallax is converted into depth information;By the depth information of acquisition Added to one in original image, so as to obtain the optical imagery with depth information;Step 5:Training CNN networks;Step 6:The optical imagery with depth information is identified with the CNN networks after training, so as to obtain the knowledge of target Other result.
- 2. the carrier aircraft method for tracking target based on binocular vision as claimed in claim 1, it is characterised in that the step 2 is wrapped Include:Step 21:SIFT matchings are carried out to the L1, L2, clarification of objective point is obtained, obtains set of characteristic points S;Step 22:To S processing, its distribution vector V=[k, d] is calculated according to the relative position relation between characteristic point in S, Wherein k is the slope of two figure characteristic point lines, and d is characterized the distance of a line, characteristic point is selected according to the required precision of identification Highest density region, abandon the characteristic point not in the region, form new set of characteristic points Sni;Step 23:To new feature point set SniClustered, C is selected according to accuracy of identification cluster numbers, obtain C cluster;Step 24:The cluster centre each clustered is calculated, in the characteristic point of this cluster, during calculating Euclidean distance is obtained and clustered Center of the minimum characteristic point of heart distance as this cluster, and calculate the transverse parallaxes in two images between the point and longitudinally regard Difference;Step 25:It is extensive that depth distance is carried out to each characteristic area to the transverse parallaxes by being obtained in step 24 and longitudinal parallax Multiple, calculation formula is as follows:X/U=Y/V=L/FWherein:X lateral separations between target and observation platform;Y transverse and longitudinal distances between target and observation platform;L is mesh Parallel to optical axis distance between mark and observation platform, f is Current camera focal length;Step 26:Obtained depth information is normalized, the matrix consistent with original image scale is formed by characteristic area distribution, with The gray level image dot product obtained in the step 24, obtains including the optical imagery of depth information.
- 3. the carrier aircraft method for tracking target based on binocular vision as claimed in claim 2, it is characterised in that the step 6 has Body is:Pass through formula:It is identified, wherein, δ is default given value, when result is more than δ, it is believed that includes and treats in collection image Identify target;RiRecognition result for CNN to characteristic area, SiIt is characterized the area in region.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711123360.0A CN107958461A (en) | 2017-11-14 | 2017-11-14 | A kind of carrier aircraft method for tracking target based on binocular vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711123360.0A CN107958461A (en) | 2017-11-14 | 2017-11-14 | A kind of carrier aircraft method for tracking target based on binocular vision |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107958461A true CN107958461A (en) | 2018-04-24 |
Family
ID=61964723
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711123360.0A Pending CN107958461A (en) | 2017-11-14 | 2017-11-14 | A kind of carrier aircraft method for tracking target based on binocular vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107958461A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109636851A (en) * | 2018-11-13 | 2019-04-16 | 中国科学院计算技术研究所 | Targeting localization method is delivered in harmful influence accident treatment agent based on binocular vision |
CN110009675A (en) * | 2019-04-03 | 2019-07-12 | 北京市商汤科技开发有限公司 | Generate method, apparatus, medium and the equipment of disparity map |
CN111145223A (en) * | 2019-12-16 | 2020-05-12 | 盐城吉大智能终端产业研究院有限公司 | Multi-camera personnel behavior track identification analysis method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101344965A (en) * | 2008-09-04 | 2009-01-14 | 上海交通大学 | Tracking system based on binocular camera shooting |
CN101504287A (en) * | 2009-01-22 | 2009-08-12 | 浙江大学 | Attitude parameter evaluation method for unmanned vehicle independent landing based on visual information |
CN101720047A (en) * | 2009-11-03 | 2010-06-02 | 上海大学 | Method for acquiring range image by stereo matching of multi-aperture photographing based on color segmentation |
CN102779347A (en) * | 2012-06-14 | 2012-11-14 | 清华大学 | Method and device for tracking and locating target for aircraft |
WO2016100814A1 (en) * | 2014-12-19 | 2016-06-23 | United Technologies Corporation | Multi-modal sensor data fusion for perception systems |
CN105759834A (en) * | 2016-03-09 | 2016-07-13 | 中国科学院上海微系统与信息技术研究所 | System and method of actively capturing low altitude small unmanned aerial vehicle |
CN107329490A (en) * | 2017-07-21 | 2017-11-07 | 歌尔科技有限公司 | Unmanned plane barrier-avoiding method and unmanned plane |
-
2017
- 2017-11-14 CN CN201711123360.0A patent/CN107958461A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101344965A (en) * | 2008-09-04 | 2009-01-14 | 上海交通大学 | Tracking system based on binocular camera shooting |
CN101504287A (en) * | 2009-01-22 | 2009-08-12 | 浙江大学 | Attitude parameter evaluation method for unmanned vehicle independent landing based on visual information |
CN101720047A (en) * | 2009-11-03 | 2010-06-02 | 上海大学 | Method for acquiring range image by stereo matching of multi-aperture photographing based on color segmentation |
CN102779347A (en) * | 2012-06-14 | 2012-11-14 | 清华大学 | Method and device for tracking and locating target for aircraft |
WO2016100814A1 (en) * | 2014-12-19 | 2016-06-23 | United Technologies Corporation | Multi-modal sensor data fusion for perception systems |
CN105759834A (en) * | 2016-03-09 | 2016-07-13 | 中国科学院上海微系统与信息技术研究所 | System and method of actively capturing low altitude small unmanned aerial vehicle |
CN107329490A (en) * | 2017-07-21 | 2017-11-07 | 歌尔科技有限公司 | Unmanned plane barrier-avoiding method and unmanned plane |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109636851A (en) * | 2018-11-13 | 2019-04-16 | 中国科学院计算技术研究所 | Targeting localization method is delivered in harmful influence accident treatment agent based on binocular vision |
CN109636851B (en) * | 2018-11-13 | 2020-12-29 | 中国科学院计算技术研究所 | Dangerous chemical accident treatment agent delivery targeting positioning method based on binocular vision |
CN110009675A (en) * | 2019-04-03 | 2019-07-12 | 北京市商汤科技开发有限公司 | Generate method, apparatus, medium and the equipment of disparity map |
CN111145223A (en) * | 2019-12-16 | 2020-05-12 | 盐城吉大智能终端产业研究院有限公司 | Multi-camera personnel behavior track identification analysis method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109102522B (en) | Target tracking method and device | |
CN105894499B (en) | A kind of space object three-dimensional information rapid detection method based on binocular vision | |
CN104541302B (en) | Distance prompt Object Segmentation System and method | |
CN110988912A (en) | Road target and distance detection method, system and device for automatic driving vehicle | |
CN104036488B (en) | Binocular vision-based human body posture and action research method | |
CN107958461A (en) | A kind of carrier aircraft method for tracking target based on binocular vision | |
CN107506711A (en) | Binocular vision obstacle detection system and method based on convolutional neural networks | |
CN107533753A (en) | Image processing apparatus | |
CN106454090A (en) | Automatic focusing method and system based on depth camera | |
CN106952225A (en) | A kind of panorama mosaic method towards forest fire protection | |
CN109697428B (en) | Unmanned aerial vehicle identification and positioning system based on RGB _ D and depth convolution network | |
CN106197452A (en) | A kind of visual pattern processing equipment and system | |
CN108235774A (en) | Information processing method, device, cloud processing equipment and computer program product | |
CN109101957A (en) | Binocular solid data processing method, device, intelligent driving equipment and storage medium | |
CN110097498B (en) | Multi-flight-zone image splicing and positioning method based on unmanned aerial vehicle flight path constraint | |
CN107560592A (en) | A kind of precision ranging method for optronic tracker linkage target | |
CN110276286B (en) | Embedded panoramic video stitching system based on TX2 | |
CN110969158A (en) | Target detection method, system and device based on underwater operation robot vision | |
CN110458952A (en) | A kind of three-dimensional rebuilding method and device based on trinocular vision | |
CN106204554A (en) | Depth of view information acquisition methods based on multiple focussing image, system and camera terminal | |
CN111209840A (en) | 3D target detection method based on multi-sensor data fusion | |
CN115272403A (en) | Fragment scattering characteristic testing method based on image processing technology | |
CN109508673A (en) | It is a kind of based on the traffic scene obstacle detection of rodlike pixel and recognition methods | |
CN113792593A (en) | Underwater close-range target identification and tracking method and system based on depth fusion | |
CN106780558B (en) | Method for generating unmanned aerial vehicle target initial tracking frame based on computer vision point |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180424 |
|
RJ01 | Rejection of invention patent application after publication |