CN103164853B - Imaging coupling assisting navigation can matching appraisal procedure - Google Patents

Imaging coupling assisting navigation can matching appraisal procedure Download PDF

Info

Publication number
CN103164853B
CN103164853B CN201310048804.4A CN201310048804A CN103164853B CN 103164853 B CN103164853 B CN 103164853B CN 201310048804 A CN201310048804 A CN 201310048804A CN 103164853 B CN103164853 B CN 103164853B
Authority
CN
China
Prior art keywords
matching
neural network
assessment
mate
artificial neural
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201310048804.4A
Other languages
Chinese (zh)
Other versions
CN103164853A (en
Inventor
张明照
杨维忠
牟建华
夏克寒
闫志强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Third Institute Of Equipment Research Institute Of Second Artillery Of C
Original Assignee
Third Institute Of Equipment Research Institute Of Second Artillery Of C
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Third Institute Of Equipment Research Institute Of Second Artillery Of C filed Critical Third Institute Of Equipment Research Institute Of Second Artillery Of C
Priority to CN201310048804.4A priority Critical patent/CN103164853B/en
Publication of CN103164853A publication Critical patent/CN103164853A/en
Application granted granted Critical
Publication of CN103164853B publication Critical patent/CN103164853B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

What the present invention relates to a kind of imaging coupling assisting navigation can matching appraisal procedure, and step comprises: (1) is selected to be used for can the parameter set of matching assessment; (2) assessment result classification is determined; (3) use above-mentioned can matching assessment parameter set as input, assessment result as output, structural model identification artificial neural network; (4) priori training mode identification artificial neural network is used; (5) undertaken matching assessing by step (4) trained neural network, using the input of the above-mentioned parameter of reference map as pattern-recognition artificial neural network, export assessment result by pattern-recognition artificial neural network.Described parameter set comprises: reference map gray variance, correlation surface main peak value and main peak sharpness.The trouble that the method not only can avoid parameter threshold to select and parameter threshold select the improper risk easily causing erroneous judgement, and the priori that can make full use of constantly accumulation progressively improves the accuracy of assessment.

Description

Imaging coupling assisting navigation can matching appraisal procedure
Technical field
The invention belongs to aircraft assisting navigation technical field, particularly a kind of imaging coupling assisting navigation can matching appraisal procedure.
Technical background
Scene Matching Techniques utilizes imaging sensor (as imaging sensors such as visible ray, infrared, the radars) scene image (hereinafter referred to as real-time figure) of Real-time Obtaining presumptive area in aircraft flight process, with be pre-stored in carry-on reference image data (hereinafter referred to as reference map) and carry out registration, the absolute position current with calculating aircraft or the position data of relative target.Scene Matching Techniques has the advantages such as navigation accuracy is high, independence is strong, plays an important role in aircraft assisting navigation.In some applications, need preferred Scene matching area, in other application, then need to assess its reliability for scene matching aided navigation and precision for certain scene Matching band, in other words can matching, these can carry out based on characteristics such as the structure of reference map, textures.
At present, for visible ray, infrared imaging or radar imagery coupling assisting navigation aircraft, before the use, can assess it according to reference map can matching.For assessment of parameter comprise reference map gray variance, independent pixels, information entropy, correlation surface main peak value, primary and secondary peak ratio, main peak sharpness, the parameters such as feature scene quantity of information, complexity, excentricity.Carry out can matching assessment time, need the threshold range setting each parameter, parameter threshold select can only be obtained by statistics the impact of assessment result, not having dominant formula to represent, therefore be the reasonable threshold value scope of Selection parameter, can only pass through repeatedly sound out.Because parameter is many, this workload repeatedly soundd out is just very large.And once parameter threshold scope is determined, the evaluate knowledge through checking obtained afterwards can not automatically for improving the accuracy of assessment, except unartificial again exploration determines new parameter threshold scope, therefore, current imaging matching process threshold value is not easily determined, inefficiency.
Summary of the invention
The object of the invention is to, overcome the limitation of prior art, thus provide a kind of imaging based on Pattern Recognition mate assisting navigation can matching appraisal procedure.
For achieving the above object, a kind of imaging coupling assisting navigation provided by the invention can matching appraisal procedure, as shown in Figure 1, this can matching appraisal procedure step be:
(1) select be used for can matching assessment parameter set, alternative parameter comprises: reference map gray variance, independent pixels, information entropy, correlation surface main peak value, primary and secondary peak ratio, main peak sharpness, feature scene quantity of information, complexity and excentricity.The parameter such as general at least selection reference figure gray variance, correlation surface main peak value, main peak sharpness during selected parameter.For improving the accuracy of assessment, multiselect parameter of can trying one's best, but multiselect parameter can increase the time of calculating parameter and the input node of artificial neural network.
(2) assessment result classification is determined, assessment result can be can not mate maybe can mate (can represent with 0 and 1 respectively), also can be can not mate, can mate but error compared with maybe mating and error little (can represent with 0,1 and 2 respectively) greatly, can also be can not mate, can mate but mismatch possibility and greatly maybe can mate and mismatch possibility little (also can represent with 0,1 and 2 respectively) etc.
(3) use above-mentioned can matching evaluate parameter collection as input, assessment result as output, structural model identification artificial neural network.Network adopts BP(Back-Propogation, backpropagation) Multi-layered Feedforward Networks of learning algorithm, each input node of network corresponds to a parameter of sample, and output layer nodes equals assessment result classification number, a corresponding class of output node.
(4) priori neural network training is used.To fly or l-G simulation test obtains imaging and schemes in real time by hanging, and to mate with corresponding reference map, artificial interpretation confirmation is carried out to the accuracy of coupling and precision; Calculate each reference map above-mentioned parameter and as a sample; Corresponding matching result is also used for the training of this artificial neural network by the sample of accumulation some, generally 80% of sample number can be used for training, 20% and be used for verification, when neural network is by repeatedly training its assessment correctness still can not reach requirement, the number of hidden nodes of adjustment neural network re-starts training until its assessment correctness reaches requirement.
(5) trained neural network can carry out matching assessing.When assessing, the above-mentioned parameter of Calculation Basis figure, it can be used as the input of neural network, and the output of neural network is assessment result.
The invention has the advantages that, imaging of the present invention coupling assisting navigation can matching appraisal procedure owing to introducing Pattern Recognition, being used for without the need to artificial setting can the threshold value of parameter of matching assessment, and newly-increased matching knowledge can be used for further training mode identification artificial neural network, to improve the reliability that it differentiates further.
Accompanying drawing explanation
Fig. 1 be imaging of the present invention coupling assisting navigation can the schematic flow sheet of matching appraisal procedure;
Fig. 2 is the structural representation of the artificial neural network that the present invention adopts;
Fig. 3 a is that an embodiment of the inventive method carries out artificial neural network training result schematic diagram;
Fig. 3 b is the result schematic diagram that the embodiment of the inventive method carries out artificial neural network training;
Fig. 3 c is the assay schematic diagram that the embodiment of the inventive method carries out artificial neural network training;
Fig. 3 d is the overall result schematic diagram that the embodiment of the inventive method carries out artificial neural network training.
Embodiment
In order to more fully understand the present invention, and for understanding objects and advantages of the present invention, in conjunction with respective drawings and embodiment, the present invention is described in detail now.
Radar scene can matching assessment in, selection reference figure gray variance, independent pixels, information entropy, correlation surface main peak value, primary and secondary peak ratio, main peak sharpness, the parameters such as feature scene quantity of information, complexity, excentricity, as can matching evaluate parameter collection, namely input as neural network.Wherein,
(1) independent pixels is defined as follows:
Independent image prime number N is defined as:
N = W L x × H L y
In formula:
W---picture traverse; H---picture altitude; L xand L ybe respectively the persistence length in x and y direction, persistence length is the displacement increment of coefficient of autocorrelation when dropping to 0.368.
(2) information entropy
If the gray level that all pixels of piece image have is set { b i, and certain gray level b ithe probability occurred is P (b i), so average information of this width image, namely the computing formula of information entropy is:
H = - Σ i = 1 L P ( b i ) · log 2 P ( b i ) ;
(3) correlation surface main peak sharpness
Correlation surface adopts normalized crosscorrelation algorithm to calculate.
In main peak sharpness correlation surface, the average of main peak surrounding 8 pixels characterizes with the ratio of main peak value.
(4) feature scene quantity of information
For bianry image, quantity of information can calculate with the ratio of the area of feature scene with the reference map total area, and the pixel number that simpler method is statistical nature scene calculates.
(5) feature scene complexity
For bianry image, the complexity of feature scene is reflected by the girth of feature scene and area usually.Suppose the girth of P representation feature scene, the area of A representation feature scene, so the computing formula of feature scene complexity C is: C=P 2/ A.
(6) feature scene excentricity
For bianry image, the computing formula of excentricity is:
p / q = 2 / [ ( A + B ) + ( A - B ) 2 + 4 H 2 ] 2 / [ ( A + B ) - ( A - B ) 2 + 4 H 2 ] - - - ( 17 )
In formula: A = Σ m i ( y i 2 + z i 2 ) ; B = Σ m i ( x i 2 + z i 2 ) The moment of inertia around X, Y-coordinate axle respectively; H=∑ m ix iy ifor the product of inertia, m ifor bianry image pixel value.
Assessment result, for mating and can not mate, represents with 1 and 2 respectively.
As shown in Figure 2, construct 9 inputs, 2 BP artificial neural networks exported, initial setting the number of hidden nodes is 10.
Obtain 57 samples, namely obtain the above-mentioned parameter of reference map corresponding to 57 Scene matching area, and demonstrate by emulation mode and hang the real-time diagram data flying to obtain and can mate at these Matching band.Neural metwork training is used at random from these 57 samplings 42 samples, training result for have 32 (accounting for 76.2% of training sample) can Matching band (representing with 1) be identified as can Matching band, 0 (accounting for 0.0% of training sample) can Matching band be identified as can not Matching band (representing with 2), 4 (accounting for 9.5% of training sample) can not Matching band be identified as can Matching band, 6 (accounting for 14.3% of training sample) can not Matching band be identified as can not Matching band, recognition correct rate is 90.5%, False Rate is 9.5%, as shown in Figure 3 a; Randomly draw 9 verifications for neural network from remaining 15, recognition correct rate is 88.9%, and False Rate is 11.1%, as shown in Figure 3 b; Remaining 6 samples are used for test, and recognition correct rate is 100%, and False Rate is 0%, as shown in Figure 3 c; Training and test the long and are shown in Fig. 3 d.As can be seen from Fig. 3 a ~ 3d, neural network has 4 samples to be mistaken for can not to mate for mating in the training stage, has 1 can mate be mistaken for and can not mate in the verification stage, and at test phase, then all interpretation is accurate.
Of the present invention above-mentioned for visible ray, infrared imaging or radar imagery coupling assisting navigation aircraft, before the use, can assess it according to reference map can matching.For assessment of parameter comprise reference map gray variance, independent pixels, information entropy, correlation surface main peak value, primary and secondary peak ratio, main peak sharpness, the parameters such as feature scene quantity of information, complexity, excentricity.Carry out can matching assessment time, because parameter is many, the threshold range of parameter selects difficulty.For this reason, propose based on Pattern Recognition can matching assessment technology, using for assessment of parameter set as the input of pattern-recognition artificial neural network, assessment result exports as it, by hang fly or Simulation results accumulation priori for the training of this artificial neural network and verification, through this artificial neural network that repetition training, School Affairs are optimized and revised, namely can be used for can matching assessment.The trouble that the method not only can avoid parameter threshold to select and parameter threshold select the improper risk easily causing erroneous judgement, and the priori that can make full use of constantly accumulation progressively improves the accuracy of assessment.
It should be noted last that, above embodiment is only in order to illustrate technical scheme of the present invention and unrestricted.Although with reference to embodiment to invention has been detailed description, those of ordinary skill in the art is to be understood that, modify to technical scheme of the present invention or equivalent replacement, do not depart from the spirit and scope of technical solution of the present invention, it all should be encompassed in the middle of right of the present invention.

Claims (4)

1. imaging coupling assisting navigation can a matching appraisal procedure, step comprises:
(1) select be used for can matching assessment parameter set; Described parameter set comprises: reference map gray variance, correlation surface main peak value and main peak sharpness;
(2) assessment result classification is determined;
(3) use above-mentioned can matching assessment parameter set as input, assessment result as output, structural model identification artificial neural network;
(4) priori training mode identification artificial neural network is used;
The process of described use priori neural network training is: to fly or l-G simulation test obtains imaging and schemes in real time by hanging, and to mate with corresponding reference map, carries out artificial interpretation confirmation to the accuracy of coupling and precision; Calculate each reference map above-mentioned parameter and as a sample; Corresponding matching result is also used for the training of this artificial neural network by the sample of accumulation some, be used for training, 20% by 80% of sample number and be used for verification, when neural network is by repeatedly training its assessment correctness still can not reach requirement, the number of hidden nodes of adjustment neural network re-starts training until its assessment correctness reaches requirement;
(5) undertaken matching assessing by step (4) trained neural network, using the input of the above-mentioned parameter of reference map as pattern-recognition artificial neural network, export assessment result by pattern-recognition artificial neural network.
2. imaging coupling assisting navigation as claimed in claim 1 can matching appraisal procedure, it is characterized in that, described parameter set also comprises further: independent pixels, information entropy, primary and secondary peak ratio, feature scene quantity of information, complexity and excentricity.
3. imaging coupling assisting navigation as claimed in claim 1 can matching appraisal procedure, it is characterized in that, described assessment result classification comprises: can not mate and maybe can mate; Can not mate, can mate but error compared with maybe can to mate greatly and error is little; Can not mate, can mate but mismatch possibility and greatly maybe can to mate and to mismatch possibility little.
4. imaging coupling assisting navigation as claimed in claim 1 can matching appraisal procedure, it is characterized in that, described pattern-recognition artificial neural network adopts the Multi-layered Feedforward Networks of back propagation learning algorithm, each input node of network corresponds to a parameter of sample, and output layer nodes equals assessment result classification number, a corresponding assessment result classification of output node.
CN201310048804.4A 2013-02-06 2013-02-06 Imaging coupling assisting navigation can matching appraisal procedure Expired - Fee Related CN103164853B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310048804.4A CN103164853B (en) 2013-02-06 2013-02-06 Imaging coupling assisting navigation can matching appraisal procedure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310048804.4A CN103164853B (en) 2013-02-06 2013-02-06 Imaging coupling assisting navigation can matching appraisal procedure

Publications (2)

Publication Number Publication Date
CN103164853A CN103164853A (en) 2013-06-19
CN103164853B true CN103164853B (en) 2015-12-09

Family

ID=48587909

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310048804.4A Expired - Fee Related CN103164853B (en) 2013-02-06 2013-02-06 Imaging coupling assisting navigation can matching appraisal procedure

Country Status (1)

Country Link
CN (1) CN103164853B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106934455B (en) * 2017-02-14 2019-09-06 华中科技大学 Remote sensing image optics adapter structure choosing method and system based on CNN
CN116342912B (en) * 2023-05-30 2023-08-11 中国铁路设计集团有限公司 Heterogeneous remote sensing image matching method and system based on correlation peak analysis

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6085135A (en) * 1997-02-20 2000-07-04 Claas Kgaa Method for agricultural map image display
CN1311879A (en) * 1998-05-29 2001-09-05 印得拉网络技术有限公司 An autopoietic network system endowed with distributed artificial intelligence for the supply of high volume high speed multimedia telesthesia, telemetry, telekinesis, telepresence

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6085135A (en) * 1997-02-20 2000-07-04 Claas Kgaa Method for agricultural map image display
CN1311879A (en) * 1998-05-29 2001-09-05 印得拉网络技术有限公司 An autopoietic network system endowed with distributed artificial intelligence for the supply of high volume high speed multimedia telesthesia, telemetry, telekinesis, telepresence

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
信息熵在导航传感器故障诊断中的应用研究;钱华明 等;《系统仿真学报》;20100228;第22卷;216-219 *
考虑基准图误差的导弹命中精度评定新方法;夏克寒 等;《导弹与航天运载技术》;20111231(第5期);38,39,49 *

Also Published As

Publication number Publication date
CN103164853A (en) 2013-06-19

Similar Documents

Publication Publication Date Title
CN104077601B (en) A kind of method that based target integrated identification is carried out using different types of information
CN106355151B (en) A kind of three-dimensional S AR images steganalysis method based on depth confidence network
CN109086700B (en) Radar one-dimensional range profile target identification method based on deep convolutional neural network
CN107132516B (en) A kind of Radar range profile's target identification method based on depth confidence network
Wheeler et al. Deep stochastic radar models
EP3696566A1 (en) System and method for identification of an airborne object
CN107728142B (en) Radar high-resolution range profile target identification method based on two-dimensional convolutional network
CN110632572A (en) Radar radiation source individual identification method and device based on unintentional phase modulation characteristics
CN103778441B (en) A kind of sequence Aircraft Target Recognition based on DSmT and HMM
CN104299243B (en) Target tracking method based on Hough forests
CN113033520B (en) Tree nematode disease wood identification method and system based on deep learning
CN104459668A (en) Radar target recognition method based on deep learning network
CN108828547A (en) The high method of the low Elevation of metre wave radar based on deep neural network
CN107202976A (en) The distributed microphone array sound source localization system of low complex degree
CN110401978A (en) Indoor orientation method based on neural network and particle filter multi-source fusion
CN106679880A (en) Pressure sensor temperature compensating method based on FOA-optimized SOM-RBF
CN113673312B (en) Deep learning-based radar signal intra-pulse modulation identification method
CN109255339B (en) Classification method based on self-adaptive deep forest human gait energy map
CN107862329A (en) A kind of true and false target identification method of Radar range profile's based on depth confidence network
CN114937066A (en) Point cloud registration system and method based on cross offset features and space consistency
CN111368930B (en) Radar human body posture identification method and system based on multi-class spectrogram fusion and hierarchical learning
CN115061126A (en) Radar cluster target behavior identification method based on multi-dimensional parameter neural network
CN103164853B (en) Imaging coupling assisting navigation can matching appraisal procedure
CN104182768A (en) Quality classification method for inverse synthetic aperture radar images
CN110297223A (en) The method that pulse laser number of echoes visual inspection based on BP neural network is surveyed

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20151209

Termination date: 20190206

CF01 Termination of patent right due to non-payment of annual fee