CN107038713A - A kind of moving target method for catching for merging optical flow method and neutral net - Google Patents

A kind of moving target method for catching for merging optical flow method and neutral net Download PDF

Info

Publication number
CN107038713A
CN107038713A CN201710236616.2A CN201710236616A CN107038713A CN 107038713 A CN107038713 A CN 107038713A CN 201710236616 A CN201710236616 A CN 201710236616A CN 107038713 A CN107038713 A CN 107038713A
Authority
CN
China
Prior art keywords
moving target
point
target
optical flow
snake
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710236616.2A
Other languages
Chinese (zh)
Inventor
朱平
甄子洋
覃海群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201710236616.2A priority Critical patent/CN107038713A/en
Publication of CN107038713A publication Critical patent/CN107038713A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a kind of moving target method for catching for merging optical flow method and neutral net.First, the position of moving target is detected from image using improved optical flow method;Then, the position of moving target is detected from image using Pulse-coupled Neural Network Model;Finally, the object detection results based on optical flow method and based on neutral net are subjected to fused filtering, obtain accurate moving target position.Present invention incorporates optical flow method and the advantage of neutral net, the position of moving target can accurately, quickly, be intactly judged.

Description

A kind of moving target method for catching for merging optical flow method and neutral net
Technical field
The invention belongs to image object detection technique field, a kind of more particularly to fortune for merging optical flow method and neutral net Moving-target method for catching.
Background technology
In recent years, with the development of multimedia technology, electronic technology and the communication technology, computer vision and digital picture Processing is increasingly paid attention to by domestic and foreign scholars and researcher.Detection for Moving Target is computer vision and digitized map As an important branch for the treatment of technology, it is widely used in robot navigation, can only video monitoring, industrial detection, aviation boat The numerous areas such as it, there is important meaning in theoretical research and practical application.Moving object detection is exactly by the mesh of motion Mark is separated from containing the image having powerful connections, and at present, there is many researchs, optical flow method, frame difference method, the back of the body in this respect both at home and abroad Scape calculus of finite differences etc. is all relatively common method.
In 1981, Horn and Schunck proposed the basic equation for calculating light stream, and the method for calculating optical flow field, Light stream suffers from very important application in terms of destination object segmentation, identification, tracking and military navigation.It is relatively external and Speech, the domestic research in the light stream location navigation and control field of multi-rotor unmanned aerial vehicle is started late.By effort in recent years, The domestic related research institutes such as unit such as the National University of Defense technology, Northwestern Polytechnical University, Harbin Institute of Technology is based on polarised light More achievement is achieved in terms of the Navigation of Pilotless Aircraft of air navigation aid, but the research based on light stream is just at the early-stage.Optical flow method It is that some special exercises conversion collected by camera correspond to the specific characteristic point rule of conversion of certain class, is passed with reference to light stream Sensor and image procossing correlation technique, obtain the related datas such as position, posture, the speed of object.Thompson proposes a kind of Using the difference between the background light stream direction for moving epipolar constraint determination and the light stream direction of moving target, obtained with this The method of the relevant information of object.The detection that Yang Wei of Harbin Institute of Technology et al. is directed under complex environment proposes a kind of new Method, but its restricted application.Limited by prior art condition, holistic approach is substantially at starting stage, phase Than external research, China is in terms of unmanned plane light stream location navigation and the conceptual approach of control, theoretical method, practical application A certain distance is all there is, therefore, is badly in need of carrying out further further investigation in this field.
The content of the invention
In order to solve the technical problem that above-mentioned background technology is proposed, the present invention is intended to provide a kind of fusion optical flow method and nerve The moving target method for catching of network, is handled with being merged the positional information come detected by optical flow method and neutral net, So that object detection results are more complete accurate, the influence that camera motion is brought is eliminated to a certain extent so that mesh Mark testing result more rationally reliable.
In order to realize above-mentioned technical purpose, the technical scheme is that:
A kind of moving target method for catching for merging optical flow method and neutral net, comprises the following steps:
(1) position of moving target is detected from image using improved optical flow method;Pass through Snake greedy algorithms first Choose the general outline of moving target, then the moving target detected using LK algorithms position;
(2) position of moving target is detected from image using pulse-coupled neural networks model;
(3) object detection results of step (1) and the object detection results of step (2) are subjected to fused filtering, obtain essence True moving target position.
Further, in step (1), the Snake greedy algorithms are as follows:
If AiFor i-th of snake point, then energy equation is discrete turns to:
In formula (1), winsize is the size of search window, Ai,jRepresent point of i-th of snake point in the search window at j, j Scope be that snake point removes point outside snake point in the window;
Econt(Ai,j)、Ecurv(Ai,j)、Eimg(Ai,j) it is respectively continuity energy, flexional, image energy, α, β, γ Corresponding weights are represented, its corresponding centrifugal pump is:
In formula (2)-(4),For the average distance between snake point,N is the snake point number chosen, and t is Choose the sequential of snake point, I (Ai,j) represent the gray values of i-th of snake point j positions in the search window, maxGray, minGray points Minimum and maximum gray value that Wei be in window.
Further, in step (1), the LK algorithms are as follows:
If I and J is two continuous frames image, the gray value of its (x, y) point is respectively I (x, y) and J (x, y), if u=[ux, uy]TIt is a bit on image I, a point v=u+d=[u is found on image Jx+dx,uy+dy]TSo that I (u) and J (v) is same One position, d=[dx dy]TDisplacement for point v relative to point u, for minimizing difference function;
Define difference function:
In formula (5), wx、wyIt is window ranges of the point in x, y Directional Extension;
The optimization problem of difference function is solved, method for solving is to solve for local derviations of the difference function e (d) on d, and makes It is equal to 0, obtains:
D=G-1b (6)
In formula (6),Ix、IyRespectively I (x, y) on x, Y local derviation, it is ensured that G is reversible, that is, ensure that Grad of the I (x, y) in x, y direction is not 0;
The above-mentioned optimization problem of successive ignition, solves optimal displacement vector d.
Further, in step (2), the detection of moving target is carried out using pulse-coupled neural networks model, by two dimension Picture element matrix M × N regards M × N number of pulse-coupled neural networks as, and the gray value of each of which pixel corresponds to the defeated of each neuron Enter;
For the detection of moving target, with the presence of the close pixel of gray value in the field of connection weight matrix inside image When, the pulse output of wherein some pixel will cause exciting for other similar gray-scale pixels correspondence neurons, produce pulse Sequence is exported, and its processing to pixel is as follows:
In formula (7), xij(t) pixel value of currently processed pixel, μ are representedij(t-1) represent that by decision-making be background Corresponding pixel points model average, σij[t-1] represents the variance of correspondence model, Fij[n] is the input of each neuron;
Given threshold θij[n] is to include two pairs of threshold values:TH1 and TL1, TH2 and TL2, judge for the first time with threshold value TH1 and TL1, second and later judgement threshold value TH2 and TL2, i.e.,:
In formula (8), n is segmentation times;
According to the general principle of pulse-coupled neural networks model and dual threshold, system is output as:
In formula (9), Uij[n] is inside neurons active entry, Uij[n]=Fij[n](1+βCij[n]), Cij[n] is nerve The outside stimulus of member, β is the connection modulation constant between protruding, and RP expressions are handled with area correlation techniques;
Said process needs successive ignition, detects that the gray value difference of moving target and background is special in the first iteration Not obvious motor point, using these motor points as seed, continues iteration, when the motor point quantity detected is zero, eventually Only iteration.
Further, in step (3), the two groups of target position datas obtained first to step (1), (2) carry out track Testing Association, to determine that obtained target position data comes from same target;Then space is implemented to two groups of position datas Location method, finds out the discrepancy fusion collection of corresponding objects composition, and the name attribute similarity based method for reusing threshold value is excluded by sky Between the error object found out of location method.
Further, the process that the Track association is examined is as follows:
IfWithIt is the state estimation from different target detection method, PiAnd PjIts covariance, P are represented respectivelyijAnd Pji The cross covariance of Target state estimator error is represented respectively, and is had:
Pij=Pji T (10)
Calculate incidence matrix:
By incidence matrix dijIn each element value and setting threshold value be compared, so as to judgeWithWhether come from In same target.
Further, the locus method uses closest method:
If A and B is the target position data set from different target detection method, then
In formula (14), a ∈ A, b ∈ B, c ({ a, b }) are the confidence level of target corresponding objects (a, b), a2It is b secondary in A Neighbour's object, b2The secondary neighbour's object for being a in B;
If confidence level c ({ a, b }) is more than default threshold value, then it is assumed that be correct corresponding objects, these corresponding objects shapes Into the fusion collection that comes in and goes out.
The beneficial effect brought using above-mentioned technical proposal:
For Detection dynamic target, camera motion influence was both faced, the real-time of processing information has been met again, accurately Property, reliability requirements.The present invention is using light stream and the object detection method of neural network fusion, to light stream sensor and nerve The positional information come detected by network method is handled and estimation fusion so that object detection results are more complete accurate, And the influence that camera motion is brought is eliminated to a certain extent so that the target detected is more rationally reliable, is fortune A new breakthrough in moving-target detection.Target detection tracking technique tool based on neutral net has an enormous advantage, because Artificial neural network has self-study, associative memory, calculating and intelligent control ability, can pass through the compound of simple non-linear unit Map and obtain stronger Nonlinear Processing ability.
Brief description of the drawings
Fig. 1 is the basic flow sheet of the present invention;
Fig. 2 is the structure chart of Pulse Coupled Neural Network in the present invention;
Fig. 3 is the testing result schematic diagram of embodiment;
Fig. 4 is the effect contrast figure of three kinds of detection methods;
Fig. 5 is three kinds of detection method verification and measurement ratio comparison diagrams.
Embodiment
Below with reference to accompanying drawing, technical scheme is described in detail.
A kind of moving target method for catching for merging optical flow method and neutral net, as shown in figure 1, comprising the following steps that.
Step 1:The motion vector of objective contour pixel is detected by light stream sensor and respective algorithms first, and then Obtain the position of each pixel.
Of the present invention is the light stream mesh that the algorithm that Snake greedy algorithms and classics LK algorithms are combined is improved Mark detection.Wherein, Snake greedy algorithms are based on boundary information, more sensitive to the position of contour curve, and convergence accuracy compares It is high., can quick and stabilization inspection using classical LK algorithms after the initial profile that target is accurately have chosen by Snake algorithms Measure the positional information of target.
Traditional Snake algorithms are to require high based on border, therefore to the original position of contour curve, convergent precision It is low.Thus greedy algorithm is introduced to improve.Algorithm is as follows:
If AjFor snake point, then energy equation is discrete turns to:
In formula (1), Ai,jPoint of i-th of snake point in the search window at j is represented, j scope is that snake point is removed in the window The point gone outside snake point.
Econt(Ai,j)、Ecurv(Ai,j)、Eimg(Ai,j) it is respectively continuity energy, flexional, image energy, α, β, γ Corresponding weights are represented, its corresponding centrifugal pump is:
In formula (2)-(4),For the average distance between snake point,N is the snake point number chosen, and t is Choose the sequential of snake point, I (Ai,j) represent the gray values of i-th of snake point j positions in the search window, maxGray, minGray points Minimum and maximum gray value that Wei be in window.
On the basis of Snake greedy algorithms, i.e. in the case that the general outline of moving target has been irised out, calculated using LK Method obtains motion vector by iteration, and then obtains the position of moving target.
LK algorithm distinguished point based, and characteristic point here is exactly each to put a corresponding wicket image block.Assuming that I It is two continuous frames image with J, the gray value of its (x, y) point is respectively I (x, y) and J (x, y).If u=[ux,uy]TIt is on image I A bit, LK algorithms are that a point v=u+d=[u are found on image Jx+dx,uy+dy]TSo that I (u) and J (v) is same Position.LK solves the similarity of pixel in the two small windows.D=[dx dy]TDisplacement for point v relative to point u, for minimizing Difference function;
For above-mentioned optimization problem, the method for solution is exactly the local derviation for requiring e (d) on vectorial d, and makes its etc. In 0, i.e.,
Obtain:
The result of local derviation is derived, and utilizes Taylor series expansion J (x+dx,y+dy):
And carry it into (7) formula
Herein in relation to image J (x, y) local derviation can be by solving I (x, y) local derviation approximate calculation solve first I (x, Y) local derviation, if
δ I=I (x, y)-J (x, y) (9)
It is hereby achieved that
Last both members take inversion to obtain:
For simplicity calculating, two parts therein are substituted with simple symbol, are set respectively:
Then, survival function is changed into after simplifying:
Above formula is equal to 0, that is, show that displacement d is:
D=G-1b (15)
In above formula (11), it is necessary to assure formula G is reversible, that is, ensure the ladders of image I (x, y) in the x and y direction Angle value is not 0.This derivation is when specific implementation, it is necessary to which multiple iteration can just obtain one more accurately puts Displacement vector, iterative process is also the process of a gradually convergence optimal value.
When moving target is in complicated natural environment, target and the interlaced of background are considerably increased for mesh Mark the difficulty accurately split, it is single based on light stream LK algorithms or based on Snake algorithms all can not be outstanding solution this One problem.Therefore, the algorithm that Snake greedy algorithms proposed by the present invention and classics LK algorithms are combined solves this well Problem.Also, due to correct selections of the Snake for target initial profile, convergence of algorithm speed is accelerated, also in certain journey The computational accuracy of algorithm is improved on degree.
Step 2:It is then based on the position that neutral net detects target, obtaining the position of each pixel.
The present invention uses Pulse-coupled Neural Network Model, this model make use of simultaneously the distinctive linear, additive of neuron, The characteristics such as non-linear multiplication modulation coupling, lead to while Pulse-coupled Neural Network Model also contemplates biological electric pulse transmission ion Road characteristic, it is contemplated that when the mammal optic nerve system visual field is properly stimulated, adjacent connection neuron simultaneous shot pulse Characteristic.Pulse Coupled Neural Network is a two-dimension single layer neuron arrays, is made up of pulse-coupled neural networks, structure such as Fig. 2 It is shown:
Feed back input passage is also directly received from outside in addition to receiving the input information from adjacent neurons Stimulus information is inputted.Its operation relation is:
Fij[n]=Fij[n-1]e-aF+Iij+VF∑MijklYkl[n-1]
In above formula, Fij[n] is the feed back input of (i, j) individual neuron, IijStimulus signal (this hair being an externally input The bright gray value for representing image pixel matrix (i, j) position).
Non-linear connection modulating part is also referred to as inside neurons activity, and it is by linear connected portion and feed back input two Split-phase multiplies acquisition.Its operation relation is:
Uij[n]=Fij[n](1+βCij[n])
In above formula, Uij[n] is inside neurons active entry, and β is connection modulation constant between protruding.
Can whether pulse produces excite dynamic threshold depending on internal activity size more than it, and this threshold value with The change of the neuron output state accordingly changes.Its operation relation is:
In above formula, EijFor threshold function table.
The detection of moving target is carried out using pulse-coupled neural networks model, regards two-dimensional pixel matrix M × N as M × N number of Pulse-coupled neural networks, the gray value of each of which pixel corresponds to the input S of each neuron.Pulse-coupled neural networks model is Single-layer model neutral net, it is not necessary to which training process can implementation pattern be recognized, image is split, target classification, is different from tradition Multitiered network.And there are good theoretical foundation, preferable analytical performance, can preferably solve the multi-modal problem of background, The use in actual motion detection field can be realized.
For the detection of moving target, with the presence of the close pixel of gray value in the field of connection weight matrix inside image When, the pulse output of wherein some pixel will cause exciting for other similar gray-scale pixels correspondence neurons, produce pulse Sequence is exported.Its processing to pixel is as follows:
In formula (16), xij(t) pixel value of currently processed pixel, m are representedij(t-1) represent that by decision-making be background Corresponding pixel model average, σij[t-1] represents the variance of correspondence model, Fij[n] represents the defeated of each neuron Enter.
Given threshold θij[n] by up and down two threshold values constitute, for the first time judge use a pair of threshold values TH1 and TL1, for the second time with Another pair threshold value TH2 and TL2 are all used in segmentation afterwards, i.e.,:
θijl[1]=TL1, θijh[1]=TH1 (17)
θijl[n]=TL2, θijh[n]=TH2, n > 1
N represents that n-th is split.According to the general principle of pulse-coupled neural networks model and dual threshold, system is output as:
In formula (18), RP expressions are handled with area correlation techniques.
The particularly apparent point of the gray value difference of moving target and background is detected in the first iteration, to be detected Continue iteration as seed in the motor point come.When an iteration is completed, and the amount of movement detected is with zero, then terminate and change Generation.
In the presence of having foreground target, the target detection based on neutral net can preferably complete DYNAMIC COMPLEX Target detection under background.
Step 3:Fused filtering, which is carried out, finally by the target location for obtaining two methods draws exact position.
The present invention realizes the positional information fusion based on light stream and neutral net using Track association method and blending algorithm. Track association is also referred to as track pairing, and its effect is, for determining that the one or more of tracks obtained by target detection are It is not from same target.Track association algorithm includes calculating incidence matrix and selects best relevance assumption, generally by certain Allocation algorithm is planted to realize.For different tracks positional information fusion, it is necessary to consider information state estimation between correlation, And then carry out the fusion of different information so that in the case that application purpose and construction background are widely different, it can also obtain objective public affairs Positive goal information.
Technology path is as follows:
Track association inspection is carried out to two positional informations collected first, incidence matrix is calculated, to determine what is obtained Positional information comes from same target;Then locus method is implemented to two groups of location information data set, finds out correspondence The discrepancy fusion collection of object composition, and exclude the mistake found out by locus method using the name attribute similarity based method of threshold value Miss object;Finally the corresponding objects that previous methods could not be found are found out using the name attribute similarity based method of threshold value.
, it is necessary to calculate incidence matrix when using Track association algorithm.This literary grace calculates incidence matrix with the following method:
WithIt is the state estimation from different sensors i and j, PiAnd PjIts covariance, P are represented respectivelyijAnd PjiRespectively The cross covariance of Target state estimator error is represented, and is had:
Pij=Pji T (19)
Obtain that incidence matrix formula is calculated as below:
Calculate incidence matrix dijAfterwards, judged according to known threshold value relation, that is, judge whether they come from Same target.
Consider the correlation between state estimation, propose a kind of positional information blending algorithm.Premise must is fulfilled for following vacation If:
This be imply in blending algorithm one it is assumed that K.C.Chang et al. provide and demonstrate this hypothesis into It is vertical, it is a mathematic expectaion, i.e.,In condition DiUnder mathematic expectaion beDiIt is the information content from sensor i.Assume Vertical condition is that sensor i information can be obtained from other sensors.
Provide state estimation fusion formula as follows:
Above formula is provided close to optimal fusion performance, and the process noise and measurement noise in the fusion process of reality are total It is present, and is the situation of Non-zero Mean.This fusion has taken into full account this situation, and will not produce excessive performance and decline Subtract, important function has been played in the fusion of positional information.
Method based on locus finds correct fusion collection merely with different positional informations, and the present invention is using most Neighbor method is found.Assuming that having 2 data acquisition system A and B from diverse location information, and there are A={ a1,a2,…,am, B= {b1,b2,…,bn}.Closest method:
In formula, a ∈ A, b ∈ B, c ({ a, b }) are the confidence level of target corresponding objects (a, b);a2The secondary neighbour for being b in A Object;b2The secondary neighbour's object for being a in B.If confidence level is more than given threshold value, correct corresponding objects are taken as.
Fusion method proposed by the present invention is the fusion based on diverse location data, and the problem of it is mainly solved is:Never Corresponding objects accurately are found out in position data set with source, these corresponding objects further form fusion collection, final to realize The fusion of diverse location information.
After the fusion of diverse location information is carried out, excluded using the name attribute similarity based method of threshold value by space bit The accuracy of the error object that the method for putting is found out, further increase algorithm.
The present embodiment carries out simulating, verifying using Microsoft Visual Studio softwares, in regarding for having had been taken Target detection is carried out to mobile vehicle in frequency.Also, where the advantage in order to verify proposed detection method, respectively Moved with based on improved optical flow method, based on neural network, based on light stream and neural network fusion method these three methods The detection of target, is then contrasted the testing result of these three methods.Detection process is carried out in two steps:Handle first by taking the photograph The image that camera is shot;It is then based on the mobile target of the integration technology detection of positional information.
By the detection to mobile target in video, testing result as shown in Figure 3 is obtained.In order to more intuitively show The advantage of target detection based on light stream and neural network fusion, we are simultaneously with other two methods (i.e. based on improved light Stream method and based on neural net method) mobile target is detected, and contrasted with method mentioned herein. Fig. 4 just shows the effect contrast figure emulated with three kinds of methods.
The verification and measurement ratio of three kinds of detection methods is as shown in figure 5, this verification and measurement ratio represents the area for being detected mobile target Account for the percentage of whole mobile target area.In Figure 5, curve " Fusion " represents the shifting based on light stream and neural network fusion The verification and measurement ratio of moving target detection method;Curve " OF " represents the verification and measurement ratio of the Moving target detection based on light stream;Curve " NN " generation The verification and measurement ratio of Moving target detection of the table based on neutral net.
A kind of method or single of the single target detection based on light stream is can be seen that from the simulation result in figure The method of target detection based on neutral net can not all obtain accurate testing result, it is impossible to eliminate because camera motion is caused Influence.In Figure 5, " Fusion " represents process proposed herein, it may be seen that verification and measurement ratio is 0.8 or so;“OF” Represent the object detection method based on light stream;" NN " represents the object detection method based on neutral net.Two methods below Verification and measurement ratio is all 0.6 or so.So, method proposed by the invention, i.e., the target detection based on light stream and neural network fusion Method, can not only detect complete object, and can eliminate well because video camera is moved to the influence come.Separately Outside, quickly and stably, and its result is high-visible, is a kind of especially real for Moving target detection for this detection method Method.
The method that can be seen that single optical flow method either neutral net from simulation result above can not all obtain essence True testing result, it is impossible to eliminate the influence that camera motion is brought.And it is proposed by the invention based on optical flow method and nerve The detection method of the network integration can not only detect complete target, and eliminate what camera motion was brought well Influence.
The technological thought of embodiment only to illustrate the invention, it is impossible to which protection scope of the present invention is limited with this, it is every according to Technological thought proposed by the present invention, any change done on the basis of technical scheme, each falls within the scope of the present invention.

Claims (7)

1. a kind of moving target method for catching for merging optical flow method and neutral net, it is characterised in that comprise the following steps:
(1) position of moving target is detected from image using improved optical flow method;Chosen first by Snake greedy algorithms The general outline of moving target, then the moving target detected using LK algorithms position;
(2) position of moving target is detected from image using Pulse-coupled Neural Network Model;
(3) object detection results of step (1) and the object detection results of step (2) are subjected to fused filtering, obtain accurate Moving target position.
2. the moving target method for catching of optical flow method and neutral net is merged according to claim 1, it is characterised in that:In step Suddenly in (1), the Snake greedy algorithms are as follows:
If AiFor i-th of snake point, then energy equation is discrete turns to:
In formula (1), winsize is the size of search window, Ai,jRepresent point of i-th of snake point in the search window at j, j model Enclose is that snake point removes point outside snake point in the window;
Econt(Ai,j)、Ecurv(Ai,j)、Eimg(Ai,j) it is respectively continuity energy, flexional, image energy, α, β, γ are represented Corresponding weights, its corresponding centrifugal pump is:
In formula (2)-(4),For the average distance between snake point,N is the snake point number chosen, and t is selection The sequential of snake point, I (Ai,j) gray values of i-th of snake point j positions in the search window, maxGray are represented, minGray is respectively Minimum and maximum gray value in window.
3. the moving target method for catching of optical flow method and neutral net is merged according to claim 1, it is characterised in that in step Suddenly in (1), the LK algorithms are as follows:
If I and J is two continuous frames image, the gray value of its (x, y) point is respectively I (x, y) and J (x, y), if u=[ux,uy]TIt is On image I a bit, a point v=u+d=[u are found on image Jx+dx,uy+dy]TSo that I (u) and J (v) is same position Put, d=[dx dy]TDisplacement for point v relative to point u, for minimizing difference function;
Define difference function:
In formula (5), wx、wyIt is window ranges of the point in x, y Directional Extension;
The optimization problem of difference function is solved, method for solving is to solve for local derviations of the difference function e (d) on d, and makes its etc. In 0, obtain:
D=G-1b (6)
In formula (6),Ix、IyRespectively I (x, y) is on x, y Local derviation, it is ensured that G is reversible, that is, ensure that Grad of the I (x, y) in x, y direction is not 0;
The above-mentioned optimization problem of successive ignition, solves optimal displacement vector d.
4. the moving target method for catching of optical flow method and neutral net is merged according to claim 1, it is characterised in that in step Suddenly in (2), the detection of moving target is carried out using Pulse-coupled Neural Network Model, regards two-dimensional pixel matrix M × N as M × N Individual pulse-coupled neural networks, the gray value of each of which pixel corresponds to the input of each neuron;
In the presence of having the close pixel of gray value in the field of connection weight matrix for the detection of moving target, inside image, its In the pulse output of some pixel will cause exciting for other similar gray-scale pixels correspondence neurons, produce pulse train defeated Go out, its processing to pixel is as follows:
In formula (7), xij(t) pixel value of currently processed pixel, μ are representedij(t-1) correspondence for background by decision-making is represented The model average of pixel, σij[t-1] represents the variance of correspondence model, Fij[n] is the input of each neuron;
Given threshold θij[n] is to include two pairs of threshold values:TH1 and TL1, TH2 and TL2, judge to use threshold value TH1 and TL1 for the first time, the Secondary and later judgement threshold value TH2 and TL2, i.e.,:
In formula (8), n is segmentation times;
According to the general principle of pulse-coupled neural networks model and dual threshold, system is output as:
In formula (9), Uij[n] is inside neurons active entry, Uij[n]=Fij[n](1+βCij[n]), Cij[n] is the outer of neuron Portion is stimulated, and β is the connection modulation constant between protruding, and RP expressions are handled with area correlation techniques;
Said process needs successive ignition, detects that the gray value difference of moving target and background is especially bright in the first iteration Aobvious motor point, using these motor points as seed, continues iteration, when the motor point quantity detected is zero, termination changes Generation.
5. the moving target method for catching of optical flow method and neutral net is merged according to claim 1, it is characterised in that in step Suddenly in (3), the two groups of target position datas obtained first to step (1), (2) carry out Track association inspection, to determine what is obtained Target position data comes from same target;Then locus method is implemented to two groups of position datas, finds out corresponding objects Composition discrepancy fusion collection, reuse threshold value name attribute similarity based method exclude by locus method find out it is wrong right As.
6. the moving target method for catching of optical flow method and neutral net is merged according to claim 5, it is characterised in that described The process that Track association is examined is as follows:
IfWithIt is the state estimation from different target detection method, PiAnd PjIts covariance, P are represented respectivelyijAnd PjiRespectively The cross covariance of Target state estimator error is represented, and is had:
Pij=Pji T (10)
Calculate incidence matrix:
By incidence matrix dijIn each element value and setting threshold value be compared, so as to judgeWithWhether come from same One target.
7. the moving target method for catching of optical flow method and neutral net is merged according to claim 5, it is characterised in that described Locus method uses closest method:
If A and B is the target position data set from different target detection method, then
In formula (14), a ∈ A, b ∈ B, c ({ a, b }) are the confidence level of target corresponding objects (a, b), a2The secondary neighbour for being b in A Object, b2The secondary neighbour's object for being a in B;
If confidence level c ({ a, b }) is more than default threshold value, then it is assumed that be correct corresponding objects, these corresponding objects are formed out Enter fusion collection.
CN201710236616.2A 2017-04-12 2017-04-12 A kind of moving target method for catching for merging optical flow method and neutral net Pending CN107038713A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710236616.2A CN107038713A (en) 2017-04-12 2017-04-12 A kind of moving target method for catching for merging optical flow method and neutral net

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710236616.2A CN107038713A (en) 2017-04-12 2017-04-12 A kind of moving target method for catching for merging optical flow method and neutral net

Publications (1)

Publication Number Publication Date
CN107038713A true CN107038713A (en) 2017-08-11

Family

ID=59536419

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710236616.2A Pending CN107038713A (en) 2017-04-12 2017-04-12 A kind of moving target method for catching for merging optical flow method and neutral net

Country Status (1)

Country Link
CN (1) CN107038713A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107967695A (en) * 2017-12-25 2018-04-27 北京航空航天大学 A kind of moving target detecting method based on depth light stream and morphological method
CN108491807A (en) * 2018-03-28 2018-09-04 北京农业信息技术研究中心 A kind of cow oestrus behavior method of real-time and system
CN109063609A (en) * 2018-07-18 2018-12-21 电子科技大学 A kind of anomaly detection method based on Optical-flow Feature in conjunction with full convolution semantic segmentation feature
CN109063549A (en) * 2018-06-19 2018-12-21 中国科学院自动化研究所 High-resolution based on deep neural network is taken photo by plane video moving object detection method
CN109211122A (en) * 2018-10-30 2019-01-15 清华大学 Ultraprecise displacement measurement system and method based on optical neural network
CN109343363A (en) * 2018-10-30 2019-02-15 清华大学 Movement TT&C system based on optical oomputing
CN109447082A (en) * 2018-08-31 2019-03-08 武汉尺子科技有限公司 A kind of scene motion Target Segmentation method, system, storage medium and equipment
WO2019114696A1 (en) * 2017-12-13 2019-06-20 腾讯科技(深圳)有限公司 Augmented reality processing method, object recognition method, and related apparatus
CN111047908A (en) * 2018-10-12 2020-04-21 富士通株式会社 Detection device and method for cross-line vehicle and video monitoring equipment
CN111186432A (en) * 2018-11-13 2020-05-22 杭州海康威视数字技术股份有限公司 Vehicle blind area early warning method and device
CN111274914A (en) * 2020-01-13 2020-06-12 目骉资讯有限公司 Horse speed calculation system and method based on deep learning
CN111507235A (en) * 2020-04-13 2020-08-07 北京交通大学 Video-based railway perimeter foreign matter intrusion detection method
CN111998780A (en) * 2019-05-27 2020-11-27 杭州海康威视数字技术股份有限公司 Target ranging method, device and system
WO2021035807A1 (en) * 2019-08-23 2021-03-04 深圳大学 Target tracking method and device fusing optical flow information and siamese framework
WO2021174513A1 (en) * 2020-03-06 2021-09-10 华为技术有限公司 Image processing system and method, and autonomous vehicle comprising said system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104504362A (en) * 2014-11-19 2015-04-08 南京艾柯勒斯网络科技有限公司 Face detection method based on convolutional neural network
CN104933732A (en) * 2015-05-15 2015-09-23 南京立坤智能技术有限公司 Method for detecting and tracking movement target based on omnidirectional vision of robot
CN106354816A (en) * 2016-08-30 2017-01-25 东软集团股份有限公司 Video image processing method and video image processing device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104504362A (en) * 2014-11-19 2015-04-08 南京艾柯勒斯网络科技有限公司 Face detection method based on convolutional neural network
CN104933732A (en) * 2015-05-15 2015-09-23 南京立坤智能技术有限公司 Method for detecting and tracking movement target based on omnidirectional vision of robot
CN106354816A (en) * 2016-08-30 2017-01-25 东软集团股份有限公司 Video image processing method and video image processing device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
QIN HAIQUN ET AL.: "Moving object detection based on optical flow and neural network fusion", 《INTERNATIONAL JOURNAL OF INTELLIGENT COMPUTING AND CYBERNETICS》 *
刘丽 等: "基于脉冲祸合神经网络的运动检测算法", 《信息技术》 *
张巍 等: "空间位置信息的多源POI数据融合", 《中国海洋大学学报》 *
齐苏敏 等: "基于改进 Snake模型的人体运动跟踪算法", 《计算机应用》 *

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019114696A1 (en) * 2017-12-13 2019-06-20 腾讯科技(深圳)有限公司 Augmented reality processing method, object recognition method, and related apparatus
CN109918975B (en) * 2017-12-13 2022-10-21 腾讯科技(深圳)有限公司 Augmented reality processing method, object identification method and terminal
US10891799B2 (en) 2017-12-13 2021-01-12 Tencent Technology (Shenzhen) Company Limited Augmented reality processing method, object recognition method, and related device
CN109918975A (en) * 2017-12-13 2019-06-21 腾讯科技(深圳)有限公司 A kind of processing method of augmented reality, the method for Object identifying and terminal
CN107967695A (en) * 2017-12-25 2018-04-27 北京航空航天大学 A kind of moving target detecting method based on depth light stream and morphological method
CN107967695B (en) * 2017-12-25 2018-11-13 北京航空航天大学 A kind of moving target detecting method based on depth light stream and morphological method
CN108491807A (en) * 2018-03-28 2018-09-04 北京农业信息技术研究中心 A kind of cow oestrus behavior method of real-time and system
CN108491807B (en) * 2018-03-28 2020-08-28 北京农业信息技术研究中心 Real-time monitoring method and system for oestrus of dairy cows
CN109063549B (en) * 2018-06-19 2020-10-16 中国科学院自动化研究所 High-resolution aerial video moving target detection method based on deep neural network
CN109063549A (en) * 2018-06-19 2018-12-21 中国科学院自动化研究所 High-resolution based on deep neural network is taken photo by plane video moving object detection method
CN109063609A (en) * 2018-07-18 2018-12-21 电子科技大学 A kind of anomaly detection method based on Optical-flow Feature in conjunction with full convolution semantic segmentation feature
CN109447082A (en) * 2018-08-31 2019-03-08 武汉尺子科技有限公司 A kind of scene motion Target Segmentation method, system, storage medium and equipment
CN109447082B (en) * 2018-08-31 2020-09-15 武汉尺子科技有限公司 Scene moving object segmentation method, system, storage medium and equipment
CN111047908B (en) * 2018-10-12 2021-11-02 富士通株式会社 Detection device and method for cross-line vehicle and video monitoring equipment
CN111047908A (en) * 2018-10-12 2020-04-21 富士通株式会社 Detection device and method for cross-line vehicle and video monitoring equipment
CN109211122A (en) * 2018-10-30 2019-01-15 清华大学 Ultraprecise displacement measurement system and method based on optical neural network
CN109211122B (en) * 2018-10-30 2020-05-15 清华大学 Ultra-precise displacement measurement system and method based on optical neural network
CN109343363A (en) * 2018-10-30 2019-02-15 清华大学 Movement TT&C system based on optical oomputing
CN111186432A (en) * 2018-11-13 2020-05-22 杭州海康威视数字技术股份有限公司 Vehicle blind area early warning method and device
CN111998780A (en) * 2019-05-27 2020-11-27 杭州海康威视数字技术股份有限公司 Target ranging method, device and system
WO2021035807A1 (en) * 2019-08-23 2021-03-04 深圳大学 Target tracking method and device fusing optical flow information and siamese framework
CN111274914A (en) * 2020-01-13 2020-06-12 目骉资讯有限公司 Horse speed calculation system and method based on deep learning
CN111274914B (en) * 2020-01-13 2023-04-18 目骉资讯有限公司 Horse speed calculation system and method based on deep learning
WO2021174513A1 (en) * 2020-03-06 2021-09-10 华为技术有限公司 Image processing system and method, and autonomous vehicle comprising said system
CN111507235A (en) * 2020-04-13 2020-08-07 北京交通大学 Video-based railway perimeter foreign matter intrusion detection method
CN111507235B (en) * 2020-04-13 2024-05-28 北京交通大学 Railway perimeter foreign matter intrusion detection method based on video

Similar Documents

Publication Publication Date Title
CN107038713A (en) A kind of moving target method for catching for merging optical flow method and neutral net
Zhu Research on road traffic situation awareness system based on image big data
US20220197281A1 (en) Intelligent decision-making method and system for unmanned surface vehicle
CN107169435B (en) Convolutional neural network human body action classification method based on radar simulation image
CN109544613A (en) A kind of binocular solid matching process and system based on the study of dense network depth
CN110246181B (en) Anchor point-based attitude estimation model training method, attitude estimation method and system
CN111340868B (en) Unmanned underwater vehicle autonomous decision control method based on visual depth estimation
CN109460709A (en) The method of RTG dysopia analyte detection based on the fusion of RGB and D information
CN110672088B (en) Unmanned aerial vehicle autonomous navigation method imitating homing mechanism of landform perception of homing pigeons
RU2476825C2 (en) Method of controlling moving object and apparatus for realising said method
CN103426179B (en) A kind of method for tracking target based on mean shift multiple features fusion and device
CN113192124B (en) Image target positioning method based on twin network
CN111753667A (en) Intelligent automobile single-target tracking method based on twin network
Sun et al. IRDCLNet: Instance segmentation of ship images based on interference reduction and dynamic contour learning in foggy scenes
Pham et al. Pencilnet: Zero-shot sim-to-real transfer learning for robust gate perception in autonomous drone racing
Yu et al. Dual-branch framework: AUV-based target recognition method for marine survey
CN111259923A (en) Multi-target detection method based on improved three-dimensional R-CNN algorithm
Li et al. Vehicle object detection based on rgb-camera and radar sensor fusion
Wang et al. An underwater single target tracking method using SiamRPN++ based on inverted residual bottleneck block
CN114266805A (en) Twin region suggestion network model for unmanned aerial vehicle target tracking
Gelen et al. An artificial neural slam framework for event-based vision
Ni et al. Edge guidance network for semantic segmentation of high resolution remote sensing images
CN117576149A (en) Single-target tracking method based on attention mechanism
CN112560571A (en) Intelligent autonomous visual navigation method based on convolutional neural network
CN115880332A (en) Target tracking method for low-altitude aircraft visual angle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170811

RJ01 Rejection of invention patent application after publication