CN103700118A - Moving target detection method on basis of pulse coupled neural network - Google Patents

Moving target detection method on basis of pulse coupled neural network Download PDF

Info

Publication number
CN103700118A
CN103700118A CN201310731768.1A CN201310731768A CN103700118A CN 103700118 A CN103700118 A CN 103700118A CN 201310731768 A CN201310731768 A CN 201310731768A CN 103700118 A CN103700118 A CN 103700118A
Authority
CN
China
Prior art keywords
histogram
pixel
global characteristics
neural network
background model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310731768.1A
Other languages
Chinese (zh)
Other versions
CN103700118B (en
Inventor
汪晋宽
才溪
韩光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeastern University China
Original Assignee
Northeastern University China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeastern University China filed Critical Northeastern University China
Priority to CN201310731768.1A priority Critical patent/CN103700118B/en
Publication of CN103700118A publication Critical patent/CN103700118A/en
Application granted granted Critical
Publication of CN103700118B publication Critical patent/CN103700118B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a moving target detection method on the basis of a pulse coupled neural network, which comprises the following steps of a, sensing a video image sequence by utilizing the pulse coupled neural network and extracting global features of a video image; b, establishing a global feature histogram of each pixel; c, for each pixel, utilizing the global feature histograms corresponding to first K frames to establish an initial background model of the pixel; d, for each pixel, calculating similarity of a global feature histogram of a current frame image and the corresponding global feature histogram in the background model and detecting whether the pixel is a moving target; e, for each pixel, utilizing the global feature histogram of the current frame image to update the background model of the pixel. The moving target detection method uses the integral characteristics of human visual perception for reference, utilizes the pulse coupled neural network to extract the global features of the image and is beneficial for inhibiting the negative influence of disturbance of a dynamic background on detection of the moving target so as to improve accuracy of detecting the moving target.

Description

Moving target detection method based on Pulse Coupled Neural Network
Technical field
The present invention relates to a kind of moving target detection method based on Pulse Coupled Neural Network, belong to technical field of video image processing.
Background technology
In intelligent video monitoring system, moving target detection technique is the basis of other various post-processed (as target following, target identification and behavioural analysis etc.).In order to be partitioned into accurately and effectively motion target area under various monitoring environments, the moving target detection method that research has robustness to complicated dynamic scene (as illumination variation, background perturbation etc.) has great importance.
For solving complicated dynamic scene, be that moving-target detects all difficulties of bringing, at present, people mainly utilize the local neighborhood feature of pixel to be described the background model of pixel, by extracting the local features of image, improve feature for the robustness of illumination variation, background perturbation in region.The limitation of the method is: because it fails to utilize well the global characteristics of image, so only effectively Background suppression disturbance when dynamic background motion is confined in regional area, and be not suitable for and have the dynamic background environment of strenuous exercise (the said regional area that is confined to, refers to and be confined to calculate in the region of local feature here.Qualitative, when dynamic background moves in the region of calculating local feature, the local features calculating just easily has stability exactly; When dynamic background motion has exceeded the region of calculating local feature, the local feature obtaining is just difficult for stable, also poor to the robustness of the dynamic background of strenuous exercise by the background model of its foundation).Thereby current urgent need is a kind of can be applicable to exist the dynamic background environment of strenuous exercise, can improve the dynamic object detection method of detection accuracy and robustness simultaneously.
Summary of the invention
The object of the invention is to, a kind of moving target detection method based on Pulse Coupled Neural Network is provided, it can effectively solve problems of the prior art, especially only effectively Background suppression disturbance when dynamic background motion is confined in regional area, and be not suitable for the problem of the dynamic background environment that has strenuous exercise.
For solving the problems of the technologies described above, the present invention adopts following technical scheme: the moving target detection method based on Pulse Coupled Neural Network, comprises the following steps:
A. utilize Pulse Coupled Neural Network perception sequence of video images, extract the global characteristics of video image;
B. set up the global characteristics histogram of each pixel;
C. for each pixel, the global characteristics histogram that before utilizing, K frame is corresponding is as the initial back-ground model of this pixel;
D. for each pixel, the histogrammic similarity of global characteristics accordingly in the global characteristics histogram that calculates current frame image and background model, whether detect this pixel is moving-target;
E. for each pixel, utilize the global characteristics histogram of current frame image to upgrade its background model.
Preferably, step a specifically comprises: a pixel in the corresponding video image of a neuron in Pulse Coupled Neural Network; For the neuron that is positioned at (i, j) in Pulse Coupled Neural Network, it is subject to environmental stimuli information S at moment n ijwith other neuron n-1 moment pulse information { Y in adjacent k * l neighborhood klimpact after, feed back input F ij, the linear input L that connects ij, internal activity item U ij, film potential dynamic threshold θ ij, pulse producer pulse output Y ijand the global characteristics Q of the video image extracting ijbe respectively:
F ij(n)=S ij
L ij ( n ) = 1 , Σ kl Y kl ( n - 1 ) > 0 0 , otherwise ;
U ij(n)=F ij(n)·(1+βL ij(n));
θ ij ( n ) = e - α θ θ ij ( n - 1 ) + V θ Y ij ( n ) ;
Y ij ( n ) = 1 , U ij ( n ) > θ ij ( n - 1 ) 0 , otherwise ;
Q ij(n)=Q ij(n-1)+(iter-n)Y ij(n);
Wherein, iter is the total degree of Pulse Coupled Neural Network iteration perception; S ijfor being positioned at the grey scale pixel value of arbitrary band image of the neuron institute perception of (i, j); α θfor dynamic threshold θ ijdamping time constant; V θfor threshold value amplification coefficient; β is strength of joint.
Preferably, step b specifically comprises: according to the feature of coloured image R, G, tri-band images of B, use Pulse Coupled Neural Network to carry out respectively perception to the R of coloured image, G, tri-band images of B, after iteration iter time, obtain respectively feature QR, QG, the QB of three wave bands; For each pixel, in its neighborhood, statistics obtains feature histogram HR, HG, the HB of three wave bands, and the global characteristics histogram H using these histogram series connection as respective pixel, detects the moving-target of photochrome thereby can realize; In addition, the feature histogram of coloured image R, G, tri-wave bands of B adopts series system, thereby has reduced dimension and the sparse property of character representation, is conducive to reduce the storage space of background model.
Specifically, in step c, the foundation of initial back-ground model specifically comprises: for each pixel, before utilizing, K two field picture is extracted and obtained K global characteristics histogram { H by step a and b 1..., H k; To each global characteristics histogram, give a weight w k, and
Figure BDA0000447765840000024
Steps d of the present invention specifically comprises: for each pixel, if the global characteristics histogram H of current frame image cbe greater than threshold value T with the individual arbitrary histogrammic similarity corresponding to background of front B in background model s, this pixel is background; Otherwise this pixel is prospect, i.e. moving-target.
Background model is upgraded specifically and being comprised described in step e of the present invention:
E1. for each pixel, to current frame image to be detected, use step a and b to extract and obtain its global characteristics histogram H c; Utilize histogram intersection to calculate the global characteristics histogram H of current frame image cwith K in background model histogram H ibetween similarity:
Figure BDA0000447765840000031
wherein, N represents the number of histogram bin; I=1,2 ..., K;
E2. establishing similarity measurement threshold value is T sif, the global characteristics histogram H of current frame image cwith { H 1..., H kin similarity between K histogram all lower than threshold value T s, current histogram cannot mate with background model, now uses the global characteristics histogram H of current frame image creplace the histogram of weight minimum in background model; If the global characteristics histogram H of current frame image cwith background model { H 1..., H kin similarity between certain several histogram higher than threshold value T s, select the background model histogram H of similarity maximum mas optimum matching histogram, and by the histogram H of current frame image cto optimum matching histogram H mand corresponding weight w mupgrade, that is:
H m=α hH c+(1-α h)H m
w m=w mw(1-w m);
Wherein, α hfor the histogram study factor, α wfor the weight study factor; Simultaneously to other histogrammic weight w in background model ialso adjust accordingly, that is:
W i=(1-α w) w i, i ≠ m wherein;
E3. descending to the K in background model histogram rearrangement according to each histogrammic weight, and select front B histogram corresponding to background; Wherein, B<K.
In the present invention, front K frame is used for initialization background model, from K+1 frame, starts to detect moving target; Background model is upgraded also from K+1 frame, after each frame all to carry out, and all only utilize current this frame to upgrade background model at every turn.
Compared with prior art, the human visual perception system that the present invention evolves ripe by reference, according to the overall permanence of human visual perception (be human visual perception system to the understanding of object of observation integral body often prior to the understanding to local, and this whole understanding is not to forming the simple addition of the individuality understanding of object of observation; Each several part individuality in object of observation all produces and stimulates visually-perceptible system, various stimulations interact, influence each other, mutually set off), utilize Pulse Coupled Neural Network simulation human eye vision nervous system, pulse reaction based on this neural network perception coloured image extracts the global characteristics information of image, and set up on this basis the background model based on global characteristics, thereby can improve the robustness that moving-target detects.In addition the present invention uses for reference the overall permanence of human visual perception, utilize Pulse Coupled Neural Network to extract the global characteristics of image, be conducive to suppress the adverse effect that dynamic background disturbance detects moving-target, thereby improve accuracy and the F-measure value that moving-target detects; And by contrast experiment, find, compare with the moving target detection method based on local feature, the testing result of the inventive method is better to the inhibition of dynamic background disturbance.According to mass data statistics, show: remove after pre-service and last handling process, under same experimental situation, contrast with existing moving target detection method, the Detection accuracy of existing moving target detection method is 78.3%, F-measure is 84.9%, and Detection accuracy of the present invention is 82.6%, F-measure is 86.7%, accuracy rate has improved 4.3%, F-measure and has improved 1.8%.In addition, the present invention is according to the feature of coloured image R, G, tri-band images of B, uses Pulse Coupled Neural Network to carry out respectively perception to the R of coloured image, G, tri-band images of B, obtains respectively the feature Q of three wave bands after iteration iter time r, Q g, Q bthereby, can realize coloured image is carried out to moving-target detection; The feature histogram of coloured image R, G, tri-wave bands of B adopts series system, thereby has reduced dimension and the sparse property of character representation, is conducive to reduce the storage space of background model.
Adopt moving target detection method of the present invention and adopt the classical background subtraction method (GMM) based on gauss hybrid models, improved background subtraction method (IGMM) based on gauss hybrid models and the background subtraction method (LBP) based on local binary patterns texture to carry out moving-target to detect and compare, detect effect as shown in table 1:
Table 1
The Method type adopting Accuracy rate F-measure
The inventive method 0.8260 0.8673
GMM 0.6578 0.7733
IGMM 0.7827 0.8485
LBP 0.6957 0.7823
Which kind of mode is difficulty of the present invention be embodied in and use which kind of method, adopt to extract the information of reflection image overall feature.
Accompanying drawing explanation
Fig. 1 is the two field picture in one group of video image;
Fig. 2 is for being used the inventive method detected moving-target region from Fig. 1;
Fig. 3 is method flow diagram of the present invention.
Below in conjunction with the drawings and specific embodiments, the present invention is further illustrated.
Embodiment
Embodiments of the invention 1: the moving target detection method based on Pulse Coupled Neural Network, as shown in Figure 3, comprises the following steps:
A. utilize Pulse Coupled Neural Network perception sequence of video images, extract the global characteristics of video image; Wherein, a pixel of Pulse Coupled Neural Network corresponding video image of neuron; For the neuron that is positioned at (i, j) in Pulse Coupled Neural Network, it is subject to environmental stimuli information S at moment n ijwith other neuron n-1 moment pulse information { Y in adjacent k * l neighborhood klimpact after, feed back input F ij, the linear input L that connects ij, internal activity item U ij, film potential dynamic threshold θ ij, pulse producer pulse output Y ijand the global characteristics Q of the video image extracting ijbe respectively:
F ij(n)=S ij
L ij ( n ) = 1 , &Sigma; kl Y kl ( n - 1 ) > 0 0 , otherwise ;
U ij(n)=F ij(n)·(1+βL ij(n));
&theta; ij ( n ) = e - &alpha; &theta; &theta; ij ( n - 1 ) + V &theta; Y ij ( n ) ;
Y ij ( n ) = 1 , U ij ( n ) > &theta; ij ( n - 1 ) 0 , otherwise ;
Q ij(n)=Q ij(n-1)+(iter-n)Y ij(n);
Wherein, iter is the total degree of Pulse Coupled Neural Network iteration perception; S ijfor being positioned at the grey scale pixel value of arbitrary band image of the neuron institute perception of (i, j); α θfor dynamic threshold θ ijdamping time constant; V θfor threshold value amplification coefficient, it has determined the neuron firing lifting degree of threshold value constantly, and the neuron firing cycle is played to important regulating action, and therefore value is larger conventionally; β is strength of joint, and it has determined linear connection input L ijto internal activity item U ijcontribution, between each neuron of some PCNN of β ≠ 0, exist and be of coupled connections, a neuronic igniting meeting contributes to other neurons in its join domain; Conventionally rule of thumb select strength of joint and establish it for constant;
B. according to the feature of coloured image R, G, tri-band images of B, set up the global characteristics histogram of each pixel: use Pulse Coupled Neural Network to carry out respectively perception to the R of coloured image, G, tri-band images of B, after iteration iter time, obtain respectively the feature Q of three wave bands r, Q g, Q b; For each pixel, in its neighborhood, statistics obtains the feature histogram H of three wave bands r, H g, H b, the global characteristics histogram H using these histogram series connection as respective pixel;
C. for each pixel, the global characteristics histogram that before utilizing, K frame is corresponding is as the initial back-ground model of this pixel; The foundation of initial back-ground model specifically comprises: for each pixel, before utilizing, K two field picture is extracted and obtained K global characteristics histogram { H by step a and b 1..., H k; To each global characteristics histogram, give a weight w k, and
Figure BDA0000447765840000055
D. for each pixel, the corresponding histogrammic similarity of global characteristics in the global characteristics histogram of calculating current frame image and background model, whether be moving-target, specifically if detecting this pixel, for each pixel, if the global characteristics histogram H of current frame image cbe greater than threshold value T with the individual arbitrary histogrammic similarity corresponding to background of front B in background model s, this pixel is background; Otherwise this pixel is prospect, i.e. moving-target;
E. for each pixel, utilize the global characteristics histogram of current frame image to upgrade its background model; Described upgrades specifically and comprises background model:
E1. for each pixel, to current frame image to be detected, use step a and b to extract and obtain its global characteristics histogram H c; Utilize histogram intersection to calculate the global characteristics histogram H of current frame image cwith K in background model histogram H i(i=1,2 ..., K)) between similarity:
Sim ( H c , H i ) = &Sigma; n = 0 N - 1 min ( H cn , H in ) ; Wherein, N represents the number of histogram bin;
E2. establishing similarity measurement threshold value is T sif, the global characteristics histogram H of current frame image cwith { H 1..., H kin similarity between K histogram all lower than threshold value T s, current histogram cannot mate with background model, now uses the global characteristics histogram H of current frame image creplace the histogram of weight minimum in background model; If the global characteristics histogram H of current frame image cwith background model { H 1..., H kin similarity between certain several histogram higher than threshold value T s, select the background model histogram H of similarity maximum mas optimum matching histogram, and by the histogram H of current frame image cto optimum matching histogram H mand corresponding weight w mupgrade, that is:
H m=α hH c+(1-α h)H m
w m=w mw(1-w m);
Wherein, α hfor the histogram study factor, α wfor the weight study factor; Simultaneously to other histogrammic weight w in background model ialso adjust accordingly, that is:
W i=(1-α w) w i, i ≠ m wherein;
E3. after completing the renewal of above-mentioned background model histogram and respective weights, descending to the K in background model histogram rearrangement according to each histogrammic weight, and select front B histogram corresponding to background.
Above method can realize the moving-target of photochrome is detected.
Embodiment 2: the moving target detection method based on Pulse Coupled Neural Network, as shown in Figure 3, comprises the following steps:
A. utilize Pulse Coupled Neural Network perception sequence of video images, extract the global characteristics of video image; Wherein, a pixel in the corresponding video image of a neuron in Pulse Coupled Neural Network; For the neuron that is positioned at (i, j) in Pulse Coupled Neural Network, it is subject to environmental stimuli information S at moment n ijwith other neuron n-1 moment pulse information { Y in adjacent k * l neighborhood klimpact after, feed back input F ij, the linear input L that connects ij, internal activity item U ij, film potential dynamic threshold θ ij, pulse producer pulse output Y ijand the global characteristics Q of the video image extracting ijbe respectively:
F ij(n)=S ij
L ij ( n ) = 1 , &Sigma; kl Y kl ( n - 1 ) > 0 0 , otherwise ;
U ij(n)=F ij(n)·(1+βL ij(n));
&theta; ij ( n ) = e - &alpha; &theta; &theta; ij ( n - 1 ) + V &theta; Y ij ( n ) ;
Y ij ( n ) = 1 , U ij ( n ) > &theta; ij ( n - 1 ) 0 , otherwise ;
Q ij(n)=Q ij(n-1)+(iter-n)Y ij(n);
Wherein, iter is the total degree of Pulse Coupled Neural Network iteration perception; S ijfor being positioned at the grey scale pixel value of gray level image of the neuron institute perception of (i, j); α θfor dynamic threshold θ ijdamping time constant; V θfor threshold value amplification coefficient; β is strength of joint;
B. set up the global characteristics histogram H of each pixel;
C. for each pixel, the global characteristics histogram that before utilizing, K frame is corresponding is as the initial back-ground model of this pixel; Wherein, the foundation of initial back-ground model specifically comprises: for each pixel, before utilizing, K two field picture is extracted and obtained K global characteristics histogram { H by step a and b 1..., H k; To each global characteristics histogram, give a weight w k, and
Figure BDA0000447765840000074
D. for each pixel, the histogrammic similarity of global characteristics accordingly the global characteristics histogram that calculates current frame image (starting from K+1 frame) and background model, whether detect this pixel is moving-target; If B the arbitrary histogrammic similarity corresponding to background is greater than threshold value Ts before in the global characteristics histogram Hc of current frame image and background model, this pixel is background; Otherwise this pixel is prospect, i.e. moving-target;
E. for each pixel, utilize the global characteristics histogram of current frame image to upgrade its background model; Described upgrades specifically and comprises background model:
E1. for each pixel, to current frame image to be detected (from K+1 frame), use step a and b to extract and obtain its global characteristics histogram Hc; Utilize histogram intersection to calculate the similarity between K histogram Hi in the global characteristics histogram Hc of current frame image and background model:
Figure BDA0000447765840000075
wherein, N represents the number of histogram bin; I=1,2 ..., K;
E2. establishing similarity measurement threshold value is T sif, the global characteristics histogram H of current frame image cwith { H 1..., H kin similarity between K histogram all lower than threshold value T s, current histogram cannot mate with background model, now uses the global characteristics histogram H of current frame image creplace the histogram of weight minimum in background model; If the global characteristics histogram H of current frame image cwith background model { H 1..., H kin similarity between certain several histogram higher than threshold value T s, select the background model histogram H of similarity maximum mas optimum matching histogram, and by the histogram H of current frame image cto optimum matching histogram H mand corresponding weight w mupgrade, that is:
H m=α hH c+(1-α h)H m
w m=w mw(1-w m);
Wherein, α hfor the histogram study factor, α wfor the weight study factor; Simultaneously to other histogrammic weight w in background model ialso adjust accordingly, that is:
W i=(1-α w) w i, i ≠ m wherein;
E3. descending to the K in background model histogram rearrangement according to each histogrammic weight, and select front B histogram corresponding to background.Wherein, B<K, B can, according to prior art, be set as a fixed value or adaptive change.
Next, for each pixel, in the global characteristics histogram of calculating K+2 two field picture and background model, the corresponding histogrammic similarity of global characteristics, judges whether this pixel is moving-target; The global characteristics histogram of recycling K+2 two field picture upgrades background model, by that analogy, and calculating K+3 two field picture ... ...
Above method can realize the moving-target of gray level image is detected.
Experimental example: Fig. 1 is that a two field picture in one group of video image is (in Fig. 1, personage's overcoat 1 is purple, shirt 2 is green, and hair 3 is blackyellow, and the branch 4 on the left side is shown in green, the branch 5 on the right is shown as black, the wall brick 6 of building is khaki, on wall brick 6, has through branch and penetrates sunlight 7, and sky 8 is blue), the monitoring background of this group video image is dynamically changeable owing to having comprised the branch that flickers with the wind, and this detects and brought difficulty to moving-target.
Use method of the present invention to carry out moving-target detection to Fig. 1, specifically comprise the following steps:
(1-1) utilize Pulse Coupled Neural Network to extract the global characteristics of image: the neural network that can consist of 120 * 160 pulse-couple neurons for the color video frequency image of a group 120 * 160 as Figure 1 be carried out perception; For the neuron that is positioned at (i, j) in neural network, it is subject to environmental stimuli information S at moment n ijwith other neuron n-1 moment pulse information { Y in adjacent k * l neighborhood klafter impact, its feed back input F ij, the linear input L that connects ij, internal activity item U ij, film potential dynamic threshold θ ij, pulse producer pulse output Y ijand the present invention extracts feature Q ijcan be described as:
F ij(n)=S ij (1)
L ij ( n ) = 1 , &Sigma; kl Y kl ( n - 1 ) > 0 0 , otherwise - - - ( 2 )
U ij(n)=F ij(n)·(1+βL ij(n)) (3)
Y ij ( n ) = 1 , U ij ( n ) > &theta; ij ( n - 1 ) 0 , otherwise - - - ( 4 )
&theta; ij ( n ) = e - &alpha; &theta; &theta; ij ( n - 1 ) + V &theta; Y ij ( n ) - - - ( 5 )
Q ij(n)=Q ij(n-1)+(iter-n)Y ij(n) (6)
Wherein, the total degree of iter=16 indicating impulse coupled neural network iteration perception; S ijbe set to be positioned at the grey scale pixel value of arbitrary band image of the neuron institute perception of (i, j); α θ=0.5 represents dynamic threshold θ ijdamping time constant; V θ=100 be threshold value amplification coefficient, and it has determined the neuron firing lifting degree of threshold value constantly, and the neuron firing cycle is played to important regulative, and therefore value is larger conventionally; β=0.2 is strength of joint, and it has determined linear connection input L ijto internal activity item U ijcontribution, between each neuron of some PCNN of β ≠ 0, exist and be of coupled connections, a neuronic igniting meeting contributes to other neurons in its join domain, conventionally rule of thumb selects strength of joint and establishes it for constant;
(1-2) set up the global characteristics histogram of each pixel: use Pulse Coupled Neural Network to carry out respectively perception to the R of coloured image, G, tri-band images of B, after iteration iter time, obtain respectively the feature Q of three wave bands r, Q g, Q b, and for each pixel, in its neighborhood, statistics obtains the feature histogram H of three wave bands r, H g, H b, and the global characteristics histogram H using these histogram series connection as respective pixel.Global characteristics histogram in statistical regions is model as a setting;
(1-3) background model initializing: for each pixel, the front K=4 two field picture employing step (1-1) of utilization and (1-2) extraction global characteristics histogram, obtain K global characteristics histogram { H 1..., H k, and give a weight w to each histogram k=1K;
(1-4) moving-target detects: for each pixel, if the global characteristics histogram H of current frame image cbe greater than threshold value T with the individual arbitrary histogrammic similarity corresponding to background of front B in background model s, this pixel is background; Otherwise this pixel is prospect, i.e. moving-target;
(1-5) background model is upgraded: for each pixel, current frame image to be detected is used step (1-1) and (1-2) extracts its global characteristics histogram H c; Utilize histogram intersection to calculate present frame histogram H cwith K in background model histogram H i(i=1 ..., the similarity between K):
Sim ( H c , H i ) = &Sigma; n = 0 N - 1 min ( H cn , H in ) - - - ( 7 )
Wherein, N represents the number of histogram bin; If similarity measurement threshold value is T s=0.5, if the global characteristics histogram H of current frame image cwith { H 1..., H kin similarity between K histogram all lower than threshold value T s, think that current histogram cannot mate with background model, now uses current histogram H creplace the histogram of weight minimum in background model; If the global characteristics histogram H of current frame image cwith background model { H 1..., H kin similarity between certain several histogram higher than threshold value T s, select the background model histogram of similarity maximum (as H m) as optimum matching histogram, and by current histogram H cto optimum matching histogram H mand respective weights w mupgrade,
H m=α hH c+(1-α h)H m (8)
w m=w mw(1-w m) (9)
Wherein, α h=0.01 is the histogram study factor, α w=0.01 is the weight study factor, meanwhile, and other histogrammic weight w in background model i(wherein i ≠ m) also will adjust accordingly, that is:
w i=(1-α w)w i (10)
After completing the renewal of above-mentioned background model histogram and respective weights, according to each histogram weight, the K in background model histogram resequenced, and select front B=2 histogram corresponding to background.
Fig. 2 is through above-mentioned steps, and the moving-target testing result of not taking any post processing mode to obtain, with other existing detection method comparison, adopts after detection method of the present invention, the false target comprising in testing result is less, and the Detection accuracy of moving-target is higher.

Claims (6)

1. the moving target detection method based on Pulse Coupled Neural Network, is characterized in that, comprises the following steps:
A. utilize Pulse Coupled Neural Network perception sequence of video images, extract the global characteristics of video image;
B. set up the global characteristics histogram of each pixel;
C. for each pixel, the global characteristics histogram that before utilizing, K frame is corresponding is as the initial back-ground model of this pixel;
D. for each pixel, the histogrammic similarity of global characteristics accordingly in the global characteristics histogram that calculates current frame image and background model, whether detect this pixel is moving-target;
E. for each pixel, utilize the global characteristics histogram of current frame image to upgrade its background model.
2. the moving target detection method based on Pulse Coupled Neural Network according to claim 1, is characterized in that, step a specifically comprises: a pixel in the corresponding video image of a neuron in Pulse Coupled Neural Network; For the neuron that is positioned at (i, j) in Pulse Coupled Neural Network, it is subject to environmental stimuli information S at moment n ijwith other neuron n-1 moment pulse information { Y in adjacent k * l neighborhood klimpact after, feed back input F ij, the linear input L that connects ij, internal activity item U ij, film potential dynamic threshold θ ij, pulse producer pulse output Y ijand the global characteristics Q of the video image extracting ijbe respectively:
Fi j(n)=Si j
L ij ( n ) = 1 , &Sigma; kl Y kl ( n - 1 ) > 0 0 , otherwise ;
U ij(n)=F ij(n)·(1+βL ij(n));
&theta; ij ( n ) = e - &alpha; &theta; &theta; ij ( n - 1 ) + V &theta; Y ij ( n ) ;
Y ij ( n ) = 1 , U ij ( n ) > &theta; ij ( n - 1 ) 0 , otherwise ;
Q ij(n)=Q ij(n-1)+(iter-n)Y ij(n);
Wherein, iter is the total degree of Pulse Coupled Neural Network iteration perception; S ijfor being positioned at the grey scale pixel value of arbitrary band image of the neuron institute perception of (i, j); α θfor dynamic threshold θ ijdamping time constant; V θfor threshold value amplification coefficient; β is strength of joint.
3. the moving target detection method based on Pulse Coupled Neural Network according to claim 2, it is characterized in that, step b comprises: according to the feature of coloured image R, G, tri-band images of B, use Pulse Coupled Neural Network to carry out respectively perception to the R of coloured image, G, tri-band images of B, after iteration iter time, obtain respectively the feature Q of three wave bands r, Q g, Q b; For each pixel, in its neighborhood, statistics obtains the feature histogram H of three wave bands r, H g, H b, the global characteristics histogram H using these histogram series connection as respective pixel.
4. according to the arbitrary described moving target detection method based on Pulse Coupled Neural Network of claim 1~3, it is characterized in that, in step c, the foundation of initial back-ground model specifically comprises: for each pixel, before utilizing, K two field picture is extracted and obtained K global characteristics histogram { H by step a and b 1..., H k; To each global characteristics histogram, give a weight w k, and
5. the moving target detection method based on Pulse Coupled Neural Network according to claim 4, is characterized in that, steps d specifically comprises: for each pixel, if the global characteristics histogram H of current frame image cbe greater than threshold value T with the individual arbitrary histogrammic similarity corresponding to background of front B in background model s, this pixel is background; Otherwise this pixel is prospect, i.e. moving-target.
6. the moving target detection method based on Pulse Coupled Neural Network according to claim 5, is characterized in that, background model is upgraded specifically and being comprised described in step e:
E1. for each pixel, to current frame image to be detected, use step a and b to extract and obtain its global characteristics histogram H c; Utilize histogram intersection to calculate the global characteristics histogram H of current frame image cwith K in background model histogram H ibetween similarity:
Figure FDA0000447765830000022
wherein, N represents the number of histogram bin; I=1,2 ..., K;
E2. establishing similarity measurement threshold value is T sif, the global characteristics histogram H of current frame image cwith { H 1..., H kin similarity between K histogram all lower than threshold value T s, current histogram cannot mate with background model, now uses the global characteristics histogram H of current frame image creplace the histogram of weight minimum in background model; If the global characteristics histogram H of current frame image cwith background model { H 1..., H kin similarity between certain several histogram higher than threshold value T s, select the background model histogram H of similarity maximum mas optimum matching histogram, and by the histogram H of current frame image cto optimum matching histogram H mand corresponding weight w mupgrade, that is:
H m=α hH c+(1-α h)H m
w m=w mw(1-w m);
Wherein, α hfor the histogram study factor, α wfor the weight study factor; Simultaneously to other histogrammic weight w in background model ialso adjust accordingly, that is:
W i=(1-α w) w i, i ≠ m wherein;
E3. descending to the K in background model histogram rearrangement according to each histogrammic weight, and select front B histogram corresponding to background.
CN201310731768.1A 2013-12-27 2013-12-27 Based on the moving target detection method of pulse coupled neural network Expired - Fee Related CN103700118B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310731768.1A CN103700118B (en) 2013-12-27 2013-12-27 Based on the moving target detection method of pulse coupled neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310731768.1A CN103700118B (en) 2013-12-27 2013-12-27 Based on the moving target detection method of pulse coupled neural network

Publications (2)

Publication Number Publication Date
CN103700118A true CN103700118A (en) 2014-04-02
CN103700118B CN103700118B (en) 2016-06-01

Family

ID=50361636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310731768.1A Expired - Fee Related CN103700118B (en) 2013-12-27 2013-12-27 Based on the moving target detection method of pulse coupled neural network

Country Status (1)

Country Link
CN (1) CN103700118B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018072066A1 (en) * 2016-10-18 2018-04-26 中国科学院深圳先进技术研究院 Pulse-based neural circuit
CN108629254A (en) * 2017-03-24 2018-10-09 杭州海康威视数字技术股份有限公司 A kind of detection method and device of moving target
US10198655B2 (en) 2017-01-24 2019-02-05 Ford Global Technologies, Llc Object detection using recurrent neural network and concatenated feature map
CN111209771A (en) * 2018-11-21 2020-05-29 晶睿通讯股份有限公司 Neural network identification efficiency improving method and relevant identification efficiency improving device thereof
CN113723594A (en) * 2021-08-31 2021-11-30 绍兴市北大信息技术科创中心 Impulse neural network target identification method

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
DANSONG CHENG ET AL: "Multi-object Segmentation Based on Improved Pulse Coupled Neural Network", 《COMPUTER AND INFORMATION SCIENCE》 *
HAIQING WANG ET AL: "A Simplified Pulse-coupled Neural Network for Cucumber Image Segmentation", 《2010 INTERNATIONAL CONFERENCE ON COMPUTATIONAL AND INFORMATION SCIENCES》 *
MEIHONG SHI ET AL: "A Simplified pulse-coupled neural network for adaptive segmentation of fabric defects", 《MACHINE VISION AND APPLICATIONS》 *
刘映杰 等: "基于多阈值PCNN的运动目标检测算法", 《计算机应用》 *
惠飞 等: "基于脉冲耦合神经网络的目标特征抽取方法", 《吉林大学学报(信息科学版)》 *
王慧斌 等: "基于脉冲耦合神经网络融合的压缩域运动目标分割方法", 《光子学报》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018072066A1 (en) * 2016-10-18 2018-04-26 中国科学院深圳先进技术研究院 Pulse-based neural circuit
US10198655B2 (en) 2017-01-24 2019-02-05 Ford Global Technologies, Llc Object detection using recurrent neural network and concatenated feature map
US10452946B2 (en) 2017-01-24 2019-10-22 Ford Global Technologies, Llc Object detection using recurrent neural network and concatenated feature map
US11062167B2 (en) 2017-01-24 2021-07-13 Ford Global Technologies, Llc Object detection using recurrent neural network and concatenated feature map
CN108629254A (en) * 2017-03-24 2018-10-09 杭州海康威视数字技术股份有限公司 A kind of detection method and device of moving target
CN111209771A (en) * 2018-11-21 2020-05-29 晶睿通讯股份有限公司 Neural network identification efficiency improving method and relevant identification efficiency improving device thereof
CN113723594A (en) * 2021-08-31 2021-11-30 绍兴市北大信息技术科创中心 Impulse neural network target identification method
CN113723594B (en) * 2021-08-31 2023-12-05 绍兴市北大信息技术科创中心 Pulse neural network target identification method

Also Published As

Publication number Publication date
CN103700118B (en) 2016-06-01

Similar Documents

Publication Publication Date Title
Zhao et al. SVM based forest fire detection using static and dynamic features
CN104616664B (en) A kind of audio identification methods detected based on sonograph conspicuousness
CN103020992B (en) A kind of video image conspicuousness detection method based on motion color-associations
CN103700114B (en) A kind of complex background modeling method based on variable Gaussian mixture number
CN103258332B (en) A kind of detection method of the moving target of resisting illumination variation
CN105354791B (en) A kind of improved ADAPTIVE MIXED Gauss foreground detection method
CN103226835B (en) Based on method for tracking target and the system of online initialization gradient enhancement regression tree
CN103020985B (en) A kind of video image conspicuousness detection method based on field-quantity analysis
CN102073841B (en) Poor video detection method and device
CN103700118A (en) Moving target detection method on basis of pulse coupled neural network
CN103984948B (en) A kind of soft double-deck age estimation method based on facial image fusion feature
CN105160310A (en) 3D (three-dimensional) convolutional neural network based human body behavior recognition method
CN109741318A (en) The real-time detection method of single phase multiple dimensioned specific objective based on effective receptive field
CN107301376B (en) Pedestrian detection method based on deep learning multi-layer stimulation
CN104537356B (en) Pedestrian identification method and the device again that sequence carries out Gait Recognition are taken turns using Switzerland
CN103810503A (en) Depth study based method for detecting salient regions in natural image
CN104715244A (en) Multi-viewing-angle face detection method based on skin color segmentation and machine learning
CN101799875B (en) Target detection method
CN110334703B (en) Ship detection and identification method in day and night image
CN111582092A (en) Pedestrian abnormal behavior detection method based on human skeleton
CN103824284A (en) Key frame extraction method based on visual attention model and system
CN104103033A (en) Image real-time processing method
CN104077609A (en) Saliency detection method based on conditional random field
CN105469111A (en) Small sample set object classification method on basis of improved MFA and transfer learning
Xiao et al. Traffic sign detection based on histograms of oriented gradients and boolean convolutional neural networks

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160601