CN107463898A - The stage performance abnormal behavior monitoring method of view-based access control model sensing network - Google Patents

The stage performance abnormal behavior monitoring method of view-based access control model sensing network Download PDF

Info

Publication number
CN107463898A
CN107463898A CN201710644855.1A CN201710644855A CN107463898A CN 107463898 A CN107463898 A CN 107463898A CN 201710644855 A CN201710644855 A CN 201710644855A CN 107463898 A CN107463898 A CN 107463898A
Authority
CN
China
Prior art keywords
mrow
msub
msup
formula
mfrac
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710644855.1A
Other languages
Chinese (zh)
Inventor
张福泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Minjiang University
Original Assignee
Minjiang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Minjiang University filed Critical Minjiang University
Priority to CN201710644855.1A priority Critical patent/CN107463898A/en
Publication of CN107463898A publication Critical patent/CN107463898A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/513Sparse representations

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Other Investigation Or Analysis Of Materials By Electrical Means (AREA)

Abstract

The present invention relates to a kind of stage performance abnormal behavior monitoring method of view-based access control model sensing network.By based on Weighted Threshold background subtraction method, target image is partitioned into from background image, target image is detected using the particle swarm optimization algorithm based on Chaos Search, the tracking of target is realized by Mean Shift track algorithms, using based on the anomaly detection method being locally linear embedding into sparse description, the local manifolds structure of analysis sample set comprehensively, lift the detection efficiency and precision of stage performance abnormal behaviour.When the inventive method carries out the monitoring of stage performance abnormal behaviour, there is higher detection efficiency and precision, and robustness is higher.

Description

The stage performance abnormal behavior monitoring method of view-based access control model sensing network
Technical field
The present invention relates to a kind of stage performance abnormal behavior monitoring method of view-based access control model sensing network.
Background technology
With the fast development of computer vision technique and sensor-based network technology, visual sensing network technology is in different prisons Control field is with a wide range of applications.Visual sensing technology is applied particularly during stage performance, is regarded based on computer Feel the technologies such as image manipulation, pattern-recognition, detect the target in stage performance scene and behavior, realize the intellectuality of dbjective state Monitoring and analysis are significant to the performance quality of lifting performer.Tradition realizes stage performance exception based on sparse descriptive model The monitoring method of behavior, the local manifolds structure during performance is not analyzed so that behavioural characteristic Its Sparse Decomposition is present Higher fluctuation, abnormal behaviour accuracy of identification are low[1,2].In order to solve this kind of problem, the application proposes that a kind of view-based access control model passes Feel network stage performance abnormal behavior monitoring method, in visual sensing network platform use based on be locally linear embedding into it is dilute The anomaly detection method of description is dredged, realizes high efficiency, the High Precision Monitor of stage performance abnormal behaviour.
The content of the invention
, should it is an object of the invention to provide a kind of stage performance abnormal behavior monitoring method of view-based access control model sensing network When method carries out the monitoring of stage performance abnormal behaviour, there is higher detection efficiency and precision, and robustness is higher.
To achieve the above object, the technical scheme is that:A kind of stage performance behavior of view-based access control model sensing network Abnormality monitoring method, comprise the following steps,
S1, the track up by CCD camera realization to stage performance moving target, obtain the stage performance image of shooting Sequence;
S2, using based on the stage performance abnormal behaviour recognition methods for being locally linear embedding into sparse description, carry out stage table Drill abnormal behaviour identification.
In an embodiment of the present invention, the specific implementation process of the step S1 is as follows,
S11, the renewal for realizing based on Weighted Threshold background subtraction method background image:
The pixel of stage performance image is set as M*N, background image sequence is f ' (x, y)={ fi' (x, y), i=1, 2...n }, stage performance image sequence is f (x, y)={ fi(x, y), i=1,2...n }, then threshold value is realized by procedure below The computing of method,
1) in stage performance image sequence background image average valueObtained by critical path method, make the primary stage The average value of video background image represents as follows with formula (1):
2) background image positioned at a later frame of current stage performance image is fi+1(x, y), make the background image of renewal For D (x, y);In the average value of primary stage video background imageOn the basis of, threshold value σ, the then back of the body updated are set Scape image is represented by formula (2):
D if (x, y) > σ, then it represents that the background image variation frequency of stage performance image is high, and need to upgrade in time Background Picture;Noise may impact to background image definition if 0≤D (x, y)≤σ, can use below in conjunction with weight coefficient Mode solves:
If weight coefficient is λ, g (x, y) be the background image of i+1 frame renewal pixel and, then threshold value σ calculation formula It is represented by:
Contrasted by a two field picture background pixel at once with the background image of primitivation, and figure is obtained with mediation product is weighted As adjustment threshold values, solve the problems, such as fluctuation of pixel values caused by noise fluctuations, can at utmost retain the clear of sample background image Clear degree;
S12, moving object detection realized using the particle swarm optimization algorithm based on Chaos Search:
1) whole Parameter Initialization procedures:c1And c2Studying factors are described, use TmaxMaximum iteration is described, is described with U Control parameter, Chaos Search initial iteration number is described with T;Make group particulate initialization, include random site and speed;
2) adaptedness of whole particulates to target, the optimum position that j-th of particle of locking searches up to now are assessed Pj, and the optimum position P that whole population searches up to nowg
3) cohesive process 1) and process 2) an iteration operation is implemented to the particle in population, if current optimum individual exists In the range of the condition of convergence or reach maximum iteration, then running 5);
If 4) make whole population history optimum particle position PgUndergo T population interative computation after, yet change or Only faint change, then make Xj'=Pg, optimal value Xj' using chaos search plain algorithm and carry out optimizing and obtain, Pg=Xj', ran Journey 3) implement follow-up particle group operatione, otherwise running 5);
5) evolutionary process terminates, and returns to globally optimal solution;The fitness function that uniformity estimates optimization is introduced, obtains image The optimal threshold t of segmentation, setting stage performance image are divided into two parts by threshold value t, use RI, I=1,2 represent, then uniformity Estimating the fitness function of optimization can use formula (4) to represent:
Wherein,RITo split rear region I, f is seti(x, y) is image-region The gray value of (x, y),Use AIRegion R is describedIIn pixel value number;Use C0lUM is described (t) normalized constant is passed through;Then optimal parameter λ comes across:λ *=argmaxUM (t* (λ));
S13, the stage performance motion target tracking based on Mean Shift algorithms:
It is space R to set ddM sample point x be presentJ, J=1,2..., m, then the Mean Shift vectors on x points be:
Wherein, with (xJ- x) description sample point xJRelative to point x offset vector, S is usedhDescription radius is that h higher-dimension is spherical Region, vector Mh(x) it is to falling into region ShIn k sample point relative to point x offset vector average vector value;If y It is to fall into region ShIn point set, then the set of y points is obtained by formula below is:
Sh={ y:(y-x)T(y-x)≤h2} (6)
Described with k in m sample point xJIn there is k point to enter ShIn;If sample point is obtained in probability density function f (x) xJ, it is most of using average value computing because the maximum direction of probability density gradient trend and the probability density growth of non-zero is consistent ShIn sample point along probability density gradient trend be distributed;Therefore, average offset vector M is dissected based on mathematicshAnd sample (x) The most wide region trend of this distribution is consistent, while is also the gradient direction of probability density function, wherein, the region that great circle represents It is exactly Sh, small circle, which represents, enters ShSample point x in regionJ∈Sh, Mean Shift datum mark X center stain descriptions, Sample point offsets datum mark X vector is described with arrow;
Then Mean Shift algorithms are by first drawing the color histogram of object module and alternative model, then by contrast The similarity for analyzing two group models implements the tracking of moving target.
In an embodiment of the present invention, Mean Shift algorithms are straight by the color for first drawing object module and alternative model Fang Tu, then the specific implementation by the tracking of the similarity implementation moving target of the group model of comparative analysis two are as follows:
S131, establish object module
Detected using stage performance and moving target is divided into several regions, according to the gray scale or color histogram of moving target Make the target area chosen, if object center is in point xo, then xJFor describing each pixel, characteristic value μ=1,2 ... m', useThe characteristic value probability of the corresponding target zone of description:
S132, mould candidate target
After completing moving object detection, the higher motion target zone covered in follow-up each frame is treated as into candidate target, mesh The centre coordinate of mark scope kernel function is y, the pixel of target zone and be mh, then the probability of the u characteristic values of candidate family Predicted density function is:
In formula,It is normaliztion constant, then optimal y mistake is retrieved when target being followed and regarded as Journey, ensure simultaneously(y) it is sameSimilarity with maximum;
S133, likelihood score function
Computing object moduleWith candidate familyPass through likelihood function Bhattacharrya coefficients againIt is right The Distributed Implementation analysis of object module, then have:
During span [0,1], its value is higher, explanationAndThe similarity of model is higher, and formula (9) existsPoint Taylor series expansion can obtain:
Formula (9) band people's formula (10) can be obtained:
S134, computing offset
Optimal Mean Shift vector values are obtained using the computing of Bhattacharrya coefficients, the value is stage performance fortune The vector that moving-target changes from element position, by loop iteration computing, obtain Optimum Matching position of the target in subsequent frame y1
In an embodiment of the present invention, the specific implementation process of the step S2 is as follows:
S21, the sparse description of stage performance behavioural characteristic:
1) it is based on l1The sparse description algorithm of norm
There is the class of M marked difference in setting, existing number of training is N in the i-th classi, then training sample beThe feature approximation of identical type is in a lower-dimensional subspace;Detection sample y is treated as into m Dimension, if y belongs to the i-th class, y can use the i-th class training sample DiLinear combination implement description, then have:
In formula, y is in DiMiddle concentration description is αi, y can use M class D=[D1,...,DM]=[d1,...,dN]∈Rd×NStructure The sparse description of dictionary;Y sparse description is obtained by formula (15):
A=mina||y-Da||2+λ'||a||1 (15)
In formula, sparse description coefficients of the y in dictionary D is a, and constraint sparse coefficient is λ ';
2) it is based on being locally linear embedding into sparse description
LLE is a kind of manifold learning arithmetic, and this kind of algorithm is assumed to obtain sample set out of smooth manifold, in low-dimensional region Sample can use its neighbour's point Linear to describe, while ensure higher-dimension sample local linear held stationary so that data set it is interior In structure remained stable;If stage performance behavioral data collection is Y=[y1,y2,...,yn], then y is assumed based on LLEiIt can use Detection sample set arest neighbors in identical manifold linearly describes, yiSparse description coefficient weighed by respective neighbours by identical Value is linearly described;Then LLE secondary limitation can use formula (16) to describe:
In formula, αjReconstruction weights be vij, yiArest neighbors be N (yi), v is obtained using formula (17)ij
Then formula (17) can be transformed into:
In formula, I is unit matrix, M=(I-V) (I-V)TIt is foundation LLE matrixes, then V isFormula (18) is incorporated in sparse description formula, then using formula (19) Represent to be based on being locally linear embedding into sparse description:
minA||Y-DA||20||A||11Tr(AMAT) (19)
In formula (19), λ0And λ1It is regularization parameter;
3) sparse description coefficient is obtained
Each α in formula (19)iIt can be described as:
If there is stable vector { aj}j≠i, then the process solved can describe with following formula:
In formula,For describing αiJ-th of component;Retrieved and calculated by feature-sign Method solves to formula (21), defines simultaneously:
Then there are g (αi)=f (αi)+λ0||αi||1
S22, the identification of stage performance abnormal behaviour:
Sparse description factor alpha based on above-mentioned acquisitioni, the species for detecting sample can be obtained;I-th of reconstruct is set to miss Difference, it is used to describe detection sample with the difference after being reconstructed by the i-th class, then had:
ri(y)=| | y-Diai| |, i=1 ..., M (23)
In formula, αiIn non-zero coefficient be used to describe the atom in corresponding class, obtain y's using the minimum residual error in formula (24) Type:
Class (y)=argminI=1 ,-M ri(y) (24)
By the high interference of stage performance behavioural characteristic dimension so that perform Behavioral training sample to dance in true detection process This implementation mark is complex, and this up-to-date style (23) is overdetermined equation, should be converted into underdetermined equation, lifts D dimension, incorporates D × d unit matrix, formula (23) is transformed into:
Y=minα||y-Bα||2+λa||1 (25)
In formula, B=[D I] ∈ Rd×(N+d), a=[αT ηT]T∈R(N+d), η ∈ R(N+d)Represent error vector;Formula (25) is true Guarantor's formula (23) is transformed into underdetermined equation, and ensures error may be present in formula (23);
Then the object function of involvement error vector is:
Minimum residual error implementation emergency to formula (27) can obtain y classification:
Class (y)=argminI=1 ,-M Ri(y) (27)。
Compared to prior art, the invention has the advantages that:The inventive method carries out stage performance abnormal behaviour During monitoring, there is higher detection efficiency and precision, and robustness is higher.
Brief description of the drawings
Fig. 1 is the master-plan strategy of the stage performance abnormal behavior monitoring system of view-based access control model sensing network of the present invention.
Fig. 2 is present system hardware composition figure.
Fig. 3 is the functional module structure figure of present system software.
Fig. 4 is that stage performance abnormal behaviour of the present invention monitors flow chart.
Fig. 5 is the present invention based on the anomaly detection method flow chart for being locally linear embedding into sparse description.
Fig. 6 is that stage personnel characteristics of the present invention extract result figure.
Fig. 7 is conventional method training result (horizontal line is anticipation error).
Fig. 8 is the training result (horizontal line is anticipation error) of the inventive method.
Embodiment
Below in conjunction with the accompanying drawings, technical scheme is specifically described.
As shown in figure 1, the system of the stage performance abnormal behavior monitoring method of view-based access control model sensing network of the present invention is overall Layout strategy is:When occurring predefining unmatched image with system during stage performance video monitoring, system can be automatic Alarm causes the attention of monitoring personnel, and monitoring personnel collects predefined exception performance sample and takes corresponding measure immediately.
1st, the stage performance abnormal behavior monitoring system of view-based access control model sensing network
Analysis chart 1, it can be seen that system by video acquisition module, image manipulation module, results operation module composition.Video Acquisition module is made up of CCD camera and video editor, for gathering stage performance behavior figure picture;Network transmission will collect Video central processing unit of the stage performance behavioural information into main monitoring station is handled and analyzed[3], with stage performance exception row Image manipulation module is formed for recognizer;As a result display and warning system form results operation module, export stage performance Abnormal behaviour.Video encoder therein realizes the compression of video data, decompression processing, and article is compiled using H.264 digital video Code device, stage performance view data is transmitted by sensing network.
1.1st, system hardware is formed
As shown in Fig. 2 the system hardware of the present invention is mainly by PC, CCD camera, motor and other control module structures Into.Ccd video camera carries out stage performance behavior figure as collecting work after receiving the order of PC collection of server image[4], simultaneously The image information of acquisition is fed back in calculator memory using transmission equipment, analysis collection is presented by image manipulation software Data, stage performance abnormal behaviour is judged whether, warning device is according to COMPUTER DETECTION result to stage performance target reality Apply operation.
1.2nd, Design of System Software
The key of present system is video analysis algorithm, and it is by image preprocessing, motion detection, target following and exception Activity recognition is formed[7], these functions should use system software complete operableization control interface, realize to stage performance row For the operation such as analysis, collection and processing.
1.2.1, the functional module of system software
Article uses Windows Visual Studio2008 developing instrument design system softwares, the wherein module of software Structure chart is described with Fig. 3.According to each module of system function design, the collection of stage performance visible image, image manipulation and knot are realized The functions such as fruit operation.Each intermodule is realized by image information stream and interacted.Video reception and sending module collection stage performance Behavior video information, information is outwards transmitted using multicast approach;The abnormal performance behavior sample that image capture module is set This;Image manipulation module carries out pre-operation to image, anomalous identification detects[5], be system key component;Results operation module For storing the result of output.
1.2.2, stage performance abnormal behaviour monitoring flow
Stage performance abnormal behaviour is monitored for high efficiency, accurately in video monitoring, performs the detection and knowledge of behavior It is not the most key process, the stage performance abnormal behaviour monitoring flow of system design is described with Fig. 4.
Stage performance moving object detection is from fixed viewpoint monitoring objective scope, and shooting is obtained from background image Original stage performance image sequence.After detecting movement destination image, it is pre-processed and noise reduction process, strengthen stage table The definition of moving target is drilled, reduces picture noise, then carry out the segmentation of image.Stage performance moving target after collection segmentation Feature, complete abnormal behaviour identification.
2nd, stage performance abnormal behaviour monitoring method
2.1st, the renewal of background image is realized based on Weighted Threshold background subtraction method
Stage monitoring scene has mobility, if the detection target that background cannot change can be to stage performance in time causes Influence, therefore, it is necessary to real-time update background.Herein by the way of the background subtraction method combination Weighted Threshold factor[6], can sieve The population of Chaos Search is selected, so as to search the parameter of weighted factor.
The pixel of stage performance image is set as M*N, background image sequence is f ' (x, y)={ fi' (x, y), i=1, 2...n }, stage performance image sequence is f (x, y)={ fi(x, y), i=1,2 ... n }, then threshold method is realized by procedure below Computing,
1) in stage performance image sequence background image average valueObtained by critical path method, make the primary stage The average value of video background image represents as follows with formula (1):
2) background image positioned at a later frame of current stage performance image is f 'i+1(x, y), make the background image of renewal For D (x, y);In the average value of primary stage video background imageOn the basis of, threshold value σ, the then background updated are set Image is represented by formula (2):
D if (x, y) > σ, then it represents that the background image variation frequency of stage performance image is high, and need to upgrade in time Background Picture;Noise may impact to background image definition if 0≤D (x, y)≤σ, can use below in conjunction with weight coefficient Mode solves the problems, such as that more new images have noise[7]
The size of threshold value σ values directly affects whether background updates, if σ value is less than normal, it is more to form noise immunity When, will be mistakenly considered noise is to need the image that updates and then cause stage performance background image smudgy.If σ value is inclined Greatly, when and context update partial pixel is stable, background image update section branch loses.The renewal back of the body in part can determine whether by threshold method Scape image, more new images are needed in the case of more than threshold value, but may contain noise in the image updated, this just illustrates threshold Value method is limited by other factorses[8], article combination weighted factor, by the regulation and control to weighted factor so as to realizing background picture Renewal[9]
If weight coefficient is λ, g (x, y) be the background image of i+1 frame renewal pixel and, then threshold value σ calculation formula It is represented by:
Contrasted by a two field picture background pixel at once with the background image of primitivation, and figure is obtained with mediation product is weighted As adjustment threshold values, solve the problems, such as fluctuation of pixel values caused by noise fluctuations, can at utmost retain the clear of sample background image Clear degree;
2.2nd, moving object detection is realized using the particle swarm optimization algorithm based on Chaos Search
When the application carries out the detection of stage performance performance-based objective, Chaos Search Method is incorporated in particle swarm optimization algorithm, If there is local optimum solution problem in particle cluster algorithm[10], then complete or collected works optimum position is obtained using Chaos Search Method, filtered out The population of local optimum positions, it is ensured that local optimum solution problem occurs in particle cluster algorithm, strengthens convergence of algorithm efficiency, specifically Process is:
1) whole Parameter Initialization procedures:c1And c2Studying factors are described, use TmaxMaximum iteration is described, is described with U Control parameter, Chaos Search initial iteration number is described with T;Make group particulate initialization, include random site and speed;
2) adaptedness of whole particulates to target, the optimum position that j-th of particle of locking searches up to now are assessed Pj, and the optimum position P that whole population searches up to nowg
3) cohesive process 1) and process 2) an iteration operation is implemented to the particle in population, if current optimum individual exists In the range of the condition of convergence or reach maximum iteration, then running 5);
If 4) make whole population history optimum particle position PgUndergo T population interative computation after, yet change or Only faint change, then make Xj'=Pg, optimal value Xj' using chaos search plain algorithm and carry out optimizing and obtain, Pg=Xj', ran Journey 3) implement follow-up particle group operatione, otherwise running 5);
5) evolutionary process terminates, and returns to globally optimal solution;Introduce the fitness function that uniformity estimates optimization[11], obtain The optimal threshold t of image segmentation, setting stage performance image are divided into two parts by threshold value t, use RI, I=1,2 represent, then The fitness function that even property estimates optimization can use formula (4) to represent:
Wherein,RITo split rear region I, f is seti(x, y) is image-region The gray value of (x, y),Use AIRegion R is describedIIn pixel value number;Use C0lUM is described (t) normalized constant is passed through;Then optimal parameter λ comes across:λ *=argmaxUM (t* (λ));
2.3rd, the stage performance motion target tracking based on Mean Shift algorithms
The application realizes the tracking of stage performance performance-based objective using Mean Shift algorithms, can handle moving target screening Gear and loss of learning problem, by LOG operators detection method gather moving target edge feature, using Mean Shift algorithms with These edge features of track, moving target follow-up location is implemented to predict based on kalman filtering.Mean Shift pass through density level bands Degree is carried out without participating in the algorithm of survey, during its pursuit movement target frame, passes through pixel characteristic point probability density function optimal method Efficient interative computation is carried out, obtains the local maximum of density function[12], the high efficiency positioning of moving target is completed, and And it can adapt to the compound movement states such as the deformation of stage performance moving target and rotation.
It is space R to set ddM sample point x be presentJ, J=1,2..., m, then the Mean Shift vectors on x points be:
Wherein, with (xJ- x) description sample point xJRelative to point x offset vector, S is usedhDescription radius is that h higher-dimension is spherical Region, vector Mh(x) it is to falling into region ShIn k sample point relative to point x offset vector average vector value;If y It is to fall into region ShIn point set, then the set of y points is obtained by formula below is:
Sh={ y:(y-x)T(y-x)≤h2} (6)
Described with k in m sample point xJIn there is k point to enter ShIn;If sample point is obtained in probability density function f (x) xJ, it is most of using average value computing because the maximum direction of probability density gradient trend and the probability density growth of non-zero is consistent ShIn sample point along probability density gradient trend be distributed[13];Therefore, average offset vector M is dissected based on mathematicsh(x) with The most wide region trend of sample distribution is consistent, while is also the gradient direction of probability density function, wherein, the area that great circle represents Domain is exactly Sh, small circle, which represents, enters ShSample point x in regionJ∈Sh, Mean Shift datum mark X retouches with center stain State, sample point offsets datum mark X vector is described with arrow;
Then Mean Shift algorithms are by first drawing the color histogram of object module and alternative model, then by contrast The similarity for analyzing two group models implements the tracking of moving target, and specific implementation is as follows:
S131, establish object module
Detected using stage performance and moving target is divided into several regions, according to the gray scale or color histogram of moving target Make the target area chosen, if object center is in point xo, then xJFor describing each pixel, characteristic value μ=1,2 ... m', useThe characteristic value probability of the corresponding target zone of description:
S132, mould candidate target
After completing moving object detection, the higher motion target zone covered in follow-up each frame is treated as into candidate target, mesh The centre coordinate of mark scope kernel function is y, the pixel of target zone and be mh, then the probability of the u characteristic values of candidate family Predicted density function is:
In formula,It is normaliztion constant, then retrieves optimal y's when target is and then regarded as Process, ensure simultaneouslyTogetherSimilarity with maximum;
S133, likelihood score function
Computing object moduleWith candidate familyPass through likelihood function Bhattacharrya coefficients againIt is right The Distributed Implementation analysis of object module, then have:
During span [0,1], its value is higher, explanationAndThe similarity of model is higher, and formula (9) existsPoint Taylor series expansion can obtain:
Formula (9) band people's formula (10) can be obtained:
S134, computing offset
Optimal Mean Shift vector values are obtained using the computing of Bhattacharrya coefficients, the value is stage performance fortune The vector that moving-target changes from element position, by loop iteration computing, obtain Optimum Matching position of the target in subsequent frame y1
2.4th, based on the stage performance abnormal behaviour recognition methods for being locally linear embedding into sparse description
2.4.1, method flow
To stage performance abnormal behaviour process it is seen that assorting process to time-variable data, by list entries with setting Behavior canonical sequence implement contrast.Stage performance abnormal behaviour is a kind of complicated abnormal behaviour pattern, it is necessary to using auxiliary Model describes stage mark king's behavior, completes accurately identifying for stage performance abnormal behaviour.Because stage performance behavioural characteristic has Dimension is big, data volume is high and the phenomenon of feature local manifolds structure dynamics fluctuation, using based on being locally linear embedding into sparse mark Anomaly detection method, realize the identification of stage performance abnormal behavior.This kind of method is divided flow algorithm by optimizing Crowd movement's feature is gathered, piecemeal is carried out to different two field pictures in video sequence, low-dimensional behavioural characteristic is collected, passes through local line Property insertion sparse description classification is implemented to the low-dimensional behavioural characteristic of acquisition[14], LLE regular terms is incorporated in sparse disaggregated model, Handle the dynamic fluctuation phenomenon of local manifolds structure.Detection sample can be at utmost preserved according to the regular terms being locally linear embedding into This local manifolds decoupling stock, greatly enhance the identification performance of sample.The flow chart of this kind of algorithm is described with Fig. 5.
2.4.2, the sparse description of stage performance behavioural characteristic
1) it is based on l1The sparse description algorithm of norm
There is the class of M marked difference in setting, existing number of training is N in the i-th classi, then training sample beThe feature approximation of identical type is in a lower-dimensional subspace;Detection sample y is treated as into m Dimension, if y belongs to the i-th class, y can use the i-th class training sample DiLinear combination implement description, then have:
In formula, y is in DiMiddle concentration description is αi, y can use M class D=[D1,...,DM]=[d1,...,dN]∈Rd×NStructure The sparse description of dictionary;Y sparse description is obtained by formula (15):
A=mina||y-Da||2+λ'||a||1 (15)
In formula, sparse description coefficients of the y in dictionary D is a, and constraint sparse coefficient is λ ';
2) it is based on being locally linear embedding into sparse description
LLE is a kind of manifold learning arithmetic, and this kind of algorithm is assumed to obtain sample set out of smooth manifold, in low-dimensional region Sample can use its neighbour's point Linear to describe, while ensure higher-dimension sample local linear held stationary so that data set it is interior In structure remained stable;If stage performance behavioral data collection is Y=[y1,y2,...,yn], then y is assumed based on LLEiIt can use Detection sample set arest neighbors in identical manifold linearly describes, yiSparse description coefficient weighed by respective neighbours by identical Value is linearly described;Then LLE secondary limitation can use formula (16) to describe:
In formula, αjReconstruction weights be vij, yiArest neighbors be N (yi), v is obtained using formula (17)ij
Then formula (17) can be transformed into:
In formula, I is unit matrix, M=(I-V) (I-V)TIt is foundation LLE matrixes, then V isFormula (18) is incorporated in sparse description formula, then using formula (19) Represent to be based on being locally linear embedding into sparse description:
minA||Y-DA||20||A||11Tr(AMAT) (19)
In formula (19), λ0And λ1It is regularization parameter;
3) sparse description coefficient is obtained
Each α in formula (19)iIt can be described as:
If there is stable vector { aj}j≠i, then the process solved can describe with following formula:
In formula, For describing αiJ-th of component;Pass through feature-sign searching algorithms Formula (21) is solved, defined simultaneously:
Then there are g (αi)=f (αi)+λ0||αi||1
2.4.3, stage performance abnormal behaviour identifies
Sparse description factor alpha based on above-mentioned acquisitioni, the species for detecting sample can be obtained;I-th of reconstruct is set to miss Difference, it is used to describe detection sample with the difference after being reconstructed by the i-th class, then had:
ri(y)=| | y-Diai| |, i=1 ..., M (23)
In formula, αiIn non-zero coefficient be used to describe the atom in corresponding class, obtain y's using the minimum residual error in formula (24) Type:
Class (y)=argminI=1 ,-M ri(y) (24)
By the high interference of stage performance behavioural characteristic dimension so that perform Behavioral training sample to dance in true detection process This implementation mark is complex, and this up-to-date style (23) is overdetermined equation, should be converted into underdetermined equation, lifts D dimension, incorporates D × d unit matrix, formula (23) is transformed into:
Y=minα||y-Bα||2+λ||a||1 (25)
In formula, B=[D I] ∈ Rd×(N+d), a=[αT ηT]T∈R(N+d), η ∈ R(N+d)Represent error vector;Formula (25) is true Guarantor's formula (23) is transformed into underdetermined equation, and ensures error may be present in formula (23);
Then the object function of involvement error vector is:
Minimum residual error implementation emergency to formula (27) can obtain y classification:
Class (y)=argminI=1 ,-M Ri(y) (27)。
The sparse interpretive classification algorithm of above-mentioned analysis can not be implemented accurate to the local manifolds structure of stage performance behavior sample Really description[15], in order to strengthen the nicety of grading of algorithm, article uses inserts sparse description method enhancing stage based on local linear Perform the accuracy of abnormal behaviour identification, detailed process is:
Input:Training sample matrix D=[D1,D2,...,Di]∈IRm×n, co-exist in M classes, test sample y ∈ [y1, y2,...,yn];
For 1≤i≤m
Step 1:Normalized is implemented to training sample set and test sample;
Step2:Solve based on sparse description problem is locally linear embedding into, obtain sparse description coefficient ai
Step3:Arithmetic eror Ri(y);
end for
Output:Class (y)=argminI=1 ,-M Ri(y)。
3rd, experimental analysis
Experiment one:The stage performance that experimental data shoots certain country of city song and dance center using SonyHVR-V1C video cameras regards Frequently, each two field picture size is 360*240 in video.Substantial amounts of stage performance abnormal behaviour in video be present, mainly have tumble, stop Pause, trample, anisotropy, jump height not enough etc..Fig. 6 is that stage personnel characteristics extract result.
In order to detect the validity of the inventive method, dancing is described under Same Scene and arbitrarily normally falls, pause, step on Step on, the video sequence that anisotropy and jump height are inadequate, complete the detection of context of methods performance.From 10 videos of shooting 6 kinds of performance behavior samples in interior acquisition tables 1, treat as training set, 35% feature set is treated as by 65% feature set of sample Test set, trained by sparse model and obtain training dictionary.Collect sparse description dictionary and detect the stage table of video sequence After drilling behavioural characteristic, by being locally linear embedding into the classification of sparse sorting technique consummatory behavior feature, test 300 frame video sequences Broomrape is detected into training sample using each 150 frame video sequence of 6 kinds of behaviors.
The stage performance abnormal behaviour monitoring side of experimental contrast analysis's the inventive method and tradition based on sparse description algorithm Method, what 6 kinds were detected with video accurately identifies rate, is described with table 2.It can be derived that the inventive method for stage performance exception sex Correct recognition rata be higher than conventional method.
Training and detection sample in the video sequence of table 1
The conventional method of table 2 and context of methods algorithm discrimination
Experimental data in table 1 includes single performance behavior and more people perform interbehavior, shoots the single behavior of 25 people, Every two people of more people's interbehaviors 1 time or 2 times.Fig. 7 is the error training figure that conventional method performs dancing personnel behavior, if instruction It is 140 times to practice number, then can obtain preferable error amount, and Fig. 8 is to use the inventive method to be obtained when training is close to 60 times Take preferable recognition effect.Illustrate that the inventive method has higher recognition efficiency.
Experiment two:D-Harris points of interest are treated as nautch behavioural characteristic point by experiment, in certain stage performance behavior Context of methods is used on YouTube databases, carries out stage performance unusual checking experiment, context of methods is to different stage tables The recognition accuracy result table 3 for drilling feature describes, and HNF stage performance abnormal behaviors are the fusions of HOG and HOF features, Analytical table 3 can obtain, the recognition accuracy highest of HNF features, be 83.36%.The recognition accuracy of HOG features is reduced to 77.91%, MBH, HOF and IMHCD stage performance abnormal behavior discriminations are all smaller than 75%.
Recognition accuracy (%) of the context of methods of table 3 to different off-notes
In order to detect interference of the sample size to abnormal behaviour discrimination, the identification under experimental analysis difference sample data Rate.To recognition accuracy situation of the stage performance abnormal behaviour HNF features under context of methods under different sample numbers, retouched with table 4 State.
Recognition accuracy (%) of the HNF off-notes of table 4 under different sample numbers
Analytical table 4 can obtain, the accuracy of different sample sizes meeting disturbance ecology rates, with the lifting of sample size, identification Rate is also gradually lifted, and when when sample number 150, average recognition rate is 96.3%, and when 100 during sample number, average recognition rate is 92.36%., illustrate that context of methods has higher stage performance abnormal behaviour recognition efficiency and robustness.
4th, conclusion
The present invention proposes the stage performance abnormal behavior monitoring method of view-based access control model sensing network, analysis view-based access control model sensing The general structure of the stage performance abnormal behavior monitoring system of network, hardware configuration and the software composition of design system, using base In being locally linear embedding into the anomaly detection method with sparse description, the local manifolds structure of sample machine, lifting are analyzed comprehensively Detection efficiency and precision.
Bibliography:
[1] detection of obstacles [J] the computer engineering of Wang Tiantao, Zhao Yongguo, Chang Faliang view-based access control model sensors is with answering With 2015,51 (4):180-183.
[2] Kuang Zhenchun, Wu Liu Ping are based on Internet of Things gardens vision monitoring method and studied with emulating [J] Computer Simulations, 2016,33(4):447-450.
[3] Li Wei, design and analysis [J] electronics of the radio sensing network energy consumption model of a small stream semi-Markov chain are set Count engineering, 2016,24 (11):95-98.
[4]Tong Y,Li H,Chen J,et al.Dual-band stereo vision based on heterogeneous sensor networks[J].Signal Processing,2016,126(C):87-95.
[5] department is sub- super, Lu Kingdom, in research and design [J] electricity of the river profit based on wireless sense network power quality supervisory information system Source technology, 2014,38 (2):373-374.
[6]Frontoni E,Mancini A,Zingaretti P.Embedded Vision Sensor Network for Planogram Maintenance in Retail Environments[J].Sensors,2015,15(9):21114- 21133.
[7] forests and streams, mechanical arm without sensor non-contact formula collision estimation [J] science and technology of the Haiyan Lu based on disturbance thinking Circular, 2016,32 (6):104-108.
[8] Design of Object Tracking System [J] modern electronic technologies of Wang Qiang view-based access control models sensing network, 2016,39 (8):88-91.
[9] old denier, Yang Fei, Ye Xiaojun data of multiple angles storehouse activity monitoring technical research [J] Journal of UEST of China, 2015,44(2):266-271.
[10]Feng M Q,Asce F,Fukuda Y,et al.Nontarget Vision Sensor for Remote Measurement of Bridge Dynamic Response[J].Journal ofBridge Engineering,2015, 20(12).
[11] roc is worn, Wang Xue, Tan Yuqi, waits towards isomery visual sensing network self-adapting calibration [J] of pedestrian detection Chinese journal of scientific instrument, 2016,37 (3):683-689.
[12]Wang Q L,Li J Y,Shen H K,et al.Research of Multi-Sensor Data Fusion Based on Binocular Vision Sensor and Laser Range Sensor[J].Key Engineering Materials,2016,693:1397-1404.
[13] Wang Tian, Li Qingwu, Liu Yan, is waited to realize that human body abnormal behaviour identifies [J] instrument and meters using pose estimation Journal, 2016,37 (10):2366-2372.
[14] Xu Weiwei, Zhang Qun, XUWei-wei, design of network topology structure method in the covering of radio sensing networks is waited [J] science and technology and engineering, 2016,16 (25):126-130.
[15] Luo Jian, Tang's Jin, Zhao Peng, the elderly's anomaly detection method [J] of based on 3D structured light sensors are waited Optical technology, 2016,42 (2):146-151..
Above is presently preferred embodiments of the present invention, all changes made according to technical solution of the present invention, caused function are made During with scope without departing from technical solution of the present invention, protection scope of the present invention is belonged to.

Claims (4)

  1. A kind of 1. stage performance abnormal behavior monitoring method of view-based access control model sensing network, it is characterised in that:Comprise the following steps,
    S1, the track up by CCD camera realization to stage performance moving target, obtain the stage performance image sequence of shooting Row;
    S2, using based on the stage performance abnormal behaviour recognition methods for being locally linear embedding into sparse description, it is different to carry out stage performance Normal Activity recognition.
  2. 2. the stage performance abnormal behavior monitoring method of view-based access control model sensing network according to claim 1, its feature exist In:The specific implementation process of the step S1 is as follows,
    S11, the renewal for realizing based on Weighted Threshold background subtraction method background image:
    The pixel of stage performance image is set as M*N, background image sequence is f ' (x, y)={ f 'i(x, y), i=1,2...n }, Stage performance image sequence is f (x, y)={ fi(x, y), i=1,2...n }, then the fortune of threshold method is realized by procedure below Calculate,
    1) in stage performance image sequence background image average valueObtained by critical path method, make primary stage video The average value of background image represents as follows with formula (1):
    <mrow> <mover> <mrow> <msup> <mi>f</mi> <mo>&amp;prime;</mo> </msup> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <mo>&amp;OverBar;</mo> </mover> <mo>=</mo> <mfrac> <mn>1</mn> <mi>n</mi> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <msubsup> <mi>f</mi> <mi>i</mi> <mo>&amp;prime;</mo> </msubsup> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
    2) background image positioned at a later frame of current stage performance image is f 'i+1(x, y), the background image for making renewal are D (x, y);In the average value of primary stage video background imageOn the basis of, threshold value σ, the then Background updated are set As being represented by formula (2):
    <mrow> <mi>D</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mo>|</mo> <msubsup> <mi>f</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> <mo>&amp;prime;</mo> </msubsup> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>-</mo> <mover> <mrow> <msup> <mi>f</mi> <mo>&amp;prime;</mo> </msup> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <mo>&amp;OverBar;</mo> </mover> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
    D if (x, y) > σ, then it represents that the background image variation frequency of stage performance image is high, and need to upgrade in time background image;If 0 Then noise may impact≤D (x, y)≤σ to background image definition, can by the way of below in conjunction with weight coefficient come Solve:
    If weight coefficient is λ, g (x, y) be i+1 frame renewal background image pixel and, then threshold value σ calculation formula can table It is shown as:
    <mrow> <mi>&amp;sigma;</mi> <mo>=</mo> <mi>&amp;lambda;</mi> <mo>*</mo> <mfrac> <mrow> <mi>g</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <mover> <mrow> <msup> <mi>f</mi> <mo>&amp;prime;</mo> </msup> <mrow> <mo>(</mo> <mrow> <mi>x</mi> <mo>,</mo> <mi>y</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>&amp;OverBar;</mo> </mover> </mfrac> <mo>,</mo> <mn>0</mn> <mo>&lt;</mo> <mi>&amp;lambda;</mi> <mo>&lt;</mo> <mn>1</mn> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
    Contrasted by a two field picture background pixel at once with the background image of primitivation, and adjusted with mediation product acquisition image is weighted Whole threshold values, solve the problems, such as fluctuation of pixel values caused by noise fluctuations, can at utmost retain the definition of sample background image;
    S12, moving object detection realized using the particle swarm optimization algorithm based on Chaos Search:
    1) whole Parameter Initialization procedures:c1And c2Studying factors are described, use TmaxMaximum iteration is described, describes to control with U Parameter, Chaos Search initial iteration number is described with T;Make group particulate initialization, include random site and speed;
    2) adaptedness of whole particulates to target, the optimum position P that j-th of particle of locking searches up to now are assessedj, with And the optimum position P that whole population searches up to nowg
    3) cohesive process 1) and process 2) an iteration operation is implemented to the particle in population, if current optimum individual is being restrained In condition and range or reach maximum iteration, then running 5);
    If 4) make whole population history optimum particle position PgAfter undergoing T population interative computation, change or only micro- not yet Weak change, then make Xj'=Pg, optimal value Xj' using chaos search plain algorithm and carry out optimizing and obtain, Pg=Xj', running 3) it is real Follow-up particle group operatione is applied, otherwise running 5);
    5) evolutionary process terminates, and returns to globally optimal solution;The fitness function that uniformity estimates optimization is introduced, obtains image segmentation Optimal threshold t, setting stage performance image two parts are divided into by threshold value t, use RI, I=1,2 represent that then uniformity is estimated The fitness function of optimization can use formula (4) to represent:
    <mrow> <mi>U</mi> <mi>M</mi> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> <mo>=</mo> <mn>1</mn> <mo>-</mo> <mfrac> <mrow> <msubsup> <mi>&amp;sigma;</mi> <mi>I</mi> <mn>2</mn> </msubsup> <mo>+</mo> <msubsup> <mi>&amp;sigma;</mi> <mn>2</mn> <mn>2</mn> </msubsup> </mrow> <msub> <mi>C</mi> <mn>01</mn> </msub> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
    Wherein,RITo split rear region I, f is seti(x, y) is image-region (x, y) Gray value,Use AIRegion R is describedIIn pixel value number;Use C0lUM (t) is described to pass through Normalized constant;Then optimal parameter λ comes across:λ *=argmaxUM (t* (λ));
    S13, the stage performance motion target tracking based on Mean Shift algorithms:
    It is space R to set ddM sample point x be presentJ, J=1,2..., m, then the Mean Shift vectors on x points be:
    <mrow> <msub> <mi>M</mi> <mi>h</mi> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mi>k</mi> </mfrac> <munder> <mo>&amp;Sigma;</mo> <mrow> <msub> <mi>x</mi> <mi>J</mi> </msub> <mo>&amp;Element;</mo> <msub> <mi>S</mi> <mi>h</mi> </msub> </mrow> </munder> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>J</mi> </msub> <mo>-</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
    Wherein, with (xJ- x) description sample point xJRelative to point x offset vector, S is usedhDescription radius is the h spherical area of higher-dimension Domain, vector Mh(x) it is to falling into region ShIn k sample point relative to point x offset vector average vector value;If y is Fall into region ShIn point set, then the set of y points is obtained by formula below is:
    Sh={ y:(y-x)T(y-x)≤h2} (6)
    Described with k in m sample point xJIn there is k point to enter ShIn;If sample point x is obtained in probability density function f (x)J, Because the probability density gradient trend of non-zero is consistent with the direction that probability density growth is maximum, using average value computing, most of Sh In sample point along probability density gradient trend be distributed;Therefore, average offset vector M is dissected based on mathematicshAnd sample (x) The most wide region trend of distribution is consistent, while is also the gradient direction of probability density function, wherein, the region that great circle represents is just It is Sh, small circle, which represents, enters ShSample point x in regionJ∈Sh, Mean Shift datum mark X center stain descriptions, sample This offset reference point X vector is described with arrow;
    Then Mean Shift algorithms are by first drawing the color histogram of object module and alternative model, then by comparative analysis The similarity of two group models implements the tracking of moving target.
  3. 3. the stage performance abnormal behavior monitoring method of view-based access control model sensing network according to claim 2, its feature exist In:Mean Shift algorithms are by first drawing the color histogram of object module and alternative model, then by two groups of comparative analysis The specific implementation that the similarity of model implements the tracking of moving target is as follows:
    S131, establish object module
    Detected using stage performance and moving target is divided into several regions, selected according to the gray scale of moving target or color histogram The target area taken, if object center is in point xo, then xJFor describing each pixel, characteristic value μ=1,2...m', use The characteristic value probability of the corresponding target zone of description:
    <mrow> <msub> <mover> <mi>q</mi> <mo>^</mo> </mover> <mi>u</mi> </msub> <mo>=</mo> <mi>C</mi> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>J</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>m</mi> </munderover> <mi>k</mi> <mrow> <mo>(</mo> <mo>|</mo> <mo>|</mo> <mfrac> <mrow> <msubsup> <mi>x</mi> <mi>J</mi> <mi>s</mi> </msubsup> <mo>-</mo> <msub> <mi>x</mi> <mi>o</mi> </msub> </mrow> <mi>h</mi> </mfrac> <mo>|</mo> <mo>|</mo> <mo>)</mo> </mrow> <mi>&amp;delta;</mi> <mo>&amp;lsqb;</mo> <mi>b</mi> <mrow> <mo>(</mo> <msubsup> <mi>x</mi> <mi>J</mi> <mi>s</mi> </msubsup> <mo>)</mo> </mrow> <mo>-</mo> <mi>u</mi> <mo>&amp;rsqb;</mo> </mrow>
    <mrow> <mi>C</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>J</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>m</mi> </munderover> <mi>k</mi> <mrow> <mo>(</mo> <mo>|</mo> <mo>|</mo> <mfrac> <mrow> <msub> <mi>x</mi> <mi>o</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>J</mi> </msub> </mrow> <mi>h</mi> </mfrac> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow>
    <mrow> <msub> <mover> <mi>p</mi> <mo>^</mo> </mover> <mi>u</mi> </msub> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>C</mi> <mi>h</mi> </msub> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>J</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>m</mi> <mi>h</mi> </msub> </munderover> <mi>k</mi> <mrow> <mo>(</mo> <mo>|</mo> <mo>|</mo> <mfrac> <mrow> <mi>y</mi> <mo>-</mo> <msub> <mi>x</mi> <mi>J</mi> </msub> </mrow> <mi>h</mi> </mfrac> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mi>&amp;delta;</mi> <mo>&amp;lsqb;</mo> <mi>b</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>J</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mi>u</mi> <mo>&amp;rsqb;</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow>
    S132, mould candidate target
    After completing moving object detection, the higher motion target zone covered in follow-up each frame is treated as into candidate target, target model The centre coordinate for enclosing kernel function is y, the pixel of target zone and be mh, then the probabilistic forecasting of the u characteristic values of candidate family Density function is:
    In formula,It is normaliztion constant, then optimal y process is retrieved when target being followed and regarded as, Ensure simultaneouslyTogetherSimilarity with maximum;
    S133, likelihood score function
    Computing object moduleWith candidate familyPass through likelihood function Bhattacharrya coefficients againTo target The Distributed Implementation analysis of model, then have:
    During span [0,1], its value is higher, explanationAndThe similarity of model is higher, and formula (9) existsPoint Taylor series expansion can obtain:
    <mrow> <mi>&amp;rho;</mi> <mo>&amp;lsqb;</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>,</mo> <mi>q</mi> <mo>&amp;rsqb;</mo> <mo>&amp;ap;</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>u</mi> <mo>=</mo> <mn>1</mn> </mrow> <msup> <mi>m</mi> <mo>&amp;prime;</mo> </msup> </munderover> <msqrt> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <msub> <mi>q</mi> <mi>u</mi> </msub> </mrow> </msqrt> <mo>+</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>u</mi> <mo>=</mo> <mn>1</mn> </mrow> <msup> <mi>m</mi> <mo>&amp;prime;</mo> </msup> </munderover> <msub> <mi>p</mi> <mi>u</mi> </msub> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <msqrt> <mfrac> <msub> <mi>q</mi> <mi>u</mi> </msub> <mrow> <msub> <mi>p</mi> <mi>u</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> </mrow> </mfrac> </msqrt> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>10</mn> <mo>)</mo> </mrow> </mrow>
    Formula (9) band people's formula (10) can be obtained:
    <mrow> <mi>&amp;rho;</mi> <mo>&amp;lsqb;</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>,</mo> <mi>q</mi> <mo>&amp;rsqb;</mo> <mo>&amp;ap;</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>u</mi> <mo>=</mo> <mn>1</mn> </mrow> <msup> <mi>m</mi> <mo>&amp;prime;</mo> </msup> </munderover> <msqrt> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <msub> <mi>q</mi> <mi>u</mi> </msub> </mrow> </msqrt> <mo>+</mo> <mfrac> <msub> <mi>C</mi> <mi>h</mi> </msub> <mn>2</mn> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>J</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>m</mi> </munderover> <msub> <mi>w</mi> <mi>J</mi> </msub> <mi>k</mi> <mrow> <mo>(</mo> <mo>|</mo> <mo>|</mo> <mfrac> <mrow> <mi>y</mi> <mo>-</mo> <msub> <mi>x</mi> <mi>J</mi> </msub> </mrow> <mi>h</mi> </mfrac> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>11</mn> <mo>)</mo> </mrow> </mrow>
    <mrow> <msub> <mi>w</mi> <mi>i</mi> </msub> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>u</mi> <mo>=</mo> <mn>1</mn> </mrow> <msup> <mi>m</mi> <mo>&amp;prime;</mo> </msup> </munderover> <mi>&amp;delta;</mi> <mo>&amp;lsqb;</mo> <mi>b</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>J</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mi>u</mi> <mo>&amp;rsqb;</mo> <msqrt> <mfrac> <msub> <mi>q</mi> <mi>u</mi> </msub> <mrow> <msub> <mi>p</mi> <mi>u</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> </mrow> </mfrac> </msqrt> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>12</mn> <mo>)</mo> </mrow> </mrow>
    S134, computing offset
    Optimal Mean Shift vector values are obtained using the computing of Bhattacharrya coefficients, the value is stage performance motion mesh The vector from element position change is marked, by loop iteration computing, obtains Optimum Matching position y of the target in subsequent frame1
    <mrow> <msub> <mi>y</mi> <mn>1</mn> </msub> <mo>=</mo> <mfrac> <mrow> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>J</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>m</mi> </munderover> <msub> <mi>x</mi> <mi>J</mi> </msub> <msub> <mi>w</mi> <mi>J</mi> </msub> <mi>g</mi> <mrow> <mo>(</mo> <mo>|</mo> <mo>|</mo> <mfrac> <mrow> <mi>y</mi> <mo>-</mo> <msub> <mi>x</mi> <mi>J</mi> </msub> </mrow> <mi>h</mi> </mfrac> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> <mrow> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>J</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>m</mi> </munderover> <msub> <mi>w</mi> <mi>J</mi> </msub> <mi>g</mi> <mrow> <mo>(</mo> <mo>|</mo> <mo>|</mo> <mfrac> <mrow> <mi>y</mi> <mo>-</mo> <msub> <mi>x</mi> <mi>J</mi> </msub> </mrow> <mi>h</mi> </mfrac> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>13</mn> <mo>)</mo> </mrow> <mo>.</mo> </mrow>
  4. 4. the stage performance abnormal behavior monitoring method of view-based access control model sensing network according to claim 1, its feature exist In:The specific implementation process of the step S2 is as follows:
    S21, the sparse description of stage performance behavioural characteristic:
    1) it is based on l1The sparse description algorithm of norm
    There is the class of M marked difference in setting, existing number of training is N in the i-th classi, then training sample beThe feature approximation of identical type is in a lower-dimensional subspace;Detection sample y is treated as into m Dimension, if y belongs to the i-th class, y can use the i-th class training sample DiLinear combination implement description, then have:
    <mrow> <mi>y</mi> <mo>=</mo> <msubsup> <mi>d</mi> <mi>a</mi> <mi>i</mi> </msubsup> <msubsup> <mi>a</mi> <mn>1</mn> <mi>i</mi> </msubsup> <mo>+</mo> <mo>...</mo> <mo>+</mo> <msubsup> <mi>d</mi> <msub> <mi>N</mi> <mi>i</mi> </msub> <mi>i</mi> </msubsup> <msubsup> <mi>&amp;alpha;</mi> <msub> <mi>N</mi> <mi>i</mi> </msub> <mi>i</mi> </msubsup> <mo>=</mo> <msup> <mi>D</mi> <mi>i</mi> </msup> <msup> <mi>&amp;alpha;</mi> <mi>i</mi> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>14</mn> <mo>)</mo> </mrow> </mrow>
    In formula, y is in DiMiddle concentration description is αi, y can use M class D=[D1,...,DM]=[d1,...,dN]∈Rd×NThe word of structure The sparse description of allusion quotation;Y sparse description is obtained by formula (15):
    A=mina||y-Da||2+λ'||a||1 (15)
    In formula, sparse description coefficients of the y in dictionary D is a, and constraint sparse coefficient is λ ';
    2) it is based on being locally linear embedding into sparse description
    LLE is a kind of manifold learning arithmetic, and this kind of algorithm is assumed to obtain sample set out of smooth manifold, the sample in low-dimensional region Its neighbour's point Linear can be used to describe, while ensure higher-dimension sample local linear held stationary so that the inherent knot of data set Structure keeps stable;If stage performance behavioral data collection is Y=[y1,y2,...,yn], then y is assumed based on LLEiIt can use identical Detection sample set arest neighbors in manifold linearly describes, yiSparse description coefficient entered by respective neighbours by identical weights Line describes;Then LLE secondary limitation can use formula (16) to describe:
    <mrow> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <mo>|</mo> <mo>|</mo> <msub> <mi>a</mi> <mi>i</mi> </msub> <mo>-</mo> <munder> <mo>&amp;Sigma;</mo> <mrow> <mi>j</mi> <mo>&amp;Element;</mo> <mi>N</mi> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> </munder> <msub> <mi>v</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <msub> <mi>&amp;alpha;</mi> <mi>j</mi> </msub> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>16</mn> <mo>)</mo> </mrow> </mrow>
    In formula, αjReconstruction weights be vij, yiArest neighbors be N (yi), v is obtained using formula (17)ij
    <mrow> <mtable> <mtr> <mtd> <mrow> <msub> <mi>v</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mo>=</mo> <mi>arg</mi> <mi> </mi> <msub> <mi>min</mi> <msub> <mi>v</mi> <mi>j</mi> </msub> </msub> <mo>|</mo> <mo>|</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>-</mo> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>j</mi> <mo>&amp;Element;</mo> <mi>N</mi> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> </munder> <msub> <mi>v</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> </mrow> </mtd> <mtd> <mrow> <mi>s</mi> <mo>.</mo> <mi>t</mi> <mo>.</mo> </mrow> </mtd> <mtd> <mrow> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>j</mi> <mo>&amp;Element;</mo> <mi>N</mi> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> </munder> <msub> <mi>v</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mo>=</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>17</mn> <mo>)</mo> </mrow> </mrow>
    Then formula (17) can be transformed into:
    <mrow> <mtable> <mtr> <mtd> <mrow> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <mo>|</mo> <mo>|</mo> <msub> <mi>a</mi> <mi>i</mi> </msub> <mo>-</mo> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>j</mi> <mo>&amp;Element;</mo> <mi>N</mi> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> </munder> <msub> <mi>v</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <msub> <mi>&amp;alpha;</mi> <mi>j</mi> </msub> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>=</mo> <mo>|</mo> <mo>|</mo> <mi>A</mi> <mo>-</mo> <mi>A</mi> <mi>V</mi> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>=</mo> <mo>|</mo> <mo>|</mo> <msup> <mrow> <mo>(</mo> <mrow> <mi>A</mi> <mo>-</mo> <mi>A</mi> <mi>V</mi> </mrow> <mo>)</mo> </mrow> <mi>T</mi> </msup> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>=</mo> <mi>T</mi> <mi>r</mi> <mrow> <mo>(</mo> <mrow> <mi>A</mi> <mrow> <mo>(</mo> <mrow> <mi>I</mi> <mo>-</mo> <mi>V</mi> </mrow> <mo>)</mo> </mrow> <msup> <mrow> <mo>(</mo> <mrow> <mi>I</mi> <mo>-</mo> <mi>V</mi> </mrow> <mo>)</mo> </mrow> <mi>T</mi> </msup> <msup> <mi>A</mi> <mi>T</mi> </msup> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mi>T</mi> <mi>r</mi> <mrow> <mo>(</mo> <mrow> <msup> <mi>AMA</mi> <mi>T</mi> </msup> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>18</mn> <mo>)</mo> </mrow> </mrow> 4
    In formula, I is unit matrix, M=(I-V) (I-V)TIt is foundation LLE matrixes, then V isFormula (18) is incorporated in sparse description formula, then using formula (19) Represent to be based on being locally linear embedding into sparse description:
    minA||Y-DA||20||A||11Tr(AMAT) (19)
    In formula (19), λ0And λ1It is regularization parameter;
    3) sparse description coefficient is obtained
    Each α in formula (19)iIt can be described as:
    <mrow> <msub> <mi>min</mi> <mrow> <msub> <mi>a</mi> <mn>1</mn> </msub> <mo>,</mo> <msub> <mi>a</mi> <mn>2</mn> </msub> <mo>,</mo> <mo>...</mo> <mo>,</mo> <msub> <mi>a</mi> <mi>n</mi> </msub> </mrow> </msub> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <mo>|</mo> <mo>|</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>Da</mi> <mi>i</mi> </msub> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>+</mo> <msub> <mi>&amp;lambda;</mi> <mn>1</mn> </msub> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <msub> <mi>M</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </msub> <msubsup> <mi>a</mi> <mi>i</mi> <mi>T</mi> </msubsup> <msub> <mi>a</mi> <mi>j</mi> </msub> <mo>+</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <msub> <mi>&amp;lambda;</mi> <mn>0</mn> </msub> <mo>|</mo> <mo>|</mo> <msub> <mi>a</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mo>|</mo> <mn>1</mn> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>20</mn> <mo>)</mo> </mrow> </mrow>
    If there is stable vector { aj}j≠i, then the process solved can describe with following formula:
    <mrow> <msub> <mi>min</mi> <msub> <mi>a</mi> <mi>i</mi> </msub> </msub> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;alpha;</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mo>|</mo> <mo>|</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>Da</mi> <mi>i</mi> </msub> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>+</mo> <msub> <mi>&amp;lambda;</mi> <mn>1</mn> </msub> <msub> <mi>M</mi> <mrow> <mi>i</mi> <mi>i</mi> </mrow> </msub> <msubsup> <mi>&amp;alpha;</mi> <mi>i</mi> <mi>T</mi> </msubsup> <msub> <mi>&amp;alpha;</mi> <mi>j</mi> </msub> <mo>+</mo> <msubsup> <mi>&amp;alpha;</mi> <mi>i</mi> <mi>T</mi> </msubsup> <msub> <mi>h</mi> <mi>i</mi> </msub> <mo>+</mo> <msub> <mi>&amp;lambda;</mi> <mn>0</mn> </msub> <mo>|</mo> <mo>|</mo> <msub> <mi>a</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mo>|</mo> <mn>1</mn> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>21</mn> <mo>)</mo> </mrow> </mrow>
    In formula, For describing αiJ-th of component;By feature-sign searching algorithms to formula (21) solved, defined simultaneously:
    <mrow> <mi>f</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;alpha;</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mo>|</mo> <mo>|</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>Da</mi> <mi>i</mi> </msub> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>+</mo> <msub> <mi>&amp;lambda;</mi> <mn>1</mn> </msub> <msub> <mi>M</mi> <mrow> <mi>i</mi> <mi>i</mi> </mrow> </msub> <msubsup> <mi>&amp;alpha;</mi> <mi>i</mi> <mi>T</mi> </msubsup> <msub> <mi>&amp;alpha;</mi> <mi>i</mi> </msub> <mo>+</mo> <msubsup> <mi>&amp;alpha;</mi> <mi>i</mi> <mi>T</mi> </msubsup> <msub> <mi>h</mi> <mi>i</mi> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>22</mn> <mo>)</mo> </mrow> </mrow>
    Then there are g (αi)=f (αi)+λ0||αi||1
    S22, the identification of stage performance abnormal behaviour:
    Sparse description factor alpha based on above-mentioned acquisitioni, the species for detecting sample can be obtained;I-th of reconstructed error is set, and it is used In description detection sample with the difference after being reconstructed by the i-th class, then have:
    ri(y)=| | y-Diai| |, i=1 ..., M (23)
    In formula, αiIn non-zero coefficient be used to describe the atom in corresponding class, y type is obtained using the minimum residual error in formula (24):
    Class (y)=argminI=1 ,-Mri(y) (24)
    By the high interference of stage performance behavioural characteristic dimension so that real to dance performance Behavioral training sample in true detection process Apply that mark is complex, and this up-to-date style (23) be overdetermined equation, should be converted into underdetermined equation, lift D dimension, involvement d × d Unit matrix, formula (23) is transformed into:
    Y=minα||y-Bα||2+λ||a||1 (25)
    In formula, B=[D I] ∈ Rd×(N+d), a=[αT ηT]T∈R(N+d), η ∈ R(N+d)Represent error vector;Formula (25) ensures formula (23) underdetermined equation is transformed into, and ensures error may be present in formula (23);
    Then the object function of involvement error vector is:
    Minimum residual error implementation emergency to formula (27) can obtain y classification:
    Class (y)=argminI=1 ,-MRi(y) (27)。
CN201710644855.1A 2017-08-01 2017-08-01 The stage performance abnormal behavior monitoring method of view-based access control model sensing network Pending CN107463898A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710644855.1A CN107463898A (en) 2017-08-01 2017-08-01 The stage performance abnormal behavior monitoring method of view-based access control model sensing network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710644855.1A CN107463898A (en) 2017-08-01 2017-08-01 The stage performance abnormal behavior monitoring method of view-based access control model sensing network

Publications (1)

Publication Number Publication Date
CN107463898A true CN107463898A (en) 2017-12-12

Family

ID=60547029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710644855.1A Pending CN107463898A (en) 2017-08-01 2017-08-01 The stage performance abnormal behavior monitoring method of view-based access control model sensing network

Country Status (1)

Country Link
CN (1) CN107463898A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108447076A (en) * 2018-03-16 2018-08-24 清华大学 Multi-object tracking method based on depth enhancing study
CN108597578A (en) * 2018-04-27 2018-09-28 广东省智能制造研究所 A kind of human motion appraisal procedure based on two-dimensional framework sequence
CN109255796A (en) * 2018-09-07 2019-01-22 浙江大丰实业股份有限公司 Stage equipment security solution platform
CN109389031A (en) * 2018-08-27 2019-02-26 浙江大丰实业股份有限公司 Performance personnel's automatic positioning mechanism
CN110711374A (en) * 2019-10-15 2020-01-21 石家庄铁道大学 Multi-modal dance action evaluation method
CN110753297A (en) * 2019-09-27 2020-02-04 广州励丰文化科技股份有限公司 Mixing processing method and processing device for audio signals
CN113804470A (en) * 2021-04-14 2021-12-17 山东省计算中心(国家超级计算济南中心) Fault detection feedback method for plug seedling assembly line
CN114240913A (en) * 2021-12-21 2022-03-25 歌尔股份有限公司 Semiconductor abnormality analysis method, semiconductor abnormality analysis device, terminal device, and storage medium
CN117854014A (en) * 2024-03-08 2024-04-09 国网福建省电力有限公司 Automatic capturing and analyzing method for comprehensive abnormal phenomenon
CN118092526A (en) * 2024-04-18 2024-05-28 黑河学院 Safety regulation control system for music performance stage equipment
CN118092526B (en) * 2024-04-18 2024-06-28 黑河学院 Safety regulation control system for music performance stage equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103699874A (en) * 2013-10-28 2014-04-02 中国计量学院 Crowd abnormal behavior identification method based on SURF (Speed-Up Robust Feature) stream and LLE (Locally Linear Embedding) sparse representation

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103699874A (en) * 2013-10-28 2014-04-02 中国计量学院 Crowd abnormal behavior identification method based on SURF (Speed-Up Robust Feature) stream and LLE (Locally Linear Embedding) sparse representation

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
徐娇: "高密度群体分割及其行为识别技术研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
王东等: "一种融合聚类的监督局部线性嵌入算法研究", 《半导体光电》 *
罗芳: "基于视频监控的人体异常行为识别系统研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108447076A (en) * 2018-03-16 2018-08-24 清华大学 Multi-object tracking method based on depth enhancing study
CN108447076B (en) * 2018-03-16 2021-04-06 清华大学 Multi-target tracking method based on deep reinforcement learning
CN108597578A (en) * 2018-04-27 2018-09-28 广东省智能制造研究所 A kind of human motion appraisal procedure based on two-dimensional framework sequence
CN108597578B (en) * 2018-04-27 2021-11-05 广东省智能制造研究所 Human motion assessment method based on two-dimensional skeleton sequence
CN109389031B (en) * 2018-08-27 2021-12-03 浙江大丰实业股份有限公司 Automatic positioning mechanism for performance personnel
CN109389031A (en) * 2018-08-27 2019-02-26 浙江大丰实业股份有限公司 Performance personnel's automatic positioning mechanism
CN109255796A (en) * 2018-09-07 2019-01-22 浙江大丰实业股份有限公司 Stage equipment security solution platform
CN109255796B (en) * 2018-09-07 2022-01-28 浙江大丰实业股份有限公司 Safety analysis platform for stage equipment
CN110753297A (en) * 2019-09-27 2020-02-04 广州励丰文化科技股份有限公司 Mixing processing method and processing device for audio signals
CN110711374A (en) * 2019-10-15 2020-01-21 石家庄铁道大学 Multi-modal dance action evaluation method
CN110711374B (en) * 2019-10-15 2021-05-04 石家庄铁道大学 Multi-modal dance action evaluation method
CN113804470A (en) * 2021-04-14 2021-12-17 山东省计算中心(国家超级计算济南中心) Fault detection feedback method for plug seedling assembly line
CN113804470B (en) * 2021-04-14 2023-12-01 山东省计算中心(国家超级计算济南中心) Fault detection feedback method for plug seedling production line
CN114240913A (en) * 2021-12-21 2022-03-25 歌尔股份有限公司 Semiconductor abnormality analysis method, semiconductor abnormality analysis device, terminal device, and storage medium
CN117854014A (en) * 2024-03-08 2024-04-09 国网福建省电力有限公司 Automatic capturing and analyzing method for comprehensive abnormal phenomenon
CN117854014B (en) * 2024-03-08 2024-05-31 国网福建省电力有限公司 Automatic capturing and analyzing method for comprehensive abnormal phenomenon
CN118092526A (en) * 2024-04-18 2024-05-28 黑河学院 Safety regulation control system for music performance stage equipment
CN118092526B (en) * 2024-04-18 2024-06-28 黑河学院 Safety regulation control system for music performance stage equipment

Similar Documents

Publication Publication Date Title
CN107463898A (en) The stage performance abnormal behavior monitoring method of view-based access control model sensing network
US10451712B1 (en) Radar data collection and labeling for machine learning
Tsintotas et al. Assigning visual words to places for loop closure detection
CN112836640B (en) Single-camera multi-target pedestrian tracking method
CN112418117A (en) Small target detection method based on unmanned aerial vehicle image
CN109818798A (en) A kind of wireless sensor network intruding detection system and method merging KPCA and ELM
CN110084165A (en) The intelligent recognition and method for early warning of anomalous event under the open scene of power domain based on edge calculations
WO2021043126A1 (en) System and method for event recognition
EP3938806A1 (en) Radar data collection and labeling for machine-learning
US20220130109A1 (en) Centralized tracking system with distributed fixed sensors
CN112541424A (en) Real-time detection method for pedestrian falling under complex environment
Cao et al. Correlation-based tracking of multiple targets with hierarchical layered structure
CN112616023A (en) Multi-camera video target tracking method in complex environment
CN115205891A (en) Personnel behavior recognition model training method, behavior recognition method and device
CN117114913A (en) Intelligent agricultural data acquisition system based on big data
Salimpour et al. Self-calibrating anomaly and change detection for autonomous inspection robots
Lu et al. An efficient network for multi-scale and overlapped wildlife detection
Sudha et al. Real time riped fruit detection using faster R-CNN deep neural network models
Xu et al. Detection method of wheat rust based on transfer learning and sharpness‐aware minimization
CN114048546A (en) Graph convolution network and unsupervised domain self-adaptive prediction method for residual service life of aircraft engine
Nikpour et al. Deep reinforcement learning in human activity recognition: A survey
Bhushan et al. Incremental principal component analysis based outlier detection methods for spatiotemporal data streams
Celik et al. Change detection without difference image computation based on multiobjective cost function optimization
Demir et al. Drone-assisted automated plant diseases identification using spiking deep conventional neural learning
Singh et al. Chaotic whale-atom search optimization-based deep stacked auto encoder for crowd behaviour recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20171212

RJ01 Rejection of invention patent application after publication