CN104700405B - A kind of foreground detection method and system - Google Patents
A kind of foreground detection method and system Download PDFInfo
- Publication number
- CN104700405B CN104700405B CN201510098306.XA CN201510098306A CN104700405B CN 104700405 B CN104700405 B CN 104700405B CN 201510098306 A CN201510098306 A CN 201510098306A CN 104700405 B CN104700405 B CN 104700405B
- Authority
- CN
- China
- Prior art keywords
- msub
- current frame
- frame image
- mrow
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Image Analysis (AREA)
Abstract
A kind of foreground detection method and system, this method include:Obtain current frame image;Calculate the local contrast of current frame image;Establish more Gaussian Background models based on local contrast;Detect part foreground image in current frame image as foreground image sample according to more Gaussian Background models;According to foreground image sample and current frame image study background image;According to background image, foreground target detection is carried out to current frame image.Solves the technical problem for causing most of foreground detection method to be difficult to carry out because the computing capability of camera processes device is limited.Foreground detection method and system provided by the invention can detect the foreground target that size is smaller, contrast is weaker using limited computing capability with real-time full.
Description
Technical field
The present invention relates to the image of video monitoring and technical field of video processing.Specifically, it is small and weak to be related to a kind of adaptation
The real-time foreground detection method and system of target.
Background technology
In visual monitor system, generally require to detect moving target, track, classify and analyze, and move
The accuracy of target detection directly affects follow-up processing and operation.It is the most frequently used in order to adapt to scene changes complicated and changeable
Method be exactly to background modeling, then detect foreground target using background model.Existing background modeling method mainly has
Median method, averaging method, kernel density estimation method, code book model, mixed Gauss model etc..
Gauss model is exactly accurately to quantify things with Gaussian probability-density function (normal distribution curve), by a things
It is decomposed into several and is based on the model that Gaussian probability-density function (normal distribution curve) is formed.(i.e. mixing is high for more Gauss models
This model) carry out the feature of each pixel in phenogram picture using K (essentially 3 to 5) individual Gauss model, in a new two field picture
Mixed Gauss model is updated after acquisition, is matched with each pixel in present image with mixed Gauss model, if success
It is background dot to judge the point, is otherwise foreground point.
However, mixed Gauss model be the pixel-recursive according to present frame update background model, this causes previous frame to exist
The mistake occurred during modeling can cause to influence for a long time on background image.Moreover, traditional mixed Gaussian background modeling can not
False-alarm caused by eliminating when illumination variation is very fast, influence of noise during low-light imaging can not be resisted, can't be intactly
Detect the target that size is smaller, contrast is weaker.In addition, intelligent front end video camera needs to detect moving target round-the-clockly,
Requirement to algorithm is higher, and the processor computing capability of front-end camera is limited so that most of conventional background modeling is calculated
Method is difficult to real time execution.
The content of the invention
Therefore, the technical problems to be solved by the invention are to cause big portion because the computing capability of camera processes device is limited
Point foreground target detection method is difficult to real time execution, so as to propose it is a kind of can not only be completed in real time under limited computing capability before
Scape target detection and can completely detect size is smaller, contrast compared with weak signal target foreground detection method and system.
In order to solve the above technical problems, the invention provides following technical scheme:
A kind of foreground detection method, comprises the following steps:
Obtain current frame image;
Calculate the local contrast of current frame image;
Establish more Gaussian Background models based on local contrast;
Detect part foreground image in current frame image as foreground image sample according to more Gaussian Background models;
According to foreground image sample and current frame image study background image;
According to background image, foreground target detection is carried out to current frame image.
Preferably, the step of local contrast for calculating current frame image, includes:
Current frame image is divided into several m*n block of pixels, wherein, m, n are the positive integer more than 0;
Count the gray average and gray variance of each block of pixels;
The local contrast of each block of pixels is obtained, local contrast is the gray variance divided by gray average of each block of pixels
The business of gained.
Preferably, according to background image, the step of current frame image progress foreground target detection, is included:
The gradient vector of each pixel in background image and current frame image is obtained respectively;
Each picture in current frame image is obtained according to the gradient vector of each pixel in background image and current frame image
Whether whether the abundant and texture of background image and current frame image is consistent for texture at vegetarian refreshments;
When the texture-rich in current frame image at some pixel and current frame image and Background at the pixel
The texture of picture is inconsistent, that is, judges the pixel for foreground point.
Preferably, whether the texture in current frame image at each pixel abundant and background image and current frame image
Texture whether be calculated by below equation:
Flat (x, y) represents texture-rich of the current frame image at pixel (x, y) place, and Diff (x, y) is represented in picture
The texture homogeneity of vegetarian refreshments (x, y) place's background image and current frame image, TgAnd TsFor default threshold value, (u0, u1)、(v0, v1)
Represent respectively background image the gradient vector at pixel (x, y) place, current frame image pixel (x, y) place gradient to
Amount, Flat (x, y)=0 represent texture-rich of the current frame image at pixel (x, y) place, and Diff (x, y)=1 is represented in pixel
The texture of point (x, y) place's current frame image and background image is inconsistent.
Preferably, also include before the step of calculating the local contrast of current frame image adaptive to current frame image progress
Noise processed is answered, is specifically included:
Obtain the noise intensity of current frame image;
When noise intensity is more than default threshold value, then noise reduction process is carried out to current frame image.
Preferably, the step of noise intensity for obtaining current frame image, includes:
Current frame image is divided into several pixel number identical image blocks;
Count the noise spot number in each image block in current frame image;
The initial noisc intensity of current frame image is obtained, initial noisc intensity is in each image block in current frame image
The intermediate value of noise spot number divided by the pixel number of image block;
The noise intensity of current frame image is obtained according to the initial noisc intensity of current frame image.
Preferably, the step of counting the noise spot number in current frame image in each image block includes:
Calculate the absolute value of current frame image and previous frame image gray scale difference;
According to the noise spot number in the absolute Data-Statistics current frame image of gray scale difference in each image block, wherein, work as figure
Then judge the pixel for noise when being in as the absolute value of the gray scale difference of some pixel in block in default threshold range
Point.
Preferably, smooth filtering is carried out to obtain present frame to the initial noisc intensity of current frame image according to below equation
The noise intensity of image:
Wherein, α is smoothing factor, and N is the initial noisc intensity of current frame image, NiIt is strong for the noise of current frame image
Degree, Ni-1For the noise intensity of previous frame image, 0<N<1,0<Ni<1,0<Ni-1<1, i=0 represents that current frame image is video
Second two field picture.
Preferably, background image is learnt by below equation:
Wherein, B (x, y) is background image, and I (x, y) is current frame image, F1(x, y) is according to more Gaussian Background models
The foreground image sample detected, F1(x, y)=0 represents that pixel (x, y) is background dot, F1(x, y)>0 expression pixel
(x, y) is foreground point, and β represents Background learning rate.
A kind of foreground detection system, including:
Acquisition module, obtain current frame image;
Computing module, calculate the local contrast of current frame image;
Module is established, establishes more Gaussian Background models based on local contrast;
Pattern detection module, before detecting that the part foreground image in current frame image is used as according to more Gaussian Background models
Scape image pattern;
Background image study module, background image is learnt according to foreground image sample and current frame image;
Foreground target detection module, according to background image, foreground target detection is carried out to current frame image.
Preferably, computing module includes:
Piecemeal submodule, current frame image is divided into several m*n block of pixels, wherein, m, n are just whole more than 0
Number;
Statistic submodule, count the gray average and gray variance of each block of pixels;
Local contrast acquisition submodule, obtains the local contrast of each block of pixels, and local contrast is each block of pixels
Gray variance divided by gray average obtained by business.
Preferably, foreground target detection module includes:
Gradient vector acquisition submodule, obtain respectively the gradient of each pixel in background image and current frame image to
Amount;
Foreground target detection is according to acquisition submodule, according to the gradient of each pixel in background image and current frame image
Whether the texture that vector is obtained in current frame image at each pixel enriches and the texture of background image and current frame image
It is whether consistent;
Foreground target judging submodule, when the texture-rich in current frame image at some pixel and at the pixel
The texture of current frame image and background image is inconsistent, that is, judges the pixel for foreground point.
Preferably, in addition to adaptive noise processing module, including:
Noise intensity acquisition submodule, before the local contrast of current frame image is calculated, obtain current frame image
Noise intensity;
Noise reduction process submodule, when noise intensity is more than default threshold value, then noise reduction process is carried out to current frame image.
Preferably, noise intensity acquisition submodule includes:
Image block division unit, current frame image is divided into several pixel number identical image blocks;
Noise spot statistic unit, count the noise spot number in each image block in current frame image;
Initial noisc intensity acquiring unit, obtains the initial noisc intensity of current frame image, and initial noisc intensity is current
The intermediate value of noise spot number in two field picture in each image block divided by the pixel number of image block;
Noise intensity acquiring unit, the noise that current frame image is obtained according to the initial noisc intensity of current frame image are strong
Degree.
Preferably, noise spot statistic unit includes:
Gray scale difference absolute value computation subunit, calculate the absolute value of current frame image and previous frame image gray scale difference;
Noise spot judges and statistics subelement, according in the absolute Data-Statistics current frame image of gray scale difference in each image block
Noise spot number, wherein, when the absolute value of the gray scale difference of some pixel in image block is in default threshold range
When then judge the pixel for noise spot.
The above-mentioned technical proposal of the present invention has advantages below compared with prior art:
1. foreground detection method provided by the invention and system, by more Gaussian Backgrounds based on local contrast model with
And background image the step of learning, background contamination caused by illuminance abrupt variation can be prevented, improve the accuracy of foreground detection.Together
When, Gaussian Background modeling can be carried out on the image of diminution, effectively improve efficiency of algorithm.
2. foreground detection method provided by the invention and system, image texture is characterized using gradient vector, utilizes texture
Contrast algorithm and carry out foreground target detection, can intactly detect the target that size is smaller, contrast is weaker, while also can
False-alarm caused by eliminating when light change is very fast.
3. foreground detection method provided by the invention and system, it can be carried out according to the noise intensity of current frame image adaptive
Answer noise reduction process.When the noise of image is stronger, interference of the noise to foreground detection can be reduced by carrying out noise reduction process;And when figure
When the noise of picture is weaker, noise reduction process is not carried out to it, it is possible to reduce the amount of calculation of camera processes device.
4. foreground detection method provided by the invention and system, tested through actual set up, can real time execution steady in a long-term in
Outdoor monitoring point, is effective against Changes in weather and illumination variation.
Brief description of the drawings
Fig. 1 is a kind of according to embodiments of the present invention 1 foreground detection method flow chart;
Fig. 2 is to carry out noise processed to current frame image according to embodiments of the present invention 2 foreground detection methods provided
Method flow diagram;
Fig. 3 is a kind of according to embodiments of the present invention 3 foreground detection method flow chart;
Fig. 4 is a kind of foreground detection system block diagram according to the present invention.
Embodiment
In order that those skilled in the art more fully understand present disclosure, with reference to the accompanying drawings and examples to this
The there is provided technical scheme of invention is described in further detail.
Embodiment 1
As shown in figure 1, present embodiments providing a kind of foreground detection method, following steps are specifically included:
S11:Obtain current frame image.
S12:Calculate the local contrast of current frame image.Specifically, present frame figure can be calculated in the following manner
The local contrast of picture:
First, current frame image is divided into several m*n block of pixels, wherein, m, n are the positive integer more than 0;
Then, the gray average and gray variance of each block of pixels are counted;
Finally, the local contrast of each block of pixels is obtained, local contrast is the gray variance divided by ash of each block of pixels
Spend the business obtained by average.
Except being outside one's consideration using the above method to calculate local contrast, can also be counted by other method of the prior art
Calculate the local contrast.
S13:With reference to the relevant information of previous frame image, more Gaussian Background models based on local contrast are established.Will
The local contrast calculated in step S12 carries out more Gaussian Background modelings as the attribute of block.
S14:Detect that the part foreground image in current frame image is decent as foreground picture according to more Gaussian Background models
This.The part foreground image detected in this step is mainly that size is larger, the significant target of contrast.
S15:According to foreground image sample and current frame image study background image.It can specifically be learned by below equation
Practise background image:
Wherein, B (x, y) is background image, and I (x, y) is current frame image, F1(x, y) is according to more Gaussian Background models
The foreground image sample detected, F1(x, y)=0 represents that pixel (x, y) is background dot, F1(x, y)>0 expression pixel
(x, y) is foreground point, and β represents Background learning rate.β can be according to being actually needed reasonable selection.
S16:According to background image, foreground target detection is carried out to current frame image.Preferably, texture comparison's algorithm is utilized
Foreground target detection is carried out, detailed process comprises the following steps:
First, the gradient vector of each pixel in background image and current frame image is obtained respectively, is adopted in the present embodiment
Obtained with Sobel gradient operators;
Then, obtained according to the gradient vector of each pixel in background image and current frame image each in current frame image
Whether whether the abundant and texture of background image and current frame image is consistent for texture at individual pixel, especially by following public affairs
Formula is calculated:
Flat (x, y) represents texture-rich of the current frame image at pixel (x, y) place, and Diff (x, y) is represented in picture
The texture homogeneity of vegetarian refreshments (x, y) place's background image and current frame image, TgAnd TsFor default threshold value, (u0, u1)、(v0, v1)
Represent respectively background image the gradient vector at pixel (x, y) place, current frame image pixel (x, y) place gradient to
Amount, Flat (x, y)=0 represent texture-rich of the current frame image at pixel (x, y) place, and Diff (x, y)=1 is represented in pixel
The texture of point (x, y) place's current frame image and background image is inconsistent;
Finally, when the texture-rich in current frame image at some pixel and current frame image and the back of the body at the pixel
The texture of scape image is inconsistent, that is, judges the pixel for foreground point.The image that all foreground points are formed is foreground image.
The foreground detection method that the present embodiment provides, can prevent background contamination caused by illuminance abrupt variation, you can to prevent
By target or other non-background element renewals into background, the accuracy of foreground detection is improved.Meanwhile Gaussian Background modeling can be with
Carried out on the image of diminution, effectively improve efficiency of algorithm, therefore the method that the present embodiment provides goes for processor
The limited front-end camera of computing capability, the foreground detection of every two field picture can be completed in real time.
In addition, carry out foreground target detection using texture comparison's algorithm, it can accurately detect out that size is smaller, contrast
The weaker target of degree, it can also remove the false-alarm caused by the reason such as light change is very fast.In addition, united according to a large amount of actual videos
Meter, target typically have an abundant texture (otherwise human eye cannot identify the target) in itself, and the texture on most of ground
It is then relatively flat, therefore exclude the less ground of texture using the rich of texture;Then can effective district using texture homogeneity
The background of partial objectives for and texture-rich, and general of illuminance abrupt variation can change brightness of image without changing image texture, because
This also has good resistivity to carry out foreground detection using this whether consistent feature of texture to illuminance abrupt variation.
Embodiment 2
As shown in Fig. 2 present embodiments providing another foreground detection method, compared with above-described embodiment 1, work as obtaining
Also include after the step of prior image frame, before the local contrast of current frame image is calculated adaptive to current frame image progress
The process of noise processed is answered, influence of noise when being imaged with eliminating to foreground detection, is comprised the following steps that:
S101:Obtain the noise intensity of current frame image;
S102:When noise intensity is more than default threshold value, then noise reduction process is carried out to current frame image, can specifically used
Low pass filter carries out noise reduction process, and further preferably Mean Filtering Algorithm carrys out noise reduction.
In the foreground detection method that the present embodiment provides, it is more than when the noise intensity of current frame image is stronger default
During threshold value, with regard to carrying out noise reduction process, accuracy in detection is improved to eliminate interference of the picture noise to foreground detection.If present image
Noise intensity it is smaller when, then noise reduction process need not be carried out to it to reduce the workload of camera processes device.
Specifically, the process of the noise intensity of acquisition present image is in step S101:
S1011:Current frame image is divided into several pixel number identical image blocks, can be specifically divided into some
Individual 8*8 image block;
S1012:Count the noise spot number in each image block in current frame image;
S1013:The initial noisc intensity of current frame image is obtained, initial noisc intensity is each image in current frame image
The intermediate value of noise spot number in block divided by the pixel number of image block, intermediate value is taken also for reduction camera processes device
Workload;
S1014:The noise intensity of current frame image is obtained according to the initial noisc intensity of current frame image.
Compared with other image noise intensity methods of estimation in the prior art, the method that the present embodiment provides can prevent image
The big ups and downs of noise intensity cause denoising module to open and close repeatedly, and algorithm principle is simple and practical, and operational efficiency is very high.
Specifically, the detailed process of the noise spot number in current frame image in each image block is counted in step S1012
For:
First, the absolute value of current frame image and previous frame image gray scale difference is calculated;
Then, according to the noise spot number in the absolute Data-Statistics current frame image of gray scale difference in each image block, wherein,
Then judge that the pixel is when the absolute value of the gray scale difference of some pixel in image block is in default threshold range
Noise spot.
In the foreground detection method that the present embodiment provides, the absolute value of the gray scale difference of some pixel is in default threshold
The pixel is just judged when in the range of value for noise spot, is the pixel gray value because when the absolute value of its gray scale difference is too small
Belong to normal fluctuation, will not have a negative impact to foreground detection, and it is excessive when, the pixel is possible to the picture for foreground image
Vegetarian refreshments.
Because the change of picture noise is to need certain process, therefore can be according to below equation pair in step S1014
The initial noisc intensity of current frame image carries out smooth filtering to obtain the noise intensity of current frame image:
Wherein, α is smoothing factor, and N is the initial noisc intensity of current frame image, NiIt is strong for the noise of current frame image
Degree, Ni-1For the noise intensity of previous frame image, 0<N<1,0<Ni<1,0<Ni-1<1, i=0 represents that current frame image is video
Second two field picture, because needing relatively to seek gray scale difference with previous frame image per two field picture, there is no image before the first frame, therefore
The image that noise intensity can be sought is and N during i=0 since the second framei=N, represent when current frame image is the of video sequence
During two two field pictures, its initial noisc intensity is its noise intensity.
Embodiment 3
As shown in figure 3, present embodiments providing a kind of foreground detection method, comprise the following steps:
S21:Obtain current frame image.Because video camera carries out foreground target detection in real time, therefore it can get reality
When every two field picture for gathering, and foreground detection is all carried out to each two field picture to realize the real-time tracking of target.
S22:Calculate the noise intensity of current frame image.A frame video image is often obtained it is necessary to calculate making an uproar for the two field picture
Sound intensity.Specific calculating process is as follows:
First, the absolute value of the two field picture and gray scale difference of the previous frame image at each pixel is calculated, is designated as D (x, y),
If T0≤D (x, y)≤T1, then pixel (x, y) is noise spot, is not otherwise noise spot, and wherein T0 and T1 are predetermined threshold value, T0
< T1, in the present embodiment, 8 and 16 are taken respectively;
Then, the two field picture is divided into K nonoverlapping pixels and is 8*8 image block, and count each image block
Interior noise spot number, the noise spot number in i-th of image block are designated as Mi, seek MiThe intermediate value of (i=1,2,3 ..., K) is designated asThen the initial noisc intensity N of the current frame image is:
Here M is soughtiThe intermediate value of (i=1,2,3 ..., K) can carry out rapid solving using histogramming algorithm.Meanwhile in order to enter
One step saves amount of calculation, it is actual when calculating can by image progressive it is down-sampled by column after carry out initial noisc Strength co-mputation again;
Finally, due to which the change of picture noise needs certain process, initial noisc intensity can be entered using formula below
Row is smooth to be filtered to obtain the noise intensity of current frame image:
Wherein, α is smoothing factor, and 0.9, N is taken in the present embodimentiFor the final noise intensity of current frame image, Ni-1To be upper
The noise intensity of one two field picture, N are the initial noisc intensity of current frame image.Noise intensity is one between 0 and 1
Value, its value show that more greatly noise in image is stronger.I=0 represents that current frame image is the second two field picture of video, because noise is strong
Degree is calculated according to the gray scale difference of current frame image and previous frame image, is not schemed before the first two field picture of the video
Picture, therefore the second frame and the below noise intensity of two field picture can only be asked, and N during i=0i=N, represent when current frame image is
During the second two field picture of video sequence, its initial noisc intensity is its noise intensity.
Compared with other image noise intensity methods of estimation in the prior art, the method that the present embodiment provides can prevent image
The big ups and downs of noise intensity cause denoising module to open and close repeatedly.
S23:Adaptive noise processing is carried out according to the noise intensity of current frame image.When the noise intensity of current frame image
During more than default threshold value L1, then open noise reduction module and noise reduction process is carried out to current frame image.Due to noise change when need
Want certain process, i.e. usually gradual change, therefore just close noise reduction module when continuous multiple image is below threshold value L0,
Constantly open and close noise reduction module in the short time can so be avoided.Threshold value L1 and threshold value L0 is respectively 0.9 and 0.5.Noise reduction mould
Block can be low pass filter, be the further average for improving operation efficiency and noise reduction and preferably using 3 × 3 in the present embodiment
Filtering algorithm.
In the foreground detection method that the present embodiment provides, it is more than when the noise intensity of current frame image is stronger default
During threshold value, with regard to carrying out noise reduction process, accuracy in detection is improved to eliminate interference of the picture noise to foreground detection.If present image
Noise intensity it is smaller when, then noise reduction process need not be carried out to it to reduce the workload of camera processes device.
S24:Calculate the local contrast of current frame image.First, working as adaptive noise reduction processing will be passed through in step S23
Prior image frame is divided into 16*16 not overlaid pixel block;Then the gray average and gray variance in each block of pixels are counted, is
Reduction amount of calculation, can use the pixel value quadratic sum in block of pixels subtract sum square mode calculate gray variance;Most
Afterwards by the gray variance divided by gray average in each block of pixels, the business of gained is local contrast.
S25:More Gaussian Background models are established based on local contrast.The local contrast of gained will be calculated in step S24
Attribute as block carries out more Gaussian Background modelings so as to obtain more Gaussian Background models.
S26:Detect part foreground image as foreground image sample by the use of more Gaussian Background models.Pass through reasonable selection
More Gaussian parameters, can detect the size in current frame image is larger, the significant target of contrast as foreground image sample,
It is designated as F1。
S27:Learn background image.The foreground image sample F detected in step S261Available for according to current frame image
Learn background image, specific study formula is as follows:
Wherein, B (x, y) is background image, and I (x, y) is current frame image, F1(x, y) is according to more Gaussian Background models
The foreground image sample detected, F1(x, y)=0 represents that pixel (x, y) belongs to background image, F1(x, y)>0 expression pixel
Point (x, y) belongs to foreground image, and β represents Background learning rate, and β takes 0.008 in the present embodiment.
The foreground detection method that the present embodiment provides, pass through more Gaussian Backgrounds modeling based on local contrast and background
The step of image study, background contamination caused by illuminance abrupt variation can be prevented, improve the accuracy of foreground detection.
S28:Foreground target detection is carried out to current frame image.It is specifically to be entered using texture comparison's algorithm in the present embodiment
Row foreground detection, texture comparison's algorithm refers on the basis of rationally expression image texture, uses certain measure
To assess the method for the similitude of two kinds of textures of image, image texture is expressed using gradient vector in the present embodiment, is had
Body process is as follows:
First, the background image B (x, y) and current frame image after Sobel gradient operator calculation procedure S27 learnings are utilized
Gradient vector at each pixel, gradient vector of the background image at pixel (x, y) place is (u0, u1), current frame image
Gradient vector at pixel (x, y) place is (v0, v1);
Then, obtain texture in current frame image at each pixel whether abundant and background image and present frame figure
Whether the texture of picture is consistent, is calculated especially by below equation:
Flat (x, y) represents texture-rich of the current frame image at pixel (x, y) place, and Diff (x, y) is represented in picture
The texture homogeneity of vegetarian refreshments (x, y) place's background image and current frame image, Flat (x, y)=0 represent current frame image in pixel
The texture-rich at point (x, y) place, Diff (x, y)=1 are represented in pixel (x, y) place's current frame image and the texture of background image
It is inconsistent, TgAnd TsFor default threshold value, determined by a large amount of different scene regulation tests comprising weak signal target and noise,
25,0.4 is taken in the present embodiment respectively;
Finally, the foreground picture E of current frame image is obtained,
E (x, y)=Diff (x, y) AND (NOT Flat (x, y))
That is, when pixel (x, y) place in current frame image texture-rich and at the pixel current frame image with
When the texture of background image is inconsistent, the pixel is exactly foreground point.And the image that all foreground points are formed is foreground picture
Picture.
The foreground detection method that the present embodiment provides, is characterized image texture using gradient vector, is calculated using texture comparison
Method carry out foreground target detection, can intactly detect the target that size is smaller, contrast is weaker, at the same can also eliminate because
Caused false-alarm when light change is very fast.
Embodiment 4
As shown in figure 4, a kind of foreground detection system is present embodiments provided, including:
Acquisition module M1, obtain current frame image;
Computing module M2, calculate the local contrast of current frame image;
Module M3 is established, establishes more Gaussian Background models based on local contrast;
Pattern detection module M4, the part foreground image conduct in current frame image is detected according to more Gaussian Background models
Foreground image sample;
Background image study module M5, background image is learnt according to foreground image sample and current frame image;
Foreground target detection module M6, according to background image, foreground target detection is carried out to current frame image.
The foreground detection system that the present embodiment provides, background contamination caused by illuminance abrupt variation can be prevented, improve prospect inspection
The accuracy of survey.Meanwhile Gaussian Background modeling can be carried out on the image of diminution, effectively improve efficiency of algorithm.
Specifically, computing module M2 includes:
Piecemeal submodule, current frame image is divided into several m*n block of pixels, wherein, m, n are just whole more than 0
Number;
Statistic submodule, count the gray average and gray variance of each block of pixels;
Local contrast acquisition submodule, obtains the local contrast of each block of pixels, and local contrast is each block of pixels
Gray variance divided by gray average obtained by business.
Preferably, foreground target detection module M6 includes:
Gradient vector acquisition submodule, obtain respectively the gradient of each pixel in background image and current frame image to
Amount;
Foreground target detection is according to acquisition submodule, according to the gradient of each pixel in background image and current frame image
Whether the texture that vector is obtained in current frame image at each pixel enriches and the texture of background image and current frame image
It is whether consistent;
Foreground target judging submodule, when the texture-rich in current frame image at some pixel and at the pixel
The texture of current frame image and background image is inconsistent, that is, judges the pixel for foreground point.
The foreground detection system that the present embodiment provides, is characterized image texture using gradient vector, is calculated using texture comparison
Method carry out foreground target detection, can intactly detect the target that size is smaller, contrast is weaker, at the same can also eliminate because
Caused false-alarm when light change is very fast.
Preferably, in addition to adaptive noise processing module, including:
Noise intensity acquisition submodule M01, before the local contrast of current frame image is calculated, obtain current frame image
Noise intensity;
Noise reduction process submodule M02, when noise intensity is more than default threshold value, then current frame image is carried out at noise reduction
Reason.
In the foreground detection system that the present embodiment provides, it is more than when the noise intensity of current frame image is stronger at noise
When managing threshold value, with regard to carrying out noise reduction process, accuracy in detection is improved to eliminate interference of the picture noise to foreground detection.If current figure
When the noise intensity of picture is smaller, then noise reduction process need not be carried out to it to reduce the workload of camera processes device.
Specifically, noise intensity acquisition submodule M01 includes:
Image block division unit, current frame image is divided into several pixel number identical image blocks;
Noise spot statistic unit, count the noise spot number in each image block in current frame image;
Initial noisc intensity acquiring unit, obtains the initial noisc intensity of current frame image, and initial noisc intensity is current
The intermediate value of noise spot number in two field picture in each image block divided by the pixel number of image block;
Noise intensity acquiring unit, the noise that current frame image is obtained according to the initial noisc intensity of current frame image are strong
Degree.
Specifically, noise spot statistic unit includes:
Gray scale difference absolute value computation subunit, calculate the absolute value of current frame image and previous frame image gray scale difference;
Noise spot judges and statistics subelement, according in the absolute Data-Statistics current frame image of gray scale difference in each image block
Noise spot number, wherein, when the absolute value of the gray scale difference of some pixel in image block is in default threshold range
When then judge the pixel for noise spot.
In the foreground detection system that the present embodiment provides, the absolute value of the gray scale difference of some pixel is in default threshold
The pixel is just judged when in the range of value for noise spot, is the pixel gray value because when the absolute value of its gray scale difference is too small
Belong to normal fluctuation, will not have a negative impact to foreground detection, and it is excessive when, the pixel is possible to the picture for foreground image
Vegetarian refreshments.
Obviously, above-described embodiment is only intended to clearly illustrate example, and is not the restriction to embodiment.It is right
For those of ordinary skill in the art, can also make on the basis of the above description it is other it is various forms of change or
Change.There is no necessity and possibility to exhaust all the enbodiments.And the obvious change thus extended out or
Among changing still in the protection domain of the invention.
Claims (12)
1. a kind of foreground detection method, it is characterised in that comprise the following steps:
Obtain current frame image;
Calculate the local contrast of the current frame image;
Establish more Gaussian Background models based on the local contrast;
Detect that the part foreground image in the current frame image is decent as foreground picture according to more Gaussian Background models
This;
According to the foreground image sample and current frame image study background image;
According to the background image, foreground target detection is carried out to the current frame image;
It is described according to the background image, the step of carrying out foreground target detection to the current frame image, includes:
The gradient vector of each pixel in the background image and the current frame image is obtained respectively;
The current frame image is obtained according to the gradient vector of each pixel in the background image and the current frame image
In texture at each pixel it is whether abundant and whether the background image consistent with the texture of the current frame image;
When the texture-rich in the current frame image at some pixel and the current frame image and institute at the pixel
It is inconsistent to state the texture of background image, that is, judges the pixel for foreground point;
Whether the texture in the current frame image at each pixel enriches and the background image and the present frame figure
Whether the texture of picture is unanimously to be calculated by below equation:
<mrow>
<mi>F</mi>
<mi>l</mi>
<mi>a</mi>
<mi>t</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mn>1</mn>
</mtd>
<mtd>
<mrow>
<msup>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<mn>2</mn>
</msup>
<mo>></mo>
<msup>
<msub>
<mi>T</mi>
<mi>g</mi>
</msub>
<mn>2</mn>
</msup>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<msup>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<mn>2</mn>
</msup>
<mo>&le;</mo>
<msup>
<msub>
<mi>T</mi>
<mi>g</mi>
</msub>
<mn>2</mn>
</msup>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
<mrow>
<mi>D</mi>
<mi>i</mi>
<mi>f</mi>
<mi>f</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mn>1</mn>
</mtd>
<mtd>
<mrow>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<msub>
<mi>u</mi>
<mn>1</mn>
</msub>
<mo><</mo>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<mo>+</mo>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>u</mi>
<mn>1</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>&CenterDot;</mo>
<msub>
<mi>T</mi>
<mi>s</mi>
</msub>
<mo>/</mo>
<mn>2</mn>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<msub>
<mi>u</mi>
<mn>1</mn>
</msub>
<mo>&GreaterEqual;</mo>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<mo>+</mo>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>u</mi>
<mn>1</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>&CenterDot;</mo>
<msub>
<mi>T</mi>
<mi>s</mi>
</msub>
<mo>/</mo>
<mn>2</mn>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
Flat (x, y) represents texture-rich of the current frame image at pixel (x, y) place, and Diff (x, y) is represented in picture
The texture homogeneity of vegetarian refreshments (x, y) place background image and the current frame image, TgAnd TsFor default threshold value, (u0,
u1)、(v0, v1) represent the background image in the gradient vector at pixel (x, y) place, the current frame image in pixel respectively
The gradient vector at point (x, y) place, Flat (x, y)=0 represent texture-rich of the current frame image at pixel (x, y) place,
Diff (x, y)=1 expression is inconsistent in the texture of pixel (x, y) place current frame image and the background image.
2. the method as described in claim 1, it is characterised in that the step of the local contrast for calculating the current frame image
Suddenly include:
The current frame image is divided into several m*n block of pixels, wherein, m, n are the positive integer more than 0;
Count the gray average and gray variance of each block of pixels;
Obtain the local contrast of each block of pixels, the local contrast is the gray variance of each block of pixels divided by described
Business obtained by gray average.
3. method as described in claim 1 or 2, it is characterised in that the local contrast for calculating the current frame image
Also include carrying out adaptive noise processing to the current frame image before the step of spending, specifically include:
Obtain the noise intensity of the current frame image;
When the noise intensity is more than default threshold value, then noise reduction process is carried out to the current frame image.
4. described method as claimed in claim 3, it is characterised in that the step of the noise intensity for obtaining the current frame image
Suddenly include:
The current frame image is divided into several pixel number identical image blocks;
Count the noise spot number in each image block in the current frame image;
The initial noisc intensity of the current frame image is obtained, the initial noisc intensity is each figure in the current frame image
As the intermediate value of the noise spot number in block divided by the pixel number of image block;
The noise intensity of the current frame image is obtained according to the initial noisc intensity of the current frame image.
5. method as claimed in claim 4, it is characterised in that in the statistics current frame image in each image block
The step of noise spot number, includes:
Calculate the absolute value of the current frame image and previous frame image gray scale difference;
According to the noise spot number in current frame image described in the absolute Data-Statistics of the gray scale difference in each image block, wherein,
Then judge the pixel when the absolute value of the gray scale difference of some pixel in image block is in default threshold range
Point is noise spot.
6. the method as described in claim 4 or 5, it is characterised in that according to below equation to the initial of the current frame image
Noise intensity carries out smooth filtering to obtain the noise intensity of the current frame image:
<mrow>
<msub>
<mi>N</mi>
<mi>i</mi>
</msub>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mi>N</mi>
</mtd>
<mtd>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>0</mn>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<msub>
<mi>&alpha;N</mi>
<mrow>
<mi>i</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
<mo>+</mo>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>-</mo>
<mi>&alpha;</mi>
<mo>)</mo>
</mrow>
<mi>N</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>i</mi>
<mo>></mo>
<mn>0</mn>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>,</mo>
</mrow>
Wherein, α is smoothing factor, and N is the initial noisc intensity of the current frame image, NiFor the noise of the current frame image
Intensity, Ni-1For the noise intensity of previous frame image, 0<N<1,0<Ni<1,0<Ni-1<1, i=0 represents that the current frame image is
Second two field picture of video.
7. the method as described in claim 1,2,4 or 5, it is characterised in that background image is learnt by below equation:
<mrow>
<mi>B</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>B</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>&CenterDot;</mo>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>-</mo>
<mi>&beta;</mi>
<mo>)</mo>
</mrow>
<mo>+</mo>
<mi>I</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>&CenterDot;</mo>
<mi>&beta;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<msub>
<mi>F</mi>
<mn>1</mn>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mn>0</mn>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>B</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
</mrow>
</mtd>
<mtd>
<mrow>
<msub>
<mi>F</mi>
<mn>1</mn>
</msub>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>></mo>
<mn>0</mn>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mo>,</mo>
</mrow>
Wherein, B (x, y) is background image, and I (x, y) is the current frame image, F1(x, y) is according to more Gaussian Background moulds
The foreground image sample that type detects, F1(x, y)=0 represents that pixel (x, y) is background dot, F1(x, y)>0 expression pixel
(x, y) is foreground point, and β represents Background learning rate.
A kind of 8. foreground detection system, it is characterised in that including:
Acquisition module, obtain current frame image;
Computing module, calculate the local contrast of the current frame image;
Module is established, establishes more Gaussian Background models based on the local contrast;
Pattern detection module, detect that the part foreground image in the current frame image is made according to more Gaussian Background models
For foreground image sample;
Background image study module, background image is learnt according to the foreground image sample and the current frame image;
Foreground target detection module, according to the background image, foreground target detection is carried out to the current frame image;
The foreground target detection module includes:
Gradient vector acquisition submodule, the gradient of each pixel in the background image and the current frame image is obtained respectively
Vector;
Foreground target detection is according to acquisition submodule, according to each pixel in the background image and the current frame image
Gradient vector obtain whether texture in the current frame image at each pixel abundant and the background image with it is described
Whether the texture of current frame image is consistent;
Foreground target judging submodule, when the texture-rich in the current frame image at some pixel and at the pixel
The texture of the current frame image and the background image is inconsistent, that is, judges the pixel for foreground point;
Whether the texture in the current frame image at each pixel enriches and the background image and the present frame figure
Whether the texture of picture is unanimously to be calculated by below equation:
<mrow>
<mi>F</mi>
<mi>l</mi>
<mi>a</mi>
<mi>t</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mn>1</mn>
</mtd>
<mtd>
<mrow>
<msup>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<mn>2</mn>
</msup>
<mo>></mo>
<msup>
<msub>
<mi>T</mi>
<mi>g</mi>
</msub>
<mn>2</mn>
</msup>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<msup>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<mn>2</mn>
</msup>
<mo>+</mo>
<msup>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<mn>2</mn>
</msup>
<mo>&le;</mo>
<msup>
<msub>
<mi>T</mi>
<mi>g</mi>
</msub>
<mn>2</mn>
</msup>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
<mrow>
<mi>D</mi>
<mi>i</mi>
<mi>f</mi>
<mi>f</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mn>1</mn>
</mtd>
<mtd>
<mrow>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<msub>
<mi>u</mi>
<mn>1</mn>
</msub>
<mo><</mo>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<mo>+</mo>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>u</mi>
<mn>1</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>&CenterDot;</mo>
<msub>
<mi>T</mi>
<mi>s</mi>
</msub>
<mo>/</mo>
<mn>2</mn>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<msub>
<mi>u</mi>
<mn>1</mn>
</msub>
<mo>&GreaterEqual;</mo>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>v</mi>
<mn>1</mn>
</msub>
<mo>+</mo>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>+</mo>
<msub>
<mi>u</mi>
<mn>1</mn>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>&CenterDot;</mo>
<msub>
<mi>T</mi>
<mi>s</mi>
</msub>
<mo>/</mo>
<mn>2</mn>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
Flat (x, y) represents texture-rich of the current frame image at pixel (x, y) place, and Diff (x, y) is represented in picture
The texture homogeneity of vegetarian refreshments (x, y) place background image and the current frame image, TgAnd TsFor default threshold value, (u0,
u1)、(v0, v1) represent the background image in the gradient vector at pixel (x, y) place, the current frame image in pixel respectively
The gradient vector at point (x, y) place, Flat (x, y)=0 represent texture-rich of the current frame image at pixel (x, y) place,
The expressions of Diff (x, y) 1 are inconsistent in the texture of pixel (x, y) place current frame image and the background image.
9. system as claimed in claim 8, it is characterised in that the computing module includes:
Piecemeal submodule, the current frame image is divided into several m*n block of pixels, wherein, m, n are just whole more than 0
Number;
Statistic submodule, count the gray average and gray variance of each block of pixels;
Local contrast acquisition submodule, obtains the local contrast of each block of pixels, and the local contrast is each block of pixels
The gray variance divided by the gray average obtained by business.
10. system as claimed in claim 8, it is characterised in that also including adaptive noise processing module, including:
Noise intensity acquisition submodule, before the local contrast of the current frame image is calculated, obtain the present frame figure
The noise intensity of picture;
Noise reduction process submodule, when the noise intensity is more than default threshold value, then noise reduction is carried out to the current frame image
Processing.
11. system as claimed in claim 10, it is characterised in that the noise intensity acquisition submodule includes:
Image block division unit, the current frame image is divided into several pixel number identical image blocks;
Noise spot statistic unit, count the noise spot number in each image block in the current frame image;
Initial noisc intensity acquiring unit, obtains the initial noisc intensity of the current frame image, and the initial noisc intensity is
The intermediate value of noise spot number in the current frame image in each image block divided by the pixel number of image block;
Noise intensity acquiring unit, the noise of the current frame image is obtained according to the initial noisc intensity of the current frame image
Intensity.
12. system as claimed in claim 11, it is characterised in that the noise spot statistic unit includes:
Gray scale difference absolute value computation subunit, calculate the absolute value of the current frame image and previous frame image gray scale difference;
Noise spot judges and statistics subelement, according to each image in current frame image described in the absolute Data-Statistics of the gray scale difference
Noise spot number in block, wherein, when the absolute value of the gray scale difference of some pixel in image block is in default threshold
Then judge the pixel for noise spot when in the range of value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510098306.XA CN104700405B (en) | 2015-03-05 | 2015-03-05 | A kind of foreground detection method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510098306.XA CN104700405B (en) | 2015-03-05 | 2015-03-05 | A kind of foreground detection method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104700405A CN104700405A (en) | 2015-06-10 |
CN104700405B true CN104700405B (en) | 2017-11-28 |
Family
ID=53347487
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510098306.XA Active CN104700405B (en) | 2015-03-05 | 2015-03-05 | A kind of foreground detection method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104700405B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105405138B (en) * | 2015-11-10 | 2018-05-29 | 上海交通大学 | Waterborne target tracking based on conspicuousness detection |
CN106327488B (en) * | 2016-08-19 | 2020-04-21 | 云赛智联股份有限公司 | Self-adaptive foreground detection method and detection device thereof |
CN108961304B (en) * | 2017-05-23 | 2022-04-26 | 阿里巴巴集团控股有限公司 | Method for identifying moving foreground in video and method for determining target position in video |
CN107992873A (en) * | 2017-10-12 | 2018-05-04 | 西安天和防务技术股份有限公司 | Object detection method and device, storage medium, electronic equipment |
CN109907555B (en) * | 2018-07-26 | 2020-11-17 | 苏州斯莱斯食品有限公司 | Solar power supply type cabinet |
CN109697725B (en) * | 2018-12-03 | 2020-10-02 | 浙江大华技术股份有限公司 | Background filtering method and device and computer readable storage medium |
CN111127347A (en) * | 2019-12-09 | 2020-05-08 | Oppo广东移动通信有限公司 | Noise reduction method, terminal and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103208123A (en) * | 2013-04-19 | 2013-07-17 | 广东图图搜网络科技有限公司 | Image segmentation method and system |
CN103530886A (en) * | 2013-10-22 | 2014-01-22 | 上海安奎拉信息技术有限公司 | Low-calculation background removing method for video analysis |
TWI425446B (en) * | 2010-10-29 | 2014-02-01 | Univ Nat Chiao Tung | A method for object detection system in day-and-night environment |
KR20140045854A (en) * | 2012-10-09 | 2014-04-17 | 에스케이텔레콤 주식회사 | Method and apparatus for monitoring video for estimating gradient of single object |
CN103810722A (en) * | 2014-02-27 | 2014-05-21 | 云南大学 | Moving target detection method combining improved LBP (Local Binary Pattern) texture and chrominance information |
CN103903278A (en) * | 2012-12-28 | 2014-07-02 | 重庆凯泽科技有限公司 | Moving target detection and tracking system |
-
2015
- 2015-03-05 CN CN201510098306.XA patent/CN104700405B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI425446B (en) * | 2010-10-29 | 2014-02-01 | Univ Nat Chiao Tung | A method for object detection system in day-and-night environment |
KR20140045854A (en) * | 2012-10-09 | 2014-04-17 | 에스케이텔레콤 주식회사 | Method and apparatus for monitoring video for estimating gradient of single object |
CN103903278A (en) * | 2012-12-28 | 2014-07-02 | 重庆凯泽科技有限公司 | Moving target detection and tracking system |
CN103208123A (en) * | 2013-04-19 | 2013-07-17 | 广东图图搜网络科技有限公司 | Image segmentation method and system |
CN103530886A (en) * | 2013-10-22 | 2014-01-22 | 上海安奎拉信息技术有限公司 | Low-calculation background removing method for video analysis |
CN103810722A (en) * | 2014-02-27 | 2014-05-21 | 云南大学 | Moving target detection method combining improved LBP (Local Binary Pattern) texture and chrominance information |
Non-Patent Citations (4)
Title |
---|
Adaptive background mixture models for real-time tracking;Chris Stauffer 等;《Computer Vision and Pattern Recognition》;19990625;246-252 * |
Real-time foreground–background segmentation using codebook model;Kyungnam Kim 等;《Real-Time Imaging》;20050630;第11卷(第3期);172-185 * |
基于纹理及统计特征的视频背景提取;姜永林 等;《光学精密工程》;20080115;第16卷(第1期);172-177 * |
应用改进混合高斯模型的运动目标检测;张燕平 等;《计算机工程与应用》;20101201;第46卷(第34期);155-223 * |
Also Published As
Publication number | Publication date |
---|---|
CN104700405A (en) | 2015-06-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104700405B (en) | A kind of foreground detection method and system | |
CN104781848B (en) | Image monitoring apparatus for estimating gradient of singleton, and method therefor | |
CN110879982B (en) | Crowd counting system and method | |
US20210082127A1 (en) | Image analysis apparatus, image analysis method, and storage medium to display information representing flow quantity | |
Wang et al. | Novel spatio-temporal structural information based video quality metric | |
CN106709932A (en) | Face position tracking method and device and electronic equipment | |
CN105791774A (en) | Surveillance video transmission method based on video content analysis | |
CN111079740A (en) | Image quality evaluation method, electronic device, and computer-readable storage medium | |
CN104318266B (en) | A kind of image intelligent analyzes and processes method for early warning | |
CN101605209A (en) | Camera head and image-reproducing apparatus | |
CN103093458B (en) | The detection method of key frame and device | |
CN110827304B (en) | Traditional Chinese medicine tongue image positioning method and system based on deep convolution network and level set method | |
Liu et al. | A perceptually relevant approach to ringing region detection | |
CN106127234B (en) | Non-reference picture quality appraisement method based on characteristics dictionary | |
CN102084397A (en) | Image processing device, method, and program | |
CN109670398A (en) | Pig image analysis method and pig image analysis equipment | |
CN103096117B (en) | Video noise detection method and device | |
CN104182983B (en) | Highway monitoring video definition detection method based on corner features | |
CN107944499A (en) | A kind of background detection method modeled at the same time for prospect background | |
CN109657597A (en) | Anomaly detection method towards individual live scene | |
CN110751635A (en) | Oral cavity detection method based on interframe difference and HSV color space | |
CN104282013B (en) | A kind of image processing method and device for foreground target detection | |
CN103049919B (en) | A kind of embedded target detection algorithm | |
CN109657571B (en) | Delivery monitoring method and device | |
CN105139394A (en) | Noise image quality evaluation method combining reconstruction with noise scatter histograms |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |