CN101980300B - 3G smart phone-based motion detection method - Google Patents

3G smart phone-based motion detection method Download PDF

Info

Publication number
CN101980300B
CN101980300B CN2010105243603A CN201010524360A CN101980300B CN 101980300 B CN101980300 B CN 101980300B CN 2010105243603 A CN2010105243603 A CN 2010105243603A CN 201010524360 A CN201010524360 A CN 201010524360A CN 101980300 B CN101980300 B CN 101980300B
Authority
CN
China
Prior art keywords
value
mean
array
background model
average gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2010105243603A
Other languages
Chinese (zh)
Other versions
CN101980300A (en
Inventor
胡维华
崔扬
汤利平
李原洲
贾琳
郁伟炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN2010105243603A priority Critical patent/CN101980300B/en
Publication of CN101980300A publication Critical patent/CN101980300A/en
Application granted granted Critical
Publication of CN101980300B publication Critical patent/CN101980300B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The invention relates to a 3G smart phone-based motion detection method. The conventional motion detection algorithm has strict requirements on hardware conditions, storage conditions and transmission conditions. The method comprises the following steps of: firstly, creating an array for storing pixel point information; secondly, calculating an average grey value and an average variance value of a background model, and determining whether abnormalities exist or not according to a photographed image; thirdly, determining whether to update the background model, update rate, the average variance value and the average gray value according to a difference value, and processing the difference value; and finally, calculating the percentage of the number of elements of which the element values are 255 in an Adiff array, and determining whether to send alarm information to users. The method has low calculated amount and can easily avoid the influence of the change of light rays in a photographing zone on the change of images in the photographing zone.

Description

A kind of method for testing motion based on the 3G smart mobile phone
Technical field
The invention belongs to technical field of computer vision, relate to a kind of motion detection algorithm based on the 3G smart mobile phone.
Background technology
Along with the develop rapidly of society, video monitoring is the important component part of security system.Because it is harsh that video monitoring requires hardware condition, transmission conditions and storage condition, so the video monitoring technology only disposes on fixed equipment or high performance of mobile equipments, and can't successful Application in mobile phone end.
Mobile phone still is that communication capacity all can't be mentioned in the same breath with professional watch-dog as portable terminal in storage capacity, computing power.But current phone basically all is equipped with camera and mass storage; Therefore monitoring mobile phone can satisfy most monitor task as a kind of portable common monitoring scheme, and to develop a kind of policing algorithm that is adapted to smart mobile phone feasible and very necessary for this reason.At present many monitoring mobile phone schemes also appear in industry, but these schemes just with mobile phone as monitor terminal, be connected through the internet to watch-dog and obtain monitor message.These schemes not with mobile phone as monitoring client, therefore can't embody the portability of mobile phone as monitoring client.
In professional watch-dog, also there is the good motion detection algorithm of operation; But it is harsh that these algorithms require hardware condition, storage condition and transmission conditions; And current cell phone apparatus can't satisfy these requirements at all; Therefore we can't be applied directly to these algorithms in the current phone platform, but should propose a kind of Processing Algorithm fast and accurately according to the distinctive characteristics of cell phone platform.
Summary of the invention
The present invention is directed to the deficiency of prior art, propose a kind of method for testing motion based on the 3G smart mobile phone.
The concrete steps of the inventive method are:
Step (1) initialization two-dimensional array ArefFrames is used to deposit the gradation data refFrames of 4 used width of cloth images of initialization; The size of this two-dimensional array ArefFrames is 4 * n;
Initialization one-dimension array Amean is used to deposit the average gray value mean of background model, and the size of this one-dimension array Amean is n;
Initialization one-dimension array AcurFrame is used to deposit the gray-scale value curFrame of present image, and the size of this one-dimension array AcurFrame is n;
Initialization one-dimension array Asig is used to deposit the mean square difference sig of background model, and the size of this one-dimension array Asig is n;
Initialization one-dimension array Adiff is used to deposit difference value diff between the average gray value of gray-scale value and background model of current shooting image, and the size of this one-dimension array Adiff is n;
Wherein n is that mobile phone is taken the image pixel data byte number that obtains.
Step (2) is calculated the average gray value of background model, and the average gray value mean with the background model that obtains deposits among the one-dimension array Amean then; The computation process that the average gray value mean of calculating background model is corresponding is following:
mean 1=refFrames 1 (1)
mean 2=mean 1+refFrames 2)/2 (2)
mean 3=(mean 2*2+refFrames 3)/3 (3)
mean 4=(mean 3*3+refFrames 4)/4 (4)
mean=mean 4 (5)
In the above-mentioned formula, mean 1, mean 2, mean 3, mean 4Be respectively the average gray value that calculates for the first time the initial back-ground model of gained, calculate the average gray value of the initial back-ground model of gained for the second time, calculate the average gray value of the initial back-ground model of gained, the average gray value that calculates the initial back-ground model of gained the 4th time for the third time.RefFrames 1, refFrames 2, refFrames 3, refFrames 4It is respectively the gradation data of the 4th width of cloth in the 4 used width of cloth images of gradation data, the initialization of the 3rd width of cloth in the 4 used width of cloth images of gradation data, the initialization of second width of cloth in the 4 used width of cloth images of gradation data, the initialization of first width of cloth in the 4 used width of cloth images of initialization.
Step (3) is calculated the mean square difference of background model, and the mean square difference sig with the background model that obtains deposits among the one-dimension array Asig then; The account form of mean square difference sig is suc as formula shown in (6).
sig = Σ i = 1 4 ( mean - refFrames i ) 2 / i - - - ( 6 )
Difference value between the grey scale pixel value of each point in the image that the shooting of step (4) calculating mobile phone obtains and the pixel average gray value of the corresponding point in the background model; If the absolute value of difference value, thinks then that this point and the corresponding point in the background model in the photographic images are identical point less than 64; If the absolute value of difference value, thinks then that this point in the photographic images is abnormity point more than or equal to 64.
Described difference value is pixel average gray value poor of grey scale pixel value and the corresponding point in the background model of each point in the photographic images.
Step (5) determines whether update background module according to difference value, specifically: if the absolute value of difference value then is updated to this point in the photographic images in the background model less than the mean square difference of 3 times of current background models; If the absolute value of difference value then is not updated to this point in the photographic images in the background model more than or equal to the mean square difference of 3 times of current background models.
Step (6) is calculated new turnover rate alpha ',
alpha′=alpha×(1.0/e |curFrame-mean|/255) (7)
Wherein the initial value of alpha is 0.5.
Step (7) is according to the mean square difference sig ' of alpha ' update background module
sig′=(1-alpha′)×sig+alpha′×(curFrame-mean) 2 (8)
Step (8) is according to the average gray value mean ' of alpha ' update background module,
mean′=(1-alpha′)×mean+alpha′×curFrame (9)
Step (9) is operated the difference value in the Adiff array as follows: check the value of all pixels in the Adiff array successively, if the value of this pixel adjacent pixels point all is 255, the value of putting this pixel so is 255, otherwise is changed to 0.
Step (10) is operated the difference value in the Adiff array as follows: check the value of all pixels in the Adiff array successively, if the value of this pixel adjacent pixels point all is 0, the value of putting this pixel so is 0, otherwise is changed to 255.
Element value is 255 the shared number percent of element number in step (11) the calculating Adiff array, if this accounts for number percent more than or equal to 15%, then sends a warning message to the user, shows that foreign matter has appearred in the current shooting zone, and jumps to step (4); If this accounts for number percent less than 15%, then directly jump to step (4).
The inventive method has following beneficial effect with respect to prior art:
1) algorithm computation amount of the present invention is little, is applicable to the low speed cell phone apparatus;
2) algorithm of the present invention can avoid changing because of shooting area light the influence of the shooting area image modification that brings well;
3) algorithm of the present invention can remove the most of noise in the differential image.
Embodiment
Below in conjunction with instance the present invention is further described.
To the lower processing speed of cell phone platform, this method will check that before operation the camera information of cell phone platform is to obtain the image data format that camera equipment is supported.If mobile phone cam is supported the data output of yuv format, this method will directly be used the Y part in the data layout and no longer convert pixel data into gradation data so.Will obtain very big speed in the method for the camera of supporting the output of yuv data form like this promotes.For the mobile phone cam that is suitable for rgb format output, this method can be with the data of the rgb format gradation data of uniting-convert into.The concrete steps of the inventive method are following:
Step (1) initialization two-dimensional array ArefFrames is used to deposit the gradation data refFrames of 4 used width of cloth images of initialization; The size of this two-dimensional array ArefFrames is 4 * 1024;
Initialization one-dimension array Amean is used to deposit the average gray value mean of background model, and the size of this one-dimension array Amean is 1024;
Initialization one-dimension array AcurFrame is used to deposit the gray-scale value curFrame of present image, and the size of this one-dimension array AcurFrame is 1024;
Initialization one-dimension array Asig is used to deposit the mean square difference sig of background model, and the size of this one-dimension array Asig is 1024;
The array Adiff of initialization one dimension is used to deposit difference value diff between the average gray value of gray-scale value and background model of current shooting image, and the size of this one-dimension array Adiff is 1024;
Step (2) is calculated the average gray value of background model, and the average gray value mean with the initial back-ground model that obtains deposits among the one-dimension array Amean then; The computation process that the average gray value mean of calculating initial back-ground model is corresponding is following:
mean 1=refFrames 1 (1)
mean 2=(mean 1+refFrames 2)/2 (2)
mean 3=(mean 2*2+refFrames 3)/3 (3)
mean 4=(mean 3*3+refFrames 4)/4 (4)
mean=mean 4 (5)
In the above-mentioned formula, mean 1, mean 2, mean 3, mean 4Be respectively the average gray value that calculates for the first time the initial back-ground model of gained, calculate the average gray value of the initial back-ground model of gained for the second time, calculate the average gray value of the initial back-ground model of gained, the average gray value that calculates the initial back-ground model of gained the 4th time for the third time.RefFrames 1, refFrames 2, refFrames 3, refFrames 4It is respectively the gradation data of the fourth officer in the 4 used width of cloth images of the 3rd secondary gradation data, initialization in the 4 used width of cloth images of second secondary gradation data, the initialization in the 4 used width of cloth images of first in the 4 used width of cloth images of initialization secondary gradation data, initialization.
Step (3) is calculated the mean square difference of background model, and the mean square difference sig with the background model that obtains deposits among the one-dimension array Asig then; Even if the mode of mean square difference sig is suc as formula shown in (6).
sig = Σ i = 1 4 ( mean - refFrames i ) 2 / i - - - ( 6 )
Belong to the part of setting up of background model from step (1) to step (3), this part is only calculated once in whole algorithm.
Difference value between the grey scale pixel value of each point in step (4) the calculating photographic images and the grey scale pixel value of the corresponding point in the background model.Described difference value is pixel average gray value poor of grey scale pixel value and the corresponding point in the background model of each point in the photographic images.If the absolute value of difference value, thinks then that this point and the corresponding point in the background model in the photographic images are identical point less than 64, and this difference value is revised as 0; If the absolute value of difference value, thinks then that this point in the photographic images is abnormity point more than or equal to 64, and this difference value is revised as 255.
Can know that from step (4) algorithm is judging that the threshold values whether a pixel changes is 64.The difference value of storing in the Adiff array in addition actual for binaryzation the value of extraordinary image vegetarian refreshments; So-called binaryzation is meant that the data of the pixel of storing among the Adiff are not 0 is exactly 255; Wherein 0 represent the pixel of ater pixel value, the pixel value of the lily pixel of 255 representatives.
Step (5) is checked difference value, if the absolute value of difference value then is updated to this point in the photographic images in the background model less than the mean square difference of 3 times of current background models; If the absolute value of difference value then is not updated to this point in the photographic images in the background model more than or equal to the mean square difference of 3 times of current background models; The renewal of background model receives the constraint of mean square difference; If the mean square difference is calculated inaccurate then can be caused background model inaccurate; And then cause the output of abnormality detection inaccurate; Therefore need be constantly according to the image update mean square difference of current shooting, formula (6), formula (7) they then are to upgrade the step of mean square difference.
Step (6) is calculated new turnover rate alpha ',
alpha′=alpha×(1.0/e |curFrame-mean|/255) (7)
Wherein the initial value of alpha is 0.5.
Step (7) is according to the mean square difference sig ' of alpha ' update background module
sig′=(1-alpha′)×sig+alpha′×(curFrame-mean) 2 (8)
For fear of the change of the normal seizure picture that causes because of ambient light etc. cause unusual, this algorithm needs real-time update background average gray value.While need be distinguished those when upgrading the background average gray value be that unusual those are normal, therefore requires algorithm to possess robustness and don't mistake accuracy rate.This algorithm is distinguished unusual and normal phenomenon based on the numerical value of background model average variance.
Step (8) is according to the average gray value mean ' of alpha ' update background module,
mean′=(1-alpha′)×mean+alpha′×curFrame (9)
Step (9) is operated the difference value in the Adiff array as follows: check the value of all pixels in the Adiff array successively, if the value of this pixel adjacent pixels point all is 255, the value of putting this pixel so is 255, otherwise is changed to 0.
Step (10) is operated the difference value in the Adiff array as follows: check the value of all pixels in the Adiff array successively, if the value of this pixel adjacent pixels point all is 0, the value of putting this pixel so is 0, otherwise is changed to 255.
Element value is 255 the shared number percent of element number in step (11) the calculating Adiff array, if this accounts for number percent more than or equal to 15%, then sends a warning message to the user, shows that the current shooting zone foreign matter occurs and jumps to step (4); If this accounts for number percent less than 15%, then directly jump to step (4), thereby the real time kinematics of keeping smart mobile phone detects.

Claims (1)

1. the method for testing motion based on the 3G smart mobile phone is characterized in that this method comprises the steps:
Step (1) initialization two-dimensional array ArefFrames is used to deposit the gradation data refFrames of 4 used width of cloth images of initialization; The size of this two-dimensional array ArefFrames is 4 * n;
Initialization one-dimension array Amean is used to deposit the average gray value mean of background model, and the size of this one-dimension array Amean is n;
Initialization one-dimension array AcurFrame is used to deposit the gray-scale value curFrame of present image, and the size of this one-dimension array AcurFrame is n;
Initialization one-dimension array Asig is used to deposit the mean square difference sig of background model, and the size of this one-dimension array Asig is n;
Initialization one-dimension array Adiff is used to deposit difference value diff between the average gray value of gray-scale value and background model of current shooting image, and the size of this one-dimension array Adiff is n;
Wherein n is that mobile phone is taken the image pixel data byte number that obtains;
Step (2) is calculated the average gray value of background model, and the average gray value mean with the background model that obtains deposits among the one-dimension array Amean then; The computation process that the average gray value mean of calculating background model is corresponding is following:
mean 1=refFrames 1
mean 2=(mean 1+refFrames 2)/2
mean 3=(mean 2*2+refFrames 3)/3
mean 4=(mean 3*3+refFrames 4)/4
mean=mean 4
Mean wherein 1, mean 2, mean 3, mean 4Be respectively the average gray value that calculates for the first time the initial back-ground model of gained, calculate the average gray value of the initial back-ground model of gained for the second time, calculate the average gray value of the initial back-ground model of gained, the average gray value that calculates the initial back-ground model of gained the 4th time for the third time; RefFrames 1, refFrames 2, refFrames 3, refFrames 4It is respectively the gradation data of the 4th width of cloth in the 4 used width of cloth images of gradation data, the initialization of the 3rd width of cloth in the 4 used width of cloth images of gradation data, the initialization of second width of cloth in the 4 used width of cloth images of gradation data, the initialization of first width of cloth in the 4 used width of cloth images of initialization;
Step (3) is calculated the mean square difference of background model, and the mean square difference sig with the background model that obtains deposits among the one-dimension array Asig then; The computing method of mean square difference sig are following:
sig = Σ i = 1 4 ( mean - refFrames i ) 2 / i
Difference value between the grey scale pixel value of each point in the image that the shooting of step (4) calculating mobile phone obtains and the pixel average gray value of the corresponding point in the background model; If the absolute value of difference value is less than 64; Think that then this point and the corresponding point in the background model in the photographic images are identical point, and this difference value is revised as 0; If the absolute value of difference value, thinks then that this point in the photographic images is abnormity point more than or equal to 64, and this difference value is revised as 255;
Described difference value is pixel average gray value poor of grey scale pixel value and the corresponding point in the background model of each point in the photographic images;
Step (5) determines whether update background module according to difference value, specifically: if the absolute value of difference value then is updated to this point in the photographic images in the background model less than the mean square difference of 3 times of current background models; If the absolute value of difference value then is not updated to this point in the photographic images in the background model more than or equal to the mean square difference of 3 times of current background models;
Step (6) is calculated new turnover rate alpha ',
alpha′=alpha×(1.0/e |curFrame-mean|/255)
Wherein the initial value of alpha is 0.5;
Step (7) is according to the mean square difference sig ' of alpha ' update background module
sig′=(1-alpha′)×sig+alpha′×(curFrame-mean) 2
Step (8) is according to the average gray value mean ' of alpha ' update background module,
mean′=(1-alpha′)×mean+alpha′×curFrame
Step (9) is operated the difference value in the Adiff array as follows: check the value of all pixels in the Adiff array successively, if the value of this pixel adjacent pixels point all is 255, the value of putting this pixel so is 255, otherwise is changed to 0;
Step (10) is operated the difference value in the Adiff array as follows: check the value of all pixels in the Adiff array successively, if the value of this pixel adjacent pixels point all is 0, the value of putting this pixel so is 0, otherwise is changed to 255;
Element value is 255 the shared number percent of element number in step (11) the calculating Adiff array, if this accounts for number percent more than or equal to 15%, then sends a warning message to the user, shows that foreign matter has appearred in the current shooting zone, and jumps to step (4); If this accounts for number percent less than 15%, then directly jump to step (4).
CN2010105243603A 2010-10-29 2010-10-29 3G smart phone-based motion detection method Expired - Fee Related CN101980300B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010105243603A CN101980300B (en) 2010-10-29 2010-10-29 3G smart phone-based motion detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010105243603A CN101980300B (en) 2010-10-29 2010-10-29 3G smart phone-based motion detection method

Publications (2)

Publication Number Publication Date
CN101980300A CN101980300A (en) 2011-02-23
CN101980300B true CN101980300B (en) 2012-07-04

Family

ID=43600801

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010105243603A Expired - Fee Related CN101980300B (en) 2010-10-29 2010-10-29 3G smart phone-based motion detection method

Country Status (1)

Country Link
CN (1) CN101980300B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3228638B2 (en) * 1994-06-08 2001-11-12 松下電器産業株式会社 Moving object detection method using background difference method
CN100502463C (en) * 2005-12-14 2009-06-17 浙江工业大学 Method for collecting characteristics in telecommunication flow information video detection
JP2008257626A (en) * 2007-04-09 2008-10-23 Victor Co Of Japan Ltd Motion detector
CN101221663A (en) * 2008-01-18 2008-07-16 电子科技大学中山学院 Intelligent monitoring and alarming method based on movement object detection

Also Published As

Publication number Publication date
CN101980300A (en) 2011-02-23

Similar Documents

Publication Publication Date Title
CN110136183B (en) Image processing method and device and camera device
WO2021036636A1 (en) Vibration detection method and apparatus for lifting device, server and storage medium
CN102395043B (en) Video quality diagnosing method
CN107621932B (en) Local amplification method and device for display image
CN105227843A (en) The filming control method of terminal, the imaging control device of terminal and terminal
CN104077785A (en) Moving object detecting device, moving object detecting method, and computer program
CN110602488A (en) Day and night type camera device switching abnormity detection method and device and camera device
US11107237B2 (en) Image foreground detection apparatus and method and electronic device
CN112312001B (en) Image detection method, device, equipment and computer storage medium
CN105828065A (en) Method and device for detecting video picture overexposure
CN101179725A (en) Motion detecting method and apparatus
US20150187051A1 (en) Method and apparatus for estimating image noise
KR20220151583A (en) Image processing method and device, electronic equipment and medium
CN113344906B (en) Camera evaluation method and device in vehicle-road cooperation, road side equipment and cloud control platform
CN111145151A (en) Motion area determination method and electronic equipment
Bohr et al. A no reference image blur detection using cumulative probability blur detection (cpbd) metric
CN114445663A (en) Method, apparatus and computer program product for detecting challenge samples
CN113888509A (en) Method, device and equipment for evaluating image definition and storage medium
CN114037087A (en) Model training method and device, depth prediction method and device, equipment and medium
CN104243967A (en) Image detection method and device
CN101980300B (en) 3G smart phone-based motion detection method
KR20220151130A (en) Image processing method and device, electronic equipment and medium
CN113807209B (en) Parking space detection method and device, electronic equipment and storage medium
CN113628192B (en) Image blur detection method, apparatus, device, storage medium, and program product
CN112949490A (en) Device action detection method and device, electronic device and readable storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120704

Termination date: 20151029

EXPY Termination of patent right or utility model