CN102737225A - Image monitoring system and person number calculating method - Google Patents

Image monitoring system and person number calculating method Download PDF

Info

Publication number
CN102737225A
CN102737225A CN2012100349193A CN201210034919A CN102737225A CN 102737225 A CN102737225 A CN 102737225A CN 2012100349193 A CN2012100349193 A CN 2012100349193A CN 201210034919 A CN201210034919 A CN 201210034919A CN 102737225 A CN102737225 A CN 102737225A
Authority
CN
China
Prior art keywords
time
space
histogram
estimate
calculate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012100349193A
Other languages
Chinese (zh)
Inventor
伊藤诚也
李媛
须田安博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Publication of CN102737225A publication Critical patent/CN102737225A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Alarm Systems (AREA)

Abstract

The invention discloses an image monitoring system capable of calculating the number of persons inside a shot image and a person number calculating method. The method comprises the steps of calculating features related to directions based on image data of image signals acquired from a shooting device and calculating a histogram related to the direction-related features inside a space-time within a specified size so as to generate a space-time histogram; calculating the space-time assessing value showing the motion complexity degree according to the space-time histogram and calculating the change of the space-time assessing value with the time; and judging whether there are more than two specified persons inside an image or a specified area according to the change of the space-time assessing value with the time.

Description

Image monitoring system and number projectional technique
Technical field
The present invention relates to a kind of image monitoring system; This image monitoring system has the function of coming recording image data according to the image that obtains from picture pick-up devices such as video cameras; Can realize the function that monitoring arrangement and image identification function through image recording structure etc. detect the intruder and carry on mobile robot etc., detecting near person's etc. function etc., the present invention especially relates to a kind of image monitoring system and number projectional technique that is suitable for detecting the personage's in the present video camera number.
Background technology
Disclosed to have the image that obtains from picture pick-up devices such as video cameras has been implemented the image monitoring system of function that Flame Image Process detects the mobile objects such as people and vehicle of present monitor area.The function that the testing result utilized that has this kind image monitoring system only writes down the image that mobile object occurred, on display device, show the function of Warning Icon and make hummer etc. send chimes of doom to cause function that the supervision personnel note etc.This kind image monitoring system helps to alleviate the burden that in the past need ceaselessly confirm the monitoring traffic of operation.In addition, when pilferage having taken place wait criminal offence and devious conduct, this kind image monitoring system institute images recorded can also be offered help to afterwards crime seeking etc.
In recent years, because of the variation of style of offence, the increase of crime case and the following degradation factor of case-solving rate, the anti-criminal intent knowledge of selling shop and places such as financial institution and office block has in batches had raising, and the introduction of image monitoring system is further developed.Along with popularizing of the high capacity of image recording structure and web camera etc., be provided with video camera in various places, and the quantity of video camera also there has been further increase.As stated, because the supervision personnel seek the huge labours of needs cost such as criminal offence from document image through visual operation, develop the function that support can be provided monitoring traffic so be starved of.For example, a kind of technology of utilizing the space-time abundant intensity from camera review, to detect moving body is disclosed in patent documentation 1.
In addition, along with the raising of awareness of safety, the introduction demand of the combined system of access management system and image monitoring system is also begun to increase.But existing, this kind system do not have the personage of authority to follow the what is called that gets into personage to follow the problem of entering (Tail Gating) at the back with authority.The keeper who has introduced said system urgently hopes to develop the application technology that the personage that can prevent not have authority gets into security. zone.
To the problems referred to above, can adopt the method that detects a plurality of personages according to detected personage's size in the prior art.In addition, in patent documentation 2, disclose and detected face, and calculated the method for number according to the quantity of face through image recognition.
Technical literature formerly
Patent documentation
The patent documentation 1 open patent 2010-204860 of Japan communique
The patent documentation 2 open patent 2008-40828 of Japan communiques
Summary of the invention
, in patent documentation 1 disclosed technology, be merely able to detect and have or not moving body, and can not carry out the reckoning of number.Though patent documentation 2 can carry out the reckoning of number, because in patent documentation 2 disclosed image processing apparatus, its precondition is that video camera must photograph face, so depend on the situation that is provided with of video camera to a large extent.When taking the personage, if video camera is arranged to take down from last, the function of then can't expecting fully to make a video recording is brought into play the function that detects face.
In order to address the above problem, for example can to consider to wait measuring distance or use other sensor, but so can cause the cost of entire system to rise with stereo camera.Most of users hope on the basis of existing video camera, to realize the function of face detection, so be necessary to satisfy this requirement of user.
In order to address the above problem; Problem of the present invention is to realize a kind ofly can calculate image monitoring system and personage's projectional technique of existing personage's number in the taken image through camera review, and can these functions be built in the video camera.
Other problem beyond the above-mentioned problem of the present invention will be explained in the present specification and drawings.
Solution
In order to solve above-mentioned problem; The present invention provides a kind of image monitoring system and number projectional technique; It is according to calculating the characteristic relevant with direction based on the view data of the picture signal of obtaining from picture pick-up device; And in the space-time of prescribed level, calculate the histogram of the characteristic of said and directional correlation, generate the space-time histogram with this; Calculate the space-time evaluation of estimate of the mobile complexity of expression according to said space-time histogram, and calculate said space-time evaluation of estimate over time; And according to said space-time evaluation of estimate calculate over time in the image or the specific region in existing number whether more than 2 regulation numbers more than the people.
Said structure is an example only, and the present invention can carry out suitable change in the scope that does not break away from its technological thought.In addition, the structure beyond the said structure of the present invention will be explained in the present specification and drawings.
The invention effect
According to the present invention; Can realize a kind of image monitoring system and number projectional technique; It obtains the for example space-time evaluation of estimate of space-time abundant intensity etc. according to camera review; And, can extrapolate existing personage's in the image number, and can these functions be built in the video camera through judging this space-time evaluation of estimate over time.
Other effect of the present invention will be explained in instructions of the present invention.
Description of drawings
Fig. 1 is the calcspar of the related image monitoring system of an expression embodiment of the present invention.
Fig. 2 is the concept map of processing of the present invention.
Fig. 3 is the processing calcspar of mobile message calculating section.
Fig. 4 is the key diagram of direction coding (Orientation Coding); Fig. 4 (a) representes original image; The image of the emphasical filtering in edge has been implemented in Fig. 4 (b) expression; The brightness step direction of the image shown in Fig. 4 (c) presentation graphs 4 (b), the state of Fig. 4 (d) expression when the brightness step direction shown in Fig. 4 (c) has been distributed direction code.
Fig. 5 is the process flow diagram that expression space-time histogram generates an example of the processing in the part.
Fig. 6 is the space-time key diagram, and Fig. 6 (a) is the concept map of space-time, and Fig. 6 (b) is the histogrammic exemplary plot of space-time of space-time shown in Fig. 6 (a).
One example of the treatment scheme of Fig. 7 express time and spatial variations calculating section.
Fig. 8 representes the relation between space-time histogram and the number.
Fig. 9 representes an example of the treatment scheme of number reckoning part.
Figure 10 is the setting picture example of present embodiment.
Picture example when Figure 11 is the output alarm of present embodiment.
Embodiment
Followingly embodiments of the invention are described with reference to accompanying drawing.In each accompanying drawing, identical or similar structure is with identical symbolic representation, and omits its repeat specification.
Fig. 1 is the calcspar of the related image monitoring system of an expression embodiment of the present invention.
This image monitoring system has video camera 10, image recognition part 20, judgment part 30 and output 40.This image monitoring system has adopted electronic computer system.The hardware of this electronic computer system comprises CPU, storer and I/O etc., through realize the various funtion parts shown in each calcspar with the software of executable mode installation provision.
Video camera 10 is to comprise camera lens with zoom function and not shown complementary metal oxide semiconductor (CMOS) (CMOS; Complementary Metal Oxide Semiconductor) and the picture pick-up device of charge coupled cell imaging apparatuss such as (CDD, Charge Coupled Device).The image recognition part 20 that this video camera 10 is stated after the picture signal that is obtained is outputed to.In addition, this video camera 10 is arranged on the monopod video camera (Pan-Tilt-Zoom Camera) that can face upward on the The Cloud Terrace bowed motion and rotatablely moved.
Judgment part 30 judges in output 40 how to export according to the result by 20 outputs of image recognition part.In judgment part 30, carry out judgment processing, for example according to the result by image recognition part 20 output judge export alarm still should be to peripheral equipment output action signal.In addition, judgment part 30 is controlled according to judged result, for example to peripheral equipment output alarm etc.
Output 40 is display device of liquid crystal indicator and cathode-ray tube (CRT) (CRT, Cathode Ray Tube) display device etc.This output 40 can be used RGB, and (RGB, Red-Green-Blue) monitor output perhaps waits to substitute via the data output of network.
The setting of the parameter of using in image recognition part 20 and judgment part 30 grades is carried out through not shown user interface.The user interface of the image monitoring system of present embodiment comprises the input media (not shown) of mouse and keyboard etc., the parameter of accepted user input etc.
Below image recognition part 20 is elaborated.
Image recognition part 20 has: mobile message calculating section 100, and it calculates the moving area in the image according to the picture signal (perhaps view data) that is obtained by video camera 10; The space-time histogram generates part 101, and its same basis generates the space-time histogram by the picture signal (perhaps view data) that video camera 10 obtains; Time and spatial variations calculating section 102, it calculates the space-time evaluation of estimate of the mobile complexity of expression according to the space-time histogram that is generated part 101 generations by the space-time histogram, and calculates this space-time evaluation of estimate over time; And number reckoning part 103, it is according to the space-time evaluation of estimate reckoning number over time in the moving area (Moving Area).Number calculates that the output of part 103 is input in the judgment part 30.
In this embodiment; Number calculates that it not is to have judged whether that simply there be (judgement is that 0 people or 1 are more than the people) in the personage that the number of carrying out in the part 103 is calculated, number calculate part 103 have calculate in the image or the regulation region memory number whether be 2 functions more than the regulation number more than the people.In addition, mobile message calculating section 100 is not necessary, and number reckoning part 103 also can not used moving area information, and according to the for example whole space-time evaluation of estimate reckoning number over time of picture.
Calculate the characteristic relevant according to view data, and in the space-time of prescribed level, calculate the histogram of this characteristic relevant, can calculate the space-time histogram thus with direction with direction.In addition, as the characteristic relevant with direction, for example can the service orientation sign indicating number with the direction that moves etc., in this embodiment to describe as example under the situation of using direction code (Orientation Code).
As the space-time evaluation of estimate, for example can use histogrammic discrete value of space-time and space-time abundant intensity etc., in the present embodiment so that use the situation of space-time abundant intensity to describe as example.
Followingly the flow process of image recognition part 20 is described with reference to Fig. 2.Fig. 2 (A) representes input picture, from the input picture of Fig. 2 (A), extracts personage's moving area through mobile message calculating section 100.The result is extracted in Fig. 2 (B) expression, and the C1 among Fig. 2 (B) representes people's object area, and the C2 among Fig. 2 (B) representes personage's moving area, and this zone is the zone circumscribed with personage's zone C 1.Fig. 2 (C) expression space-time abundant intensity result of calculation over time, C3 representes the distribution over time of space-time abundant intensity.The extraction result of the space-time abundant intensity distribution C3 over time of Fig. 2 (C) in the zone of the moving area C2 of Fig. 2 (D) presentation graphs 2 (B) calculates that in number this information of use is come the reckoning number in the part 103.
Below the processing of the various piece in the image recognition part 20 is elaborated.
Fig. 3 representes the inter-process of mobile message calculating section 100.
Mobile message calculating section 100 has frame difference part 200, benchmark image generates part 201, benchmark image 202 and mark part (Labeling Portion) 203.Mobile message calculating section 100 carries out computing, calculates the moving body zone (moving area) that occurs in the image according to view data.In addition; Because the purpose of mobile message calculating section 100 is to calculate moving area; So computing method are not limited in following frame difference method, also can calculate moving area through histogram coupling (Histogram Matching) or light stream Flame Image Process methods such as (Optical Flow).
At first, will be the view data that is fit to carry out image recognition processing and image recording from the image signal transformation that video camera 10 obtains.Wherein, as view data, obtain one dimension and arrange the perhaps view data of two-dimensional arrangements.At this,,, can implement the processing of smoothing filtering (Smoothing filtering), the emphasical filtering (Edge Enhancement Filtering) in edge and concentration conversion etc. in advance to view data as pre-service in order to reduce the influence of clutter and vibration etc.In addition, also can select data modes such as RGB look and black and white according to purposes.Perhaps, in order to reduce processing cost, also can carry out the processing of Image Data Compression to prescribed level.
Below the processing in the frame difference part 200 is described.In this embodiment,, use the frame difference method to calculate mobile message as normally used method.So-called frame difference method is meant, for example obtains the difference between the view data of the view data of some frames and more Zao than this frame in time frame, detects the image change that takes place in the short time with this.
Before the processing of conducting frame difference part 200, at first basis view data before this generates benchmark image 202 (perhaps generating benchmark image 202 in advance) in benchmark image generation part 201.At this, in order to alleviate processing burden, with the image of previous frame as benchmark image 202.Owing to candidate areas that can detect moving body (Moving Object) through frame difference and clutter etc., so also help the zone that obviously is background is left out.Input picture is set at I Xy, benchmark image 202 is set at B XyThe time, frame difference Sub XyWith following formula: Sub Xy=| B Xy-I Xy| expression.
In the formula, x and the position of y remarked pixel on image.Frame difference Sub XyFor example represent Sub with the difference of brightness value (Luminance Value) XyAlso can be according to threshold value Γ SubHandle, and represent as binary image.
Below mark part 203 is described.Mark part 203 is that unit is to frame difference Sub with the object XyCarry out mark and handle, and calculate circumscribed rectangular area and area etc.This circumscribed rectangular area is the moving area of being represented by symbol C2 among Fig. 2 (B).Before implementing the mark processing, with Filtering Processing moving area is connected through shrink process that binary image is expanded, might can improve performance.
The mobile message number that outputs to that comprises the moving area information of calculating is thus calculated in the part 103.Calculate that in number the moving area in this information of use is carried out processing in the part 103, with the number in the output moving area.
Below space-time histogram shown in Figure 1 generated part 101 and time and spatial variations calculating section 102 describe.These processing have the space-time evaluation of estimate and this space-time evaluation of estimate function over time of for example calculating space-time abundant intensity etc.
Fig. 4 is the key diagram of direction coding; Fig. 4 (a) representes original image; The image of the emphasical filtering in edge has been implemented in Fig. 4 (b) expression, the brightness step direction of the image shown in Fig. 4 (c) presentation graphs 4 (b), the state of Fig. 4 (d) expression when the brightness step direction shown in Fig. 4 (c) has been distributed direction code.
As shown in Figure 4, direction encoding is meant the at first brightness step of computed image, and through in the direction of regulation this gradient direction being carried out the processing that quantization is encoded.Below this method is elaborated.
Fig. 5 is the process flow diagram that expression space-time histogram generates an example of the processing in the part 101.
At first, to each pixel p in the image (x, input picture I y) Xy, each edge gradient Δ Iu that calculates horizontal direction and vertical direction (vertical), Δ Iv (step 1).At this,, use the edge emphasis filter during Δ Iv at edge calculation gradient delta Iu.When using Suo Beier (Sobel) wave filter as the edge emphasis filter, the design factor FLTn of its horizontal direction and the design factor FLTv of vertical direction are as shown in the formula shown in (1).In addition, also can use other edge emphasis filter such as Pu Ruiweite (Prewitt) wave filter to replace the Suo Beier wave filter.
Figure BDA0000136074390000101
Figure BDA0000136074390000102
... formula (1)
Then, according to the edge gradient Δ Iu that the wave filter by using wave filter of formula (1) is calculated, Δ Iv, employing formula (2) is calculated edge strength ρ Xy(step S2).
ρ Xy = Δ I u 2 + Δ I v 2 Formula (2)
After this judge whether to be ρ Xy>Γ ρ(step S3).In the formula, Γ ρThe expression pre-set threshold.
At ρ Xy>Γ ρThe time (when the result of step S3 is Yes), just edge strength ρ XyThreshold value Γ greater than regulation ρThe time, like the said edge direction θ that calculates in back Xy(step S4), and get into next procedure (step S5).
Be not ρ Xy>Γ ρThe time (when the result of step S3 is No), just edge strength ρ XyThreshold value Γ in regulation ρWhen following, do not calculate edge direction θ XyAnd entering next procedure (step S5).
That is to say, on the edge of intensity ρ XyThe low situation of value under because sometimes the influence of clutter etc. is big, so use pre-set threshold Γ ρ,, do not give direction code through skips steps S4 to the pixel below the threshold value.(x y), calculates edge direction θ according to formula (3) in step S4 to the pixel p that surpasses threshold value Xy
θ Xy=tan -1(Δ Iv/ Δ Iu) ... Formula (3)
After this, according to the edge direction θ that is calculated Xy, employing formula (4) is calculated direction code C Xy(step S5).Carrying out via step S4 under the situation about handling, just at ρ Xy>Γ ρSituation under, according to θ Xy/ Δ θObtain direction code C XyNot carried out under the situation about handling, be not ρ just via step S4 Xy>Γ ρSituation under, direction code C XyBe N=2 π/Δ θ
In addition, count N according to predetermined quantization, 2 π are cut apart angle delta with each with all gradient directions θCut apart, and each the gradient direction assign direction sign indicating number C to obtaining after cutting apart XyFor example, be set at 16 o'clock, direction code C shown in Fig. 4 (d), quantization being counted N XyBe C Xy=0,1 ..., 15}.In addition, in step S3, be judged as edge strength at threshold value Γ ρFollowing pixel p (x, direction code C y) XyFor counting the value that N equates with quantization.For example, be set at 16 o'clock, direction code C as stated quantization being counted N Xy=16.The pixel that has been endowed this direction code is the pixel with invalid direction code.It more than is the situation of the direction encoding processing of view data.
C Xy = [ θ Xy Δ θ ] : If ρ Xy > ρ N = 2 Π Δ θ : Otherwise Formula (4)
In step S6, judge whether whole image has been carried out the processing of step S1~S5, be judged as processing when also not finishing (when the result of step S6 is No); Turn back to step S1; Change x, y handles remaining pixel repeatedly; Being judged as processing when having finished (result of step S6 is Yes), enter into step S7.
Below to the space-time histogram P of step S7 XytComputing method describe.
Fig. 6 is the key diagram of space-time, and Fig. 6 (a) is the concept map of space-time, and Fig. 6 (b) is the space-time histogram P of space-time shown in Fig. 6 (a) XytExemplary plot.Shown in Fig. 6 (a), in this space-time, on whole time t direction, from time point T-T MTill the time point T, exist M to open continuously the image that constitutes by the xy plane together.In each image, be set with the plane domain that is of a size of L * L.In space-time, be shaped as rectangular-shaped zone (space-time of prescribed level) through what said method was set prescribed level (in the xy zone, be of a size of L * L, time span is M).With this zone as not shown space-time S.
Space-time histogram P shown in Fig. 6 (b) XytAccording to after the method stated the space-time S of this prescribed level handled obtain, y direction is represented direction code C XyOccurrence frequency, X direction is represented direction code C XyValue.In Fig. 6, i representes C XyValue.
In step S7, according to the rules the size (xy area L * L, time orientation are M) space-time S in the direction code character C that has calculated Xyt∈ S calculates direction code C XyThe space-time histogram P of occurrence frequency XytWherein, C XytThe pixel p of express time point t (x, direction code y).Calculating space-time histogram P XytThe time, at first calculate expression direction code C according to formula (5) XytThe h of occurrence frequency Xytδ in the formula representes Kronecker δ (Kronecker Delta), i and C XytValue when equating, δ equals 1, under other situation, δ equals 0.
h Xyt ( i ) = Σ ( x , y , t ) ES δ ( i - C Xyt ) Formula (5)
In formula (5), in that (x, y) (x y) and the zone of L * L on every side, calculates from current time point T to time point T-T along time orientation as the pixel p at the center in xy zone with pixel p MTill space-time S in direction code C XytOccurrence frequency.
Then, considering edge strength ρ XyAt threshold value Γ ρThe following direction code that is had is the h of invalid direction code Xyt(N) and on the basis of the size of space-time S, calculate the space-time histogram P that shows by the relative degree numerical table according to formula (6) Xyt
P Xyt ( i ) = h Xyt ( i ) L 2 × M - h Xyt ( N ) Formula (6)
Owing to the space-time histogram P that obtains according to formula (6) XytBe pixel p (x, y), the space-time histogram at time point t place, so (x y) comes whole image is calculated space-time histogram P through changing pixel p Xyt(this circular treatment in the not shown step S7).
In above explanation, the conduct characteristic service orientation sign indicating number relevant with direction generated the histogrammic example of space-time be illustrated., the present invention is not limited in this method, as the characteristic relevant with direction, also can use the moving direction that is obtained by light stream etc., and generate the space-time histogram to this moving direction.The space-time histogram of this moment is also referred to as HOF (Histogram of Flow).Under this situation, processing after this also can adopt identical method to carry out.
Below time and the spatial variations calculating section 102 of Fig. 1 described.
One example of the treatment scheme of Fig. 7 express time and spatial variations calculating section 102.In time and spatial variations calculating section 102, use the space-time histogram P that obtains according to said sequence XytObtain space-time abundant intensity R as an example of space-time evaluation of estimate XytWith this space-time evaluation of estimate over time.
At first, according to space-time histogram P XytCalculate space-time abundant intensity R Xyt(step S11) is at this, with space-time entropy E XytBe set at the histogrammic evaluation of estimate of space-time, considering maximum entropy E MaxThe basis on, calculate space-time abundant intensity R XytFormula (7) expression maximum entropy E Max, formula (8) expression space-time entropy E XytWith space-time abundant intensity R XytFormula.In the formula, α eThe weighting coefficient of expression threshold value, this coefficient is suitably set according to the characteristic of image.In addition, maximum entropy E MaxWith space-time entropy E in formula (8) XytMiddle space-time histogram P XytThe situation (the space-time histogram is flat condition) that all is the probability of 1/N in all i is suitable.
E Max = - Σ i = 0 N - 1 1 N Log 2 1 N Formula (7)
E xyt = - Σ i = 0 N - 1 P xyt ( i ) log 2 P xyt ( i )
R Xyt = E Xyt - α e E Max E Max - α e E Max If E Xyt ≥ α e E Max 0 Otherwise Formula (8)
Shown in Fig. 8 (a), for example in having a small amount of personage's space S, pixel p (x, y) the space-time abundant intensity R of position XytThe probability of occurrence of specific direction sign indicating number high.The value that is to say entropy diminishes.There is the trend of a direction of deflection at this explanation personage's edge.On the other hand, shown in Fig. 8 (b), comparing under the situation that has more a plurality of personages with the less situation of personage, when moving because of the personage when detecting various direction code, it is big that the space-time abundant intensity becomes.The value that is to say entropy becomes big.
From space-time abundant intensity (the space-time evaluation of estimate is over time) over time, and to compare under the situation fewer in number, moving under the more situation of number is more complicated, so that the space-time abundant intensity becomes over time is big.Therefore, through checking this space-time abundant intensity (the space-time evaluation of estimate over time) over time, can more accurately calculate number.
Therefore, then obtain expression space-time abundant intensity space-time abundant intensity over time and change R SubxyAt first, (x obtains the space-time abundant intensity R of the previous time point (T-1) of calculating in advance in y) in pixel p Xyt(T-1) with the space-time abundant intensity R of current (time point T) Xyt(T) poor (the step S12) between.
Judge whether whole image have been carried out above-mentioned a series of processing (step S13).When being judged as also completion processing (when the judged result of step S13 is No), return step S1, repeatedly untreated pixel is handled, so that accomplish the processing in the whole xy space of view data.Be judged as (when the judged result of step S13 is Yes) when whole image having been carried out processing, carrying out next step Filtering Processing (step S14).
Shown in (9), Filtering Processing is to adopt Gaussian filter G (Gaussian filter) that the difference of in step 12, calculating is carried out the processing of filtering G, is the convolution (convolution) that Gaussian filter function and space-time abundant intensity change.The window of wave filter is preferably according to the shooting situation of image, changes according to personage's height.For example, through being set at the elongate shape of 7 * 15 grades, can access and the more identical information of people's object area.Through above-mentioned processing, calculate space-time abundant intensity (the space-time evaluation of estimate over time) over time.
R Subxy=G* (R Xyt(T)-R Xyt(T-1)) ... Formula (9)
In addition; In above explanation; Be illustrated using the space-time abundant intensity to calculate space-time abundant intensity example over time, but the present invention is not limited to this, also can the histogrammic statistic of space-time be used as the space-time evaluation of estimate as an example of space-time evaluation of estimate.As this statistic, can enumerate out discrete value, standard deviation, skewness and the kurtosis etc. obtained according to moment to use near the notion of entropy.Also can be arranged to any more than one statistic in these statistics is used as the space-time evaluation of estimate, and the histogrammic statistic of this space-time is calculated as the space-time evaluation of estimate over time over time.At this moment, after the number stated calculate processing in the part 103 except space-time evaluation of estimate content over time changes, remaining processing is basic identical.
Below the described number of Fig. 1 is calculated that part 103 describes.In this is handled, obtain Fig. 2 (D) according to Fig. 2 (B) and Fig. 2 (C), carry out the reckoning of number with this.Fig. 9 representes an example of the treatment scheme of number reckoning part 103.
According to the moving area of in mobile message calculating section 100, obtaining (C2 that is equivalent to Fig. 2 (B)) and the space-time evaluation of estimate in time and spatial variations calculating section 102, obtained over time (at this so that change R with the space-time abundant intensity SubxySituation be that example describes), the empty abundant intensity when obtaining in moving area changes R SubxyMean value R Ave(step S21).
After this, with the mean value R that is tried to achieve AveThr compares with threshold value, if mean value R AveMore than threshold value thr (step S22), then be judged as more than the number of regulation, and the data of output reckoning number.Perhaps according to comparative result output whether more than the number of regulation.In addition, also can set a plurality of threshold values, and divide a plurality of stage reckoning numbers.At this, the number of regulation or threshold value thr can be decided through setting by the user, are under 1 people or 2 situation more than the people in judgement, can with its setting threshold thr consistently.Establishing method after state the part explain.
When having a plurality of moving area, carry out above-mentioned processing repeatedly to all moving areas, judge whether all moving areas to have been carried out handling (step S33) end process when all moving areas being carried out processing.
In Fig. 9, be illustrated as example, but also can be arranged to not use moving area and the All Ranges on the picture is handled with the situation of using moving area.
Below the result's that judges image recognition part 20 judgment part 30 is described.One example of the setting picture that Figure 10 representes employed condition in the judgment part 30 is set.For example, be set in hope that output is during alarm under the situation more than the number of regulation, the number of in the C4 of Figure 10, setting regulation gets final product.At this moment, the number according to this setting changes above-mentioned threshold value thr.In addition, being set at when giving the alarm,, can whether export alarm and the setting of whether locking etc. of the C5 of Figure 10 as the setting of alarm action.Again, with the situation of room entry/exit management system interlock under, can carry out the utilization for example when 2 people are above, door lock being lived to wait, can prevent to follow entering thus.
In addition, the number during for example with the output alarm of present embodiment is set at 10 man-hours, a large amount of man-hours occurring, can also use as the device of output alarm.
When in the C5 of Figure 10, alarm setting being ON, shown in figure 11, can carry out the utilization of on picture output warning picture etc.
According to the present invention; Can realize a kind of image monitoring system and number projectional technique; It obtains the for example space-time evaluation of estimate of space-time abundant intensity etc. according to camera review; And, can extrapolate existing personage's in the image number, and can these functions be built in the video camera through judging this space-time evaluation of estimate over time.
Abovely describe the present invention according to embodiment, but in above embodiment illustrated structure example only, the present invention can carry out suitable change in the scope that does not break away from its technological thought.In addition, the structure of illustrated in an embodiment variation needs only each other not contradiction, also can make up use.
Symbol description
10 video cameras (picture pick-up device)
20 image recognition parts
30 judgment parts
40 outputs
100 mobile message calculating sections
101 space-time histograms generate part
102 times and spatial variations calculating section
103 numbers are calculated part
200 frame difference parts
201 benchmark images generate part
202 benchmark images
203 mark parts

Claims (9)

1. image monitoring system is characterized in that having:
The space-time histogram generates part; Said space-time histogram generates part according to calculating the characteristic relevant with direction based on the view data of the picture signal of obtaining from picture pick-up device; And in the space-time of prescribed level, calculate the histogram of the characteristic of said and directional correlation, generate the space-time histogram with this;
Time and spatial variations calculating section, said time and spatial variations calculating section are calculated the space-time evaluation of estimate of the mobile complexity of expression according to said space-time histogram, and calculate said space-time evaluation of estimate over time; And
Number is calculated part, said number calculate part calculate in the image over time according to said space-time evaluation of estimate or the specific region in existing number whether more than 2 regulation numbers more than the people.
2. image monitoring system as claimed in claim 1 is characterized in that,
Further have the mobile message calculating section of calculating moving area according to said view data,
Said number is calculated partly and is calculated over time according to the said space-time evaluation of estimate in the said moving area whether existing number is 2 regulation numbers more than the people in the said moving area.
3. like claim 1 or 2 described image monitoring systems, it is characterized in that,
Said time and spatial variations calculating section are calculated the space-time abundant intensity according to the histogrammic entropy of said space-time, and this space-time abundant intensity is used as said space-time evaluation of estimate.
4. like claim 1 or 2 described image monitoring systems, it is characterized in that,
More than in said time and the histogrammic discrete value of the said space-time of spatial variations computing section, standard deviation, skewness and the kurtosis each, and it is used as said space-time evaluation of estimate.
5. like claim 1 or 2 described image monitoring systems, it is characterized in that,
Said space-time histogram generates part and calculates according to said view data the brightness step direction is carried out direction code that quantization obtains as the said characteristic relevant with direction, and generates the said space-time histogram of the occurrence frequency of the interior said direction code of the said space-time of expression.
6. like claim 1 or 2 described image monitoring systems, it is characterized in that,
Said space-time histogram generates part and calculates moving direction as the said characteristic relevant with direction according to said view data, and generates the said space-time histogram of the occurrence frequency of the said moving direction in the said space-time of expression.
7. like claim 1 or 2 described image monitoring systems, it is characterized in that,
Said number calculate part calculate in the image over time according to said space-time evaluation of estimate or the specific region in existing number be that a people or 2 are more than the people.
8. like claim 1 or 2 described image monitoring systems, it is characterized in that,
Further has the judgment part, when calculating the number of partly extrapolating when said regulation number is above by said number, the output alarm of said judgment part.
9. a number projectional technique is characterized in that,
According to calculating the characteristic relevant with direction based on the view data of the picture signal of obtaining from picture pick-up device, and the histogram of in the space-time of prescribed level, calculating the characteristic of said and directional correlation, generate the space-time histogram with this;
Calculate the space-time evaluation of estimate of the mobile complexity of expression according to said space-time histogram, and calculate said space-time evaluation of estimate over time; And
According to said space-time evaluation of estimate calculate over time in the image or the specific region in existing number whether more than 2 regulation numbers more than the people.
CN2012100349193A 2011-04-12 2012-02-16 Image monitoring system and person number calculating method Pending CN102737225A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011087852A JP2012221331A (en) 2011-04-12 2011-04-12 Video monitoring system and number of persons estimating method
JP2011-087852 2011-04-12

Publications (1)

Publication Number Publication Date
CN102737225A true CN102737225A (en) 2012-10-17

Family

ID=46992698

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012100349193A Pending CN102737225A (en) 2011-04-12 2012-02-16 Image monitoring system and person number calculating method

Country Status (2)

Country Link
JP (1) JP2012221331A (en)
CN (1) CN102737225A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106339769A (en) * 2015-07-08 2017-01-18 北京大学 User travel forecasting method for mobile social network

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6885665B2 (en) * 2015-08-18 2021-06-16 株式会社ユニバーサルエンターテインメント Information processing device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101303735A (en) * 2007-05-03 2008-11-12 索尼德国有限责任公司 Method for detecting moving objects in a blind spot region of a vehicle and blind spot detection device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101303735A (en) * 2007-05-03 2008-11-12 索尼德国有限责任公司 Method for detecting moving objects in a blind spot region of a vehicle and blind spot detection device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SEIYA ITO: "person detection based on [outside] three persons and the degree of abundant [between space-time] and measurement", 《INSTITUTE OF ELECTRICAL ENGINEERS OF JAPAN》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106339769A (en) * 2015-07-08 2017-01-18 北京大学 User travel forecasting method for mobile social network

Also Published As

Publication number Publication date
JP2012221331A (en) 2012-11-12

Similar Documents

Publication Publication Date Title
US11341669B2 (en) People flow analysis apparatus, people flow analysis system, people flow analysis method, and non-transitory computer readable medium
US8351662B2 (en) System and method for face verification using video sequence
US8712149B2 (en) Apparatus and method for foreground detection
JP4764487B2 (en) Video surveillance system
CN105354563A (en) Depth and color image combined human face shielding detection early-warning device and implementation method
CN108304816B (en) Identity recognition method and device, storage medium and electronic equipment
CN111654700B (en) Privacy mask processing method and device, electronic equipment and monitoring system
CN111783665A (en) Action recognition method and device, storage medium and electronic equipment
JP3486229B2 (en) Image change detection device
US20230049656A1 (en) Method of processing image, electronic device, and medium
Wang et al. Early smoke detection in video using swaying and diffusion feature
CN112560683A (en) Method and device for identifying copied image, computer equipment and storage medium
CN108460319B (en) Abnormal face detection method and device
CN107301373B (en) Data processing method, device and storage medium
CN102737225A (en) Image monitoring system and person number calculating method
US10783365B2 (en) Image processing device and image processing system
CN115984973B (en) Human body abnormal behavior monitoring method for peeping-preventing screen
US10916016B2 (en) Image processing apparatus and method and monitoring system
CN115423795A (en) Static frame detection method, electronic device and storage medium
CN110363192A (en) Object image identification system and object image discrimination method
JP2012048691A (en) Image monitoring apparatus
Lausser et al. Detecting zebra crossings utilizing AdaBoost.
CN114898181A (en) Hidden danger violation identification method and device for explosion-related video
Rhee Gaussian filtering detection using band pass residual and contrast of forgery image
US12002195B2 (en) Computer vision-based anomaly detection method, device and electronic apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20121017