CN102760230B - Flame detection method based on multi-dimensional time domain characteristics - Google Patents

Flame detection method based on multi-dimensional time domain characteristics Download PDF

Info

Publication number
CN102760230B
CN102760230B CN201210203504.4A CN201210203504A CN102760230B CN 102760230 B CN102760230 B CN 102760230B CN 201210203504 A CN201210203504 A CN 201210203504A CN 102760230 B CN102760230 B CN 102760230B
Authority
CN
China
Prior art keywords
flame
frame
region
object region
match
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210203504.4A
Other languages
Chinese (zh)
Other versions
CN102760230A (en
Inventor
王岳环
宋萌萌
桑农
王军
顾舒航
江曼
朱秀峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201210203504.4A priority Critical patent/CN102760230B/en
Publication of CN102760230A publication Critical patent/CN102760230A/en
Application granted granted Critical
Publication of CN102760230B publication Critical patent/CN102760230B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a flame detection method in a near-infrared image video, belonging to the technical field of mode identification and image treatment. The method comprises the steps of: firstly, tracking a target region through a shortest distance interframe matching way to remove unsteady targets such as noise; then performing multi-dimensional time domain characteristic extraction on the steadily tracked target region, determining a degree of confidence of the target region via an iteration way based on a characteristic extraction value and realizing rapid detection and identification of the initial flame based on the degree of confidence of the target region; and then removing false-alarm based on the brightness mutation characteristics and motion characteristics of the flame edge. The flame detection method in the near-infrared image video, disclosed by the invention, has the advantages of high sensitivity, low false-alarm rate and capability of discovering initial flame in a monitoring scene rapidly and exactly in a near-infrared image.

Description

A kind of flame detecting method based on multidimensional temporal signatures
Technical field
The invention belongs to computer vision methods, be specifically related to the flame detection and Identification method in computer vision, can be fire-fighting system reliable monitoring and early warning are provided.
Background technology
Traditional fire alarm system based on smoke detector is because the characteristic such as high sensitivity and low cost to smog is obtaining application widely aspect fire prevention and control.But due to its special principle of work, cannot carry out direct alarm to flame, and can only carry out indirectly early warning flame by smog, and early warning cost detection time is longer, is unfavorable for the discovery early of fire.
Computer vision is mainly studied the method for obtaining information from view data, in the flame detector system based on video monitoring, can analyze video image content by computer vision methods, obtain the preliminary understanding to guarded region scene, judge by the extraction of the visual information to flame; Meanwhile, the fire alarm system based on video monitoring can obtain abundant scene image information data, can provide in time fire location, and the preliminary judgement of intensity of a fire size, the very first time provides condition of a fire information, reduces fire damage.
Flame detects the detection identification problem that belongs to specific objective in computer vision field, and some researchists have proposed the detection algorithm based on flame different characteristic.Flame detection algorithm in actual use mainly contains following several at present:
1) flame based on colouring information detects
Colouring information is the important information of figure, by find the region of particular color in color graphics, can find potential target area, thereby realizes the detection of flame, and the singularity of flame color is used having arrived widely of the method simultaneously.But, utilize colouring information to carry out flame detection and also exist some significantly not enough, be for example subject to the interference of Similar color target; Can in addition, set up suitable color model for the flame of different colours, be also the critical limitation that restrict colors information is applied in Smoke Detection.
2) flame based on shape information detects
Because the chemical reaction of generation flame is varied, be easily subject to the impact of flame region surrounding air convection current simultaneously, the shape of flame is various irregular often.But a common characteristic that reflects flame is the wedge angle in profile.Therefore whether be flame to a lot of methods if detecting moving region with this characteristic, but be easily subject to the impact of image-forming condition and moving target extraction.
3) flame based on flame flicking frequency detects
Because flame combustion is subject to airborne oxygen level and the impact of the matter of burning, flame there will be flicker.The frequency that obtains flame by a large amount of research is in 25Hz left and right, and this and natural other jamming targets and the larger difference of artificial light sources (50Hz) existence, therefore become a kind of widely accepted method by the calculating of flame frequency being carried out to flame detection.
Although researchist has proposed different flame detection algorithms, but because the change of shape of flame is varied, color, the luminance difference of the flame that different comburants produce are very large, add that the background of detection is different, are difficult at present find the feature that can be good at Description Image Flame.Particularly, under near infrared imaging condition, do not comprise colouring information in image, this makes most rely on the interested method of color detection to become infeasible, thereby has strengthened the difficulty that flame detects.
Summary of the invention
The object of the invention is to propose the flame detecting method under a kind of near infrared scene, overcome near infrared scene and do not have colouring information flame detection target to be brought to the problem of detection difficult, and reach the object that reduces false-alarm by false-alarm filtering.
Based on a flame detecting method for multidimensional temporal signatures, comprise the following steps:
(1) extract flame candidate region chain C according to gray-scale value from current frame image t;
(2) if the first frame that present frame t is sequence of video images, the flame object region chain D of present frame t 1be flame candidate region chain C, D is set 1in the match is successful in each flame object region that number of times is 1, enter next frame; If present frame t is not the first frame of sequence of video images, enter step (3);
(3) adopt bee-line frame matching algorithm by the flame object region chain D of the flame candidate region chain of present frame t and previous frame t-1 t-1mate, according to matching result, by the match is successful in the flame object region that in previous frame t-1, the match is successful, number of times adds 1, and in previous frame t-1, the match is successful but degree of confidence is greater than the first confidence threshold value T 2the match is successful in flame object region that number of times adds 1, in present frame t, be not matched to candidate's flame region of merit the match is successful that number of times puts 1; And by the flame object region that in previous frame t-1, the match is successful, previous frame t-1, the match is successful but degree of confidence is greater than the first confidence threshold value T 2flame object region, present frame t in be not matched to merit candidate's flame region add the flame object region chain D of present frame t t;
(4) calculate target area multidimensional temporal signatures:
(4.1) in initialization present frame t, the match is successful flame object region that number of times is greater than 1 multidimensional characteristic mulD imen i t = 0 ;
(4.2) calculate flame object region interframe brightness change wherein, with be respectively target area the mean pixel gray-scale value that comprises flame pixels point in present frame t and previous frame t-1, if be greater than 3.5, think flame object region change from t-1 frame to t frame generation brightness, upgrade mulD imen i t = mulD imen i t + 1 , Otherwise be worth constant;
(4.3) calculate flame object region interframe area change wherein, S and S be respectively fiery target area flame foreground area in t frame and t-1 frame is occured simultaneously and union, if think flame object region from t-1 frame to t frame generation area change, upgrade mul Dimen i t = mulD imen i t + 1 , Otherwise be worth constant;
(4.4) calculate flame object region interframe displacement L i t = ( x t - x t - 1 ) 2 + ( y t - y t - 1 ) 2 , Wherein, (x t, y t) and (x t-1, y t-1) be respectively flame object region center point coordinate in present frame t and previous frame t-1, if be more than or equal to 2, think flame object region be subjected to displacement variation from t-1 frame to t frame, mul Dimen i t = mulD imen i t + 1 , Otherwise mulDimen i t = 0 Be worth constant;
(4.5) for the match is successful, number of times is more than or equal to 20 flame object region calculate its multidimensional temporal signatures for flame object region at the target multidimensional characteristic of r frame, 20≤m < flame object region the match is successful number of times;
(5) if target multidimensional temporal signatures value be greater than eigenvalue threshold T 3, upgrade flame object region degree of confidence otherwise upgrade flame object region degree of confidence 0.8≤a≤1,0.6≤b≤0.8;
(6) if flame object region mark degree of confidence be more than or equal to the second confidence threshold value T 4, judge flame object region for flame region.
Further, flame detecting method of the present invention also comprises step: the luminance difference rate of change that calculates flame region wherein, with be respectively the interior average gray value of flame pixels point of boundary rectangle frame and the average gray value of background pixel point of flame region, if be less than brightness rate of change threshold value T 5, this flame region is false-alarm region.
Further, also comprise step: use frame-to-frame differences method to ask for the moving target foreground area of present frame t, and then calculate wherein, S movefor belong to the pixel number of flame region and moving target foreground area simultaneously, the pixel number that S is flame region, if be less than sport foreground coverage rate threshold value T 7, this flame region is false-alarm region.
The present invention has following beneficial effect:
The present invention adopted minor increment frame matching algorithm before calculating multidimensional temporal signatures, can stably obtain the Time domain and Space information of target, removed the unstable targets such as noise.The criterions such as brightness, area and the change in displacement of utilization flame object are extracted the multidimensional temporal signatures of target area, computation complexity is low, the degree of confidence of determining target area according to the eigenwert of extracting in the mode of iteration, then realizes fast detecting and the identification of onset flame according to target area degree of confidence.
Further, a kind of flame false-alarm decision method has also been proposed in the inventive method.Use flame spatially the kinetic characteristic of fringe region jump in brightness and flame fringe remove the false-alarm of the projection speck that light etc. produces.And interference under near infrared scene is mainly light, this false-alarm removal method makes this method under near infrared scene, have a stronger robustness.
Brief description of the drawings
Fig. 1 is the inventive method process flow diagram;
Fig. 2 is target area multidimensional temporal signatures acquiring method process flow diagram;
Fig. 3 is that flame region false-alarm is removed method flow diagram.
Embodiment
Describe the present invention below in conjunction with specific embodiment.
As shown in Figure 1, the inventive method concrete steps are as follows:
(1) extract flame candidate region chain
This step is mainly the standard of extracting as flame by gray-scale value, determine a gray-scale value threshold value according to priori, the pixel that is greater than this gray-scale value threshold value tentatively assert that it is flame candidate pixel point (foreground point), and then obtains flame candidate region chain by connected component labeling.
For the t frame video image image of input, detecting step is as follows:
(1.1) generate flame area-of-interest foreground image
All pixels in traversal image, for the gray-scale value pixel (x of each pixel, y) with gray threshold P comparison, if be greater than threshold value, think that this point is that foreground point interested is flame foreground point, by this pixel fore in foreground image interested t(x, y) is set to 255, otherwise sets to 0, thereby obtains the foreground image interested in t moment, that is:
fore t ( x , y ) = 255 ifpixel ( x , y ) &GreaterEqual; P 0 else
P is the gray threshold of setting, and artificially sets with the bright-dark degree that needs the flame detecting according to scene, under general near infrared scene in large sealing warehouse, selects 180~230 as gray threshold.
(1.2) foreground image interested is carried out to filtering
In order to eliminate t moment foreground image fore interested tisolated noise in (x, y) and level and smooth target area, select median filter to fore in this example tcarry out filtering processing.
(1.3) connected component labeling:
Bianry image fore tafter processing after filtering, by pixel value be wherein 255 and be arranged in each other the other side 8 neighborhoods the same numeric indicia of pixel out, all pixels in the image after mark with identical numerical value are under the jurisdiction of same connected domain, by all N that obtain tindividual connected domain saves as flame candidate region chain C;
(2) if the first frame that present frame t is sequence of video images, flame candidate region chain is the flame object region chain D of present frame t 1, and this object matching number of times is set be 1, enter next frame and detect, otherwise, step (3) entered;
(3) adopt bee-line frame matching algorithm by the target area chain D of candidate's flame region chain C of present frame t and previous frame t-1 t-1mate, by candidate's flame region that in the flame object region that in previous frame t-1, the match is successful, present frame t, the match is successful and previous frame t-1, the match is successful but degree of confidence is greater than confidence threshold value T 2flame object region add the flame object region chain D of present frame t t;
For the target area chain D of t-1 frame t-1in i target area adopt and mate with the following method and upgrade:
(3.1) target of prediction region central point in the position in t frame moment
Use target area before middle storage the center position coordinate information X in frame moment, target of prediction region in the center in t moment ( it is target area matching times).
X t &prime; = X t - 1 + 1 - 0.9 1 - 0.9 match - 1 &Sigma; k = 1 match - 2 0.9 k - 1 ( X k - 1 - X k )
X in formula t-1for the center point coordinate of the t-1 frame moment target area of middle storage, X k+1and X krepresent that respectively target area is at k+1 and the center point coordinate in k frame moment; Match is the matching times of the target area of storage this value is no more than 100, in always store the center point coordinate of nearest 100 frame moment target areas.
(3.2) in the chain C of candidate region finding center point with immediate candidate region
Traversal candidate region chain C, asks for central point and predicted value between the minimum candidate region C of line segment distance j, minor increment is L:
L = arg min j ( X j . x - X t &prime; . x ) 2 + ( X j . y - X t &prime; . y ) 2
X in formula jand X .x j.y represent respectively candidate region C jcentral point X jx axle and y axial coordinate; with represent respectively target area at the central point of t frame moment prediction x axle, the coordinate of y axle.
Relatively minor increment L and distance threshold T 1if, be less than threshold value, use C jinformation upgrade information, comprise Offered target region at center point coordinate and the flame prospect boundary rectangle coordinate in t frame moment for C iheart point coordinate and the boundary rectangle coordinate of foreground area, and make the matching times of target area add 1; If be greater than threshold value, right carry out degree of confidence judgement.Distance threshold T 1be set to 5~20 pixel distances.
Will degree of confidence and the first confidence threshold value T 2relatively, if be greater than threshold value, use target the information updating in t-1 frame moment comprise Offered target region at center point coordinate and the prospect boundary rectangle coordinate in t frame moment for at the center point coordinate in t-1 frame moment and the boundary rectangle coordinate of flame foreground area, and make the matching times of target area add 1; If be less than threshold value, delete target area the first confidence threshold value T 2be set to 0.5~4.
(3.3) process the candidate region of not mated
To D t-1all target areas carry out after (3.1) and (3.2) operation, remaining candidate region interpolation of mate with any target area in the chain C of candidate region is entered to t-1 frame moment target area chain D t-1, think the target that the t frame moment newly increases, the matching times of these target areas that newly increase is set to 1.
Complete after aforesaid operations, i.e. t-1 frame moment target area chain D t-1renewal completes, and becomes t frame moment target area chain D t.
(4) calculate target area multidimensional temporal signatures each target area for t frame if its matching times equal 1, asking for of multidimensional temporal signatures value do not carried out in this target area; If matching times be greater than 1, calculate the multidimensional temporal signatures of target area according to multidimensional temporal signatures extracting method, with reference to Fig. 2:
(4.1) initialization target area at the multidimensional characteristic of t frame
(4.2) calculate target area interframe brightness change
For target area travel through its flame foreground point in t frame and space, t-1 frame moment target area, use t frame and the video image in t-1 frame moment, calculate respectively the mean pixel gray-scale value of the foreground pixel point in t frame moment mean pixel gray-scale value with t-1 frame moment foreground pixel point
A i t = &Sigma; fore t ( x , y ) = = 255 &cap; ( x , y ) &Element; B i t pixel ( x , y ) num ( B i t ) , A i t - 1 = &Sigma; fore t - 1 ( x , y ) = = 255 &cap; ( x , y ) &Element; B i t - 1 pixel ( x , y ) num ( B i t - 1 )
In formula, pixel (x, y) is the gray-scale value of the coordinate pixel that is (x, y), fore t(x, y) and fore t-1(x, y) represents respectively the grey scale pixel value of the pixel that in the foreground image that t frame and t-1 frame moment step (1) obtain, coordinate is (x, y); with represent respectively target area the target area boundary rectangle coordinate information in the t frame of middle storage and t-1 frame moment; the pixel of coordinate (x, y) point in t frame foreground graphic representing is in target area boundary rectangle frame in and be foreground pixel point.Interframe brightness changes calculate:
If be greater than 3.5, think that object brightness changed in the t frame moment, mul Dimen i t = mulD imen i t + 1 ; Otherwise be worth constant.
(4.3) calculate target area interframe area change
For target area ask for the foreground area coincidence factor of t frame moment and t-1 frame moment target area determine this value by calculating crossing foreground area area of t frame and t-1 frame moment with the ratio of phase foreground area also:
&xi; i t = S &cap; S &cup;
S for target area area in t frame and t-1 frame occurs simultaneously, S for target area area union in t frame and t-1 frame.
Further develop, obtain formula
&xi; i t = &Sigma; ( x , y ) &Element; ( B i t &cup; B i t - 1 ) ( fore t ( x , y ) = = 255 ) &cap; ( fore t - 1 ( x , y ) = = 255 ) &Sigma; ( x , y ) &Element; ( B i t &cup; B i t - 1 ) ( fore t ( x , y ) = = 255 ) &cup; ( fore t - 1 ( x , y ) = = 255 )
In formula with represent respectively target area the target prospect region boundary rectangle coordinate information in the t frame of middle storage and t-1 frame moment; represent t frame and t-1 frame target area prospect flame region mutually and overall region; Fore t(x, y) and fore t-1(x, y) represents respectively the grey scale pixel value of the pixel that in t frame and t-1 frame moment foreground image, coordinate is (x, y).
If think target change at t frame moment area, mul Dimen i t = mulD imen i t + 1 ; Otherwise be worth constant.
(4.4) calculate target area interframe displacement
For target ask for t frame moment center position coordinate (x t, y t) and t-1 frame moment target area center position (x t-1, y t-1) distance the change in displacement that represents target area, computing method are as follows:
L i t = ( x t - x t - 1 ) 2 + ( y t - y t - 1 ) 2
If be more than or equal to 2, think target change in the t frame moment, mul Dimen i t = mulD imen i t + 1 ; Otherwise mulDimen i t = 0 Be worth constant.
(4.5) according to target area the multidimensional characteristic value in every frame moment on time shaft calculate the multidimensional temporal signatures of target area
Ei gen i t = &Sigma; r = t - m t mulDimen i r
In formula for target area the target multidimensional characteristic being calculated by (4.2), (4.3) and (4.4) at r frame, for flame object region the match is successful number of times.
In order to ensure detecting effect, m should choose the video frame number being greater than in 1 second.The m value of choosing in this example is 25~30 frames.(video flowing is that 25 frames are per second)
(5) calculate target area degree of confidence
In the t frame moment, by target multidimensional temporal signatures value with eigenvalue threshold T 3relatively, if be greater than threshold value, upgrade flame object region degree of confidence otherwise t 3span be m × 1 to m × 1.2, wherein m selected frame number value when calculating target area multidimensional temporal signatures in (4.5) step; A span is 0.8~1; B span is 0.6~0.8.Be degree of confidence update rule:
E i t = a &times; E i t - 1 + 1 if Eigen i t &GreaterEqual; T 3 b &times; E i t - 1 else
(6) judge according to degree of confidence whether target area is flame region
In the t frame moment, by target area degree of confidence with the second confidence threshold value T 4relatively, if be more than or equal to the second confidence threshold value T 4, judge that target area is as flame region; Otherwise not flame region.T 4span is 7~13.
(7) remove false-alarm targets, make flame and detect judgement
Referring to Fig. 3, the flame region determining according to (6) step use target area interior foreground area boundary rectangle frame the difference of middle foreground point average gray value and background dot average gray value judges that target area knows that no is false-alarm.Calculate the average gray value of the interior foreground pixel point of boundary rectangle frame of target area average gray value with background pixel point luminance difference rate of change
A i t = &Sigma; fore t ( x , y ) = = 255 &cap; ( x , y ) &Element; B i t pixel ( x , y ) num _ fore ( x , y ) ; a i t = &Sigma; fore t ( x , y ) = = 0 &cap; ( x , y ) &Element; B i t pixel ( x , y ) num _ back ( x , y ) ; C i t = A i t - a i t A i t
In formula, pixel (x, y) is that the video image image in t frame moment is at the grey scale pixel value of coordinate (x, y); represent target area the target area boundary rectangle coordinate information in the t frame moment of middle storage; Fore t(x, y) represents the pixel value of the pixel that in t frame moment foreground image, coordinate is (x, y).Num_fore (x, y) and num_back (x, y) are respectively target area target area boundary rectangle frame in foreground pixel point and background pixel point number.
Relatively with brightness rate of change threshold value T 5if, being greater than threshold value, target area is flame region; Otherwise be false-alarm region.T 5span is 0.1~0.3.
The flame region determining according to (6) step use target area in flame foreground area in the sport foreground shared ratio of counting out take a decision as to whether flame region.
Use frame-to-frame differences method to ask for moving target foreground area, adopt t two field picture and t-1 two field picture to do poor, by traveling through two pixels of all same coordinate points in two two field pictures, the difference Δ pixel (x, y) of pixel gray-scale value=| pixel t(x, y)-pixel t-1(x, y) |; Pixel t(x, y) and pixel t-1 (x, y) coordinate is the gray-scale value of (x, y) pixel in the video flowing t frame of representative input respectively and t-1 two field picture.Relatively Δ pixel (x, y) and threshold value T 6if, be greater than threshold value, be sport foreground point, the sport foreground image move in t frame moment tat the pixel value move of (x, y) t(x, y)=255; Otherwise, be background dot, pixel value move t(x, y)=0.T 6span is 10~20.
Traversal target area boundary rectangle in all foreground pixel points, statistics is wherein the pixel number of moving target foreground point simultaneously, and is separately the pixel number of flame foreground point, calculates its ratio
Cover i t = &Sigma; ( x , y ) &Element; B i t ( fore t ( x , y ) = = 255 &cap; move t ( x , y ) = = 255 ) &Sigma; ( x , y ) &Element; B i t ( fore t ( x , y ) = = 255 )
Wherein fore t(x, y) represents the pixel value of the pixel that in the flame foreground image that present frame t moment step (1) calculates, coordinate is (x, y); Move t(x, y) represents the pixel value of the pixel that in present frame t moment sport foreground image, coordinate is (x, y); represent target area boundary rectangle coordinate information in the target area in the t frame moment of middle storage.
Relatively with sport foreground coverage rate threshold value T 7if, being greater than sport foreground coverage rate threshold value, target area is flame region; Otherwise be false-alarm region.T 7span is 0.5%~20%.
Through step (7) afterwards, if be tested with flame object region, the relevant information of flame region is fed back in result images, and trigger alarm mechanism, alerting signal produced.
The value of t is added to one, t=t+1, jump to step (1) and carry out the detection of next frame.
Each threshold value of using in testing process of the present invention is definite according to priori, and those skilled in the art are understanding after the technology of the present invention thinking, all can determine its concrete span by the test of limited number of time.
Those skilled in the art will readily understand; the foregoing is only preferred embodiment of the present invention; not in order to limit the present invention, all any amendments of doing within the spirit and principles in the present invention, be equal to and replace and improvement etc., within all should being included in protection scope of the present invention.

Claims (3)

1. the flame detecting method based on multidimensional temporal signatures, for the detection and Identification of flame, comprises the following steps:
(1) extract flame candidate region chain C according to gray-scale value from current frame image t;
(2) if the first frame that present frame t is sequence of video images, the flame object region chain D of present frame t 1be flame candidate region chain C, D is set 1in the match is successful in each flame object region that number of times is 1, enter next frame; If present frame t is not the first frame of sequence of video images, enter step (3);
(3) adopt bee-line frame matching algorithm by the flame object region chain D of the flame candidate region chain of present frame t and previous frame t-1 t-1mate, according to matching result, by the match is successful in the flame object region that in previous frame t-1, the match is successful, number of times adds 1, and in previous frame t-1, the match is successful but degree of confidence is greater than the first confidence threshold value T 2the match is successful in flame object region that number of times adds 1, in present frame t, be not matched to candidate's flame region of merit the match is successful that number of times puts 1; And by the flame object region that in previous frame t-1, the match is successful, previous frame t-1, the match is successful but degree of confidence is greater than the first confidence threshold value T 2flame object region, present frame t in be not matched to merit candidate's flame region add the flame object region chain D of present frame t t;
(4) calculate target area multidimensional temporal signatures:
(4.1) in initialization present frame t, the match is successful flame object region that number of times is greater than 1 multidimensional characteristic mulDimen i t = 0 ;
(4.2) calculate flame object region interframe brightness change wherein, with be respectively target area the mean pixel gray-scale value that comprises flame pixels point in present frame t and previous frame t-1, if be greater than 3.5, think flame object region change from t-1 frame to t frame generation brightness, upgrade mulDimen i t = mulDimen i t + 1 , Otherwise be worth constant;
(4.3) calculate flame object region interframe area change wherein, S and S be respectively fiery target area flame foreground area in t frame and t-1 frame is occured simultaneously and union, if think flame object region from t-1 frame to t frame generation area change, upgrade mulDimen i t = mulDimen i t + 1 , Otherwise be worth constant;
(4.4) calculate flame object region interframe displacement wherein, (x t, y t) and (x t-1, y t-1) be respectively flame object region center point coordinate in present frame t and previous frame t-1, if be more than or equal to 2, think flame object region be subjected to displacement variation from t-1 frame to t frame, mulDimen i t = mulDimen i t + 1 , Otherwise mulDimen i t = 0 Be worth constant;
(4.5) for the match is successful, number of times is more than or equal to 20 flame object region calculate its multidimensional temporal signatures for flame object region at the target multidimensional characteristic of r frame, 20≤m < flame object region the match is successful number of times;
(5) if target multidimensional temporal signatures value be greater than eigenvalue threshold T 3, upgrade flame object region degree of confidence otherwise upgrade flame object region degree of confidence 0.8≤a≤1,0.6≤b≤0.8;
(6) if flame object region mark degree of confidence be more than or equal to the second confidence threshold value T 4, judge flame object region for flame region.
2. the flame detecting method based on multidimensional temporal signatures according to claim 1, is characterized in that, also comprises step: the luminance difference rate of change that calculates flame region wherein, with be respectively the interior average gray value of flame pixels point of boundary rectangle frame and the average gray value of background pixel point of flame region, if be less than brightness rate of change threshold value T 5, this flame region is false-alarm region.
3. the flame detecting method based on multidimensional temporal signatures according to claim 1 and 2, is characterized in that, also comprises step: use frame-to-frame differences method to ask for the moving target foreground area of present frame t, and then calculate wherein, S movefor belong to the pixel number of flame region and moving target foreground area simultaneously, the pixel number that S is flame region, if be less than sport foreground coverage rate threshold value T 7, this flame region is false-alarm region.
CN201210203504.4A 2012-06-19 2012-06-19 Flame detection method based on multi-dimensional time domain characteristics Expired - Fee Related CN102760230B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210203504.4A CN102760230B (en) 2012-06-19 2012-06-19 Flame detection method based on multi-dimensional time domain characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210203504.4A CN102760230B (en) 2012-06-19 2012-06-19 Flame detection method based on multi-dimensional time domain characteristics

Publications (2)

Publication Number Publication Date
CN102760230A CN102760230A (en) 2012-10-31
CN102760230B true CN102760230B (en) 2014-07-23

Family

ID=47054685

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210203504.4A Expired - Fee Related CN102760230B (en) 2012-06-19 2012-06-19 Flame detection method based on multi-dimensional time domain characteristics

Country Status (1)

Country Link
CN (1) CN102760230B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102819735B (en) * 2012-08-17 2015-07-15 深圳辉锐天眼科技有限公司 Flame detection method based on video frame image
CN106952438A (en) * 2017-04-19 2017-07-14 天津安平易视智能影像科技有限公司 A kind of fire alarm method based on video image
CN108108695B (en) * 2017-12-22 2019-11-19 湖南源信光电科技股份有限公司 Fire defector recognition methods based on Infrared video image
CN110069961A (en) * 2018-01-24 2019-07-30 北京京东尚科信息技术有限公司 A kind of object detecting method and device
CN108830834B (en) * 2018-05-23 2022-03-11 重庆交通大学 Automatic extraction method for video defect information of cable climbing robot
CN109919120B (en) * 2019-03-15 2023-06-30 江苏鼎集智能科技股份有限公司 Flame detection method based on near infrared spectrum imaging
WO2021011300A1 (en) 2019-07-18 2021-01-21 Carrier Corporation Flame detection device and method
CA3098859A1 (en) 2019-11-22 2021-05-22 Carrier Corporation Systems and methods of detecting flame or gas

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101393603A (en) * 2008-10-09 2009-03-25 浙江大学 Method for recognizing and detecting tunnel fire disaster flame
CN101441712A (en) * 2008-12-25 2009-05-27 北京中星微电子有限公司 Flame video recognition method and fire hazard monitoring method and system
CN101739686A (en) * 2009-02-11 2010-06-16 北京智安邦科技有限公司 Moving object tracking method and system thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101393603A (en) * 2008-10-09 2009-03-25 浙江大学 Method for recognizing and detecting tunnel fire disaster flame
CN101441712A (en) * 2008-12-25 2009-05-27 北京中星微电子有限公司 Flame video recognition method and fire hazard monitoring method and system
CN101739686A (en) * 2009-02-11 2010-06-16 北京智安邦科技有限公司 Moving object tracking method and system thereof

Also Published As

Publication number Publication date
CN102760230A (en) 2012-10-31

Similar Documents

Publication Publication Date Title
CN102760230B (en) Flame detection method based on multi-dimensional time domain characteristics
CN110516609B (en) Fire disaster video detection and early warning method based on image multi-feature fusion
CN104392468B (en) Based on the moving target detecting method for improving visual background extraction
CN105404847B (en) A kind of residue real-time detection method
CN108416968B (en) Fire early warning method and device
Chen et al. Multi-feature fusion based fast video flame detection
US7574039B2 (en) Video based fire detection system
Zhao et al. SVM based forest fire detection using static and dynamic features
CN101493980B (en) Rapid video flame detection method based on multi-characteristic fusion
CN103632158B (en) Forest fire prevention monitor method and forest fire prevention monitor system
CN102982313B (en) The method of Smoke Detection
CN104809463A (en) High-precision fire flame detection method based on dense-scale invariant feature transform dictionary learning
Sengar et al. Detection of moving objects based on enhancement of optical flow
CN109460764A (en) A kind of satellite video ship monitoring method of combination brightness and improvement frame differential method
CN105469427B (en) One kind is for method for tracking target in video
CN110874592A (en) Forest fire smoke image detection method based on total bounded variation
CN106910204A (en) A kind of method and system to the automatic Tracking Recognition of sea ship
CN110598570A (en) Pedestrian abnormal behavior detection method and system, storage medium and computer equipment
CN102509414B (en) Smog detection method based on computer vision
CN107729811B (en) Night flame detection method based on scene modeling
Lai et al. Robust little flame detection on real-time video surveillance system
Wang Research and implementation of intrusion detection algorithm in video surveillance
CN103020587B (en) Based on video image analysis flame regarding figure analysis method
CN102609710B (en) Smoke and fire object segmentation method aiming at smog covering scene in fire disaster image video
Abidha et al. Reducing false alarms in vision based fire detection with nb classifier in eadf framework

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140723

Termination date: 20200619

CF01 Termination of patent right due to non-payment of annual fee