CN105261030A - Method and device for detecting flame from infrared video - Google Patents

Method and device for detecting flame from infrared video Download PDF

Info

Publication number
CN105261030A
CN105261030A CN201510848008.8A CN201510848008A CN105261030A CN 105261030 A CN105261030 A CN 105261030A CN 201510848008 A CN201510848008 A CN 201510848008A CN 105261030 A CN105261030 A CN 105261030A
Authority
CN
China
Prior art keywords
video
image frame
frame
flame
infrared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510848008.8A
Other languages
Chinese (zh)
Other versions
CN105261030B (en
Inventor
罗锐
唐惠娟
宋昌林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HUIYUAN OPTICAL COMMUNICATION CO Ltd SICHUAN
Original Assignee
HUIYUAN OPTICAL COMMUNICATION CO Ltd SICHUAN
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HUIYUAN OPTICAL COMMUNICATION CO Ltd SICHUAN filed Critical HUIYUAN OPTICAL COMMUNICATION CO Ltd SICHUAN
Priority to CN201510848008.8A priority Critical patent/CN105261030B/en
Publication of CN105261030A publication Critical patent/CN105261030A/en
Application granted granted Critical
Publication of CN105261030B publication Critical patent/CN105261030B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Abstract

The invention provides a method for detecting flame from an infrared video. The method comprises the steps of: subjecting each pixel point in image frames of a target infrared video to brightness segmentation, and preserving the pixel points with brightness exceeding a preset brightness threshold value in the image frames of the target infrared video; filtering out noise in the image frames of the target infrared video; extracting static features of one single image frame in the target infrared video; extracting dynamic features of the continuous image frames in the target infrared video; inputting the static features and the dynamic features into a neural network for synthetic judgment, and acquiring existence situation of the flame in the target infrared video; and outputting the existence situation of the flame in the target infrared video. The invention further provides a device for detecting the flame from the infrared video, which solves the technical problems in the prior art that the fire monitoring technical cost is high, the coverage rate is low and the dependence on ambient lighting is high.

Description

The method of flame detection and device from infrared video
Technical field
The present invention relates to technical field of image processing, in particular to a kind of method and device of flame detection from infrared video.
Background technology
Fire is welding not only, causes tremendous influence also can to the productive life of people.Will by fire according to loss drop to minimum, wherein one of most effective method is exactly the early stage Timeliness coverage in fire generation, fire also do not have large area spread before put out.Current fire detection technology mainly comprises: sensor-based monitoring technique and visible light video monitoring technique.Wherein, sensor-based monitoring technique can only use in the indoor finite space, cannot in the wild, and this extensive area of such as forest fire protection is promoted the use of.Fire monitoring technology based on visible ray can the very large monitoring range of area coverage, thoroughly solves the shortcomings such as the high cost of sensor-based monitoring technique, low coverage rate.But visible ray monitoring technique is at night, and the ambient lightings such as rainy weather undesirable occasion imaging effect is poor, this also makes visible ray fire monitoring there is significant limitation.Therefore, providing a kind of low cost, high coverage rate and relying on little fire detection method to ambient lighting is the anxious technical issues that need to address.
Summary of the invention
In view of this, the invention provides a kind of method and device of flame detection from infrared video, utilize method and the device of this flame detection from infrared video, not only cost is low, the high dependence to ambient lighting of coverage rate is also little.
To achieve these goals, the technical scheme of embodiment of the present invention employing is as follows:
First aspect, the embodiment of the present invention provides a kind of method of flame detection from infrared video, comprise: Intensity segmentation is carried out to each pixel in the picture frame of an Infrared Targets video, retain brightness in the picture frame of described Infrared Targets video and exceed the pixel of predetermined luminance threshold value; Noise in the picture frame of Infrared Targets video described in filtering; Carry out static nature extraction to the single image frame in described Infrared Targets video, described static nature comprises the angle point density feature of the circularity feature of described single image frame, the height of C.G. coefficient characteristics of described single image frame and described single image frame; Carry out behavioral characteristics extraction to the successive image frame in described Infrared Targets video, described behavioral characteristics comprises the centre-of gravity shift frequency between area variation frequency between the grey scale change frequency between described successive image frame, described successive image frame and described successive image frame; Described static nature and described behavioral characteristics are input in neural network and carry out comprehensive descision, obtain the existence of described Infrared Targets video Flame; Export the existence of described Infrared Targets video Flame.
Second aspect, the embodiment of the present invention also provides a kind of device of flame detection from infrared video, comprise: Intensity segmentation module, carry out Intensity segmentation for each pixel in the picture frame to an Infrared Targets video, retain brightness in the picture frame of described Infrared Targets video and exceed the pixel of predetermined luminance threshold value; Filtering module, for the noise in the picture frame of Infrared Targets video described in filtering; Static nature extraction module, for carrying out static nature extraction to the single image frame in described Infrared Targets video, described static nature comprises the angle point density feature of the circularity feature of described single image frame, the height of C.G. coefficient characteristics of described single image frame and described single image frame; Behavioral characteristics extraction module, carry out behavioral characteristics extraction to the successive image frame in described Infrared Targets video, described behavioral characteristics comprises the centre-of gravity shift frequency between area variation frequency between the grey scale change frequency between described successive image frame, described successive image frame and described successive image frame; Neural network judge module, carries out for described static nature and described behavioral characteristics being input in neural network the existence that comprehensive descision obtains described Infrared Targets video Flame; And output module, for exporting the existence of described Infrared Targets video Flame.
By method and the device of above-mentioned flame detection from infrared video, filtering noise is detected by carrying out connected domain to infrared video two field picture, Static and dynamic feature extraction is carried out to infrared video two field picture, Static and dynamic feature after extraction is input in neural network and carries out discriminant classification, obtain the existence of infrared video two field picture Flame.Have relative to the fire hazard monitoring technology of existing sensor-based monitoring technique and visible ray that cost is low, coverage rate is high and rely on little advantage to ambient lighting.
For making above-mentioned purpose of the present invention, feature and advantage become apparent, preferred embodiment cited below particularly, and coordinate appended accompanying drawing, be described in detail below.
Accompanying drawing explanation
In order to the clearer explanation embodiment of the present invention or technical scheme of the prior art, be briefly described to the accompanying drawing used required in embodiment or description of the prior art below, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for ordinary skill user person, under the prerequisite not paying creative work, other relevant accompanying drawings can also be obtained according to these accompanying drawings.
Fig. 1 shows a kind of structured flowchart of the terminal device that can be applicable in the embodiment of the present invention.
The process flow diagram of the method for flame detection from infrared video that Fig. 2 shows that the present invention first specific embodiment provides.
Fig. 3 shows the schematic diagram of grey scale change E (u, v) in mil(unit of angular measure) characteristic extraction procedure in the method for Fig. 2 flame detection from infrared video.
Fig. 4-A shows the neural network structure schematic diagram judging flame existence in the method for Fig. 2 flame detection from infrared video.
Fig. 4-B shows in the method for Fig. 2 flame detection from infrared video the structural representation of the neural network after inputting static nature and behavioral characteristics.
The structured flowchart of the device of flame detection from infrared video that Fig. 5 provides for the present invention second specific embodiment.
Embodiment
Fig. 1 shows a kind of structured flowchart of the terminal device that can be applicable in the embodiment of the present invention.As shown in Figure 1, terminal device 100 comprises storer 102, memory controller 104, one or more (only illustrating one in figure) processor 106, Peripheral Interface 108, radio-frequency module 110, audio-frequency module 112, Touch Screen 114 etc.These assemblies are by one or more communication bus/signal wire 116 communication mutually.
Storer 102 can be used for storing software program and module, programmed instruction/module as corresponding in the method for flame detection from infrared video in the embodiment of the present invention, processor 106 is by running the software program and module that are stored in storer 102, thus perform the application of various function and data processing, as the method for flame detection from infrared video that the embodiment of the present invention provides.
Storer 102 can comprise high speed random access memory, also can comprise nonvolatile memory, as one or more magnetic storage device, flash memory or other non-volatile solid state memories.Processor 106 and other possible assemblies can carry out the access of storer 102 under the control of memory controller 104.
Various input/output device is coupled to processor 106 and storer 102 by Peripheral Interface 108.In certain embodiments, Peripheral Interface 108, processor 106 and memory controller 104 can realize in one single chip.In some other example, they can respectively by independently chip realization.
Radio-frequency module 110, for receiving and sending electromagnetic wave, realizes the mutual conversion of electromagnetic wave and electric signal, thus carries out communication with communication network or other equipment.
Audio-frequency module 112 provides audio interface to user, and it can comprise one or more microphone, one or more loudspeaker and voicefrequency circuit.
Touch Screen 114 provides one simultaneously and exports and inputting interface between terminal device 100 and user.Particularly, Touch Screen 114 exports to user's display video, and the content of these video frequency output can comprise word, figure, video and combination in any thereof.
Be appreciated that the structure shown in Fig. 1 is only signal, terminal device 100 also can comprise than assembly more or less shown in Fig. 1, or has the configuration different from shown in Fig. 1.Each assembly shown in Fig. 1 can adopt hardware, software or its combination to realize.
In the embodiment of the present invention, in terminal device 100, client is installed, this client can be browser also can be third-party application software, hold corresponding with server (Server), for user provides service, such as, in the infrared video software of flame detection, for the existence of flame detection from infrared video.
Below in conjunction with accompanying drawing in the embodiment of the present invention, be clearly and completely described the technical scheme in the embodiment of the present invention, obviously, described embodiment is only the present invention's part embodiment, instead of whole embodiments.The assembly of the embodiment of the present invention describing and illustrate in usual accompanying drawing herein can be arranged with various different configuration and design.Therefore, below to the detailed description of the embodiments of the invention provided in the accompanying drawings and the claimed scope of the present invention of not intended to be limiting, but selected embodiment of the present invention is only represented.Based on embodiments of the invention, the every other embodiment that those skilled in the art obtain under the prerequisite not making creative work, all belongs to the scope of protection of the invention.
First specific embodiment
Fig. 2 provides the process flow diagram of the method for flame detection from infrared video for the present invention first specific embodiment.As shown in Figure 2, the process flow diagram of the method for flame detection from infrared video in the present embodiment can comprise the following steps.
Step S210, carries out Intensity segmentation to each pixel in the picture frame of an Infrared Targets video, retains brightness in the picture frame of described Infrared Targets video and exceedes the pixel of predetermined luminance threshold value.
Infrared video only has monochrome information usually, does not have colouring information.Therefore, binaryzation is carried out to infrared image here, by threshold value, the pixels such as background too low for brightness in image or noise are filtered, only retain the pixel that brightness (temperature) is higher.
Concrete pixel filter expression is as follows:
I ′ ( x , y ) = 0 I ( x , y ) > T l u m a 1 I ( x , y ) ≤ T l u m a
Wherein, I (x, y) be video frame images in infrared video, I ' (x, y) is the image after I (x, y) binaryzation, I ' (x, y) can be understood as mask (template) image of doubtful flame region, follow-up all operations all processes in the region that the middle numerical value of mask image I ' (x, y) is 1.T lumabe the threshold value of binaryzation, this parameter obtains through a large amount of test according to actual conditions, also can be set based on experience value, such as: T luma=150.
Step S220, the noise in the picture frame of Infrared Targets video described in filtering.
Consider in the two field picture in infrared video to there is a lot of noise, and some noise pixel brightness is wherein also higher, can judge cause flase drop to follow-up fire defector.In addition, noise presents erratic stochastic distribution in the picture, and the method for connected component analysis therefore can be adopted to reject the interference of this random noise.Conventional connected component analysis detection method has: se ed filling algorithm, unrestrained water completion method etc.
Suppose that the connected region detected is C i, wherein 0<i<N, N are here the sums of connected region.C ii-th the connected region component detected.Judge the area A of each connected region component i, get rid of wherein connected region area and be less than threshold value T areaconnected region.
Get rid of connected region area and be less than threshold value T areathe expression formula of connected region be:
I i &prime; = 1 A i > T a r e a 0 A i &le; T a r e a
If its connected region area is less than threshold value, the mask in this connected component region is set to 0, that is: is got rid of as noise; The connected region that area is greater than threshold value then carries out post-treatment operations as doubtful flame region (mask remains 1).
Step S230, carry out static nature extraction to the single image frame in described Infrared Targets video, described static nature comprises the angle point density feature of the circularity feature of described single image frame, the height of C.G. coefficient characteristics of described single image frame and described single image frame.
The static nature of flame carries out feature extraction and discriminant classification to single image frame, and this method only relates to the spatial domain relation between image pixel, does not relate to the temporal relationship between the frame of front and back.This specific embodiment extracts three kinds of static natures and is described flame region.They are respectively: circularity feature extraction, height of C.G. coefficient characteristics and angle point density feature.
A1. circularity feature extraction
Circularity refers to the degree of object proximity circle, and it is an index of tolerance body form degree of irregularity.The chaff interference of doubtful flame, such as: the regular shape degree of incandescent lamp, candle, electric torch, car light etc. is higher, therefore can consider the basis for estimation of circularity as fire defector identification.
The computing formula of circularity is as follows:
C k = P k 2 4 &pi;A k k = 1 , 2 , ... , n
Wherein, k is the kth profile in picture frame, and n is profile sum, C kfor the circularity of correspondence, A kthe area occupied by profile k, P kit is the girth of profile.The span of circularity is [0,1].When the shape of profile more close circular time, its circularity is more close to 1, and when shape is more irregular, the value of its circularity is more close to 0.
A2. height of C.G. coefficient characteristics is extracted
Flame in the middle of the two field picture of infrared video should be in ceaselessly drift and swing process.In the process, the information such as center of gravity, width, height of flame region is not also stopping change.This specific embodiment considers that the feature of height of C.G. coefficient describes this drift features of flame.
The computing formula of height of C.G. coefficient is as follows:
Coeff i = G i H i ,
Wherein, Coeff irepresent the height of C.G. coefficient of the i-th frame, H irepresent the height of the i-th frame flame suspicious region.
A3. angle point density feature extracts
Angle point is exactly the point changed greatly in level, vertical both direction.If have flame to exist in the middle of infrared video, certainly exist the angle point of a lot of burr shape in regions such as the top of flare, edges.Here the foundation judged as fire defector by angle point density feature is considered.Here flame Corner Detection adopts Harris angular-point detection method.
Calculation procedure is as follows: computed image window translation [u, v] produces grey scale change E (u, v)
E(u,v)=∑w(x,y)[I(x+u,y+v)-I(x,y)] 2
Wherein, the pixel value at I (x, y) denotation coordination (x, y) place, w (x, y) represents weight coefficient.The weight size of diverse location should be different.
By formula I (x+u, y+v)=I (x, y)+I xu+I yv+O (u 2, v 2), obtain:
E ( u , v ) = &lsqb; u , v &rsqb; I x 2 I x I y I x I y I y 2 u v
For the amount of movement [u, v] that local is small, approximate expression is:
E ( u , v ) &cong; &lsqb; u , v &rsqb; M u v
Wherein, M is the matrix of 2x2, can be calculated try to achieve by the derivative of image:
M = &Sigma; x , y w ( x , y ) I x 2 I x I y I x I y I y 2
With reference to Fig. 3, E (u, the v) form usually in a kind of ellipse.
The expression formula of definition angle point response function R is:
R=λ 1λ 2-k(λ 12) 2
Here λ 1represent the eigenwert of M.
Final Corner Detection carries out threshold process to angle point response function R exactly: R>Thr, that is: extract the local maximum of R, and threshold value Thr is then determined by test according to actual conditions.
After above-mentioned steps calculates Corner Feature, it is as follows that we define angle point density expression formula:
D i = N C i L i
Wherein, N cirepresent the angle point number of i-th profile, L irepresent the girth of i-th profile.Angle point density D irepresent the quantity of angle point in unit length, for doubtful flame region, obvious D ithe more the better.
Step S240, carry out behavioral characteristics extraction to the successive image frame in described Infrared Targets video, described behavioral characteristics comprises the centre-of gravity shift frequency between area variation frequency between the grey scale change frequency between described successive image frame, described successive image frame and described successive image frame.
This specific embodiment extracts three kinds of behavioral characteristics and is described flame region.They are respectively: grey scale change frequency, area variation frequency and centre-of gravity shift frequency.
B1. grey scale change frequency
The computing formula of grey scale change frequency is as follows:
D i f f ( x , y ) = &Sigma; i = 0 n | I ( P i ( x , y ) ) - I ( P i - 1 ( x , y ) ) | n - 1
Wherein, (P (x, y) represents the gray-scale value of coordinate (x, y) position in the middle of the i-th frame in video to I; N represents video frame number, and Diff (x, y) then represents the change frequency of gradation of image in the middle of n frame of video.Here n carries out value according to actual conditions, such as n=25 or 30 etc.
B2. area variation frequency
The computing formula of area variation frequency is as follows:
F a r e a = &Sigma; i = 1 n | A i - A i - 1 | n - 1
Wherein, A irepresent the area of doubtful flame region i-th frame in the middle of video, F arearepresent within n successive video frames, the area variation frequency of doubtful flame region.
B3. centre-of gravity shift frequency
The computing formula of centre-of gravity shift frequency is as follows:
F g = &Sigma; i = 1 n | g i - g i - 1 | n - 1
Wherein, g irepresent the centre of gravity place of the i-th frame in the middle of video, n represents video frame number, F grepresent centre-of gravity shift frequency.
In this specific embodiment, step S230 and step S240 can sequentially carry out in no particular order.
Step S250, is input to described static nature and described behavioral characteristics in neural network and carries out comprehensive descision, obtains the existence of described Infrared Targets video Flame.
The static nature extract above-mentioned steps S230 and step S240 and behavioral characteristics are input to neural network classifier and carry out discriminant classification.
Neural network is by the Massively parallel network of some simple processing unit (neuron) tissues.Neuron processing signals is by input signal and neuron weight coefficient are carried out inner product operation, then activation function (excitation function, transport function) is passed through, eventually pass threshold function table to differentiate, to determine whether this neuron is activated or suppresses.
With reference to Fig. 4-A, neural network structure is divided into input layer, hidden layer, output layer to amount to three network layers usually.
The number of the input layer of neural network is determined by the dimension of sample attribute, and the neuronic number of output layer is determined by sample classification number.The number of plies of hidden layer and the neuron number of every layer are specified by user.With reference to Fig. 4-B, for this specific embodiment, the input of neural network has 3 static natures and 3 behavioral characteristics, amounts to 6 inputs; Export and then only have this two states of flame, nonflame.That is: the neural network classifier input layer that the present invention relates to is by 6 nodes, and output layer has 2 nodes.
Every one deck of neural network all comprises several neurons, and each neuron comprises a threshold value θ jwith a weight coefficient ω ij.Here θ jthe threshold value of unit j, and ω ijthen represent the weight between front one deck neuron and later layer neuron.For the input of hidden layer and output layer:
Y j = &Sigma; i w i j I i + &theta; j
Here Y jthe output of current layer node j, I ithe input of last layer node i, ω ijthe weight of last layer node i to current layer node j.
Neuron in neural network exports and is calculated by activation function.Conventional activation (transmission) function has: linear function, inclined-plane function, threshold function table, Simoid function, Logistic function etc.Preferably, the output layer activation function of this specific embodiment adopts Simoid function.
The expression formula of Simoid function is:
f ( x ) = 1 1 + e - &alpha; x , ( 0 < f ( x ) < 1 )
Step S260, exports the existence of described Infrared Targets video Flame.
By the method for above-mentioned flame detection from infrared video, minimizing noise is detected by carrying out connected domain to infrared video two field picture, Static and dynamic feature extraction is carried out to infrared video two field picture, Static and dynamic feature after extraction is input in neural network and carries out discriminant classification, obtain the existence of infrared video two field picture Flame.Have relative to the fire hazard monitoring technology of existing sensor-based monitoring technique and visible ray that cost is low, coverage rate is high and rely on little advantage to ambient lighting.
Second specific embodiment
Fig. 5 is the structured flowchart of the device of a kind of flame detection from infrared video provided for the present invention second specific embodiment, should from infrared video flame detection device 300 for realize in the first specific embodiment from infrared video flame detection method, please refer to Fig. 5, the device 300 of described flame detection from infrared video comprises
Intensity segmentation module 310, carries out Intensity segmentation for each pixel in the picture frame to an Infrared Targets video, retains brightness in the picture frame of described Infrared Targets video and exceedes the pixel of predetermined luminance threshold value.
Filtering module 320, for the noise in the picture frame of Infrared Targets video described in filtering.
Static nature extraction module 330, for carrying out static nature extraction to the single image frame in described Infrared Targets video, described static nature comprises the angle point density feature of the circularity feature of described single image frame, the height of C.G. coefficient characteristics of described single image frame and described single image frame.
Behavioral characteristics extraction module 340, carry out behavioral characteristics extraction to the successive image frame in described Infrared Targets video, described behavioral characteristics comprises the centre-of gravity shift frequency between area variation frequency between the grey scale change frequency between described successive image frame, described successive image frame and described successive image frame.
Neural network judge module 350, carries out for described static nature and described behavioral characteristics being input in neural network the existence that comprehensive descision obtains described Infrared Targets video Flame.
Output module 360, for exporting the existence of described Infrared Targets video Flame.
It should be noted that, each embodiment in this instructions all adopts the mode of going forward one by one to describe, and what each embodiment stressed is the difference with other embodiments, between each embodiment identical similar part mutually see.
In addition, the process flow diagram in accompanying drawing and block diagram show system according to multiple embodiment of the present invention, the architectural framework in the cards of method and computer program product, function and operation.In this, each square frame in process flow diagram or block diagram can represent a part for module, program segment or a code, and a part for described module, program segment or code comprises one or more executable instruction for realizing the logic function specified.Also it should be noted that at some as in the realization of replacing, the function marked in square frame also can be different from occurring in sequence of marking in accompanying drawing.Such as, in fact two continuous print square frames can perform substantially concurrently, and they also can perform by contrary order sometimes, and this determines according to involved function.Also it should be noted that, the combination of the square frame in each square frame in block diagram and/or process flow diagram and block diagram and/or process flow diagram, can realize by the special hardware based system of the function put rules into practice or action, or can realize with the combination of specialized hardware and computer instruction.
It should be noted that, in this article, the such as relational terms of first and second grades and so on is only used for an entity or operation to separate with another entity or operational zone, and not necessarily requires or imply the relation that there is any this reality between these entities or operation or sequentially.And, term " comprises ", " comprising " or its any other variant are intended to contain comprising of nonexcludability, thus make to comprise the process of a series of key element, method, article or equipment and not only comprise those key elements, but also comprise other key elements clearly do not listed, or also comprise by the intrinsic key element of this process, method, article or equipment.When not more restrictions, the key element limited by statement " comprising ... ", and be not precluded within process, method, article or the equipment comprising described key element and also there is other identical element.
The foregoing is only the preferred embodiments of the present invention, be not limited to the present invention, for a person skilled in the art, the present invention can have various modifications and variations.Within the spirit and principles in the present invention all, any amendment done, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.It should be noted that: represent similar terms in similar label and letter accompanying drawing below, therefore, once be defined in an a certain Xiang Yi accompanying drawing, then do not need to define further it and explain in accompanying drawing subsequently.

Claims (10)

1. the method for flame detection from infrared video, is characterized in that, comprising:
Intensity segmentation is carried out to each pixel in the picture frame of an Infrared Targets video, retains brightness in the picture frame of described Infrared Targets video and exceed the pixel of predetermined luminance threshold value;
Noise in the picture frame of Infrared Targets video described in filtering;
Carry out static nature extraction to the single image frame in described Infrared Targets video, described static nature comprises the angle point density feature of the circularity feature of described single image frame, the height of C.G. coefficient characteristics of described single image frame and described single image frame;
Carry out behavioral characteristics extraction to the successive image frame in described Infrared Targets video, described behavioral characteristics comprises the centre-of gravity shift frequency between area variation frequency between the grey scale change frequency between described successive image frame, described successive image frame and described successive image frame;
Described static nature and described behavioral characteristics are input in neural network and carry out comprehensive descision, obtain the existence of described Infrared Targets video Flame;
Export the existence of described Infrared Targets video Flame.
2. the method for flame detection from infrared video as claimed in claim 1, it is characterized in that, the step of the noise in the picture frame of Infrared Targets video described in filtering comprises:
Calculate the area of connected region in the picture frame of described Infrared Targets video, when the area of described connected region is less than a preset area threshold value, by connected region filtering corresponding for the area of the described connected region being less than described preset area threshold value.
3. the method for flame detection from infrared video as claimed in claim 1, it is characterized in that, the step of the single image frame in described Infrared Targets video being carried out to static nature extraction comprises:
According to formula k=1,2 ..., n obtains the circularity feature of described single image frame, and wherein k is the kth profile in image, and n is profile sum, C kfor the circularity of correspondence, A kthe area occupied by profile k, P kit is the girth of profile;
According to formula obtain the height of C.G. coefficient characteristics of described single image frame, wherein Coeff irepresent the height of C.G. coefficient of the i-th frame, H irepresent the height of the i-th frame flame suspicious region;
According to formula obtain the angle point density feature of described single image frame, wherein N cirepresent the angle point number of i-th profile, L irepresent the girth of i-th profile.
4. the method for flame detection from infrared video as claimed in claim 1, it is characterized in that, the step of the successive image frame in described Infrared Targets video being carried out to behavioral characteristics extraction comprises:
According to formula D i f f ( x , y ) = &Sigma; i = 0 n | I ( P i ( x , y ) ) - I ( P i - 1 ( x , y ) ) | n - 1 Obtain the grey scale change frequency between described successive image frame, wherein I (P i(x, y) represents the gray-scale value of coordinate (x, y) position in the middle of the i-th frame in described Infrared Targets video, and n represents video frame number, and Diff (x, y) then represents the change frequency of gradation of image in the middle of n frame of video.
5. the method for flame detection from infrared video as claimed in claim 1, it is characterized in that, carry out the step that comprehensive descision obtains the existence of described Infrared Targets video Flame and comprise described static nature and described behavioral characteristics are input in neural network:
Neuron weight coefficient neuronic in described static nature and described behavioral characteristics and described neural network is carried out inner product operation and obtains an operation result by described neural network, by described operation result by after activation function, judge to determine described neuronal activation or suppression through threshold function table again, thus obtain the existence of described Infrared Targets video Flame.
6. the device of flame detection from infrared video, is characterized in that, comprising:
Intensity segmentation module, carries out Intensity segmentation for each pixel in the picture frame to an Infrared Targets video, retains brightness in the picture frame of described Infrared Targets video and exceedes the pixel of predetermined luminance threshold value;
Filtering module, for the noise in the picture frame of Infrared Targets video described in filtering;
Static nature extraction module, for carrying out static nature extraction to the single image frame in described Infrared Targets video, described static nature comprises the angle point density feature of the circularity feature of described single image frame, the height of C.G. coefficient characteristics of described single image frame and described single image frame;
Behavioral characteristics extraction module, carry out behavioral characteristics extraction to the successive image frame in described Infrared Targets video, described behavioral characteristics comprises the centre-of gravity shift frequency between area variation frequency between the grey scale change frequency between described successive image frame, described successive image frame and described successive image frame;
Neural network judge module, carries out for described static nature and described behavioral characteristics being input in neural network the existence that comprehensive descision obtains described Infrared Targets video Flame; And
Output module, for exporting the existence of described Infrared Targets video Flame.
7. the device of flame detection from infrared video as claimed in claim 6, is characterized in that:
Described filtering module calculates the area of connected region in the picture frame of described Infrared Targets video, when the area of described connected region is less than a preset area threshold value, by connected region filtering corresponding for the area of the described connected region being less than described preset area threshold value.
8. the device of flame detection from infrared video as claimed in claim 6, is characterized in that:
Described static nature extraction module is according to formula k=1,2 ..., n obtains the circularity feature of described single image frame, and wherein k is the kth profile in image, and n is profile sum, C kfor the circularity of correspondence, A kthe area occupied by profile k, P kit is the girth of profile;
According to formula obtain the height of C.G. coefficient characteristics of described single image frame, wherein Coeff irepresent the height of C.G. coefficient of the i-th frame, H irepresent the height of the i-th frame flame suspicious region;
According to formula obtain the angle point density feature of described single image frame, wherein N cirepresent the angle point number of i-th profile, L irepresent the girth of i-th profile.
9. the device of flame detection from infrared video as claimed in claim 6, is characterized in that:
Described behavioral characteristics extraction module is according to formula obtain the grey scale change frequency between described successive image frame, wherein I (P i(x, y) represents the gray-scale value of coordinate (x, y) position in the middle of the i-th frame in described Infrared Targets video, and n represents video frame number, and Diff (x, y) then represents the change frequency of gradation of image in the middle of n frame of video.
10. the device of flame detection from infrared video as claimed in claim 6, is characterized in that:
Described neural network judge module, neuron weight coefficient neuronic in described static nature and described behavioral characteristics and described neural network is carried out inner product operation and obtains an operation result, by described operation result by after activation function, judge to determine described neuronal activation or suppression through threshold function table again, thus obtain the existence of described Infrared Targets video Flame.
CN201510848008.8A 2015-11-26 2015-11-26 The method and device of flame is detected from infrared video Active CN105261030B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510848008.8A CN105261030B (en) 2015-11-26 2015-11-26 The method and device of flame is detected from infrared video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510848008.8A CN105261030B (en) 2015-11-26 2015-11-26 The method and device of flame is detected from infrared video

Publications (2)

Publication Number Publication Date
CN105261030A true CN105261030A (en) 2016-01-20
CN105261030B CN105261030B (en) 2019-01-15

Family

ID=55100702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510848008.8A Active CN105261030B (en) 2015-11-26 2015-11-26 The method and device of flame is detected from infrared video

Country Status (1)

Country Link
CN (1) CN105261030B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106693261A (en) * 2016-11-29 2017-05-24 广东工业大学 Fire water monitor system based on autonomous searching of machine vision and control method
CN107246913A (en) * 2017-06-05 2017-10-13 山东神戎电子股份有限公司 Based on the multiple forest fire protection detection method for differentiating mechanism
CN107256566A (en) * 2017-05-26 2017-10-17 北京环境特性研究所 Forest fires detection method based on emittance
CN109145689A (en) * 2017-06-28 2019-01-04 南京理工大学 A kind of robot fire detection method
CN110853077A (en) * 2019-10-17 2020-02-28 广西电网有限责任公司电力科学研究院 Self-adaptive infrared dynamic frame feature extraction method based on morphological change estimation
CN112972972A (en) * 2021-03-31 2021-06-18 湖南现代德雷工程有限公司 Intelligent fire fighting method and system for building safety construction site
CN115359617A (en) * 2022-08-26 2022-11-18 新创碳谷控股有限公司 Oxidation furnace flame detection method, computer equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101625789A (en) * 2008-07-07 2010-01-13 北京东方泰坦科技有限公司 Method for monitoring forest fire in real time based on intelligent identification of smoke and fire
CN102743830A (en) * 2012-07-10 2012-10-24 西安交通大学 Automatic electric switch cabinet fire extinguishing system and fire recognition method
CN103150856A (en) * 2013-02-28 2013-06-12 江苏润仪仪表有限公司 Fire flame video monitoring and early warning system and fire flame detection method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101625789A (en) * 2008-07-07 2010-01-13 北京东方泰坦科技有限公司 Method for monitoring forest fire in real time based on intelligent identification of smoke and fire
CN102743830A (en) * 2012-07-10 2012-10-24 西安交通大学 Automatic electric switch cabinet fire extinguishing system and fire recognition method
CN103150856A (en) * 2013-02-28 2013-06-12 江苏润仪仪表有限公司 Fire flame video monitoring and early warning system and fire flame detection method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吴茜茵等: "多特征融合的火焰检测算法", 《智能系统学报》 *
吴茜茵等: "视频火焰检测综述", 《计算机科学与应用》 *
熊国良等: "火焰特性识别的Matlab实现方法", 《计算机工程与科学》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106693261A (en) * 2016-11-29 2017-05-24 广东工业大学 Fire water monitor system based on autonomous searching of machine vision and control method
CN107256566A (en) * 2017-05-26 2017-10-17 北京环境特性研究所 Forest fires detection method based on emittance
CN107256566B (en) * 2017-05-26 2019-10-29 北京环境特性研究所 Forest fires detection method based on radiation energy
CN107246913A (en) * 2017-06-05 2017-10-13 山东神戎电子股份有限公司 Based on the multiple forest fire protection detection method for differentiating mechanism
CN107246913B (en) * 2017-06-05 2019-11-08 山东神戎电子股份有限公司 Based on the multiple forest fire protection detection method for differentiating mechanism
CN109145689A (en) * 2017-06-28 2019-01-04 南京理工大学 A kind of robot fire detection method
CN110853077A (en) * 2019-10-17 2020-02-28 广西电网有限责任公司电力科学研究院 Self-adaptive infrared dynamic frame feature extraction method based on morphological change estimation
CN112972972A (en) * 2021-03-31 2021-06-18 湖南现代德雷工程有限公司 Intelligent fire fighting method and system for building safety construction site
CN112972972B (en) * 2021-03-31 2021-12-21 湖南现代德雷工程有限公司 Intelligent fire fighting method and system for building safety construction site
CN115359617A (en) * 2022-08-26 2022-11-18 新创碳谷控股有限公司 Oxidation furnace flame detection method, computer equipment and storage medium

Also Published As

Publication number Publication date
CN105261030B (en) 2019-01-15

Similar Documents

Publication Publication Date Title
CN105261030A (en) Method and device for detecting flame from infrared video
CN109978893B (en) Training method, device, equipment and storage medium of image semantic segmentation network
CN107220618B (en) Face detection method and device, computer readable storage medium and equipment
CN109154976B (en) System and method for training object classifier through machine learning
US11455805B2 (en) Method and apparatus for detecting parking space usage condition, electronic device, and storage medium
CN110728330A (en) Object identification method, device, equipment and storage medium based on artificial intelligence
CN107871122A (en) Safety check detection method, device, system and electronic equipment
CN108460328A (en) A kind of fake-licensed car detection method based on multitask convolutional neural networks
CN109740639B (en) Wind cloud satellite remote sensing image cloud detection method and system and electronic equipment
CN110491132B (en) Vehicle illegal parking detection method and device based on video frame picture analysis
US20220004818A1 (en) Systems and Methods for Evaluating Perception System Quality
CN110826429A (en) Scenic spot video-based method and system for automatically monitoring travel emergency
CN111325319B (en) Neural network model detection method, device, equipment and storage medium
CN111709265A (en) Camera monitoring state classification method based on attention mechanism residual error network
CN109472193A (en) Method for detecting human face and device
CN111209858B (en) Real-time license plate detection method based on deep convolutional neural network
CN110059666A (en) A kind of attention detection method and device
CN111145475A (en) Intrusion alarm system, method and medium based on vibration optical fiber and deep learning
CN113841161A (en) Extensible architecture for automatically generating content distribution images
CN112149476A (en) Target detection method, device, equipment and storage medium
CN108615401B (en) Deep learning-based indoor non-uniform light parking space condition identification method
CN109460705A (en) Oil pipeline monitoring method based on machine vision
CN114241358A (en) Equipment state display method, device and equipment based on digital twin transformer substation
CN104361366B (en) A kind of licence plate recognition method and car license recognition equipment
CN109583396A (en) A kind of region prevention method, system and terminal based on CNN two stages human testing

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant