TWI313848B - A fire image detection method - Google Patents

A fire image detection method Download PDF

Info

Publication number
TWI313848B
TWI313848B TW96105521A TW96105521A TWI313848B TW I313848 B TWI313848 B TW I313848B TW 96105521 A TW96105521 A TW 96105521A TW 96105521 A TW96105521 A TW 96105521A TW I313848 B TWI313848 B TW I313848B
Authority
TW
Taiwan
Prior art keywords
fire
image
sample
pixel
distance
Prior art date
Application number
TW96105521A
Other languages
Chinese (zh)
Other versions
TW200834476A (en
Inventor
Chung Ning Huang
Shih Chieh Chen
Original Assignee
Ind Tech Res Inst
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ind Tech Res Inst filed Critical Ind Tech Res Inst
Priority to TW96105521A priority Critical patent/TWI313848B/en
Publication of TW200834476A publication Critical patent/TW200834476A/en
Application granted granted Critical
Publication of TWI313848B publication Critical patent/TWI313848B/en

Links

Description

1313848 ,·九、發明說明: 【發明所屬之技術領域】 一種火災影像偵測方法,特別是一種將影像像素與一個火 炙樣本做相似性分析,以得知影像像素與火災樣本之相似程 度,藉以判斷影像中是否有火災發生之方法。 【先前技術】 在利用影像擷取晝面後判斷是否有火災的技術中,我們可 以見到美國弟5153722號專利「pjre detecti〇n SyStem」,其係先 透過二個紅/紫外線感應器去對所照射的區域做感應,如果所 感應到的能量超過所設定的臨界值(Thresh〇ld)才會將此影像 藉由攝影機擷取,再對影像分析亮度區(bright areas)、邊界偵 測(edge detection)與影像抽出(imagesubtmcti〇n)等作業,進而 去判定是否為真實的火。在此架構下,對於火的偵測需要使用 額外的感應器(Sensor)來輔助才能較精準的判定出所要感應的 範圍有火的結論。 此外’在 Wen-Bing Horng、JTian-Wen Peng、Chih-Yuan Chen 1313848 於2005美國電氣與電子工私師學會(IEEE, 2005年3月19-22 日)發表之論文「A new image-based real-time flame detection method using color analysis」中,係將偵測過程分成靜態與動 態兩部份’靜態方面,事先蒐集火的樣本,將樣本由紅綠藍色 彩空間(Red, Green, Blue domain,RGB domain)轉換成 HSI 色彩 空間(HIS domain,色調Hue,色飽合度Saturation,亮度 Intensity)後’再統計樣本火與煙的HSI值的範圍,進而將這些 統6十值利用來辨識火與煙的流程上。動態方面,利用影像差異 (image difference)的方法,再去掉與火和煙相似的背景物件, 藉以判斷是否有火與煙的產生。 其次,在 T. H_ Chen, P. H. Wu 及 Y. C. Chiou 於 2004 IEEE 影像處理之國際會議(International Conference on Image1313848, ·9, invention description: [Technical field of invention] A method for detecting fire images, in particular, a similarity analysis between image pixels and a fire sample to know the similarity between image pixels and fire samples, To determine if there is a fire in the image. [Prior Art] In the technique of judging whether there is a fire after using the image to capture the surface, we can see the US patent 5,153,722, "pjre detecti〇n SyStem", which is first passed through two red/ultraviolet sensors. The illuminated area is sensed. If the detected energy exceeds the set threshold (Thresh〇ld), the image is captured by the camera, and the brightness area and boundary detection are analyzed. Edge detection) and image extraction (imagesubtmcti〇n) and other operations, to determine whether it is a real fire. Under this architecture, the detection of fire requires the use of an additional sensor to assist in the more accurate determination of the range of vibrations to be sensed. In addition, 'Wen-Bing Horng, JTian-Wen Peng, Chih-Yuan Chen 1313848, 2005, American Institute of Electrical and Electronics Engineers (IEEE, March 19-22, 2005) published a paper "A new image-based real In the -time flame detection method using color analysis", the detection process is divided into two parts: static and dynamic. The static samples are collected in advance, and the samples are collected from red, green, blue, and color spaces (Red, Green, Blue domain, RGB). Domain) is converted into HSI color space (HIS domain, Hue, Saturation, Luminance Intensity), and then the range of HSI values of sample fire and smoke is re-stated, and these values are used to identify fire and smoke. On the process. On the dynamic side, the image difference method is used to remove background objects similar to fire and smoke to determine whether there is fire and smoke. Second, at T. H_ Chen, P. H. Wu and Y. C. Chiou at the 2004 IEEE International Conference on Image Processing (International Conference on Image

Processing,ICIP,2004 年 l〇 月 24-27 曰)中發表之『An Early Fire-Detection Method Based on Image Processing』’其同樣的將 處理過程分成靜態與動態兩部分,而靜態部分,歸類出火在 RGB色彩空間裡的特徵(如Rg 〇 > B),再進行火在色調上的 6 1313848 初步分類,在動態方面,利用影像差異(][11^明difference)去除 與火相似的背景。 以上二篇論文’在靜態方面,都是利用大量的荒集火的來 源’亚統st分析纽其特定色彩空間(如RGB <聰)上的特徵 或範圍’且要奴多個臨界值,以達到最佳效果,再去初步分 ^出火的區域。此種方式㈣樣本收集上較為耗時,且運算也 複雜。 【發明内容】 本發明提出一種火災影像偵測方法,係使用-個火災影像 樣本做為基礎,將齡之料^麟《料樣本進行本發 明之分類方式,即可㈣简影像晝面中的像素是否具有火災 «,而無需使用額外之_器,亦無需對多數火災影像樣本 進行分析,以解決上述問題。 本發明之火災影像偵測方法,係將—火災影像樣本與至 少一擷取之影像晝面(軸ge F議雜崎,㈣斷該娜 之影像晝面具有—火災像素否,該火災影像_方法係包 7 1313848 括: 將該火災影像樣本之各樣本像素轉換成一色彩模型 (Color Space),該色彩模型具有一第一座標與一第二座 標,各該樣本像素則對應該第一座標與該第二座標具有 一第一座標值與一第二座標值; 計算該等樣本像素之該第一座標值與該第二座標值 之一中心點; 計算該火災影像樣本之各該等樣本像素與該中心點 之馬氏距離(Mahalanobis Distance)而得到複數個樣本距 離值; 依據該等樣本距離值定義一門檻值; 取該擷取之影像晝面之複數個影像像素之一為判斷 像素; 計算該判斷像素與該中心點之馬氏距離而得到一像 素距離;以及 比較該像素距離與該樣本距離,並於該像素距離小 8 1313848 於該門檻值時判斷該擷取之影像晝面具有該火災像素。 藉由前述之方法,即可利用一個火災影像樣本與擷取之影 像晝面做判斷,以決定是否有火災像素,達到前述目的。 為使對本發明之目的,實作方法及其功能有進一步的了 解,茲配合圖示詳細說明如下: 【實施方式】 請參照「第1圖」,其為本發明火災影像偵測方法之流程 圖,其主要係將火災影像樣本30(請見於「第2圖」)與至少 一擷取之影像畫面(40Γ image Frame)做比對,以判斷該擷取 之影像晝面是否具有一火災像素;此火災影像彳貞測方法係 包括: 步驟S12 :將該火災影像樣本30之各樣本像素轉換成一 色彩模型(Color Space),該色彩模型具有一第一座標與一第二 座標,各該樣本像素則對應該第一座標與該第二座標具有一第 一座標值與一第二座標值; 步驟S14 :計算該等樣本像素之該第一座標值與該第二座 9 1313848 標值之一中心點; 心 步驟S16 :計算該火災影像樣本之各該等樣本像素與該中 點之馬氏距離(MahalanoWs Distance)而得到複數個樣本距離 值; 步驟S18 :依據該等樣本距離值定義一門柯值. 步驟S20 :取該操取之影像晝面40之複數個影像像素之 一為判斷像素; 步驟S22 :計算該判斷像素與該中心點之馬氏距離而得到 一像素距離;以及 步驟S24 :比較該像素距離與該樣本距離,並於該像素距 離小於該門檻值時判斷該擷取之影像晝面具有該火災像素。 其中,在步驟S12之色彩模型係可例如RGB,YIQ, (青紫貫),HSV,HSI等,本發明係以亮度色差色彩模型丫工卩 為例,其中第一座標(Y)係為亮度(Luminance)座標,第二座標 ⑴係為色差同向分置(Inphase)座標,Q座標則為色差正交分 量(Quadrat說)’ YIQ係為美國國家電視系統委員會(他加― 10 1313848Processing, ICIP, "An Early Fire-Detection Method Based on Image Processing" published in 2004) is divided into static and dynamic parts, while static parts are classified as fire. The features in the RGB color space (such as Rg 〇 > B), and the preliminary classification of the fire on the color of the 6 1313848, in terms of dynamics, use the image difference (] [11 ^ Ming difference) to remove the background similar to fire. The above two papers 'in terms of statics, are all using a large number of sources of wildfires'. The sub-systems analyze the characteristics or ranges of their specific color spaces (such as RGB &C; Cong) and have to pass multiple thresholds. In order to achieve the best results, go to the initial area of the fire. In this way, (4) the sample collection is time consuming and the calculation is complicated. SUMMARY OF THE INVENTION The present invention provides a method for detecting fire images, which uses a fire image sample as a basis, and the material sample of the age of the material is subjected to the classification method of the present invention, which can be (4) in the image of the simple image. Whether the pixel has a fire «, without the need for an additional device, does not require analysis of most fire image samples to solve the above problem. The fire image detecting method of the present invention is to fire a fire image sample with at least one captured image (the axis ge F is said to be miscellaneous, and (4) the image of the image of the image is broken - the fire pixel is not, the fire image _ The method package 7 1313848 includes: converting each sample pixel of the fire image sample into a color space, the color model having a first coordinate and a second coordinate, each of the sample pixels corresponding to the first coordinate and The second coordinate has a first coordinate value and a second coordinate value; calculating a center point of the first coordinate value and the second coordinate value of the sample pixels; and calculating each of the sample pixels of the fire image sample Obtaining a plurality of sample distance values from the Mahalanobis Distance of the center point; defining a threshold value according to the sample distance values; taking one of the plurality of image pixels of the captured image plane as the determination pixel; Calculating a Mahjong distance of the determination pixel from the center point to obtain a pixel distance; and comparing the pixel distance to the sample distance, and the pixel distance is 8 1313848 When the value is depreciated, it is determined that the captured image has the fire pixel. By the foregoing method, a fire image sample and the captured image can be used to determine whether there is a fire pixel to achieve the foregoing purpose. For a better understanding of the purpose of the present invention, the implementation method and the function thereof, the following is a detailed description of the following: [Embodiment] Please refer to "FIG. 1", which is a flowchart of the fire image detecting method of the present invention. The main purpose is to compare the fire image sample 30 (see "Fig. 2") with at least one captured image frame (40Γ image frame) to determine whether the captured image has a fire pixel; The fire image detection method includes: Step S12: converting each sample pixel of the fire image sample 30 into a color space, the color model having a first coordinate and a second coordinate, each sample pixel Corresponding to the first coordinate and the second coordinate having a first coordinate value and a second coordinate value; Step S14: calculating the first coordinate value and the second of the sample pixels a 9-1313848 one of the center points; a heart step S16: calculating a MahalanoWs Distance between the sample pixels of the fire image sample and the midpoint, and obtaining a plurality of sample distance values; Step S18: The sample distance value defines a threshold value. Step S20: taking one of the plurality of image pixels of the image plane 40 of the operation as a determination pixel; Step S22: calculating a Mahalanobis distance between the determination pixel and the center point to obtain a Pixel distance; and step S24: comparing the pixel distance to the sample distance, and determining that the captured image plane has the fire pixel when the pixel distance is less than the threshold value. Wherein, the color model in step S12 can be, for example, RGB, YIQ, HSV, HSI, etc., and the present invention is exemplified by a luminance color difference color model, wherein the first coordinate (Y) is brightness ( Luminance) coordinates, the second coordinate (1) is the color difference in the same direction (Inphase) coordinates, and the Q coordinate is the color difference quadrature component (Quadrat said) 'YIQ is the National Television System Committee (He added - 10 1313848

Television System Committee,NTSC)所訂定,本發明在以下之 實施例係以党度座標與色差同向分量座標為例,而此色彩 模型與此二座標之選擇係考量到在各種色彩模型中,以此 亮度色差色彩模型中的亮度座標與色差同向分量座標在實 施本發明時,其門檻值之設定較為容易,且能得到相當正 確之判斷結果。 前述列舉之色彩模型之定義係簡述如下,其中CMY(青 紫黃,Cyan Magenta Yellow)色彩模型,其係為彩色圖像印 刷行業最常使用的色彩模型,在彩色立方體中此三色是紅 綠藍的補色,CMY色彩模型即是藉此三色來表現出各種彩 色的模型。HSV (Hue,Saturation, Value)色彩模型主要是對 應於晝家配色模型,與人眼的視覺感知相一致,是一種適 合人眼分辨的模型,其中Η定義產色的波長,稱為色調, S定義顏色的深淺程度,稱為飽和度,V定義顏色的明暗 程度,稱為亮度,通常用百分比來表示。其次,在樣本像 素轉換為亮度色差色彩模型之方法中,較常見的是將紅綠 11 1313848 藍色彩模型(RGB domain)轉換為亮度色差色彩模型(YIQ domain),故以下即列出此一轉換之公式:The Television System Committee (NTSC) stipulates that the following embodiments use the party coordinates and the color difference and the same component coordinates as an example, and the color model and the selection of the two coordinates are considered in various color models. In the luminance chromatic aberration color model, the luminance coordinate and the chrominance difference component coordinate are used in the practice of the present invention, and the threshold value is set relatively easily, and a relatively accurate judgment result can be obtained. The definitions of the color models listed above are briefly described below, in which the CMY (Cyan Magenta Yellow) color model is the most commonly used color model in the color image printing industry. In the color cube, the three colors are red and green. The blue complementary color, the CMY color model is the three colors to represent the various color models. The HSV (Hue, Saturation, Value) color model mainly corresponds to the color matching model of the family, which is consistent with the visual perception of the human eye. It is a model suitable for human eye resolution, in which Η defines the wavelength of color production, called color tone, S Defining the degree of lightness, called saturation, V defines the degree of lightness and darkness of the color, called brightness, usually expressed as a percentage. Secondly, in the method of converting sample pixels into a color difference color model, it is more common to convert the red and green 11 1313848 blue color model (RGB domain) into a luminance color difference color model (YIQ domain), so the following is listed. Formula:

Y 0.299 0.587 0.114 R I — 0.595716 -0.274453 -0.321263 G Q 0.211456 -0.522591 0.311135 B 由上述公式即可得知在此YIQ色彩模型與RGB色彩 • 模型之間有高度相關性,其差別在於YIQ座標分別代表亮 度、色差同向分量、與色差正交分量,而RGB則分別代表 紅、綠、藍座標。 在樣本像素經過轉換後,每個樣本像素即會具有對應 Y, I座標之第一座標值與第二座標值,也就是亮度值與色 ® 差同向分量值,當然每個樣本像素亦會具有Q座標之色差 正交分量值,但由於在本實施例中未以此座標為例,故不 再贅述。 ' 其次,即是步驟S14之計算該等樣本像素之該第一座標 值與該第二座標值之一中心點,其係將所有樣本像素之第一座 標值與第二座標值分別計算其算數平均數,意即所有第一座標 12 1313848 值相加後除以樣本像素之個數,所有第二座標值相加後除以樣 本像素之個數,如此即得到此中心點之座標值。 步驟S16之馬氏距離係為統計方法中群集分析中的一 種距離方法,其計算公式為: 乃M (Xz. ) = -"广 Σ 1 “ _ 其中,A為(W X3,…,w,p為中心點之座標值, Σ μ則為聯合組内共變數矩陣,Dm (A)為χ,·之馬氏距 離,關於詳細計算例容後詳述。 本步驟主要係計算出火災影像樣本中各樣本像素與中 心點之馬氏距離,以形成複數個樣本距離,做步驟S18定 義門檻值之依據 關於門檻值之定義,可視樣本距離之分佈而不同,其 可有下述幾種定義方式,包含將門檻值設定為大於95%之 樣本距離值的數值、將門檻值定義為該等樣本距離值之中 位數、將門植值設定為該等樣本距離值之最大值、將門檻 13 1313848 值設定為該等樣本距離值之平均值、將門檻值設定為該等 樣本距離值之平均值乘上一安全係數或將門檻值設定為該 等樣本距離值之平均值加上該等樣本距離值之三倍標準 差,其中安全係數可以依實際情形調整,例如12。 而丽述各種門檻值之定義方式端視樣本距離之直方圖分 佈情形而改變,例如若為常態分配,則以三倍標準差之方式為 較佳’若非統計中已知之分佈模式’則可採用最大值、大於某 個比例(如95%)之樣本距離值、或者視影像樣本中火災圖樣之 比例來定義此門檻值。 再者’步驟S20之影像晝面係為由攝影機所拍攝到的即時 晝面,此種影像晝面係由連續的畫面(frame)構成,往往為了使 影像晝面在人眼的視覺中得到較佳的效果,均需於—秒中榻取 出至少三十張晝面(frame); 步驟⑽巾鑽料之贿係可採料齡式進行,其一 為依序將影像晝面40中之所右 厅有衫像像素抽取為到斷像素,再 進行後續步驟,此種方式所需 進仃&十异與火災像素之判斷較 14 1313848 多,將使系統負荷較重,亦可能因系統之效能與晝面像數較多 而導致運算速度無法跟上影像晝面中各晝面錄之速度。 其第二種方式可以採用取該擷取之影像畫面中與前一 影像晝面相異之複數個影像像素之_為該判斷像素,意即 將目前擷取到之影像晝面與前一時間點所操取到之影像晝 面做影像像素逐-比對’找出相異之複數個影像像素,再 將相異之影縣錢-設定為_料,㈣行後續步驟。 在步驟奶與步驟S24中係計算該判斷像素與該中心點之 馬氏距㈣得到-像素輯,並_該像纽離與該樣本距 離,並於該像素距離小於禱綱酬該錄之影像晝面具 有—火繼如細火嶋樣本中各像素 與中心點之歸做騎據,故若像素麟小於門檀值 時’即代纽《輯騎紅_像讀火災料樣本屬於 同-群集,亦即該判斷像素應屬火炎像素。 「„關於本發明之實例演算請同時參閱「第2圖」、「第3圖」’ W圖」係為火災影像樣本之第_實施例,圖中係以火影像 15 1313848 樣本做為實施方式,此—《彡像樣本係可以取自任何實際的火 之影像,若此火影像樣本之樣本像素有69*104 pixeis,亦即共 有7176個像素,將之由RGB座標轉換為γΐ(}座標後,γ座 才示與I座標則各別有[69 x 104]之矩陣,將此兩個矩陣重新如下 方式排列,由於數據相當多,故本實施例僅提出特定數值做為 實施例之說明: X. 7176 J 7176 ,其中丫丨以79.1697,1丨以81.8977為下述例。 經步驟S14後即可得到中心點為她⑽由貫際計算而得分別為255.0835與40.8868 Υμ Ιμ 其中Yu與Iu Y與I之聯合組内共變數矩陣L -1則為Y 0.299 0.587 0.114 RI — 0.595716 -0.274453 -0.321263 GQ 0.211456 -0.522591 0.311135 B From the above formula, there is a high correlation between the YIQ color model and the RGB color model. The difference is that the YIQ coordinates represent brightness, The chromatic aberration is the same component and the color difference is orthogonal component, and RGB is the red, green and blue coordinates, respectively. After the sample pixels are converted, each sample pixel will have a first coordinate value and a second coordinate value corresponding to the Y, I coordinate, that is, the luminance value and the color value of the same component, of course, each sample pixel will also The color difference quadrature component value of the Q coordinate is used, but since this coordinate is not taken as an example in this embodiment, it will not be described again. Second, that is, the first coordinate value of the sample pixels and the center point of the second coordinate value are calculated in step S14, and the first coordinate value and the second coordinate value of all sample pixels are respectively calculated as arithmetic numbers thereof. The average number means that all the first coordinates 12 1313848 are added and divided by the number of sample pixels. All the second coordinate values are added and divided by the number of sample pixels, so that the coordinate value of the center point is obtained. The Markov distance of step S16 is a distance method in the cluster analysis in the statistical method, and the calculation formula is: M (Xz.) = -"Growing 1 " _ where A is (W X3,...,w , p is the coordinate value of the center point, Σ μ is the covariate matrix in the joint group, Dm (A) is the Markov distance of χ,·, and detailed description of the detailed calculation. This step is mainly to calculate the fire image. The Mahalanobis distance between each sample pixel and the center point in the sample to form a plurality of sample distances. The step of defining the threshold value in step S18 is based on the definition of the threshold value, which may be different depending on the distribution of the sample distance, and may have the following definitions. The method includes setting the threshold value to a value of the sample distance value greater than 95%, defining the threshold value as the median value of the sample distance values, setting the threshold value to the maximum value of the sample distance values, and setting the threshold 13 1313848 The value is set to the average of the sample distance values, the threshold value is set to the average of the sample distance values multiplied by a safety factor or the threshold value is set to the average of the sample distance values plus the samples Distance value The standard deviation is different, wherein the safety factor can be adjusted according to the actual situation, for example 12. The definition of various threshold values of Liszian changes depending on the histogram distribution of the sample distance, for example, if it is a normal distribution, it is three times the standard deviation. The method is preferably 'if the distribution pattern is known in the statistics', the maximum value, the sample distance value greater than a certain ratio (such as 95%), or the ratio of the fire pattern in the image sample is used to define the threshold value. 'The image surface of step S20 is the instant surface captured by the camera. The image surface is composed of a continuous frame, which is often used to make the image surface better in the human eye. For the effect, it is necessary to take out at least 30 frames in the second seconds; Step (10) The bribes of the towel drills can be taken in the age of the material, and one of them is the right hall in the image 40 If there is a shirt, the pixel is extracted as a broken pixel, and then the subsequent steps are carried out. In this way, more than 14 1313848 are required for the comparison of the pixel and the fire pixel, which will make the system load heavier, and may also be due to the performance of the system. Face The number of images is too large, so that the operation speed cannot keep up with the speed of each recording in the image plane. The second method can take a plurality of image pixels which are different from the previous image surface in the captured image frame. _ is the judgment pixel, which means that the image plane that is currently captured and the image acquired by the previous time point are image-by-aligned to find out the different image pixels, and then the phase The different shadow county money - set as _ material, (four) line subsequent steps. In step milk and step S24 is calculated the judgment pixel and the center point of the Markov distance (four) to get - pixel series, and _ the image is separated from the The distance between the sample and the image that is less than the distance of the pixel is the same as that of the pixel in the sample of the fine fire, so if the pixel is smaller than the threshold value, Dyna's "Riding Red _ like reading fire material samples belong to the same - cluster, that is, the judgment pixel should be a fire pixel. "For the example calculation of the present invention, please refer to "Fig. 2" and "3rd picture". "W" is the first example of the fire image sample. The figure is based on the fire image 15 1313848 sample. , this - "The image sample can be taken from any actual fire image. If the sample pixel of this fire image sample has 69*104 pixeis, that is, there are 7176 pixels, which are converted from RGB coordinates to γΐ(} coordinates. After that, the γ-seat and the I-score have a matrix of [69 x 104], and the two matrices are arranged in the following manner. Since the data is quite large, the present embodiment only proposes a specific numerical value as an explanation of the embodiment. : X. 7176 J 7176 , where 丫丨 is 79.1697, 1 丨 is 81.8977 as the following example. After step S14, the center point is obtained for her (10) by the cross-calculation to obtain 255.0835 and 40.8868 Υμ Ιμ, where Yu and The covariate matrix L -1 in the joint group of Iu Y and I is

7176 I7176 I

Mean\Xi - Meanf (7176)-1 ,並以X ]為例計算如下: 79.1697 81.8977 255.0835]^ 40.8868Mean\Xi - Meanf (7176)-1, and take X] as an example: 79.1697 81.8977 255.0835]^ 40.8868

79.1697 X 81.8977 '255.0835ί τ\ 40.8868 _ J 4 2.1291 -0.59841 = 104x -0.5984 0.1682 16 1313848 再將聯合組内共變數矩陣逐一相加Σι+Σ2+……Σ再除 以(7176-1)即得到 ,1.2086 -0.7754 = 103χ -0.7754 0.6762 其中,為使整個運算過程不致因除以零而產生錯誤,79.1697 X 81.8977 '255.0835ί τ\ 40.8868 _ J 4 2.1291 -0.59841 = 104x -0.5984 0.1682 16 1313848 Then add the covariate matrix in the joint group one by one Σι+Σ2+...Σ and divide by (7176-1) to get 1.2086 -0.7754 = 103χ -0.7754 0.6762 where, in order to prevent the entire operation from being caused by dividing by zero,

亦可加上一個單位矩陣 之後得到聯合組内共變數矩 陣為: ,1.2096 -0.7754 103χ -0.7754 0.6772 接下來進行前述步驟S16、S18,以本發明之實施例, 樣本距離值之分佈並非為常態分配,且有95%之樣本距離 值小於6,故門彳監值即設定為6。 其後,則先取擷取之影像晝面40(如「第3圖」所示)之 複數個影像像素之一為判斷像素,假設此判斷像素之Υ與I 座標值為0, 0,將此判斷像素與中心點做馬氏距離之計算如下:The unit covariate matrix can also be obtained by adding a unit matrix to: 1.2096 -0.7754 103χ -0.7754 0.6772. The foregoing steps S16 and S18 are performed. In the embodiment of the present invention, the distribution of the sample distance values is not a normal allocation. And 95% of the sample distance value is less than 6, so the threshold value is set to 6. Thereafter, one of the plurality of image pixels of the captured image plane 40 (as shown in FIG. 3) is taken as the judgment pixel, and it is assumed that the 像素 and I coordinate values of the judgment pixel are 0, 0. The calculation of the Mahalanobis distance between the pixel and the center point is as follows:

[/ V - _ r、 f \ 一】 ί _ T \ 0 225.0835 1.2096 -0.7754 \ 0 225.0835 D,= — X 103 X X — 1 } 0 40.8868 -0.7754 0.6772 0 40.8868 η 、匕」 L - J )V L - J 17 1313848 D ^15 23 * 1 ’由於其值大於門檻值,故判斯並非為火災 素女此即凡成一個影像像素點之判斷。 關於判斷像素之決定,請參考「第4圖」,其係為另一 另榻取之影像晝面41,圖中可以看見其内具有火趋像素 42 ’而在刚—張影像晝面40中並未有此火焰像素存在,因 此,當判斷像素係以前後二張影像晝面4〇, 〇做比較,將 變化的影像像素再計算其馬氏距離,即可省略—些運算之 時間’而此火焰像素在計算出馬氏距離後,其值為3.25, 小於門彳監值,即判斷為火災像素。 關於前述本發明方法方物理意義,係以統計方法中的 群集分析(Cluster analysis,classificad〇n),直接以馬氏距離 法算出火災影像樣本中的每個火災像素至中心值之距離, 而由於在實驗後得知採馬氏距離之方式計算對於色彩空間 的座標之選擇,需以亮度及色差同向分量、或亮度及色差 正交分量做搭配,將有較好之效果,因此,在門檻值定義 後,對新擷取之影像晝面中的各判斷像素在計算對樣本中 18 1313848 心點之距離若切㈣值,即麵此觸像讀火災影像 樣本中的火災像素屬於同一群集,藉以判斷火災像素。 «月 > 考第5圖」’其係為本發明火災影像樣本儿第二 實施例之示意圖,其以㈣像樣本為例,—般在赵火災時, 通常會伴隨著煙,故煙影像樣本亦需做為火災像素判斷之重要 依據’其影像制方法與前述_,不再費述。 前述火災影像樣本3〇, 31亦可同時係為—火影像樣本 及-煙影像樣本,因此,中心點即包含為一火中心點與一 煙中心點,樣本距離則包含一火樣本距離與一煙樣本距 離,像素距離係包含-火像素距離與一煙像素距離,該門 植值包含-火門播值與一煙門捏值,以在該火像素距離小 於該火門播值時且該煙像素距離小於該煙門捏值時,判斷 該擷取之影像晝面具有一火災像素,如此一來,其判斷上 則較為嚴謹,需同時有火與煙之產生才判斷為火災像素。 前述火災影像樣本30與擷取之影像晝面4〇,41,在實 施時最好以同一影像擷取系統與硬體所擷取而得之像素為 19 1313848 佳,以消弭硬體與硬體間的差異。 關於本發明之應用,請參考「第6圖」,其係將前述步 驟S12, S14, S16, S18結合為步驟S50 :樣本分析,以設定 門檻值,而步驟S20,S22則為步驟S52,以計算判斷像素 之像素距離,其後於步驟S54即進行像素距離與門檻值之 比較,若像素距離小於門檻值時,即記錄該影像晝面為火 災畫面,並開始計數(S55),於判斷完一張影像晝面後,即 進行步驟S56,判斷是否連續三十張影像晝面均具有火災 晝面,若是,則發出警報(S58),若否,則執行步驟S59, 本次影像晝面若不是火災畫面時,則火災晝面計數歸零, 如此一來,即可避免警報誤發之情形;當然,前述連續三 十張火災晝面計數之設計係可依不同之需求而調整,可以 設定為二十張或更多或更少,以符合實際需求。 雖然本發明已以較佳實施例揭露如上,然其並非用以限定 本發明,任何熟習此技藝者,在不脫離本發明之精神和範圍 内,當可作各種之更動和潤飾,因此本發明之保護範圍當後附 20 1313848 之申請專利範圍所界定者為準。 【圖式簡單說明】 , 第1圖,係為本發明火災影像偵測方法之流程圖; 第2圖,係為本發明之火災影像樣本第一實施例之示意圖; 第3圖,係為一擷取之影像晝面之示意圖; • 第4圖,係為另一擷取之影像晝面之示意圖; 第5圖,係為本發明之火災影像樣本第二實施例之示意圖;及 第6圖,係為本發明應用例之流程圖。 【主要元件符號說明】 30,31 火災影像樣本 • 40,41 影像畫面 42 火焰像素 步驟S12將火災影像樣本之各樣本像素轉換成一色彩 樣型(Color Space) 步驟S14計算樣本像素之中心點 步驟S16計算樣本像素與中心點之馬氏距離而得到樣 21 1313848 本距離值 步驟S18依據樣本距離值定義一門檻值 步驟S20取影像晝面之影像像素之一為判斷像素 步驟S22計算判斷像素與中心點之馬氏距離而得到一 像素距離 步驟S24比較像素距離與樣本距離,並於像素距離小於 樣本距離時判斷影像畫面具有一火災像素 步驟S50樣本分析 步驟S52影像擷取 步驟S54火災像素判斷 步驟S55火災晝面計數 步驟S56火災晝面計數達三十個 步驟S58發出警報 步驟S59本次影像晝面若不是火災畫面,則火災晝面計 數歸零 22[/ V - _ r, f \ a] ί _ T \ 0 225.0835 1.2096 -0.7754 \ 0 225.0835 D,= — X 103 XX — 1 } 0 40.8868 -0.7754 0.6772 0 40.8868 η ,匕” L - J )VL - J 17 1313848 D ^15 23 * 1 'Because its value is greater than the threshold value, it is not judged that the fire is a female pixel. For the decision to determine the pixel, please refer to "4th picture", which is another image of the image 41, which can be seen in the figure with the fire-like pixel 42' and in the image-only surface 40 There is no such flame pixel. Therefore, when judging the pixel image before and after the two images, the comparison is made, and the changed image pixel is calculated by the Mahalanobis distance, and the time of some operations can be omitted. After calculating the Mahalanobis distance, the flame pixel has a value of 3.25, which is smaller than the threshold value, that is, it is determined to be a fire pixel. Regarding the physical meaning of the foregoing method of the present invention, the cluster analysis (classificad〇n) in the statistical method is used to directly calculate the distance from each fire pixel to the center value in the fire image sample by the Mahalanobis distance method, After the experiment, the method of calculating the Markov distance is used to calculate the coordinates of the color space. It is necessary to match the luminance and chrominance components in the same direction, or the orthogonal components of the luminance and chromatic aberration, which will have a better effect. Therefore, at the threshold. After the value is defined, the judgment pixels in the newly captured image plane are calculated as the distance (4) of the distance between the 18 1313848 heart points in the sample, that is, the fire pixels in the sample fire image sample belong to the same cluster. To judge the fire pixel. «月> 考第5图' is a schematic diagram of a second embodiment of a fire image sample of the present invention, which takes (4) a sample sample as an example, generally in the case of a Zhao fire, usually accompanied by smoke, so smoke image The sample also needs to be used as an important basis for the judgment of the fire pixel. The image method and the above _ are not mentioned. The fire image samples 3〇, 31 can also be used as a fire image sample and a smoke image sample. Therefore, the center point includes a fire center point and a smoke center point, and the sample distance includes a fire sample distance and a The smoke sample distance, the pixel distance includes a fire pixel distance and a smoke pixel distance, the gate value includes a fire gate broadcast value and a smoke gate pinch value, when the fire pixel distance is less than the fire door broadcast value and the When the distance of the smoke pixel is less than the value of the smoke door, it is judged that the image of the captured image has a fire pixel, and thus the judgment is more rigorous, and it is determined that the fire pixel is generated by the fire and the smoke at the same time. The fire image sample 30 and the captured image surface 4, 41 are preferably implemented by the same image capturing system and the hardware to obtain a pixel of 19 1313848, in order to eliminate the hardware and hardware. The difference between the two. For the application of the present invention, please refer to "Fig. 6", which combines the foregoing steps S12, S14, S16, S18 into step S50: sample analysis to set the threshold value, and step S20, S22 is step S52, Calculating the pixel distance of the pixel, and then comparing the pixel distance with the threshold value in step S54, if the pixel distance is less than the threshold value, recording the image surface as a fire picture, and starting counting (S55), after the judgment is completed After an image is scanned, step S56 is performed to determine whether there are 30 consecutive images of the face of the image, and if so, an alarm is issued (S58). If not, step S59 is performed, If it is not a fire screen, the fire surface count will be zero. In this way, the alarm can be avoided. Of course, the design of the above 30 consecutive fire surface counts can be adjusted according to different needs. Twenty or more or less to meet actual needs. While the present invention has been described in its preferred embodiments, the present invention is not intended to limit the invention, and the invention may be modified and modified without departing from the spirit and scope of the invention. The scope of protection is defined by the scope of the patent application of PCT Application No. 20 1313848. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a flow chart of a fire image detecting method according to the present invention; FIG. 2 is a schematic view showing a first embodiment of a fire image sample of the present invention; Schematic diagram of the captured image; • Figure 4 is a schematic diagram of another captured image; Figure 5 is a schematic view of a second embodiment of the fire image sample of the present invention; and Figure 6 Is a flowchart of an application example of the present invention. [Main component symbol description] 30,31 Fire image sample • 40,41 Image screen 42 Flame pixel step S12 converts each sample pixel of the fire image sample into a color space (Step S14) Calculate the center point of the sample pixel Step S16 Calculating the Mahalanobis distance between the sample pixel and the center point to obtain the sample 21 1313848. The distance value step S18 defines a threshold according to the sample distance value. Step S20 takes one of the image pixels of the image plane as the determination pixel. Step S22 calculates the judgment pixel and the center point. The pixel distance is obtained by comparing the pixel distance and the sample distance in step S24, and the image image has a fire pixel when the pixel distance is smaller than the sample distance. Step S50 Sample analysis step S52 Image capture step S54 Fire pixel determination step S55 fire After the face counting step S56, the fire surface count reaches 30 steps S58 is issued an alarm step S59. If the image surface is not a fire screen, the fire surface count is reset to zero.

Claims (1)

1313848 十、申請專利範圍: 1‘一種火災影像_方法,储—火災影像樣本與至少- 操取之影像晝面(贿ge Frame)做比對,以判斷該榻取之 影像晝面具有-火災像素否,該火災影像_方法係包 括: 將該火《彡像樣本之各樣本像素轉料—色彩模型 (Color Space)’該色彩模型具有一第_座標與一第二座 標’各該樣本像素騎應該第—練賴第二座標具有 一第一座標值與一第二座標值;1313848 X. Patent application scope: 1 'A fire image_method, storage-fire image sample is compared with at least the image of the image (brieze frame) to determine that the image of the bed has a fire Pixel No, the fire image_method includes: the fire "the sample pixel of each sample sample - color space (Color Space)" the color model has a _ coordinate and a second coordinate of each sample pixel The ride should be the first - the second coordinate of the training has a first coordinate value and a second coordinate value; 計算該等樣本像素之該第一座標值與該第 之一中心點; 二座標值 中心點 樣本距 為列斷 計算該火災影像樣本之各該等樣本像素與該 之馬氏距離(Mahalanobis Distance)而得到複數個 離值; 依據該寺樣本距離值定義一門梭值. 取該擷取之影像晝面之複數個影像像素之 23 1313848 像素; 計算該判斷像素與該中心點之馬氏距離而得到—像 素距離;以及 比幸乂δ玄像素距離與該樣本距離,並於該像素距離小 於該門檻值時判斷該擷取之影像晝面具有該火災像素。 > 2.如中請專利範圍第】項所述之火災影像㈣方法,其中 該色彩模型係為—亮度色差色彩模型(YIQ d_in),該 第一座標係為亮度(Luminance)座標,該第二座標係為色 ——一差同向分量(Inphase)座標。 — .·——............ _ 3.如申請專利範圍第2項所述之火災影像偵測方法,其中 ^ 該樣本像素之該第一座標值係為一亮度值,該樣本像素 之該弟一座標值係為一色差同向分量值。 4·如申請專利範圍第1項所述之火災影像偵測方法,其中 該馬氏距離係為 (A.) = (χ. - μ)τ^ (χ. ~ ,其中 a 為〜弟座‘值與該弟二座標值,y為該中心點之該第 24 1313848 座"^值與該第二座標值,Σ 1則為聯合組内共變數矩 陣乃从(戈)為\之該馬氏距離。 如申印專利範圍第1項所述之火災影像偵測方法,其中 '^門檢值係大於95%之該等樣本距離值。 鲁6.如申請專利範圍第丨項所述之火災影像制方法,其中 該門檻值係為該等樣本距離值之中位數。 7.如申請專利範圍第W所述之火災影像_方法,其中 該值係為該等樣本距離值之最大值。 δ·如申請專利範圍第1項所述之火災影像偵測方法,其中 該門根值係為該等樣本距離值之平均值。 9.如申請專利範圍第!項所述之火災影像偵測方 其中 該門捏值係為該等樣本距離值之平均值乘上、一 6 女全係 數。 .10.如申請專利範圍第9項所述之火災影像谓測方法,其中 該安全係數係為1.2。 11·如申請專利範圍第1項所述之火災影像侦測方 ,其中 25 1313848 上該荨樣本距 該門檻值係為該等樣本距離值之平均值加 離值之三倍標準差。 其中 12. 如申請專利範圍第1項所述之火災影像偵測方法, 該取該擷取之影像畫面之複數個影像像素之一為判斷像 素之步驟係為取該擷取之影像晝面中與前—影像佥面才 異之複數個影像像素之一為該判斷像素。 13. 如申請專利範㈣丨項所述之火災影像_方法,其中 該計算該判斷像素與該中心點之馬氏距離而得到—像素 距離之步驟係 '為先將該判斷像素轉換為該第—座標值與 該第二座標冑,其次再計算該判斷像素與該中心點之馬 氏距離。 14.如申請專利範圍第1項所述之火災影像_方法,其中 該火災影像樣本係為一火影像樣本。 15·如申請專利範圍第!項所述之火災影像偵測方法,其中 °亥火災影像樣本係為一煙影像樣本。 16·如申請專利範圍第1項所述之火災影像偵測方法,其中 26 1313848 該火災影像樣本係為一火影像樣本及一煙影像樣本,該 中心點係包含為一火中心點與一煙中心點,而該樣本距 離則包含一火樣本距離與一煙樣本距離,該像素距離係 包含一火像素距離與一煙像素距離,該門檻值係包含一 火門播值與一煙門檻值,以在該火像素距離小於該火門 檻值時且該煙像素距離小於該煙門檻值時,判斷該擷取 之影像晝面具有一火災像素。 17.如申請專利範圍第1項所述之火災影像偵測方法,更包 s步驟.連續三十個該擷取之影像晝面均具有該火災像 素時,發出警訊。 18如申請專利範圍第1項所述之火災影像制方法,其中 該计异該判斷像素與該中心點之馬氏距離而得到一像素 距離之步驟前更包含將該判斷像素轉換成該色彩模型之 步驟。 申明專利範圍第1項所述之火災影像偵測方法,其中 該中心點係為該等樣本像素之該第一座標值之算數平均 27 1313848 數與該第二座標值之算數平均值。Calculating the first coordinate value of the sample pixels and the first center point; the coordinate value of the center point of the two coordinate values is a row break to calculate the Mahalanobis Distance of each of the sample pixels of the fire image sample And obtaining a plurality of deviation values; defining a threshold value according to the distance value of the temple sample. Taking 23 1313848 pixels of the plurality of image pixels of the captured image surface; calculating the Mahalanobis distance between the determination pixel and the center point a pixel distance; and a distance from the sample distance and the sample distance, and determining that the captured image has the fire pixel when the pixel distance is less than the threshold value. < 2. The fire image (4) method according to the scope of the patent scope, wherein the color model is a luminance color difference color model (YIQ d_in), and the first coordinate is a Luminance coordinate, the first The two coordinate systems are color--one-phase in-phase (Inphase) coordinates. - .. -....... _ 3. The method for detecting fire images according to claim 2, wherein the first coordinate value of the sample pixel is one The brightness value, the standard value of the sample pixel is a color difference same direction component value. 4. The method for detecting fire images as described in claim 1, wherein the Markov distance is (A.) = (χ. - μ)τ^ (χ. ~ , where a is ~ brother' The value and the two coordinate values of the brother, y is the central point of the 24 1313848 seat " ^ value and the second coordinate value, Σ 1 is the joint group covariate matrix is from (go) to \ the horse The method for detecting fire images as described in item 1 of the patent application scope, wherein the '^ gate detection value is greater than 95% of the sample distance values. Lu 6. As described in the scope of claim patent The method for manufacturing a fire image, wherein the threshold value is the median value of the sample distance values. 7. The fire image method according to claim W, wherein the value is the maximum value of the sample distance values. δ· The method for detecting fire images according to claim 1, wherein the root value is an average value of the distance values of the samples. 9. The fire image detection described in the scope of claim patent item In the test, the value of the door is multiplied by the average of the distance values of the samples, and a total factor of 6 females. The fire image pre-measurement method described in claim 9 wherein the safety factor is 1.2. 11. The fire image detecting party described in claim 1 of the patent application, wherein the sample is 25 1313848 The threshold value is the standard deviation of the average value of the sample distance plus three times the standard deviation. Among them 12. For the fire image detection method described in item 1 of the patent application, the plural of the captured image image is taken. One of the image pixels is a step of determining the pixel by taking one of a plurality of image pixels different from the front image edge in the captured image plane as the determination pixel. 13. For example, the patent application (4) The fire image_method, wherein the step of calculating a Mahjong distance between the determination pixel and the center point to obtain a pixel distance is to first convert the determination pixel into the first coordinate value and the second coordinate And then calculating the Mahalanobis distance of the judgment pixel and the center point. 14. The fire image method according to claim 1, wherein the fire image sample is a fire image sample. The fire image detecting method described in the scope of the patent application, wherein the image of the fire image is a smoke image sample. 16· The method for detecting fire images according to claim 1 of the patent scope, 26 1313848 The fire image sample is a fire image sample and a smoke image sample, the center point includes a fire center point and a smoke center point, and the sample distance includes a fire sample distance and a smoke sample distance, the pixel distance The method includes a fire pixel distance and a smoke pixel distance, the threshold value includes a fire door broadcast value and a smoke threshold value, and when the fire pixel distance is less than the fire door threshold value, and the smoke pixel distance is less than the smoke threshold value When judging, the captured image mask has a fire pixel. 17. The method for detecting fire images as described in claim 1 of the patent application further includes the step of s. a warning is issued when 30 consecutive images of the captured image have the fire pixels. The fire image forming method according to claim 1, wherein the step of determining the pixel distance from the center point to obtain a pixel distance further comprises converting the judgment pixel into the color model. The steps. The fire image detecting method according to claim 1, wherein the center point is an arithmetic mean of the first coordinate value of the first coordinate value of the sample pixels and an arithmetic mean of the second coordinate value. 2828
TW96105521A 2007-02-14 2007-02-14 A fire image detection method TWI313848B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW96105521A TWI313848B (en) 2007-02-14 2007-02-14 A fire image detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW96105521A TWI313848B (en) 2007-02-14 2007-02-14 A fire image detection method

Publications (2)

Publication Number Publication Date
TW200834476A TW200834476A (en) 2008-08-16
TWI313848B true TWI313848B (en) 2009-08-21

Family

ID=44819490

Family Applications (1)

Application Number Title Priority Date Filing Date
TW96105521A TWI313848B (en) 2007-02-14 2007-02-14 A fire image detection method

Country Status (1)

Country Link
TW (1) TWI313848B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI453679B (en) * 2009-01-16 2014-09-21 Hon Hai Prec Ind Co Ltd Image matching system and method
TWI420423B (en) * 2011-01-27 2013-12-21 Chang Jung Christian University Machine vision flame identification system and method
TWI486883B (en) * 2011-11-28 2015-06-01 Ind Tech Res Inst Method for extractive features of flame image

Also Published As

Publication number Publication date
TW200834476A (en) 2008-08-16

Similar Documents

Publication Publication Date Title
US8774503B2 (en) Method for color feature extraction
Kumar et al. A comparative study of different color spaces for foreground and shadow detection for traffic monitoring system
US7606414B2 (en) Fusion of color space data to extract dominant color
TWI419082B (en) Moving object detection method and image processing system for moving object detection
JP2006350557A5 (en)
JPH11110559A (en) Method for detecting object and removing background, and record medium recorded with program
WO2016037422A1 (en) Method for detecting change of video scene
WO2016115829A1 (en) Banknote classification and identification method and device based on lab color space
WO2019076326A1 (en) Shadow detection method and system for surveillance video image, and shadow removing method
CN107180439B (en) Color cast characteristic extraction and color cast detection method based on Lab chromaticity space
CN106651966B (en) Picture color identification method and system
CN106067177A (en) HDR scene method for detecting and device
CN106599880A (en) Discrimination method of the same person facing examination without monitor
Li et al. A color cast detection algorithm of robust performance
CN111766248A (en) Steel seal on-line detection system and method based on color CCD
CN103841295B (en) Image processing apparatus
TWI313848B (en) A fire image detection method
TWI586954B (en) An apparatus for detecting cells being infected by human papillomavirus (hpv) and an detection method therefor
JP2010008159A (en) Visual inspection processing method
CN113744326A (en) Fire detection method based on seed region growth rule in YCRCB color space
CN101056350A (en) Digital image evidence collecting method for detecting the multiple tampering based on the tone mode
US9165218B2 (en) Distance-based image analysis
CN105761282B (en) The detection method and device of image color cast
JP4459945B2 (en) Extraction color range setting method
CN112949367A (en) Method and device for detecting color of work clothes based on video stream data

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees