CN113744326B - Fire detection method based on seed region growth rule in YCRCB color space - Google Patents

Fire detection method based on seed region growth rule in YCRCB color space Download PDF

Info

Publication number
CN113744326B
CN113744326B CN202110982257.1A CN202110982257A CN113744326B CN 113744326 B CN113744326 B CN 113744326B CN 202110982257 A CN202110982257 A CN 202110982257A CN 113744326 B CN113744326 B CN 113744326B
Authority
CN
China
Prior art keywords
fire
image
pixel
flame
channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110982257.1A
Other languages
Chinese (zh)
Other versions
CN113744326A (en
Inventor
杨雄飞
管一弘
王端生
杨建菊
余天兵
李志强
廖礼玲
杨亚飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kunming University of Science and Technology
Original Assignee
Kunming University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunming University of Science and Technology filed Critical Kunming University of Science and Technology
Priority to CN202110982257.1A priority Critical patent/CN113744326B/en
Publication of CN113744326A publication Critical patent/CN113744326A/en
Application granted granted Critical
Publication of CN113744326B publication Critical patent/CN113744326B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/28Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture specially adapted for farming

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Fire-Detection Mechanisms (AREA)

Abstract

The invention relates to a fire detection method based on a seed region growth rule in a YCRCB color space, and belongs to the technical field of fire detection. The method comprises the steps of collecting images, preprocessing frame images, selecting image seeds in a YCrCb channel by utilizing the color characteristics of fire, carrying out regional growth on the obtained seeds according to gray value and chromaticity difference similarity criteria, and judging the dynamic characteristics of flame images on the grown images so as to obtain the final judging result of the fire. The invention solves the problems of low accuracy and large operation degree of the existing fire disaster identification, and can effectively improve the speed and accuracy of fire disaster detection.

Description

Fire detection method based on seed region growth rule in YCRCB color space
Technical Field
The invention relates to a fire detection method based on a seed region growth rule in a YCRCB color space, and belongs to the technical field of fire detection.
Background
Fire is a disaster event that rapidly causes significant injury and property damage. To reduce such disasters, early false alarm free fire detection is critical. Accordingly, various automatic fire detection techniques are being developed and widely used in real life.
Most of the existing fire region detection algorithms are based on RGB color images and simple static and dynamic characteristics of fire, but the fire images are obviously influenced by background illumination, and the traditional detection method based on RGB channels has poor detection effect under the condition that the fire background is similar to the fire color; in addition, the traditional method has large technical traffic, and the point-to-point operation is required to be carried out on the whole image for a plurality of times.
Disclosure of Invention
The invention provides a fire detection method based on a seed region growing rule in a YCRCB color space, which is used for solving the problems of low detection efficiency, low accuracy, high operation degree and slower detection rate of a fire under a bright background in the existing fire detection algorithm.
The technical scheme of the invention is as follows: a fire detection method based on a seed region growth rule in a YCRCB color space comprises the steps of collecting images, preprocessing frame images, selecting image pixel seeds in a YCrCb channel by utilizing the color characteristics of the fire, carrying out region growth on the obtained seeds according to a gray value and a chromaticity difference similarity criterion, and judging the dynamic characteristics of flame images on the grown images so as to obtain a final fire judgment result.
As a further aspect of the present invention, the specific steps of the method are as follows:
step1, acquiring suspected fire images through image acquisition equipment, and then performing image preprocessing;
step2, converting the processed color image from RGB to YCrCb space;
step3, selecting an image pixel seed in the YCrCb space based on a pixel similarity criterion;
step4, carrying out region growth on the selected seed region in the YCrCb space according to the growth criterion of the seeds; the growth criterion of the seeds takes the gray level difference and the chromaticity difference of the images as judging conditions;
step5, performing region segmentation on the YCrCb channel based on the flame image;
step6, after the image area is segmented, the problems of over-segmentation and under-segmentation are solved by dynamically adjusting the gray level difference threshold value and the chroma difference threshold value in the area growth criterion;
step7, screening the characteristics of the YCrCb channels of the image, and removing the image area similar to the fire image again;
step8, analyzing the dynamic characteristics of the suspected flame area to obtain a final fire discrimination result.
As a further aspect of the present invention, the image preprocessing in Step1 includes image filtering and image enhancement.
As a further scheme of the present invention, in Step3, based on selecting the image pixel seeds by using a pixel-based similarity criterion, adding a limiting condition according to a color channel of the flame image, wherein the limiting condition formula of the color channel is as follows:
Y(x,y)>Cb(x,y)
Cr(x,y)>Cb(x,y)
wherein Y (x, Y), cr (x, Y) and Cb (x, Y) are luminance, blue and red chrominance values, respectively, at spatial locations (x, Y); according to the statistical rule of the flame image: in the YCrCb color channel, the Y channel value is larger than the Cb channel value, and the Cr channel value is larger than the Cb channel value;
second, pixel-based similarity criteria include the following:
(1) Similarity formula:
x i representing that the neighborhood pixels around a pixel at a certain point are Y, C respectively r 、C b The value on the channel is set to be,representing that the neighborhood pixels around a pixel at a certain point are Y, C respectively r 、C b Mean value, sigma over channel x Representation Y, C r 、C b Standard deviation on the channel, denoted sigma YThe total standard deviation is denoted as sigma,/>
The normalized similarity is as follows:
σ N =σ/σ max
wherein sigma max Representing the maximum standard deviation in the image, and H represents the similarity between the pixel and the neighborhood thereof;
H=1-σ N
if H<H thesoul Then it is the seed region, here the threshold H thesoul The method is obtained through statistical experimental data, and can dynamically adjust the threshold value for different fire backgrounds;
(2) The maximum value of the relative Euclidean distance of the colors of a region and the neighboring region is smaller than a threshold value;
the relative Euclidean formula is:
i=1,2,…,8
wherein d is i Representing the euclidean distance of a pixel at a point from its surrounding 8 field pixels on the YCbCr channel,
C b ,C r representing a pixel at a point Y, C b ,C r Values on channel, Y iNeighborhood pixels representing the point are at Y, C b ,C r Values on channel, d max Representing the pixel and its surrounding 8 field imageThe maximum value of the euclidean distance values in the element, which maximum value is to satisfy:
d max< =d thesould
here the euclidean colour threshold d thesould Is determined by statistical experimental rules.
As a further aspect of the present invention, in Step4, the determination condition is specifically:
condition 1: if the gray level difference threshold value and the chromaticity difference threshold value of the two adjacent areas are smaller than a certain threshold value, merging the two areas;
the method comprises the following steps:
4.1, scanning the image line by line to find out the pixel which has not belonged;
4.2, taking the pixel as a center, checking adjacent pixels, namely comparing the pixels in the adjacent pixels with the pixels one by one, and merging the center pixel and the field pixel if the gray level difference is smaller than a predetermined threshold value;
4.3, taking the newly combined pixels as the center, and then carrying out detection in the step 4.2 until the area cannot be further expanded;
4.4, returning to the step 4.1 again, continuing scanning until no belonging pixels can be found, and ending the whole growth process.
As a further aspect of the present invention, step7 specifically includes the following screening criteria:
(1) The Y channel value of the fire image is larger than the K near field average value, and the Cr, cb channel value is smaller than the K near field average value;
(2) When the difference between the Cb and Cr channels at a certain point is larger than a certain threshold value, the fire area is defined.
In Step8, the dynamic characteristics of the suspected flame area are analyzed, specifically, the flame disorder, the flame motion characteristics, the area randomness of the fire area, the circular angle of the fire edge, and the circularity of the flame, and the final characteristic analysis is used to determine whether the fire is a real fire.
As a further aspect of the present invention, step8 specifically includes:
8.1 detection of motion regions in images
The formula is:
d(i,j)=g n (i,j)-g 0 (i,j)
wherein g n (i, j) is the nth frame, g, of the input image 0 (i, j) is a background image updated during monitoring, g here n (i,j)、g 0 (i, j) respectively representing gray-scale images, determining a difference value between the two obtained gray-scale images, and determining a moving image when the difference value is greater than a threshold value T1; otherwise, judging as a non-moving image, wherein g is defined result The final image determination result is as follows:
the formula is:
the sum g is defined herein result The significance of the movement for the most recent image is as follows:
t1 is a threshold value selected by guessing and counting experience rules, and T1 determines the sensitivity of the algorithm to the change of a single pixel;
8.2 detecting randomness of the size of the fire region image area
In image processing, the region size indicates the number of pixels of an object; as the flame flickers, the edges jump and the area size of the suspicious region changes from frame to frame, but the contours are similar, the randomness of the region size is calculated using the formula described below:
the formula:
wherein A is i Representing the area size of a potential fire region in the current frame, A i-1 Representing the size of the area of the potential fire area in the previous frame, if hard decision rules are usedThen, supposing fire disaster, wherein lambda is a decision threshold value, and obtaining by a statistical experiment rule;
8.3 randomness of edges
From a geometric standpoint, the edges of adjacent frames of the flame sequence are unstable, but the overall edges have stable similarity; the sobel detection operator is used to obtain the edge of the suspected fire, and then uses the similarity formula as an equation;
wherein b i (x, y) is the pixel of flame in the current frame, b i+1 (x, y) is the pixel of the flame in the following frame; in the fire detection process, a threshold value theta is selected according to a statistical experiment rule; when the fire is greater than theta, judging that the fire is happening; edge likelihood reflects the similarity of flame shape, spatial variation and spatial distribution variation; the method is used for distinguishing common interference targets such as fast moving fixed high light, flame color moving objects and large-area illumination changes;
8.4 detecting fire image acute angle corner
The flame shape features in the video image are displayed as images having multiple layers of closed contours with one or more sharp corners on the contours; the sharp corner must satisfy two conditions, firstly it should have a vertex, then the values on both sides must be greater than a threshold L; in the event of a fire, there must be many sharp corners in the outline of the potential fire area, where the minimum number set is N min Is also obtained by experimental law;
8.5 circularity of target
The circularity is a parameter used for measuring the circularity or area complexity of an object; it is calculated from the object area size and perimeter as follows:
the formula:
where So is the area size of the target and P is the perimeter; generally, flames have a complex, irregular shape, while other similar fire disturbances have a regular shape, with a high degree of circularity; if the circularity of the potential target is smaller than a threshold Ct, determining that a fire exists in the image; the extraction of sharp corners and the calculation of the roundness of the target are both criteria based on the polygonal and irregular characteristics of the flame.
The beneficial effects of the invention are as follows: according to the invention, a large number of fire-independent pixels are removed before the growth of the fire pixel seed area by a seed area large method, so that the operation space of a computer is reduced; the method can effectively improve the speed and accuracy of fire detection.
Drawings
FIG. 1 is a flow chart of a seed region selection criterion in the present invention;
FIG. 2 is a flow chart of a region growing rule in the present invention;
FIG. 3 is a flow chart of image region segmentation in the present invention;
FIG. 4 is a flow chart of dynamic analysis according to the present invention;
FIG. 5 is a flow chart of the overall analysis of fire images according to the present invention.
Detailed Description
Example 1: 1-5, a fire detection method based on a seed region growth rule in a YCRCB color space comprises the steps of collecting images, preprocessing frame images, selecting image pixel seeds in a YCrCb channel by utilizing the color characteristics of a fire, performing region growth on the obtained seeds according to gray values and a similarity criterion of a chromaticity difference, and judging the dynamic characteristics of a flame image on the grown images, so as to obtain a final fire judgment result.
As a further aspect of the present invention, the specific steps of the method are as follows:
step1, acquiring suspected fire images through image acquisition equipment, and then performing image preprocessing; ensuring that all noise is removed prior to the processing step, providing a high quality image for further manipulation of the subsequent image;
step2, converting the processed color image from RGB to YCrCb color space; as having the property of YCbCr to more effectively separate luminance and chrominance;
the conversion space between RGB and YCrCb can be as follows:
step3, selecting an image pixel seed in the YCrCb color space based on a pixel similarity criterion; the criterion profile may be as follows: y > =cr > =cb;
step4, carrying out region growth on the selected seed region in the YCrCb space according to the growth criterion of the seeds; the growth criterion of the seeds takes the gray level difference and the chromaticity difference of the images as judging conditions;
step5, performing region segmentation on the YCrCb channel based on the flame image;
step6, after the image area is segmented, the problems of over-segmentation and under-segmentation are solved by dynamically adjusting the gray level difference threshold value and the chroma difference threshold value in the area growth criterion;
step7, screening the characteristics of the YCrCb channels of the image, and removing the image area similar to the fire image again;
step8, analyzing the dynamic characteristics of the suspected flame area to obtain a final fire discrimination result.
As a further aspect of the present invention, the image preprocessing in Step1 includes image filtering and image enhancement.
As a further scheme of the present invention, in Step3, based on selecting the image pixel seeds by using a pixel-based similarity criterion, adding a limiting condition according to a color channel of the flame image, wherein the limiting condition formula of the color channel is as follows:
Y(x,y)>Cb(x,y)
Cr(x,y)>Cb(x,y)
wherein Y (x, Y), cr (x, Y) and Cb (x, Y) are luminance, blue and red chrominance values, respectively, at spatial locations (x, Y); according to the statistical rule of the flame image: in the YCrCb color channel, the Y channel value is larger than the Cb channel value, and the Cr channel value is larger than the Cb channel value;
second, pixel-based similarity criteria include the following:
(1) Similarity formula:
x i representing that the neighborhood pixels around a pixel at a certain point are Y, C respectively r 、C b The value on the channel is set to be,representing that the neighborhood pixels around a pixel at a certain point are Y, C respectively r 、C b Mean value, sigma over channel x Representation Y, C r 、C b Standard deviation on the channel, denoted sigma YThe total standard deviation is denoted as sigma,/>
The normalized similarity is as follows:
σ N =σ/σ max
wherein sigma max Representing the maximum standard deviation in the image, and H represents the similarity between the pixel and the neighborhood thereof;
H=1-σ N
if H<H thesoul Then it is the seed region, here the threshold H thesoul The method is obtained through statistical experimental data, and can dynamically adjust the threshold value for different fire backgrounds;
(2) The maximum value of the relative Euclidean distance of the colors of a region and the neighboring region is smaller than a threshold value;
the relative Euclidean formula is:
i=1,2,…,8
wherein d is i Representing the euclidean distance of a pixel at a point from its surrounding 8 field pixels on the YCbCr channel,
C b ,C r representing a pixel at a point Y, C b ,C r Values on channel, Y iNeighborhood pixels representing the point are at Y, C b ,C r Values on channel, d max Representing the maximum value of the euclidean distance value between the pixel and its surrounding 8 field pixels, the maximum value being:
d max< =d thesould
here the euclidean colour threshold d thesould Is determined by statistical experimental rules. When the threshold is large, the block may be divided into several large blocks, and when the threshold is small, the over-division may be caused. The main purpose of the restriction by this condition is to ensure that the seed region does not fall on the boundary. At this time, the seeds in the image can be classified into two types: fire-like seeds and non-fire-like seeds;
as a further aspect of the present invention, in Step4, the determination condition is specifically:
condition 1: if the gray level difference threshold value and the chromaticity difference threshold value of the two adjacent areas are smaller than a certain threshold value, merging the two areas;
the method comprises the following steps:
4.1, scanning the image line by line to find out the pixel which has not belonged;
4.2, taking the pixel as a center, checking adjacent pixels, namely comparing the pixels in the adjacent pixels with the pixels one by one, and merging the center pixel and the field pixel if the gray level difference is smaller than a predetermined threshold value;
4.3, taking the newly combined pixels as the center, and then carrying out detection in the step 4.2 until the area cannot be further expanded;
4.4, returning to the step 4.1 again, continuing scanning until no belonging pixels can be found, and ending the whole growth process.
After the growth is finished, the region segmentation is finished, the obtained image has the problem of over segmentation or under segmentation, and the gray level difference threshold value and the chroma difference threshold value of the region growth criterion can be adjusted.
As a further aspect of the present invention, step7 specifically includes the following screening criteria:
(1) The Y channel value of the fire image is larger than the K near field average value, and the Cr, cb channel value is smaller than the K near field average value;
(2) When the difference between the Cb and Cr channels at a certain point is larger than a certain threshold value, the fire area is defined.
Specifically, analyzing some large features of the image on the YCRCB channel, and removing a part of irrelevant areas;
through statistical analysis of the flame image, it was found that the flame image is different from other classes of images in the YCRCB channel as follows:
feature 1
This is wherein:
y (x, Y), cr (x, Y), cb (x, Y) respectively represent the value of a pixel at a point on the Y, cr, cb channels, Y mean Represents the mean value of K near field of the image on Y channel, cr maen ,Cb maen And so on. The Y channel value of the formula surface fire image is generally larger than the K near field average value, while the Cr, cb channel value is smaller than the K near field average value characteristic 2:
as can be readily observed from the representative fire image (fig. 4), there is a significant difference between the Cb and Cr components of the flame pixels. The Cb component is mainly "black", while the Cr component is mainly "white". The above rule can thus be established based on this feature, and is only defined as a fire region if the cb and Cr differ by more than a certain threshold.
In Step8, the dynamic characteristics of the suspected flame area are analyzed, specifically, the flame disorder, the flame motion characteristics, the area randomness of the fire area, the circular angle of the fire edge, and the circularity of the flame, and the final characteristic analysis is used to determine whether the fire is a real fire.
As a further aspect of the present invention, step8 specifically includes:
8.1 detection of motion regions in images
The formula is:
d(i,j)=g n (i,j)-g 0 (i,j)
wherein g n (i, j) is the nth frame, g, of the input image 0 (i, j) is a background image updated during monitoring, g here n (i,j)、g 0 (i, j) respectively representing gray-scale images, determining a difference value between the two obtained gray-scale images, and determining a moving image when the difference value is greater than a threshold value T1; otherwise, judging as a non-moving image, wherein g is defined result Image final determinationThe results were as follows:
the formula is:
the sum g is defined herein result The significance of the movement for the most recent image is as follows:
t1 is a threshold value selected by guessing and counting experience rules, and T1 determines the sensitivity of the algorithm to the change of a single pixel;
8.2 detecting randomness of the size of the fire region image area
In image processing, the region size indicates the number of pixels of an object; as the flame flickers, the edges jump and the area size of the suspicious region changes from frame to frame, but the contours are similar, the randomness of the region size is calculated using the formula described below:
the formula:
wherein A is i Representing the area size of a potential fire region in the current frame, A i-1 Representing the size of the area of the potential fire area in the previous frame, if hard decision rules are usedThen, supposing fire disaster, wherein lambda is a decision threshold value, and obtaining by a statistical experiment rule;
8.3 randomness of edges
From a geometric standpoint, the edges of adjacent frames of the flame sequence are unstable, but the overall edges have stable similarity; the sobel detection operator is used to obtain the edge of the suspected fire, and then uses the similarity formula as an equation;
wherein b i (x, y) is the pixel of flame in the current frame, b i+1 (x, y) is the pixel of the flame in the following frame; in the fire detection process, a threshold value theta is selected according to a statistical experiment rule; when the fire is greater than theta, judging that the fire is happening; edge likelihood reflects the similarity of flame shape, spatial variation and spatial distribution variation; the method is used for distinguishing common interference targets, such as fast moving fixed high light, flame color moving objects, large-area illumination changes and the like;
8.4 detecting fire image acute angle corner
The flame shape features in the video image are displayed as images having multiple layers of closed contours with one or more sharp corners on the contours; the sharp corner must satisfy two conditions, firstly it should have a vertex, then the values on both sides must be greater than a threshold L; in the event of a fire, there must be many sharp corners in the outline of the potential fire area, where the minimum number set is N min Is also obtained by experimental law;
8.5 circularity of target
The circularity is a parameter used for measuring the circularity or area complexity of an object; it is calculated from the object area size and perimeter as follows:
the formula:
where So is the area size of the target and P is the perimeter; generally, flames have a complex, irregular shape, while other similar fire disturbances have a regular shape, with a high degree of circularity; if the circularity of the potential target is smaller than a threshold Ct, determining that a fire exists in the image; the extraction of sharp corners and the calculation of the roundness of the target are both criteria based on the polygonal and irregular characteristics of the flame.
While the present invention has been described in detail with reference to the drawings, the present invention is not limited to the above embodiments, and various changes can be made by those skilled in the art without departing from the spirit of the present invention.

Claims (5)

1. A fire detection method based on a seed region growing rule in YCRCB color space, characterized by: collecting images, preprocessing frame images, selecting image pixel seeds in a YCrCb channel by utilizing the color characteristics of fire, performing regional growth on the obtained seeds according to gray value and chromaticity difference similarity criteria, and judging the dynamic characteristics of flame images on the grown images so as to obtain the final judging result of the fire;
the method comprises the following specific steps:
step1, acquiring suspected fire images through image acquisition equipment, and then performing image preprocessing;
step2, converting the processed color image from RGB to YCrCb space;
step3, selecting an image pixel seed in the YCrCb space based on a pixel similarity criterion;
step4, carrying out region growth on the selected seed region in the YCrCb space according to the growth criterion of the seeds; the growth criterion of the seeds takes the gray level difference and the chromaticity difference of the images as judging conditions;
step5, performing region segmentation on the YCrCb channel based on the flame image;
step6, after the image area is segmented, the problems of over-segmentation and under-segmentation are solved by dynamically adjusting the gray level difference threshold value and the chroma difference threshold value in the area growth criterion;
step7, screening the characteristics of the YCrCb channels of the image, and removing the image area similar to the fire image again;
step8, analyzing the dynamic characteristics of the suspected flame area to obtain a final fire discrimination result;
in Step3, based on selecting the image pixel seeds by using a pixel-based similarity criterion, adding a limiting condition according to a color channel of the flame image, wherein the limiting condition formula of the color channel is as follows:
Y(x,y)>Cb(x,y)
Cr(x,y)>Cb(x,y)
wherein Y (x, Y), cr (x, Y) and Cb (x, Y) are luminance, bluish-ness and redness values, respectively, at spatial locations (x, Y); according to the statistical rule of the flame image: in the YCrCb color channel, the Y channel value is larger than the Cb channel value, and the Cr channel value is larger than the Cb channel value;
second, pixel-based similarity criteria include the following:
(1) Similarity formula:
x i representing that the neighborhood pixels around a pixel at a certain point are Y, C respectively r 、C b The value on the channel, x represents that the neighborhood pixels around a certain point pixel are Y, C respectively r 、C b Mean value, sigma over channel x Representation Y, C r 、C b Standard deviation on the channel, denoted sigma YThe total standard deviation is denoted as sigma,/>The normalized similarity is as follows:
σ N =σ/σ max
wherein sigma max Representing the maximum standard deviation in the image, and H represents the similarity between the pixel and the neighborhood thereof;
H=1-σ N
if H<H thesoul Then it is the seed region, here the threshold H thesoul The method is obtained through statistical experimental data, and can dynamically adjust the threshold value for different fire backgrounds;
(2) The maximum value of the relative Euclidean distance of the colors of a region and the neighboring region is smaller than a threshold value;
the relative Euclidean formula is:
wherein d is i Euclidean distance, Y, C, between a pixel representing a point and its surrounding 8-domain pixels on the YCbCr channel b ,C r Representing a pixel at a point Y, C b ,C r Values on channel, Y i
Neighborhood pixels representing the point are at Y, C b ,C r Values on channel, d max Representing the maximum value of the euclidean distance value between the pixel and its surrounding 8 field pixels, the maximum value being:
d max< =d thesould
here the euclidean colour threshold d thesould Is determined by a statistical experimental rule;
step8 specifically includes:
8.1 detection of motion regions in images
The formula is:
d(i,j)=g n (i,j)-g 0 (i,j)
wherein g n (i, j) is the nth frame, g, of the input image 0 (i, j) is a background image updated during monitoring, g here n (i,j)、g 0 (i, j) respectively representing gray-scale images, determining a difference value between the two obtained gray-scale images, and determining a moving image when the difference value is greater than a threshold value T1; otherwise, judging as a non-moving image, wherein g is defined result The final image determination result is as follows:
the formula is:
the sum g is defined herein result The significance of the movement for the most recent image is as follows:
where T1 is a threshold value selected by a rule of guess and statistical experience, T1 determines the sensitivity of the algorithm to single pixel variations;
8.2 detecting randomness of the size of the fire region image area
In image processing, the region size indicates the number of pixels of an object; as the flame flickers, the edges jump and the area size of the suspicious region changes from frame to frame, but the contours are similar, the randomness of the region size is calculated using the formula described below:
the formula:
wherein A is i Representing the area size of a potential fire region in the current frame, A i-1 Representing the size of the area of the potential fire area in the previous frame, if a using the hard decision rule i >Lambda is determined as fire, wherein lambda is a decision threshold value and is obtained through a statistical experiment rule;
8.3 detection of randomness of edges
From a geometric standpoint, the edges of adjacent frames of the flame sequence are unstable, but the overall edges have stable similarity; the sobel detection operator is used to obtain the edge of the suspected fire, and then uses the similarity formula as an equation;
wherein b i (x, y) is the pixel of flame in the current frame, b i+1 (x, y) is the pixel of the flame in the following frame; in the fire detection process, a threshold value theta is selected according to a statistical experiment rule; when greater than θ, judge asFire disaster; edge likelihood reflects the similarity of flame shape, spatial variation and spatial distribution variation; the method is used for distinguishing common interference targets, including fast moving fixed high light, flame color moving objects or large-area illumination changes;
8.4 detecting fire image acute angle corner
The flame shape features in the video image are displayed as images having multiple layers of closed contours with one or more sharp corners on the contours; the sharp corner must satisfy two conditions, firstly it has a vertex, then the values of its two sides are greater than a threshold value L; in the event of a fire, there must be many sharp corners in the outline of the potential fire area, where the minimum number set is N min Is also obtained by experimental law;
8.5 detection of the circularity of the target
The circularity is a parameter used for measuring the circularity or area complexity of an object; it is calculated from the object area size and perimeter as follows:
the formula:
where So is the area size of the target and P is the perimeter; flames have a complex, irregular shape, while other similar fire disturbances have a regular shape, with a high degree of circularity; if the circularity of the potential target is smaller than the threshold Ct, determining that a fire exists in the image; the extraction of sharp angles and the calculation of the roundness of the target are both criteria based on the polygonal and irregular characteristics of the flame.
2. The fire detection method based on the seed region growing rule in YCRCB color space according to claim 1, wherein: the image preprocessing in Step1 comprises image filtering and image enhancement.
3. The fire detection method based on the seed region growing rule in YCRCB color space according to claim 1, wherein: in Step4, the judgment conditions are specifically:
condition 1: if the gray level difference threshold value and the chromaticity difference threshold value of the two adjacent areas are smaller than a certain threshold value, merging the two areas;
the method comprises the following steps:
4.1, scanning the image line by line to find out the pixel which has not belonged;
4.2, taking the pixel as a center, checking adjacent pixels, namely comparing the pixels in the adjacent pixels with the pixels one by one, and merging the center pixel and the field pixel if the gray level difference is smaller than a predetermined threshold value;
4.3, taking the newly combined pixels as the center, and then carrying out detection in the step 4.2 until the area cannot be further expanded;
4.4, returning to the step 4.1 again, continuing scanning until no belonging pixel can be found, and ending the whole growth process.
4. The fire detection method based on the seed region growing rule in YCRCB color space according to claim 1, wherein: step7 specifically includes the following screening criteria:
(1) The Y channel value of the fire image is larger than the K near field average value, and the Cr, cb channel value is smaller than the K near field average value;
(2) When the difference between the Cb and Cr channels at a certain point is larger than a certain threshold value, the fire area is defined.
5. The fire detection method based on the seed region growing rule in YCRCB color space according to claim 1, wherein: in Step8, the dynamic characteristics of the suspected flame area are analyzed, specifically, the flame disorder, the flame motion characteristics, the area randomness of the fire area, the fire edge fillet and the flame circularity are analyzed, and whether the fire is a real fire is determined by utilizing the final characteristic analysis.
CN202110982257.1A 2021-08-25 2021-08-25 Fire detection method based on seed region growth rule in YCRCB color space Active CN113744326B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110982257.1A CN113744326B (en) 2021-08-25 2021-08-25 Fire detection method based on seed region growth rule in YCRCB color space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110982257.1A CN113744326B (en) 2021-08-25 2021-08-25 Fire detection method based on seed region growth rule in YCRCB color space

Publications (2)

Publication Number Publication Date
CN113744326A CN113744326A (en) 2021-12-03
CN113744326B true CN113744326B (en) 2023-08-22

Family

ID=78732769

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110982257.1A Active CN113744326B (en) 2021-08-25 2021-08-25 Fire detection method based on seed region growth rule in YCRCB color space

Country Status (1)

Country Link
CN (1) CN113744326B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797372B (en) * 2022-11-11 2023-09-22 中国消防救援学院 Flame target extraction method based on combined image segmentation
CN116051543B (en) * 2023-03-06 2023-06-16 山东锦霖钢材加工有限公司 Defect identification method for peeling steel

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012088796A (en) * 2010-10-15 2012-05-10 Kddi Corp Image area division device, image area division method, and image area division program
CN105096323A (en) * 2015-07-28 2015-11-25 中国石油天然气股份有限公司 Pool fire flame height measurement method based on visible image processing
CN109145689A (en) * 2017-06-28 2019-01-04 南京理工大学 A kind of robot fire detection method
CN109493361A (en) * 2018-11-06 2019-03-19 中南大学 A kind of fire hazard aerosol fog image partition method
CN110516609A (en) * 2019-08-28 2019-11-29 南京邮电大学 A kind of fire video detection and method for early warning based on image multiple features fusion

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106997461B (en) * 2017-03-28 2019-09-17 浙江大华技术股份有限公司 A kind of firework detecting method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012088796A (en) * 2010-10-15 2012-05-10 Kddi Corp Image area division device, image area division method, and image area division program
CN105096323A (en) * 2015-07-28 2015-11-25 中国石油天然气股份有限公司 Pool fire flame height measurement method based on visible image processing
CN109145689A (en) * 2017-06-28 2019-01-04 南京理工大学 A kind of robot fire detection method
CN109493361A (en) * 2018-11-06 2019-03-19 中南大学 A kind of fire hazard aerosol fog image partition method
CN110516609A (en) * 2019-08-28 2019-11-29 南京邮电大学 A kind of fire video detection and method for early warning based on image multiple features fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于改进FOA- SVM 的矿井火灾图像识别;苗续芝;计算机工程;第第45卷卷(第第4期期);全文 *

Also Published As

Publication number Publication date
CN113744326A (en) 2021-12-03

Similar Documents

Publication Publication Date Title
CN114937055B (en) Image self-adaptive segmentation method and system based on artificial intelligence
CN107085714B (en) Forest fire detection method based on video
CN111260616A (en) Insulator crack detection method based on Canny operator two-dimensional threshold segmentation optimization
WO2022099598A1 (en) Video dynamic target detection method based on relative statistical features of image pixels
CN105404847B (en) A kind of residue real-time detection method
CN113744326B (en) Fire detection method based on seed region growth rule in YCRCB color space
US20200250840A1 (en) Shadow detection method and system for surveillance video image, and shadow removing method
CN108181316B (en) Bamboo strip defect detection method based on machine vision
CN108563979B (en) Method for judging rice blast disease conditions based on aerial farmland images
CN112149543B (en) Building dust recognition system and method based on computer vision
CN109472788B (en) Method for detecting flaw on surface of airplane rivet
WO2021098163A1 (en) Corner-based aerial target detection method
CN113935666B (en) Building decoration wall tile abnormity evaluation method based on image processing
CN108921857A (en) A kind of video image focus area dividing method towards monitoring scene
CN112862832B (en) Dirt detection method based on concentric circle segmentation positioning
CN109523524A (en) A kind of eye fundus image hard exudate detection method based on integrated study
CN110288618B (en) Multi-target segmentation method for uneven-illumination image
CN112291551A (en) Video quality detection method based on image processing, storage device and mobile terminal
CN110223344A (en) A kind of infrared small target detection method based on morphology and vision noticing mechanism
CN115272350A (en) Method for detecting production quality of computer PCB mainboard
CN114821158A (en) Dried jujube quality classification method and system based on image processing
CN117011291B (en) Watch shell quality visual detection method
Islami Implementation of HSV-based Thresholding Method for Iris Detection
CN112364884A (en) Method for detecting moving object
CN111724375A (en) Screen detection method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant