CN115619802A - Fire image segmentation method for improving density peak value clustering - Google Patents

Fire image segmentation method for improving density peak value clustering Download PDF

Info

Publication number
CN115619802A
CN115619802A CN202210703536.4A CN202210703536A CN115619802A CN 115619802 A CN115619802 A CN 115619802A CN 202210703536 A CN202210703536 A CN 202210703536A CN 115619802 A CN115619802 A CN 115619802A
Authority
CN
China
Prior art keywords
image
density
distance
points
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210703536.4A
Other languages
Chinese (zh)
Inventor
马宗方
赵佳星
曹永根
宋琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Architecture and Technology
Original Assignee
Xian University of Architecture and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Architecture and Technology filed Critical Xian University of Architecture and Technology
Priority to CN202210703536.4A priority Critical patent/CN115619802A/en
Publication of CN115619802A publication Critical patent/CN115619802A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a fire disaster image segmentation method for improving density peak value clustering, which comprises the following steps; step 1: firstly, preprocessing and characteristic extraction are carried out on an image to obtain an RGB space of the image and the number N of initial superpixels; step 2: converting the RGB space obtained in the step 1 into CIE-Lab space; and step 3: calculating the density and distance of Lab space sampling points; and 4, step 4: normalizing the density and distance obtained in the third step in order to correctly divide the fire area, and calculating gamma of the clustering center point of the real flame area k A value; and 5: taking gamma k Value pairThe corresponding sample point is used as a clustering center point, and the rest sample points are distributed according to the traditional DPC to finish the segmentation of the flame image, so that the accurate segmentation graph of the flame image is finally obtained. The invention improves the efficiency of image segmentation, does not need supervision and further improves the timeliness and the accuracy of building fire monitoring.

Description

Fire image segmentation method for improving density peak value clustering
Technical Field
The invention relates to the technical field of fire image processing, in particular to a fire image segmentation method for improving density peak value clustering.
Background
Conventional sensors such as smoke, temperature and light sensors are most often used to monitor some important fire characteristics such as heat, gas, flame, smoke, etc. However, most designs must install specific hardware or software to achieve the temperature differential, resulting in unacceptable costs. In addition, conventional fire detection techniques are difficult to accurately implement fire detection due to the effects of various disturbances in complex environments. Image-based fire detection may be effective in reducing interference with the external environment as compared to sensors.
The image-based fire detection technology mainly comprises key technologies such as fire image segmentation, feature extraction, fire judgment, fire extinguishment and fire linkage and the like. The fire image segmentation is a premise of fire characteristic extraction and identification, and the accuracy of fire identification is directly influenced by a segmentation result.
Therefore, the research on the fire image segmentation technology has important significance. Classical image segmentation algorithms include threshold-based, region-based and boundary-based segmentation algorithms. Clustering segmentation may find natural groups of data based on the internal structure of the data. However, the traditional image segmentation algorithm needs supervision, is not very efficient, and is not particularly good in effect.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide a fire disaster image segmentation method for improving density peak value clustering, which improves the image segmentation efficiency, does not need to be supervised, and further improves the timeliness and the accuracy of building fire disaster monitoring.
In order to achieve the purpose, the invention adopts the technical scheme that:
a fire disaster image segmentation method for improving density peak value clustering comprises the following steps;
step 1: firstly, preprocessing and feature extraction are carried out on an image to obtain an RGB space of the image and the number N of initial super pixels, in order to reduce the complexity of the image, similar pixels in a small area are aggregated to form an irregular block by adopting an SLIC super pixel segmentation algorithm, and image blocks are used for replacing pixels as basic units in clustering analysis;
and 2, step: converting the RGB space obtained in the step 1 into CIE-Lab space;
and 3, step 3: calculating the density and distance of Lab space sampling points;
and 4, step 4: normalizing the density and distance obtained in the third step for correctly dividing the fire area, and calculating gamma of the clustering center point of the real flame area k A value;
and 5: taking gamma k And (4) taking the sample points corresponding to the values as clustering center points, and completing the segmentation of the flame image by the rest sample points according to the traditional DPC distribution to finally obtain an accurate segmentation map of the flame image.
In the step 1, the SLIC considers the similarity of colors and positions, S pixel points are assumed in an image, the number of super pixel points is set to be N, the area of one super pixel point is S/N pixel points, and one pixel point is randomly selected as an initial clustering centroid C of the area N The gradient of pixels in the vicinity of the t x t region is then calculated (t usually takes 3), the pixel with the minimum gradient being the new cluster centroid, based on
Figure RE-GDA0003978583520000021
Each neighbor searches for similar pixels and then iterates through the feature vectors until the results converge.
In the step 2, a unique channel setting is provided in a CIE-Lab color space, wherein brightness characteristics obtained by RGB space variation are stored in only an L channel, color characteristics are stored in a and b channels, and in the CIE-Lab color space, an image is represented by a 5-element feature vector V = [ L a b x y ], wherein [ L a b ] retains color information and [ x y ] retains pixel position information. And the color and the brightness of each pixel in the superpixel block are similar, and the average value of the color and the position of the superpixel block is used as a sample point of cluster segmentation.
The point density calculation in the step 3 comprises the following steps;
the density of each sampling point is calculated by adopting the formula (1), the sampling point density with similar color and brightness characteristics in the building fire image is closer, and therefore, the greater the density of the sampling points is, the more similar the color and brightness of the neighborhood of the sampling points are.
Figure RE-GDA0003978583520000031
Figure RE-GDA0003978583520000032
The input is the N sample points of the algorithm, denoted X = { X 1 ,x 2 ,...,x m ,…x N At the m-th sample point of
Figure RE-GDA0003978583520000033
d ij Represents the distance between the ith and jth sample points, d Lab_ij Indicating the luminance and color distance between the ith and jth sample points. d c Is 2% of the spatial distance of the positions between the samples after the positive sequence. τ is 20% of the spatial distance Lab between samples after positive sequence.
The more similar the brightness and color of two superpixels in an image, the more similar the density between them, in which case the two superpixels are more easily clustered into the same class, the position information and density of the two superpixels in the image are similar, but the color and brightness information differ greatly, so that the two superpixels cannot be clustered into one class, and the formula (2) is used to calculate the distance of the input sample point.
Distance delta i Is a description of the difference between the density of sample points within their minimum distance and the denser points in the CIE-Lab space. Calculated from equation (3):
Figure RE-GDA0003978583520000041
where rho i And ρ j The ith and j samples calculated for equation (1)The local density of the dots. d Lab Calculated from equation (2). Local density p only when the sample point i Distance delta at maximum density i Take the maximum value of each sample point distance. If ρ i Not of maximum density, then the distance δ i That is, the sample point distance that is denser than it and has the smallest relative distance.
In the step 4, a clustering center is selected; in order to correctly divide the fire zone, the local density ρ and the distance δ of each sample point are calculated by equations (1) and (3), and then the density ρ and the distance δ are normalized so that these values are in the range of [0,1 ]. The normalization formula is as follows
Figure RE-GDA0003978583520000042
Wherein rho' i And delta' i Is a normalized parameter, p max And ρ min Is the maximum and minimum of ρ, δ max And delta min Are the maximum and minimum values of δ.
In the step 5, in a building fire image, a cluster center of a flame area is searched by combining with an HSV color space model, for fires occurring inside and outside the building, due to shielding of the building, the brightness of the flame area is usually higher than that of a background environment, the color of the flame is red and yellow, and according to the relationship between the color and the brightness, the value range of an H component and the value range of a V component of the flame area are found through priori knowledge; calculating the gamma value of the sample points in the extraction area through a formula (5), and sequencing the sample points from large to small to obtain the gamma value of the clustering center point of the real flame area k The value is maximum in the extraction area, and therefore takes γ k And the corresponding sample points are used as clustering center points, and the rest sample points are distributed according to the traditional DPC to finish the segmentation of the flame image, so that the accurate segmentation graph of the flame image is finally obtained.
γ i =ρ′ i ×δ′ i (5)
The invention has the beneficial effects that:
in order to find the flame area in the building fire image, the invention combines the priori knowledge to find the clustering center of the real fire area. Redefining the density of the corresponding sample points by using the position information and the color information of the super pixels in the image; in the process of distributing the residual sample points, the position information and the color information of the sample points are considered, and the problem of mismatching of the sample points is solved to a certain extent. Therefore, the segmentation accuracy of the target region is improved.
The invention improves the detection precision and the segmentation precision and embodies the effectiveness and the superiority of the building fire image detection.
The method divides superpixels of a fire image, forms adjacent pixels with similar colors and brightness in the image into irregular pixel blocks with certain significance, and further denoises in combination with Gaussian filtering to complete image preprocessing. And redefining the density of the input sample points under a density peak value clustering algorithm framework, finding a clustering center of a fire area by using priori knowledge, and completing automatic segmentation of the fire image by distributing residual sample points.
Drawings
Fig. 1 is a schematic diagram of a fire image segmentation result according to the present invention.
Fig. 2 is a schematic diagram illustrating the fire image segmentation effect of outdoor gas.
FIG. 3 is a schematic diagram of image segmentation effect of street garbage incineration.
Fig. 4 is a schematic diagram of the image segmentation effect of a burning house.
Fig. 5 is a schematic diagram of the image segmentation effect of the indoor combustion gasoline.
Fig. 6 is a schematic diagram of the image segmentation effect of outdoor burning gasoline.
Detailed Description
The present invention will be described in further detail with reference to examples.
As shown in fig. 1-6:
firstly, preprocessing and feature extraction are needed to be carried out on an image, in order to reduce the complexity of the image, similar pixels in a small area are aggregated to form an irregular block by adopting an SLIC super-pixel segmentation algorithm, and an image block is used for replacing pixels as a basic unit in clustering analysis.
SLICs take into account similarities in color and location. Assuming that S pixel points exist in an image, the number of the super pixel points is set to be N, the area of one super pixel point is S/N pixel points, and one pixel point is randomly selected as an initial clustering centroid C of the area N . The pixel gradient of the nearby t x t region is then calculated (t is typically taken to be 3). The pixel with the smallest gradient is the new cluster centroid. According to
Figure RE-GDA0003978583520000061
Each neighbor searches for similar pixels. The feature vectors are then iterated until the results converge.
There is a unique channel set in the CIE-Lab color space, where the luminance features are stored only in the L channel and the color features are stored in the a and b channels. In the CIE-Lab color space, an image can be represented by one 5-element feature vector V = [ l a b x y ], where [ l a b ] retains color information and [ x y ] retains pixel position information. And the color and the brightness of each pixel in the superpixel block are similar, and the average value of the color and the position of the superpixel block is used as a sample point of cluster segmentation.
The density of each sampling point is calculated by adopting the formula (1), and the densities of the sampling points with similar color and brightness characteristics in the building fire image are closer. Thus, the greater the density of sample points, the more similar the color and brightness of their neighborhood.
Figure RE-GDA0003978583520000062
Figure RE-GDA0003978583520000071
The input is N sample points of the algorithm, denoted as X = { X 1 ,x 2 ,...,x m ,…x N At the m-th sample point of
Figure RE-GDA0003978583520000072
d ij Representing the ith and jth sample pointsDistance between positions, d Lab_ij Indicating the luminance and color distance between the ith and jth sample points. d is a radical of c Is 2% of the spatial distance of the positions between the samples after the positive sequence. τ is 20% of the spatial distance Lab between samples after positive sequence.
The more similar the brightness and color of two superpixels in an image, the more similar the density between them. In this case, two superpixel blocks are more likely to be clustered into the same class. The position information and density of two superpixels in an image are similar, but the color and brightness information are greatly different, so that two superpixels cannot be grouped into one. We calculate the distance of the input sample point using equation (2).
Distance delta i Is a description of the difference between the density of sample points within their minimum distance and the denser points in the CIE-Lab space. Calculated from equation (3):
Figure RE-GDA0003978583520000073
where rho i And ρ j The local density of the i, j sample points calculated for equation (1). d is a radical of Lab Calculated from equation (2). Local density p only when the sample point i Distance delta at maximum density i The maximum value of the distance of each sample point is taken. If ρ i Not the maximum density, then the distance δ i That is, the sample point distance that is denser than it and has the smallest relative distance.
Then, selecting a clustering center; to correctly segment the fire zone, we calculate the local density ρ and distance δ for each sample point by equations (1) and (3). The density ρ and distance δ are then normalized to be within the range of [0,1 ]. The normalization formula is as follows
Figure RE-GDA0003978583520000081
Wherein rho' i And delta' i Is a normalized parameter, p max And ρ min Is the maximum and minimum of rhoValue delta max And delta min Are the maximum and minimum values of δ.
In the building fire image, a cluster center of a flame area is found by combining an HSV color space model. For a fire occurring inside or outside a building, the brightness of the flame area is generally higher than that of the background environment due to the shielding of the building, and the color of the flame is red and yellow. And finding the value range of the H component and the value range of the V component of the flame area through priori knowledge according to the relation between the color and the brightness.
The gamma values of the sample points in the extraction area are calculated by formula (5) and sorted from large to small. Gamma of real flame area clustering center point k The value is maximum in the extraction area, and therefore takes γ k The corresponding sample points are used as clustering center points, and the rest sample points are distributed according to the traditional DPC to finish the segmentation of the flame image;
γ i =ρ′ i ×δ′ i
FIG. 1 is a diagram illustrating the effect of superpixel segmentation on fire images according to the present application. Under the framework of a density peak value clustering algorithm, the density of input sample points is redefined, the prior knowledge is utilized to find out the clustering center of a fire area, the automatic segmentation of a fire image is completed by distributing the residual sample points, and as can be seen from the attached drawing, the fire area is completely segmented, and the edges are tightly attached.
FIGS. 2-6 are drawings illustrating an original drawing; HCM; FLCM; DPC; SFFCM; a proposed algorithm; manually marking the image; the segmentation result of the method f provided by the application is better and closer to that of the DPC, and the accuracy of the segmentation result is reduced due to the influence of the complex background of the image on the HCM algorithm and the FLCM algorithm. The SFFCM algorithm has difficulty completely segmenting the flame region.

Claims (6)

1. A fire disaster image segmentation method for improving density peak value clustering is characterized by comprising the following steps;
step 1: firstly, preprocessing and feature extraction are carried out on an image to obtain an RGB space of the image and the number N of initial superpixels, in order to reduce the complexity of the image, similar pixels in a small area are aggregated to form an irregular block by adopting an SLIC superpixel segmentation algorithm, and image blocks are used for replacing pixels as basic units in clustering analysis;
step 2: converting the RGB space obtained in the step 1 into CIE-Lab space;
and step 3: calculating the density and distance of Lab space sampling points;
and 4, step 4: normalizing the density and distance obtained in the third step for correctly dividing the fire area, and calculating gamma of the clustering center point of the real flame area k A value;
and 5: taking gamma k And (4) taking the sample points corresponding to the values as clustering center points, and completing the segmentation of the flame image by the rest sample points according to the traditional DPC distribution to finally obtain an accurate segmentation map of the flame image.
2. The fire disaster image segmentation method for improving density peak value clustering according to claim 1, wherein the SLIC in step 1 considers the similarity of color and position, S pixel points are assumed to exist in an image, the number of super pixel points is set to be N, the area of one super pixel point is S/N pixel points, and one pixel point is randomly selected as an initial clustering centroid C of the area N Then the gradient of the pixels (t usually takes 3) of the nearby t x t region is calculated, the pixel with the minimum gradient being the new cluster centroid, based on
Figure FDA0003705291870000011
Each neighbor searches for similar pixels and then iterates through the feature vectors until the results converge.
3. The fire image segmentation method for improving density peak clustering according to claim 1, wherein the step 2 has a unique channel setting in the CIE-Lab color space, wherein the brightness characteristics obtained by RGB space variation are stored only in the L channel, the color characteristics are stored in the a and b channels, and in the CIE-Lab color space, the image is represented by a 5-element feature vector V = [ la b x y ], wherein [ la b ] retains color information and [ x y ] retains pixel position information, and the color and brightness of each pixel in the super-pixel block are similar, and the average value of the color and position of the super-pixel block is used as the sample point of the clustering segmentation.
4. The fire image segmentation method for improving density peak clustering according to claim 1, wherein the point density calculation in the step 3 comprises the following steps;
the density of each sampling point is calculated by adopting a formula (1), and the density of the sampling points with similar color and brightness characteristics in the building fire image is closer, so that the greater the density of the sampling points is, the more similar the color and brightness of the neighborhood of the sampling points are;
Figure FDA0003705291870000021
Figure FDA0003705291870000022
the input is the N sample points of the algorithm, denoted X = { X 1 ,x 2 ,...,x m ,…x N At the m-th sample point of
Figure FDA0003705291870000023
d ij Represents the distance between the ith and jth sample points, d Lab_ij Representing the luminance and color distance between the ith and jth sample points, d c Is 2% of the spatial distance of the positions between the samples after the positive sequence, and tau is 20% of the spatial distance of Lab between the samples after the positive sequence;
the more similar the brightness and color of two superpixels in an image, the more similar the density between the two superpixels, in this case, the two superpixels are more easily gathered into the same class, the position information and density of the two superpixels in the image are similar, but the difference between the color information and the brightness information is larger, so that the two superpixels cannot be gathered into one class, and the distance of an input sample point is calculated by adopting the formula (2);
distance delta i Is toDescription of the difference between the density of sample points within their minimum distance and the denser points in the CIE-Lab space. Calculated from equation (3):
Figure FDA0003705291870000031
where ρ is i And ρ j The local density of the i, j-th sample point calculated for equation (1). d is a radical of Lab Calculated from equation (2). Local density p only when the sample point i Distance delta at maximum density i Take the maximum value of each sample point distance. If ρ i Not the maximum density, then the distance δ i That is, the sample point distance that is denser than it and has the smallest relative distance.
5. The fire image segmentation method for improving density peak clustering according to claim 1, wherein in the step 4, a clustering center is selected; for correct segmentation of the fire zone, the local density ρ and distance δ of each sample point are calculated by equations (1) and (3), and then the density ρ and distance δ are normalized to be in the range of [0,1], as follows
Figure FDA0003705291870000032
Wherein rho' i And delta' i Is a normalized parameter, p max And ρ min Is the maximum and minimum of ρ, δ max And delta min Are the maximum and minimum values of δ.
6. The fire image segmentation method for improving density peak clustering according to claim 1, wherein in the step 5, a cluster center of a flame region is found in the fire image of the building in combination with an HSV color space model, and for a fire occurring inside or outside the building, due to the shielding of the building, the brightness of the flame region is generally higher than that of a background environmentThe flame is red and yellow, and the value range of the H component and the value range of the V component of the flame area are found through priori knowledge according to the relation between the color and the brightness; calculating the gamma value of the sample points in the extraction area through a formula (5), and sequencing the sample points from large to small to obtain the gamma value of the clustering center point of the real flame area k The value is maximum in the extraction area, and therefore takes γ k The corresponding sample points are used as clustering center points, and the rest sample points are distributed according to the traditional DPC to finish the segmentation of the flame image, so that an accurate segmentation graph of the flame image is finally obtained;
γ i =ρ′ i ×δ′ i (5)。
CN202210703536.4A 2022-06-21 2022-06-21 Fire image segmentation method for improving density peak value clustering Pending CN115619802A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210703536.4A CN115619802A (en) 2022-06-21 2022-06-21 Fire image segmentation method for improving density peak value clustering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210703536.4A CN115619802A (en) 2022-06-21 2022-06-21 Fire image segmentation method for improving density peak value clustering

Publications (1)

Publication Number Publication Date
CN115619802A true CN115619802A (en) 2023-01-17

Family

ID=84856646

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210703536.4A Pending CN115619802A (en) 2022-06-21 2022-06-21 Fire image segmentation method for improving density peak value clustering

Country Status (1)

Country Link
CN (1) CN115619802A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117152474A (en) * 2023-07-25 2023-12-01 华能核能技术研究院有限公司 High-temperature gas cooled reactor flame identification method based on K-means clustering algorithm

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117152474A (en) * 2023-07-25 2023-12-01 华能核能技术研究院有限公司 High-temperature gas cooled reactor flame identification method based on K-means clustering algorithm

Similar Documents

Publication Publication Date Title
US10474874B2 (en) Applying pixelwise descriptors to a target image that are generated by segmenting objects in other images
CN107292339B (en) Unmanned aerial vehicle low-altitude remote sensing image high-resolution landform classification method based on feature fusion
CN105631880B (en) Lane line dividing method and device
Recky et al. Windows detection using k-means in cie-lab color space
CN103871029B (en) A kind of image enhaucament and dividing method
CN106599792B (en) Method for detecting hand driving violation behavior
CN111666834A (en) Forest fire automatic monitoring and recognizing system and method based on image recognition technology
CN109859171A (en) A kind of flooring defect automatic testing method based on computer vision and deep learning
CN105608455B (en) A kind of license plate sloped correcting method and device
CN110222644A (en) Forest fire smoke detection method based on image segmentation
CN107392968B (en) The image significance detection method of Fusion of Color comparison diagram and Color-spatial distribution figure
CN107909033A (en) Suspect's fast track method based on monitor video
JP5181955B2 (en) Image classification device and image processing device
CN107729812B (en) Method suitable for recognizing vehicle color in monitoring scene
CN111274964B (en) Detection method for analyzing water surface pollutants based on visual saliency of unmanned aerial vehicle
CN109657612A (en) A kind of quality-ordered system and its application method based on facial image feature
CN107273866A (en) A kind of human body abnormal behaviour recognition methods based on monitoring system
CN109063619A (en) A kind of traffic lights detection method and system based on adaptive background suppression filter and combinations of directions histogram of gradients
Xiong et al. Early smoke detection of forest fires based on SVM image segmentation
CN111027475A (en) Real-time traffic signal lamp identification method based on vision
CN112906550B (en) Static gesture recognition method based on watershed transformation
CN114359323B (en) Image target area detection method based on visual attention mechanism
Alvarado-Robles et al. An approach for shadow detection in aerial images based on multi-channel statistics
CN113706523A (en) Method for monitoring belt deviation and abnormal operation state based on artificial intelligence technology
CN111369539A (en) Building facade window detecting system based on multi-feature map fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination