CN115311811A - Electrical fire remote alarm processing method and device based on Internet of things - Google Patents

Electrical fire remote alarm processing method and device based on Internet of things Download PDF

Info

Publication number
CN115311811A
CN115311811A CN202211237719.8A CN202211237719A CN115311811A CN 115311811 A CN115311811 A CN 115311811A CN 202211237719 A CN202211237719 A CN 202211237719A CN 115311811 A CN115311811 A CN 115311811A
Authority
CN
China
Prior art keywords
value
pixel
image
preset
pixel points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211237719.8A
Other languages
Chinese (zh)
Other versions
CN115311811B (en
Inventor
仲崇涛
陈礼贵
王睿
周健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Anshilang Intelligent Technology Co ltd
Original Assignee
Jiangsu Anshilang Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Anshilang Intelligent Technology Co ltd filed Critical Jiangsu Anshilang Intelligent Technology Co ltd
Priority to CN202211237719.8A priority Critical patent/CN115311811B/en
Publication of CN115311811A publication Critical patent/CN115311811A/en
Application granted granted Critical
Publication of CN115311811B publication Critical patent/CN115311811B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks

Abstract

The invention provides an electric fire remote alarm processing method and device based on the Internet of things, wherein image difference degrees are obtained according to the same number of pixel points and different numbers of pixel points; if the image difference degree is greater than the preset difference degree, obtaining a plurality of acquisition moments according to the current moment and a preset time period, and extracting the area image of the area where each power transformation equipment is located in the infrared image based on the acquisition moments; processing pixel values of the images in the area corresponding to each adjacent time point in the image set to obtain a plurality of pixel value variation trends, and obtaining trend variation difference values corresponding to each image set based on the pixel value variation trends; adjusting a preset trend threshold value according to the environmental information, the using duration and the average value of the total number of the target pixel points to obtain a current trend threshold value; and if the trend change difference is larger than the current trend threshold, the transformer equipment corresponding to the image set is used as early warning transformer equipment, and the early warning transformer equipment is sent to a fire disaster processing end.

Description

Electrical fire remote alarm processing method and device based on Internet of things
Technical Field
The invention relates to a data processing technology, in particular to an electric fire remote alarm processing method and device based on the Internet of things.
Background
The electrical fire refers to a fire caused by taking electric energy as a fire source, is easy to develop into a serious fire accident, has electric shock and explosion danger during fighting and has larger hazard compared with other fires. Especially in a place with a large amount of power transformation equipment, such as a substation, once an electrical fire occurs, the consequences are not obvious.
In the prior art, an infrared thermal imager is usually used to detect whether an electrical fire occurs, and the infrared thermal imager is an imaging device reflecting the surface temperature of an object, so that the infrared thermal imager can be used as an effective fire detection device. However, for some power transformation devices which can continuously generate heat during normal operation, whether a fire disaster occurs or not is difficult to detect by using the method, so how to sense and detect the fire disaster occurrence situation of the power transformation devices in time by using the infrared thermal imager is a problem which needs to be solved urgently at present.
Disclosure of Invention
The embodiment of the invention provides an electric fire remote alarm processing method based on the Internet of things, which can sense the fire occurrence condition of a power transformation device in time by using an infrared thermal imager.
In a first aspect of the embodiments of the present invention, an electrical fire remote alarm processing method based on the internet of things is provided, including:
acquiring a first infrared image of a power transformation room at a first time point and a second infrared image at a second time point, acquiring the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and acquiring image difference according to the same number of pixel points and the different numbers of pixel points;
if the image difference is greater than the preset difference, obtaining a plurality of acquisition moments according to the current moment and a preset time period, extracting area images of areas where the power transformation equipment is located in the infrared image based on the acquisition moments, and sequencing according to time to generate image sets corresponding to the power transformation equipment;
processing pixel values of the images in the corresponding regions of the adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and obtaining trend variation difference values corresponding to the image sets based on the pixel value variation trends;
acquiring environment information and current use time corresponding to power transformation equipment and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use time and the average value of the total number of the target pixels to obtain an adjusted current trend threshold;
and if the trend change difference value corresponding to the image set is greater than the current trend threshold value, the transformer equipment corresponding to the image set is used as early warning transformer equipment, and the early warning transformer equipment is sent to a fire disaster processing end.
Optionally, in a possible implementation manner of the first aspect, obtaining a first infrared image of a substation at a first time point and a second infrared image at a second time point, obtaining the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and obtaining an image difference according to the same number of pixel points and the different numbers of pixel points includes:
performing coordinate processing on the first infrared image and the second infrared image to obtain a coordinate value of each pixel point;
acquiring coordinates of pixel points in the first infrared image and the second infrared image to generate a first coordinate set and a second coordinate set;
calculating the difference value between the pixel value of the pixel point positioned in the central coordinate in the first coordinate set and the pixel value of the pixel point positioned outside the preset range in the second coordinate set;
and obtaining the same number of pixel points and different numbers of pixel points in the first coordinate set and the second coordinate set based on the ratio of the difference values corresponding to the pixel points in the first coordinate set and the second coordinate set, and obtaining the image difference according to the same number of the pixel points and the different numbers of the pixel points.
Optionally, in a possible implementation manner of the first aspect, obtaining the same number of pixels and different numbers of pixels in the first coordinate set and the second coordinate set based on a ratio of differences corresponding to each pixel in the first coordinate set and the second coordinate set, and obtaining the image disparity according to the same number of pixels and the different numbers of pixels includes:
collecting pixel values of pixel points positioned at the center coordinate and pixel points positioned at all coordinates outside a preset range in the first coordinate set, calculating the difference value of the pixel values to obtain a first difference value, and calculating the difference value of the pixel values of pixel points positioned at the center coordinate and pixel values of pixel points positioned at all coordinates outside the preset range in the second coordinate set to obtain a second difference value;
acquiring a first difference value and a second difference value corresponding to pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set, and obtaining a difference value ratio corresponding to the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set based on the first difference value and the second difference value;
if the difference ratio is smaller than a preset difference ratio, taking the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set as the same pixel points, and if the difference ratio is larger than or equal to the preset difference ratio, taking the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set as different pixel points;
and counting the number of the same pixel points to obtain the same number of the pixel points, counting the number of the different pixel points to obtain different numbers of the pixel points, obtaining the total number of the pixel points according to the sum of the same number of the pixel points and the different numbers of the pixel points, and obtaining the image difference degree based on the ratio of the same number of the pixel points to the total number of the pixel points.
Optionally, in a possible implementation manner of the first aspect, the preset range is constructed by:
generating a preset range by taking a pixel point positioned at a central coordinate in the first coordinate set and the second coordinate set as a circle center and a preset radius as a radius;
acquiring the pixel quantity of all pixel points in the first coordinate set and the second coordinate set, and adjusting the preset range based on the pixel quantity to obtain an adjustment range;
the number of pixel points of the adjustment radius corresponding to the adjustment range is calculated by the following formula,
Figure 9790DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 936157DEST_PATH_IMAGE002
the number of the pixel points of the adjustment radius corresponding to the adjustment range,
Figure 16109DEST_PATH_IMAGE003
the number of pixels of all the pixel points is,
Figure 635309DEST_PATH_IMAGE004
in order to preset the number of pixels,
Figure 433501DEST_PATH_IMAGE005
in order to adjust the weight value of the range,
Figure 163559DEST_PATH_IMAGE006
the number of pixels corresponding to the preset radius.
Optionally, in a possible implementation manner of the first aspect, obtaining multiple acquisition moments according to a current moment and a preset time period, extracting area images of areas where the power transformation equipment is located in the infrared image based on the acquisition moments, and sequencing the area images according to time to generate an image set corresponding to each power transformation equipment, where the method includes:
determining a first acquisition time and a second acquisition time according to the current time and a preset acquisition duration, and extracting area images of areas of the transformer equipment substations in the infrared images based on the first acquisition time and the second acquisition time;
calculating based on the preset acquisition duration and the pixel values of the two region images to obtain a pixel value change trend;
obtaining a duration offset coefficient according to the pixel value variation trend and a preset pixel value variation trend, and adjusting a preset acquisition duration based on the duration offset coefficient to obtain a current acquisition duration;
determining the next acquisition time according to the second acquisition time and the current acquisition time, deleting the first acquisition time, updating the second acquisition time to the first acquisition time, and determining the next acquisition time to be the second acquisition time;
and repeatedly executing the steps, and stopping executing after the preset requirements are met.
Optionally, in a possible implementation manner of the first aspect, obtaining a duration offset coefficient according to the pixel value variation trend and a preset pixel value variation trend, and adjusting a preset acquisition duration based on the duration offset coefficient to obtain a current acquisition duration includes:
adjusting the preset acquisition time length according to the time length offset coefficient to obtain an adjusted acquisition time length after adjustment, and obtaining the current acquisition time length based on the difference value between the preset acquisition time length and the adjusted acquisition time length;
the current acquisition time duration is calculated by the following formula,
Figure 98017DEST_PATH_IMAGE007
wherein, the first and the second end of the pipe are connected with each other,
Figure 622540DEST_PATH_IMAGE008
the time length for the current acquisition is,
Figure 908027DEST_PATH_IMAGE009
in order to preset the acquisition time length,
Figure 176198DEST_PATH_IMAGE010
in order to be a pixel value variation tendency,
Figure 230741DEST_PATH_IMAGE011
in order to preset the trend of the pixel value variation,
Figure 191744DEST_PATH_IMAGE012
in order to be the time-length offset coefficient,
Figure 698949DEST_PATH_IMAGE013
the weight value of the acquisition time length is adjusted.
Optionally, in a possible implementation manner of the first aspect, processing pixel values of an image in an area corresponding to each neighboring time point in the image set to obtain a plurality of pixel value variation trends, and obtaining a trend variation difference value corresponding to each image set based on the plurality of pixel value variation trends includes:
acquiring target pixel values of target pixel points in each regional image, and acquiring pixel values corresponding to each regional image according to the average value of all the target pixel values;
acquiring acquisition time length between adjacent region images, and calculating pixel values of a previous region image and a next region image between the adjacent region images in the image set based on the acquisition time length to obtain a plurality of pixel value variation trends;
determining a former pixel value variation trend and a latter pixel value variation trend in the adjacent pixel value variation trends, and obtaining a plurality of trend variation difference values according to the difference value of the latter pixel value variation trend and the former pixel value variation trend;
the trend change difference is calculated by the following formula,
Figure 770810DEST_PATH_IMAGE014
wherein the content of the first and second substances,
Figure 945440DEST_PATH_IMAGE015
in the form of a trend-change difference value,
Figure 811764DEST_PATH_IMAGE016
is as follows
Figure 71844DEST_PATH_IMAGE017
The variation trend of the value of each pixel,
Figure 478555DEST_PATH_IMAGE018
is as follows
Figure 507691DEST_PATH_IMAGE019
The variation trend of the value of each pixel,
Figure 544917DEST_PATH_IMAGE020
is a first
Figure 292293DEST_PATH_IMAGE021
The tendency of the value of each pixel to change,
Figure 705957DEST_PATH_IMAGE022
is a first
Figure 324020DEST_PATH_IMAGE023
The target pixel value of the target pixel point in the previous region image,
Figure 797727DEST_PATH_IMAGE024
the total number of target pixel points in the previous regional image,
Figure 32399DEST_PATH_IMAGE025
the number value of the target pixel points in the previous region image,
Figure 984174DEST_PATH_IMAGE026
is the average value of the pixels corresponding to the image of the previous area,
Figure 722323DEST_PATH_IMAGE027
is as follows
Figure 101352DEST_PATH_IMAGE023
Target image of target pixel point in next area imageThe value of the element is as follows,
Figure 823320DEST_PATH_IMAGE028
the total number of target pixel points in the image of the next region,
Figure 578787DEST_PATH_IMAGE029
the number of target pixels in the image of the next region is the value,
Figure 437021DEST_PATH_IMAGE030
is the average value of the pixels corresponding to the image of the latter area,
Figure 801267DEST_PATH_IMAGE031
is a first
Figure 948215DEST_PATH_IMAGE023
The length of time between the acquisition of the images of the adjacent regions,
Figure 241793DEST_PATH_IMAGE032
weight values are calculated for the trends.
Optionally, in a possible implementation manner of the first aspect, the acquiring environmental information and current usage time corresponding to a power transformation device, and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environmental information, the usage time, and the average value of the total number of target pixels to obtain an adjusted current trend threshold includes:
acquiring an environment temperature value corresponding to the environment information according to a temperature acquisition device, and calculating according to the environment temperature value and a preset temperature value to obtain a temperature offset coefficient;
acquiring a preset service life corresponding to each power transformation device, and calculating according to the preset service life and the current service life to generate a service life deviation value;
collecting the total number of target pixel points of all regional images in an image set, and calculating based on the average value of the total number of the target pixel points and the number of preset pixel points to obtain a number deviation value;
adjusting a preset trend threshold value according to the temperature offset coefficient, the service life offset value and the quantity offset value to obtain an adjusted current trend threshold value;
the current trend threshold is calculated by the following formula,
Figure 954534DEST_PATH_IMAGE033
wherein, the first and the second end of the pipe are connected with each other,
Figure 940945DEST_PATH_IMAGE034
in the case of the current trend threshold value,
Figure 168664DEST_PATH_IMAGE035
is a value of the ambient temperature and,
Figure 469195DEST_PATH_IMAGE036
is a preset temperature value, and is used for controlling the temperature of the air conditioner,
Figure 833180DEST_PATH_IMAGE037
in order to preset the service life of the device,
Figure 990492DEST_PATH_IMAGE038
for the current length of time of use,
Figure 908769DEST_PATH_IMAGE039
is the average of the total number of target pixels,
Figure 544150DEST_PATH_IMAGE040
in order to preset the number of the pixel points,
Figure 231483DEST_PATH_IMAGE041
in order to pre-set the trend threshold value,
Figure 762959DEST_PATH_IMAGE042
a weight value that is the current trend threshold.
Optionally, in a possible implementation manner of the first aspect, if correction information input by a worker through an input device is received, the correction information includes correction information that a fire alarm does not occur and correction information that a fire alarm does not occur;
if the correction information is correction information without fire alarm, calculating a current trend threshold value and an increase value of a preset increase correction value, and correcting a weight value of the current trend threshold value based on the increase value to obtain a correction weight value;
if the correction information is the correction information which is not alarmed when a fire disaster occurs, calculating a current trend threshold value and a preset turn-down value of a turn-down correction value, and correcting a weight value of the current trend threshold value based on the turn-down value to obtain a correction weight value;
the correction weight value is calculated by the following formula,
Figure 902953DEST_PATH_IMAGE043
wherein the content of the first and second substances,
Figure 607604DEST_PATH_IMAGE044
in order to correct the weight values,
Figure 149444DEST_PATH_IMAGE045
in order to preset the up-regulation correction value,
Figure 648558DEST_PATH_IMAGE046
in order to increase the weight of the correction,
Figure 72586DEST_PATH_IMAGE047
in order to reduce the value of the correction weight,
Figure 49770DEST_PATH_IMAGE048
the correction value is adjusted low for the preset.
In a second aspect of the embodiments of the present invention, an electrical fire remote alarm processing device based on the internet of things is provided,
optionally, in a possible implementation manner of the second aspect, the method includes:
the difference module is used for acquiring a first infrared image of the power transformation room at a first time point and a second infrared image at a second time point, acquiring the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and acquiring image difference degrees according to the same number of pixel points and the different numbers of pixel points;
the image module is used for obtaining a plurality of acquisition moments according to the current moment and a preset time period if the difference degree is greater than the preset difference degree, extracting area images of areas where the power transformation equipment is located in the infrared image based on the acquisition moments, and sequencing according to time to generate an image set corresponding to the power transformation equipment;
the difference module is used for processing pixel values of the images in the areas corresponding to the adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and trend variation differences corresponding to the image sets are obtained based on the pixel value trends;
the threshold module is used for acquiring environment information and current use time corresponding to the power transformation equipment and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use time and the average value of the total number of the target pixels to obtain an adjusted current trend threshold;
and the early warning module is used for taking the transformer equipment corresponding to the image set as early warning transformer equipment and sending the early warning transformer equipment to a fire disaster processing end if the trend change difference value corresponding to the image set is greater than the current trend threshold value.
The invention has the following beneficial effects:
1. the method comprises the steps of firstly obtaining the pixel value change trend of the infrared image corresponding to each substation at different moments through the pixel values of the infrared image, then obtaining the trend change difference value through the adjacent pixel value change trend, and judging whether a fire disaster occurs or not according to the comparison between the trend change difference value and a preset threshold value. Compared with the prior art, the method can enable the infrared equipment to monitor the fire occurrence condition of the transformer equipment more accurately. In addition, the scheme also utilizes some factors which easily influence the preset threshold value to adjust the preset threshold value, such as environmental factors, use time and the like, and further adjusts the preset threshold value afterwards, so that the more accuracy of the preset threshold value is ensured. Compared with the prior art, the preset threshold value can be dynamically adjusted through the method, so that the preset threshold value is more consistent with the practical application scene of the invention.
2. According to the method, the infrared images of the power transformation room at different moments are subjected to coordinate processing, difference values of pixel points of coordinates outside a preset range and pixel points of center coordinates in the infrared images corresponding to the different moments are calculated to obtain difference values corresponding to the infrared images at the different moments, and whether the infrared images of the power transformation equipment at the different moments are changed or not is judged according to the difference values. Through this kind of mode, can judge whether the condition of the transformer room of different moments changes to can further confirm whether it can take place the conflagration, and calculate difference value through the difference and can also reduce the error, make the calculated result accord with the scene more.
3. The method comprises the steps of firstly obtaining a pixel value variation trend through the ratio of the difference value of the pixel values of the infrared images corresponding to each power transformation equipment at adjacent moments to the corresponding duration of the infrared images, and then obtaining a trend variation difference value through the difference value of the adjacent pixel value variation trends. By the method, the change condition of the infrared image of the power transformation equipment within a certain time can be acquired, and whether the power transformation equipment is abnormal in operation can be determined according to the change condition. In addition, the preset acquisition duration is adjusted according to the pixel value variation trend, so that the variation condition of the power transformation equipment can be acquired in time when the pixel value variation trend is large, and the condition deterioration caused by improper acquisition time can be prevented.
4. The preset threshold value is adjusted through the environmental information corresponding to the power transformation equipment, the current use time and the average value of the total number of the target pixel points in the image set, and the preset threshold value is further adjusted when the error occurs in the post early warning. Through the method, the preset threshold value can be dynamically adjusted, so that the preset threshold value is more consistent with the practical application scene of the invention.
Drawings
Fig. 1 is a schematic diagram of a remote alarm processing method for an electrical fire based on the internet of things according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a remote alarm processing method for an electrical fire based on the internet of things according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
Referring to fig. 1, which is a schematic diagram of a remote alarm processing method for an electrical fire based on the internet of things according to an embodiment of the present invention, an execution subject of the method shown in fig. 1 may be a software and/or hardware device. The execution subject of the present application may include, but is not limited to, at least one of: user equipment, network equipment, etc. The user equipment may include, but is not limited to, a computer, a smart phone, a Personal Digital Assistant (PDA), and the electronic devices mentioned above. The network device may include, but is not limited to, a single network server, a server group of multiple network servers, or a cloud of numerous computers or network servers based on cloud computing, wherein cloud computing is one type of distributed computing, a super virtual computer consisting of a cluster of loosely coupled computers. The present embodiment does not limit this. The electric fire remote alarm processing method based on the Internet of things comprises the following steps of S1 to S5:
the method comprises the steps of S1, obtaining a first infrared image of a power transformation room at a first time point and a second infrared image of the power transformation room at a second time point, obtaining the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and obtaining image difference degrees according to the same number of pixel points and the different numbers of the pixel points.
The first time point is a first time point for acquiring an infrared image, the first infrared image is an infrared image of the power transformation room corresponding to the first time point, the second time point is a second time point for acquiring the infrared image after a preset time period, and the second infrared image is an infrared image of the power transformation room corresponding to the second time point.
For example, if the first time point is ten points and the preset time period is 10 minutes, the second time point is ten points and ten over ten minutes.
The same number of the pixel points is the number of the pixel points with the same pixel value divided in the first infrared image and the second infrared image, the different number of the pixel points is the number of the pixel points with different pixel values divided in the first infrared image and the second infrared image, and the image difference degree is the image difference degree of the first infrared image and the second infrared image.
In some embodiments, the image difference may be obtained through steps S11 to S14, specifically as follows:
and S11, carrying out coordinate processing on the first infrared image and the second infrared image to obtain a coordinate value of each pixel point.
The first infrared image and the second infrared image are coordinated with the same origin of coordinates, and are both coordinated with the center point as the origin of coordinates, for example.
And S12, acquiring coordinates of pixel points in the first infrared image and the second infrared image to generate a first coordinate set and a second coordinate set.
The first coordinate set is a set formed by coordinates of all pixel points of the first infrared image after the first infrared image is subjected to the coordinate processing, and the second coordinate set is a set formed by coordinates of all pixel points of the second infrared image after the second infrared image is subjected to the coordinate processing.
And S13, calculating the difference value between the pixel value of the pixel point positioned in the central coordinate and the pixel value of the pixel point positioned outside the preset range in the first coordinate set and the second coordinate set.
S14, obtaining the same number of pixel points and different numbers of pixel points in the first coordinate set and the second coordinate set based on the ratio of the difference value corresponding to each pixel point in the first coordinate set and the second coordinate set, and obtaining the image difference degree according to the same number of the pixel points and the different numbers of the pixel points.
It will be appreciated that infrared images taken at different points in time are rarely identical due to factors such as light. Therefore, the pixel values of the pixel points located in the same coordinate may have some deviation, so that in order to reduce errors, the judgment is performed according to the difference between the pixel values of the pixel points of the coordinates outside the preset range and the pixel values of the pixel points of the central coordinate when the same pixel point and different pixel points are judged.
Specifically, the same pixel point and the different pixel points may be obtained through steps S141 to S144, and the image difference may be obtained.
And S141, collecting pixel values of the pixel point positioned at the central coordinate and the pixel points positioned at the coordinates outside the preset range in the first coordinate set, calculating a difference value of the pixel values to obtain a first difference value, and calculating a difference value of the pixel values of the pixel point positioned at the central coordinate and the pixel values of the pixel points positioned at the coordinates outside the preset range in the second coordinate set to obtain a second difference value.
The first difference value is the difference value between the pixel value of the pixel point positioned at the central coordinate in the first coordinate set and the pixel value of the pixel point positioned at each coordinate outside the preset range, and the second difference value is the difference value between the pixel value of the pixel point positioned at the central coordinate in the second coordinate set and the pixel value of the pixel point positioned at each coordinate outside the preset range.
In some embodiments, the preset range is constructed by:
and generating a preset range by taking the pixel point positioned at the central coordinate in the first coordinate set and the second coordinate set as a circle center and a preset radius as a radius.
In practical application, the preset radius can be preset by a worker according to practical situations.
It can be understood that, in order to reduce errors caused by the influence of factors such as light on the pixel values of the pixels located at the same coordinate, the present solution selects to determine the difference between the pixel values of the pixels outside the preset ranges and the pixel value of the pixel at the center coordinate, so as to determine whether the pixel is the same pixel. Secondly, the pixel value of the pixel point of the central coordinate is very similar to the pixel values of the pixel points of the surrounding coordinates, so that the pixel points in the preset range are excluded when the pixel value of the pixel point of each coordinate is calculated in the scheme, so that the error is reduced.
And acquiring the pixel quantity of all pixel points in the first coordinate set and the second coordinate set, and adjusting the preset range based on the pixel quantity to obtain an adjustment range. The method can adjust the preset range by combining the total number of the pixel points in the first infrared image and the second infrared image to obtain the corresponding adjustment range, and if the total number is larger, the corresponding adjustment range is larger.
The number of pixel points of the adjustment radius corresponding to the adjustment range is calculated by the following formula,
Figure 914957DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 584973DEST_PATH_IMAGE002
the number of the pixel points of the adjustment radius corresponding to the adjustment range,
Figure 699560DEST_PATH_IMAGE003
for the number of pixels of all the pixel points,
Figure 480434DEST_PATH_IMAGE004
in order to preset the number of pixels,
Figure 996866DEST_PATH_IMAGE005
in order to adjust the weight value of the range,
Figure 837783DEST_PATH_IMAGE006
the number of pixels corresponding to the preset radius.
It can be seen from the above formula that the number of pixels of all the pixels
Figure 439666DEST_PATH_IMAGE003
The number of pixel points with the adjustment radius corresponding to the adjustment range
Figure 820968DEST_PATH_IMAGE049
Proportional relation, the number of pixels of all pixels
Figure 191907DEST_PATH_IMAGE003
The larger the adjustment range is, the number of the pixel points with the adjustment radius corresponding to the adjustment range is
Figure 203725DEST_PATH_IMAGE002
The larger the pixel size is, because of the pixel number of all the pixel points
Figure 27325DEST_PATH_IMAGE003
The larger the value of (b), the larger the corresponding image, and therefore the larger the adjustment range. So that the subsequently calculated difference is relatively more accurate.
And S142, acquiring a first difference value and a second difference value corresponding to the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set, and obtaining a difference value ratio corresponding to the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set based on the first difference value and the second difference value.
And the difference ratio is the ratio of a first difference and a second difference corresponding to the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set.
In some embodiments, the difference ratio may be calculated by the following formula:
Figure 415581DEST_PATH_IMAGE050
wherein, the first and the second end of the pipe are connected with each other,
Figure 646885DEST_PATH_IMAGE051
in order to be a ratio of the difference values,
Figure 829605DEST_PATH_IMAGE052
is the pixel value of the pixel point located at the center coordinate in the first coordinate set,
Figure 406080DEST_PATH_IMAGE053
the pixel values of the pixels in the first coordinate set that are located at coordinates outside the predetermined range,
Figure 332447DEST_PATH_IMAGE054
the pixel values of the pixel points located at the center coordinate in the second coordinate set,
Figure 412399DEST_PATH_IMAGE055
the pixel values of the pixel points of the second coordinate set which are located at the coordinates outside the preset range,
Figure 31599DEST_PATH_IMAGE056
is the weighted value of the difference ratio.
The larger the difference ratio is, the larger the difference between the pixel values of the pixels located at the same coordinate in the first coordinate set and the second coordinate set is, and based on this, the pixels located at the same coordinate in the first coordinate set and the second coordinate set can be divided into the same pixels or different pixels.
And S143, if the difference ratio is smaller than a preset difference ratio, taking the pixel points in the first coordinate set and the second coordinate set which are located at the same coordinate as the same pixel points, and if the difference ratio is larger than or equal to the preset difference ratio, taking the pixel points in the first coordinate set and the second coordinate set which are located at the same coordinate as different pixel points.
The same pixel points are the pixel points judged to have the same pixel values in the first coordinate set and the second coordinate set, and the different pixel points are the pixel points judged to have different pixel values in the first coordinate set and the second coordinate set.
It can be understood that if the difference ratio is smaller than the preset difference ratio, it indicates that the pixel value similarity of the pixel points located at the same coordinate in the first coordinate set and the second coordinate set meets the preset requirement, and the pixel values can be judged to be the same pixel points; if the difference comparison is larger than or equal to the preset difference comparison, it is indicated that the pixel value similarity of the pixel points located in the same coordinate in the first coordinate set and the second coordinate set does not meet the preset requirement, and the pixel values can be judged to be different pixel points.
S144, counting the number of the same pixels to obtain the same number of the pixels, counting the number of the different pixels to obtain different numbers of the pixels, obtaining the total number of the pixels according to the sum of the same number of the pixels and the different numbers of the pixels, and obtaining the image difference degree based on the ratio of the same number of the pixels to the total number of the pixels.
Specifically, after all the same pixel points and all the different pixel points are obtained, the same number of the pixel points and the different number of the pixel points can be obtained through statistics. After summing the same number of pixel points and different numbers of pixel points, the image difference can be obtained according to the ratio of the same number of pixel points to the total number of the pixel points.
The higher the ratio of the image difference degrees is, the more similar the two images are, and the time-varying power room is not abnormal; conversely, the lower the ratio of the image difference degrees is, the more dissimilar the two images are, and the more likely the time-varying power house is abnormal.
The image difference degree obtained by the method can be used for judging whether the situation of the power transformation room is abnormal or not in real time, so that a worker can find problems in time.
And S2, if the image difference is larger than the preset difference, obtaining a plurality of acquisition moments according to the current moment and a preset time period, extracting area images of areas of all the power transformation equipment substations in the infrared image based on the acquisition moments, and sequencing according to time to generate an image set corresponding to all the power transformation equipment substations.
The acquisition time is the time for acquiring the area images of the areas of the transformer equipment, and the image set is a set formed by the area images acquired by the transformer equipment at a plurality of acquisition times.
In some embodiments, the region image may be obtained by:
firstly, an area extraction frame corresponding to each power transformation device can be manually preset in an acquisition area of the infrared camera, and then an area image corresponding to each power transformation device is determined by the area extraction frame corresponding to each power transformation device.
Generally, the position of the power transformation equipment is not changed, so that the corresponding area image can be acquired by manually setting the area extraction frame corresponding to each power transformation equipment in advance.
In addition, when the position of the power transformation equipment is changed, the corresponding area extraction frame can be changed accordingly.
It can be understood that if the image difference is greater than the preset difference, it indicates that there may be an abnormality in the power transformation equipment in the area where the power transformation room is located, and at this time, it may be determined which power transformation equipment has an abnormality by acquiring the area images corresponding to the power transformation equipment at different times.
In some embodiments, the above-described region image and the image set may be obtained by the following steps S21 to S25.
S21, determining a first acquisition time and a second acquisition time according to the current time and a preset acquisition time, and extracting area images of areas where the power transformation equipment is located in the infrared images based on the first acquisition time and the second acquisition time.
The first acquisition time is the time when the regional image corresponding to each power transformation device is acquired in the previous time, and the second acquisition time is the time when the regional image corresponding to each power transformation device is acquired in the next time.
For example, if the current time is ten o ' clock and the preset acquisition time is 1 minute, the first acquisition time is ten o ' clock, and the second acquisition time is ten o ' clock and one zero minute.
Specifically, after the first acquisition time and the second acquisition time are acquired, the area images corresponding to the first acquisition time and the second acquisition time can be respectively extracted at the first acquisition time and the second acquisition time.
And S22, calculating based on the preset acquisition time and the pixel values of the two area images to obtain a pixel value change trend.
Wherein, the pixel value variation trend is the pixel value variation trend between two adjacent area images.
In some embodiments, the pixel value variation trend may be calculated by a ratio of a difference between pixel values corresponding to the two area images to a preset acquisition time.
S23, obtaining a duration offset coefficient according to the pixel value variation trend and a preset pixel value variation trend, and adjusting preset acquisition duration based on the duration offset coefficient to obtain the current acquisition duration.
The time length offset coefficient is the ratio of the pixel value variation trend to the preset pixel value variation trend, and the current acquisition time length is the acquisition time length after the preset acquisition time length is adjusted.
It can be understood that, since the preset acquisition duration is preset, it does not change, but the speed of the flame rising of the power transformation device is very fast when a fire actually occurs, which may cause a situation that the fire rapidly extends after the preset acquisition duration, so that the preset acquisition duration may be dynamically adjusted to prompt a worker of the occurrence of the fire in time.
In some embodiments, the current acquisition duration may be obtained by:
and adjusting the preset acquisition time length according to the time length offset coefficient to obtain an adjusted acquisition time length after adjustment, and obtaining the current acquisition time length based on the difference value of the preset acquisition time length and the adjusted acquisition time length.
The current acquisition time duration is calculated by the following formula,
Figure 829791DEST_PATH_IMAGE007
wherein, the first and the second end of the pipe are connected with each other,
Figure 294270DEST_PATH_IMAGE008
the time length for the current acquisition is,
Figure 494307DEST_PATH_IMAGE009
in order to preset the acquisition time length,
Figure 487671DEST_PATH_IMAGE010
in order to show the tendency of the pixel value variation,
Figure 507580DEST_PATH_IMAGE011
in order to preset the trend of the pixel value variation,
Figure 41329DEST_PATH_IMAGE057
in order to be the time-length offset coefficient,
Figure 95873DEST_PATH_IMAGE013
the weight value of the acquisition time length is adjusted.
The concrete conception of the above formula is:
when the pixel value changes
Figure 791296DEST_PATH_IMAGE010
The larger the difference between the pixel values of the two area images is, the more likely the difference between the pixel values of the two area images becomes larger due to the fire, and the corresponding current acquisition time length is
Figure 564080DEST_PATH_IMAGE008
The method can also be correspondingly shortened so as to conveniently acquire the area image at the next moment in time, and whether the fire disaster happens or not is further determined according to the pixel value change trend among a plurality of area images.
When the pixel value changes
Figure 635942DEST_PATH_IMAGE010
When the time is shorter, the difference between the pixel values of the two area images is smaller, or the pixel value of the next area image is smaller than the pixel value of the previous area image, at the moment, the probability of fire disaster is very small, so that the current acquisition time length can be shortened
Figure 544992DEST_PATH_IMAGE008
The corresponding arrangement is longer to ensure monitoring of the power transformation equipment.
In practical application, the change trend of the pixel value is preset
Figure 676896DEST_PATH_IMAGE011
And a preset acquisition duration
Figure 671397DEST_PATH_IMAGE009
The setting can be made correspondingly. Adjusting weight value of acquisition duration
Figure 546949DEST_PATH_IMAGE013
Can be set according to actual conditions, and the current acquisition time length
Figure 841664DEST_PATH_IMAGE008
When the time is too small, the weight value of the acquisition time length can be adjusted
Figure 878890DEST_PATH_IMAGE013
Increase it, and the current collection time length
Figure 626266DEST_PATH_IMAGE008
When the time is too large, the weight value of the acquisition time length can be adjusted
Figure 774351DEST_PATH_IMAGE058
It is subjected to a reduction process.
And S24, determining the next acquisition time according to the second acquisition time and the current acquisition time, deleting the first acquisition time, updating the second acquisition time to the first acquisition time, and determining the next acquisition time to be the second acquisition time.
And S25, repeatedly executing the steps and stopping execution after the preset requirements are met.
Specifically, the next acquisition time after the second acquisition time can be obtained by adding the current acquisition time to the second acquisition time, at this time, the original second acquisition time can be updated to the first acquisition time, the original first acquisition time is deleted, the next acquisition time is updated to the second acquisition time, and the steps are repeated to realize the acquisition of the regional image.
In practical applications, the preset requirement may be that the step is stopped after the preset number of area images are acquired, and the preset number may be preset by a worker according to an actual situation.
After the area images at the respective times are acquired, an image set corresponding to each power transformation device can be generated from all the area images.
Regional image and image set through above-mentioned mode obtains to and when presetting the current collection duration of gathering of carrying out the adjustment and obtaining, regional image to each substation equipment correspondence that can be timely is gathered to whether the staff can be timely discovery substation equipment takes place the conflagration.
And S3, processing the pixel values of the images in the area corresponding to each adjacent time point in the image set to obtain a plurality of pixel value change trends, and obtaining trend change difference values corresponding to each image set based on the plurality of pixel value change trends.
Wherein, the trend change difference is the difference of the change trends of two adjacent pixel values.
In some embodiments, the above-described trend change difference value may be obtained through steps S31 to S33.
And S31, acquiring target pixel values of target pixel points in each regional image, and acquiring pixel values corresponding to each regional image according to the average value of all the target pixel values.
In practical application, a worker may preset a pixel value range, and take a pixel point falling in the pixel value range as the target pixel point, and take a pixel value corresponding to the target pixel point as the target pixel value. The pixel value range is, for example, the pixel value range corresponding to red, and the scheme can extract the pixel points corresponding to the heating area through the pixel value range.
It can be understood that the temperature of the power transformation equipment can rise when a fire breaks out, and the temperature of the power transformation equipment which continuously generates heat during normal work can also be very high, so that whether the power transformation equipment which continuously generates heat during normal work breaks out a fire can be judged by the pixel value of the target pixel point, and if the power transformation equipment which continuously generates heat during work continuously rises in temperature, the corresponding target pixel point and the target pixel value can also be correspondingly increased.
S32, acquiring the acquisition time length between adjacent area images, and calculating the pixel values of the previous area image and the next area image between the adjacent area images in the image set based on the acquisition time length to obtain a plurality of pixel value variation trends.
It should be noted that the acquisition duration between adjacent region images may be a preset acquisition duration, and may also be a current acquisition duration. If the adjacent area images are the first group of adjacent area images, namely the collected first area image and the collected second area image, the collection time length is the preset collection time length, and if the adjacent area images are not the first group of adjacent area images, the collection time length is the current collection time length.
And S33, determining a former pixel value change trend and a latter pixel value change trend in the adjacent pixel value change trends, and obtaining a plurality of trend change difference values according to the difference value of the latter pixel value change trend and the former pixel value change trend.
The trend change difference is calculated by the following formula,
Figure 923572DEST_PATH_IMAGE059
wherein the content of the first and second substances,
Figure 131700DEST_PATH_IMAGE015
in order to be a trend-change difference value,
Figure 631951DEST_PATH_IMAGE016
is as follows
Figure 849306DEST_PATH_IMAGE023
The variation trend of the value of each pixel,
Figure 587455DEST_PATH_IMAGE018
is as follows
Figure 232063DEST_PATH_IMAGE019
The tendency of the value of each pixel to change,
Figure 891714DEST_PATH_IMAGE060
is a first
Figure 647181DEST_PATH_IMAGE021
The variation trend of the value of each pixel,
Figure 296293DEST_PATH_IMAGE022
is as follows
Figure 111803DEST_PATH_IMAGE023
The target pixel value of the target pixel point in the previous region image,
Figure 789909DEST_PATH_IMAGE024
the total number of target pixel points in the previous regional image,
Figure 552328DEST_PATH_IMAGE061
the quantity value of the target pixel point in the previous regional image,
Figure 61807DEST_PATH_IMAGE062
is the average value of the pixels corresponding to the image of the previous area,
Figure 48218DEST_PATH_IMAGE027
is as follows
Figure 479199DEST_PATH_IMAGE023
The target pixel value of the target pixel point in the next region image,
Figure 310889DEST_PATH_IMAGE063
the total number of target pixels in the image of the next region,
Figure 143716DEST_PATH_IMAGE029
the number of target pixels in the image of the next region is the value,
Figure 301027DEST_PATH_IMAGE064
is the average value of the pixels corresponding to the image of the latter area,
Figure 953726DEST_PATH_IMAGE031
is a first
Figure 854686DEST_PATH_IMAGE023
The length of time between the acquisition of the images of the adjacent regions,
Figure 542019DEST_PATH_IMAGE065
weight values are calculated for the trends.
As can be seen from the above formula, the first
Figure 870232DEST_PATH_IMAGE019
Tendency of variation of individual pixel value
Figure 10226DEST_PATH_IMAGE018
And a first
Figure 449298DEST_PATH_IMAGE023
Tendency of variation of individual pixel value
Figure 194400DEST_PATH_IMAGE016
The larger the difference value of (A) is, the larger the change of the change trend of the adjacent pixel value at the moment is, and the trend change difference value
Figure 693514DEST_PATH_IMAGE015
The larger the temperature difference, the more likely the temperature of the substation is to be continuously raised, and the more likely a fire is to occur.
Average value of pixels corresponding to image of previous area
Figure 320805DEST_PATH_IMAGE066
Average pixel value corresponding to the image of the subsequent region
Figure 563567DEST_PATH_IMAGE064
The larger the difference value of (A) is, the larger the pixel value change between the adjacent area images at the moment is, the
Figure 225493DEST_PATH_IMAGE067
Tendency of variation of individual pixel value
Figure 895509DEST_PATH_IMAGE060
The larger the difference, the more likely the pixel value change between the images of adjacent areas will be increased due to the temperature rise of the power transformation equipment.
In practical applications, the trend calculates the weight value
Figure 744516DEST_PATH_IMAGE068
Can be preset according to actual conditions when
Figure 790969DEST_PATH_IMAGE067
Tendency of variation of individual pixel value
Figure 307401DEST_PATH_IMAGE069
When the weight value is too large, the weight value can be calculated through the trend
Figure 148318DEST_PATH_IMAGE068
Subjecting it to a size-reduction treatment when it is on
Figure 484622DEST_PATH_IMAGE067
Tendency of variation of individual pixel value
Figure 69187DEST_PATH_IMAGE020
When too small, the weight value can be calculated from the trend
Figure 440125DEST_PATH_IMAGE068
It is subjected to an enlargement process.
The trend change difference obtained in the above manner can timely obtain the pixel value change trend corresponding to each power transformation device, so that the fire occurrence condition of the power transformation device can be timely warned.
And S4, acquiring environment information and current use time corresponding to the power transformation equipment and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use time and the average value of the total number of the target pixels to obtain an adjusted current trend threshold.
The environmental information is information corresponding to the surrounding environment of the power transformation equipment, the current service life is the service life corresponding to the current time when the power transformation equipment is replaced or installed from the beginning, and the current trend threshold is a threshold corresponding to a trend change difference when the early warning information is sent out.
In practical application, a threshold value is generally set in advance, and corresponding early warning information is generated after various acquired data reach the threshold value. However, due to the influence of various factors, the preset threshold value is likely to be inaccurate, and therefore, in order to make the set threshold value more accurate, the scheme collects various factors which may influence the change of the threshold value to adjust the threshold value.
In some embodiments, the current trend threshold may be obtained by the following steps.
S41, obtaining an environment temperature value corresponding to the environment information according to a temperature collecting device, and calculating according to the environment temperature value and a preset temperature value to obtain a temperature offset coefficient.
The environment temperature value is the environment temperature collected by the temperature collecting device, and the temperature offset coefficient is the ratio of the environment temperature value to a preset temperature value.
And S42, acquiring the preset service life corresponding to each power transformation device, calculating according to the preset service life and the current service time, and generating a service life deviation value.
The service life deviant is a ratio of a preset service life to the current service life.
S43, collecting the total number of target pixel points of all regional images in the image set, and calculating based on the average value of the total number of the target pixel points and the number of preset pixel points to obtain a number deviation value.
Specifically, the number of target pixel points corresponding to each regional image is obtained, then the number of the target pixel points in all the regional images is added to obtain the total number of all the target pixel points, and finally the total number of the target pixel points is averaged. The quantity deviation value is the ratio of the average value of the total quantity of the target pixel points to the preset pixel point quantity.
And S44, adjusting a preset trend threshold value according to the temperature offset coefficient, the service life offset value and the quantity offset value to obtain an adjusted current trend threshold value.
The current trend threshold is calculated by the following formula,
Figure 451944DEST_PATH_IMAGE033
wherein the content of the first and second substances,
Figure 541122DEST_PATH_IMAGE034
in the case of the current trend threshold value,
Figure 929379DEST_PATH_IMAGE035
is the value of the ambient temperature,
Figure 420403DEST_PATH_IMAGE036
is a preset temperature value, and is used as a temperature value,
Figure 608982DEST_PATH_IMAGE037
in order to preset the service life of the device,
Figure 919877DEST_PATH_IMAGE038
for the current length of time of use,
Figure 846245DEST_PATH_IMAGE039
is the average of the total number of the target pixels,
Figure 191776DEST_PATH_IMAGE040
in order to preset the number of the pixel points,
Figure 748659DEST_PATH_IMAGE041
in order to pre-set the trend threshold value,
Figure 546851DEST_PATH_IMAGE042
a weight value that is a current trend threshold.
The general concept of the above formula is:
ambient temperature value
Figure 276909DEST_PATH_IMAGE070
The higher the ambient temperature is, and when the ambient temperature is higher, the temperature change of the power transformation equipment may be affected, so that the power transformation equipment is also heated, and therefore the current trend threshold value can be set
Figure 476947DEST_PATH_IMAGE034
And is also adjusted to be larger appropriately to reduce the influence of the ambient temperature, and vice versa.
Length of current use
Figure 1469DEST_PATH_IMAGE071
The larger the current trend threshold value, the longer the service time of the power transformation equipment is, and the more easily the fire disaster is caused by the fault correspondingly, so that the current trend threshold value can be set
Figure 286957DEST_PATH_IMAGE034
Also properly reduces the current use time length
Figure 617444DEST_PATH_IMAGE038
The resulting effect and vice versa.
When the average value of the total number of the target pixels
Figure 875250DEST_PATH_IMAGE039
The larger the threshold value is, the closer the infrared collector is to the power transformation device, and if the threshold value is set to be low, the threshold value is likely to reach the threshold value easily to cause misjudgment, so that the current trend threshold value can be set
Figure 570673DEST_PATH_IMAGE034
Larger in setting to reduce the average value of the total number of target pixels
Figure 343457DEST_PATH_IMAGE039
The resulting effect and vice versa.
In practical application, threshold misjudgment is likely to occur, so that the scheme adjusts the current trend threshold in the following manner on the basis of the manner described above:
and S441, if correction information input by the staff through the input device is received, the correction information comprises correction information of no fire alarm and correction information of no fire alarm.
It can be understood that the warning information is alarmed when no fire occurs or is misjudged when no fire occurs, so that the threshold misjudgment can be adjusted through correction information input by staff in order to reduce the occurrence of threshold misjudgment.
And S442, if the correction information is the correction information without fire alarm, calculating the current trend threshold value and the heightened value of the preset heightened correction value, and correcting the weight value of the current trend threshold value based on the heightened value to obtain a corrected weight value.
And S443, if the correction information is correction information that a fire does not alarm, calculating a current trend threshold value and a preset lowering value of a lowering correction value, and correcting a weight value of the current trend threshold value based on the lowering value to obtain a corrected weight value.
It can be understood that, if the correction information is the correction information for which no fire alarm occurs, it indicates that the current trend threshold is lower at this time, which leads to the early warning when no fire occurs, and therefore, it is necessary to increase the current trend threshold to improve the threshold misjudgment.
If the correction information is the correction information which does not alarm when a fire occurs, it is indicated that the early warning is given out when the fire does not occur because the current trend threshold value is higher, and therefore, the threshold value misjudgment needs to be improved by reducing the threshold value.
The preset down-regulation correction value and the preset up-regulation correction value can be set in advance by a worker according to actual conditions.
The correction weight value is calculated by the following formula,
Figure 415318DEST_PATH_IMAGE072
wherein the content of the first and second substances,
Figure 324369DEST_PATH_IMAGE044
in order to correct the weight value(s),
Figure 721852DEST_PATH_IMAGE045
in order to preset the up-regulation correction value,
Figure 450774DEST_PATH_IMAGE046
in order to increase the weight of the correction,
Figure 591905DEST_PATH_IMAGE047
in order to reduce the value of the correction weight,
Figure 355462DEST_PATH_IMAGE048
the correction value is adjusted low for the preset.
The overall concept of the above formula is:
when no fire alarm occurs, the weight value of the current trend threshold value can be weighted
Figure 658267DEST_PATH_IMAGE073
Performing increasing processing so as to obtain corrected weight value after adjustment when next calculation is performed
Figure 140064DEST_PATH_IMAGE044
For current trend threshold
Figure 553728DEST_PATH_IMAGE074
And (6) adjusting. Wherein the magnitude of the increase may be thresholded by the current trend
Figure 437370DEST_PATH_IMAGE074
And preset set-up correction value
Figure 911077DEST_PATH_IMAGE045
To make the adjustment.
When fire disaster does not occur, the weight value of the current trend threshold value can be weighted
Figure 880170DEST_PATH_IMAGE075
Performing reduction processing so as to obtain corrected weight value after adjustment when next calculation is performed
Figure 97524DEST_PATH_IMAGE044
For current trend threshold
Figure 38936DEST_PATH_IMAGE074
And (6) adjusting. Wherein the magnitude of the decrease may be thresholded by the current trend
Figure 683544DEST_PATH_IMAGE074
And preset lower correction value
Figure 139933DEST_PATH_IMAGE048
To make the adjustment.
In practical application, the preset heightening correction value
Figure 895399DEST_PATH_IMAGE045
Increasing the correction weight
Figure 753634DEST_PATH_IMAGE046
Reducing the correction weight value
Figure 303564DEST_PATH_IMAGE047
And preset turndown correction value
Figure 247249DEST_PATH_IMAGE048
The setting can be carried out in advance by working personnel according to the actual situation.
The current trend threshold and the correction weight value obtained in the above way can dynamically adjust the threshold according to the actual situation, so that the adjusted threshold is more consistent with the applicable scene of the invention, and the threshold can be more accurate by readjusting the threshold through the correction weight value.
And S5, if the trend change difference value corresponding to the image set is larger than the current trend threshold value, the transformer equipment corresponding to the image set is used as early warning transformer equipment, and the early warning transformer equipment is sent to a fire disaster processing end.
It can be understood that if the trend change difference value corresponding to the image set is greater than the current trend threshold value, it indicates that a fire disaster is likely to occur in the power transformation equipment corresponding to the image set at this time, and therefore the power transformation equipment corresponding to the image set can be uploaded to the processing end as early warning power transformation equipment, so that a worker can process the early warning power transformation equipment in time.
Through this kind of mode, can make the staff in time make the processing to the situation that transformer equipment took place, can prevent because the huge harm that transformer equipment conflagration brought.
Referring to fig. 2, a schematic structural diagram of an electrical fire remote alarm processing device based on the internet of things according to an embodiment of the present invention is provided, where the electrical fire remote alarm processing device based on the internet of things includes:
the difference module is used for acquiring a first infrared image of the power transformation room at a first time point and a second infrared image at a second time point, acquiring the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and acquiring image difference degrees according to the same number of pixel points and the different numbers of pixel points.
And the image module is used for obtaining a plurality of acquisition moments according to the current moment and a preset time period if the difference degree is greater than a preset difference degree, extracting area images of areas of the transformer equipment stations in the infrared image based on the acquisition moments, and sequencing according to time to generate an image set corresponding to the transformer equipment stations.
And the difference value module is used for processing the pixel values of the images of the areas corresponding to the adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and obtaining trend variation difference values corresponding to the image sets based on the pixel value trends.
And the threshold module is used for acquiring environment information and current use time corresponding to the power transformation equipment and the average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use time and the average value of the total number of the target pixels to obtain the adjusted current trend threshold.
And the early warning module is used for taking the transformer equipment corresponding to the image set as early warning transformer equipment and sending the early warning transformer equipment to a fire disaster processing end if the trend change difference value corresponding to the image set is greater than the current trend threshold value.
The apparatus in the embodiment shown in fig. 2 may be correspondingly used to perform the steps in the method embodiment shown in fig. 1, and the implementation principle and technical effects are similar, which are not described herein again.
Referring to fig. 3, which is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention, the electronic device 30 includes: a processor 31, a memory 32 and a computer program; wherein
A memory 32 for storing the computer program, which may also be a flash memory (flash). The computer program is, for example, an application program, a functional module, or the like that implements the above-described method.
A processor 31 for executing the computer program stored in the memory to implement the steps performed by the apparatus in the above method. Reference may be made in particular to the description relating to the preceding method embodiment.
Alternatively, the memory 32 may be separate or integrated with the processor 31.
When the memory 32 is a device independent of the processor 31, the apparatus may further include:
a bus 33 for connecting the memory 32 and the processor 31.
The present invention also provides a readable storage medium, in which a computer program is stored, which, when being executed by a processor, is adapted to implement the methods provided by the various embodiments described above.
The readable storage medium may be a computer storage medium or a communication medium. Communication media includes any medium that facilitates transfer of a computer program from one place to another. Computer storage media may be any available media that can be accessed by a general purpose or special purpose computer. For example, a readable storage medium is coupled to the processor such that the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Additionally, the ASIC may reside in user equipment. Of course, the processor and the readable storage medium may also reside as discrete components in a communication device. The readable storage medium may be read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like.
The present invention also provides a program product comprising execution instructions stored in a readable storage medium. The at least one processor of the device may read the execution instructions from the readable storage medium, and the execution of the execution instructions by the at least one processor causes the device to implement the methods provided by the various embodiments described above.
In the above embodiments of the apparatus, it should be understood that the Processor may be a Central Processing Unit (CPU), other general purpose processors, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor, or in a combination of the hardware and software modules within the processor.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. An electric fire remote alarm processing method based on the Internet of things is characterized by comprising the following steps:
acquiring a first infrared image of a power transformation room at a first time point and a second infrared image at a second time point, acquiring the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and acquiring image difference according to the same number of pixel points and the different numbers of pixel points;
if the image difference is greater than the preset difference, obtaining a plurality of acquisition moments according to the current moment and a preset time period, extracting area images of areas where the power transformation equipment is located in the infrared image based on the acquisition moments, and sequencing according to time to generate image sets corresponding to the power transformation equipment;
processing pixel values of the images of the regions corresponding to the adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and obtaining trend variation difference values corresponding to the image sets based on the pixel value variation trends;
acquiring environment information and current use time corresponding to power transformation equipment and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use time and the average value of the total number of the target pixels to obtain an adjusted current trend threshold;
and if the trend change difference value corresponding to the image set is larger than the current trend threshold value, the transformer equipment corresponding to the image set is used as early warning transformer equipment, and the early warning transformer equipment is sent to a fire disaster processing end.
2. The method of claim 1,
acquiring a first infrared image of a power transformation room at a first time point and a second infrared image at a second time point, acquiring the same quantity of pixel points and different quantities of pixel points according to pixel values of pixel points at the same positions in the first infrared image and the second infrared image, and acquiring image difference degrees according to the same quantity of the pixel points and the different quantities of the pixel points, wherein the image difference degrees comprise:
performing coordinate processing on the first infrared image and the second infrared image to obtain a coordinate value of each pixel point;
acquiring coordinates of pixel points in the first infrared image and the second infrared image to generate a first coordinate set and a second coordinate set;
calculating the difference value between the pixel value of the pixel point positioned in the central coordinate in the first coordinate set and the pixel value of the pixel point positioned outside the preset range in the second coordinate set;
and obtaining the same number of pixel points and different numbers of pixel points in the first coordinate set and the second coordinate set based on the ratio of the difference values corresponding to the pixel points in the first coordinate set and the second coordinate set, and obtaining the image difference degree according to the same number of the pixel points and the different numbers of the pixel points.
3. The method of claim 2,
obtaining the same number of pixel points and different numbers of pixel points in the first coordinate set and the second coordinate set based on the ratio of the difference value corresponding to each pixel point in the first coordinate set and the second coordinate set, and obtaining the image difference degree according to the same number of pixel points and the different numbers of pixel points, wherein the method comprises the following steps:
collecting pixel values of a pixel point positioned at the central coordinate and pixel points positioned at coordinates outside a preset range in the first coordinate set, calculating a difference value of the pixel values to obtain a first difference value, and calculating a difference value of the pixel values of a pixel point positioned at the central coordinate and pixel values of pixel points positioned at coordinates outside the preset range in the second coordinate set to obtain a second difference value;
acquiring a first difference value and a second difference value corresponding to pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set, and obtaining a difference value ratio corresponding to the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set based on the first difference value and the second difference value;
if the difference ratio is smaller than a preset difference ratio, taking the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set as the same pixel points, and if the difference ratio is larger than or equal to the preset difference ratio, taking the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set as different pixel points;
and counting the number of the same pixel points to obtain the same number of the pixel points, counting the number of the different pixel points to obtain different numbers of the pixel points, obtaining the total number of the pixel points according to the sum of the same number of the pixel points and the different numbers of the pixel points, and obtaining the image difference degree based on the ratio of the same number of the pixel points to the total number of the pixel points.
4. The method of claim 3, wherein the preset range is constructed by:
generating a preset range by taking a pixel point positioned at the central coordinate in the first coordinate set and the second coordinate set as a circle center and a preset radius as a radius;
acquiring the pixel quantity of all pixel points in the first coordinate set and the second coordinate set, and adjusting the preset range based on the pixel quantity to obtain an adjustment range;
the number of pixel points of the adjustment radius corresponding to the adjustment range is calculated by the following formula,
Figure 991223DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 377205DEST_PATH_IMAGE003
the number of the pixel points of the adjustment radius corresponding to the adjustment range,
Figure 750417DEST_PATH_IMAGE005
for the number of pixels of all the pixel points,
Figure 880047DEST_PATH_IMAGE007
in order to preset the number of pixels,
Figure 789360DEST_PATH_IMAGE009
in order to adjust the weight value of the range,
Figure 80664DEST_PATH_IMAGE011
the number of pixels corresponding to the preset radius.
5. The method of claim 1,
obtaining a plurality of acquisition moments according to the current moment and a preset time period, extracting area images of areas where all the power transformation equipment are located in the infrared image based on the acquisition moments, and sequencing according to time to generate image sets corresponding to all the power transformation equipment, wherein the image sets comprise:
determining a first acquisition time and a second acquisition time according to the current time and a preset acquisition time, and extracting area images of areas of transformer equipment in the infrared images based on the first acquisition time and the second acquisition time;
calculating based on the preset acquisition duration and the pixel values of the two region images to obtain a pixel value change trend;
obtaining a duration offset coefficient according to the pixel value variation trend and a preset pixel value variation trend, and adjusting a preset acquisition duration based on the duration offset coefficient to obtain a current acquisition duration;
determining the next acquisition time according to the second acquisition time and the current acquisition time, deleting the first acquisition time, updating the second acquisition time to the first acquisition time, and determining the next acquisition time to be the second acquisition time;
and repeatedly executing the steps, and stopping executing after the preset requirements are met.
6. The method of claim 5,
obtaining a duration offset coefficient according to the pixel value variation trend and a preset pixel value variation trend, and adjusting preset acquisition duration based on the duration offset coefficient to obtain current acquisition duration, wherein the method comprises the following steps:
adjusting the preset acquisition time length according to the time length offset coefficient to obtain an adjusted acquisition time length after adjustment, and obtaining the current acquisition time length based on the difference value between the preset acquisition time length and the adjusted acquisition time length;
the current acquisition time duration is calculated by the following formula,
Figure 941172DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 608914DEST_PATH_IMAGE015
the time length of the current acquisition is,
Figure 871268DEST_PATH_IMAGE017
in order to preset the acquisition time length,
Figure 333473DEST_PATH_IMAGE019
in order to be a pixel value variation tendency,
Figure 884540DEST_PATH_IMAGE021
in order to preset the trend of the pixel value variation,
Figure 887131DEST_PATH_IMAGE023
in order to be the time-length offset coefficient,
Figure 708719DEST_PATH_IMAGE025
the weight value of the acquisition time length is adjusted.
7. The method of claim 6,
processing pixel values of the images in the regions corresponding to the adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and obtaining trend variation difference values corresponding to the image sets based on the pixel value variation trends, wherein the method comprises the following steps:
acquiring target pixel values of target pixel points in each regional image, and acquiring pixel values corresponding to each regional image according to the average value of all the target pixel values;
acquiring acquisition time length between adjacent area images, and calculating pixel values of a previous area image and a next area image between the adjacent area images in the image set based on the acquisition time length to obtain a plurality of pixel value change trends;
determining a former pixel value variation trend and a latter pixel value variation trend in the adjacent pixel value variation trends, and obtaining a plurality of trend variation difference values according to the difference value of the latter pixel value variation trend and the former pixel value variation trend;
the trend change difference is calculated by the following formula,
Figure 732039DEST_PATH_IMAGE027
wherein the content of the first and second substances,
Figure 442506DEST_PATH_IMAGE029
in the form of a trend-change difference value,
Figure 576684DEST_PATH_IMAGE031
is as follows
Figure 688996DEST_PATH_IMAGE033
The variation trend of the value of each pixel,
Figure 617638DEST_PATH_IMAGE035
is as follows
Figure 815401DEST_PATH_IMAGE037
The variation trend of the value of each pixel,
Figure 989156DEST_PATH_IMAGE039
is as follows
Figure 690396DEST_PATH_IMAGE041
The variation trend of the value of each pixel,
Figure 55518DEST_PATH_IMAGE043
is as follows
Figure 474998DEST_PATH_IMAGE033
The target pixel value of the target pixel point in the previous region image,
Figure 950979DEST_PATH_IMAGE045
the total number of target pixel points in the previous area image,
Figure 772304DEST_PATH_IMAGE047
the number value of the target pixel points in the previous region image,
Figure DEST_PATH_IMAGE049
is the average value of the pixels corresponding to the image of the previous area,
Figure DEST_PATH_IMAGE051
is as follows
Figure 872110DEST_PATH_IMAGE033
The target pixel value of the target pixel point in the next region image,
Figure DEST_PATH_IMAGE053
the total number of target pixel points in the image of the next region,
Figure DEST_PATH_IMAGE055
the number of target pixels in the image of the next region,
Figure DEST_PATH_IMAGE057
is the average value of the pixels corresponding to the image of the latter area,
Figure DEST_PATH_IMAGE059
is as follows
Figure 903519DEST_PATH_IMAGE033
The length of time between the acquisition of the images of the adjacent regions,
Figure DEST_PATH_IMAGE061
weight values are calculated for the trends.
8. The method of claim 7,
acquiring environmental information and current use time corresponding to the power transformation equipment, and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environmental information, the use time and the average value of the total number of the target pixels to obtain an adjusted current trend threshold, wherein the method comprises the following steps:
acquiring an environment temperature value corresponding to the environment information according to a temperature acquisition device, and calculating according to the environment temperature value and a preset temperature value to obtain a temperature offset coefficient;
acquiring a preset service life corresponding to each transformer device, and calculating according to the preset service life and the current service life to generate a service life offset value;
collecting the total number of target pixel points of all regional images in an image set, and calculating based on the average value of the total number of the target pixel points and the number of preset pixel points to obtain a number deviation value;
adjusting a preset trend threshold value according to the temperature offset coefficient, the service life offset value and the quantity offset value to obtain an adjusted current trend threshold value;
the current trend threshold is calculated by the following formula,
Figure DEST_PATH_IMAGE063
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE065
in the case of the current trend threshold value,
Figure DEST_PATH_IMAGE067
is the value of the ambient temperature,
Figure DEST_PATH_IMAGE069
is a preset temperature value, and is used for controlling the temperature of the air conditioner,
Figure DEST_PATH_IMAGE071
in order to preset the service life of the device,
Figure DEST_PATH_IMAGE073
for the current length of time of use,
Figure DEST_PATH_IMAGE075
is the average of the total number of the target pixels,
Figure DEST_PATH_IMAGE077
in order to preset the number of the pixel points,
Figure DEST_PATH_IMAGE079
in order to pre-set the trend threshold value,
Figure DEST_PATH_IMAGE081
a weight value that is a current trend threshold.
9. The method of claim 8, further comprising:
if receiving correction information input by a worker through an input device, the correction information comprises correction information of no fire alarm and correction information of no fire alarm;
if the correction information is correction information without fire alarm, calculating a current trend threshold value and an increase value of a preset increase correction value, and correcting a weight value of the current trend threshold value based on the increase value to obtain a correction weight value;
if the correction information is the correction information which is not alarmed when a fire disaster occurs, calculating a current trend threshold value and a preset turn-down value of a turn-down correction value, and correcting a weight value of the current trend threshold value based on the turn-down value to obtain a correction weight value;
the correction weight value is calculated by the following formula,
Figure DEST_PATH_IMAGE083
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE085
in order to correct the weight values,
Figure DEST_PATH_IMAGE087
in order to preset the up-regulation correction value,
Figure DEST_PATH_IMAGE089
in order to increase the weight of the correction,
Figure DEST_PATH_IMAGE091
in order to reduce the value of the correction weight,
Figure DEST_PATH_IMAGE093
the correction value is adjusted low for the preset.
10. The utility model provides an electric fire remote alarm processing apparatus based on thing networking which characterized in that includes:
the difference module is used for acquiring a first infrared image of the power transformation room at a first time point and a second infrared image at a second time point, acquiring the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and acquiring image difference degrees according to the same number of pixel points and the different numbers of pixel points;
the image module is used for obtaining a plurality of acquisition moments according to the current moment and a preset time period if the difference degree is greater than the preset difference degree, extracting area images of areas where the power transformation equipment is located in the infrared image based on the acquisition moments, and sequencing according to time to generate an image set corresponding to the power transformation equipment;
the difference module is used for processing pixel values of the images in the areas corresponding to the adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and trend variation differences corresponding to the image sets are obtained based on the pixel value trends;
the threshold module is used for acquiring environment information and current use duration corresponding to the power transformation equipment and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use duration and the average value of the total number of the target pixels to obtain an adjusted current trend threshold;
and the early warning module is used for taking the transformer equipment corresponding to the image set as early warning transformer equipment and sending the early warning transformer equipment to a fire disaster processing end if the trend change difference value corresponding to the image set is greater than the current trend threshold value.
CN202211237719.8A 2022-10-11 2022-10-11 Electrical fire remote alarm processing method and device based on Internet of things Active CN115311811B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211237719.8A CN115311811B (en) 2022-10-11 2022-10-11 Electrical fire remote alarm processing method and device based on Internet of things

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211237719.8A CN115311811B (en) 2022-10-11 2022-10-11 Electrical fire remote alarm processing method and device based on Internet of things

Publications (2)

Publication Number Publication Date
CN115311811A true CN115311811A (en) 2022-11-08
CN115311811B CN115311811B (en) 2022-12-06

Family

ID=83868195

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211237719.8A Active CN115311811B (en) 2022-10-11 2022-10-11 Electrical fire remote alarm processing method and device based on Internet of things

Country Status (1)

Country Link
CN (1) CN115311811B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115841489A (en) * 2023-02-21 2023-03-24 华至云链科技(苏州)有限公司 Intelligent point inspection method and platform
CN116189367A (en) * 2022-12-09 2023-05-30 嘉应学院 Building fire alarm system based on Internet of things
CN116758079A (en) * 2023-08-18 2023-09-15 杭州浩联智能科技有限公司 Harm early warning method based on spark pixels
CN116912780A (en) * 2023-09-12 2023-10-20 杭州慕皓新能源技术有限公司 Charging monitoring protection method and system based on mode dynamic switching

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001067566A (en) * 1999-08-30 2001-03-16 Fujitsu Ltd Fire detecting device
US20030132847A1 (en) * 2002-01-14 2003-07-17 Anderson Kaare J. Method of detecting a fire by IR image processing
US20080136934A1 (en) * 2006-12-12 2008-06-12 Industrial Technology Research Institute Flame Detecting Method And Device
CN101840571A (en) * 2010-03-30 2010-09-22 杭州电子科技大学 Flame detection method based on video image
CN102208018A (en) * 2011-06-01 2011-10-05 西安工程大学 Method for recognizing fire disaster of power transmission line based on video variance analysis
CN102236947A (en) * 2010-04-29 2011-11-09 中国建筑科学研究院 Flame monitoring method and system based on video camera
CN108537202A (en) * 2018-04-19 2018-09-14 广州林邦信息科技有限公司 Forest fire identification device and system
CN109643482A (en) * 2016-06-28 2019-04-16 烟雾检测器有限责任公司 Use the smoke detection system and method for camera
CN110634261A (en) * 2019-08-27 2019-12-31 国网山东省电力公司泗水县供电公司 Fire early warning system and method for underground power distribution network
CN113551775A (en) * 2021-06-23 2021-10-26 国网福建省电力有限公司 Equipment fault on-line monitoring and alarming method and system based on infrared thermal imaging

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001067566A (en) * 1999-08-30 2001-03-16 Fujitsu Ltd Fire detecting device
US20030132847A1 (en) * 2002-01-14 2003-07-17 Anderson Kaare J. Method of detecting a fire by IR image processing
US20080136934A1 (en) * 2006-12-12 2008-06-12 Industrial Technology Research Institute Flame Detecting Method And Device
CN101840571A (en) * 2010-03-30 2010-09-22 杭州电子科技大学 Flame detection method based on video image
CN102236947A (en) * 2010-04-29 2011-11-09 中国建筑科学研究院 Flame monitoring method and system based on video camera
CN102208018A (en) * 2011-06-01 2011-10-05 西安工程大学 Method for recognizing fire disaster of power transmission line based on video variance analysis
CN109643482A (en) * 2016-06-28 2019-04-16 烟雾检测器有限责任公司 Use the smoke detection system and method for camera
CN108537202A (en) * 2018-04-19 2018-09-14 广州林邦信息科技有限公司 Forest fire identification device and system
CN110634261A (en) * 2019-08-27 2019-12-31 国网山东省电力公司泗水县供电公司 Fire early warning system and method for underground power distribution network
CN113551775A (en) * 2021-06-23 2021-10-26 国网福建省电力有限公司 Equipment fault on-line monitoring and alarming method and system based on infrared thermal imaging

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116189367A (en) * 2022-12-09 2023-05-30 嘉应学院 Building fire alarm system based on Internet of things
CN116189367B (en) * 2022-12-09 2023-09-26 嘉应学院 Building fire alarm system based on Internet of things
CN115841489A (en) * 2023-02-21 2023-03-24 华至云链科技(苏州)有限公司 Intelligent point inspection method and platform
CN116758079A (en) * 2023-08-18 2023-09-15 杭州浩联智能科技有限公司 Harm early warning method based on spark pixels
CN116758079B (en) * 2023-08-18 2023-12-05 杭州浩联智能科技有限公司 Harm early warning method based on spark pixels
CN116912780A (en) * 2023-09-12 2023-10-20 杭州慕皓新能源技术有限公司 Charging monitoring protection method and system based on mode dynamic switching
CN116912780B (en) * 2023-09-12 2023-11-24 国网浙江省电力有限公司杭州供电公司 Charging monitoring protection method and system based on mode dynamic switching

Also Published As

Publication number Publication date
CN115311811B (en) 2022-12-06

Similar Documents

Publication Publication Date Title
CN115311811B (en) Electrical fire remote alarm processing method and device based on Internet of things
CN111426393B (en) Temperature correction method, device and system
CN112232279B (en) Personnel interval detection method and device
CN106895791B (en) Board deformation monitoring and early warning system
CN113551775A (en) Equipment fault on-line monitoring and alarming method and system based on infrared thermal imaging
CN108989463B (en) Data processing method and device
CN114666473A (en) Video monitoring method, system, terminal and storage medium for farmland protection
WO2018149322A1 (en) Image identification method, device, apparatus, and data storage medium
CN108268076B (en) Big data-based machine room operation safety evaluation system
CN114401621A (en) Method, device, equipment and medium for determining air inlet temperature of server
CN116298737B (en) Switch cabinet discharge monitoring system, method and equipment
WO2024045428A1 (en) Detection method and apparatus for electrolytic polar plate, and electronic device and storage medium
JP2007310464A (en) Monitoring camera system and control method for the same
CN114204680B (en) Multi-type automatic detection equipment fusion remote diagnosis system and method
CN108345575A (en) A kind of steel tower thunder resisting equipment probability of malfunction computational methods and system
CN114636883A (en) Alternating current based power system fault determination method and device and storage medium
CN115479676A (en) Method and device for judging equipment state based on transformer substation temperature field
CN112037176A (en) Human presence detection device
CN204740690U (en) Infraredly prevent external damage early warning circuit
CN112729610A (en) Power equipment temperature data remote monitoring system and method based on Internet of things
CN111813983A (en) Abnormal body temperature early warning method, device and system
CN215575554U (en) Storage battery temperature monitoring system
CN111668838B (en) Method, device and equipment for measuring frequency response coefficient of power system
CN116957373A (en) Transformer equipment reliability evaluation method and system based on image processing
WO2021256170A1 (en) Abnormality determination system and power generation system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant