CN115311811B - Electrical fire remote alarm processing method and device based on Internet of things - Google Patents
Electrical fire remote alarm processing method and device based on Internet of things Download PDFInfo
- Publication number
- CN115311811B CN115311811B CN202211237719.8A CN202211237719A CN115311811B CN 115311811 B CN115311811 B CN 115311811B CN 202211237719 A CN202211237719 A CN 202211237719A CN 115311811 B CN115311811 B CN 115311811B
- Authority
- CN
- China
- Prior art keywords
- value
- pixel
- preset
- image
- pixel points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 10
- 230000009466 transformation Effects 0.000 claims abstract description 82
- 230000008859 change Effects 0.000 claims abstract description 40
- 238000012545 processing Methods 0.000 claims abstract description 34
- 238000012937 correction Methods 0.000 claims description 63
- 238000000034 method Methods 0.000 claims description 36
- 239000000126 substance Substances 0.000 claims description 11
- 238000012163 sequencing technique Methods 0.000 claims description 7
- 230000007613 environmental effect Effects 0.000 claims description 5
- 230000006855 networking Effects 0.000 claims 1
- 238000004590 computer program Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000002159 abnormal effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000003827 upregulation Effects 0.000 description 3
- 230000005856 abnormality Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 206010000369 Accident Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000003828 downregulation Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
- 238000005549 size reduction Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B17/00—Fire alarms; Alarms responsive to explosion
- G08B17/12—Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
- G08B17/125—Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
Abstract
The invention provides an electric fire remote alarm processing method and device based on the Internet of things, wherein image difference degrees are obtained according to the same number of pixel points and different numbers of pixel points; if the image difference degree is greater than the preset difference degree, obtaining a plurality of acquisition moments according to the current moment and a preset time period, and extracting the area image of the area where each power transformation equipment is located in the infrared image based on the acquisition moments; processing pixel values of the images in the area corresponding to each adjacent time point in the image set to obtain a plurality of pixel value variation trends, and obtaining trend variation difference values corresponding to each image set based on the pixel value variation trends; adjusting the preset trend threshold value according to the environment information, the using time and the average value of the total number of the target pixel points to obtain a current trend threshold value; and if the trend change difference is larger than the current trend threshold, the transformer equipment corresponding to the image set is used as early warning transformer equipment, and the early warning transformer equipment is sent to a fire disaster processing end.
Description
Technical Field
The invention relates to a data processing technology, in particular to an electric fire remote alarm processing method and device based on the Internet of things.
Background
The electric fire refers to a fire caused by taking electric energy as a fire source, is easy to develop into a serious fire accident, has electric shock and explosion danger during fighting and has larger hazard compared with other fires. Especially in a substation with a large amount of substation equipment, once an electrical fire occurs, the consequences are not imaginable.
In the prior art, an infrared thermal imager is usually used to detect whether an electrical fire occurs, and the infrared thermal imager is an imaging device reflecting the surface temperature of an object, so that the infrared thermal imager can be used as an effective fire detection device. However, for some power transformation devices which can continuously generate heat during normal operation, whether a fire disaster occurs or not is difficult to detect by using the method, so how to sense and detect the fire disaster occurrence situation of the power transformation devices in time by using the infrared thermal imager is a problem which needs to be solved urgently at present.
Disclosure of Invention
The embodiment of the invention provides an electric fire remote alarm processing method based on the Internet of things, which can sense the fire occurrence condition of a power transformation device in time by using an infrared thermal imager.
In a first aspect of the embodiments of the present invention, an electrical fire remote alarm processing method based on the internet of things is provided, including:
acquiring a first infrared image of a power transformation room at a first time point and a second infrared image at a second time point, acquiring the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and acquiring image difference according to the same number of pixel points and the different numbers of pixel points;
if the image difference is greater than the preset difference, obtaining a plurality of acquisition moments according to the current moment and a preset time period, extracting area images of areas where the power transformation equipment is located in the infrared image based on the acquisition moments, and sequencing according to time to generate image sets corresponding to the power transformation equipment;
processing pixel values of the images of the regions corresponding to the adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and obtaining trend variation difference values corresponding to the image sets based on the pixel value variation trends;
acquiring environment information and current use time corresponding to power transformation equipment and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use time and the average value of the total number of the target pixels to obtain an adjusted current trend threshold;
and if the trend change difference value corresponding to the image set is greater than the current trend threshold value, the transformer equipment corresponding to the image set is used as early warning transformer equipment, and the early warning transformer equipment is sent to a fire disaster processing end.
Optionally, in a possible implementation manner of the first aspect, obtaining a first infrared image of a substation at a first time point and a second infrared image at a second time point, obtaining the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and obtaining an image difference according to the same number of pixel points and the different numbers of pixel points includes:
performing coordinate processing on the first infrared image and the second infrared image to obtain a coordinate value of each pixel point;
acquiring coordinates of pixel points in the first infrared image and the second infrared image to generate a first coordinate set and a second coordinate set;
calculating the difference value between the pixel value of the pixel point positioned in the central coordinate in the first coordinate set and the pixel value of the pixel point positioned outside the preset range in the second coordinate set;
and obtaining the same number of pixel points and different numbers of pixel points in the first coordinate set and the second coordinate set based on the ratio of the difference values corresponding to the pixel points in the first coordinate set and the second coordinate set, and obtaining the image difference according to the same number of the pixel points and the different numbers of the pixel points.
Optionally, in a possible implementation manner of the first aspect, obtaining the same number of pixels and different numbers of pixels in the first coordinate set and the second coordinate set based on a ratio of differences corresponding to each pixel in the first coordinate set and the second coordinate set, and obtaining the image difference according to the same number of pixels and the different numbers of pixels includes:
collecting pixel values of a pixel point positioned at the central coordinate and pixel points positioned at coordinates outside a preset range in the first coordinate set, calculating a difference value of the pixel values to obtain a first difference value, and calculating a difference value of the pixel values of a pixel point positioned at the central coordinate and pixel values of pixel points positioned at coordinates outside the preset range in the second coordinate set to obtain a second difference value;
acquiring a first difference value and a second difference value corresponding to pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set, and obtaining a difference value ratio corresponding to the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set based on the first difference value and the second difference value;
if the difference ratio is smaller than a preset difference ratio, taking the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set as the same pixel points, and if the difference ratio is larger than or equal to the preset difference ratio, taking the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set as different pixel points;
and counting the number of the same pixel points to obtain the same number of the pixel points, counting the number of the different pixel points to obtain different numbers of the pixel points, obtaining the total number of the pixel points according to the sum of the same number of the pixel points and the different numbers of the pixel points, and obtaining the image difference degree based on the ratio of the same number of the pixel points to the total number of the pixel points.
Optionally, in a possible implementation manner of the first aspect, the preset range is constructed by:
generating a preset range by taking a pixel point positioned at a central coordinate in the first coordinate set and the second coordinate set as a circle center and a preset radius as a radius;
acquiring the pixel quantity of all pixel points in the first coordinate set and the second coordinate set, and adjusting the preset range based on the pixel quantity to obtain an adjustment range;
the number of pixel points of the adjustment radius corresponding to the adjustment range is calculated by the following formula,
wherein,The number of the pixel points of the adjustment radius corresponding to the adjustment range,for the number of pixels of all the pixel points,in order to preset the number of pixels,in order to adjust the weight value of the range,the number of pixels corresponding to the preset radius.
Optionally, in a possible implementation manner of the first aspect, obtaining multiple acquisition moments according to a current moment and a preset time period, extracting a region image of a region where each power transformation equipment is located in an infrared image based on the acquisition moments, and generating an image set corresponding to each power transformation equipment by sorting according to time includes:
determining a first acquisition time and a second acquisition time according to the current time and a preset acquisition duration, and extracting area images of areas of the transformer equipment substations in the infrared images based on the first acquisition time and the second acquisition time;
calculating based on the preset acquisition duration and the pixel values of the two region images to obtain a pixel value variation trend;
obtaining a duration offset coefficient according to the pixel value variation trend and a preset pixel value variation trend, and adjusting a preset acquisition duration based on the duration offset coefficient to obtain a current acquisition duration;
determining the next acquisition time according to the second acquisition time and the current acquisition time, deleting the first acquisition time, updating the second acquisition time to the first acquisition time, and determining the next acquisition time to be the second acquisition time;
and repeatedly executing the steps, and stopping executing after the preset requirements are met.
Optionally, in a possible implementation manner of the first aspect, obtaining a duration offset coefficient according to the pixel value variation trend and a preset pixel value variation trend, and adjusting a preset acquisition duration based on the duration offset coefficient to obtain a current acquisition duration includes:
adjusting the preset acquisition time length according to the time length offset coefficient to obtain an adjusted acquisition time length after adjustment, and obtaining the current acquisition time length based on the difference value between the preset acquisition time length and the adjusted acquisition time length;
the current acquisition duration is calculated by the following formula,
wherein the content of the first and second substances,the time length of the current acquisition is,in order to preset the acquisition time length,in order to be a pixel value variation tendency,in order to preset the trend of the pixel value variation,in order to be the time-length offset coefficient,the weight value of the acquisition time length is adjusted.
Optionally, in a possible implementation manner of the first aspect, processing pixel values of images in regions corresponding to adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and obtaining a trend variation difference value corresponding to each image set based on the plurality of pixel value variation trends includes:
obtaining target pixel values of target pixel points in each regional image, and obtaining pixel values corresponding to each regional image according to the average value of all the target pixel values;
acquiring acquisition time length between adjacent region images, and calculating pixel values of a previous region image and a next region image between the adjacent region images in the image set based on the acquisition time length to obtain a plurality of pixel value variation trends;
determining a former pixel value variation trend and a latter pixel value variation trend in the adjacent pixel value variation trends, and obtaining a plurality of trend variation difference values according to the difference value of the latter pixel value variation trend and the former pixel value variation trend;
the trend change difference is calculated by the following formula,
wherein, the first and the second end of the pipe are connected with each other,in the form of a trend-change difference value,is as followsThe variation trend of the value of each pixel,is as followsThe variation trend of the value of each pixel,is as followsThe tendency of the value of each pixel to change,is as followsThe target pixel value of the target pixel point in the previous region image,the total number of target pixel points in the previous area image,the number value of the target pixel points in the previous region image,is the average value of the pixels corresponding to the image of the previous area,is as followsThe target pixel value of the target pixel point in the next region image,the total number of target pixels in the image of the next region,the number of target pixels in the image of the next region,is the average value of the pixels corresponding to the image of the latter area,is as followsThe length of time between the acquisition of the images of the adjacent regions,weight values are calculated for the trends.
Optionally, in a possible implementation manner of the first aspect, acquiring environment information and current use time corresponding to a power transformation device, and an average value of a total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use time, and the average value of the total number of the target pixels to obtain an adjusted current trend threshold includes:
acquiring an environment temperature value corresponding to the environment information according to a temperature acquisition device, and calculating according to the environment temperature value and a preset temperature value to obtain a temperature offset coefficient;
acquiring a preset service life corresponding to each power transformation device, and calculating according to the preset service life and the current service life to generate a service life deviation value;
collecting the total number of target pixel points of all regional images in an image set, and calculating based on the average value of the total number of the target pixel points and the number of preset pixel points to obtain a number deviation value;
adjusting a preset trend threshold value according to the temperature offset coefficient, the service life offset value and the quantity offset value to obtain an adjusted current trend threshold value;
the current trend threshold is calculated by the following formula,
wherein the content of the first and second substances,is as followsThe value of the front-tendency threshold is,is the value of the ambient temperature,is a preset temperature value, and is used as a temperature value,in order to preset the service life of the device,the time length of the current use is the time length,is the average of the total number of the target pixels,in order to preset the number of the pixel points,in order to pre-set the trend threshold value,a weight value that is the current trend threshold.
Optionally, in a possible implementation manner of the first aspect, if correction information input by a worker through an input device is received, the correction information includes correction information that a fire alarm does not occur and correction information that a fire alarm does not occur;
if the correction information is correction information without fire alarm, calculating a current trend threshold value and an increase value of a preset increase correction value, and correcting a weight value of the current trend threshold value based on the increase value to obtain a correction weight value;
if the correction information is the correction information which is not alarmed when a fire disaster occurs, calculating a current trend threshold value and a preset turn-down value of a turn-down correction value, and correcting a weight value of the current trend threshold value based on the turn-down value to obtain a correction weight value;
the correction weight value is calculated by the following formula,
wherein the content of the first and second substances,in order to correct the weight values,in order to preset the up-regulation correction value,in order to increase the weight of the correction,in order to reduce the value of the correction weight,the correction value is adjusted down for a preset value.
In a second aspect of the embodiments of the present invention, an electrical fire remote alarm processing device based on the internet of things is provided,
optionally, in a possible implementation manner of the second aspect, the method includes:
the difference module is used for acquiring a first infrared image of the power transformation room at a first time point and a second infrared image at a second time point, acquiring the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and acquiring image difference degrees according to the same number of pixel points and the different numbers of pixel points;
the image module is used for obtaining a plurality of acquisition moments according to the current moment and a preset time period if the difference degree is greater than the preset difference degree, extracting area images of areas where the power transformation equipment is located in the infrared image based on the acquisition moments, and sequencing according to time to generate an image set corresponding to the power transformation equipment;
the difference module is used for processing the pixel values of the images in the area corresponding to each adjacent time point in the image set to obtain a plurality of pixel value variation trends, and obtaining trend variation differences corresponding to each image set based on the pixel value trends;
the threshold module is used for acquiring environment information and current use time corresponding to the power transformation equipment and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use time and the average value of the total number of the target pixels to obtain an adjusted current trend threshold;
and the early warning module is used for taking the transformer equipment corresponding to the image set as early warning transformer equipment and sending the early warning transformer equipment to a fire disaster processing end if the trend change difference value corresponding to the image set is greater than the current trend threshold value.
The invention has the following beneficial effects:
1. the method comprises the steps of firstly obtaining pixel value variation trends of infrared images corresponding to each power transformation equipment at different moments through pixel values of the infrared images, then obtaining trend variation difference values through the adjacent pixel value variation trends, and comparing the trend variation difference values with preset threshold values to judge whether a fire disaster happens or not. Compare in prior art, can make infrared equipment monitoring substation equipment's conflagration emergence condition more accurate through this kind of mode. In addition, the preset threshold value is adjusted by using some factors which easily influence the preset threshold value, such as environmental factors, use duration and the like, and the preset threshold value is further adjusted afterwards, so that the accuracy of the preset threshold value is ensured. Compared with the prior art, the preset threshold value can be dynamically adjusted through the method, so that the preset threshold value is more consistent with the practical application scene of the invention.
2. According to the method, the infrared images of the power transformation room at different moments are subjected to coordinate processing, difference values of pixel points of coordinates outside a preset range and pixel points of center coordinates in the infrared images corresponding to the different moments are calculated to obtain difference values corresponding to the infrared images at the different moments, and whether the infrared images of the power transformation equipment at the different moments are changed or not is judged according to the difference values. Through this kind of mode, can judge whether the condition of the transformer room of different moments changes to whether can further confirm it can take place the conflagration, and calculate difference value through the difference and can also reduce the error, make the calculated result accord with the scene more.
3. The method comprises the steps of firstly obtaining a pixel value variation trend through the ratio of the difference value of the pixel values of the infrared images corresponding to each power transformation equipment at adjacent moments to the corresponding duration of the infrared images, and then obtaining a trend variation difference value through the difference value of the adjacent pixel value variation trends. By the method, the change condition of the infrared image of the power transformation equipment within a certain time can be acquired, and whether the power transformation equipment is abnormal in operation or not can be determined according to the change condition. In addition, the preset acquisition duration is adjusted according to the pixel value variation trend, so that the variation condition of the power transformation equipment can be acquired in time when the pixel value variation trend is large, and the condition deterioration caused by improper acquisition time can be prevented.
4. The preset threshold value is adjusted through the environmental information corresponding to the power transformation equipment, the current use time and the average value of the total number of the target pixel points in the image set, and the preset threshold value is further adjusted when the error occurs in the post early warning. Through the method, the preset threshold value can be dynamically adjusted, so that the preset threshold value is more consistent with the practical application scene of the invention.
Drawings
Fig. 1 is a schematic diagram of a remote alarm processing method for an electrical fire based on the internet of things according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a remote alarm processing method for an electrical fire based on the internet of things according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, which is a schematic diagram of a method for processing an electrical fire remote alarm based on the internet of things according to an embodiment of the present invention, an execution subject of the method shown in fig. 1 may be a software and/or hardware device. The execution subject of the present application may include, but is not limited to, at least one of: user equipment, network equipment, etc. The user equipment may include, but is not limited to, a computer, a smart phone, a Personal Digital Assistant (PDA), the above mentioned electronic equipment, and the like. The network device may include, but is not limited to, a single network server, a server group of multiple network servers, or a cloud of numerous computers or network servers based on cloud computing, wherein cloud computing is one type of distributed computing, a super virtual computer consisting of a cluster of loosely coupled computers. The present embodiment does not limit this. The electric fire remote alarm processing method based on the Internet of things comprises the following steps of S1 to S5:
the method comprises the steps of S1, obtaining a first infrared image of a power transformation room at a first time point and a second infrared image of the power transformation room at a second time point, obtaining the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and obtaining image difference degrees according to the same number of pixel points and the different numbers of the pixel points.
The first time point is a first time point for acquiring an infrared image, the first infrared image is an infrared image of the power transformation room corresponding to the first time point, the second time point is a second time point for acquiring the infrared image after a preset time period, and the second infrared image is an infrared image of the power transformation room corresponding to the second time point.
For example, if the first time point is ten points and the preset time period is 10 minutes, the second time point is ten points over ten minutes.
The same number of the pixel points is the number of the pixel points with the same pixel value divided in the first infrared image and the second infrared image, the different number of the pixel points is the number of the pixel points with different pixel values divided in the first infrared image and the second infrared image, and the image difference degree is the image difference degree of the first infrared image and the second infrared image.
In some embodiments, the image difference may be obtained through steps S11 to S14, specifically as follows:
and S11, carrying out coordinate processing on the first infrared image and the second infrared image to obtain a coordinate value of each pixel point.
The first infrared image and the second infrared image are coordinated with the same origin of coordinates, for example, both are coordinated with the center point as the origin of coordinates.
And S12, acquiring coordinates of pixel points in the first infrared image and the second infrared image to generate a first coordinate set and a second coordinate set.
The first coordinate set is a set formed by coordinates of all pixel points of the first infrared image after the first infrared image is subjected to the coordinate processing, and the second coordinate set is a set formed by coordinates of all pixel points of the second infrared image after the second infrared image is subjected to the coordinate processing.
And S13, calculating the difference value between the pixel value of the pixel point positioned in the central coordinate in the first coordinate set and the second coordinate set and the pixel value of the pixel point positioned outside the preset range.
S14, obtaining the same number of pixel points and different numbers of pixel points in the first coordinate set and the second coordinate set based on the ratio of the difference values corresponding to the pixel points in the first coordinate set and the second coordinate set, and obtaining the image difference degree according to the same number of the pixel points and the different numbers of the pixel points.
It will be appreciated that infrared images taken at different points in time are rarely identical due to the effects of light and the like. Therefore, the pixel values of the pixel points located in the same coordinate may have some deviation, so that in order to reduce errors, the judgment is performed according to the difference between the pixel values of the pixel points of the coordinates outside the preset range and the pixel values of the pixel points of the central coordinate when the same pixel point and different pixel points are judged.
Specifically, the same pixel point and the different pixel points may be obtained through steps S141 to S144, and the image difference may be obtained.
And S141, collecting pixel values of the pixel point positioned at the central coordinate and the pixel points positioned at the coordinates outside the preset range in the first coordinate set, calculating a difference value of the pixel values to obtain a first difference value, and calculating a difference value of the pixel values of the pixel point positioned at the central coordinate and the pixel values of the pixel points positioned at the coordinates outside the preset range in the second coordinate set to obtain a second difference value.
The first difference value is the difference value between the pixel value of the pixel point positioned at the central coordinate in the first coordinate set and the pixel value of the pixel point positioned at each coordinate outside the preset range, and the second difference value is the difference value between the pixel value of the pixel point positioned at the central coordinate in the second coordinate set and the pixel value of the pixel point positioned at each coordinate outside the preset range.
In some embodiments, the preset range is constructed by:
and generating a preset range by taking the pixel point positioned at the central coordinate in the first coordinate set and the second coordinate set as a circle center and a preset radius as a radius.
In practical applications, the preset radius can be preset by a worker according to practical situations.
It can be understood that, in order to reduce errors caused by the influence of factors such as light on the pixel values of the pixels located at the same coordinate, the present solution selects to determine the difference between the pixel values of the pixels outside the preset ranges and the pixel value of the pixel at the center coordinate, so as to determine whether the pixel is the same pixel. Secondly, the pixel value of the pixel point of the central coordinate is very similar to the pixel values of the pixel points of the surrounding coordinates, so that the pixel points in the preset range are excluded when the pixel value of the pixel point of each coordinate is calculated in the scheme, so that the error is reduced.
And acquiring the pixel quantity of all pixel points in the first coordinate set and the second coordinate set, and adjusting the preset range based on the pixel quantity to obtain an adjustment range. The method can adjust the preset range by combining the total number of the pixel points in the first infrared image and the second infrared image to obtain the corresponding adjustment range, and if the total number is larger, the corresponding adjustment range is larger.
The number of pixel points of the adjustment radius corresponding to the adjustment range is calculated by the following formula,
wherein the content of the first and second substances,the number of the pixel points of the adjustment radius corresponding to the adjustment range,for the number of pixels of all the pixel points,in order to preset the number of pixels,in order to adjust the weight value of the range,the number of pixels corresponding to the preset radius.
It can be seen from the above formula that the number of pixels of all the pixelsThe number of pixel points with the adjustment radius corresponding to the adjustment rangeProportional relation, the number of pixels of all pixelsThe larger the adjustment range is, the number of the pixel points with the adjustment radius corresponding to the adjustment range isThe larger the pixel count, because of the pixel count of all the pixelsThe larger the value of (b), the larger the corresponding image, and therefore the larger the adjustment range. So that the subsequently calculated difference is relatively more accurate.
And S142, acquiring a first difference value and a second difference value corresponding to the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set, and obtaining a difference value ratio corresponding to the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set based on the first difference value and the second difference value.
And the difference ratio is the ratio of a first difference and a second difference corresponding to the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set.
In some embodiments, the difference ratio may be calculated by the following formula:
wherein the content of the first and second substances,in order to be a ratio of the difference values,is the pixel value of the pixel point located at the center coordinate in the first coordinate set,the pixel values of the pixel points of the first coordinate set which are located at the coordinates outside the preset range,is the pixel value of the pixel point located at the center coordinate in the second coordinate set,the pixel values of the pixel points of the second coordinate set which are located at the coordinates outside the preset range,is a weighted value of the difference ratio.
The larger the difference ratio is, the larger the difference between the pixel values of the pixels located at the same coordinate in the first coordinate set and the second coordinate set is, and based on this, the pixels located at the same coordinate in the first coordinate set and the second coordinate set can be divided into the same pixels or different pixels.
And S143, if the difference ratio is smaller than a preset difference ratio, taking the pixel points in the first coordinate set and the second coordinate set which are located at the same coordinate as the same pixel points, and if the difference ratio is larger than or equal to the preset difference ratio, taking the pixel points in the first coordinate set and the second coordinate set which are located at the same coordinate as different pixel points.
The same pixel points are the pixel points judged to have the same pixel values in the first coordinate set and the second coordinate set, and the different pixel points are the pixel points judged to have different pixel values in the first coordinate set and the second coordinate set.
It can be understood that if the difference ratio is smaller than the preset difference ratio, it indicates that the pixel value similarity of the pixel points located at the same coordinate in the first coordinate set and the second coordinate set meets the preset requirement, and the pixel values can be judged to be the same pixel points; if the difference comparison is larger than or equal to the preset difference comparison, it is indicated that the pixel value similarity of the pixel points located in the same coordinate in the first coordinate set and the second coordinate set does not meet the preset requirement, and the pixel values can be judged to be different pixel points.
S144, counting the number of the same pixels to obtain the same number of the pixels, counting the number of the different pixels to obtain different numbers of the pixels, obtaining the total number of the pixels according to the sum of the same number of the pixels and the different numbers of the pixels, and obtaining the image difference degree based on the ratio of the same number of the pixels to the total number of the pixels.
Specifically, after all the same pixel points and all the different pixel points are obtained, the same number of the pixel points and the different number of the pixel points can be obtained through statistics on the same pixel points and the different pixel points. After summing the same number of pixel points and different numbers of pixel points, the image difference can be obtained according to the ratio of the same number of pixel points to the total number of the pixel points.
The higher the ratio of the image difference degrees is, the more similar the two images are, and the time-varying power room is not abnormal; conversely, the lower the ratio of the image difference degrees is, the more dissimilar the two images are, and the more likely the time-varying power house is abnormal.
The image difference degree obtained by the method can be used for judging whether the situation of the power transformation room is abnormal or not in real time, so that a worker can find problems in time.
And S2, if the image difference is larger than the preset difference, obtaining a plurality of acquisition moments according to the current moment and the preset time period, extracting area images of areas where the power transformation equipment is located in the infrared image based on the acquisition moments, and sequencing according to time to generate an image set corresponding to the power transformation equipment.
The acquisition time is the time for acquiring the area images of the areas of the transformer substations, and the image set is a set formed by the area images acquired by the transformer substations at multiple acquisition times.
In some embodiments, the region image may be obtained by:
firstly, an area extraction frame corresponding to each power transformation device can be manually preset in an acquisition area of the infrared camera, and then an area image corresponding to each power transformation device is determined by the area extraction frame corresponding to each power transformation device.
Generally, since the position of the power transformation device does not change, the region image corresponding to each power transformation device can be acquired by manually setting the region extraction frame corresponding to the power transformation device in advance.
In addition, when the position of the power transformation equipment is changed, the corresponding area extraction frame can be changed accordingly.
It can be understood that if the image difference is greater than the preset difference, it indicates that there may be an abnormality in the power transformation equipment in the area where the power transformation room is located, and at this time, it may be determined which power transformation equipment has an abnormality by acquiring the area images corresponding to the power transformation equipment at different times.
In some embodiments, the above-described region image and the image set may be obtained by the following steps S21 to S25.
S21, determining a first acquisition time and a second acquisition time according to the current time and a preset acquisition time, and extracting area images of areas where the power transformation equipment is located in the infrared images based on the first acquisition time and the second acquisition time.
The first acquisition time is the time when the regional image corresponding to each power transformation device is acquired in the previous time, and the second acquisition time is the time when the regional image corresponding to each power transformation device is acquired in the next time.
For example, if the current time is ten o ' clock and the preset collection time is 1 minute, the first collection time is ten o ' clock, and the second collection time is one-tenth of ten o ' clock.
Specifically, after the first acquisition time and the second acquisition time are acquired, the area images corresponding to the first acquisition time and the second acquisition time can be respectively extracted at the first acquisition time and the second acquisition time.
And S22, calculating based on the preset acquisition time and the pixel values of the two area images to obtain a pixel value variation trend.
Wherein, the pixel value variation trend is the pixel value variation trend between two adjacent area images.
In some embodiments, the pixel value variation trend may be calculated by a ratio of a difference between pixel values corresponding to the two area images to a preset acquisition time.
S23, obtaining a duration offset coefficient according to the pixel value variation trend and a preset pixel value variation trend, and adjusting the preset acquisition duration based on the duration offset coefficient to obtain the current acquisition duration.
The time length offset coefficient is the ratio of the pixel value variation trend to the preset pixel value variation trend, and the current acquisition time length is the acquisition time length after the preset acquisition time length is adjusted.
It can be understood that, since the preset acquisition duration is preset, it does not change, but the speed of raising the flame of the power transformation device is very fast when a fire actually occurs, which may cause a situation that the fire rapidly extends after the preset acquisition duration, so that the preset acquisition duration may be dynamically adjusted to timely remind a worker of the occurrence of the fire.
In some embodiments, the current acquisition duration may be obtained by:
and adjusting the preset acquisition time length according to the time length offset coefficient to obtain an adjusted acquisition time length after adjustment, and obtaining the current acquisition time length based on the difference value of the preset acquisition time length and the adjusted acquisition time length.
The current acquisition duration is calculated by the following formula,
wherein the content of the first and second substances,the time length of the current acquisition is,in order to preset the acquisition time length,in order to be a pixel value variation tendency,in order to preset the trend of the pixel value variation,in order to be the time-length offset coefficient,the weight value of the acquisition time length is adjusted.
The concrete conception of the above formula is:
when the pixel value changesThe larger the difference between the pixel values of the two area images is, the more likely the difference between the pixel values of the two area images becomes larger due to the fire, and the corresponding current acquisition time length isThe method can also be correspondingly shortened so as to conveniently acquire the area image at the next moment in time, and whether the fire disaster happens or not is further determined according to the pixel value change trend among a plurality of area images.
Trend of pixel value variationThe smaller the difference between the pixel values of the two area images is, or the pixel value of the latter area image is smaller than the pixel value of the former area image, at the moment, the probability of fire occurrence is very small, so that the current acquisition time length can be shortenedThe corresponding arrangement is longer to ensure monitoring of the power transformation equipment.
In practical application, the change trend of the pixel value is presetAnd a preset acquisition durationThe setting can be made correspondingly. Adjusting weight value of acquisition durationCan be set according to actual conditions, and the current acquisition time lengthWhen the time is too small, the weight value of the acquisition time length can be adjustedIncrease it, and the current collection time lengthWhen the time is too large, the weight value of the acquisition time length can be adjustedIt is subjected to a reduction process.
And S24, determining the next acquisition time according to the second acquisition time and the current acquisition time, deleting the first acquisition time, updating the second acquisition time to the first acquisition time, and determining the next acquisition time to be the second acquisition time.
And S25, repeatedly executing the steps and stopping execution after the preset requirements are met.
Specifically, the next acquisition time after the second acquisition time can be obtained by adding the current acquisition time to the second acquisition time, at this time, the original second acquisition time can be updated to the first acquisition time, the original first acquisition time is deleted, the next acquisition time is updated to the second acquisition time, and the steps are repeated, so that the acquisition of the area image is realized.
In practical applications, the preset requirement may be that the step is stopped after the preset number of area images are acquired, and the preset number may be preset by a worker according to an actual situation.
After the area images at the respective times are acquired, an image set corresponding to each power transformation device can be generated from all the area images.
The regional images and the image sets obtained in the mode and the current acquisition time obtained by adjusting the preset acquisition time are acquired, and the regional images corresponding to the transformer equipment can be acquired in time, so that the worker can find out whether the transformer equipment is in fire timely.
And S3, processing the pixel values of the images in the area corresponding to each adjacent time point in the image set to obtain a plurality of pixel value change trends, and obtaining trend change difference values corresponding to each image set based on the plurality of pixel value change trends.
Wherein, the trend change difference is the difference of the change trends of two adjacent pixel values.
In some embodiments, the above-described trend change difference value may be obtained through steps S31 to S33.
And S31, acquiring target pixel values of target pixel points in each regional image, and acquiring pixel values corresponding to each regional image according to the average value of all the target pixel values.
In practical application, a worker may preset a pixel value range, take a pixel point falling in the pixel value range as the target pixel point, and take a pixel value corresponding to the target pixel point as the target pixel value. The pixel value range is, for example, the pixel value range corresponding to red, and the scheme can extract the pixel points corresponding to the heating area through the pixel value range.
It can be understood that the temperature of the power transformation equipment can rise when a fire breaks out, and the temperature of the power transformation equipment which continuously generates heat during normal work can also be very high, so that whether the power transformation equipment which continuously generates heat during normal work breaks out a fire can be judged by the pixel value of the target pixel point, and if the power transformation equipment which continuously generates heat during work continuously rises in temperature, the corresponding target pixel point and the target pixel value can also be correspondingly increased.
S32, acquiring the acquisition time length between adjacent area images, and calculating the pixel values of the previous area image and the next area image between the adjacent area images in the image set based on the acquisition time length to obtain a plurality of pixel value variation trends.
It should be noted that the acquisition time duration between adjacent area images may be a preset acquisition time duration or a current acquisition time duration. If the adjacent area images are the first group of adjacent area images, namely the collected first area image and the collected second area image, the collection time length is the preset collection time length, and if the adjacent area images are not the first group of adjacent area images, the collection time length is the current collection time length.
And S33, determining a former pixel value change trend and a latter pixel value change trend in the adjacent pixel value change trends, and obtaining a plurality of trend change difference values according to the difference value of the latter pixel value change trend and the former pixel value change trend.
wherein the content of the first and second substances,in order to be a trend-change difference value,is as followsThe variation trend of the value of each pixel,is as followsThe variation trend of the value of each pixel,is a firstThe variation trend of the value of each pixel,is as followsThe target pixel value of the target pixel point in the previous region image,the total number of target pixel points in the previous area image,the quantity value of the target pixel point in the previous regional image,is the average value of the pixels corresponding to the image of the previous area,is as followsThe target pixel value of the target pixel point in the next region image,the total number of target pixels in the image of the next region,the number of target pixels in the image of the next region,is the average value of the pixels corresponding to the image of the latter area,is as followsThe length of time between the acquisition of the images of the adjacent regions,weight values are calculated for the trends.
As can be seen from the above formula, the firstTendency of variation of individual pixel valueAnd a firstTendency of variation of individual pixel valueThe larger the difference value of (A) is, the larger the change of the change trend of the adjacent pixel value at the moment is, and the trend change difference valueThe larger the temperature difference, the more likely the temperature of the substation is to be continuously raised, and the more likely a fire is to occur.
Average value of pixels corresponding to image of previous areaAverage pixel value corresponding to the image of the subsequent regionThe larger the difference value of (A) is, the larger the pixel value change between the adjacent area images at the moment is, theTendency of variation of individual pixel valueThe larger the difference, the more likely the pixel value change between the images of adjacent areas will be increased due to the temperature rise of the power transformation equipment.
In practical applications, the trend calculates the weight valueCan be preset according to actual conditions whenTendency of variation of individual pixel valueWhen the weight value is too large, the weight value can be calculated through the trendSubjecting it to a size-reduction treatmentTendency of variation of individual pixel valueWhen too small, the weight value can be calculated by trendIt is subjected to an enlargement process.
The trend change difference obtained in the above manner can timely obtain the pixel value change trend corresponding to each power transformation device, so as to timely give an early warning to the fire occurrence condition of the power transformation device.
And S4, acquiring environment information and current use time corresponding to the power transformation equipment and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use time and the average value of the total number of the target pixels to obtain an adjusted current trend threshold.
The environmental information is information corresponding to the surrounding environment of the power transformation equipment, the current service life is the service life corresponding to the current time when the power transformation equipment is replaced or installed from the beginning, and the current trend threshold is a threshold corresponding to a trend change difference when the early warning information is sent out.
In practical application, a threshold value is generally set in advance, and corresponding early warning information is generated after various acquired data reach the threshold value. However, due to the influence of various factors, the preset threshold value is likely to be inaccurate, and therefore, in order to make the set threshold value more accurate, the scheme collects various factors which may influence the change of the threshold value to adjust the threshold value.
In some embodiments, the current trend threshold may be obtained by the following steps.
S41, obtaining an environment temperature value corresponding to the environment information according to a temperature collecting device, and calculating according to the environment temperature value and a preset temperature value to obtain a temperature offset coefficient.
The environment temperature value is the environment temperature collected by the temperature collecting device, and the temperature offset coefficient is the ratio of the environment temperature value to a preset temperature value.
And S42, acquiring the preset service life corresponding to each transformer device, calculating according to the preset service life and the current service time, and generating a service life deviation value.
The service life deviant is the ratio of the preset service life to the current service life.
S43, collecting the total number of target pixel points of all regional images in the image set, and calculating based on the average value of the total number of the target pixel points and the number of preset pixel points to obtain a number deviation value.
Specifically, the number of target pixel points corresponding to each regional image is obtained, then the number of the target pixel points in all the regional images is added to obtain the total number of all the target pixel points, and finally the total number of the target pixel points is averaged. The quantity deviation value is the ratio of the average value of the total quantity of the target pixel points to the preset pixel point quantity.
And S44, adjusting a preset trend threshold value according to the temperature offset coefficient, the service life offset value and the quantity offset value to obtain an adjusted current trend threshold value.
The current trend threshold is calculated by the following formula,
wherein, the first and the second end of the pipe are connected with each other,in the case of the current trend threshold value,is the value of the ambient temperature,is a preset temperature value, and is used as a temperature value,in order to preset the service life of the device,the time length of the current use is the time length,is the average of the total number of the target pixels,in order to preset the number of the pixel points,in order to pre-set the trend threshold value,a weight value that is a current trend threshold.
The general concept of the above formula is:
ambient temperature valueThe higher the ambient temperature is, and when the ambient temperature is higher, the temperature change of the power transformation equipment may be affected, so that the power transformation equipment is also heated, and therefore the current trend threshold value can be setAnd is also adjusted to be larger appropriately to reduce the influence of the ambient temperature, and vice versa.
Length of current useThe larger the current trend threshold value, the longer the service time of the power transformation equipment is, and the more easily the fire disaster is caused by the fault correspondingly, so that the current trend threshold value can be setAlso properly reduced to reduce the current use timeThe resulting effect and vice versa.
When the average value of the total number of the target pixelsThe larger the distance, the closer the infrared collector is to the power transformation device, and if the threshold is set to be low, the threshold is likely to be reached to cause misjudgment, so the method can be used asFront trend thresholdLarger in setting to reduce the average of the total number of target pixelsThe resulting effect and vice versa.
In practical application, threshold misjudgment is likely to occur, so that the scheme adjusts the current trend threshold in the following manner on the basis of the manner described above:
and S441, if correction information input by the staff through the input device is received, the correction information comprises correction information of no fire alarm and correction information of no fire alarm.
It can be understood that the warning information is alarmed when no fire occurs or is misjudged when no fire occurs, so that the threshold misjudgment can be adjusted through correction information input by staff in order to reduce the occurrence of threshold misjudgment.
And S442, if the correction information is the correction information without fire alarm, calculating a current trend threshold value and an increase value of a preset increase correction value, and correcting a weight value of the current trend threshold value based on the increase value to obtain a corrected weight value.
And S443, if the correction information is the correction information that the fire is not alarming, calculating a current trend threshold value and a preset turn-down value of the turn-down correction value, and correcting a weight value of the current trend threshold value based on the turn-down value to obtain a corrected weight value.
It can be understood that, if the correction information is the correction information for no fire alarm, it indicates that the current trend threshold is lower at this time, which leads to the early warning when no fire occurs, and therefore, it is necessary to increase the trend threshold to improve the threshold misjudgment.
If the correction information is the correction information that the fire is not alarming, it is shown that the current trend threshold value is higher at this time, so that the early warning is sent out when the fire is not occurring, and therefore, the reduction processing needs to be carried out on the current trend threshold value to improve the condition of threshold value misjudgment.
The preset down-regulation correction value and the preset up-regulation correction value can be set in advance by a worker according to actual conditions.
The correction weight value is calculated by the following formula,
wherein the content of the first and second substances,in order to correct the weight values,in order to preset the up-regulation correction value,in order to increase the weight of the correction,in order to reduce the value of the correction weight,the correction value is adjusted down for a preset value.
The overall concept of the above formula is:
when no fire alarm occurs, the weight value of the current trend threshold value can be weightedPerforming increase processing so as to obtain the corrected weight value after adjustment when next calculation is performedFor current trend thresholdAnd (6) adjusting. Wherein the magnitude of the increaseMay be based on the current trend thresholdAnd preset set-up correction valueTo make the adjustment.
When fire disaster does not occur, the weight value of the current trend threshold value can be weightedPerforming reduction processing so as to obtain corrected weight value after adjustment when next calculation is performedFor current trend thresholdAnd (6) adjusting. Wherein the magnitude of the decrease may be thresholded by the current trendAnd preset turndown correction valueTo make the adjustment.
In practical application, the preset heightening correction valueIncreasing the correction weightDecreasing the corrected weight valueAnd preset turndown correction valueCan be preset by the working personnel according to the actual situationThe setup is performed first.
The current trend threshold and the correction weight value obtained in the above way can dynamically adjust the threshold according to the actual situation, so that the adjusted threshold is more consistent with the applicable scene of the invention, and the threshold can be more accurate by readjusting the threshold through the correction weight value.
And S5, if the trend change difference value corresponding to the image set is larger than the current trend threshold value, the transformer equipment corresponding to the image set is used as early warning transformer equipment, and the early warning transformer equipment is sent to a fire disaster processing end.
It can be understood that if the trend change difference value corresponding to the image set is greater than the current trend threshold value, it indicates that the power transformation equipment corresponding to the image set is likely to have a fire, and therefore, the power transformation equipment corresponding to the image set can be used as early warning power transformation equipment to be uploaded to a processing end, so that a worker can process the early warning power transformation equipment in time.
Through this kind of mode, can make the timely situation to substation equipment emergence of staff make the processing, can prevent because the huge harm that substation equipment conflagration took place.
Referring to fig. 2, which is a schematic structural diagram of an electrical fire remote alarm processing apparatus based on the internet of things according to an embodiment of the present invention, the electrical fire remote alarm processing apparatus based on the internet of things includes:
the difference module is used for acquiring a first infrared image of the power transformation room at a first time point and a second infrared image at a second time point, acquiring the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and acquiring image difference degrees according to the same number of pixel points and the different numbers of pixel points.
And the image module is used for obtaining a plurality of acquisition moments according to the current moment and a preset time period if the difference is greater than the preset difference, extracting the area images of the areas of the transformer substations in the infrared image based on the acquisition moments, and sequencing according to time to generate the image sets corresponding to the transformer substations.
And the difference value module is used for processing the pixel values of the images of the areas corresponding to the adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and obtaining trend variation difference values corresponding to the image sets based on the pixel value trends.
And the threshold module is used for acquiring environment information and current use time corresponding to the power transformation equipment and the average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use time and the average value of the total number of the target pixels to obtain the adjusted current trend threshold.
And the early warning module is used for taking the transformer equipment corresponding to the image set as early warning transformer equipment and sending the early warning transformer equipment to a fire disaster processing end if the trend change difference value corresponding to the image set is greater than the current trend threshold value.
The apparatus in the embodiment shown in fig. 2 can be correspondingly used to perform the steps in the method embodiment shown in fig. 1, and the implementation principle and technical effect are similar, which are not described herein again.
Referring to fig. 3, which is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention, the electronic device 30 includes: a processor 31, a memory 32 and a computer program; wherein
A memory 32 for storing the computer program, which may also be a flash memory (flash). The computer program is, for example, an application program, a functional module, or the like that implements the above-described method.
A processor 31 for executing the computer program stored in the memory to implement the steps performed by the apparatus in the above method. Reference may be made in particular to the description relating to the preceding method embodiment.
Alternatively, the memory 32 may be separate or integrated with the processor 31.
When the memory 32 is a device independent of the processor 31, the apparatus may further include:
a bus 33 for connecting the memory 32 and the processor 31.
The present invention also provides a readable storage medium, in which a computer program is stored, which, when being executed by a processor, is adapted to implement the methods provided by the various embodiments described above.
The readable storage medium may be a computer storage medium or a communication medium. Communication media includes any medium that facilitates transfer of a computer program from one place to another. Computer storage media may be any available media that can be accessed by a general purpose or special purpose computer. For example, a readable storage medium is coupled to a processor such that the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Additionally, the ASIC may reside in user equipment. Of course, the processor and the readable storage medium may also reside as discrete components in a communication device. The readable storage medium may be a read-only memory (ROM), a random-access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
The present invention also provides a program product comprising execution instructions stored in a readable storage medium. The at least one processor of the device may read the execution instructions from the readable storage medium, and the execution of the execution instructions by the at least one processor causes the device to implement the methods provided by the various embodiments described above.
In the above embodiments of the apparatus, it should be understood that the Processor may be a Central Processing Unit (CPU), other general purpose processors, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor, or in a combination of the hardware and software modules within the processor.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.
Claims (10)
1. An electric fire remote alarm processing method based on the Internet of things is characterized by comprising the following steps:
acquiring a first infrared image of a power transformation room at a first time point and a second infrared image of the power transformation room at a second time point, acquiring the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and acquiring image difference degrees according to the same number of pixel points and the different numbers of pixel points;
if the image difference is greater than the preset difference, obtaining a plurality of acquisition moments according to the current moment and a preset time period, extracting area images of areas where the power transformation equipment is located in the infrared image based on the acquisition moments, and sequencing according to time to generate image sets corresponding to the power transformation equipment;
processing pixel values of the images of the regions corresponding to the adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and obtaining trend variation difference values corresponding to the image sets based on the pixel value variation trends;
acquiring environment information and current use time corresponding to power transformation equipment and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use time and the average value of the total number of the target pixels to obtain an adjusted current trend threshold;
and if the trend change difference value corresponding to the image set is larger than the current trend threshold value, the transformer equipment corresponding to the image set is used as early warning transformer equipment, and the early warning transformer equipment is sent to a fire disaster processing end.
2. The method of claim 1,
the method comprises the steps of obtaining a first infrared image of a power transformation room at a first time point and a second infrared image at a second time point, obtaining the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and obtaining image difference degrees according to the same number of pixel points and the different numbers of pixel points, and comprises the following steps:
performing coordinate processing on the first infrared image and the second infrared image to obtain a coordinate value of each pixel point;
acquiring coordinates of pixel points in the first infrared image and the second infrared image to generate a first coordinate set and a second coordinate set;
calculating the difference value between the pixel value of the pixel point positioned in the central coordinate in the first coordinate set and the pixel value of the pixel point positioned outside the preset range in the second coordinate set;
and obtaining the same number of pixel points and different numbers of pixel points in the first coordinate set and the second coordinate set based on the ratio of the difference values corresponding to the pixel points in the first coordinate set and the second coordinate set, and obtaining the image difference according to the same number of the pixel points and the different numbers of the pixel points.
3. The method of claim 2,
obtaining the same number of pixel points and different numbers of pixel points in the first coordinate set and the second coordinate set based on the ratio of the corresponding difference values of the pixel points in the first coordinate set and the second coordinate set, and obtaining the image difference according to the same number of the pixel points and the different numbers of the pixel points, wherein the method comprises the following steps:
collecting pixel values of a pixel point positioned at the central coordinate and pixel points positioned at coordinates outside a preset range in the first coordinate set, calculating a difference value of the pixel values to obtain a first difference value, and calculating a difference value of the pixel values of a pixel point positioned at the central coordinate and pixel values of pixel points positioned at coordinates outside the preset range in the second coordinate set to obtain a second difference value;
acquiring a first difference value and a second difference value corresponding to pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set, and obtaining a difference value ratio corresponding to the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set based on the first difference value and the second difference value;
if the difference ratio is smaller than a preset difference ratio, taking the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set as the same pixel points, and if the difference ratio is larger than or equal to the preset difference ratio, taking the pixel points positioned at the same coordinate in the first coordinate set and the second coordinate set as different pixel points;
and counting the number of the same pixel points to obtain the same number of the pixel points, counting the number of the different pixel points to obtain different numbers of the pixel points, obtaining the total number of the pixel points according to the sum of the same number of the pixel points and the different numbers of the pixel points, and obtaining the image difference degree based on the ratio of the same number of the pixel points to the total number of the pixel points.
4. The method of claim 3, wherein the preset range is constructed by:
generating a preset range by taking a pixel point positioned at the central coordinate in the first coordinate set and the second coordinate set as a circle center and a preset radius as a radius;
acquiring the pixel quantity of all pixel points in the first coordinate set and the second coordinate set, and adjusting the preset range based on the pixel quantity to obtain an adjustment range;
the number of pixel points of the adjustment radius corresponding to the adjustment range is calculated by the following formula,
wherein, the first and the second end of the pipe are connected with each other,the number of the pixel points of the adjustment radius corresponding to the adjustment range,for the number of pixels of all the pixel points,in order to preset the number of pixels,in order to adjust the weight value of the range,the number of pixels corresponding to the preset radius.
5. The method of claim 1,
obtaining a plurality of acquisition moments according to the current moment and a preset time period, extracting area images of areas where all the power transformation equipment are located in the infrared image based on the acquisition moments, and sequencing according to time to generate image sets corresponding to all the power transformation equipment, wherein the image sets comprise:
determining a first acquisition time and a second acquisition time according to the current time and a preset acquisition duration, and extracting area images of areas of the transformer equipment substations in the infrared images based on the first acquisition time and the second acquisition time;
calculating based on the preset acquisition duration and the pixel values of the two region images to obtain a pixel value variation trend;
obtaining a duration offset coefficient according to the pixel value variation trend and a preset pixel value variation trend, and adjusting preset acquisition duration based on the duration offset coefficient to obtain current acquisition duration;
determining the next acquisition time according to the second acquisition time and the current acquisition time, deleting the first acquisition time, updating the second acquisition time to the first acquisition time, and determining the next acquisition time to be the second acquisition time;
and repeatedly executing the steps, and stopping executing after the preset requirements are met.
6. The method of claim 5,
obtaining a duration offset coefficient according to the pixel value variation trend and a preset pixel value variation trend, and adjusting the preset acquisition duration based on the duration offset coefficient to obtain the current acquisition duration, wherein the duration offset coefficient comprises the following steps:
adjusting the preset acquisition time length according to the time length offset coefficient to obtain an adjusted acquisition time length after adjustment, and obtaining the current acquisition time length based on the difference value between the preset acquisition time length and the adjusted acquisition time length;
the current acquisition duration is calculated by the following formula,
wherein the content of the first and second substances,the time length for the current acquisition is,in order to preset the acquisition time length,in order to be a pixel value variation tendency,in order to preset the trend of the pixel value variation,in order to be the time-length offset coefficient,the weight value of the acquisition time length is adjusted.
7. The method of claim 6,
processing pixel values of the images in the regions corresponding to the adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and obtaining trend variation difference values corresponding to the image sets based on the pixel value variation trends, wherein the method comprises the following steps:
acquiring target pixel values of target pixel points in each regional image, and acquiring pixel values corresponding to each regional image according to the average value of all the target pixel values;
acquiring acquisition time length between adjacent region images, and calculating pixel values of a previous region image and a next region image between the adjacent region images in the image set based on the acquisition time length to obtain a plurality of pixel value variation trends;
determining a former pixel value variation trend and a latter pixel value variation trend in the adjacent pixel value variation trends, and obtaining a plurality of trend variation difference values according to the difference value of the latter pixel value variation trend and the former pixel value variation trend;
the trend change difference value is calculated by the following formula,
wherein the content of the first and second substances,in the form of a trend-change difference value,is as followsThe variation trend of the value of each pixel,is as followsThe variation trend of the value of each pixel,is as followsThe variation trend of the value of each pixel,is as followsThe target pixel value of the target pixel point in the previous region image,the total number of target pixel points in the previous regional image,the number value of the target pixel points in the previous region image,is the average value of the pixels corresponding to the image of the previous area,is a firstThe target pixel value of the target pixel point in the next region image,the total number of target pixels in the image of the next region,the number of target pixels in the image of the next region,is the average value of the pixels corresponding to the image of the latter area,is as followsThe length of time between the acquisition of the images of the adjacent regions,weight values are calculated for the trends.
8. The method of claim 7,
acquiring environmental information and current use time corresponding to power transformation equipment, and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold value in real time according to the environmental information, the use time and the average value of the total number of the target pixels to obtain an adjusted current trend threshold value, wherein the method comprises the following steps:
acquiring an environment temperature value corresponding to the environment information according to a temperature acquisition device, and calculating according to the environment temperature value and a preset temperature value to obtain a temperature offset coefficient;
acquiring a preset service life corresponding to each power transformation device, and calculating according to the preset service life and the current service life to generate a service life deviation value;
collecting the total number of target pixel points of all regional images in an image set, and calculating based on the average value of the total number of the target pixel points and the number of preset pixel points to obtain a number deviation value;
adjusting a preset trend threshold value according to the temperature offset coefficient, the service life offset value and the quantity offset value to obtain an adjusted current trend threshold value;
the current trend threshold is calculated by the following formula,
wherein, the first and the second end of the pipe are connected with each other,in the case of the current trend threshold value,is the value of the ambient temperature,is a preset temperature value, and is used as a temperature value,in order to preset the service life of the device,for the current length of time of use,is the average of the total number of the target pixels,is a preset imageThe number of the prime points is the number of the prime points,in order to pre-set the trend threshold value,a weight value that is a current trend threshold.
9. The method of claim 8, further comprising:
if receiving correction information input by a worker through an input device, the correction information comprises correction information of no fire alarm and correction information of no fire alarm;
if the correction information is correction information without fire alarm, calculating a current trend threshold value and an increase value of a preset increase correction value, and correcting a weight value of the current trend threshold value based on the increase value to obtain a correction weight value;
if the correction information is the correction information which is not alarmed when a fire disaster occurs, calculating a current trend threshold value and a preset turn-down value of a turn-down correction value, and correcting a weight value of the current trend threshold value based on the turn-down value to obtain a correction weight value;
the correction weight value is calculated by the following formula,
wherein the content of the first and second substances,in order to correct the weight value(s),in order to preset the adjustment-up correction value,in order to increase the weight of the correction,in order to reduce the value of the correction weight,the correction value is adjusted down for a preset value.
10. The utility model provides an electric fire remote alarm processing apparatus based on thing networking which characterized in that includes:
the difference module is used for acquiring a first infrared image of the power transformation room at a first time point and a second infrared image at a second time point, acquiring the same number of pixel points and different numbers of pixel points according to pixel values of pixel points at the same position in the first infrared image and the second infrared image, and acquiring image difference degrees according to the same number of pixel points and the different numbers of pixel points;
the image module is used for obtaining a plurality of acquisition moments according to the current moment and a preset time period if the difference degree is greater than the preset difference degree, extracting area images of areas where the power transformation equipment is located in the infrared image based on the acquisition moments, and sequencing according to time to generate an image set corresponding to the power transformation equipment;
the difference module is used for processing pixel values of the images in the areas corresponding to the adjacent time points in the image sets to obtain a plurality of pixel value variation trends, and trend variation differences corresponding to the image sets are obtained based on the pixel value trends;
the threshold module is used for acquiring environment information and current use duration corresponding to the power transformation equipment and an average value of the total number of target pixels in the image set, and adjusting a preset trend threshold in real time according to the environment information, the use duration and the average value of the total number of the target pixels to obtain an adjusted current trend threshold;
and the early warning module is used for taking the transformer equipment corresponding to the image set as early warning transformer equipment and sending the early warning transformer equipment to a fire disaster processing end if the trend change difference value corresponding to the image set is greater than the current trend threshold value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211237719.8A CN115311811B (en) | 2022-10-11 | 2022-10-11 | Electrical fire remote alarm processing method and device based on Internet of things |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211237719.8A CN115311811B (en) | 2022-10-11 | 2022-10-11 | Electrical fire remote alarm processing method and device based on Internet of things |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115311811A CN115311811A (en) | 2022-11-08 |
CN115311811B true CN115311811B (en) | 2022-12-06 |
Family
ID=83868195
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211237719.8A Active CN115311811B (en) | 2022-10-11 | 2022-10-11 | Electrical fire remote alarm processing method and device based on Internet of things |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115311811B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116189367B (en) * | 2022-12-09 | 2023-09-26 | 嘉应学院 | Building fire alarm system based on Internet of things |
CN115841489B (en) * | 2023-02-21 | 2023-04-28 | 华至云链科技(苏州)有限公司 | Intelligent spot inspection method and platform |
CN116758079B (en) * | 2023-08-18 | 2023-12-05 | 杭州浩联智能科技有限公司 | Harm early warning method based on spark pixels |
CN116912780B (en) * | 2023-09-12 | 2023-11-24 | 国网浙江省电力有限公司杭州供电公司 | Charging monitoring protection method and system based on mode dynamic switching |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001067566A (en) * | 1999-08-30 | 2001-03-16 | Fujitsu Ltd | Fire detecting device |
CN101840571A (en) * | 2010-03-30 | 2010-09-22 | 杭州电子科技大学 | Flame detection method based on video image |
CN102208018A (en) * | 2011-06-01 | 2011-10-05 | 西安工程大学 | Method for recognizing fire disaster of power transmission line based on video variance analysis |
CN102236947A (en) * | 2010-04-29 | 2011-11-09 | 中国建筑科学研究院 | Flame monitoring method and system based on video camera |
CN108537202A (en) * | 2018-04-19 | 2018-09-14 | 广州林邦信息科技有限公司 | Forest fire identification device and system |
CN109643482A (en) * | 2016-06-28 | 2019-04-16 | 烟雾检测器有限责任公司 | Use the smoke detection system and method for camera |
CN110634261A (en) * | 2019-08-27 | 2019-12-31 | 国网山东省电力公司泗水县供电公司 | Fire early warning system and method for underground power distribution network |
CN113551775A (en) * | 2021-06-23 | 2021-10-26 | 国网福建省电力有限公司 | Equipment fault on-line monitoring and alarming method and system based on infrared thermal imaging |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6696958B2 (en) * | 2002-01-14 | 2004-02-24 | Rosemount Aerospace Inc. | Method of detecting a fire by IR image processing |
US20080136934A1 (en) * | 2006-12-12 | 2008-06-12 | Industrial Technology Research Institute | Flame Detecting Method And Device |
-
2022
- 2022-10-11 CN CN202211237719.8A patent/CN115311811B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001067566A (en) * | 1999-08-30 | 2001-03-16 | Fujitsu Ltd | Fire detecting device |
CN101840571A (en) * | 2010-03-30 | 2010-09-22 | 杭州电子科技大学 | Flame detection method based on video image |
CN102236947A (en) * | 2010-04-29 | 2011-11-09 | 中国建筑科学研究院 | Flame monitoring method and system based on video camera |
CN102208018A (en) * | 2011-06-01 | 2011-10-05 | 西安工程大学 | Method for recognizing fire disaster of power transmission line based on video variance analysis |
CN109643482A (en) * | 2016-06-28 | 2019-04-16 | 烟雾检测器有限责任公司 | Use the smoke detection system and method for camera |
CN108537202A (en) * | 2018-04-19 | 2018-09-14 | 广州林邦信息科技有限公司 | Forest fire identification device and system |
CN110634261A (en) * | 2019-08-27 | 2019-12-31 | 国网山东省电力公司泗水县供电公司 | Fire early warning system and method for underground power distribution network |
CN113551775A (en) * | 2021-06-23 | 2021-10-26 | 国网福建省电力有限公司 | Equipment fault on-line monitoring and alarming method and system based on infrared thermal imaging |
Also Published As
Publication number | Publication date |
---|---|
CN115311811A (en) | 2022-11-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115311811B (en) | Electrical fire remote alarm processing method and device based on Internet of things | |
CN111426393B (en) | Temperature correction method, device and system | |
WO2018149322A1 (en) | Image identification method, device, apparatus, and data storage medium | |
CN114666473A (en) | Video monitoring method, system, terminal and storage medium for farmland protection | |
CN116298737B (en) | Switch cabinet discharge monitoring system, method and equipment | |
CN113970382B (en) | Temperature detection method, device, medium and electronic equipment | |
CN107782954A (en) | A kind of transformer overvoltage method for early warning based on a large amount of overvoltage number data | |
JP2007310464A (en) | Monitoring camera system and control method for the same | |
CN115311624B (en) | Slope displacement monitoring method and device, electronic equipment and storage medium | |
CN112163448A (en) | Forehead temperature detection method and system based on risk grade classification and storage medium | |
CN114204680B (en) | Multi-type automatic detection equipment fusion remote diagnosis system and method | |
CN116798205A (en) | Big data-based dynamic processing method and system for electrical fire monitoring threshold | |
CN112559947B (en) | Real-time power flow data detection method based on equipment and related device | |
CN114550256A (en) | Method, device and equipment for detecting tiny target and computer readable medium | |
CN111813983A (en) | Abnormal body temperature early warning method, device and system | |
CN204740690U (en) | Infraredly prevent external damage early warning circuit | |
CN112037176A (en) | Human presence detection device | |
TW202036371A (en) | Field analysis apparatus and method thereof | |
CN113820018A (en) | Temperature measurement method, device, system and medium based on infrared imaging | |
CN115841489A (en) | Intelligent point inspection method and platform | |
CN114061512B (en) | Method, system and equipment for detecting center point deviation of high-voltage transmission tower | |
KR102470097B1 (en) | Real-time risk measurement system for respiratory infections that can automatically distribute computational load | |
CN117579766A (en) | Video frame extraction method and related equipment, abnormal scene early warning method and related equipment | |
CN114739536A (en) | Health monitoring system and terminal equipment | |
CN117765701A (en) | Information detection method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |