CN115512506B - End cloud linkage firefighting map detection method and system based on two buffer pools - Google Patents

End cloud linkage firefighting map detection method and system based on two buffer pools Download PDF

Info

Publication number
CN115512506B
CN115512506B CN202211224520.1A CN202211224520A CN115512506B CN 115512506 B CN115512506 B CN 115512506B CN 202211224520 A CN202211224520 A CN 202211224520A CN 115512506 B CN115512506 B CN 115512506B
Authority
CN
China
Prior art keywords
time
buffer pool
image
real
long
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211224520.1A
Other languages
Chinese (zh)
Other versions
CN115512506A (en
Inventor
聂晖
罗朝会
陈黎
杨小波
李军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Eastwit Technology Co ltd
Wuhan Qingniao Zhi'an Technology Co ltd
Jade Bird Fire Co Ltd
Original Assignee
Wuhan Eastwit Technology Co ltd
Wuhan Qingniao Zhi'an Technology Co ltd
Jade Bird Fire Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Eastwit Technology Co ltd, Wuhan Qingniao Zhi'an Technology Co ltd, Jade Bird Fire Co Ltd filed Critical Wuhan Eastwit Technology Co ltd
Priority to CN202211224520.1A priority Critical patent/CN115512506B/en
Publication of CN115512506A publication Critical patent/CN115512506A/en
Application granted granted Critical
Publication of CN115512506B publication Critical patent/CN115512506B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/48Matching video sequences

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Alarm Systems (AREA)

Abstract

The invention provides an end cloud linkage firefighting map detection method and system based on two buffer pools, wherein the system comprises a front end sensing device, video monitoring equipment and a cloud center server; the cloud center server is provided with: the buffer module comprises a long-time buffer pool and a short-time buffer pool and is used for respectively storing and updating the acquired long-time image data and short-time image data; the diagram detection module is used for calculating the difference of the historical diagrams of the long-time buffer pool and the short-time buffer pool in a window period of real-time video stream acquisition; after the real-time video stream is acquired, calculating the real-time image difference between the long-time buffer pool and the real-time video; when the cloud center server receives an alarm signal of the front-end sensing device, auxiliary analysis is provided for artificial image detection through historical image aberration and/or real-time image aberration calculation. The invention adopts two buffer pool techniques, and effectively improves the timeliness and accuracy of the fire alarm report through dual analysis of the detecting difference of the historical diagram and the detecting difference of the real-time diagram.

Description

End cloud linkage firefighting map detection method and system based on two buffer pools
Technical Field
The invention relates to the technical field of fire prevention and control automation platforms, in particular to an end cloud linkage firefighting map detection method and system based on two buffer pools.
Background
Fire safety is an important component of urban public safety, and relates to thousands of households and various industries, and the fire safety is directly related to urban municipal safety, traffic safety, production operation safety, sanitation safety and people's life and property safety. In the aspect of intelligent fire prevention and control, the existing traditional fire control system generally only stays at the informatization level of service management, mainly realizes the collection of workplace sensing data, but the collected data is not timely and comprehensive enough and is not practical enough, the analysis value of the data is not exerted, and the service application mode still depends on manual analysis, so that the service application mode has limited help for fire prevention and fire extinguishing rescue. The development of the machine vision technology provides a powerful support for the integration of image information processing and the working depth of fire service and the construction of a modern fire service mechanism meeting actual combat requirements.
Modern fire prevention and control automation platforms take an electric fire detector, a combustible gas detector, fireproof door control and the like as main front-end sensing devices, and take video imaging equipment, encoding, decoding and forwarding equipment as visual equipment of a fire scene. The cloud platform of video pictures is returned at the first moment of fire alarm, and the artificial picture detection and confirmation work of the fire source is carried out at the cloud, so that precious time is won for the golden 3 minutes of extinguishing the initial fire and saving oneself. The fire alarm prevention and control which is necessary for the minute and the second needs to be analyzed and judged as soon as possible to judge the trend of the fire and the disaster development. However, in real application, short delay or blocking exists in video picture transmission due to data packet loss and distribution switching of video streams caused by limitation of network bandwidth or communication abnormality, and confirmation of artificial picture detection is delayed. Meanwhile, manual picture-based fire alarm investigation requires a worker to keep a working state of long-time concentration, and a certain search and judgment time is also required for positioning a fire source target.
Disclosure of Invention
According to the problems in the prior art, the invention provides the end cloud linkage firefighting map detection method and the system based on the two buffer pools, which effectively make up for a blank window in which map detection picture transmission is missing and improve the timeliness and the accuracy of a fire alarm report.
The technical scheme of the invention is as follows:
an end cloud linkage firefighting map detection system based on two buffer pools is characterized by comprising a front end sensing device, video monitoring equipment and a cloud center server; the front-end sensing device and the video monitoring equipment are respectively in communication connection with the cloud center server; the cloud center server is provided with:
the buffer module comprises a long-time buffer pool and a short-time buffer pool and is used for respectively storing and updating the long-time image data and the short-time image data of the acquired video monitoring equipment;
the diagram detection module is used for calculating the difference of the historical diagrams of the long-time buffer pool and the short-time buffer pool in a window period of real-time video stream acquisition; after the real-time video stream is acquired, calculating the real-time image difference between the long-time buffer pool and the real-time video;
when the cloud center server receives an alarm signal of the front-end sensing device, auxiliary analysis is provided for artificial image detection through historical image aberration and/or real-time image aberration calculation.
Further, an address matching module is further arranged on the cloud center server and used for setting a geographic coordinate-IP address matching table of the front-end sensing device and the video monitoring equipment and providing a communication address of the video monitoring equipment corresponding to the front-end sensing device.
Further, the cloud center server is respectively in communication connection with the front-end sensing device and the video monitoring equipment through the communication module.
An end cloud linkage firefighting map detection method based on two buffer pools is characterized by comprising the following steps:
s1, setting a geographic coordinate-IP address matching table of a front-end sensing device and video monitoring equipment;
s2, setting two buffer pools, wherein the first buffer pool is a long-time buffer pool, and the second buffer pool is a short-time buffer pool, and respectively storing and updating the long-time image data and the short-time image data of the acquired video monitoring equipment;
s3, when the cloud center server receives an alarm signal of the front-end sensing device, inquiring a communication address of the corresponding video monitoring equipment through a matching table, and remotely acquiring a video stream;
s4, performing historical image difference calculation through image data of a long-time buffer pool and a short-time buffer pool in a window period acquired by the real-time video stream; after the real-time video stream is acquired, performing real-time image difference calculation through the long-time buffer pool and image data of the real-time video stream;
s5, using historical image aberration and/or real-time image aberration calculation results to provide auxiliary analysis for artificial image detection.
The step S1 specifically includes:
s11, front end sensing device x i The front-end sensing device x is stored in a gridding mode for prevention i Is defined by the geographic coordinates of (a);
s12, video monitoring equipment y j 1-N front-end sensing devices are covered in the visual field of the video monitoring equipment y j Is set to the IP address of (a);
s13, front end sensing device x i With video monitoring device y j The i:j matching table of the geographic coordinates-IP addresses is an N:1 linear table, namely N front-end sensing devices correspond to video monitoring equipment.
The step S2 specifically comprises the following steps:
s21, setting a long-time buffer pool data updating time t1, and receiving a remote image of the video monitoring equipment at intervals of t1 to update long-time buffer pool image data;
s22, setting short-time buffer pool data updating time t2, and receiving remote images of video monitoring equipment at intervals of t2, and updating short-time buffer pool image data; wherein t2< t1.
Further, in step S21, the method for updating image data of the long-term buffer pool is a gaussian background modeling method, which specifically includes:
let the mapping p (x) of each pixel point (x, y) to the gaussian background model for the background image B satisfy
Figure GDA0004235719140000031
Wherein: x is the pixel value of a certain pixel point, and mu and d are the mean value and variance of Gaussian distribution; calculating the mean mu and the variance d of each point in the long-term buffer pool image sequence asA background model; for an arbitrary image G containing a foreground, for each point (x, y) on the image, if:
Figure GDA0004235719140000032
then the point is considered to be a background point, otherwise the point is considered to be a foreground point; wherein T is a constant threshold, G (x, y) represents the pixel value of a certain point on the foreground, namely the current image, and B (x, y) represents the pixel value of a corresponding point on the background, namely the background image of Gaussian background modeling;
the long-term buffer pool updates each frame of image by the following method:
B t (x,y)=p*B t-1 (x,y)+(1-p)*G t (x,y)
where p is an update threshold, which is a constant that reflects the background update rate, the larger p, the slower the background update.
In step S22, the short-time buffer pool image data updating method is an substitution method, specifically:
after the short-time buffer pool acquires the images, all the newly acquired image pixels M t (x, y) completely covers the previous pixel M t-1 (x,y):
M t (x,y)=M t-1 (x,y)。
In step S4, the method for calculating the historical image variability and the real-time image variability is a structural similarity difference calculating method, which specifically includes the steps of:
inputting image data of a long-time buffer pool and a short-time buffer pool or image data of the long-time buffer pool and a real-time video, and setting each two pictures as x and y respectively;
the average gray scale mu is used as brightness measurement to calculate the average gray scale mu x 、μ y
The gray standard deviation sigma is used as contrast measurement, and the gray standard deviation sigma is calculated respectively x 、σ y
According to the calculated mu x 、μ y 、σ x 、σ y The structured difference D (x, y) between the pictures is calculated:
Figure GDA0004235719140000041
wherein C is 1 ,C 2 Is constant, and avoids instability caused by the fact that the denominator is close to 0;
the larger D, the smaller the difference of the representative picture, the larger the difference of the representative picture.
In step S5, providing auxiliary analysis for the artificial image frame means: after the front-end sensing device alarms, if the history image aberration D1 is larger than a certain threshold before the cloud center server does not acquire the field real-time video stream, the probability is not false alarm of the front-end sensing device, and a worker can obtain possible places and time of fire disaster occurrence according to the IP address and the video time, send out fire disaster early warning and prepare fire fighting operation in advance; when the on-site real-time video stream reaches the cloud center server, if the real-time image aberration D2 is larger than a certain threshold value, the situation that a fire disaster occurs at the place is judged to be in high probability, and a worker can determine the place and time of the fire disaster according to the IP address and the video time, so that a fire disaster alarm is given out.
The invention has the technical effects that:
according to the end cloud linkage firefighting map detection method and system based on the two buffer pools, the short-time buffer pool and the long-time buffer pool are arranged, and the acquired long-time image data and short-time image data are respectively stored and updated; when waiting for real-time picture transmission, the information of the short-time buffer pool and the long-time buffer pool is utilized to calculate the history map detection difference, so that a blank window missing in picture transmission is effectively made up, and a potential fire point is found in time; after receiving the real-time picture, calculating real-time picture detection difference by utilizing the information of the real-time video and the long-time buffer pool; according to the dual analysis of the detecting difference of the historical diagram and the detecting difference of the real-time diagram, powerful technical support is provided for the final determination of the alarm result for the staff, the timeliness and the accuracy of the fire alarm report are effectively improved, and the method is worthy of popularization and application.
Drawings
FIG. 1 is a schematic diagram of an end-cloud linked fire detection system based on two buffer pools.
Fig. 2 is a flow chart of an embodiment of a method for detecting an end cloud linked fire map based on two buffer pools.
Detailed Description
For a further understanding of the present invention, preferred embodiments of the invention are described below with reference to the drawings, but it is to be understood that the description is only intended to illustrate further the features and advantages of the invention, and not to limit the scope of the claims.
FIG. 1 is a schematic diagram of an end-cloud linked fire detection system based on two buffer pools.
An end cloud linkage firefighting map detection system based on two buffer pools comprises a front end sensing device, video monitoring equipment and a cloud center server; the front-end sensing device and the video monitoring equipment are respectively in communication connection with the cloud center server; in this embodiment, the front end sensing device and the video monitoring device are respectively connected with the cloud center server through the communication module. The cloud center server is provided with: the buffer module, the map detection module and the address matching module are also arranged; the buffer module comprises a long-time buffer pool and a short-time buffer pool, and is used for respectively storing and updating the long-time image data and the short-time image data of the acquired video monitoring equipment; the diagram detection module is used for calculating the historical image variability of the long-time buffer pool and the short-time buffer pool in a window period of real-time video stream acquisition; after the real-time video stream is acquired, calculating the real-time image difference between the long-time buffer pool and the real-time video; when the cloud center server receives an alarm signal of the front-end sensing device, auxiliary analysis is provided for artificial image detection through historical image aberration and/or real-time image aberration calculation. The cloud center server is also provided with an address matching module which is used for setting a geographic coordinate-IP address matching table of the front-end sensing device and the video monitoring equipment and providing a communication address of the video monitoring equipment corresponding to the front-end sensing device.
Fig. 2 is a flow chart of an embodiment of an end cloud linked fire detection method based on two buffer pools.
An end cloud linkage firefighting map detection method based on two buffer pools comprises the following steps:
s1, setting a geographic coordinate-IP address matching table of a front-end sensing device and video monitoring equipment;
s2, setting two buffer pools, wherein the first buffer pool is a long-time buffer pool, and the second buffer pool is a short-time buffer pool, and respectively storing and updating the long-time image data and the short-time image data of the acquired video monitoring equipment;
s3, when the cloud center server receives an alarm signal of the front-end sensing device, inquiring a communication address of the corresponding video monitoring equipment through a matching table, and remotely acquiring a video stream;
s4, performing historical image difference calculation through image data of a long-time buffer pool and a short-time buffer pool in a window period acquired by the real-time video stream; after the real-time video stream is acquired, performing real-time image difference calculation through the long-time buffer pool and image data of the real-time video stream;
s5, using historical image aberration and/or real-time image aberration calculation results to provide auxiliary analysis for artificial image detection.
The step S1 specifically includes:
s11, front end sensing device x i The front-end sensing device x is stored in a gridding mode for prevention i Is defined by the geographic coordinates of (a);
s12, video monitoring equipment y j 1-N front-end sensing devices are covered in the visual field of the video monitoring equipment y j Is set to the IP address of (a);
s13, front end sensing device x i With video monitoring device y j The i:j matching table of the geographic coordinates-IP addresses is an N:1 linear table, namely N front-end sensing devices correspond to video monitoring equipment. In this embodiment, every 3-5 front-end aware devices corresponds to a video monitoring device.
The step S2 specifically comprises the following steps:
s21, setting a long-time buffer pool data updating time t1, and at intervals of t1, (in a state of 'no alarm' or 'alarm eliminated'), receiving a remote image of video monitoring equipment, and updating long-time buffer pool image data; in this embodiment, t1 is set to 12h, that is, every 12 hours, long-term buffer pool image data is updated;
s22, setting short-time buffer pool data updating time t2, and at intervals of t2, (in a state of 'no alarm' or 'alarm eliminated'), receiving a remote image of video monitoring equipment, and updating short-time buffer pool image data; wherein t2< t1; in this embodiment, t2 is set to 15s, that is, buffer pool image data is updated every 15 seconds when the time is short;
furthermore, the image data updating mode of the long-time buffer pool is a Gaussian background modeling method, which specifically comprises the following steps:
let the mapping p (x) of each pixel point (x, y) to the gaussian background model for the background image B satisfy
Figure GDA0004235719140000061
Wherein: x is the pixel value of a certain pixel point, and mu and d are the mean value and variance of Gaussian distribution; calculating the mean mu and the variance d of each point in the long-time buffer pool image sequence as a background model; for an arbitrary image G containing a foreground, for each point (x, y) on the image, if:
Figure GDA0004235719140000062
then the point is considered to be a background point, otherwise the point is considered to be a foreground point; wherein, G (x, y) represents the pixel value of a certain point on the foreground, namely the current image, and B (x, y) represents the pixel value of a corresponding point on the background, namely the background image of Gaussian background modeling; t is a constant threshold, typically 0.25 or 0.75, and in this embodiment, T is set to 0.75 for optimal results.
The long-term buffer pool updates each frame of image by the following method:
B t (x,y)=p*B t-1 (x,y)+(1-p)*G t (x,y)
where p is an update threshold, which is a constant that reflects the background update rate, the larger p, the slower the background update. In this embodiment, p is set to 0.2.
The short-time buffer pool image data updating mode is an alternative method, which specifically comprises the following steps:
after the short-time buffer pool acquires the images, all the newly acquired image pixels M t (x, y) completely covers the previous pixel M t-1 (x,y):
M t (x,y)=M t-1 (x,y)。
In step S3, when the cloud center server receives the alarm signal of the front-end sensing device xi, the IP address of the corresponding video monitoring device yj is queried through the matching table to perform a real-time video stream remote acquisition request,
the video stream remote acquisition request is performed in two modes, namely a static image mode for acquiring picture data at intervals, and a video stream mode for continuously acquiring real-time stream signals.
In step S4, the method for calculating the historical image variability and the real-time image variability is a structural similarity difference calculating method, which specifically includes the steps of:
inputting image data of a long-time buffer pool and a short-time buffer pool or image data of the long-time buffer pool and a real-time video, and setting each two pictures as x and y respectively;
the average gray scale mu is used as brightness measurement to calculate the average gray scale mu x 、μ y
The gray standard deviation sigma is used as contrast measurement, and the gray standard deviation sigma is calculated respectively x 、σ y
According to the calculated mu x 、μ y 、σ x 、σ y The structured difference D (x, y) between the pictures is calculated:
Figure GDA0004235719140000071
wherein C is 1 ,C 2 Is constant, and avoids instability caused by the fact that the denominator is close to 0; in this embodiment, C1 is set to 0.0001, C2 is set to 0.0009, and errors are avoided as much as possible while preventing denominator from approaching 0.
The larger the calculated D, the smaller the difference of the representative picture, and the larger the difference of the representative picture.
In the embodiment, firstly, in a video stream remote acquisition request window period, image data of a long-time buffer pool and short-time buffer pool are subjected to difference calculation to obtain D1, wherein D1 is the history map detection difference; then, after the acquisition request is successful, the long-time buffer pool and the real-time video are subjected to difference calculation to obtain D2, wherein D2 is the real-time diagram detection difference.
In step S5, providing auxiliary analysis for the artificial image frame means: after the front-end sensing device alarms, if the history image difference D1 is larger than a certain threshold before the cloud center server does not acquire the field real-time video stream, in the embodiment, if the history image difference D1 is larger than 0.75, the probability is not the front-end sensing device false alarm, and a worker can inquire the IP address and the video time to obtain the possible place and time of fire disaster, send out fire disaster early warning and prepare for fire fighting operation in advance; when the on-site real-time video stream reaches the cloud center server, if the real-time image difference D2 is greater than a certain threshold, in this embodiment, if the real-time image difference D2 is greater than 0.5, it is determined that a fire disaster occurs at the location with a high probability, and a worker can determine the location and time of the fire disaster according to the IP address and the video time, and send out a fire disaster alarm.
It should be noted that the above-described embodiments will enable those skilled in the art to more fully understand the invention, but do not limit it in any way. All technical solutions and modifications thereof which do not depart from the spirit and scope of the invention are included in the protection scope of the invention.

Claims (10)

1. An end cloud linkage firefighting map detection system based on two buffer pools is characterized by comprising a front end sensing device, video monitoring equipment and a cloud center server; the front-end sensing device and the video monitoring equipment are respectively in communication connection with the cloud center server; the cloud center server is provided with:
the buffer module comprises a long-time buffer pool and a short-time buffer pool and is used for respectively storing and updating the long-time image data and the short-time image data of the acquired video monitoring equipment;
the diagram detection module is used for calculating the difference of the historical diagrams of the long-time buffer pool and the short-time buffer pool in a window period of real-time video stream acquisition; after the real-time video stream is acquired, calculating the real-time image difference between the long-time buffer pool and the real-time video;
when the cloud center server receives an alarm signal of the front-end sensing device, auxiliary analysis is provided for artificial image detection through historical image aberration and/or real-time image aberration calculation.
2. The end cloud linked fire map detection system based on the two buffer pools according to claim 1, wherein the cloud center server is further provided with an address matching module, and the address matching module is used for setting a geographic coordinate-Internet Protocol (IP) address matching table of the front-end sensing device and the video monitoring device, and providing a communication address of the video monitoring device corresponding to the front-end sensing device.
3. The end cloud linkage fire fighting pattern detection system based on the two buffer pools according to claim 1, wherein the cloud center server is respectively in communication connection with the front end sensing device and the video monitoring device through the communication module.
4. An end cloud linkage firefighting map detection method based on two buffer pools is characterized by comprising the following steps:
s1, setting a geographic coordinate-IP address matching table of a front-end sensing device and video monitoring equipment;
s2, setting two buffer pools, wherein the first buffer pool is a long-time buffer pool, and the second buffer pool is a short-time buffer pool, and respectively storing and updating the long-time image data and the short-time image data of the acquired video monitoring equipment;
s3, when the cloud center server receives an alarm signal of the front-end sensing device, inquiring a communication address of the corresponding video monitoring equipment through a matching table, and remotely acquiring a video stream;
s4, performing historical image difference calculation through image data of a long-time buffer pool and a short-time buffer pool in a window period acquired by the real-time video stream; after the real-time video stream is acquired, performing real-time image difference calculation through the long-time buffer pool and image data of the real-time video stream;
s5, using historical image aberration and/or real-time image aberration calculation results to provide auxiliary analysis for artificial image detection.
5. The method for detecting an end cloud linked fire map based on two buffer pools according to claim 4, wherein the step S1 specifically includes:
s11, front end sensing device x i The front-end sensing device x is stored in a gridding mode for prevention i Is defined by the geographic coordinates of (a);
s12, video monitoring equipment y j 1-N front-end sensing devices are covered in the visual field of the video monitoring equipment y j Is set to the IP address of (a);
s13, front end sensing device x i With video monitoring device y j The i:j matching table of the geographic coordinates-IP addresses is an N:1 linear table, namely N front-end sensing devices correspond to video monitoring equipment.
6. The method for detecting the end cloud linked fire map based on the two buffer pools according to claim 4, wherein the step S2 specifically includes:
s21, setting a long-time buffer pool data updating time t1, and receiving a remote image of the video monitoring equipment at intervals of t1 to update long-time buffer pool image data;
s22, setting short-time buffer pool data updating time t2, and receiving remote images of video monitoring equipment at intervals of t2, and updating short-time buffer pool image data; wherein t2< t1.
7. The method for detecting end-cloud linked fire charts based on two buffer pools according to claim 6, wherein in step S21, the long-term buffer pool image data updating mode is a gaussian background modeling method, which specifically comprises:
set for background image B, each pixelThe mapping p (x) of points (x, y) to gaussian background model satisfies
Figure FDA0004235719130000021
Wherein: x is the pixel value of a certain pixel point, and mu and d are the mean value and variance of Gaussian distribution; calculating the mean mu and the variance d of each point in the long-time buffer pool image sequence as a background model;
for an arbitrary image G containing a foreground, for each point (x, y) on the image, if:
Figure FDA0004235719130000022
then the point is considered to be a background point, otherwise the point is considered to be a foreground point; wherein T is a constant threshold, G (x, y) represents the pixel value of a certain point on the foreground, namely the current image, and B (x, y) represents the pixel value of a corresponding point on the background, namely the background image of Gaussian background modeling;
the long-term buffer pool updates each frame of image by the following method:
B t (x,y)=p*B t-1 (x,y)+(1-p)*G t (x,y)
where p is an update threshold, which is a constant that reflects the background update rate, the larger p, the slower the background update.
8. The method for detecting the end cloud linked fire map based on the two buffer pools according to claim 6, wherein in the step S22, the short-time buffer pool image data updating method is a substitution method, specifically:
after the short-time buffer pool acquires the images, all the newly acquired image pixels M t (x, y) completely covers the previous pixel M t-1 (x,y):
M t (x,y)=M t-1 (x,y)。
9. The end cloud linked fire map detection method based on two buffer pools according to claim 4, wherein in the step S4, the calculation method of the historical image variability and the real-time image variability is a structural similarity difference calculation method, and the specific steps include:
inputting image data of a long-time buffer pool and a short-time buffer pool or image data of the long-time buffer pool and a real-time video, and setting each two pictures as x and y respectively;
the average gray scale mu is used as brightness measurement to calculate the average gray scale mu x 、μ y
The gray standard deviation sigma is used as contrast measurement, and the gray standard deviation sigma is calculated respectively x 、σ y
According to the calculated mu x 、μ y 、σ x 、σ y The structured difference D (x, y) between the pictures is calculated:
Figure FDA0004235719130000031
wherein C is 1 ,C 2 Is constant, and avoids instability caused by the fact that the denominator is close to 0;
the larger D, the smaller the difference of the representative picture, the larger the difference of the representative picture.
10. The method for detecting end-cloud linked fire charts based on two buffer pools according to claim 9, wherein in step S5, providing auxiliary analysis for the artificial image frames means: after the front-end sensing device alarms, if the history image aberration D1 is larger than a certain threshold before the cloud center server does not acquire the field real-time video stream, the probability is not false alarm of the front-end sensing device, and a worker can obtain possible places and time of fire disaster occurrence according to the IP address and the video time, send out fire disaster early warning and prepare fire fighting operation in advance; when the on-site real-time video stream reaches the cloud center server, if the real-time image aberration D2 is larger than a certain threshold value, the situation that a fire disaster occurs at the place is judged to be in high probability, and a worker can determine the place and time of the fire disaster according to the IP address and the video time, so that a fire disaster alarm is given out.
CN202211224520.1A 2022-10-09 2022-10-09 End cloud linkage firefighting map detection method and system based on two buffer pools Active CN115512506B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211224520.1A CN115512506B (en) 2022-10-09 2022-10-09 End cloud linkage firefighting map detection method and system based on two buffer pools

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211224520.1A CN115512506B (en) 2022-10-09 2022-10-09 End cloud linkage firefighting map detection method and system based on two buffer pools

Publications (2)

Publication Number Publication Date
CN115512506A CN115512506A (en) 2022-12-23
CN115512506B true CN115512506B (en) 2023-06-20

Family

ID=84508910

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211224520.1A Active CN115512506B (en) 2022-10-09 2022-10-09 End cloud linkage firefighting map detection method and system based on two buffer pools

Country Status (1)

Country Link
CN (1) CN115512506B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2183878B (en) * 1985-10-11 1989-09-20 Matsushita Electric Works Ltd Abnormality supervising system
CN103593936B (en) * 2013-09-29 2017-04-26 西安祥泰软件设备系统有限责任公司 Fire alarm remote monitoring method and embedded motherboard
JP2017076304A (en) * 2015-10-16 2017-04-20 アズビル株式会社 Fire detection system
CN112562255B (en) * 2020-12-03 2022-06-28 国家电网有限公司 Intelligent image detection method for cable channel smoke and fire conditions in low-light-level environment
CN114913663A (en) * 2021-02-08 2022-08-16 腾讯科技(深圳)有限公司 Anomaly detection method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN115512506A (en) 2022-12-23

Similar Documents

Publication Publication Date Title
JP5021171B2 (en) River information server
CN108965825B (en) Video linkage scheduling method based on holographic position map
US7991187B2 (en) Intelligent image smoke/flame sensor and detection system
CN105913600B (en) A kind of building intelligent fire alarm system
US7242295B1 (en) Security data management system
CN101334924B (en) Fire hazard probe system and its fire hazard detection method
CN107729850B (en) Internet of things outdoor advertisement monitoring and broadcasting system
CN113872328B (en) Remote intelligent substation inspection method and system based on neural network
CN105569733B (en) Underground coal mine driving face floods alarm method based on image
CN111667089A (en) Intelligent disaster prevention system and intelligent disaster prevention method
CN107167114A (en) Dilapidated house automatic monitoring system
KR101461184B1 (en) Wether condition data extraction system using cctv image
CN110636281B (en) Real-time monitoring camera shielding detection method based on background model
CN109410497B (en) Bridge opening space safety monitoring and alarming system based on deep learning
CN201091014Y (en) Fire detecting device
CN114463948A (en) Geological disaster monitoring and early warning method and system
CN206162876U (en) Road speed limit prison bat system based on visibility detection
CN110703760A (en) Newly-increased suspicious object detection method for security inspection robot
CN110928305B (en) Patrol method and system for patrol robot of railway passenger station
CN115512506B (en) End cloud linkage firefighting map detection method and system based on two buffer pools
CN110674543B (en) Three-dimensional visual gas-related operation safety control system and implementation method
CN113673406A (en) Curtain wall glass burst detection method and system, electronic equipment and storage medium
CN111330185A (en) Ancient building decoration fire prevention method and system
CN114998789A (en) Landslide geological disaster deformation monitoring system and method based on video identification
US11900470B1 (en) Systems and methods for acquiring insurance related informatics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant