CN115512506A - Terminal cloud linkage fire-fighting diagram detection method and system based on two buffer pools - Google Patents

Terminal cloud linkage fire-fighting diagram detection method and system based on two buffer pools Download PDF

Info

Publication number
CN115512506A
CN115512506A CN202211224520.1A CN202211224520A CN115512506A CN 115512506 A CN115512506 A CN 115512506A CN 202211224520 A CN202211224520 A CN 202211224520A CN 115512506 A CN115512506 A CN 115512506A
Authority
CN
China
Prior art keywords
time
image
buffer pool
real
long
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211224520.1A
Other languages
Chinese (zh)
Other versions
CN115512506B (en
Inventor
聂晖
罗朝会
陈黎
杨小波
李军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Eastwit Technology Co ltd
Wuhan Qingniao Zhi'an Technology Co ltd
Jade Bird Fire Co Ltd
Original Assignee
Wuhan Eastwit Technology Co ltd
Wuhan Qingniao Zhi'an Technology Co ltd
Jade Bird Fire Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Eastwit Technology Co ltd, Wuhan Qingniao Zhi'an Technology Co ltd, Jade Bird Fire Co Ltd filed Critical Wuhan Eastwit Technology Co ltd
Priority to CN202211224520.1A priority Critical patent/CN115512506B/en
Publication of CN115512506A publication Critical patent/CN115512506A/en
Application granted granted Critical
Publication of CN115512506B publication Critical patent/CN115512506B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/48Matching video sequences

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Alarm Systems (AREA)

Abstract

The invention provides a method and a system for detecting a terminal cloud linkage fire-fighting diagram based on two buffer pools, wherein the system comprises a front-end sensing device, video monitoring equipment and a cloud center server; the cloud center server is provided with: the buffer module comprises a long-time buffer pool and a short-time buffer pool and is used for respectively storing and updating the acquired long-time image data and the acquired short-time image data; the image detection module is used for calculating the difference of historical images of the long-time buffer pool and the short-time buffer pool in a window period acquired by the real-time video stream; after the real-time video stream is obtained, calculating the real-time image difference between the long-time buffer pool and the real-time video; when the cloud center server receives an alarm signal of the front-end sensing device, auxiliary analysis is provided for manual image detection through historical image difference and/or real-time image difference calculation. The invention adopts two buffer pool technologies, and effectively improves the timeliness and the accuracy of the fire alarm report through double analysis of the detection difference of the historical image and the detection difference of the real-time image.

Description

Terminal cloud linkage fire-fighting diagram detection method and system based on two buffer pools
Technical Field
The invention relates to the technical field of fire prevention and control automation platforms, in particular to a method and a system for detecting a terminal cloud linkage fire-fighting diagram based on two buffer pools.
Background
Fire safety is an important component of urban public safety, relates to thousands of households and every industry, and is directly related to urban municipal safety, traffic safety, production operation safety, sanitation safety and people life and property safety. In the aspect of conflagration intelligence prevention and control, current traditional fire extinguishing system generally only stops at business management's information-based layer, has mainly realized the collection of workplace response data, but the data of gathering are not timely enough, comprehensive, and is also not practical enough, and the analytical value of data is not exerted, still relies on manual analysis, and such business application mode is to the conflagration take precautions against and the rescue help of putting out a fire is limited. The development of the machine vision technology provides powerful support for the integration of image information processing and the working depth of fire service and the creation of a modern fire service mechanism which meets the actual combat requirements.
The modern fire prevention and control automation platform takes an electrical fire detector, a combustible gas detector, a fire door controller and the like as main front-end sensing devices, and takes video imaging equipment and coding, decoding and forwarding equipment as visual equipment of a fire alarm field. The cloud platform of video picture passback is carried out at first moment of fire alarm, and the work of confirming is listened to the manual work picture that carries out the fire source in the high in the clouds, gains valuable time for putting out early stage conflagration and the golden 3 minutes of saving oneself. The fire alarm prevention and control which must be struggled in minutes and seconds needs to analyze and judge the trend of fire and the development of disaster as fast as possible. However, in practical applications, due to data packet loss caused by network bandwidth limitation or communication abnormality and distribution switching of video streams, there is a short delay or jam in video picture transmission, and confirmation work of manual image detection is delayed. Meanwhile, the worker needs to keep a working state of being absorbed by the worker for a long time based on manual picture fire alarm investigation, and certain searching and judging time is also needed for fire source target positioning.
Disclosure of Invention
According to the problems in the prior art, the invention provides the terminal cloud linkage fire-fighting image detection method and system based on the two buffer pools, so that a blank window with image detection image transmission loss is effectively made up, and the timeliness and the accuracy of a fire alarm report are improved.
The technical scheme of the invention is as follows:
a terminal cloud linkage fire-fighting image detection system based on two buffer pools is characterized by comprising a front-end sensing device, video monitoring equipment and a cloud center server; the front end sensing device and the video monitoring equipment are respectively in communication connection with the cloud center server; the cloud center server is provided with:
the buffer module comprises a long-time buffer pool and a short-time buffer pool and is used for respectively storing and updating the acquired long-time image data and short-time image data of the video monitoring equipment;
the image detection module is used for calculating the difference of historical images of the long-time buffer pool and the short-time buffer pool in a window period acquired by the real-time video stream; after the real-time video stream is obtained, calculating the real-time image difference between the long-time buffer pool and the real-time video;
when the cloud center server receives an alarm signal of the front-end sensing device, auxiliary analysis is provided for manual image detection through historical image difference and/or real-time image difference calculation.
Furthermore, the cloud center server is further provided with an address matching module which is used for setting a geographic coordinate-IP address matching table of the front-end sensing device and the video monitoring equipment and providing a communication address of the video monitoring equipment corresponding to the front-end sensing device.
Further, the cloud center server is in communication connection with the front-end sensing device and the video monitoring equipment through the communication module.
A terminal cloud linkage fire-fighting diagram detection method based on two buffer pools is characterized by comprising the following steps:
s1, setting a geographic coordinate-IP address matching table of a front-end sensing device and video monitoring equipment;
s2, setting two buffer pools, wherein the first buffer pool is a long-time buffer pool, and the second buffer pool is a short-time buffer pool, and respectively storing and updating the acquired long-time image data and short-time image data of the video monitoring equipment;
s3, when the cloud center server receives an alarm signal of the front-end sensing device, the cloud center server inquires a communication address of corresponding video monitoring equipment through a matching table to remotely acquire a video stream;
s4, performing historical image difference calculation through image data of the long-time buffer pool and the short-time buffer pool in a window period acquired by the real-time video stream; after the real-time video stream is obtained, performing real-time image difference calculation through the long-time buffer pool and the image data of the real-time video stream;
and S5, providing auxiliary analysis for the artificial graph detection by using the difference calculation result of the historical images and/or the difference calculation result of the real-time images.
Wherein, step S1 specifically includes:
s11, front end sensing device x i Deploying defense in a gridding mode and storing a front-end sensing device x i (ii) geographic coordinates of;
s12, video monitoring equipment y j Covering 1 to N front end sensing devices by the field of view, and storing the video monitoring equipment y j The IP address of (2);
s13, front end sensing device x i And video monitoring equipment y j The matching table of the geographic coordinates and the IP addresses is an N:1 linear table, namely N front-end sensing devices correspond to one video monitoring device.
The step S2 specifically includes:
s21, setting long-term buffer pool data updating time t1, receiving remote images of the video monitoring equipment at intervals of t1, and updating image data of the long-term buffer pool;
s22, setting short-time buffer pool data updating time t2, receiving remote images of the video monitoring equipment at intervals of t2, and updating image data of the short-time buffer pool; wherein t2 < t1.
Further, in step S21, the long-term buffer pool image data update method is a gaussian background modeling method, which specifically includes:
let the mapping p (x) of each pixel point (x, y) to the Gaussian background model satisfy for the background image B
Figure 344869DEST_PATH_IMAGE001
Wherein: x is the pixel value of a certain pixel point, and u and d are the mean value and variance of Gaussian distribution; calculating the mean value u and the variance d of each point in the long-term buffer pool image sequence as a background model; for an arbitrary image G containing the foreground, for each point (x, y) on the image, if:
Figure 918545DEST_PATH_IMAGE002
if the point is a background point, otherwise, the point is a foreground point; wherein the content of the first and second substances,
Figure 893454DEST_PATH_IMAGE003
the method comprises the following steps that (1) G (x, y) represents a pixel value of a certain point on a foreground image, namely a current image, and B (x, y) represents a pixel value of a corresponding point on a background image, namely a Gaussian background model, wherein the pixel value of the corresponding point on the background image is a constant threshold;
the long-term buffer pool updates each frame of image by the following method:
Figure 312934DEST_PATH_IMAGE004
wherein p is an update threshold, which is a constant and is used to reflect the background update rate, and the larger p is, the slower the background update is.
In step S22, the updating method of the image data in the short-time buffer pool is a replacement method, which specifically includes:
after the image is acquired by the short-time buffer pool, the newly acquired image is processedAll image pixels M t (x, y) completely covers the previous pixel M t-1 (x,y):
Figure 805227DEST_PATH_IMAGE005
In step S4, the calculation method of the difference between the historical images and the difference between the real-time images is a structural similarity difference calculation method, and the specific steps include:
inputting image data of a long-term buffer pool and a short-term buffer pool or image data of a long-term buffer pool and a real-time video, and setting each two pictures as x and y respectively;
taking the average gray mu as brightness measurement, and respectively calculating average gray ux and uy;
taking the gray standard deviation sigma as contrast measurement, and respectively calculating gray standard deviations sigma x and sigma y;
and calculating structural difference D (x, y) among the pictures according to the calculated ux, uy, sigma x and sigma y:
Figure 360973DEST_PATH_IMAGE006
wherein C is 1 ,C 2 The constant value is adopted, so that the instability caused when the denominator is close to 0 is avoided;
the larger D is, the smaller the difference of the representative picture is, and the smaller D is, the larger the difference of the representative picture is.
In step S5, providing the auxiliary analysis for the artificial image frame includes: after the front-end sensing device gives an alarm, before the cloud center server does not acquire a live real-time video stream, if the difference D1 of the historical images is greater than a certain threshold value, the fact that the probability is not that the front-end sensing device gives a false alarm is proved, and a worker can obtain the possible place and time of a fire according to the IP address and the video time, send out a fire early warning and prepare for fire-fighting operation in advance; after the live real-time video stream reaches the cloud center server, if the difference D2 of the real-time images is larger than a certain threshold value, the fire disaster at the position with high probability is judged, and the staff can determine the fire disaster place and time according to the IP address and the video time and send out a fire disaster alarm.
The invention has the technical effects that:
according to the method and the system for detecting the end cloud linkage fire fighting diagram based on the two buffer pools, the short-time buffer pool and the long-time buffer pool are arranged, and the obtained long-time image data and the obtained short-time image data are stored and updated respectively; when waiting for real-time picture transmission, the information of the short-time buffer pool and the long-time buffer pool is utilized to calculate the detection difference of the historical map, thereby effectively making up a blank window lacking in picture transmission and finding out a potential fire point in time; after receiving the real-time picture, calculating the real-time picture detection difference by using the information of the real-time video and the long-time buffer pool; according to the dual analysis of the detection difference of the historical images and the detection difference of the real-time images, a powerful technical guarantee is provided for the staff to finally determine the alarm result, the timeliness and the accuracy of the fire alarm report are effectively improved, and the method is worthy of popularization and application.
Drawings
Fig. 1 is a schematic structural view of a terminal cloud linkage fire-fighting image detection system based on two buffer pools.
Fig. 2 is a schematic flow chart of an embodiment of a terminal cloud-linked fire-fighting diagram detection method based on two buffer pools according to the present invention.
Detailed Description
For a further understanding of the invention, reference is made to the following description of the preferred embodiments of the invention taken in conjunction with the accompanying drawings, but it is understood that the description is intended to illustrate further features and advantages of the invention, and not to limit the scope of the appended claims.
Fig. 1 is a schematic structural view of a terminal cloud linkage fire-fighting image detection system based on two buffer pools.
A terminal cloud linkage fire-fighting image detection system based on two buffer pools comprises a front-end sensing device, video monitoring equipment and a cloud center server; the front-end sensing device and the video monitoring equipment are respectively in communication connection with the cloud center server; in this embodiment, the front-end sensing device and the video monitoring device are respectively connected to the cloud center server through the communication module. The cloud center server is provided with: the buffer module and the image detection module are also provided with an address matching module; the buffer module comprises a long-term buffer pool and a short-term buffer pool, and is used for respectively storing and updating the acquired long-term image data and short-term image data of the video monitoring equipment; the image detection module is used for calculating the difference of historical images of the long-time buffer pool and the short-time buffer pool in a window period acquired by the real-time video stream; after the real-time video stream is obtained, calculating the real-time image difference between the long-time buffer pool and the real-time video; when the cloud center server receives an alarm signal of the front-end sensing device, auxiliary analysis is provided for artificial image detection through historical image difference and/or real-time image difference calculation. The cloud center server is also provided with an address matching module which is used for setting a geographic coordinate-IP address matching table of the front-end sensing device and the video monitoring equipment and providing a communication address of the video monitoring equipment corresponding to the front-end sensing device.
Fig. 2 is a schematic flow chart of an embodiment of a terminal cloud linked fire fighting diagram detection method based on two buffer pools according to the present invention.
A terminal cloud linkage fire-fighting diagram detection method based on two buffer pools comprises the following steps:
s1, setting a geographic coordinate-IP address matching table of a front-end sensing device and video monitoring equipment;
s2, setting two buffer pools, wherein the first buffer pool is a long-time buffer pool, and the second buffer pool is a short-time buffer pool, and respectively storing and updating the acquired long-time image data and short-time image data of the video monitoring equipment;
s3, when the cloud center server receives an alarm signal of the front-end sensing device, the cloud center server inquires a communication address of corresponding video monitoring equipment through a matching table to remotely acquire a video stream;
s4, performing historical image difference calculation through image data of the long-time buffer pool and the short-time buffer pool in a window period acquired by the real-time video stream; after the real-time video stream is obtained, performing real-time image difference calculation through the long-time buffer pool and the image data of the real-time video stream;
and S5, providing auxiliary analysis for the artificial image detection by using the difference calculation results of the historical images and/or the difference calculation results of the real-time images.
Wherein, step S1 specifically includes:
s11, front end sensing device x i Deploying defense in a gridding manner and storing front-end sensing device x i (ii) geographic coordinates of;
s12, video monitoring equipment y j Covering 1 to N front end sensing devices by the field of view, and storing the video monitoring equipment y j The IP address of (2);
s13, front end sensing device x i And video monitoring equipment y j The matching table of the geographic coordinates and the IP addresses is an N:1 linear table, namely N front-end sensing devices correspond to one video monitoring device. In this embodiment, every 3-5 front-end sensing devices correspond to a video monitoring apparatus.
The step S2 specifically includes:
s21, setting long-term buffer pool data updating time t1, receiving a remote image of the video monitoring equipment at intervals of t1 (in a 'no alarm' or 'alarm eliminated' state), and updating the long-term buffer pool image data; in this embodiment, t1 is set to 12h, that is, the long-term buffer pool image data is updated every 12 hours;
s22, setting short-time buffer pool data updating time t2, receiving a remote image of the video monitoring equipment at the time of every t2 (in a 'non-alarm' or 'alarm eliminated' state), and updating the short-time buffer pool image data; wherein t2 < t1; in this embodiment, t2 is set to 15s, that is, the image data of the buffer pool is updated every 15 seconds in a short time;
further, the long-term buffer pool image data updating method is a gaussian background modeling method, and specifically includes:
let the mapping p (x) of each pixel point (x, y) to the Gaussian background model satisfy for the background image B
Figure 644799DEST_PATH_IMAGE001
Wherein: x is the pixel value of a certain pixel point, and u and d are the mean value and variance of Gaussian distribution; calculating the mean value u and the variance d of each point in the long-term buffer pool image sequence as a background model; for an arbitrary image G containing a foreground, for the image GIf:
Figure 20417DEST_PATH_IMAGE002
if the point is a background point, otherwise, the point is a foreground point; wherein, G (x, y) represents the pixel value of a certain point on the foreground, namely the current image, and B (x, y) represents the pixel value of a corresponding point on the background, namely the Gaussian background modeling background image; t is a constant threshold, generally set to 0.25 or 0.75, and in this embodiment, T is set to 0.75 with optimal effect.
The long-term buffer pool updates each frame of image by the following method:
Figure DEST_PATH_IMAGE007
wherein p is an update threshold, which is a constant and used to reflect the background update rate, and the larger p, the slower the background update. In the present embodiment, p is set to 0.2.
The image data updating method of the short-time buffer pool is a replacement method, and specifically comprises the following steps:
after the image is acquired in the short-time buffer pool, all the image pixels M acquired newly are processed t (x, y) completely covers the previous pixel M t-1 (x,y):
Figure 581980DEST_PATH_IMAGE005
In step S3, when the cloud center server receives the alarm signal of the front-end sensing device xi, the IP address of the corresponding video monitoring equipment yj is inquired through the matching table, the real-time video stream remote acquisition request is carried out,
the video stream remote acquisition request is carried out in two modes, one is a static image mode for acquiring picture data at intervals, and the other is a video stream mode for continuously acquiring real-time streaming signals.
In step S4, the calculation method of the difference between the historical images and the difference between the real-time images is a structural similarity difference calculation method, and the specific steps include:
inputting image data of a long-term buffer pool and a short-term buffer pool or image data of a long-term buffer pool and a real-time video, and setting each two pictures as x and y respectively;
taking the average gray mu as brightness measurement, and respectively calculating average gray ux and uy;
taking the gray standard deviation sigma as contrast measurement, and respectively calculating gray standard deviations sigma x and sigma y;
and calculating structural difference D (x, y) among the pictures according to the calculated ux, uy, sigma x and sigma y:
Figure 257811DEST_PATH_IMAGE006
wherein C is 1 ,C 2 The constant value is adopted, so that the instability caused when the denominator is close to 0 is avoided; in this embodiment, C1 is set to 0.0001 and C2 is set to 0.0009, and errors are avoided as much as possible while preventing the denominator from being close to 0.
The larger the calculated D is, the smaller the difference of the representative picture is, and the smaller the D is, the larger the difference of the representative picture is.
In this embodiment, first, in a period of a remote acquisition request window of a video stream, difference calculation is performed on image data of a long-term buffer pool and image data of a short-term buffer pool to obtain D1, where D1 is a historical map detection difference; and then, after the acquisition request is successful, the long-time buffer pool and the real-time video are subjected to difference calculation to obtain D2, wherein the D2 is the real-time image detection difference.
In step S5, providing the auxiliary analysis for the artificial image frame includes: after the front-end sensing device gives an alarm, before the cloud center server does not acquire the live real-time video stream, if the difference D1 of the historical images is greater than a certain threshold value, in the embodiment, if the difference D1 of the historical image frames is greater than 0.75, it is shown that the probability is not the false alarm of the front-end sensing device, a worker can inquire the IP address and the video time to obtain the possible place and time of the fire, send out fire early warning, and prepare for fire control operation in advance; after the live real-time video stream reaches the cloud center server, if the difference D2 of the real-time image is greater than a certain threshold, in this embodiment, if the difference D2 of the real-time image frame is greater than 0.5, it is determined that a fire disaster occurs at this point with a high probability, and the worker can determine the location and time of the fire disaster according to the IP address and the video time, and send out a fire alarm.
It should be noted that the above-mentioned embodiments enable a person skilled in the art to more fully understand the invention, without restricting it in any way. All technical solutions and modifications thereof without departing from the spirit and scope of the present invention are covered by the protection scope of the present invention.

Claims (10)

1. A terminal cloud linkage fire-fighting image detection system based on two buffer pools is characterized by comprising a front-end sensing device, video monitoring equipment and a cloud center server; the front end sensing device and the video monitoring equipment are respectively in communication connection with the cloud center server; the cloud center server is provided with:
the buffer module comprises a long-time buffer pool and a short-time buffer pool and is used for respectively storing and updating the acquired long-time image data and short-time image data of the video monitoring equipment;
the image detection module is used for calculating the difference of historical images of the long-time buffer pool and the short-time buffer pool in a window period acquired by the real-time video stream; after the real-time video stream is obtained, calculating the real-time image difference between the long-time buffer pool and the real-time video;
when the cloud center server receives an alarm signal of the front-end sensing device, auxiliary analysis is provided for manual image detection through historical image difference and/or real-time image difference calculation.
2. The system according to claim 1, wherein the cloud center server is further provided with an address matching module for setting a geographic coordinate-IP address matching table of the front-end sensing device and the video monitoring equipment and providing a communication address of the video monitoring equipment corresponding to the front-end sensing device.
3. The two-buffer-pool-based end cloud-linked fire fighting image detection system according to claim 1, wherein the cloud center server is in communication connection with the front end sensing device and the video monitoring device through communication modules respectively.
4. A terminal cloud linkage fire-fighting diagram detection method based on two buffer pools is characterized by comprising the following steps:
s1, setting a geographic coordinate-IP address matching table of a front-end sensing device and video monitoring equipment;
s2, setting two buffer pools, wherein the first buffer pool is a long-term buffer pool, and the second buffer pool is a short-term buffer pool, and respectively storing and updating the obtained long-term image data and short-term image data of the video monitoring equipment;
s3, when the cloud center server receives an alarm signal of the front-end sensing device, the cloud center server inquires a communication address of corresponding video monitoring equipment through a matching table to remotely acquire a video stream;
s4, performing historical image difference calculation through image data of the long-time buffer pool and the short-time buffer pool in a window period acquired by the real-time video stream; after the real-time video stream is obtained, performing real-time image difference calculation through the long-time buffer pool and the image data of the real-time video stream;
and S5, providing auxiliary analysis for the artificial image detection by using the difference calculation results of the historical images and/or the difference calculation results of the real-time images.
5. The method for detecting the fire fighting image based on the end cloud linkage of the two buffer pools according to claim 4, wherein the step S1 specifically comprises:
s11, front end sensing device x i Deploying defense in a gridding manner and storing front-end sensing device x i The geographic coordinates of (a);
s12, video monitoring equipment y j Covering 1 to N front end sensing devices by the field of view, and storing the video monitoring equipment y j The IP address of (2);
s13, front end sensing device x i With video monitoring equipment y j The matching table of the geographic coordinates and the IP addresses is an N:1 linear table, namely N front-end sensing devices correspond to one video monitoring device.
6. The method for detecting the fire fighting image based on the end cloud linkage of the two buffer pools according to claim 4, wherein the step S2 specifically comprises:
s21, setting long-term buffer pool data updating time t1, receiving remote images of the video monitoring equipment at intervals of t1, and updating image data of the long-term buffer pool;
s22, setting short-time buffer pool data updating time t2, receiving remote images of the video monitoring equipment at intervals of t2, and updating image data of the short-time buffer pool; wherein t2 < t1.
7. The method for detecting the fire fighting image based on the end cloud linkage of the two buffer pools as claimed in claim 6, wherein in the step S21, the image data updating mode of the long-term buffer pool is a Gaussian background modeling method, which specifically comprises:
let the mapping p (x) of each pixel point (x, y) to the Gaussian background model satisfy for the background image B
Figure 780836DEST_PATH_IMAGE001
Wherein: x is the pixel value of a certain pixel point, and u and d are the mean and variance of Gaussian distribution; calculating the mean value u and the variance d of each point in the long-term buffer pool image sequence as a background model; for an arbitrary image G containing the foreground, for each point (x, y) on the image, if:
Figure 491304DEST_PATH_IMAGE002
if the point is a background point, otherwise, the point is a foreground point; wherein the content of the first and second substances,
Figure 704110DEST_PATH_IMAGE003
is a constant threshold value, G (x, y) represents a pixel value of a certain point on the foreground, namely the current image, B (x, y) represents a corresponding point on the background, namely the Gaussian background modeling background imageA pixel value of (a);
the long-term buffer pool updates each frame of image by the following method:
Figure 550843DEST_PATH_IMAGE004
wherein p is an update threshold, which is a constant and used to reflect the background update rate, and the larger p, the slower the background update.
8. The method for detecting the fire fighting image based on the end cloud linkage of the two buffer pools as claimed in claim 6, wherein in the step S22, the image data updating mode of the short-time buffer pool is a replacement method, specifically:
after the image is acquired in the short-time buffer pool, all the image pixels M acquired newly are processed t (x, y) completely covers the previous pixel M t-1 (x,y):
Figure 964638DEST_PATH_IMAGE005
9. The method for detecting the fire fighting image based on the end cloud linkage of the two buffer pools according to claim 4, wherein in the step S4, the calculation method of the difference of the historical images and the difference of the real-time images is a structural similarity difference calculation method, and the specific steps include:
inputting image data of a long-term buffer pool and a short-term buffer pool or image data of a long-term buffer pool and a real-time video, and setting each two pictures as x and y respectively;
taking the average gray mu as brightness measurement, and respectively calculating average gray ux and uy;
taking the gray standard deviation sigma as contrast measurement, and respectively calculating gray standard deviations sigma x and sigma y;
and calculating the structural difference D (x, y) among the pictures according to the calculated ux, uy, sigma x and sigma y:
Figure 631243DEST_PATH_IMAGE006
wherein C is 1 ,C 2 The constant value is adopted, so that the instability caused when the denominator is close to 0 is avoided;
the larger D is, the smaller the difference of the representative picture is, and the smaller D is, the larger the difference of the representative picture is.
10. The method for end-cloud linked fire fighting image detection based on two buffer pools according to claim 9, wherein in the step S5, providing auxiliary analysis for the artificial image frame includes: after the front-end sensing device gives an alarm, before the cloud center server does not acquire the field real-time video stream, if the difference D1 of the historical images is larger than a certain threshold value, the probability is that the front-end sensing device gives a false alarm, and the worker can obtain the possible place and time of the fire according to the IP address and the video time, send out fire early warning and prepare fire-fighting operation in advance; after the live real-time video stream reaches the cloud center server, if the difference D2 of the real-time images is larger than a certain threshold value, the fire disaster at the position with high probability is judged, and the staff can determine the fire disaster place and time according to the IP address and the video time and send out a fire disaster alarm.
CN202211224520.1A 2022-10-09 2022-10-09 End cloud linkage firefighting map detection method and system based on two buffer pools Active CN115512506B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211224520.1A CN115512506B (en) 2022-10-09 2022-10-09 End cloud linkage firefighting map detection method and system based on two buffer pools

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211224520.1A CN115512506B (en) 2022-10-09 2022-10-09 End cloud linkage firefighting map detection method and system based on two buffer pools

Publications (2)

Publication Number Publication Date
CN115512506A true CN115512506A (en) 2022-12-23
CN115512506B CN115512506B (en) 2023-06-20

Family

ID=84508910

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211224520.1A Active CN115512506B (en) 2022-10-09 2022-10-09 End cloud linkage firefighting map detection method and system based on two buffer pools

Country Status (1)

Country Link
CN (1) CN115512506B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4737847A (en) * 1985-10-11 1988-04-12 Matsushita Electric Works, Ltd. Abnormality supervising system
CN103593936A (en) * 2013-09-29 2014-02-19 西安祥泰软件设备系统有限责任公司 Fire alarm remote monitoring method and embedded motherboard
JP2017076304A (en) * 2015-10-16 2017-04-20 アズビル株式会社 Fire detection system
CN112562255A (en) * 2020-12-03 2021-03-26 国家电网有限公司 Intelligent image detection method for cable channel smoke and fire condition in low-light-level environment
CN114913663A (en) * 2021-02-08 2022-08-16 腾讯科技(深圳)有限公司 Anomaly detection method and device, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4737847A (en) * 1985-10-11 1988-04-12 Matsushita Electric Works, Ltd. Abnormality supervising system
CN103593936A (en) * 2013-09-29 2014-02-19 西安祥泰软件设备系统有限责任公司 Fire alarm remote monitoring method and embedded motherboard
JP2017076304A (en) * 2015-10-16 2017-04-20 アズビル株式会社 Fire detection system
CN112562255A (en) * 2020-12-03 2021-03-26 国家电网有限公司 Intelligent image detection method for cable channel smoke and fire condition in low-light-level environment
CN114913663A (en) * 2021-02-08 2022-08-16 腾讯科技(深圳)有限公司 Anomaly detection method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN115512506B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
CN108965825B (en) Video linkage scheduling method based on holographic position map
CN107729850B (en) Internet of things outdoor advertisement monitoring and broadcasting system
JP5021171B2 (en) River information server
US7342489B1 (en) Surveillance system control unit
US7242295B1 (en) Security data management system
CN105913600B (en) A kind of building intelligent fire alarm system
CN108040221A (en) A kind of intelligent video analysis and monitoring system
CN113778034B (en) Intelligent manufacturing monitoring industrial Internet platform based on edge calculation
CN111667089A (en) Intelligent disaster prevention system and intelligent disaster prevention method
CN111695541A (en) Unmanned aerial vehicle forest fire prevention system and method based on machine vision
CN116545122B (en) Power transmission line external damage prevention monitoring device and external damage prevention monitoring method
CN110896462A (en) Control method, device and equipment of video monitoring cluster and storage medium
CN115082813A (en) Detection method, unmanned aerial vehicle, detection system and medium
CN108731682A (en) A kind of path planning system and method applied to underground mine rescue
CN115512506A (en) Terminal cloud linkage fire-fighting diagram detection method and system based on two buffer pools
CN114973564A (en) Remote personnel intrusion detection method and device under non-illumination condition
CN112966552B (en) Routine inspection method and system based on intelligent identification
CN113673406A (en) Curtain wall glass burst detection method and system, electronic equipment and storage medium
CN110928305B (en) Patrol method and system for patrol robot of railway passenger station
CN116582693B (en) Camera calling control method based on video resource pool
CN112001810A (en) Intelligent forestry patrolling system and method based on machine vision
CN108520615B (en) Fire identification system and method based on image
CN104378629A (en) Camera fault detection method
KR102191349B1 (en) Fire detection system providing a digital map for fire detection based on the image of the fire receiver
CN114581598A (en) Service integration framework based on three-dimensional live-action and on-site video fusion management

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant