CN109118702B - Fire detection method, device and equipment - Google Patents

Fire detection method, device and equipment Download PDF

Info

Publication number
CN109118702B
CN109118702B CN201811151379.0A CN201811151379A CN109118702B CN 109118702 B CN109118702 B CN 109118702B CN 201811151379 A CN201811151379 A CN 201811151379A CN 109118702 B CN109118702 B CN 109118702B
Authority
CN
China
Prior art keywords
image
depth
infrared
block
comparison result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811151379.0A
Other languages
Chinese (zh)
Other versions
CN109118702A (en
Inventor
宋林东
王倩
吕思豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Optical Technology Co Ltd
Original Assignee
Goertek Optical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Optical Technology Co Ltd filed Critical Goertek Optical Technology Co Ltd
Priority to CN201811151379.0A priority Critical patent/CN109118702B/en
Publication of CN109118702A publication Critical patent/CN109118702A/en
Application granted granted Critical
Publication of CN109118702B publication Critical patent/CN109118702B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Multimedia (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Fire-Detection Mechanisms (AREA)

Abstract

The embodiment of the invention provides a fire detection method, a fire detection device and fire detection equipment, wherein the method comprises the following steps: acquiring a depth image and an infrared image through a TOF depth module; comparing the depth image with a preset reference depth image to obtain a first comparison result; comparing the infrared image with a preset reference infrared image to obtain a second comparison result; and determining whether a fire exists according to the second comparison result and a plurality of continuous first comparison results. By comparing the depth gray level difference value corresponding to the depth image with the infrared gray level difference value corresponding to the infrared image, whether a fire disaster occurs is finally determined, furthermore, the position of the fire disaster can be determined based on the image, whether the fire disaster occurs can be rapidly and accurately determined, and accurate position information of the fire disaster and corresponding image information can be provided for workers.

Description

Fire detection method, device and equipment
Technical Field
The invention relates to the technical field of image processing, in particular to a fire detection method, a fire detection device and fire detection equipment.
Background
Currently, fire monitoring and detecting devices are arranged in various public places, such as schools, shopping malls, and the like. So that the fire can be found in the shortest time and the alarm can be given in time when the fire breaks out.
In the prior art, a smoke sensor is usually adopted for alarming, the alarm device has low equipment cost, and the alarm device is the most widely applied alarm device in public places at present. However, some fires may be direct combustion of a big fire, and when smoke is detected by a smoke detector, the fire may have developed to a serious degree and cannot be found in time. The existing infrared fire alarm is easy to interfere the detection result and misjudge due to the change of indoor temperature and light, thereby resulting in that the fire condition can not be found in time.
Based on this, a scheme for timely, accurate and simple fire detection is needed.
Disclosure of Invention
In view of this, embodiments of the present invention provide a fire detection method, apparatus and device, so as to improve real-time performance and accuracy of indoor fire detection.
In a first aspect, an embodiment of the present invention provides a fire detection method, including:
acquiring a depth image and an infrared image through a TOF depth module;
comparing the depth image with a preset reference depth image to obtain a first comparison result;
comparing the infrared image with a preset reference infrared image to obtain a second comparison result;
and determining whether a fire exists according to the second comparison result and a plurality of continuous first comparison results.
In a second aspect, an embodiment of the present invention provides a fire detection apparatus, including:
the acquisition module is used for acquiring the depth image and the infrared image through the TOF depth module;
the first comparison module is used for comparing the depth image with a preset reference depth image to obtain a first comparison result;
the second comparison module is used for comparing the infrared image with a preset reference infrared image to obtain a second comparison result;
and the determining module is used for determining whether a fire disaster exists according to the second comparison result and a plurality of continuous first comparison results.
In a third aspect, an embodiment of the present invention provides an electronic device, including: a memory, a processor; wherein,
the memory is configured to store one or more computer instructions, wherein the one or more computer instructions, when executed by the processor, implement the fire detection method according to the first aspect.
According to the fire detection method provided by the embodiment of the invention, the depth image and the infrared image are simultaneously acquired through the TOF depth module. And comparing the gray value of the depth image and the infrared image which are acquired respectively with the reference depth image and the reference infrared image which are acquired before. If the depth image comparison result and the infrared image comparison result are both possible to generate a fire, the fire is considered to be generated. In contrast, when performing gray scale value comparison, it is often necessary to divide an image into a plurality of blocks and perform comparison based on an average gray scale value in the blocks. And after the occurrence of fire is confirmed, the worker is informed in time. By comparing the depth gray level difference value corresponding to the depth image with the infrared gray level difference value corresponding to the infrared image, whether a fire disaster occurs is finally determined, furthermore, the position of the fire disaster can be determined based on the image, whether the fire disaster occurs can be rapidly and accurately determined, and accurate position information of the fire disaster and corresponding image information can be provided for workers.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a flow chart of a method for fire detection according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of image partitioning according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a fire detection device according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a fire detection device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and "a" and "an" generally include at least two, but do not exclude at least one, unless the context clearly dictates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element.
In addition, the sequence of steps in each method embodiment described below is only an example and is not strictly limited.
Fig. 1 is a schematic flow chart of a fire detection method according to an embodiment of the present invention, and the main steps of performing indoor fire detection include:
101: and acquiring a depth image and an infrared image through the TOF depth module.
It is easy to understand that the tof (time of flight) depth module emits infrared light, and captures a depth image and an infrared image in the same frame. In practical application, when the image is acquired through the TOF depth module, the image is continuously acquired according to a set frequency. When the image comparison is performed, two consecutive adjacent frames of images may be compared, or the current image may be compared with a reference image pre-stored before.
102: and comparing the depth image with a preset reference depth image to obtain a first comparison result.
The preset reference depth image refers to a depth image of a previous frame of a currently acquired depth information image. When the comparison is performed, the comparison is performed with the preset reference image of the previous frame according to the plurality of current images collected in real time.
Specifically, the gray value of the depth image acquired in real time is compared with a preset reference depth image. And calculating the difference value between the current depth image and the previous frame of depth image (namely, the preset reference depth image), calculating the square sum of the difference values, taking the square sum as the difference value, and calculating the sum of the difference values of the continuous multiple frames.
If the depth information values of a plurality of continuous depth images have a large-amplitude jumping phenomenon (namely, the gray difference value exceeds a first preset threshold) compared with the gray value of a preset reference depth image (the respective previous depth image), the depth information value at the position can be judged to have large vertical change and be in an unstable state, the sum of the difference values of the plurality of continuous depth images is large, if the difference value exceeds the set depth difference threshold, a fire disaster is considered to possibly occur, otherwise, the fire disaster is considered to not be detected.
103: and comparing the infrared image with a preset reference infrared image to obtain a second comparison result.
The preset reference infrared image is obtained by a TOF module after fire detection equipment (comprising an infrared image acquisition module and an infrared image acquisition module) is installed at a specified position, and is used as a preset reference infrared image. In order to obtain a better reference effect for the obtained preset reference infrared image, a plurality of reference infrared images are usually collected, and the pixel values of the plurality of images are averaged.
Specifically, assuming that the resolution of the infrared image acquired by the infrared image module is 1280 × 720, assuming that m reference infrared images are acquired; further, the gray values corresponding to the pixel points are sequentially calculated according to the m images. Taking the calculation of the gray scale value of one pixel as an example, assume that the gray scale value of the (0,0) point is calculated, and the formula Average is 1/m ∑ vali (val: pixel value; i is 1,2 … n). After the gray value of each pixel point is obtained, a preset reference infrared image can be obtained.
Further, gray value comparison is carried out on the infrared image acquired in real time and a preset reference infrared image. It is easy to understand that there are two cases of the first comparison result mentioned here, namely, the infrared image gray-scale value exceeds the reference infrared image gray-scale value, in other words, a possible fire is detected; the other is that the infrared image gradation value does not exceed the reference infrared image gradation value, in other words, the fire occurrence is not detected.
104: and determining whether a fire exists according to the second comparison result and a plurality of continuous first comparison results.
In order to more accurately determine whether the first comparison result and the second comparison result are consistent or not, it is further required to determine whether the first comparison result and the second comparison result are consistent or not. Further, when the determination is made based on the first comparison result, it is necessary to acquire a plurality of consecutive first comparison results, and if there are a plurality of consecutive first comparison results, all of which exceed the first preset threshold value, the fire is marked as suspected to be on fire, and if there are a plurality of consecutive first comparison results, all of which do not exceed the first preset threshold value, the fire is not marked as suspected to be on fire.
For example, when the first comparison result is a numerical value change in which the depth information (gray value) of the depth image has a large fluctuation (in other words, the first comparison result is checked by a first preset threshold), it indicates that the first comparison result is suspected of causing a fire, and if the gray value of the depth image is stable and is close to the gray value of the reference depth image, it indicates that the first comparison result is not causing a fire; and when the second comparison result is that the gray value corresponding to the infrared image exceeds the gray value corresponding to the preset reference infrared image, the second comparison result is suspected to be a fire, and if the second comparison result is smaller than the preset reference infrared image, the second comparison result is not a fire.
Further, only when the first comparison result indicates a suspected fire and the second comparison result indicates a suspected fire, the final output result indicates that a fire is determined to have occurred. This can effectively improve the accuracy of detecting a fire.
In one or more embodiments of the present invention, the acquiring the depth image and the infrared image may specifically include: acquiring a depth image and an infrared image in real time based on the TOF depth module; wherein the depth image and the infrared image are acquired in the same frame.
In order to ensure the reliability of the comparison result, in the detection process, the frequency of the image acquired in real time can be acquired once in 1 second, and the frequency of the acquired image can be adjusted according to different time periods; for example, in the case that all the people are in the daytime, the collection frequency may be once a minute, and if the people are off duty at night, the collection frequency may be adjusted to once a second.
In one or more embodiments of the present invention, the comparing the depth image with a preset reference depth image to obtain a first comparison result may specifically include: dividing the depth image to obtain a first depth block image; marking the index number of each depth block in the first depth block image; dividing the preset reference depth image to obtain a first reference depth block image; marking the index number of each depth block in the first reference depth block image; comparing the blocks in the first depth block image and the first reference depth block image according to the depth block index number; a first comparison result is obtained.
It should be noted that, when comparing the depth image with the preset reference depth image, one-to-one comparison may be performed, for example, the currently acquired depth image is compared with the previous frame depth image (preset reference depth image) one-to-one; and comparing the multi-frame depth image with a preset reference depth image.
Specifically, for example, as shown in fig. 2, assuming that the resolution of the depth image and the preset reference depth image is 400 × 300, the two images may be divided into 4 × 3 blocks in total, which are 12 blocks. Further, the index numbers of the divisions are marked, for example, in the order from left to right and from top to bottom, which are sequentially marked as 1-12.
When comparing, the gray values of the separate images corresponding to the same index number can be compared. For example, the gray value of the block image with the index number of 8 in the depth image acquired in real time is compared with the gray value of the block image with the index number of 8 in the preset reference depth image. It should be noted that, when performing gray value comparison, each pixel may be compared in sequence, and for simplification, a gray average value of each block image may also be obtained, and the average value is used for comparison.
In one or more embodiments of the present invention, the comparing the blocks in the first depth block image and the first reference depth block image according to the depth block index number may specifically include: obtaining a first gray average value of a block corresponding to each depth block index number in the first depth block image; obtaining a first reference gray level mean value of a block corresponding to each depth block index number in the first reference depth block image; and sequentially comparing the depth gray level difference value between the first gray level mean value and the corresponding first reference gray level mean value according to the depth block index number. According to the principle, the depth gray level difference value of the continuous multi-frame first depth image and the first reference depth image is calculated.
As shown in fig. 2, it is assumed that an image with a resolution of 400 × 300 is divided into 4 × 3 blocks. The resolution of each block is 100 × 100, and further, AVG ═ 1/n Σ vali (i ═ 1 to 10000) can be used. Assuming that a first gray level mean value corresponding to the depth image is AVG1, a first reference gray level mean value corresponding to a previous frame of depth image (a preset reference depth image) is AVG2, and then performing subtraction by using AVG1 and AVG2 to obtain a depth gray level difference value AVG 3; further calculating the depth gray level difference value of the depth images of continuous multiple frames;
in one or more embodiments of the present invention, the obtaining the first comparison result may specifically include: comparing the difference value with the sum of the depth gray levels according to a first preset threshold value; and if the sum of the depth gray level difference values is greater than the first preset threshold value, acquiring the corresponding depth block index number to obtain a first comparison result.
Assuming that the first preset threshold is AVG4, after obtaining the depth gray scale difference AVG3, AVG4 is compared with AVG 3. Assuming that a suspected fire is occurring when a plurality of consecutive AVGs 3 are larger than AVG4, further, corresponding index numbers of deep divisions need to be acquired, and the first comparison result corresponds to the index numbers.
In one or more embodiments of the present invention, the comparing the infrared image with a preset reference infrared image to obtain a second comparison result may specifically include: dividing the infrared image to obtain a first infrared block image; marking the index number of each infrared block in the first infrared block image; dividing the preset reference infrared image to obtain a first reference infrared block image; marking the index number of each infrared block in the first reference infrared block image; comparing the blocks in the first infrared block image and the first reference infrared block image according to the infrared block index numbers; a second comparison result is obtained.
For example, assuming that the resolution of the infrared image and the preset reference infrared image is 400 × 300, the two images may be divided into 4 × 3 blocks of 12 blocks. Further, the index numbers of the divisions are marked, for example, in the order from left to right and from top to bottom, which are sequentially marked as 1-12.
When comparing, the gray values of the separate images corresponding to the same index number can be compared. For example, the gray value of the block image with the index number of 10 in the infrared image acquired in real time is compared with the gray value of the block image with the index number of 10 in the preset reference infrared image. It should be noted that, when performing gray value comparison, each pixel may be compared in sequence, and for simplification, a gray average value of each block image may also be obtained, and the average value is used for comparison.
In one or more embodiments of the present invention, the comparing the patches in the first infrared patch image and the first reference infrared patch image according to the infrared patch index number may specifically include: obtaining a first infrared gray level mean value of a block corresponding to each infrared block index number in the first infrared block image; obtaining a first reference infrared gray level mean value of a block corresponding to each infrared block index number in the first reference infrared block image; and sequentially comparing the infrared gray level difference value between the first infrared gray level average value and the corresponding first reference infrared gray level average value according to the infrared blocking index number.
Assume that an image with a resolution of 400 × 300 is divided into 4 × 3 block images. The resolution of each block is 100 × 100, and further, AVG ═ 1/n Σ vali (i ═ 1 to 10000) can be used. Assuming that the second gray scale mean value corresponding to the infrared image is AVG5, the second reference gray scale mean value corresponding to the reference infrared image is AVG6, and then performing subtraction by using AVG5 and AVG6 to obtain an infrared gray scale difference value AVG 7.
In one or more embodiments of the present invention, the obtaining the second comparison result may specifically include: comparing the infrared gray level difference with a second preset threshold value; and if the infrared gray difference value is greater than the second preset threshold value, acquiring the corresponding infrared block index number and acquiring a second comparison result.
Assuming that the second preset threshold is AVG8, after obtaining the infrared gray-scale difference AVG7, AVG8 is compared with AVG 7. The magnitude relation between the infrared gray scale difference value and the second preset threshold value can be set manually, and in practical application, as the difference value has positive and negative values, the infrared gray scale difference value can be larger than the second preset threshold value or smaller than the second preset threshold value as a condition for judging fire occurrence. If the suspected fire is considered to occur when the infrared gray scale difference is greater than the second preset threshold, the corresponding infrared-separated index number needs to be acquired, and the second comparison result corresponds to the index number.
In one or more embodiments of the present invention, the determining whether a fire disaster exists according to the second comparison result and a plurality of consecutive first comparison results may specifically include: and if a plurality of continuous first comparison results and second comparison results are all in the presence of fire, and the depth block index numbers correspond to the infrared block index numbers, determining that the fire exists.
In order to improve the accuracy of the detection result, it is necessary to integrate the first comparison result and the second comparison result, and only when a plurality of consecutive first comparison results are judged to be suspected to occur or, and the second comparison result is judged to be suspected to have a fire, it can be judged that the fire is occurring.
In one or more embodiments of the invention, further comprising: and if the fire disaster exists, determining the space position of the fire disaster according to the corresponding relation between the preset space position and the depth image and/or the infrared image.
For example, as shown in fig. 3, it is assumed that the fire detection device 1 is installed in a room a1 and the fire detection device 2 is installed in a room a 2. When the fire detection device 1 detects that a fire occurs, the fire detection device 1 sends the device number and the corresponding room number for detecting the fire to the control device or the staff member, so that the staff member can quickly find the room where the fire occurs and perform fire extinguishing work.
Further, if the monitored space is large, such as a large conference room or a large warehouse, the relationship between each partition and the actual spatial position needs to be established. It is assumed that a block image with a block number of 1 corresponds to a door of a warehouse, and a block image with a block number of 2 corresponds to a file storage cabinet. When alarming, the block numbers can be sent to the working personnel together, so that more accurate alarming is realized.
Based on the same idea, a fire detection apparatus, as shown in fig. 4, includes:
an obtaining module 41, configured to obtain a depth image and an infrared image through a TOF depth module;
a first comparing module 42, configured to compare the depth image with a preset reference depth image, and obtain a first comparison result;
a second comparison module 43, configured to compare the infrared image with a preset reference infrared image to obtain a second comparison result;
a determining module 44, configured to determine whether a fire exists according to the second comparison result and a plurality of consecutive first comparison results.
Further, the obtaining module 41 is configured to obtain a depth image and an infrared image in real time based on the TOF depth module; wherein the depth image and the infrared image are acquired in the same frame.
Further, the first comparing module 42 is configured to divide the depth image to obtain a first depth block image;
marking the index number of each depth block in the first depth block image;
dividing the preset reference depth image to obtain a first reference depth block image;
marking the index number of each depth block in the first reference depth block image;
comparing the blocks in the first depth block image and the first reference depth block image according to the depth block index number;
a first comparison result is obtained.
Further, obtaining a first gray average value of a block corresponding to each depth block index number in the first depth block image;
obtaining a first reference gray level mean value of a block corresponding to each depth block index number in the first reference depth block image;
and sequentially comparing the depth gray level difference value between the first gray level mean value and the corresponding first reference gray level mean value according to the depth block index number.
Further, comparing the depth gray level difference value with a first preset threshold value;
and if the depth gray level difference value is greater than the first preset threshold value, acquiring the corresponding depth block index number to obtain a first comparison result.
A second comparison module 43, configured to divide the infrared image to obtain a first infrared block image;
marking the index number of each infrared block in the first infrared block image;
dividing the preset reference infrared image to obtain a first reference infrared block image;
marking the index number of each infrared block in the first reference infrared block image;
comparing the blocks in the first infrared block image and the first reference infrared block image according to the infrared block index numbers;
a second comparison result is obtained.
Further, obtaining a first infrared gray level mean value of a block corresponding to each infrared block index number in the first infrared block image;
obtaining a first reference infrared gray level mean value of a block corresponding to each infrared block index number in the first reference infrared block image;
and sequentially comparing the infrared gray level difference value between the first infrared gray level average value and the corresponding first reference infrared gray level average value according to the infrared blocking index number.
Further, comparing the infrared gray level difference value with a second preset threshold value;
and if the infrared gray difference value is greater than the second preset threshold value, acquiring the corresponding infrared block index number and acquiring a second comparison result.
Further, the determining module 44 is configured to determine whether a fire occurs, and determine that a fire exists if a plurality of consecutive first comparison results and a plurality of consecutive second comparison results both indicate that a fire exists and the deep block index number corresponds to the infrared block index number.
Further, if the fire disaster exists, the space position where the fire disaster occurs is determined according to the corresponding relation between the preset space position and the depth image and/or the infrared image.
Based on the above embodiment, the depth image and the infrared image are simultaneously acquired through the depth module and the infrared module. And comparing the gray value of the depth image and the infrared image which are acquired respectively with the reference depth image and the reference infrared image which are acquired before. And if the infrared gray difference value and the gray difference value both meet the corresponding preset difference value, determining that the fire disaster occurs. In contrast, when performing gray scale value comparison, it is often necessary to divide an image into a plurality of blocks and perform comparison based on an average gray scale value in the blocks. And after the occurrence of fire is confirmed, the worker is informed in time. By comparing the depth gray level difference value corresponding to the depth image with the infrared gray level difference value corresponding to the infrared image, whether a fire disaster occurs is finally determined, furthermore, the position of the fire disaster can be determined based on the image, whether the fire disaster occurs can be rapidly and accurately determined, and accurate position information of the fire disaster and corresponding image information can be provided for workers.
Based on the same idea, an embodiment of the present invention further provides an electronic device, including: a memory, a processor; wherein,
the memory is configured to store one or more computer instructions, wherein the one or more computer instructions, when executed by the processor, implement a method comprising:
acquiring a depth image and an infrared image through a TOF depth module;
comparing the depth image with a preset reference depth image to obtain a first comparison result;
comparing the infrared image with a preset reference infrared image to obtain a second comparison result;
and determining whether a fire exists according to the second comparison result and a plurality of continuous first comparison results.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by adding a necessary general hardware platform, and of course, can also be implemented by a combination of hardware and software. With this understanding in mind, the above-described aspects and portions of the present technology which contribute substantially or in part to the prior art may be embodied in the form of a computer program product, which may be embodied on one or more computer-usable storage media having computer-usable program code embodied therein, including without limitation disk storage, CD-ROM, optical storage, and the like.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable coordinate determination device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable coordinate determination device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable coordinate determination apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable coordinate determination device to cause a series of operational steps to be performed on the computer or other programmable device to produce a computer implemented process such that the instructions which execute on the computer or other programmable device provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (9)

1. A fire detection method is applied to a TOF depth module and comprises the following steps:
acquiring a depth image and an infrared image through a TOF depth module; wherein, the obtaining of the depth image and the infrared image comprises: acquiring a depth image and an infrared image in real time based on the TOF depth module; wherein the depth image and the infrared image are acquired in the same frame;
comparing the depth image with a preset reference depth image to obtain a first comparison result; wherein the comparing the depth image with a preset reference depth image to obtain a first comparison result comprises: dividing the depth image to obtain a first depth block image; marking the index number of each depth block in the first depth block image; dividing the preset reference depth image to obtain a first reference depth block image; marking the index number of each depth block in the first reference depth block image; comparing the blocks in the first depth block image and the first reference depth block image according to the depth block index number; obtaining a first comparison result;
comparing the infrared image with a preset reference infrared image to obtain a second comparison result;
determining whether a fire exists according to the second comparison result and a plurality of consecutive first comparison results; the method comprises the following steps: if a plurality of continuous first comparison results and second comparison results are both in the presence of fire, and the depth block index numbers correspond to the infrared block index numbers, determining that the fire exists; the infrared blocking index number is obtained by performing blocking division and marking on the basis of the infrared image.
2. The method of claim 1, wherein comparing the tiles in the first depth tile image and the first reference depth tile image according to the depth tile index number comprises:
obtaining a first gray average value of a block corresponding to each depth block index number in the first depth block image;
obtaining a first reference gray level mean value of a block corresponding to each depth block index number in the first reference depth block image;
and sequentially comparing the depth gray level difference value between the first gray level mean value and the corresponding first reference gray level mean value according to the depth block index number.
3. The method of claim 2, wherein obtaining the first comparison result comprises:
comparing the depth gray level difference value with a first preset threshold value;
and if the depth gray level difference value is greater than the first preset threshold value, acquiring the corresponding depth block index number to obtain a first comparison result.
4. The method according to claim 1, wherein said comparing said infrared image with a preset reference infrared image to obtain a second comparison result comprises:
dividing the infrared image to obtain a first infrared block image;
marking the index number of each infrared block in the first infrared block image;
dividing the preset reference infrared image to obtain a first reference infrared block image;
marking the index number of each infrared block in the first reference infrared block image;
comparing the blocks in the first infrared block image and the first reference infrared block image according to the infrared block index numbers;
a second comparison result is obtained.
5. The method of claim 4, wherein comparing the patches in the first infrared patch image and the first reference infrared patch image according to the infrared patch index numbers comprises:
obtaining a first infrared gray level mean value of a block corresponding to each infrared block index number in the first infrared block image;
obtaining a first reference infrared gray level mean value of a block corresponding to each infrared block index number in the first reference infrared block image;
and sequentially comparing the infrared gray level difference value between the first infrared gray level average value and the corresponding first reference infrared gray level average value according to the infrared blocking index number.
6. The method of claim 5, wherein obtaining the second comparison result comprises:
comparing the infrared gray level difference with a second preset threshold value;
and if the infrared gray difference value is greater than the second preset threshold value, acquiring the corresponding infrared block index number and acquiring a second comparison result.
7. The method of claim 1, further comprising:
and if the fire disaster exists, determining the space position of the fire disaster according to the corresponding relation between the preset space position and the depth image and/or the infrared image.
8. A fire detection device, comprising:
the acquisition module is used for acquiring a depth image and an infrared image; wherein, the obtaining of the depth image and the infrared image comprises: acquiring a depth image and an infrared image in real time based on a TOF depth module; wherein the depth image and the infrared image are acquired in the same frame;
the first comparison module is used for comparing the depth image with a preset reference depth image to obtain a first comparison result; wherein the comparing the depth image with a preset reference depth image to obtain a first comparison result comprises: dividing the depth image to obtain a first depth block image; marking the index number of each depth block in the first depth block image; dividing the preset reference depth image to obtain a first reference depth block image; marking the index number of each depth block in the first reference depth block image; comparing the blocks in the first depth block image and the first reference depth block image according to the depth block index number; obtaining a first comparison result;
the second comparison module is used for comparing the infrared image with a preset reference infrared image to obtain a second comparison result;
a determining module for determining whether a fire exists according to the second comparison result and a plurality of consecutive first comparison results; the method comprises the following steps: if a plurality of continuous first comparison results and second comparison results are both in the presence of fire, and the depth block index numbers correspond to the infrared block index numbers, determining that the fire exists; the infrared blocking index number is obtained by performing blocking division and marking on the basis of the infrared image.
9. An electronic device, comprising: a memory, a processor; wherein,
the memory is to store one or more computer instructions, wherein the one or more computer instructions, when executed by the processor, implement the fire detection method of any of claims 1 to 7.
CN201811151379.0A 2018-09-29 2018-09-29 Fire detection method, device and equipment Active CN109118702B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811151379.0A CN109118702B (en) 2018-09-29 2018-09-29 Fire detection method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811151379.0A CN109118702B (en) 2018-09-29 2018-09-29 Fire detection method, device and equipment

Publications (2)

Publication Number Publication Date
CN109118702A CN109118702A (en) 2019-01-01
CN109118702B true CN109118702B (en) 2021-07-20

Family

ID=64857753

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811151379.0A Active CN109118702B (en) 2018-09-29 2018-09-29 Fire detection method, device and equipment

Country Status (1)

Country Link
CN (1) CN109118702B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109916518A (en) * 2019-03-18 2019-06-21 杜峰斌 A kind of smoking detection device, system and method
CN110097732B (en) * 2019-05-08 2021-07-20 江西省天眼科技有限公司 Flame detection monitoring device and processing method thereof
WO2021011300A1 (en) 2019-07-18 2021-01-21 Carrier Corporation Flame detection device and method
CN112785587B (en) * 2021-02-04 2024-05-31 上海电气集团股份有限公司 Foreign matter detection method, system, equipment and medium in stacking production process

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101679148B1 (en) * 2015-06-15 2016-12-06 동의대학교 산학협력단 Detection System of Smoke and Flame using Depth Camera
CN106954021A (en) * 2017-03-03 2017-07-14 天津天地伟业信息系统集成有限公司 Infrared thermal imaging monitoring intelligent camera shield cloud deck
EP3428591A1 (en) * 2017-07-12 2019-01-16 Honeywell International Inc. Flame detector field of view verification via reverse infrared signaling

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3269453B2 (en) * 1998-04-08 2002-03-25 三菱電機株式会社 Fire detection system
CN101673448B (en) * 2009-09-30 2012-03-21 青岛科恩锐通信息技术有限公司 Method and system for detecting forest fire
CN103065413B (en) * 2012-12-13 2016-01-20 中国电子科技集团公司第十五研究所 Obtain method and the device of fire size class information
CN104504896B (en) * 2014-08-15 2017-06-27 上海市政工程设计研究总院(集团)有限公司 A kind of traffic offence automatic production record and method
CN105512667B (en) * 2014-09-22 2019-01-15 中国石油化工股份有限公司 Infrared and visible light video image fusion recognition fire method
CN105913604B (en) * 2016-05-18 2018-03-20 中国计量大学 Assay method and its device occur for the fire based on unmanned plane
CN107577981A (en) * 2016-07-04 2018-01-12 高德信息技术有限公司 A kind of road traffic index identification method and device
CN106251567B (en) * 2016-10-11 2019-06-21 广东工业大学 A kind of Intellective Fire Alarm System
CN106485229A (en) * 2016-10-14 2017-03-08 黑龙江科技大学 Agricultural ecotone remote sensing monitoring and early warning fire system
CN108131786B (en) * 2017-10-31 2020-05-29 珠海格力电器股份有限公司 Control method and device of air conditioner

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101679148B1 (en) * 2015-06-15 2016-12-06 동의대학교 산학협력단 Detection System of Smoke and Flame using Depth Camera
CN106954021A (en) * 2017-03-03 2017-07-14 天津天地伟业信息系统集成有限公司 Infrared thermal imaging monitoring intelligent camera shield cloud deck
EP3428591A1 (en) * 2017-07-12 2019-01-16 Honeywell International Inc. Flame detector field of view verification via reverse infrared signaling

Also Published As

Publication number Publication date
CN109118702A (en) 2019-01-01

Similar Documents

Publication Publication Date Title
CN109118702B (en) Fire detection method, device and equipment
US10317206B2 (en) Location determination processing device and storage medium
CN107438766B (en) Image-based monitoring system
US20170109981A1 (en) System and method of using a fire spread forecast and bim to guide occupants using smart signs
CN108629254B (en) Moving target detection method and device
CN110659606A (en) Fire fighting access occupation identification method and device, computer equipment and storage medium
CN112383756B (en) Video monitoring alarm processing method and device
Nghiem et al. Background subtraction in people detection framework for RGB-D cameras
KR20200119921A (en) Intelligent fire identification apparatus, intelligent fire identification method and recording medium
CN115866190A (en) Security monitoring method and device and security monitoring system
CN105701939A (en) Portable terminal capable of carrying out environment monitoring and control method thereof
CN116307740A (en) Fire point analysis method, system, equipment and medium based on digital twin city
TWI679400B (en) System and method for building structural safety detection
CN111228706A (en) GIS-based building interior fire fighting equipment control method and device
JP2020071698A (en) Fire detection device, fire detection method, and fire monitoring system
US20230260383A1 (en) Visible range detection system
CN109074714B (en) Detection apparatus, method and storage medium for detecting event
KR101853700B1 (en) Indoor localization system in disaster relief and localization method thereof
CN113701893B (en) Temperature measurement method, device, equipment and storage medium
CN106507046B (en) Alarm information generation method and device
CN111614938B (en) Risk identification method and device
CN110619737B (en) Joint defense warning method and device
JP6550145B2 (en) INFORMATION PROCESSING SYSTEM AND INFORMATION PROCESSING METHOD
CN115578827B (en) Method and system for tracking dangerous chemical article receiving container based on AI video
CN117975642B (en) Intelligent management system for fire-fighting facilities

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20201027

Address after: 261061 north of Yuqing East Street, east of Dongming Road, Weifang High tech Zone, Weifang City, Shandong Province (Room 502, Geer electronic office building)

Applicant after: GoerTek Optical Technology Co.,Ltd.

Address before: 261031 No. 268 Dongfang Road, Weifang hi tech Industrial Development Zone, Shandong, Weifang

Applicant before: GOERTEK Inc.

TA01 Transfer of patent application right
CB02 Change of applicant information

Address after: 261061 east of Dongming Road, north of Yuqing East Street, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Applicant after: GoerTek Optical Technology Co.,Ltd.

Address before: 261061 North of Yuqing East Street, east of Dongming Road, Weifang High-tech Zone, Weifang City, Shandong Province (Room 502, Goertek Office Building)

Applicant before: GoerTek Optical Technology Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant