Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and "a" and "an" generally include at least two, but do not exclude at least one, unless the context clearly dictates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element.
In addition, the sequence of steps in each method embodiment described below is only an example and is not strictly limited.
Fig. 1 is a schematic flow chart of a fire detection method according to an embodiment of the present invention, and the main steps of performing indoor fire detection include:
101: and acquiring a depth image and an infrared image through the TOF depth module.
It is easy to understand that the tof (time of flight) depth module emits infrared light, and captures a depth image and an infrared image in the same frame. In practical application, when the image is acquired through the TOF depth module, the image is continuously acquired according to a set frequency. When the image comparison is performed, two consecutive adjacent frames of images may be compared, or the current image may be compared with a reference image pre-stored before.
102: and comparing the depth image with a preset reference depth image to obtain a first comparison result.
The preset reference depth image refers to a depth image of a previous frame of a currently acquired depth information image. When the comparison is performed, the comparison is performed with the preset reference image of the previous frame according to the plurality of current images collected in real time.
Specifically, the gray value of the depth image acquired in real time is compared with a preset reference depth image. And calculating the difference value between the current depth image and the previous frame of depth image (namely, the preset reference depth image), calculating the square sum of the difference values, taking the square sum as the difference value, and calculating the sum of the difference values of the continuous multiple frames.
If the depth information values of a plurality of continuous depth images have a large-amplitude jumping phenomenon (namely, the gray difference value exceeds a first preset threshold) compared with the gray value of a preset reference depth image (the respective previous depth image), the depth information value at the position can be judged to have large vertical change and be in an unstable state, the sum of the difference values of the plurality of continuous depth images is large, if the difference value exceeds the set depth difference threshold, a fire disaster is considered to possibly occur, otherwise, the fire disaster is considered to not be detected.
103: and comparing the infrared image with a preset reference infrared image to obtain a second comparison result.
The preset reference infrared image is obtained by a TOF module after fire detection equipment (comprising an infrared image acquisition module and an infrared image acquisition module) is installed at a specified position, and is used as a preset reference infrared image. In order to obtain a better reference effect for the obtained preset reference infrared image, a plurality of reference infrared images are usually collected, and the pixel values of the plurality of images are averaged.
Specifically, assuming that the resolution of the infrared image acquired by the infrared image module is 1280 × 720, assuming that m reference infrared images are acquired; further, the gray values corresponding to the pixel points are sequentially calculated according to the m images. Taking the calculation of the gray scale value of one pixel as an example, assume that the gray scale value of the (0,0) point is calculated, and the formula Average is 1/m ∑ vali (val: pixel value; i is 1,2 … n). After the gray value of each pixel point is obtained, a preset reference infrared image can be obtained.
Further, gray value comparison is carried out on the infrared image acquired in real time and a preset reference infrared image. It is easy to understand that there are two cases of the first comparison result mentioned here, namely, the infrared image gray-scale value exceeds the reference infrared image gray-scale value, in other words, a possible fire is detected; the other is that the infrared image gradation value does not exceed the reference infrared image gradation value, in other words, the fire occurrence is not detected.
104: and determining whether a fire exists according to the second comparison result and a plurality of continuous first comparison results.
In order to more accurately determine whether the first comparison result and the second comparison result are consistent or not, it is further required to determine whether the first comparison result and the second comparison result are consistent or not. Further, when the determination is made based on the first comparison result, it is necessary to acquire a plurality of consecutive first comparison results, and if there are a plurality of consecutive first comparison results, all of which exceed the first preset threshold value, the fire is marked as suspected to be on fire, and if there are a plurality of consecutive first comparison results, all of which do not exceed the first preset threshold value, the fire is not marked as suspected to be on fire.
For example, when the first comparison result is a numerical value change in which the depth information (gray value) of the depth image has a large fluctuation (in other words, the first comparison result is checked by a first preset threshold), it indicates that the first comparison result is suspected of causing a fire, and if the gray value of the depth image is stable and is close to the gray value of the reference depth image, it indicates that the first comparison result is not causing a fire; and when the second comparison result is that the gray value corresponding to the infrared image exceeds the gray value corresponding to the preset reference infrared image, the second comparison result is suspected to be a fire, and if the second comparison result is smaller than the preset reference infrared image, the second comparison result is not a fire.
Further, only when the first comparison result indicates a suspected fire and the second comparison result indicates a suspected fire, the final output result indicates that a fire is determined to have occurred. This can effectively improve the accuracy of detecting a fire.
In one or more embodiments of the present invention, the acquiring the depth image and the infrared image may specifically include: acquiring a depth image and an infrared image in real time based on the TOF depth module; wherein the depth image and the infrared image are acquired in the same frame.
In order to ensure the reliability of the comparison result, in the detection process, the frequency of the image acquired in real time can be acquired once in 1 second, and the frequency of the acquired image can be adjusted according to different time periods; for example, in the case that all the people are in the daytime, the collection frequency may be once a minute, and if the people are off duty at night, the collection frequency may be adjusted to once a second.
In one or more embodiments of the present invention, the comparing the depth image with a preset reference depth image to obtain a first comparison result may specifically include: dividing the depth image to obtain a first depth block image; marking the index number of each depth block in the first depth block image; dividing the preset reference depth image to obtain a first reference depth block image; marking the index number of each depth block in the first reference depth block image; comparing the blocks in the first depth block image and the first reference depth block image according to the depth block index number; a first comparison result is obtained.
It should be noted that, when comparing the depth image with the preset reference depth image, one-to-one comparison may be performed, for example, the currently acquired depth image is compared with the previous frame depth image (preset reference depth image) one-to-one; and comparing the multi-frame depth image with a preset reference depth image.
Specifically, for example, as shown in fig. 2, assuming that the resolution of the depth image and the preset reference depth image is 400 × 300, the two images may be divided into 4 × 3 blocks in total, which are 12 blocks. Further, the index numbers of the divisions are marked, for example, in the order from left to right and from top to bottom, which are sequentially marked as 1-12.
When comparing, the gray values of the separate images corresponding to the same index number can be compared. For example, the gray value of the block image with the index number of 8 in the depth image acquired in real time is compared with the gray value of the block image with the index number of 8 in the preset reference depth image. It should be noted that, when performing gray value comparison, each pixel may be compared in sequence, and for simplification, a gray average value of each block image may also be obtained, and the average value is used for comparison.
In one or more embodiments of the present invention, the comparing the blocks in the first depth block image and the first reference depth block image according to the depth block index number may specifically include: obtaining a first gray average value of a block corresponding to each depth block index number in the first depth block image; obtaining a first reference gray level mean value of a block corresponding to each depth block index number in the first reference depth block image; and sequentially comparing the depth gray level difference value between the first gray level mean value and the corresponding first reference gray level mean value according to the depth block index number. According to the principle, the depth gray level difference value of the continuous multi-frame first depth image and the first reference depth image is calculated.
As shown in fig. 2, it is assumed that an image with a resolution of 400 × 300 is divided into 4 × 3 blocks. The resolution of each block is 100 × 100, and further, AVG ═ 1/n Σ vali (i ═ 1 to 10000) can be used. Assuming that a first gray level mean value corresponding to the depth image is AVG1, a first reference gray level mean value corresponding to a previous frame of depth image (a preset reference depth image) is AVG2, and then performing subtraction by using AVG1 and AVG2 to obtain a depth gray level difference value AVG 3; further calculating the depth gray level difference value of the depth images of continuous multiple frames;
in one or more embodiments of the present invention, the obtaining the first comparison result may specifically include: comparing the difference value with the sum of the depth gray levels according to a first preset threshold value; and if the sum of the depth gray level difference values is greater than the first preset threshold value, acquiring the corresponding depth block index number to obtain a first comparison result.
Assuming that the first preset threshold is AVG4, after obtaining the depth gray scale difference AVG3, AVG4 is compared with AVG 3. Assuming that a suspected fire is occurring when a plurality of consecutive AVGs 3 are larger than AVG4, further, corresponding index numbers of deep divisions need to be acquired, and the first comparison result corresponds to the index numbers.
In one or more embodiments of the present invention, the comparing the infrared image with a preset reference infrared image to obtain a second comparison result may specifically include: dividing the infrared image to obtain a first infrared block image; marking the index number of each infrared block in the first infrared block image; dividing the preset reference infrared image to obtain a first reference infrared block image; marking the index number of each infrared block in the first reference infrared block image; comparing the blocks in the first infrared block image and the first reference infrared block image according to the infrared block index numbers; a second comparison result is obtained.
For example, assuming that the resolution of the infrared image and the preset reference infrared image is 400 × 300, the two images may be divided into 4 × 3 blocks of 12 blocks. Further, the index numbers of the divisions are marked, for example, in the order from left to right and from top to bottom, which are sequentially marked as 1-12.
When comparing, the gray values of the separate images corresponding to the same index number can be compared. For example, the gray value of the block image with the index number of 10 in the infrared image acquired in real time is compared with the gray value of the block image with the index number of 10 in the preset reference infrared image. It should be noted that, when performing gray value comparison, each pixel may be compared in sequence, and for simplification, a gray average value of each block image may also be obtained, and the average value is used for comparison.
In one or more embodiments of the present invention, the comparing the patches in the first infrared patch image and the first reference infrared patch image according to the infrared patch index number may specifically include: obtaining a first infrared gray level mean value of a block corresponding to each infrared block index number in the first infrared block image; obtaining a first reference infrared gray level mean value of a block corresponding to each infrared block index number in the first reference infrared block image; and sequentially comparing the infrared gray level difference value between the first infrared gray level average value and the corresponding first reference infrared gray level average value according to the infrared blocking index number.
Assume that an image with a resolution of 400 × 300 is divided into 4 × 3 block images. The resolution of each block is 100 × 100, and further, AVG ═ 1/n Σ vali (i ═ 1 to 10000) can be used. Assuming that the second gray scale mean value corresponding to the infrared image is AVG5, the second reference gray scale mean value corresponding to the reference infrared image is AVG6, and then performing subtraction by using AVG5 and AVG6 to obtain an infrared gray scale difference value AVG 7.
In one or more embodiments of the present invention, the obtaining the second comparison result may specifically include: comparing the infrared gray level difference with a second preset threshold value; and if the infrared gray difference value is greater than the second preset threshold value, acquiring the corresponding infrared block index number and acquiring a second comparison result.
Assuming that the second preset threshold is AVG8, after obtaining the infrared gray-scale difference AVG7, AVG8 is compared with AVG 7. The magnitude relation between the infrared gray scale difference value and the second preset threshold value can be set manually, and in practical application, as the difference value has positive and negative values, the infrared gray scale difference value can be larger than the second preset threshold value or smaller than the second preset threshold value as a condition for judging fire occurrence. If the suspected fire is considered to occur when the infrared gray scale difference is greater than the second preset threshold, the corresponding infrared-separated index number needs to be acquired, and the second comparison result corresponds to the index number.
In one or more embodiments of the present invention, the determining whether a fire disaster exists according to the second comparison result and a plurality of consecutive first comparison results may specifically include: and if a plurality of continuous first comparison results and second comparison results are all in the presence of fire, and the depth block index numbers correspond to the infrared block index numbers, determining that the fire exists.
In order to improve the accuracy of the detection result, it is necessary to integrate the first comparison result and the second comparison result, and only when a plurality of consecutive first comparison results are judged to be suspected to occur or, and the second comparison result is judged to be suspected to have a fire, it can be judged that the fire is occurring.
In one or more embodiments of the invention, further comprising: and if the fire disaster exists, determining the space position of the fire disaster according to the corresponding relation between the preset space position and the depth image and/or the infrared image.
For example, as shown in fig. 3, it is assumed that the fire detection device 1 is installed in a room a1 and the fire detection device 2 is installed in a room a 2. When the fire detection device 1 detects that a fire occurs, the fire detection device 1 sends the device number and the corresponding room number for detecting the fire to the control device or the staff member, so that the staff member can quickly find the room where the fire occurs and perform fire extinguishing work.
Further, if the monitored space is large, such as a large conference room or a large warehouse, the relationship between each partition and the actual spatial position needs to be established. It is assumed that a block image with a block number of 1 corresponds to a door of a warehouse, and a block image with a block number of 2 corresponds to a file storage cabinet. When alarming, the block numbers can be sent to the working personnel together, so that more accurate alarming is realized.
Based on the same idea, a fire detection apparatus, as shown in fig. 4, includes:
an obtaining module 41, configured to obtain a depth image and an infrared image through a TOF depth module;
a first comparing module 42, configured to compare the depth image with a preset reference depth image, and obtain a first comparison result;
a second comparison module 43, configured to compare the infrared image with a preset reference infrared image to obtain a second comparison result;
a determining module 44, configured to determine whether a fire exists according to the second comparison result and a plurality of consecutive first comparison results.
Further, the obtaining module 41 is configured to obtain a depth image and an infrared image in real time based on the TOF depth module; wherein the depth image and the infrared image are acquired in the same frame.
Further, the first comparing module 42 is configured to divide the depth image to obtain a first depth block image;
marking the index number of each depth block in the first depth block image;
dividing the preset reference depth image to obtain a first reference depth block image;
marking the index number of each depth block in the first reference depth block image;
comparing the blocks in the first depth block image and the first reference depth block image according to the depth block index number;
a first comparison result is obtained.
Further, obtaining a first gray average value of a block corresponding to each depth block index number in the first depth block image;
obtaining a first reference gray level mean value of a block corresponding to each depth block index number in the first reference depth block image;
and sequentially comparing the depth gray level difference value between the first gray level mean value and the corresponding first reference gray level mean value according to the depth block index number.
Further, comparing the depth gray level difference value with a first preset threshold value;
and if the depth gray level difference value is greater than the first preset threshold value, acquiring the corresponding depth block index number to obtain a first comparison result.
A second comparison module 43, configured to divide the infrared image to obtain a first infrared block image;
marking the index number of each infrared block in the first infrared block image;
dividing the preset reference infrared image to obtain a first reference infrared block image;
marking the index number of each infrared block in the first reference infrared block image;
comparing the blocks in the first infrared block image and the first reference infrared block image according to the infrared block index numbers;
a second comparison result is obtained.
Further, obtaining a first infrared gray level mean value of a block corresponding to each infrared block index number in the first infrared block image;
obtaining a first reference infrared gray level mean value of a block corresponding to each infrared block index number in the first reference infrared block image;
and sequentially comparing the infrared gray level difference value between the first infrared gray level average value and the corresponding first reference infrared gray level average value according to the infrared blocking index number.
Further, comparing the infrared gray level difference value with a second preset threshold value;
and if the infrared gray difference value is greater than the second preset threshold value, acquiring the corresponding infrared block index number and acquiring a second comparison result.
Further, the determining module 44 is configured to determine whether a fire occurs, and determine that a fire exists if a plurality of consecutive first comparison results and a plurality of consecutive second comparison results both indicate that a fire exists and the deep block index number corresponds to the infrared block index number.
Further, if the fire disaster exists, the space position where the fire disaster occurs is determined according to the corresponding relation between the preset space position and the depth image and/or the infrared image.
Based on the above embodiment, the depth image and the infrared image are simultaneously acquired through the depth module and the infrared module. And comparing the gray value of the depth image and the infrared image which are acquired respectively with the reference depth image and the reference infrared image which are acquired before. And if the infrared gray difference value and the gray difference value both meet the corresponding preset difference value, determining that the fire disaster occurs. In contrast, when performing gray scale value comparison, it is often necessary to divide an image into a plurality of blocks and perform comparison based on an average gray scale value in the blocks. And after the occurrence of fire is confirmed, the worker is informed in time. By comparing the depth gray level difference value corresponding to the depth image with the infrared gray level difference value corresponding to the infrared image, whether a fire disaster occurs is finally determined, furthermore, the position of the fire disaster can be determined based on the image, whether the fire disaster occurs can be rapidly and accurately determined, and accurate position information of the fire disaster and corresponding image information can be provided for workers.
Based on the same idea, an embodiment of the present invention further provides an electronic device, including: a memory, a processor; wherein,
the memory is configured to store one or more computer instructions, wherein the one or more computer instructions, when executed by the processor, implement a method comprising:
acquiring a depth image and an infrared image through a TOF depth module;
comparing the depth image with a preset reference depth image to obtain a first comparison result;
comparing the infrared image with a preset reference infrared image to obtain a second comparison result;
and determining whether a fire exists according to the second comparison result and a plurality of continuous first comparison results.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by adding a necessary general hardware platform, and of course, can also be implemented by a combination of hardware and software. With this understanding in mind, the above-described aspects and portions of the present technology which contribute substantially or in part to the prior art may be embodied in the form of a computer program product, which may be embodied on one or more computer-usable storage media having computer-usable program code embodied therein, including without limitation disk storage, CD-ROM, optical storage, and the like.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable coordinate determination device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable coordinate determination device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable coordinate determination apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable coordinate determination device to cause a series of operational steps to be performed on the computer or other programmable device to produce a computer implemented process such that the instructions which execute on the computer or other programmable device provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.