WO2024083134A1 - Fire determination method, system and apparatus, and storage medium - Google Patents

Fire determination method, system and apparatus, and storage medium Download PDF

Info

Publication number
WO2024083134A1
WO2024083134A1 PCT/CN2023/125066 CN2023125066W WO2024083134A1 WO 2024083134 A1 WO2024083134 A1 WO 2024083134A1 CN 2023125066 W CN2023125066 W CN 2023125066W WO 2024083134 A1 WO2024083134 A1 WO 2024083134A1
Authority
WO
WIPO (PCT)
Prior art keywords
fire
image
location
value
heat source
Prior art date
Application number
PCT/CN2023/125066
Other languages
French (fr)
Chinese (zh)
Inventor
周璐琼
Original Assignee
浙江华感科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 浙江华感科技有限公司 filed Critical 浙江华感科技有限公司
Publication of WO2024083134A1 publication Critical patent/WO2024083134A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Definitions

  • the present invention relates to the field of fire monitoring and image processing, and in particular to a fire determination method, system, device and storage medium.
  • infrared thermal imaging uses infrared detectors to detect the heat emitted by objects, and infrared thermal images can be obtained with the use of optical imaging objectives; by matching the image with the heat distribution field on the surface of the object one by one, high-temperature objects such as flames and the sun can be detected. This method is widely used in forest fire prevention, fire monitoring and other fields.
  • the method of fire detection based on temperature has a high false detection rate.
  • the sun or sunlight reflection points, as well as high-temperature objects such as car engines and exhaust pipes are sometimes mistaken as fire sources, resulting in false fire alarms.
  • This application provides a fire judgment method, system, device and storage device to solve the problem of false fire alarm caused by misjudging the sun or sunlight reflection point and the position of the car engine and exhaust pipe as the fire source.
  • the specific implementation scheme is as follows:
  • one of the contents of the present invention provides a fire judgment method, the method comprising: determining a first fire location based on a first image of a target area; determining a matching area of the first fire location in a second image of the target area, and extracting color information of the matching area, wherein the shooting time difference between the first image and the second image satisfies a preset condition; based on the color information, judging whether the fire corresponding to the first fire location is a real fire.
  • the first image is an infrared thermal image
  • the second image is a visible light image
  • the color information includes RGB values
  • judging whether the fire corresponding to the first fire location is a real fire based on the color information includes: in response to the RGB value satisfying a first judgment condition, determining that the fire corresponding to the first fire location is the real fire; in response to the RGB value not satisfying the first judgment condition, determining that the fire corresponding to the first fire location is a non-real fire.
  • the first judgment condition includes: the R value and the G value of the RGB value satisfy a first threshold range, and the B value satisfies a second threshold range.
  • the method further comprises: in response to the fire corresponding to the first fire location being the A non-real fire situation is used to determine an interference category of the non-real fire situation.
  • determining the interference category of the non-real fire includes: in response to the RGB value satisfying a second judgment condition, determining that the interference category is a reflective point heat source; in response to the RGB value not satisfying the second judgment condition, determining that the interference category is a non-reflective point heat source.
  • the second judgment condition includes: the R value, the G value and the B value of the RGB value satisfy a third threshold range.
  • the non-reflective point heat source includes a mobile heat source and a stationary heat source.
  • the method After determining that the interference category of the non-real fire corresponding to the first fire location is the non-reflective point heat source, the method also includes: determining a second fire location based on the first image, the second fire location and the first fire location corresponding to the same fire area; determining whether the second fire location and the first fire location satisfy a third judgment condition; in response to satisfying the third judgment condition, determining that the interference category is the stationary heat source; in response to not satisfying the third judgment condition, determining that the interference category is the mobile heat source.
  • the third judgment condition includes: the second fire location is consistent with the first fire location.
  • one of the contents of the present invention provides a fire situation judgment system, the system comprising: a determination module, used to determine a first fire location based on a first image of a target area; an extraction module, used to determine a matching area of the first fire location in a second image of the target area, and extract color information of the matching area, and a shooting time difference between the first image and the second image satisfies a preset condition; a judgment module, used to judge whether the fire corresponding to the first fire location is a real fire based on the color information.
  • one of the contents of the present invention provides a fire situation judgment device, which includes at least one processor and at least one memory; the at least one memory is used to store computer instructions; the at least one processor is used to execute at least part of the computer instructions to implement a fire situation judgment method.
  • one of the contents of the present invention provides a computer-readable storage medium, wherein the storage medium stores computer instructions.
  • the computer reads the computer instructions in the storage medium, the computer executes a fire situation determination method.
  • the beneficial effects brought about by the above invention include but are not limited to: (1) through the relevant features of the visible light image (for example, color information, etc.), it can assist in judging the authenticity of the fire location that has been judged as a suspected fire by the infrared thermal image grayscale map, and can eliminate misjudgment in many situations, such as: strong sun reflection points, high temperature locations such as stationary car engines or exhaust pipes, moving high temperature objects, etc.; (2)
  • the RGB information of the visible light image can solve the problem that the infrared thermal image grayscale map is easily affected by weather, season, and time, thereby improving the accuracy of fire judgment.
  • FIG1 is a schematic diagram of an application scenario of a fire situation judgment system according to some embodiments of this specification.
  • FIG2 is an exemplary module diagram of a fire situation determination system according to some embodiments of this specification.
  • FIG3 is an exemplary flow chart of a fire situation determination method according to some embodiments of this specification.
  • FIG4A is an exemplary schematic diagram of an infrared thermal image grayscale map according to some embodiments of the present specification.
  • FIG4B is an exemplary schematic diagram of a grayscale image of a visible light image according to some embodiments of the present specification.
  • FIG5 is an exemplary schematic diagram of determining interference categories of non-real fire conditions according to some embodiments of the present specification
  • FIG. 6 is an exemplary schematic diagram of a fire situation determination process according to some embodiments of the present specification.
  • system used herein are a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
  • infrared thermal imaging uses infrared detectors to detect the heat emitted by objects, and infrared thermal images can be obtained by using optical imaging lenses; then the image is matched one-to-one with the heat distribution field on the surface of the object to detect high-temperature objects, such as fires.
  • a circular detection algorithm is used on the fire point, it is easy to misjudge the sun or sunlight reflection point and the position of the car engine and exhaust pipe as the fire source, resulting in false fire alarms and increasing the misjudgment rate of fire.
  • an embodiment of the present application provides a fire judgment method, by which interference from reflective points such as sunlight, static heat sources such as engine exhaust of stationary cars, and mobile heat sources such as moving hot water cups can be eliminated, thereby reducing the misjudgment rate of fire.
  • FIG1 is a schematic diagram of an application scenario of a fire situation judgment system according to some embodiments of this specification.
  • the application scenario 100 involved in the embodiments of this specification may include a thermal imaging device 110, a visible light camera 120, a processor 130, a network 140, a terminal device 150, and a storage device 160.
  • the fire situation determination system may implement the methods and/or processes disclosed in this specification.
  • the thermal imaging device 110 can be used to obtain infrared thermal images.
  • the thermal imaging device 110 can monitor and obtain infrared thermal images of the target area in real time.
  • the thermal imaging device 110 can perform an all-round or fixed-point scan of the monitoring range to obtain an infrared thermal image of any area within the monitoring range.
  • the thermal imaging device 110 can detect the infrared energy of the object in a non-contact manner, and convert the infrared energy into an electrical signal to form an infrared thermal image.
  • the infrared thermal image can be a grayscale image or a color image.
  • the grayscale value of each pixel reflects the received thermal radiation energy or temperature.
  • different colors can be used to intuitively represent the temperature. For example, red can be used to represent a high temperature, and blue can be used to represent a low temperature.
  • the thermal imaging device 110 can send the acquired infrared thermal image to the processor 130 for processing.
  • the thermal imaging device 110 can send the acquired infrared thermal image to the storage device 160 for storage.
  • the visible light camera 120 can be used to obtain visible light images.
  • the visible light camera can be a security camera with an imaging system of an RGB sensor.
  • the visible light camera 120 can monitor and obtain visible light images of the target area in real time.
  • the visible light camera 120 can perform an all-round or fixed-point scan of the monitoring range to obtain the monitoring range.
  • the visible light camera 120 can acquire a visible light image of any area within the visible light spectrum. Among them, the visible light camera 120 can acquire a visible light image by capturing light within the visible light spectrum.
  • the visible light image can be a color image. In a visible light image in the form of a color image, each pixel has color information, such as RGB value, etc. For more explanation, see Figure 3 and its related description.
  • the visible light camera 120 can send the acquired visible light image to the processor 130 for processing.
  • the visible light camera 120 can send the acquired visible light image to the storage device 160 for storage.
  • the processor 130 may process data and/or information obtained from the thermal imaging device 110, the visible light camera 120, the terminal device 150, and/or the storage device 160. For example, the processor 130 may obtain a first image and a second image of a target area. For another example, the processor 130 may determine a first fire location based on the first image of the target area, determine a matching area of the first fire location in the second image of the target area, and extract color information of the matching area. For another example, the processor 130 may determine whether the fire corresponding to the first fire location is a real fire based on the color information.
  • the processor 130 may be a single server or a server group.
  • the server group may be centralized or distributed.
  • the processor 130 may be local or remote.
  • the processor 130 may access information and/or data from the thermal imaging device 110, the visible light camera 120, the terminal device 150, and/or the storage device 160 via the network 140.
  • the processor 130 may be directly connected to the thermal imaging device 110, the visible light camera 120, the terminal device 150, and/or the storage device 160 to access information and/or data.
  • the processor 130 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud cloud, a multi-cloud, etc., or any combination thereof.
  • the network 140 may include any suitable network that facilitates information and/or data exchange.
  • one or more components of the application scenario 100 may exchange information and/or data through the network 140.
  • the processor 130 may receive an infrared thermal image captured by the thermal imaging device 110 and a visible light image captured by the visible light camera 120 through the network 140.
  • the network 140 may include a local area network (LAN), a wide area network (WAN), a wired network, a wireless network, etc., or any combination thereof.
  • the terminal device 150 can communicate and/or connect with the thermal imaging device 110, the visible light camera 120 and/or the storage device 160.
  • the terminal device 150 can send one or more control instructions to the thermal imaging device 110 and/or the visible light camera 120 through the network 140 to control the thermal imaging device 110 and/or the visible light camera 120 to collect infrared thermal images and/or visible light images of the target area according to the instructions.
  • the terminal device 150 may include one or any combination of other devices with input and/or output functions such as a mobile device 150-1, a tablet computer 150-2, a laptop computer 150-3, a desktop computer 150-4, etc.
  • the terminal device 150 may include an input device, an output device, etc.
  • the input device may include a keyboard, a touch screen, a mouse, a voice device, etc., or any combination thereof.
  • the output device may include a display, a speaker, a printer, etc., or any combination thereof.
  • the terminal device 150 may be a part of the processor 130. In some embodiments, the terminal device 150 may be integrated with the processor 130 as an operating console of the fire judgment system.
  • the storage device 160 may store data, instructions, and/or any other information.
  • the storage device 160 may store data acquired from the thermal imaging device 110, the visible light camera 120, the processor 130, and/or the terminal device 150.
  • the storage device 160 may store infrared thermal images and/or visible light images acquired from the thermal imaging device 110 and/or the visible light camera 120.
  • the storage device 160 may store data and/or instructions used by the processor 130 to execute or use to complete the exemplary methods described in this specification.
  • the storage device 160 may store instructions for controlling the thermal imaging device 110 and the visible light camera 120 to acquire images.
  • the storage device 160 may include a mass storage, a removable storage, a volatile read-write memory, a read-only memory (ROM), etc. or any combination thereof. In some embodiments, the storage device 160 may be implemented on a cloud platform. In some embodiments, the storage device 160 may be part of the processor 130.
  • thermal imaging device 110 the visible light camera 120
  • the processor 130 and the terminal device 150 may share a storage device 160, or may have their own storage devices.
  • these changes and modifications do not deviate from the scope of this specification.
  • FIG2 is an exemplary module diagram of a fire situation determination system according to some embodiments of the present specification.
  • the fire situation determination system 200 may include a determination module 210, an extraction module 220, and a determination module 230.
  • the fire situation determination system 200 may be implemented by a processor 130.
  • the determination module 210 may be used to determine the first fire location based on the first image of the target area. For more information about the first image and the first fire location, see FIG. 3 and its related description.
  • the extraction module 220 can be used to determine the matching area of the first fire location in the second image of the target area, and extract the color information of the matching area, and the shooting time difference between the first image and the second image meets the preset condition.
  • the second image, the matching area, the color information, etc. please refer to Figure 3 and its related description.
  • the determination module 230 may be used to determine whether the fire corresponding to the first fire location is a real fire based on the color information of the matching area.
  • the color information includes RGB values
  • the judgment module 230 can be further used to: in response to the RGB value satisfying the first judgment condition, determine that the fire corresponding to the first fire location is a real fire; in response to the RGB value not satisfying the first judgment condition, determine that the fire corresponding to the first fire location is a non-real fire.
  • the judgment module 230 may be further used to: in response to the fire corresponding to the first fire position being a non-real fire, determine the interference category of the non-real fire. In some embodiments, the judgment module 230 may be further used to: in response to the RGB value satisfying the second judgment condition, determine the interference category as a reflective point heat source; in response to the RGB value not satisfying the second judgment condition, determine the interference category as a non-reflective point heat source.
  • the non-reflective point heat source includes a mobile heat source and a stationary heat source.
  • the judgment module 230 can be further used to: determine a second fire location based on the first image, the second fire location and the first fire location corresponding to the same fire area; determine whether the second fire location and the first fire location meet a third judgment condition; in response to meeting the third judgment condition, determine that the interference category is a stationary heat source; in response to not meeting the third judgment condition, determine that the interference category is a mobile heat source.
  • the determination module 210, extraction module 220 and judgment module 230 disclosed in Figure 2 can be different modules in a system, or a module can realize the functions of two or more modules mentioned above.
  • each module can share a storage module, or each module can have its own storage module.
  • FIG. 3 is an exemplary flow chart of a fire situation determination method according to some embodiments of the present specification.
  • the fire condition determination method may be executed by the processor 130 or the fire condition determination system 200.
  • the process 300 may be stored in a storage device (eg, the storage device 160) in the form of a program or instruction.
  • the fire situation judgment system 200 executes the program or instruction, the process 300 can be implemented.
  • the operation diagram of the process 300 presented below is illustrative. In some embodiments, the process can be completed using one or more additional operations not described and/or one or more operations not discussed. In addition, the order of the operations of the process 300 shown in Figure 3 and described below is not restrictive.
  • Step 301 determining a first fire location based on a first image of a target area.
  • step 310 may be performed by the processor 130 or the determination module 210 .
  • the target area refers to the area where the fire situation is to be determined, wherein the fire situation may include the location where the fire is suspected to have occurred, the actual situation of the fire, etc.
  • the target area can be in various forms.
  • the target area can be in various forms such as an outdoor square, an outdoor parking lot, an indoor square, a shopping mall, etc.
  • the description of the target area is only for illustrative purposes and does not constitute a limitation on the implementation method.
  • the first image refers to image data used to determine the fire situation.
  • the first image is an infrared thermal image.
  • the first image may be an infrared thermal image in the form of a color image.
  • the first image can be acquired by a monitoring device deployed at a monitoring position.
  • the monitoring position can be the location where the monitoring device is located.
  • the monitoring device can include a thermal imaging device.
  • a plurality of monitoring columns are dispersedly arranged in an outdoor square, a pan-tilt is arranged at the top of the plurality of monitoring columns, and the monitoring device is mounted on the pan-tilt, which can rotate so that a plurality of monitoring devices can collect infrared thermal images of the outdoor square, etc. Therefore, in the above application scenario, the monitoring position can be the location of the pan-tilt.
  • monitoring devices For a plurality of monitoring devices, they can be installed on pan-tilts located at different locations, and a plurality of monitoring devices located on different pan-tilts can exchange information with a monitoring center through a network. For more information about the thermal imaging device, see FIG. 1 and its related description.
  • the first fire location refers to the location area where the fire is suspected to have occurred.
  • the processor 130 may process the first image of the target area in a variety of ways to determine the first fire location.
  • the processor 130 may process the original infrared thermal image (e.g., a color infrared thermal image) in a grayscale processing manner to obtain an infrared thermal image in a grayscale image (hereinafter referred to as an infrared thermal image grayscale image), and determine the first fire location in the infrared thermal image grayscale image according to the fire location determination method.
  • the method for determining the location of a fire includes: when there are one or more pixels in the grayscale image of the infrared thermal image with a grayscale value greater than a preset grayscale threshold, the location area composed of these pixels is determined as the first fire location.
  • the method for determining the location of a fire includes: determining a location area composed of multiple pixels whose grayscale values are greater than a preset grayscale threshold; determining the temperature statistics of the location area (for example, standard deviation, mean difference, etc.); determining the fluctuation characteristics of the location area based on the temperature statistics (for example, determining the ratio of the temperature standard deviation to the temperature mean difference as the fluctuation characteristics); when the fluctuation characteristics (for example, the ratio of the aforementioned temperature standard deviation to the temperature mean difference) is greater than the preset ratio, determining the location area as the first fire location.
  • the preset grayscale threshold and the preset ratio can be system default values, experience values, artificial pre-set values, etc. or any combination thereof, and can be set according to actual needs, and this specification does not limit this.
  • the processor 130 can determine the reference coordinate point in the infrared thermal image grayscale map by a reference coordinate point determination method, determine the first fire location coordinates of the first fire location based on the reference coordinate point and a rectangular coordinate system established based on the reference coordinate point, and determine the fire area spreading outward with the first fire location as the center according to the fire area division method.
  • the fire area division method includes: taking the fire location as the center and dividing the fire area by a preset distance value.
  • the rectangular area obtained by the external diffusion is the fire area.
  • the fire area is divided by: the area obtained by splicing multiple pixel points whose grayscale difference with the grayscale value at the fire location is within a preset difference range is determined as the fire area.
  • the preset distance value and the preset difference range can be system default values, experience values, artificial preset values, etc. or any combination thereof, and can be set according to actual needs, and this specification does not limit this.
  • the reference coordinate point determination method includes: determining the reference coordinate point based on the upper left corner of the infrared thermal image grayscale image. In some embodiments, the reference coordinate point determination method also includes: determining the reference coordinate point based on the lower left corner of the infrared thermal image grayscale image.
  • the specific reference coordinate point determination method is not limited in the embodiments of this specification.
  • Figure 4A is an infrared thermal image grayscale image after grayscale processing of the original infrared thermal image of the target area taken by the thermal imaging device.
  • a certain position W in the infrared thermal image grayscale image is determined as the first fire location.
  • a rectangular coordinate system is established with the lower left corner of the infrared thermal image grayscale image as the origin (0, 0), and the coordinates of the first fire location (i.e., position W) are determined to be (5, 5). This coordinate is the first fire location coordinate, and the rectangular area radiating outward from position W is the fire area Q.
  • Step 302 determining a matching area of the first fire location in the second image of the target area, and extracting color information of the matching area.
  • step 320 may be performed by the processor 130 or the extraction module 220 .
  • the second image refers to image data used to determine the fire situation.
  • the second image is different from the first image.
  • the second image is a visible light image.
  • the second image can be a visible light image in the form of a color image.
  • the second image may be acquired after the first fire location is determined.
  • the shooting time difference between the first image and the second image satisfies a preset condition.
  • the preset condition may be that the shooting time difference between the first image and the second image is less than a preset time difference threshold. The preset condition and the preset time difference may be set based on actual needs and are not limited here.
  • the second image can be acquired by a monitoring device deployed at the monitoring location.
  • the monitoring device may also include a visible light camera.
  • a security camera based on an imaging system with an RGB sensor may capture a raw image corresponding to the target area; the raw image is processed by a raw image processing method to obtain a corresponding visible light image.
  • a raw image processing method For more information about the visible light camera, see FIG. 1 and its related description.
  • the matching area refers to the partial image area corresponding to the fire area corresponding to the first fire location in the second image.
  • the processor 130 can determine the matching area of the first fire location in the second image of the target area in a variety of ways. For example, the processor 130 can configure the first image and the second image through an image registration algorithm, and then determine the matching area of the first fire location in the second image based on the fire area of the first fire location in the first image.
  • the image registration algorithm includes but is not limited to a template-based matching algorithm, a grayscale-based matching algorithm, a feature-based matching algorithm, etc.
  • the processor 130 can determine the reference coordinate points of the second image through a reference coordinate point determination method, and construct a rectangular coordinate system based on the reference coordinate points; mark the fire area in the first image in the second image based on the rectangular coordinate system to obtain a matching area.
  • a visible light image grayscale image of the target area i.e., the second image in grayscale form, obtained by a visible light camera
  • the fire area determined in step 310 is marked with a rectangle in the visible light image grayscale image, i.e., the infrared thermal image in FIG. 4A
  • the rectangular area Q in the grayscale image is obtained to obtain the matching area Q'.
  • FIG4A and FIG4B are respectively the infrared thermal image grayscale image and the visible light image grayscale image of the same target area.
  • the grayscale values of different pixels in the infrared thermal image grayscale image are related to the temperature, while there is no specific corresponding relationship between the grayscale values of different pixels in the visible light image grayscale image and the temperature.
  • the color information refers to information on the color included in the matching area.
  • the color information may include RGB values.
  • the RGB values of the matching area may be obtained in a variety of ways. The embodiments of this specification do not specifically limit the method for extracting the RGB values, and operations familiar to those skilled in the art may be used.
  • the color information of the matching area may include the color information of each pixel in the matching area.
  • the color information of the matching area may be a statistical value of the color information of each pixel in the matching area, such as a mean, an extreme value, a median, or the like.
  • Step 303 Based on the color information, determine whether the fire corresponding to the first fire position is a real fire.
  • step 330 may be executed by the processor 130 or the determination module 230 .
  • the processor 130 can determine whether the fire corresponding to the first fire position is a real fire based on the RGB value in the color information. In some embodiments, the processor 130 can determine whether the fire corresponding to the first fire position is a real fire based on the RGB value of each pixel in the matching area. For example, in response to the RGB values of a preset number of pixels in the matching area satisfying a specific judgment condition, the fire corresponding to the first fire position is judged to be a real fire, otherwise it is an unreal fire. For another example, in response to the average of the RGB values of each pixel satisfying a specific judgment condition, the fire corresponding to the first fire position is judged to be a real fire, otherwise it is an unreal fire.
  • the processor 130 can determine that the fire corresponding to the first fire location is a real fire in response to the RGB value satisfying the first judgment condition; in response to the RGB value not satisfying the first judgment condition, determine that the fire corresponding to the first fire location is a non-real fire.
  • the first judgment condition refers to a judgment condition used to judge whether the fire corresponding to the first fire location is a real fire.
  • the first judgment condition includes: the R value and the G value in the RGB value satisfy the first threshold range, and the B value satisfies the second threshold range.
  • the first threshold range may be [a, b], and the second threshold range may be [c, d], where d ⁇ a, a, b, c, d ⁇ (0,255).
  • the first threshold range may be [250,255]
  • the second threshold range may be [210,230].
  • the second threshold range may be [215,225].
  • a and b may be the same.
  • the first threshold range may be 255.
  • the R value, G value, and B value of the rectangular area Q' in Figure 4B are processed by averaging to obtain R value, G value, and B value of 255, 254, and 252, respectively. At this time, the R value and G value are within the first preset threshold range, and the B value is not within the second threshold range. It can be determined that the R, G, and B values of the matching area do not meet the first judgment condition.
  • the R value, G value, and B value of the rectangular area Q' in Figure 4B are processed by averaging to obtain R value, G value, and B value of 255, 254, and 220, respectively. At this time, the R value and G value are within the first preset threshold range, and the B value is within the second threshold range. It can be determined that the R, G, and B values of the matching area meet the first judgment condition.
  • the first threshold range and the second threshold range may be predetermined.
  • the first threshold range and the second threshold range may be system default values, empirical values, artificially preset values, or any combination thereof, and may be set according to actual needs, and this specification does not limit this.
  • the processor 130 may also dynamically determine the corresponding first Threshold range and second threshold range. Different actual situations correspond to different first threshold ranges and/or different second threshold ranges, and the corresponding first threshold range and second threshold range can be dynamically determined according to the actual situation.
  • the processor 130 may acquire a visible light image sequence of the target area, and determine a first threshold range and a second threshold range based on the visible light image sequence.
  • the process of acquiring the visible light image sequence may be triggered after acquiring the second image.
  • the processor 130 may control the visible light camera to capture the visible light image of the target area to acquire the visible light image sequence.
  • the processor 130 may control the visible light camera to capture the target area multiple times to obtain a visible light image sequence. For example, the processor 130 may control the visible light camera to capture an image of the target area once at a preset time interval to obtain a visible light image, thereby obtaining a visible light image sequence.
  • the visible light image sequence may be captured at a first capture frequency.
  • the processor 130 may control the visible light camera to capture the target area multiple times at the first capture frequency to acquire the visible light image sequence.
  • the first capture frequency is a capture frequency used to acquire a visible light image sequence.
  • the first capture frequency may be x images/min, etc., where x is a positive integer.
  • the first snapshot frequency may be preset based on prior knowledge or historical data. For example, the snapshot frequency that is used most times in historical data may be determined as the first snapshot frequency.
  • the processor 130 may determine the first capture frequency according to the actual situation of the target area. In some embodiments, the processor 130 may acquire a preset number of initial visible light images at a second capture frequency; determine an ambient light intensity sequence and a difference distribution sequence based on the preset number of initial visible light images; and determine the first capture frequency based on the ambient light intensity sequence and the difference distribution sequence.
  • the second capture frequency is a preliminarily determined capture frequency.
  • the second capture frequency can be used to determine the first capture frequency. For example, firstly, image acquisition is performed at the second capture frequency, and analysis and processing are performed based on the acquired image to determine the first capture frequency.
  • the second capture frequency can be the same as or different from the first capture frequency.
  • the second snapshot frequency can be preset based on prior knowledge or historical data. For example, the snapshot frequency that is used most times in historical data can be determined as the second snapshot frequency.
  • the preset number may be a system default value, an experience value, a manually preset value, or any combination thereof, and may be set according to actual needs, and this specification does not impose any restrictions on this.
  • the initial visible light image refers to a visible light image acquired according to the second capture frequency.
  • the ambient light intensity sequence refers to a sequence consisting of the ambient light intensity of a preset number of initial visible light images.
  • the processor 130 may obtain the ambient light intensity of each of the preset number of initial visible light images to obtain the ambient light intensity sequence. For more information on obtaining the ambient light intensity, see below.
  • the difference distribution sequence refers to a sequence consisting of difference distributions of visible light images captured at adjacent moments in a preset number of initial visible light images.
  • the difference distribution sequence includes multiple difference distributions, each of which is determined based on image information of initial visible light images captured at adjacent moments in a preset number of initial visible light images.
  • the processor 130 may perform semantic segmentation on two initial visible light images taken at adjacent moments to obtain multiple groups of semantic block pairs; calculate the pixel difference matrix of each semantic block pair in the multiple groups of semantic block pairs; perform statistical feature extraction on the pixel difference matrix to obtain difference statistics; obtain multiple difference statistics based on multiple pixel difference matrices corresponding to the multiple groups of semantic block pairs, and the multiple difference statistics constitute a difference distribution.
  • a semantic block pair consists of two semantic blocks, which come from different visible light images.
  • the correspondence between semantic blocks means that two semantic blocks correspond to the same object.
  • a semantic block such as a car
  • a semantic block the same car as the previous one
  • visible light image B the two semantic blocks correspond to each other.
  • two adjacent visible light images include visible light image A and visible light image B
  • visible light image A is divided into semantic blocks a1, a2, a3
  • visible light image B is divided into semantic blocks b1, b2, b3, where a1 corresponds to b1, a2 corresponds to b2, and a3 corresponds to b3.
  • the processor 130 can perform semantic segmentation on two adjacent visible light images respectively by various feasible methods such as semantic segmentation algorithms or semantic segmentation models (e.g., machine learning models, etc.), to obtain multiple groups of semantic block pairs.
  • semantic segmentation algorithms or semantic segmentation models e.g., machine learning models, etc.
  • the embodiments of this specification do not specifically limit the semantic segmentation algorithms or semantic segmentation models, and operations familiar to those skilled in the art can be used.
  • the correspondence between two semantic blocks when there is a correspondence between two semantic blocks, there may also be a correspondence between the pixels in the two semantic blocks.
  • the correspondence between two pixels may refer to the two pixels having the same relative position.
  • the relative position refers to the relative position of a pixel among multiple pixels contained in a semantic block.
  • pixel R in semantic block X of visible light image A and pixel R' in semantic block X' of visible light image B which means that pixel R and pixel R' are in the same relative position, for example, pixel R is the pixel in the third row and third column of visible light image A, and pixel R' is also the pixel in the third row and third column of visible light image B.
  • the pixel difference matrix refers to the matrix composed of the pixel differences between every two corresponding pixels in a semantic block pair.
  • the processor 130 may calculate the difference between two pixel points at corresponding pixel positions in the semantic block pair, and obtain a pixel difference matrix according to the difference of each pixel position.
  • the difference between two pixel points may be calculated and determined by a distance function or the like.
  • the distance function includes but is not limited to Euclidean distance, cosine distance, and the like.
  • the difference statistic refers to a statistic obtained by extracting statistical features from the pixel difference matrix.
  • the difference statistic may be a mean value, a variance value, etc. of the differences between each pixel in the pixel difference matrix.
  • the processor 130 may extract statistical features from the pixel difference matrix using a statistical feature extraction algorithm to obtain a difference statistical value.
  • the statistical feature extraction algorithm includes an algorithm for extracting a mean, an algorithm for extracting a variance, etc., which are not limited here.
  • the processor 130 may determine the first snapshot frequency based on the ambient light intensity sequence and the difference distribution sequence in a variety of ways.
  • the processor 130 may determine the first snapshot frequency by vector retrieval based on the ambient light intensity sequence and the difference distribution sequence.
  • the processor 130 may construct a vector to be matched based on the ambient light intensity sequence and the difference distribution sequence; perform vector matching in a vector database based on the vector to be matched to determine an associated vector; and determine the first snapshot frequency based on the associated vector.
  • the vector database may include multiple reference vectors and their corresponding reference capture frequencies.
  • the reference vectors may be constructed based on historical data. For example, multiple reference vectors are obtained by vector construction of multiple historical ambient light intensity sequences and multiple historical difference distribution sequences. The reference capture frequencies corresponding to the reference vectors may be manually labeled based on prior knowledge and actual historical capture frequencies.
  • the processor 130 may determine a reference vector that meets a preset matching condition in a vector database based on the vector to be matched, determine the reference vector that meets the preset matching condition as an associated vector, and determine the reference snapshot frequency corresponding to the associated vector as the first snapshot frequency corresponding to the vector to be matched.
  • the preset matching condition may refer to a judgment condition for determining the associated vector.
  • the preset matching condition may include a vector distance less than a distance threshold, a vector distance minimum, etc.
  • the processor 130 may be configured to generate a frequency signal based on the ambient light intensity sequence and the difference distribution sequence.
  • the rate determination model determines the first snapshot frequency.
  • the frequency determination model may be a machine learning model, such as a deep neural network (DNN) model.
  • the input of the frequency determination model includes an ambient light intensity sequence and a difference distribution sequence, and the output includes the first capture frequency.
  • the frequency determination model can be trained by various methods based on multiple first training samples with first labels to update model parameters.
  • the training can be based on the gradient descent method.
  • the training can be based on the gradient descent method.
  • multiple first training samples with first labels can be input into the initial frequency determination model, and a loss function can be constructed by the first label and the output result of the initial frequency determination model, and the parameters of the initial frequency determination model can be iteratively updated based on the loss function.
  • the loss function of the initial frequency determination model meets the preset conditions, the model training is completed, and a trained image recognition model is obtained.
  • the preset conditions can be that the loss function converges, the number of iterations reaches a threshold, etc.
  • the first training sample includes a sample ambient light intensity sequence and a sample difference distribution sequence corresponding to a sample initial visible light image sequence of the sample area (the sample initial visible light image sequence includes multiple initial visible light images acquired at a sample second capture frequency), and the first label includes a first capture frequency for capturing visible light images of the sample area.
  • the sample area may be an area where the first fire location is determined.
  • the first training sample and the first label may be determined based on historical data. For example, the area where the first fire location is determined may be used as the sample area, and the ambient light intensity sequence and the difference distribution sequence of the historical initial visible light image sequence of the sample area may be used as the first training sample.
  • the processor 130 may determine the lowest capture frequency that can identify a real fire under the first training sample as the first label corresponding to the first training sample.
  • Some embodiments of this specification can effectively analyze whether the ambient light intensity of the target area is in a relatively stable state or a frequently changing state (i.e., the changing state of the ambient light), and can effectively analyze whether the objects contained in the target area are stationary objects or moving objects (i.e., the dynamic and static state of the objects) by first taking multiple initial visible light images of the target area, and then determining the ambient light intensity sequence and difference distribution sequence of the multiple initial visible light images.
  • the influence of these factors on the judgment of the real fire situation can be fully considered, and then a reasonable first capture frequency can be determined to obtain a reasonable visible light image sequence, so that the first threshold range and the second threshold range set based on the visible light image sequence are more accurate and reasonable.
  • the processor 130 can process the visible light image sequence in a variety of ways to determine the first threshold range and the second threshold range.
  • the processor 130 can identify the ambient light intensity of each visible light image in the visible light image sequence and calculate the average ambient light intensity; based on the average ambient light intensity, the first threshold range and the second threshold range are determined by querying a first preset comparison table.
  • the processor 130 may use a trained image recognition model to recognize the ambient light intensity of the visible light image.
  • the image recognition model may be a machine learning model, such as a convolutional neural network (CNN) model.
  • CNN convolutional neural network
  • the input of the image recognition model includes the visible light image
  • the output includes the ambient light intensity of the visible light image.
  • the image recognition model can be trained by various methods based on multiple second training samples with second labels to update model parameters.
  • the training process of the image recognition model is similar to the training process of the frequency determination model. For more details, please refer to the relevant description above, which will not be repeated here.
  • the second training sample includes a sample visible light image of the sample area
  • the second label includes the ambient light intensity of the sample visible light image.
  • the second training sample can be obtained based on historical image data.
  • the second label can be obtained based on historical detection data.
  • the historical image data includes historical visible light images of the sample area taken by a visible light camera
  • the historical detection data includes historical ambient light intensity of the sample area collected by a light intensity detection instrument.
  • the first preset comparison table includes correspondences between different reference ambient light intensity means and different reference first threshold ranges and reference second threshold ranges.
  • the correspondences between different reference ambient light intensity means and different reference first threshold ranges and reference second threshold ranges can be constructed by preset construction rules based on prior knowledge or historical data to obtain the first preset comparison table.
  • the preset construction rules include: under different historical ambient light intensity means, according to historical judgment situations, the first threshold range and the second threshold range corresponding to the lowest misjudgment rate are determined as the first threshold range and the second threshold range corresponding to the historical ambient light intensity mean.
  • the historical judgment situation includes the correspondence between the judgment result of whether the fire at the first fire location is a real fire and the actual situation.
  • the historical judgment situation includes wrong judgment and accurate judgment.
  • the judgment result does not match the actual situation, it is a wrong judgment, otherwise it is an accurate judgment.
  • the fire at the first fire location is judged to be a real fire, but in fact the fire at the first fire location is not a real fire, it is a wrong judgment; when the fire at the first fire location is judged to be a real fire, but in fact the fire at the first fire location is a real fire, it is an accurate judgment.
  • the misjudgment rate is the ratio of the number of wrong judgments in multiple judgments to the total number of judgments.
  • the processor 130 can also match the same or similar historical ambient light intensity mean values in the historical data based on the current ambient light intensity mean value, and determine the historical first threshold range and the historical second threshold range corresponding to the same or similar historical ambient light intensity mean values as the currently corresponding first threshold range and the second threshold range.
  • an actual visible light image sequence of the target area can be obtained, and then the first threshold range and the second threshold range can be determined based on the visible light image sequence.
  • This method is conducive to adaptively and dynamically determining the first threshold range and the second threshold range based on the actual situation of the target area, so that the subsequent result of judging the authenticity of the fire based on color information is more accurate.
  • the processor 130 may determine the interference category of the unreal fire in response to the fire corresponding to the first fire location being an unreal fire. For more information about this embodiment, see FIG. 5 and its related description.
  • the relevant features of the visible light image are used to assist in determining the authenticity of the fire location that has been determined as a suspected fire by the infrared thermal image grayscale image, and misjudgment in various situations can be eliminated, such as: strong sun reflection points, high-temperature locations such as stationary car engines or exhaust pipes, moving high-temperature objects, etc.
  • the RGB information of the visible light image can solve the problem that the infrared thermal image grayscale image is easily affected by weather, season, and time, and improve the accuracy of fire judgment.
  • process 300 is only for example and illustration, and does not limit the scope of application of this specification.
  • various modifications and changes can be made to process 300 under the guidance of this specification. However, these modifications and changes are still within the scope of this specification.
  • FIG. 5 is an exemplary schematic diagram of determining interference categories of non-real fire conditions according to some embodiments of the present specification.
  • the processor 130 may determine an interference category of the unreal fire in response to the fire corresponding to the first fire location being an unreal fire.
  • the interference category may refer to a heat source category that is not a real fire. In some embodiments, the interference category includes a reflective point heat source and a non-reflective point heat source.
  • Reflective point heat sources refer to points or objects where the temperature rises due to reflection, for example, the sun's reflection points.
  • Non-reflective point heat sources refer to points or objects whose temperature rises due to reasons other than reflection, such as car engines, exhaust vents, hot water cups, etc.
  • the processor 130 may determine the interference category of the non-real fire in a variety of ways. In some embodiments, the processor 130 may determine the interference category of the non-real fire based on the shape of the fire area corresponding to the first fire location. For example, when the shape of the fire area is circular or quasi-circular, the interference category of the non-real fire at the first fire location is determined to be a reflective point heat source. In some embodiments, the processor 130 may determine the interference category of the non-real fire based on the color information (eg, RGB value, etc.) corresponding to the matching area of the first fire location in the second image of the target area.
  • the color information eg, RGB value, etc.
  • the processor 130 may determine that the interference category is a reflective point heat source in response to the RGB value satisfying the second judgment condition; and determine that the interference category is a non-reflective point heat source in response to the RGB value not satisfying the second judgment condition.
  • the second judgment condition is a judgment condition for judging the interference category that is not a real fire situation.
  • the second judgment condition includes: the R value, the G value, and the B value in the RGB value satisfy the third threshold range.
  • the third threshold range may be [e, f], a ⁇ e, b ⁇ f, e, f ⁇ (0,255).
  • the first threshold range may be [250,255]
  • the second threshold range may be [215,225]
  • the third threshold range may be [254,255].
  • e and f may be the same.
  • the third threshold range may be 255.
  • the R value, G value, and B value of the rectangular area Q' in Figure 4B are all 255 after mean processing, and the R value, G value, and B value are all within the third threshold range.
  • the R value, G value, and B value meet the second judgment condition, and it is determined that the non-real fire corresponding to the position W in the rectangular area Q in Figure 4A is a reflective point heat source.
  • the third threshold range may be predetermined.
  • the third threshold range may be a system default value, an empirical value, a manually preset value, or any combination thereof, and may be set according to actual needs, which is not limited in this specification.
  • the processor 130 may also dynamically determine a corresponding third threshold range according to the actual situation of the target area.
  • the processor 130 may obtain temperature distribution data of the target area from the first image, obtain color distribution data of the target area from the second image, and determine a third threshold range based on the temperature distribution data and the color distribution data.
  • the temperature distribution data refers to the temperature conditions of different sub-regions after the first image is divided into multiple sub-regions, wherein the sub-region refers to a partial region of the first image.
  • the processor 130 may divide the first image into a plurality of sub-regions in a variety of ways. For example, the processor 130 may divide the first image into a plurality of sub-regions according to a grid of a preset width. For another example, the processor 130 may divide the first image into a plurality of sub-regions by semantic segmentation.
  • the method of semantically segmenting the first image e.g., an infrared thermal image
  • FIG. 3 see FIG. 3 and its related description.
  • the temperature data of the sub-region may include one or more of the temperature mean, temperature variance, and temperature median of the sub-region.
  • the processor 130 may directly read the temperature value of each pixel from the first image, and then obtain the temperature data of the sub-region through statistical analysis. The embodiments of this specification do not specifically limit the method of statistical analysis, and operations familiar to those skilled in the art may be used.
  • the color distribution data refers to the color conditions of different sub-regions after the second image is divided into multiple sub-regions.
  • the method of dividing the second image is similar to the method of dividing the first image, and will not be repeated here.
  • the color information of the sub-region may include the mean R value, the mean G value, the mean B value, etc. of the sub-region.
  • the processor 130 may directly read the RGB value of each pixel from the second image, and then obtain the color information of the sub-region through statistical analysis.
  • the processor 130 can process the temperature distribution data and the color distribution data in a variety of ways to determine the third threshold value. Value range.
  • the processor 130 may determine the third threshold range by querying a second preset comparison table based on the temperature distribution data and the color distribution data.
  • the second preset comparison table includes correspondences between different reference temperature distribution data, different reference color distribution data, and different reference third threshold ranges.
  • the second preset comparison table is constructed in a similar manner to the first preset comparison table, and will not be described in detail here. For more information, see step 330 and its related description.
  • the processor 130 can also match the same or similar historical temperature distribution data and historical color distribution data in the historical data based on the current temperature distribution data and color distribution data, and determine the historical third threshold range corresponding to the same or similar historical temperature distribution data and historical color distribution data as the current corresponding third threshold range.
  • the processor 130 may determine material distribution data within a preset range of the first fire location based on the second image; and determine a third threshold range based on the temperature distribution data, the color distribution data, and the material distribution data.
  • the preset range refers to a certain range around the first fire location.
  • the preset range may be a 100-meter range around the first fire location.
  • the preset range may be the fire area of the first fire location.
  • the preset range may be a system default value, an experience value, a manually preset value, or any combination thereof, and may be set according to actual needs, and this specification does not limit this.
  • Material distribution data refers to the material conditions at different locations within a preset range of the first fire location.
  • the material conditions at different locations within the preset range may include, but are not limited to, one or more of iron, aluminum, and aluminum alloy.
  • the processor 130 may determine the material distribution data within a preset range of the first fire location based on the second image by means of a material detection algorithm or the like.
  • Exemplary material detection algorithms include material detection algorithms based on visual features, etc. The embodiments of this specification do not specifically limit the material detection algorithm, and operations familiar to those skilled in the art may be used.
  • the processor 130 may determine the third threshold range by querying a third preset comparison table based on the temperature distribution data, the color distribution data, and the material distribution data.
  • the third preset comparison table includes correspondences between different reference temperature distribution data, different reference color distribution data, different reference material distribution terms, and different reference third threshold ranges.
  • the construction method of the third preset comparison table is similar to the construction method of the first preset comparison table, and will not be repeated here. For more explanation, see step 330 and its related description.
  • the processor 130 may also match relevant historical data in the historical data based on the temperature distribution data, the color distribution data, and the material distribution data, thereby determining the third threshold range. For more information, refer to the above description of matching the historical data based on the temperature distribution data and the color distribution data.
  • the non-reflective point heat source may include a mobile heat source and a stationary heat source.
  • a mobile heat source refers to a non-reflective point heat source in a moving state.
  • a stationary heat source refers to a non-reflective point heat source in a stationary state.
  • an air conditioner outdoor unit for example, a fixed hot water cup, etc.
  • the processor 130 may further determine whether the interference category of the non-reflective point heat source is a moving heat source or a stationary heat source.
  • the processor 130 may determine a second fire location based on a third image of the target area; determine whether the second fire location and the first fire location satisfy a third determination condition; and in response to satisfying the third determination condition, The interference category is determined to be a stationary heat source; in response to not satisfying the third judgment condition, the interference category is determined to be a moving heat source.
  • the third image refers to image data used to determine the fire situation.
  • the third image may be an infrared thermal image.
  • the processor 130 may acquire the third image of the target area at a time point after determining the first fire location.
  • the acquisition method of the third image is the same as the acquisition method of the first image, which will not be repeated here.
  • the second fire location refers to the location area where a fire is suspected to have occurred.
  • the processor 130 may determine the second fire location in the third image in the same manner as the first fire location. In some embodiments, the processor 130 may determine the second fire location coordinates of the second fire location based on a rectangular coordinate system established by the reference coordinate points.
  • the processor 130 may align the third image with the first image, and then determine the second fire location coordinates of the second fire location based on the rectangular coordinate system established in the first image, and determine the fire area spreading outward with the second fire location as the center according to the fire area division method.
  • step 310 For more information on determining the first fire location, establishing a rectangular coordinate system, and dividing the fire area, refer to step 310 and its related description.
  • the second fire location corresponds to the same fire area as the first fire location.
  • corresponding to the same fire area may mean that the fire area corresponding to the first fire location completely overlaps or partially overlaps the fire area corresponding to the second fire location.
  • the area of the partially overlapping area may meet a preset area threshold.
  • the preset area threshold may be a system default value, an experience value, a manually preset value, or any combination thereof, and may be set according to actual needs, and this specification does not limit this.
  • the third judgment condition is a judgment condition for further judging whether the interference category is a moving heat source or a stationary heat source.
  • the third judgment condition includes: the second fire location is consistent with the first fire location.
  • the second fire location is consistent with the first fire location, which may mean that the second fire location coordinates are consistent with the first fire location coordinates.
  • the processor 130 may determine whether the second fire location and the first fire location meet the third judgment condition based on the second fire location coordinates of the second fire location and the first fire location coordinates of the first fire location.
  • the non-real fire at the first fire position corresponding to the above-mentioned first fire position coordinates belongs to a static heat source, such as the engine or exhaust pipe position of a stationary car; if they are inconsistent, it is determined that the non-real fire at the first fire position corresponding to the above-mentioned first fire position coordinates belongs to a mobile heat source, such as the engine of a moving car, a mobile hot water cup, etc.
  • the above manner by comparing the first fire location coordinates with the second fire location coordinates in the above manner, it is possible to determine whether the unreal fire source comes from a stationary heat source or a mobile heat source. At the same time, the above manner only calculates the RGB value for the suspicious location, which can greatly reduce the amount of calculation and is convenient and fast.
  • FIG6 is an exemplary schematic diagram of a fire condition judgment processing process according to some embodiments of this specification.
  • the fire condition judgment processing process includes:
  • the first fire position in the infrared thermal image may be mapped to the visible light image to determine a matching area.
  • determining the authenticity of the fire corresponding to the first fire location includes:
  • the next step of judging operation may be performed to determine the interference category of the unreal fire, including:
  • the interference category of the non-real fire corresponding to the first fire location is a non-reflective point heat source
  • S12 Determine whether the second fire location is consistent with the first fire location.
  • the fire After determining that the fire corresponding to the first fire location is not actually acquired and further determining the interference category of the non-real fire source, the fire is reported. For more information, see the related description above.
  • Visible light images can assist in determining the authenticity of suspicious points that have been identified as fires by infrared thermal imaging grayscale images, eliminating misjudgments in many situations, such as strong sun reflections, high-temperature locations such as stationary car engines or exhaust pipes, and moving high-temperature objects, which interfere with the judgment results and improve the accuracy of fire judgments.
  • an embodiment of this specification also provides a fire situation judgment device, which includes at least one processor and at least one memory; the at least one memory is used to store computer instructions; the at least one processor is used to execute at least part of the computer instructions to implement the fire situation judgment method discussed above.
  • an embodiment of this specification also provides a storage medium, which stores computer instructions.
  • the computer instructions When the computer instructions are executed on a computer, the computer executes the fire situation judgment method discussed above.
  • various aspects of the fire situation determination method provided in the embodiments of this specification may also be implemented in the form of a program product, which includes a program code.
  • the program product When the program product is run on the device, the program code is used to enable the control device to execute the steps of the fire situation determination method according to various exemplary embodiments of the present application described above in this specification.
  • aspects of this specification may be performed entirely by hardware, entirely by software (including firmware, resident software, microcode, etc.), or by a combination of hardware and software.
  • the above hardware or software may be referred to as “data blocks”, “modules”, “engines”, “units”, “components” or “systems”, etc.
  • various aspects of this specification may be expressed as a computer product located in one or more computer-readable media, which includes computer-readable program code.
  • Computer storage media can be any computer-readable media that can be connected to an instruction execution system, device or apparatus to communicate, propagate or transport the program for use.
  • the program code on the computer storage medium can be transmitted via any suitable medium, including radio, cable, fiber optic cable, RF, or similar media, or any combination of the above media.
  • the computer program code required for the operation of the various parts of this specification can be written in any one or more programming languages.
  • the program code can be run entirely on the user's computer, or run on the user's computer as a separate software package, or run partially on the user's computer and partially on a remote computer, or run entirely on a remote computer or processing device.
  • the remote computer can be connected to the user's computer through any network form, such as a local area network (LAN) or a wide area network (WAN), or connected to an external computer (e.g., via the Internet), or in a cloud computing environment, or used as a service such as software as a service (SaaS).
  • LAN local area network
  • WAN wide area network
  • SaaS software as a service
  • numbers describing the quantity of components and attributes are used. It should be understood that such numbers used to describe the embodiments are modified by the modifiers "about”, “approximately” or “substantially” in some examples. Unless otherwise specified, “about”, “approximately” or “substantially” indicate that the numbers are allowed to vary by ⁇ 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximate values, which may change according to the required features of individual embodiments. Although the numerical domains and parameters used to confirm the breadth of their range in some embodiments of this specification are approximate values, in specific embodiments, such numerical values are set as accurately as possible within the feasible range.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Fire-Detection Mechanisms (AREA)

Abstract

Provided in the present invention are a fire determination method, system and apparatus, and a storage medium. The method comprises: determining a first fire location on the basis of a first image of a target area; determining a matched area of the first fire location in a second image of the target area, and extracting color information of the matched area, a difference in photographing time of the first image and the second image meeting a preset condition; and on the basis of the color information, determining whether a fire corresponding to the first fire location is a real fire. The method can be implemented by means of the fire determination apparatus. The method can also be implemented after computer instructions stored by a computer-readable storage medium are read. The technical solution provided by the embodiments of the present application can eliminate interferences of reflection points of sunlight, etc., static heat sources such as engine exhaust gas of static automobiles, and moving heat sources such as moving hot water cups, thereby reducing the false determination rate of fires.

Description

一种火情判断方法、系统、装置和存储介质A fire situation judgment method, system, device and storage medium
交叉引用cross reference
本申请要求2022年10月20日提交的申请号为202211289815.7的中国申请的优先权,其全部内容通过引用并入本文。This application claims priority to Chinese application No. 202211289815.7 filed on October 20, 2022, the entire contents of which are incorporated herein by reference.
技术领域Technical Field
本说明书涉及火情监测与图像处理领域,特别涉及一种火情判断方法、系统、装置和存储介质。The present invention relates to the field of fire monitoring and image processing, and in particular to a fire determination method, system, device and storage medium.
背景技术Background technique
在全球气候变暖及生活用火不慎等因素的影响下,自然火灾以及人为火灾日益频发,对公众安全以及社会经济造成了巨大危害。在火灾发生初期,准确的火情判断、火警预报可以降低生命和财产的损失。因此,对火灾进行准确判断具有重要意义。Under the influence of global warming and careless use of fire in daily life, natural fires and man-made fires are becoming more and more frequent, causing great harm to public safety and social economy. In the early stage of a fire, accurate fire judgment and fire alarm forecast can reduce the loss of life and property. Therefore, it is of great significance to accurately judge a fire.
目前,采用探测烟气浓度、光信号以及温度信号检测火灾的感烟、感光、感温接触型火灾探测器受空间限制,在室外开阔空间探测灵敏度较低,火灾检测误判率较高。随着数字图像处理技术的迅速发展,基于图像处理的火灾判断能够克服接触型探测器对环境依赖性强等缺点,逐渐成为消防领域的主要研究方向。其中,红外热成像利用红外探测器探测物体散发的热量,配合光学成像物镜的使用可获得红外热像图;将图像与物体表面的热分布场一一对应,即可检测高温物体,例如火焰、太阳。该方法在森林防火、火情监测等领域运用广泛。At present, smoke, light, and temperature contact fire detectors that detect smoke concentration, light signals, and temperature signals are limited by space, have low detection sensitivity in open outdoor spaces, and have a high rate of misjudgment of fire detection. With the rapid development of digital image processing technology, fire judgment based on image processing can overcome the shortcomings of contact detectors such as strong dependence on the environment, and has gradually become the main research direction in the field of fire protection. Among them, infrared thermal imaging uses infrared detectors to detect the heat emitted by objects, and infrared thermal images can be obtained with the use of optical imaging objectives; by matching the image with the heat distribution field on the surface of the object one by one, high-temperature objects such as flames and the sun can be detected. This method is widely used in forest fire prevention, fire monitoring and other fields.
然而,基于温度高低进行火情检测的方式存在较高的误检率。例如,检测过程中,有时候会将太阳或者阳光反射点以及汽车引擎、尾气排气管等高温物体,误判为火源,造成火情误报。However, the method of fire detection based on temperature has a high false detection rate. For example, during the detection process, the sun or sunlight reflection points, as well as high-temperature objects such as car engines and exhaust pipes are sometimes mistaken as fire sources, resulting in false fire alarms.
因此,希望提供一种火情判断方法、系统、装置和存储介质,可以排除太阳、阳光反射点、尾气排气管等热源对火情判断的干扰,有效降低火情误判率。Therefore, it is hoped to provide a fire situation judgment method, system, device and storage medium that can eliminate the interference of heat sources such as the sun, sunlight reflection points, exhaust pipes, etc. on fire situation judgment, and effectively reduce the misjudgment rate of fire situation.
发明内容Summary of the invention
本申请提供了一种火情判断方法、系统、装置及存储设备,用以解决将太阳或者阳光反射点及汽车引擎、尾气排气管位置误判为火源而造成火情误报的问题。具体实现方案如下:This application provides a fire judgment method, system, device and storage device to solve the problem of false fire alarm caused by misjudging the sun or sunlight reflection point and the position of the car engine and exhaust pipe as the fire source. The specific implementation scheme is as follows:
第一方面,本发明内容之一提供一种火情判断方法,所述方法包括:基于目标区域的第一图像确定第一火情位置;确定所述第一火情位置在所述目标区域的第二图像中的匹配区域,并提取所述匹配区域的颜色信息,所述第一图像与所述第二图像的拍摄时间差满足预设条件;基于所述颜色信息,判断所述第一火情位置对应的火情是否为真实火情。In a first aspect, one of the contents of the present invention provides a fire judgment method, the method comprising: determining a first fire location based on a first image of a target area; determining a matching area of the first fire location in a second image of the target area, and extracting color information of the matching area, wherein the shooting time difference between the first image and the second image satisfies a preset condition; based on the color information, judging whether the fire corresponding to the first fire location is a real fire.
在一些实施例中,所述第一图像为红外热图像,所述第二图像为可见光图像。In some embodiments, the first image is an infrared thermal image, and the second image is a visible light image.
在一些实施例中,所述颜色信息包括RGB值,所述基于所述颜色信息,判断所述第一火情位置对应的火情是否为真实火情包括:响应于所述RGB值满足第一判断条件,确定所述第一火情位置对应的火情为所述真实火情;响应于所述RGB值不满足所述第一判断条件,确定所述第一火情位置对应的火情为非真实火情。In some embodiments, the color information includes RGB values, and judging whether the fire corresponding to the first fire location is a real fire based on the color information includes: in response to the RGB value satisfying a first judgment condition, determining that the fire corresponding to the first fire location is the real fire; in response to the RGB value not satisfying the first judgment condition, determining that the fire corresponding to the first fire location is a non-real fire.
在一些实施例中,所述第一判断条件包括:所述RGB值中R值和G值满足第一阈值范围、B值满足第二阈值范围。In some embodiments, the first judgment condition includes: the R value and the G value of the RGB value satisfy a first threshold range, and the B value satisfies a second threshold range.
在一些实施例中,所述方法进一步包括:响应于所述第一火情位置对应的火情为所述 非真实火情,确定所述非真实火情的干扰类别。In some embodiments, the method further comprises: in response to the fire corresponding to the first fire location being the A non-real fire situation is used to determine an interference category of the non-real fire situation.
在一些实施例中,所述响应于所述第一火情位置对应的火情为所述非真实火情,确定所述非真实火情的干扰类别包括:响应于所述RGB值满足第二判断条件,确定所述干扰类别为反光点热源;响应于所述RGB值不满足第二判断条件,确定所述干扰类别为非反光点热源。In some embodiments, in response to the fire corresponding to the first fire location being the non-real fire, determining the interference category of the non-real fire includes: in response to the RGB value satisfying a second judgment condition, determining that the interference category is a reflective point heat source; in response to the RGB value not satisfying the second judgment condition, determining that the interference category is a non-reflective point heat source.
在一些实施例中,所述第二判断条件包括:所述RGB值中R值、G值和B值满足第三阈值范围。In some embodiments, the second judgment condition includes: the R value, the G value and the B value of the RGB value satisfy a third threshold range.
在一些实施例中,所述非反光点热源包括移动热源和静止热源,在确定所述第一火情位置对应的所述非真实火情的所述干扰类别为所述非反光点热源后,所述方法还包括:基于所述第一图像确定第二火情位置,所述第二火情位置与所述第一火情位置对应同一火情区域;判断所述第二火情位置与所述第一火情位置是否满足第三判断条件;响应于满足所述第三判断条件,确定所述干扰类别为所述静止热源;响应于不满足所述第三判断条件,确定所述干扰类别为所述移动热源。In some embodiments, the non-reflective point heat source includes a mobile heat source and a stationary heat source. After determining that the interference category of the non-real fire corresponding to the first fire location is the non-reflective point heat source, the method also includes: determining a second fire location based on the first image, the second fire location and the first fire location corresponding to the same fire area; determining whether the second fire location and the first fire location satisfy a third judgment condition; in response to satisfying the third judgment condition, determining that the interference category is the stationary heat source; in response to not satisfying the third judgment condition, determining that the interference category is the mobile heat source.
在一些实施例中,所述第三判断条件包括:所述第二火情位置与所述第一火情位置一致。In some embodiments, the third judgment condition includes: the second fire location is consistent with the first fire location.
第二方面,本发明内容之一提供一种火情判断系统,所述系统包括:确定模块,用于基于目标区域的第一图像确定第一火情位置;提取模块,用于确定所述第一火情位置在所述目标区域的第二图像中的匹配区域,并提取所述匹配区域的颜色信息,所述第一图像与所述第二图像的拍摄时间差满足预设条件;判断模块,用于基于所述颜色信息,判断所述第一火情位置对应的火情是否为真实火情。In a second aspect, one of the contents of the present invention provides a fire situation judgment system, the system comprising: a determination module, used to determine a first fire location based on a first image of a target area; an extraction module, used to determine a matching area of the first fire location in a second image of the target area, and extract color information of the matching area, and a shooting time difference between the first image and the second image satisfies a preset condition; a judgment module, used to judge whether the fire corresponding to the first fire location is a real fire based on the color information.
第三方面,本发明内容之一提供一种火情判断装置,所述装置包括至少一个处理器以及至少一个存储器;所述至少一个存储器用于存储计算机指令;所述至少一个处理器用于执行所述计算机指令中的至少部分指令以实现火情判断方法。In a third aspect, one of the contents of the present invention provides a fire situation judgment device, which includes at least one processor and at least one memory; the at least one memory is used to store computer instructions; the at least one processor is used to execute at least part of the computer instructions to implement a fire situation judgment method.
第四方面,本发明内容之一提供一种计算机可读存储介质,所述存储介质存储计算机指令,当计算机读取存储介质中的计算机指令后,计算机执行火情判断方法。In a fourth aspect, one of the contents of the present invention provides a computer-readable storage medium, wherein the storage medium stores computer instructions. When a computer reads the computer instructions in the storage medium, the computer executes a fire situation determination method.
上述发明内容带来的有益效果包括但不限于:(1)通过可见光图像的相关特征(例如,颜色信息等),辅助判断已经被红外热图像灰度图判断为疑似出现火情的火情位置的真实性,可以排除多种情形的误判,比如:强烈的太阳反光点,静止的汽车引擎或尾气管等高温位置,运动的高温物体等;(2)可见光图像的RGB信息可以解决红外热图像灰度图易受天气、季节、时间影响的问题,提高火情判断的精确性。The beneficial effects brought about by the above invention include but are not limited to: (1) through the relevant features of the visible light image (for example, color information, etc.), it can assist in judging the authenticity of the fire location that has been judged as a suspected fire by the infrared thermal image grayscale map, and can eliminate misjudgment in many situations, such as: strong sun reflection points, high temperature locations such as stationary car engines or exhaust pipes, moving high temperature objects, etc.; (2) The RGB information of the visible light image can solve the problem that the infrared thermal image grayscale map is easily affected by weather, season, and time, thereby improving the accuracy of fire judgment.
附图说明BRIEF DESCRIPTION OF THE DRAWINGS
图1是根据本说明书一些实施例所示的火情判断系统的应用场景示意图;FIG1 is a schematic diagram of an application scenario of a fire situation judgment system according to some embodiments of this specification;
图2是根据本说明书一些实施例所示的火情判断系统的示例性模块图;FIG2 is an exemplary module diagram of a fire situation determination system according to some embodiments of this specification;
图3是根据本说明书一些实施例所示的火情判断方法的示例性流程图;FIG3 is an exemplary flow chart of a fire situation determination method according to some embodiments of this specification;
图4A是根据本说明书一些实施例所示的红外热图像灰度图的示例性示意图;FIG4A is an exemplary schematic diagram of an infrared thermal image grayscale map according to some embodiments of the present specification;
图4B是根据本说明书一些实施例所示的可见光图像灰度图的示例性示意图;FIG4B is an exemplary schematic diagram of a grayscale image of a visible light image according to some embodiments of the present specification;
图5是根据本说明书一些实施例所示的确定非真实火情的干扰类别的示例性示意图;FIG5 is an exemplary schematic diagram of determining interference categories of non-real fire conditions according to some embodiments of the present specification;
图6是根据本说明书一些实施例所示的火情判断处理过程的示例性示意图。 FIG. 6 is an exemplary schematic diagram of a fire situation determination process according to some embodiments of the present specification.
具体实施方式Detailed ways
为了更清楚地说明本说明书实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本说明书的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图将本说明书应用于其它类似情景。除非从语言环境中显而易见或另做说明,图中相同标号代表相同结构或操作。In order to more clearly illustrate the technical solutions of the embodiments of this specification, the following is a brief introduction to the drawings required for the description of the embodiments. Obviously, the drawings described below are only some examples or embodiments of this specification. For ordinary technicians in this field, without paying creative work, this specification can also be applied to other similar scenarios based on these drawings. Unless it is obvious from the language environment or otherwise explained, the same reference numerals in the figures represent the same structure or operation.
本文使用的“系统”、“装置”、“单元”和/或“模块”是用于区分不同级别的不同组件、元件、部件、部分或装配的一种方法。然而,如果其他词语可实现相同的目的,则可通过其他表达来替换所述词语。The "system", "device", "unit" and/or "module" used herein are a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
除非上下文明确提示例外情形,“一”、“一个”、“一种”和/或“该”等词并非特指单数,也可包括复数。一般说来,术语“包括”与“包含”仅提示包括已明确标识的步骤和元素,而这些步骤和元素不构成一个排它性的罗列,方法或者设备也可能包含其它的步骤或元素。Unless the context clearly indicates an exception, the words "a", "an", "an" and/or "the" do not refer to the singular and may also include the plural. Generally speaking, the terms "include" and "comprise" only indicate the inclusion of the steps and elements that have been clearly identified, and these steps and elements do not constitute an exclusive list. The method or device may also include other steps or elements.
本说明书中使用了流程图用来说明根据本说明书的实施例的系统所执行的操作。应当理解的是,前面或后面操作不一定按照顺序来精确地执行。相反,可以按照倒序或同时处理各个步骤。同时,也可以将其他操作添加到这些过程中,或从这些过程移除某一步或数步操作。Flowcharts are used in this specification to illustrate the operations performed by the system according to the embodiments of this specification. It should be understood that the preceding or following operations are not necessarily performed precisely in order. Instead, the steps may be processed in reverse order or simultaneously. At the same time, other operations may be added to these processes, or one or more operations may be removed from these processes.
当前,在火灾检测时,通过红外热成像利用红外探测器探测物体散发的热量,并配合光学成像物镜的使用可获得红外热像图;随后将图像与物体表面的热分布场一一对应,即可检测高温物体,比如火灾。在这种方式下,当对着火点进行圆形检测算法时,易将太阳或者阳光反射点及汽车引擎、尾气排气管位置误判为火源,导致火情误报,增加了火情误判率。Currently, when detecting fires, infrared thermal imaging uses infrared detectors to detect the heat emitted by objects, and infrared thermal images can be obtained by using optical imaging lenses; then the image is matched one-to-one with the heat distribution field on the surface of the object to detect high-temperature objects, such as fires. In this way, when a circular detection algorithm is used on the fire point, it is easy to misjudge the sun or sunlight reflection point and the position of the car engine and exhaust pipe as the fire source, resulting in false fire alarms and increasing the misjudgment rate of fire.
因此,本申请实施例提供了一种火情判断方法,通过该方法可以排除太阳光等反光点、静止汽车的引擎尾气等静止热源以及移动的热水杯等移动热源的干扰,降低火情误判率。Therefore, an embodiment of the present application provides a fire judgment method, by which interference from reflective points such as sunlight, static heat sources such as engine exhaust of stationary cars, and mobile heat sources such as moving hot water cups can be eliminated, thereby reducing the misjudgment rate of fire.
图1是根据本说明书一些实施例所示的火情判断系统的应用场景示意图。FIG1 is a schematic diagram of an application scenario of a fire situation judgment system according to some embodiments of this specification.
如图1所示,本说明书实施例所涉及的应用场景100可以包括热成像设备110、可见光相机120、处理器130、网络140、终端设备150以及存储设备160。在一些实施例中,火情判断系统可以通过实施本说明书中披露的方法和/或过程。As shown in Fig. 1, the application scenario 100 involved in the embodiments of this specification may include a thermal imaging device 110, a visible light camera 120, a processor 130, a network 140, a terminal device 150, and a storage device 160. In some embodiments, the fire situation determination system may implement the methods and/or processes disclosed in this specification.
热成像设备110可以用于获取红外热图像。在一些实施例中,热成像设备110可以实时监测并获取目标区域的红外热图像,热成像设备110可以对监控范围进行全方位或者定点的扫描,以获得监控范围内的任意区域的红外热图像。其中,热成像设备110可以通过非接触方式探测物体的红外能量,并将红外能量转换为电信号,进而形成红外热图像。在一些实施例中,红外热图像可以是灰度图,也可以是彩色图像。灰度图形式的红外热图像中,每个像素点的灰度值反映接收到的热辐射能量或温度高低。彩色图像形式的红外热图像中,可以用不同的颜色来直观的表示温度的高低,例如,可以用红色表示温度高,用蓝色表示温度低等。在一些实施例中,热成像设备110可以将获取的红外热图像发送给处理器130进行处理。在一些实施例中,热成像设备110可以将获取的红外热图像发送至存储设备160中进行存储。The thermal imaging device 110 can be used to obtain infrared thermal images. In some embodiments, the thermal imaging device 110 can monitor and obtain infrared thermal images of the target area in real time. The thermal imaging device 110 can perform an all-round or fixed-point scan of the monitoring range to obtain an infrared thermal image of any area within the monitoring range. Among them, the thermal imaging device 110 can detect the infrared energy of the object in a non-contact manner, and convert the infrared energy into an electrical signal to form an infrared thermal image. In some embodiments, the infrared thermal image can be a grayscale image or a color image. In the infrared thermal image in the form of a grayscale image, the grayscale value of each pixel reflects the received thermal radiation energy or temperature. In the infrared thermal image in the form of a color image, different colors can be used to intuitively represent the temperature. For example, red can be used to represent a high temperature, and blue can be used to represent a low temperature. In some embodiments, the thermal imaging device 110 can send the acquired infrared thermal image to the processor 130 for processing. In some embodiments, the thermal imaging device 110 can send the acquired infrared thermal image to the storage device 160 for storage.
可见光相机120可以用于获取可见光图像。例如,可见光相机可以是带有RGB sensor的成像系统的安防相机。在一些实施例中,可见光相机120可以实时监测并获取目标区域的可见光图像,可见光相机120可以对监控范围进行全方位或者定点的扫描,以获得监控范围 内的任意区域的可见光图像。其中,可见光相机120可以通过捕捉可见光谱范围内的光线来获取可见光图像。在一些实施例中,可见光图像可以是彩色图像。彩色图像形式的可见光图像中,每个像素点具有颜色信息,例如RGB值等。更多说明参见图3及其相关描述。在一些实施例中,可见光相机120可以将获取的可见光图像发送给处理器130进行处理。在一些实施例中,可见光相机120可以将获取的可见光图像发送至存储设备160中进行存储。The visible light camera 120 can be used to obtain visible light images. For example, the visible light camera can be a security camera with an imaging system of an RGB sensor. In some embodiments, the visible light camera 120 can monitor and obtain visible light images of the target area in real time. The visible light camera 120 can perform an all-round or fixed-point scan of the monitoring range to obtain the monitoring range. The visible light camera 120 can acquire a visible light image of any area within the visible light spectrum. Among them, the visible light camera 120 can acquire a visible light image by capturing light within the visible light spectrum. In some embodiments, the visible light image can be a color image. In a visible light image in the form of a color image, each pixel has color information, such as RGB value, etc. For more explanation, see Figure 3 and its related description. In some embodiments, the visible light camera 120 can send the acquired visible light image to the processor 130 for processing. In some embodiments, the visible light camera 120 can send the acquired visible light image to the storage device 160 for storage.
处理器130可以处理从热成像设备110、可见光相机120、终端设备150和/或存储设备160获取的数据和/或信息。例如,处理器130可以获取目标区域的第一图像和第二图像。又例如,处理器130可以基于目标区域的第一图像确定第一火情位置,确定第一火情位置在目标区域的第二图像中的匹配区域,并提取匹配区域的颜色信息。又例如,处理器130可以基于颜色信息,判断第一火情位置对应的火情是否为真实火情。The processor 130 may process data and/or information obtained from the thermal imaging device 110, the visible light camera 120, the terminal device 150, and/or the storage device 160. For example, the processor 130 may obtain a first image and a second image of a target area. For another example, the processor 130 may determine a first fire location based on the first image of the target area, determine a matching area of the first fire location in the second image of the target area, and extract color information of the matching area. For another example, the processor 130 may determine whether the fire corresponding to the first fire location is a real fire based on the color information.
在一些实施例中,处理器130可以是单一服务器或服务器组。服务器组可以是集中式的或分布式的。在一些实施例中,处理器130可以是本地或远程的。例如,处理器130可以通过网络140从热成像设备110、可见光相机120、终端设备150和/或存储设备160访问信息和/或数据。又例如,处理器130可以直接连接到热成像设备110、可见光相机120、终端设备150和/或存储设备160,以访问信息和/或数据。在一些实施例中,处理器130可以在云平台上实现。例如,云平台可以包括私有云、公共云、混合云、社区云、分布式云、云间云、多云等或其任意组合。In some embodiments, the processor 130 may be a single server or a server group. The server group may be centralized or distributed. In some embodiments, the processor 130 may be local or remote. For example, the processor 130 may access information and/or data from the thermal imaging device 110, the visible light camera 120, the terminal device 150, and/or the storage device 160 via the network 140. For another example, the processor 130 may be directly connected to the thermal imaging device 110, the visible light camera 120, the terminal device 150, and/or the storage device 160 to access information and/or data. In some embodiments, the processor 130 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud cloud, a multi-cloud, etc., or any combination thereof.
网络140可以包括提供能够促进信息和/或数据交换的任何合适的网络。在一些实施例中,应用场景100的一个或多个组件之间可以通过网络140交换信息和/或数据。例如,处理器130可以通过网络140接收热成像设备110采集的红外热图像和可见光相机120采集的可见光图像。网络140可以包括局域网(LAN)、广域网(WAN)、有线网络、无线网络等或其任意组合。The network 140 may include any suitable network that facilitates information and/or data exchange. In some embodiments, one or more components of the application scenario 100 may exchange information and/or data through the network 140. For example, the processor 130 may receive an infrared thermal image captured by the thermal imaging device 110 and a visible light image captured by the visible light camera 120 through the network 140. The network 140 may include a local area network (LAN), a wide area network (WAN), a wired network, a wireless network, etc., or any combination thereof.
终端设备150可以与热成像设备110、可见光相机120和/或存储设备160通信和/或连接。例如,终端设备150可以通过网络140向热成像设备110和/或可见光相机120发送一种或多种控制指令以控制热成像设备110和/或可见光相机120按照指令采集目标区域的红外热图像和/或可见光图像。在一些实施例中,终端设备150可以包括移动设备150-1、平板电脑150-2、笔记本电脑150-3、台式电脑150-4等其他具有输入和/或输出功能的设备中的一种或其任意组合。在一些实施例中,终端设备150可以包括输入设备、输出设备等。输入设备可以包括键盘、触摸屏、鼠标、语音设备等或其任意组合。输出设备可以包括显示器、扬声器、打印机等或其任意组合。在一些实施例中,终端设备150可以是处理器130的一部分。在一些实施例中,终端设备150可以与处理器130整合为一体,作为火情判断系统的操作台。The terminal device 150 can communicate and/or connect with the thermal imaging device 110, the visible light camera 120 and/or the storage device 160. For example, the terminal device 150 can send one or more control instructions to the thermal imaging device 110 and/or the visible light camera 120 through the network 140 to control the thermal imaging device 110 and/or the visible light camera 120 to collect infrared thermal images and/or visible light images of the target area according to the instructions. In some embodiments, the terminal device 150 may include one or any combination of other devices with input and/or output functions such as a mobile device 150-1, a tablet computer 150-2, a laptop computer 150-3, a desktop computer 150-4, etc. In some embodiments, the terminal device 150 may include an input device, an output device, etc. The input device may include a keyboard, a touch screen, a mouse, a voice device, etc., or any combination thereof. The output device may include a display, a speaker, a printer, etc., or any combination thereof. In some embodiments, the terminal device 150 may be a part of the processor 130. In some embodiments, the terminal device 150 may be integrated with the processor 130 as an operating console of the fire judgment system.
存储设备160可以存储数据、指令和/或任何其他信息。在一些实施例中,存储设备160可以存储从热成像设备110、可见光相机120、处理器130和/或终端设备150获取的数据。例如,存储设备160可以存储从热成像设备110和/或可见光相机120获取的红外热图像和/或可见光图像。在一些实施例中,存储设备160可以存储处理器130用来执行或使用以完成本说明书描述的示例性方法的数据和/或指令。例如,存储设备160可以存储用于控制热成像设备110、可见光相机120进行图像采集的指令。The storage device 160 may store data, instructions, and/or any other information. In some embodiments, the storage device 160 may store data acquired from the thermal imaging device 110, the visible light camera 120, the processor 130, and/or the terminal device 150. For example, the storage device 160 may store infrared thermal images and/or visible light images acquired from the thermal imaging device 110 and/or the visible light camera 120. In some embodiments, the storage device 160 may store data and/or instructions used by the processor 130 to execute or use to complete the exemplary methods described in this specification. For example, the storage device 160 may store instructions for controlling the thermal imaging device 110 and the visible light camera 120 to acquire images.
在一些实施例中,存储设备160可以包括大容量存储器、可移动存储器、易失性读写存储器、只读存储器(ROM)等或其任意组合。在一些实施例中,存储设备160可以在云平台上实现。在一些实施例中,存储设备160可以是处理器130的一部分。 In some embodiments, the storage device 160 may include a mass storage, a removable storage, a volatile read-write memory, a read-only memory (ROM), etc. or any combination thereof. In some embodiments, the storage device 160 may be implemented on a cloud platform. In some embodiments, the storage device 160 may be part of the processor 130.
应该注意的是,上述描述仅出于说明性目的而提供,并不旨在限制本说明书的范围。对于本领域普通技术人员而言,在本说明书内容的指导下,可做出多种变化和修改。可以以各种方式组合本说明书描述的示例性实施例的特征、结构、方法和其他特征,以获取另外的和/或替代的示例性实施例。例如,热成像设备110、可见光相机120、处理器130与终端设备150可以共用一个存储设备160,也可以有各自的存储设备。然而,这些变化与修改不会背离本说明书的范围。It should be noted that the above description is provided for illustrative purposes only and is not intended to limit the scope of this specification. For those of ordinary skill in the art, various changes and modifications may be made under the guidance of the contents of this specification. The features, structures, methods and other features of the exemplary embodiments described in this specification may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the thermal imaging device 110, the visible light camera 120, the processor 130 and the terminal device 150 may share a storage device 160, or may have their own storage devices. However, these changes and modifications do not deviate from the scope of this specification.
图2是根据本说明书一些实施例所示的火情判断系统的示例性模块图。在一些实施例中,如图2所示,火情判断系统200可以包括确定模块210、提取模块220和判断模块230。在一些实施例中,火情判断系统200可以通过处理器130实现。FIG2 is an exemplary module diagram of a fire situation determination system according to some embodiments of the present specification. In some embodiments, as shown in FIG2 , the fire situation determination system 200 may include a determination module 210, an extraction module 220, and a determination module 230. In some embodiments, the fire situation determination system 200 may be implemented by a processor 130.
在一些实施例中,确定模块210可以用于基于目标区域的第一图像确定第一火情位置。关于第一图像、第一火情位置的更多说明参见图3及其相关描述。In some embodiments, the determination module 210 may be used to determine the first fire location based on the first image of the target area. For more information about the first image and the first fire location, see FIG. 3 and its related description.
在一些实施例中,提取模块220可以用于确定第一火情位置在目标区域的第二图像中的匹配区域,并提取匹配区域的颜色信息,所述第一图像与所述第二图像的拍摄时间差满足预设条件。关于第二图像、匹配区域、颜色信息等的更多说明参见图3及其相关描述。In some embodiments, the extraction module 220 can be used to determine the matching area of the first fire location in the second image of the target area, and extract the color information of the matching area, and the shooting time difference between the first image and the second image meets the preset condition. For more information about the second image, the matching area, the color information, etc., please refer to Figure 3 and its related description.
在一些实施例中,判断模块230可以用于基于匹配区域的颜色信息,判断第一火情位置对应的火情是否为真实火情。In some embodiments, the determination module 230 may be used to determine whether the fire corresponding to the first fire location is a real fire based on the color information of the matching area.
在一些实施例中,所述颜色信息包括RGB值,判断模块230可以进一步用于:响应于RGB值满足第一判断条件,确定第一火情位置对应的火情为真实火情;响应于RGB值不满足第一判断条件,确定第一火情位置对应的火情为非真实火情。In some embodiments, the color information includes RGB values, and the judgment module 230 can be further used to: in response to the RGB value satisfying the first judgment condition, determine that the fire corresponding to the first fire location is a real fire; in response to the RGB value not satisfying the first judgment condition, determine that the fire corresponding to the first fire location is a non-real fire.
在一些实施例中,判断模块230可以进一步用于:响应于第一火情位置对应的火情为非真实火情,确定非真实火情的干扰类别。在一些实施例中,判断模块230可以进一步用于:响应于RGB值满足第二判断条件,确定干扰类别为反光点热源;响应于RGB值不满足第二判断条件,确定干扰类别为非反光点热源。In some embodiments, the judgment module 230 may be further used to: in response to the fire corresponding to the first fire position being a non-real fire, determine the interference category of the non-real fire. In some embodiments, the judgment module 230 may be further used to: in response to the RGB value satisfying the second judgment condition, determine the interference category as a reflective point heat source; in response to the RGB value not satisfying the second judgment condition, determine the interference category as a non-reflective point heat source.
在一些实施例中,所述非反光点热源包括移动热源和静止热源,在确定所述第一火情位置对应的所述非真实火情的所述干扰类别为所述非反光点热源后,所述判断模块230可以进一步用于:基于第一图像确定第二火情位置,第二火情位置与第一火情位置对应同一火情区域;判断第二火情位置与第一火情位置是否满足第三判断条件;响应于满足第三判断条件,确定干扰类别为静止热源;响应于不满足第三判断条件,确定干扰类别为移动热源。In some embodiments, the non-reflective point heat source includes a mobile heat source and a stationary heat source. After determining that the interference category of the non-real fire corresponding to the first fire location is the non-reflective point heat source, the judgment module 230 can be further used to: determine a second fire location based on the first image, the second fire location and the first fire location corresponding to the same fire area; determine whether the second fire location and the first fire location meet a third judgment condition; in response to meeting the third judgment condition, determine that the interference category is a stationary heat source; in response to not meeting the third judgment condition, determine that the interference category is a mobile heat source.
关于判断第一火情位置对应的火情是否为真实火情的更多说明参见图3、图5及其相关描述。For more information on determining whether the fire corresponding to the first fire location is a real fire, see Figures 3 and 5 and their related descriptions.
需要注意的是,以上对于候选项显示、确定系统及其模块的描述,仅为描述方便,并不能把本说明书限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解该系统的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接。在一些实施例中,图2中披露的确定模块210、提取模块220和判断模块230可以是一个系统中的不同模块,也可以是一个模块实现上述的两个或两个以上模块的功能。例如,各个模块可以共用一个存储模块,各个模块也可以分别具有各自的存储模块。诸如此类的变形,均在本说明书的保护范围之内。It should be noted that the above description of the candidate display, determination system and its modules is only for convenience of description and cannot limit this specification to the scope of the embodiments. It can be understood that for those skilled in the art, after understanding the principle of the system, it is possible to arbitrarily combine the various modules without deviating from this principle, or to form a subsystem connected with other modules. In some embodiments, the determination module 210, extraction module 220 and judgment module 230 disclosed in Figure 2 can be different modules in a system, or a module can realize the functions of two or more modules mentioned above. For example, each module can share a storage module, or each module can have its own storage module. Such variations are all within the scope of protection of this specification.
图3是根据本说明书一些实施例所示的火情判断方法的示例性流程图。FIG. 3 is an exemplary flow chart of a fire situation determination method according to some embodiments of the present specification.
在一些实施例中,火情判断方法可以由处理器130或火情判断系统200执行。例如,流程300可以以程序或指令的形式存储在存储设备(例如,存储设备160)中,当处理器130 或火情判断系统200执行该程序或指令时,可以实现流程300。下文呈现的流程300的操作示意图是说明性的。在一些实施例中,可以利用一个或以上未描述的附加操作和/或未讨论的一个或以上操作来完成该过程。另外,图3中示出的及下文描述的流程300的操作的顺序并非限制性的。In some embodiments, the fire condition determination method may be executed by the processor 130 or the fire condition determination system 200. For example, the process 300 may be stored in a storage device (eg, the storage device 160) in the form of a program or instruction. When the fire situation judgment system 200 executes the program or instruction, the process 300 can be implemented. The operation diagram of the process 300 presented below is illustrative. In some embodiments, the process can be completed using one or more additional operations not described and/or one or more operations not discussed. In addition, the order of the operations of the process 300 shown in Figure 3 and described below is not restrictive.
步骤301,基于目标区域的第一图像确定第一火情位置。在一些实施例中,步骤310可以由处理器130或确定模块210执行。Step 301 : determining a first fire location based on a first image of a target area. In some embodiments, step 310 may be performed by the processor 130 or the determination module 210 .
目标区域是指待确定火情情况的区域。其中,火情情况可以包括疑似发生火情的位置、火情的真实情况等。The target area refers to the area where the fire situation is to be determined, wherein the fire situation may include the location where the fire is suspected to have occurred, the actual situation of the fire, etc.
目标区域可以是多种形式。例如,目标区域可以是室外广场、室外停车场、室内广场、商场等多种形式。关于目标区域的相关描述仅为示例性说明,不构成对实施方式的限制。The target area can be in various forms. For example, the target area can be in various forms such as an outdoor square, an outdoor parking lot, an indoor square, a shopping mall, etc. The description of the target area is only for illustrative purposes and does not constitute a limitation on the implementation method.
第一图像是指用于确定火情情况的图像数据。The first image refers to image data used to determine the fire situation.
在一些实施例中,第一图像为红外热图像。例如,第一图像可以是彩色图像形式的红外热图像。In some embodiments, the first image is an infrared thermal image. For example, the first image may be an infrared thermal image in the form of a color image.
在一些实施例中,第一图像可以由监测位置处部署的监测设备采集得到。监测位置可以是监测设备所在的位置。在一些实施例中,监测设备可以包括热成像设备。在一种可选的应用场景下,室外广场中分散设置有多个监控柱,多个监控柱的顶端设置有云台,监测设备安装在云台上,云台可以旋转,以使多个监测设备采集室外广场的红外热图像等。因而,在上述应用场景中,监测位置可以是云台的位置。对于多个监控设备,可以安装在位于不同位置的云台上,位于不同云台上的多个监控设备可以与监控中心之间通过网络进行信息交互。关于热成像设备的更多说明参见图1及其相关描述。In some embodiments, the first image can be acquired by a monitoring device deployed at a monitoring position. The monitoring position can be the location where the monitoring device is located. In some embodiments, the monitoring device can include a thermal imaging device. In an optional application scenario, a plurality of monitoring columns are dispersedly arranged in an outdoor square, a pan-tilt is arranged at the top of the plurality of monitoring columns, and the monitoring device is mounted on the pan-tilt, which can rotate so that a plurality of monitoring devices can collect infrared thermal images of the outdoor square, etc. Therefore, in the above application scenario, the monitoring position can be the location of the pan-tilt. For a plurality of monitoring devices, they can be installed on pan-tilts located at different locations, and a plurality of monitoring devices located on different pan-tilts can exchange information with a monitoring center through a network. For more information about the thermal imaging device, see FIG. 1 and its related description.
第一火情位置是指疑似发生火情的位置区域。The first fire location refers to the location area where the fire is suspected to have occurred.
在一些实施例中,处理器130可以通过多种方式对目标区域的第一图像进行处理,进而确定第一火情位置。在一些实施例中,处理器130可以通过灰度处理方式对原始红外热图像(例如,彩色形式的红外热图像)进行处理,得到灰度图形式的红外热图像(以下称为红外热图像灰度图),并按照火情位置的判断方式,在红外热图像灰度图中确定第一火情位置。In some embodiments, the processor 130 may process the first image of the target area in a variety of ways to determine the first fire location. In some embodiments, the processor 130 may process the original infrared thermal image (e.g., a color infrared thermal image) in a grayscale processing manner to obtain an infrared thermal image in a grayscale image (hereinafter referred to as an infrared thermal image grayscale image), and determine the first fire location in the infrared thermal image grayscale image according to the fire location determination method.
在一些实施例中,火情位置的判断方式包括:当红外热图像灰度图中存在一个或多个像素点的灰度值大于预设灰度阈值时,将这些像素点组成的位置区域确定为第一火情位置。在一些实施例中,火情位置的判断方式包括:确定灰度值大于预设灰度阈值的多个像素点组成的位置区域;确定位置区域的温度统计值(例如,标准差、均差等);基于温度统计值确定该位置区域的波动特征(例如,将温度标准差与温度均差的比值确定为波动特征);当波动特征(例如,前述的温度标准差与温度均差的比值)大于预设比值时,确定该位置区域为第一火情位置。其中,预设灰度阈值、预设比值可以是系统默认值、经验值、人为预先设定值等或其任意组合,可以根据实际需求设定,本说明书对此不做限制。In some embodiments, the method for determining the location of a fire includes: when there are one or more pixels in the grayscale image of the infrared thermal image with a grayscale value greater than a preset grayscale threshold, the location area composed of these pixels is determined as the first fire location. In some embodiments, the method for determining the location of a fire includes: determining a location area composed of multiple pixels whose grayscale values are greater than a preset grayscale threshold; determining the temperature statistics of the location area (for example, standard deviation, mean difference, etc.); determining the fluctuation characteristics of the location area based on the temperature statistics (for example, determining the ratio of the temperature standard deviation to the temperature mean difference as the fluctuation characteristics); when the fluctuation characteristics (for example, the ratio of the aforementioned temperature standard deviation to the temperature mean difference) is greater than the preset ratio, determining the location area as the first fire location. Among them, the preset grayscale threshold and the preset ratio can be system default values, experience values, artificial pre-set values, etc. or any combination thereof, and can be set according to actual needs, and this specification does not limit this.
这里需要说明的是,本说明书实施例对灰度处理方式、火情位置的判断方式没有特殊的限定,采用本领域技术人员熟知的操作即可。It should be noted that the embodiments of this specification do not have any special limitations on the grayscale processing method and the fire location determination method, and operations familiar to those skilled in the art may be used.
在一些实施例中,在第一火情位置确定后,处理器130可以通过基准坐标点确定方式在该红外热图像灰度图中确定基准坐标点,根据基准坐标点以及基于基准坐标点建立的直角坐标系确定该第一火情位置的第一火情位置坐标,并根据火情区域的划分方式确定以第一火情位置为中心向外扩散的火情区域。In some embodiments, after the first fire location is determined, the processor 130 can determine the reference coordinate point in the infrared thermal image grayscale map by a reference coordinate point determination method, determine the first fire location coordinates of the first fire location based on the reference coordinate point and a rectangular coordinate system established based on the reference coordinate point, and determine the fire area spreading outward with the first fire location as the center according to the fire area division method.
在一些实施例中,火情区域的划分方式包括:将以火情位置为中心,以预设距离值向 外扩散得到的矩形区域为火情区域。在一些实施例中,火情区域的划分方式包括:将与火情位置处的灰度值之间的灰度差值在预设差值范围内的多个像素点拼接得到的区域确定为火情区域。预设距离值、预设差值范围可以是系统默认值、经验值、人为预先设定值等或其任意组合,可以根据实际需求设定,本说明书对此不做限制。In some embodiments, the fire area division method includes: taking the fire location as the center and dividing the fire area by a preset distance value. The rectangular area obtained by the external diffusion is the fire area. In some embodiments, the fire area is divided by: the area obtained by splicing multiple pixel points whose grayscale difference with the grayscale value at the fire location is within a preset difference range is determined as the fire area. The preset distance value and the preset difference range can be system default values, experience values, artificial preset values, etc. or any combination thereof, and can be set according to actual needs, and this specification does not limit this.
这里需要说明的是,本说明书实施例对火情区域的划分方式没有特殊的限定,采用本领域技术人员熟知的操作即可。It should be noted that the embodiments of this specification do not have any special restrictions on the method of dividing the fire area, and operations familiar to those skilled in the art can be used.
在一些实施例中,基准坐标点确定方式包括:基于红外热图像灰度图的左上角确定基准坐标点。在一些实施例中,基准坐标点确定方式还包括:基于红外热图像灰度图的左下角确定基准坐标点。本说明书实施例中并不限定具体的基准坐标点确定方式。In some embodiments, the reference coordinate point determination method includes: determining the reference coordinate point based on the upper left corner of the infrared thermal image grayscale image. In some embodiments, the reference coordinate point determination method also includes: determining the reference coordinate point based on the lower left corner of the infrared thermal image grayscale image. The specific reference coordinate point determination method is not limited in the embodiments of this specification.
举例来说,图4A为热成像设备拍摄的目标区域的原始红外热图像经灰度处理后的红外热图像灰度图,通过火情位置的判断条件确定了在该红外热图像灰度图中某一位置W为第一火情位置,以红外热图像灰度图左下角为原点(0,0)建立直角坐标系,确定第一火情位置(即位置W)的坐标为(5,5),该坐标即为第一火情位置坐标,以位置W为中心向外扩散的矩形区域即为火情区域Q。For example, Figure 4A is an infrared thermal image grayscale image after grayscale processing of the original infrared thermal image of the target area taken by the thermal imaging device. Through the judgment condition of the fire location, a certain position W in the infrared thermal image grayscale image is determined as the first fire location. A rectangular coordinate system is established with the lower left corner of the infrared thermal image grayscale image as the origin (0, 0), and the coordinates of the first fire location (i.e., position W) are determined to be (5, 5). This coordinate is the first fire location coordinate, and the rectangular area radiating outward from position W is the fire area Q.
步骤302,确定第一火情位置在目标区域的第二图像中的匹配区域,并提取匹配区域的颜色信息。在一些实施例中,步骤320可以由处理器130或提取模块220执行。Step 302 , determining a matching area of the first fire location in the second image of the target area, and extracting color information of the matching area. In some embodiments, step 320 may be performed by the processor 130 or the extraction module 220 .
第二图像是指用于确定火情情况的图像数据。第二图像不同于第一图像。The second image refers to image data used to determine the fire situation. The second image is different from the first image.
在一些实施例中,第二图像为可见光图像。例如,第二图像可以是彩色图像形式的可见光图像。In some embodiments, the second image is a visible light image. For example, the second image can be a visible light image in the form of a color image.
在一些实施例中,第二图像可以在确定第一火情位置后获取。在一些实施例中,第一图像与第二图像的拍摄时间差满足预设条件。在一些实施例中,预设条件可以是:第一图像与第二图像的拍摄时间差小于预设时间差阈值。预设条件、预设时间差可以基于实际需求进行设置,在此不做限制。In some embodiments, the second image may be acquired after the first fire location is determined. In some embodiments, the shooting time difference between the first image and the second image satisfies a preset condition. In some embodiments, the preset condition may be that the shooting time difference between the first image and the second image is less than a preset time difference threshold. The preset condition and the preset time difference may be set based on actual needs and are not limited here.
在一些实施例中,第二图像可以由监测位置处部署的监测设备采集得到。在一些实施例中,监测设备还可以包括可见光相机。例如,可以基于带有RGB sensor的成像系统的安防相机拍摄出上述目标区域对应的raw图像;通过raw图像处理方式对该raw图像处理,得到对应的可见光图像。关于可见光相机的更多说明参见图1及其相关描述。In some embodiments, the second image can be acquired by a monitoring device deployed at the monitoring location. In some embodiments, the monitoring device may also include a visible light camera. For example, a security camera based on an imaging system with an RGB sensor may capture a raw image corresponding to the target area; the raw image is processed by a raw image processing method to obtain a corresponding visible light image. For more information about the visible light camera, see FIG. 1 and its related description.
这里需要说明的是,本说明书实施例对raw图像处理方式没有特殊的限定,采用本领域技术人员熟知的操作即可。It should be noted that the embodiments of this specification do not specifically limit the raw image processing method, and operations well known to those skilled in the art may be used.
匹配区域是指第一火情位置对应的火情区域在第二图像中对应的部分图像区域。The matching area refers to the partial image area corresponding to the fire area corresponding to the first fire location in the second image.
在一些实施例中,处理器130可以通过多种方式,确定第一火情位置在目标区域的第二图像中的匹配区域。例如,处理器130可以通过图像配准算法,将第一图像和第二图像进行配置,进而根据第一火情位置在第一图像中的火情区域,确定第一火情位置在第二图像中的匹配区域。其中,图像配准算法包括但不限于基于模板的匹配算法、基于灰度的匹配算法、基于特征的匹配算法等。又例如,处理器130可以通过基准坐标点确定方式确定第二图像的基准坐标点,并基于基准坐标点构建直角坐标系;以直角坐标系为基准在第二图像中标记上述第一图像中的火情区域,得到匹配区域。In some embodiments, the processor 130 can determine the matching area of the first fire location in the second image of the target area in a variety of ways. For example, the processor 130 can configure the first image and the second image through an image registration algorithm, and then determine the matching area of the first fire location in the second image based on the fire area of the first fire location in the first image. Among them, the image registration algorithm includes but is not limited to a template-based matching algorithm, a grayscale-based matching algorithm, a feature-based matching algorithm, etc. For another example, the processor 130 can determine the reference coordinate points of the second image through a reference coordinate point determination method, and construct a rectangular coordinate system based on the reference coordinate points; mark the fire area in the first image in the second image based on the rectangular coordinate system to obtain a matching area.
举例来说,如图4B所示为上述目标区域的可见光图像灰度图(即灰度图形式的第二图像,由可见光相机获取),以该可见光图像灰度图的左下角为原点(0,0)建立直角坐标系,在可见光图像灰度图中以矩形标记步骤310中确定的火情区域,即图4A中红外热图像 灰度图中的矩形区域Q,得到匹配区域Q’。这里需要说明的是,图4A和图4B分别是同一个目标区域的红外热图像灰度图和可见光图像灰度图,红外热图像灰度图中不同像素点的灰度值与温度相关,而可见光图像灰度图中不同像素点的灰度值与温度之间无特定对应关系。For example, as shown in FIG. 4B , a visible light image grayscale image of the target area (i.e., the second image in grayscale form, obtained by a visible light camera) is used to establish a rectangular coordinate system with the lower left corner of the visible light image grayscale image as the origin (0, 0). The fire area determined in step 310 is marked with a rectangle in the visible light image grayscale image, i.e., the infrared thermal image in FIG. 4A The rectangular area Q in the grayscale image is obtained to obtain the matching area Q'. It should be noted here that FIG4A and FIG4B are respectively the infrared thermal image grayscale image and the visible light image grayscale image of the same target area. The grayscale values of different pixels in the infrared thermal image grayscale image are related to the temperature, while there is no specific corresponding relationship between the grayscale values of different pixels in the visible light image grayscale image and the temperature.
颜色信息是指与匹配区域所包含的颜色的情况相关的信息。The color information refers to information on the color included in the matching area.
在一些实施例中,颜色信息可以包括RGB值。匹配区域的RGB值可以通过多种方式获取,本说明书实施例对RGB值的提取方式没有特殊的限定,采用本领域技术人员熟知的操作即可。In some embodiments, the color information may include RGB values. The RGB values of the matching area may be obtained in a variety of ways. The embodiments of this specification do not specifically limit the method for extracting the RGB values, and operations familiar to those skilled in the art may be used.
在一些实施例中,匹配区域的颜色信息可以包括匹配区域中各个像素点的颜色信息。在一些实施例中,匹配区域的颜色信息可以是匹配区域中各个像素点的颜色信息的统计值。例如,均值、极值、中值等统计值。In some embodiments, the color information of the matching area may include the color information of each pixel in the matching area. In some embodiments, the color information of the matching area may be a statistical value of the color information of each pixel in the matching area, such as a mean, an extreme value, a median, or the like.
步骤303,基于颜色信息,判断第一火情位置对应的火情是否为真实火情。在一些实施例中,步骤330可以由处理器130或判断模块230执行。Step 303 : Based on the color information, determine whether the fire corresponding to the first fire position is a real fire. In some embodiments, step 330 may be executed by the processor 130 or the determination module 230 .
在一些实施例中,处理器130可以根据颜色信息中的RGB值,判断第一火情位置对应的火情是否为真实火情。在一些实施例中,处理器130可以根据匹配区域中各个像素点的RGB值,判断第一火情位置对应的火情是否为真实火情。例如,响应于匹配区域中存在预设数量的像素点的RGB值满足特定的判断条件时,判断第一火情位置对应的火情为真实火情,反之为非真实火情。又例如,响应于各个像素点的RGB值的均值满足特定的判断条件时,判断第一火情位置对应的火情为真实火情,反之为非真实火情。In some embodiments, the processor 130 can determine whether the fire corresponding to the first fire position is a real fire based on the RGB value in the color information. In some embodiments, the processor 130 can determine whether the fire corresponding to the first fire position is a real fire based on the RGB value of each pixel in the matching area. For example, in response to the RGB values of a preset number of pixels in the matching area satisfying a specific judgment condition, the fire corresponding to the first fire position is judged to be a real fire, otherwise it is an unreal fire. For another example, in response to the average of the RGB values of each pixel satisfying a specific judgment condition, the fire corresponding to the first fire position is judged to be a real fire, otherwise it is an unreal fire.
在一些实施例中,处理器130可以响应于RGB值满足第一判断条件,确定第一火情位置对应的火情为真实火情;响应于RGB值不满足第一判断条件,确定第一火情位置对应的火情为非真实火情。In some embodiments, the processor 130 can determine that the fire corresponding to the first fire location is a real fire in response to the RGB value satisfying the first judgment condition; in response to the RGB value not satisfying the first judgment condition, determine that the fire corresponding to the first fire location is a non-real fire.
第一判断条件是指用于判断第一火情位置对应的火情是否为真实火情的判断条件。The first judgment condition refers to a judgment condition used to judge whether the fire corresponding to the first fire location is a real fire.
在一些实施例中,第一判断条件包括:RGB值中R值和G值满足第一阈值范围、B值满足第二阈值范围。其中,第一阈值范围可以是[a,b],第二阈值范围可以是[c,d],d<a,a、b、c、d∈(0,255)。仅作为示例,第一阈值范围可以是[250,255],第二阈值范围可以是[210,230],优选地,第二阈值范围可以是[215,225]。在一些实施例中,a、b可以相同。仅作为示例,第一阈值范围可以是255。In some embodiments, the first judgment condition includes: the R value and the G value in the RGB value satisfy the first threshold range, and the B value satisfies the second threshold range. The first threshold range may be [a, b], and the second threshold range may be [c, d], where d < a, a, b, c, d∈(0,255). As an example only, the first threshold range may be [250,255], and the second threshold range may be [210,230]. Preferably, the second threshold range may be [215,225]. In some embodiments, a and b may be the same. As an example only, the first threshold range may be 255.
举例来说,图4B中矩形区域Q’的R值、G值、B值经均值处理后得到的R值、G值、B值分别为255、254、252,此时R值、G值在第一预设阈值范围内,B值没有在第二阈值范围内,则可以确定匹配区域的R、G、B值不满足第一判断条件。又例如,图4B中矩形区域Q’的R值、G值、B值经均值处理后得到的R值、G值、B值分别为255、254、220,此时R值、G值在第一预设阈值范围内,B值在第二阈值范围内,则可以确定匹配区域的R、G、B值满足第一判断条件。For example, the R value, G value, and B value of the rectangular area Q' in Figure 4B are processed by averaging to obtain R value, G value, and B value of 255, 254, and 252, respectively. At this time, the R value and G value are within the first preset threshold range, and the B value is not within the second threshold range. It can be determined that the R, G, and B values of the matching area do not meet the first judgment condition. For another example, the R value, G value, and B value of the rectangular area Q' in Figure 4B are processed by averaging to obtain R value, G value, and B value of 255, 254, and 220, respectively. At this time, the R value and G value are within the first preset threshold range, and the B value is within the second threshold range. It can be determined that the R, G, and B values of the matching area meet the first judgment condition.
通过上述方式提取匹配区域的R、G、B值,并将其与第一判断条件进行比较,提高了火情位置判断的准确性,降低了火情判断的误判率。同时只计算了火情可疑点的RGB值,有效减少了计算量。By extracting the R, G, and B values of the matching area in the above manner and comparing them with the first judgment condition, the accuracy of fire location judgment is improved and the misjudgment rate of fire judgment is reduced. At the same time, only the RGB values of the suspicious fire points are calculated, which effectively reduces the amount of calculation.
在一些实施例中,第一阈值范围、第二阈值范围可以事先确定。例如,第一阈值范围、第二阈值范围可以是系统默认值、经验值、人为预先设定值等或其任意组合,可以根据实际需求设定,本说明书对此不做限制。In some embodiments, the first threshold range and the second threshold range may be predetermined. For example, the first threshold range and the second threshold range may be system default values, empirical values, artificially preset values, or any combination thereof, and may be set according to actual needs, and this specification does not limit this.
在一些实施例中,处理器130还可以根据目标区域的实际情况,动态确定对应的第一 阈值范围和第二阈值范围。不同实际情况对应不同的第一阈值范围和/或不同的第二阈值范围,根据实际情况可以动态确定对应的第一阈值范围和第二阈值范围。In some embodiments, the processor 130 may also dynamically determine the corresponding first Threshold range and second threshold range. Different actual situations correspond to different first threshold ranges and/or different second threshold ranges, and the corresponding first threshold range and second threshold range can be dynamically determined according to the actual situation.
在一些实施例中,处理器130可以获取目标区域的可见光图像序列,并基于可见光图像序列,确定第一阈值范围以及第二阈值范围。In some embodiments, the processor 130 may acquire a visible light image sequence of the target area, and determine a first threshold range and a second threshold range based on the visible light image sequence.
在一些实施例中,可见光图像序列的获取流程可以在获取到第二图像之后触发。处理器130可以在获取到第二图像之后,控制可见光相机拍摄目标区域的可见光图像,以获取可见光图像序列。In some embodiments, the process of acquiring the visible light image sequence may be triggered after acquiring the second image. After acquiring the second image, the processor 130 may control the visible light camera to capture the visible light image of the target area to acquire the visible light image sequence.
在一些实施例中,处理器130可以控制可见光相机对目标区域进行多次拍摄获取可见光图像序列。例如,处理器130可以控制可见光相机每隔预设时间间隔对目标区域进行一次图像采集,获取一张可见光图像,进而得到可见光图像序列。In some embodiments, the processor 130 may control the visible light camera to capture the target area multiple times to obtain a visible light image sequence. For example, the processor 130 may control the visible light camera to capture an image of the target area once at a preset time interval to obtain a visible light image, thereby obtaining a visible light image sequence.
在一些实施例中,可见光图像序列可以以第一抓拍频率拍摄获取。处理器130可以控制可见光相机按照第一抓拍频率对目标区域进行多次拍摄获取可见光图像序列。In some embodiments, the visible light image sequence may be captured at a first capture frequency. The processor 130 may control the visible light camera to capture the target area multiple times at the first capture frequency to acquire the visible light image sequence.
第一抓拍频率是用于获取可见光图像序列的抓拍频率。例如,第一抓拍频率可以是x张/min等,x为正整数。The first capture frequency is a capture frequency used to acquire a visible light image sequence. For example, the first capture frequency may be x images/min, etc., where x is a positive integer.
在一些实施例中,第一抓拍频率可以基于先验知识或历史数据预设得到。例如,可以将历史数据中使用次数最多的抓拍频率确定为第一抓拍频率。In some embodiments, the first snapshot frequency may be preset based on prior knowledge or historical data. For example, the snapshot frequency that is used most times in historical data may be determined as the first snapshot frequency.
在一些实施例中,处理器130可以根据目标区域的实际情况确定第一抓拍频率。在一些实施例中,处理器130可以以第二抓拍频率获取预设数量的初始可见光图像;基于预设数量的初始可见光图像,确定环境光照强度序列以及差异分布序列;以及,基于环境光照强度序列以及差异分布序列,确定第一抓拍频率。In some embodiments, the processor 130 may determine the first capture frequency according to the actual situation of the target area. In some embodiments, the processor 130 may acquire a preset number of initial visible light images at a second capture frequency; determine an ambient light intensity sequence and a difference distribution sequence based on the preset number of initial visible light images; and determine the first capture frequency based on the ambient light intensity sequence and the difference distribution sequence.
第二抓拍频率是初步确定的抓拍频率。第二抓拍频率可以用于确定第一抓拍频率。例如,先以第二抓拍频率进行图像采集,根据采集的图像进行分析处理,以确定第一抓拍频率。第二抓拍频率可以与第一抓拍频率相同或不相同。The second capture frequency is a preliminarily determined capture frequency. The second capture frequency can be used to determine the first capture frequency. For example, firstly, image acquisition is performed at the second capture frequency, and analysis and processing are performed based on the acquired image to determine the first capture frequency. The second capture frequency can be the same as or different from the first capture frequency.
在一些实施例中,第二抓拍频率可以基于先验知识或历史数据预设得到。例如,可以将历史数据中使用次数最多的抓拍频率确定为第二抓拍频率。In some embodiments, the second snapshot frequency can be preset based on prior knowledge or historical data. For example, the snapshot frequency that is used most times in historical data can be determined as the second snapshot frequency.
在一些实施例中,预设数量可以是系统默认值、经验值、人为预先设定值等或其任意组合,可以根据实际需求设定,本说明书对此不做限制。In some embodiments, the preset number may be a system default value, an experience value, a manually preset value, or any combination thereof, and may be set according to actual needs, and this specification does not impose any restrictions on this.
初始可见光图像是指按照第二抓拍频率获取的可见光图像。The initial visible light image refers to a visible light image acquired according to the second capture frequency.
环境光照强度序列是指由预设数量的初始可见光图像的环境光照强度构成的序列。在一些实施例中,处理器130可以获取预设数量的初始可见光图像中每张初始可见光图像的环境光照强度,进而得到环境光照强度序列。关于获取环境光照强度的更多说明参见后文。The ambient light intensity sequence refers to a sequence consisting of the ambient light intensity of a preset number of initial visible light images. In some embodiments, the processor 130 may obtain the ambient light intensity of each of the preset number of initial visible light images to obtain the ambient light intensity sequence. For more information on obtaining the ambient light intensity, see below.
差异分布序列是指由预设数量的初始可见光图像中,相邻时刻拍摄的可见光图像的差异分布构成的序列。在一些实施例中,差异分布序列包括多个差异分布,每个差异分布基于预设数量的初始可见光图像中,相邻时刻拍摄的初始可见光图像的图像信息确定。The difference distribution sequence refers to a sequence consisting of difference distributions of visible light images captured at adjacent moments in a preset number of initial visible light images. In some embodiments, the difference distribution sequence includes multiple difference distributions, each of which is determined based on image information of initial visible light images captured at adjacent moments in a preset number of initial visible light images.
在一些实施例中,处理器130可以分别对相邻时刻拍摄的两张初始可见光图像进行语义分割,得到多组语义块对;计算多组语义块对中每组语义块对的像素差异矩阵;对像素差异矩阵进行统计特征提取,得到差异统计值;基于多组语义块对对应的多个像素差异矩阵,得到多个差异统计值,多个差异统计值构成差异分布。通过对预设数量的初始可见光图像中每相邻两张可见光图像进行上述处理,可以得到差异分布序列。In some embodiments, the processor 130 may perform semantic segmentation on two initial visible light images taken at adjacent moments to obtain multiple groups of semantic block pairs; calculate the pixel difference matrix of each semantic block pair in the multiple groups of semantic block pairs; perform statistical feature extraction on the pixel difference matrix to obtain difference statistics; obtain multiple difference statistics based on multiple pixel difference matrices corresponding to the multiple groups of semantic block pairs, and the multiple difference statistics constitute a difference distribution. By performing the above processing on each adjacent two visible light images in a preset number of initial visible light images, a difference distribution sequence can be obtained.
语义块对由两个语义块组成,该两个语义块分别来自不同的可见光图像,该两个语义 块之间具有对应关系。其中,语义块之间的对应关系是指两个语义块对应同一个对象。比如,可见光图像A中识别到一个语义块(比如一辆车),可见光图像B识别到一个语义块(跟前面是同一辆车),则这两个语义块是互相对应的。仅作为示例,相邻的两张可见光图像包括可见光图像A和可见光图像B,可见光图像A被分割为语义块a1、a2、a3,可见光图像B被分割为语义块b1、b2、b3,其中,a1与b1对应、a2与b2对应、a3与b3对应。A semantic block pair consists of two semantic blocks, which come from different visible light images. There is a correspondence between the blocks. Among them, the correspondence between semantic blocks means that two semantic blocks correspond to the same object. For example, a semantic block (such as a car) is recognized in visible light image A, and a semantic block (the same car as the previous one) is recognized in visible light image B, then the two semantic blocks correspond to each other. Just as an example, two adjacent visible light images include visible light image A and visible light image B, visible light image A is divided into semantic blocks a1, a2, a3, and visible light image B is divided into semantic blocks b1, b2, b3, where a1 corresponds to b1, a2 corresponds to b2, and a3 corresponds to b3.
在一些实施例中,处理器130可以通过语义分割算法或语义分割模型(例如,机器学习模型等)等多种可行的方式分别对相邻的两张可见光图像进行语义分割,得到多组语义块对。本说明书实施例对语义分割算法或语义分割模型没有特殊的限定,采用本领域技术人员熟知的操作即可。In some embodiments, the processor 130 can perform semantic segmentation on two adjacent visible light images respectively by various feasible methods such as semantic segmentation algorithms or semantic segmentation models (e.g., machine learning models, etc.), to obtain multiple groups of semantic block pairs. The embodiments of this specification do not specifically limit the semantic segmentation algorithms or semantic segmentation models, and operations familiar to those skilled in the art can be used.
在一些实施例中,两个语义块之间具有对应关系时,两个语义块中的像素点之间也可以具有对应关系。在一些实施例中,两个像素点之间具有对应关系可以指两个像素点具有相同的相对位置。相对位置是指某一像素点在语义块中包含的多个像素点之中的相对位置。例如,可见光图像A的语义块X内的像素点R,与可见光图像B的语义块X’的像素点R’之间具有对应关系,则表示像素点R与像素点R’处于相同的相对位置,比如像素点R是可见光图像A中第三行第三列的像素点,像素点R’也是可见光图像B中第三行第三列的像素点。In some embodiments, when there is a correspondence between two semantic blocks, there may also be a correspondence between the pixels in the two semantic blocks. In some embodiments, the correspondence between two pixels may refer to the two pixels having the same relative position. The relative position refers to the relative position of a pixel among multiple pixels contained in a semantic block. For example, there is a correspondence between pixel R in semantic block X of visible light image A and pixel R' in semantic block X' of visible light image B, which means that pixel R and pixel R' are in the same relative position, for example, pixel R is the pixel in the third row and third column of visible light image A, and pixel R' is also the pixel in the third row and third column of visible light image B.
像素差异矩阵是指语义块对中,每两个相对应的像素点之间的像素差异构成的矩阵。The pixel difference matrix refers to the matrix composed of the pixel differences between every two corresponding pixels in a semantic block pair.
在一些实施例中,处理器130可以计算语义块对中对应像素位置的两个像素点之间的差异,根据每个像素位置的差异得到像素差异矩阵。在一些实施例中,两个像素点之间的差异可以通过距离函数等方式计算确定。例如,距离函数包括但不限于欧式距离、余弦距离等。In some embodiments, the processor 130 may calculate the difference between two pixel points at corresponding pixel positions in the semantic block pair, and obtain a pixel difference matrix according to the difference of each pixel position. In some embodiments, the difference between two pixel points may be calculated and determined by a distance function or the like. For example, the distance function includes but is not limited to Euclidean distance, cosine distance, and the like.
差异统计值是指对像素差异矩阵进行统计特征提取,得到的统计值。例如,差异统计值可以是像素差异矩阵中各个像素差异的均值、方差值等。The difference statistic refers to a statistic obtained by extracting statistical features from the pixel difference matrix. For example, the difference statistic may be a mean value, a variance value, etc. of the differences between each pixel in the pixel difference matrix.
在一些实施例中,处理器130可以通过统计特征提取算法对像素差异矩阵进行统计特征提取,得到差异统计值。例如,统计特征提取算法包括提取均值的算法、提取方差的算法等,在此不做限制。In some embodiments, the processor 130 may extract statistical features from the pixel difference matrix using a statistical feature extraction algorithm to obtain a difference statistical value. For example, the statistical feature extraction algorithm includes an algorithm for extracting a mean, an algorithm for extracting a variance, etc., which are not limited here.
在一些实施例中,处理器130可以通过多种方式,基于环境光照强度序列以及差异分布序列,确定第一抓拍频率。In some embodiments, the processor 130 may determine the first snapshot frequency based on the ambient light intensity sequence and the difference distribution sequence in a variety of ways.
在一些实施例中,处理器130可以基于环境光照强度序列以及差异分布序列,通过向量检索的方式确定第一抓拍频率。处理器130可以基于环境光照强度序列以及差异分布序列,构建待匹配向量;基于待匹配向量在向量数据库中进行向量匹配,确定关联向量;基于关联向量确定第一抓拍频率。In some embodiments, the processor 130 may determine the first snapshot frequency by vector retrieval based on the ambient light intensity sequence and the difference distribution sequence. The processor 130 may construct a vector to be matched based on the ambient light intensity sequence and the difference distribution sequence; perform vector matching in a vector database based on the vector to be matched to determine an associated vector; and determine the first snapshot frequency based on the associated vector.
在一些实施例中,向量数据库可以包括多个参考向量及其对应的参考抓拍频率。在一些实施例中,参考向量可以基于历史数据进行构建。例如,通过对多个历史环境光照强度序列、多个历史差异分布序列进行向量构建得到多个参考向量。参考向量对应的参考抓拍频率可以由人为根据先验知识和实际的历史抓拍频率进行标注。In some embodiments, the vector database may include multiple reference vectors and their corresponding reference capture frequencies. In some embodiments, the reference vectors may be constructed based on historical data. For example, multiple reference vectors are obtained by vector construction of multiple historical ambient light intensity sequences and multiple historical difference distribution sequences. The reference capture frequencies corresponding to the reference vectors may be manually labeled based on prior knowledge and actual historical capture frequencies.
在一些实施例中,处理器130可以基于待匹配向量,在向量数据库中确定符合预设匹配条件的参考向量,将符合预设匹配条件的参考向量确定为关联向量,将关联向量对应的参考抓拍频率确定为待匹配向量对应的第一抓拍频率。其中,预设匹配条件可以指用于确定关联向量的判断条件。在一些实施例中,预设匹配条件可以包括向量距离小于距离阈值、向量距离最小等。In some embodiments, the processor 130 may determine a reference vector that meets a preset matching condition in a vector database based on the vector to be matched, determine the reference vector that meets the preset matching condition as an associated vector, and determine the reference snapshot frequency corresponding to the associated vector as the first snapshot frequency corresponding to the vector to be matched. The preset matching condition may refer to a judgment condition for determining the associated vector. In some embodiments, the preset matching condition may include a vector distance less than a distance threshold, a vector distance minimum, etc.
在一些实施例中,处理器130可以基于环境光照强度序列以及差异分布序列,通过频 率确定模型确定第一抓拍频率。In some embodiments, the processor 130 may be configured to generate a frequency signal based on the ambient light intensity sequence and the difference distribution sequence. The rate determination model determines the first snapshot frequency.
在一些实施例中,频率确定模型可以是机器学习模型。例如,深度神经网络(Deep Neural Network,DNN)模型等。在一些实施例中,频率确定模型的输入包括环境光照强度序列以及差异分布序列,输出包括第一抓拍频率。In some embodiments, the frequency determination model may be a machine learning model, such as a deep neural network (DNN) model. In some embodiments, the input of the frequency determination model includes an ambient light intensity sequence and a difference distribution sequence, and the output includes the first capture frequency.
在一些实施例中,频率确定模型可以基于多个带有第一标签的第一训练样本,通过各种方法进行训练,更新模型参数。例如,可以基于梯度下降法进行训练。例如,可以基于梯度下降法进行训练。仅作为示例,可以将多个带有第一标签的第一训练样本输入初始频率确定模型,通过第一标签和初始频率确定模型的输出结果构建损失函数,基于损失函数迭代更新初始频率确定模型的参数。当初始频率确定模型的损失函数满足预设条件时模型训练完成,得到训练好的图像识别模型。其中,预设条件可以是损失函数收敛、迭代的次数达到阈值等。In some embodiments, the frequency determination model can be trained by various methods based on multiple first training samples with first labels to update model parameters. For example, the training can be based on the gradient descent method. For example, the training can be based on the gradient descent method. As an example only, multiple first training samples with first labels can be input into the initial frequency determination model, and a loss function can be constructed by the first label and the output result of the initial frequency determination model, and the parameters of the initial frequency determination model can be iteratively updated based on the loss function. When the loss function of the initial frequency determination model meets the preset conditions, the model training is completed, and a trained image recognition model is obtained. Among them, the preset conditions can be that the loss function converges, the number of iterations reaches a threshold, etc.
在一些实施例中,第一训练样本包括样本区域的样本初始可见光图像序列(该样本初始可见光图像序列包括以样本第二抓拍频率获取的多张初始可见光图像)对应的样本环境光照强度序列以及样本差异分布序列,第一标签包括对样本区域进行可见光图像采集的第一抓拍频率。在一些实施例中,样本区域可以是确定了第一火情位置的区域。在一些实施例中,第一训练样本和第一标签可以基于历史数据确定。例如,可以将确定了第一火情位置的区域作为样本区域,将样本区域的历史初始可见光图像序列的环境光照强度序列以及差异分布序列作为第一训练样本。在一些实施例中,处理器130可以将第一训练样本下,能够识别出真实火情的最低抓拍频率确定为该第一训练样本对应的第一标签。In some embodiments, the first training sample includes a sample ambient light intensity sequence and a sample difference distribution sequence corresponding to a sample initial visible light image sequence of the sample area (the sample initial visible light image sequence includes multiple initial visible light images acquired at a sample second capture frequency), and the first label includes a first capture frequency for capturing visible light images of the sample area. In some embodiments, the sample area may be an area where the first fire location is determined. In some embodiments, the first training sample and the first label may be determined based on historical data. For example, the area where the first fire location is determined may be used as the sample area, and the ambient light intensity sequence and the difference distribution sequence of the historical initial visible light image sequence of the sample area may be used as the first training sample. In some embodiments, the processor 130 may determine the lowest capture frequency that can identify a real fire under the first training sample as the first label corresponding to the first training sample.
本说明书一些实施例通过先拍摄目标区域的多张初始可见光图像,进而确定多张初始可见光图像的环境光照强度序列和差异分布序列,可以有效分析目标区域的环境光强度是比较稳定的状态,还是频繁变化的状态(即环境光的变化状态),以及可以有效分析目标区域中包含的对象是静止对象还是移动对象(即对象的动静状态)。通过分析目标区域中环境光的变化状态和对象的动静状态,可以充分考虑这些因素对真实火情判断的影响,进而确定合理的第一抓拍频率,以获取合理的可见光图像序列,使得后续基于可见光图像序列设置的第一阈值范围和第二阈值范围更为准确合理。Some embodiments of this specification can effectively analyze whether the ambient light intensity of the target area is in a relatively stable state or a frequently changing state (i.e., the changing state of the ambient light), and can effectively analyze whether the objects contained in the target area are stationary objects or moving objects (i.e., the dynamic and static state of the objects) by first taking multiple initial visible light images of the target area, and then determining the ambient light intensity sequence and difference distribution sequence of the multiple initial visible light images. By analyzing the changing state of the ambient light and the dynamic and static state of the objects in the target area, the influence of these factors on the judgment of the real fire situation can be fully considered, and then a reasonable first capture frequency can be determined to obtain a reasonable visible light image sequence, so that the first threshold range and the second threshold range set based on the visible light image sequence are more accurate and reasonable.
处理器130可以通过多种方式对可见光图像序列进行处理,确定第一阈值范围以及第二阈值范围。在一些实施例中,处理器130可以识别可见光图像序列中每张可见光图像的环境光照强度,并计算环境光照强度均值;基于环境光照强度均值,通过查询第一预设对照表的方式确定第一阈值范围以及第二阈值范围。The processor 130 can process the visible light image sequence in a variety of ways to determine the first threshold range and the second threshold range. In some embodiments, the processor 130 can identify the ambient light intensity of each visible light image in the visible light image sequence and calculate the average ambient light intensity; based on the average ambient light intensity, the first threshold range and the second threshold range are determined by querying a first preset comparison table.
在一些实施例中,处理器130可以利用训练好的图像识别模型对可见光图像的环境光照强度进行识别。图像识别模型可以是机器学习模型,例如,卷积神经网络(Convolutional Neural Network,CNN)模型等。在一些实施例中,图像识别模型的输入包括可见光图像,输出包括可见光图像的环境光照强度。In some embodiments, the processor 130 may use a trained image recognition model to recognize the ambient light intensity of the visible light image. The image recognition model may be a machine learning model, such as a convolutional neural network (CNN) model. In some embodiments, the input of the image recognition model includes the visible light image, and the output includes the ambient light intensity of the visible light image.
在一些实施例中,图像识别模型可以基于多个带有第二标签的第二训练样本,通过各种方法进行训练,更新模型参数。图像识别模型的训练过程与频率确定模型的训练过程类似,更多说明参见前文相关描述,在此不再赘述。In some embodiments, the image recognition model can be trained by various methods based on multiple second training samples with second labels to update model parameters. The training process of the image recognition model is similar to the training process of the frequency determination model. For more details, please refer to the relevant description above, which will not be repeated here.
在一些实施例中,第二训练样本包括样本区域的样本可见光图像,第二标签包括样本可见光图像的环境光照强度。在一些实施例中,第二训练样本可以基于历史图像数据获取。第二标签可以基于历史检测数据获取。其中,历史图像数据中包括可见光相机对样本区域拍摄的历史可见光图像,历史检测数据包括光强检测仪器对样本区域采集的历史环境光照强度。 In some embodiments, the second training sample includes a sample visible light image of the sample area, and the second label includes the ambient light intensity of the sample visible light image. In some embodiments, the second training sample can be obtained based on historical image data. The second label can be obtained based on historical detection data. The historical image data includes historical visible light images of the sample area taken by a visible light camera, and the historical detection data includes historical ambient light intensity of the sample area collected by a light intensity detection instrument.
在一些实施例中,第一预设对照表中包括不同的参考环境光照强度均值与不同的参考第一阈值范围、参考第二阈值范围的对应关系。在一些实施例中,可以根据先验知识或历史数据,通过预设构建规则构建不同的参考环境光照强度均值与不同的参考第一阈值范围、参考第二阈值范围的对应关系,得到第一预设对照表。在一些实施例中,预设构建规则包括:在不同历史环境光照强度均值下,根据历史判别情况,将误判率最低时对应的第一阈值范围以及第二阈值范围确定为该历史环境光照强度均值对应的第一阈值范围以及第二阈值范围。In some embodiments, the first preset comparison table includes correspondences between different reference ambient light intensity means and different reference first threshold ranges and reference second threshold ranges. In some embodiments, the correspondences between different reference ambient light intensity means and different reference first threshold ranges and reference second threshold ranges can be constructed by preset construction rules based on prior knowledge or historical data to obtain the first preset comparison table. In some embodiments, the preset construction rules include: under different historical ambient light intensity means, according to historical judgment situations, the first threshold range and the second threshold range corresponding to the lowest misjudgment rate are determined as the first threshold range and the second threshold range corresponding to the historical ambient light intensity mean.
在一些实施例中,历史判别情况包括第一火情位置的火情是否为真实火情的判断结果与实际情况的对应情况。例如,历史判别情况包括判断错误和判断准确。当判断结果与实际情况不符时,则为判断错误,反之则为判断准确。例如,当第一火情位置的火情判断为真实火情,而实际上第一火情位置的火情为非真实火情时,则为判断错误;当第一火情位置的火情判断为真实火情,而实际上第一火情位置的火情为真实火情时,则为判断准确。误判率为多次判断中判断错误的次数与总判断次数的比值。In some embodiments, the historical judgment situation includes the correspondence between the judgment result of whether the fire at the first fire location is a real fire and the actual situation. For example, the historical judgment situation includes wrong judgment and accurate judgment. When the judgment result does not match the actual situation, it is a wrong judgment, otherwise it is an accurate judgment. For example, when the fire at the first fire location is judged to be a real fire, but in fact the fire at the first fire location is not a real fire, it is a wrong judgment; when the fire at the first fire location is judged to be a real fire, but in fact the fire at the first fire location is a real fire, it is an accurate judgment. The misjudgment rate is the ratio of the number of wrong judgments in multiple judgments to the total number of judgments.
在一些实施例中,处理器130还可以分别基于当前的环境光照强度均值,在历史数据中匹配相同或相似的历史环境光照强度均值,将相同或相似的历史环境光照强度均值对应的历史第一阈值范围和历史第二阈值范围确定为当前对应的第一阈值范围和第二阈值范围。In some embodiments, the processor 130 can also match the same or similar historical ambient light intensity mean values in the historical data based on the current ambient light intensity mean value, and determine the historical first threshold range and the historical second threshold range corresponding to the same or similar historical ambient light intensity mean values as the currently corresponding first threshold range and the second threshold range.
本说明书一些实施例可以在确定目标区域包括第一火情位置后,获取目标区域的实际的可见光图像序列,进而根据可见光图像序列确定第一阈值范围和第二阈值范围。该种方式有利于基于目标区域的实际情况,适应性地、动态地确定第一阈值范围和第二阈值范围,使得后续基于颜色信息判断火情真假的结果更为准确。In some embodiments of the present specification, after determining that the target area includes the first fire location, an actual visible light image sequence of the target area can be obtained, and then the first threshold range and the second threshold range can be determined based on the visible light image sequence. This method is conducive to adaptively and dynamically determining the first threshold range and the second threshold range based on the actual situation of the target area, so that the subsequent result of judging the authenticity of the fire based on color information is more accurate.
在一些实施例中,处理器130可以响应于第一火情位置对应的火情为非真实火情,确定非真实火情的干扰类别。关于该实施例的更多说明参见图5及其相关描述。In some embodiments, the processor 130 may determine the interference category of the unreal fire in response to the fire corresponding to the first fire location being an unreal fire. For more information about this embodiment, see FIG. 5 and its related description.
本说明书一些实施例中,通过可见光图像的相关特征(例如,颜色信息等),辅助判断已经被红外热图像灰度图判断为疑似出现火情的火情位置的真实性,可以排除多种情形的误判,比如:强烈的太阳反光点,静止的汽车引擎或尾气管等高温位置,运动的高温物体等。同时,可见光图像的RGB信息可以解决红外热图像灰度图易受天气、季节、时间影响的问题,提高火情判断的精确性。In some embodiments of this specification, the relevant features of the visible light image (e.g., color information, etc.) are used to assist in determining the authenticity of the fire location that has been determined as a suspected fire by the infrared thermal image grayscale image, and misjudgment in various situations can be eliminated, such as: strong sun reflection points, high-temperature locations such as stationary car engines or exhaust pipes, moving high-temperature objects, etc. At the same time, the RGB information of the visible light image can solve the problem that the infrared thermal image grayscale image is easily affected by weather, season, and time, and improve the accuracy of fire judgment.
应当注意的是,上述有关流程300的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程300进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。It should be noted that the above description of process 300 is only for example and illustration, and does not limit the scope of application of this specification. For those skilled in the art, various modifications and changes can be made to process 300 under the guidance of this specification. However, these modifications and changes are still within the scope of this specification.
图5是根据本说明书一些实施例所示的确定非真实火情的干扰类别的示例性示意图。FIG. 5 is an exemplary schematic diagram of determining interference categories of non-real fire conditions according to some embodiments of the present specification.
在一些实施例中,处理器130可以响应于第一火情位置对应的火情为非真实火情,确定非真实火情的干扰类别。In some embodiments, the processor 130 may determine an interference category of the unreal fire in response to the fire corresponding to the first fire location being an unreal fire.
在一些实施例中,干扰类别可以指非真实火情的热源类别。在一些实施例中,干扰类别包括反光点热源和非反光点热源。In some embodiments, the interference category may refer to a heat source category that is not a real fire. In some embodiments, the interference category includes a reflective point heat source and a non-reflective point heat source.
反光点热源是指由于反光造成的温度升高的点位或物体。例如,太阳反光点等。Reflective point heat sources refer to points or objects where the temperature rises due to reflection, for example, the sun's reflection points.
非反光点热源是指除反光以外其他原因造成的温度升高的点位或物体。例如,汽车的引擎、尾气排气口、热水杯等。Non-reflective point heat sources refer to points or objects whose temperature rises due to reasons other than reflection, such as car engines, exhaust vents, hot water cups, etc.
处理器130可以通过多种方式确定非真实火情的干扰类别。在一些实施例中,处理器130可以基于第一火情位置对应的火情区域的形状,确定非真实火情的干扰类别。例如,火情区域的形状为圆形或类圆形时,判断第一火情位置处的非真实火情的干扰类别为反光点热源。 在一些实施例中,处理器130可以基于第一火情位置在目标区域的第二图像中的匹配区域对应的颜色信息(例如,RGB值等),确定非真实火情的干扰类别。The processor 130 may determine the interference category of the non-real fire in a variety of ways. In some embodiments, the processor 130 may determine the interference category of the non-real fire based on the shape of the fire area corresponding to the first fire location. For example, when the shape of the fire area is circular or quasi-circular, the interference category of the non-real fire at the first fire location is determined to be a reflective point heat source. In some embodiments, the processor 130 may determine the interference category of the non-real fire based on the color information (eg, RGB value, etc.) corresponding to the matching area of the first fire location in the second image of the target area.
参见图5,在一些实施例中,处理器130可以响应于RGB值满足第二判断条件,确定干扰类别为反光点热源;响应于RGB值不满足第二判断条件,确定干扰类别为非反光点热源。5 , in some embodiments, the processor 130 may determine that the interference category is a reflective point heat source in response to the RGB value satisfying the second judgment condition; and determine that the interference category is a non-reflective point heat source in response to the RGB value not satisfying the second judgment condition.
第二判断条件是用于判断非真实火情的干扰类别的判断条件。The second judgment condition is a judgment condition for judging the interference category that is not a real fire situation.
在一些实施例中,第二判断条件包括:RGB值中R值、G值和B值满足第三阈值范围。其中,第三阈值范围可以是[e,f],a<e,b≤f,e、f∈(0,255)。仅作为示例,第一阈值范围可以是[250,255],第二阈值范围可以是[215,225],第三阈值范围可以是[254,255]。在一些实施例中,e、f可以相同。仅作为示例,第三阈值范围可以是255。In some embodiments, the second judgment condition includes: the R value, the G value, and the B value in the RGB value satisfy the third threshold range. The third threshold range may be [e, f], a<e, b≤f, e, f∈(0,255). As an example only, the first threshold range may be [250,255], the second threshold range may be [215,225], and the third threshold range may be [254,255]. In some embodiments, e and f may be the same. As an example only, the third threshold range may be 255.
举例来说,图4B中矩形区域Q’的R值、G值、B值经均值处理后得到的R值、G值、B值均为255,R值、G值、B值均在第三阈值范围内,此时R值、G值、B值满足第二判断条件,确定在图4A的矩形区域Q中的位置W对应的非真实火情为反光点热源。For example, the R value, G value, and B value of the rectangular area Q' in Figure 4B are all 255 after mean processing, and the R value, G value, and B value are all within the third threshold range. At this time, the R value, G value, and B value meet the second judgment condition, and it is determined that the non-real fire corresponding to the position W in the rectangular area Q in Figure 4A is a reflective point heat source.
通过上述的方式,将提取出的R值、G值、B值与第二判断条件进行比较,可以准确的确定出非真实火情来自于太阳光等反光点。By comparing the extracted R value, G value, and B value with the second judgment condition in the above manner, it can be accurately determined that the non-real fire comes from reflective points such as sunlight.
在一些实施例中,第三阈值范围可以事先确定。例如,第三阈值范围可以是系统默认值、经验值、人为预先设定值等或其任意组合,可以根据实际需求设定,本说明书对此不做限制。In some embodiments, the third threshold range may be predetermined. For example, the third threshold range may be a system default value, an empirical value, a manually preset value, or any combination thereof, and may be set according to actual needs, which is not limited in this specification.
在一些实施例中,处理器130还可以根据目标区域的实际情况,动态确定对应的第三阈值范围。In some embodiments, the processor 130 may also dynamically determine a corresponding third threshold range according to the actual situation of the target area.
在一些实施例中,处理器130可以从第一图像中获取目标区域的温度分布数据,从第二图像中获取目标区域的颜色分布数据,以及基于温度分布数据和颜色分布数据确定第三阈值范围。In some embodiments, the processor 130 may obtain temperature distribution data of the target area from the first image, obtain color distribution data of the target area from the second image, and determine a third threshold range based on the temperature distribution data and the color distribution data.
温度分布数据是指第一图像被划分为多个子区域后,不同子区域的温度情况。其中,子区域是指第一图像的部分区域。The temperature distribution data refers to the temperature conditions of different sub-regions after the first image is divided into multiple sub-regions, wherein the sub-region refers to a partial region of the first image.
在一些实施例中,处理器130可以通过多种方式将第一图像划分为多个子区域。例如,处理器130可以按照预设宽度的网格将第一图像划分为多个子区域。又例如,处理器130可以通过语义分割的方式将第一图像划分为多个子区域。对第一图像(例如,红外热图像)进行语义分割的方式与对可见光图像进行语义分割的方式类似,更多说明参见图3及其相关描述。In some embodiments, the processor 130 may divide the first image into a plurality of sub-regions in a variety of ways. For example, the processor 130 may divide the first image into a plurality of sub-regions according to a grid of a preset width. For another example, the processor 130 may divide the first image into a plurality of sub-regions by semantic segmentation. The method of semantically segmenting the first image (e.g., an infrared thermal image) is similar to the method of semantically segmenting the visible light image. For more explanation, see FIG. 3 and its related description.
在一些实施例中,子区域的温度数据可以包括子区域的温度均值、温度方差、温度中值等中的一种或多种。在一些实施例中,处理器130可以从第一图像中直接读取各个像素点的温度值,进而通过统计分析得到子区域的温度数据。本说明书实施例对统计分析的方式没有特殊的限定,采用本领域技术人员熟知的操作即可。In some embodiments, the temperature data of the sub-region may include one or more of the temperature mean, temperature variance, and temperature median of the sub-region. In some embodiments, the processor 130 may directly read the temperature value of each pixel from the first image, and then obtain the temperature data of the sub-region through statistical analysis. The embodiments of this specification do not specifically limit the method of statistical analysis, and operations familiar to those skilled in the art may be used.
颜色分布数据是指第二图像被划分为多个子区域后,不同子区域的颜色情况。对第二图像进行划分的方式与对第一图像进行划分的方式类似,在此不再赘述。The color distribution data refers to the color conditions of different sub-regions after the second image is divided into multiple sub-regions. The method of dividing the second image is similar to the method of dividing the first image, and will not be repeated here.
在一些实施例中,子区域的颜色信息可以包括子区域的R值均值、G值均值、B值均值等。在一些实施例中,处理器130可以从第二图像中直接读取各个像素点的RGB值,进而通过统计分析得到子区域的颜色信息。In some embodiments, the color information of the sub-region may include the mean R value, the mean G value, the mean B value, etc. of the sub-region. In some embodiments, the processor 130 may directly read the RGB value of each pixel from the second image, and then obtain the color information of the sub-region through statistical analysis.
处理器130可以通过多种方式对温度分布数据和颜色分布数据进行处理,确定第三阈 值范围。The processor 130 can process the temperature distribution data and the color distribution data in a variety of ways to determine the third threshold value. Value range.
在一些实施例中,处理器130可以基于温度分布数据和颜色分布数据,通过查询第二预设对照表的方式确定第三阈值范围。In some embodiments, the processor 130 may determine the third threshold range by querying a second preset comparison table based on the temperature distribution data and the color distribution data.
在一些实施例中,第二预设对照表中包括不同的参考温度分布数据、不同的参考颜色分布数据与不同的参考第三阈值范围的对应关系。第二预设对照表的构建方式与第一预设对照表的构建方式类似,在此不再赘述,更多说明参见步骤330及其相关描述。In some embodiments, the second preset comparison table includes correspondences between different reference temperature distribution data, different reference color distribution data, and different reference third threshold ranges. The second preset comparison table is constructed in a similar manner to the first preset comparison table, and will not be described in detail here. For more information, see step 330 and its related description.
在一些实施例中,处理器130还可以分别基于当前的温度分布数据和颜色分布数据,在历史数据中匹配相同或相似的历史温度分布数据和历史颜色分布数据,将相同或相似的历史温度分布数据和历史颜色分布数据对应的历史第三阈值范围确定为当前对应的第三阈值范围。In some embodiments, the processor 130 can also match the same or similar historical temperature distribution data and historical color distribution data in the historical data based on the current temperature distribution data and color distribution data, and determine the historical third threshold range corresponding to the same or similar historical temperature distribution data and historical color distribution data as the current corresponding third threshold range.
在一些实施例中,处理器130可以基于第二图像,确定第一火情位置的预设范围内的材质分布数据;以及基于温度分布数据、颜色分布数据以及材质分布数据,确定第三阈值范围。In some embodiments, the processor 130 may determine material distribution data within a preset range of the first fire location based on the second image; and determine a third threshold range based on the temperature distribution data, the color distribution data, and the material distribution data.
预设范围是指第一火情位置的周围一定范围。例如,预设范围可以是第一火情位置的周围100米范围。又例如,预设范围可以是第一火情位置的火情区域。预设范围可以是系统默认值、经验值、人为预先设定值等或其任意组合,可以根据实际需求设定,本说明书对此不做限制。The preset range refers to a certain range around the first fire location. For example, the preset range may be a 100-meter range around the first fire location. For another example, the preset range may be the fire area of the first fire location. The preset range may be a system default value, an experience value, a manually preset value, or any combination thereof, and may be set according to actual needs, and this specification does not limit this.
材质分布数据是指第一火情位置的预设范围内不同位置的材质情况。例如,第一火情位置为汽车尾气排气口时,预设范围内不同位置的材质情况可以包括但不限于铁、铝、铝合金等中的一种或多种。Material distribution data refers to the material conditions at different locations within a preset range of the first fire location. For example, when the first fire location is a car exhaust outlet, the material conditions at different locations within the preset range may include, but are not limited to, one or more of iron, aluminum, and aluminum alloy.
在一些实施例中,处理器130可以基于第二图像,通过材质检测算法等方式确定第一火情位置的预设范围内的材质分布数据。示例性的材质检测算法包括基于视觉特征的材质检测算法等。本说明书实施例对材质检测算法没有特殊的限定,采用本领域技术人员熟知的操作即可。In some embodiments, the processor 130 may determine the material distribution data within a preset range of the first fire location based on the second image by means of a material detection algorithm or the like. Exemplary material detection algorithms include material detection algorithms based on visual features, etc. The embodiments of this specification do not specifically limit the material detection algorithm, and operations familiar to those skilled in the art may be used.
在一些实施例中,处理器130可以基于温度分布数据、颜色分布数据和材质分布数据,通过查询第三预设对照表的方式确定第三阈值范围。在一些实施例中,第三预设对照表中包括不同的参考温度分布数据、不同的参考颜色分布数据、不同的参考材质分布术语与不同的参考第三阈值范围的对应关系。第三预设对照表的构建方式与第一预设对照表的构建方式类似,在此不再赘述,更多说明参见步骤330及其相关描述。In some embodiments, the processor 130 may determine the third threshold range by querying a third preset comparison table based on the temperature distribution data, the color distribution data, and the material distribution data. In some embodiments, the third preset comparison table includes correspondences between different reference temperature distribution data, different reference color distribution data, different reference material distribution terms, and different reference third threshold ranges. The construction method of the third preset comparison table is similar to the construction method of the first preset comparison table, and will not be repeated here. For more explanation, see step 330 and its related description.
在一些实施例中,处理器130还可以基于温度分布数据、颜色分布数据和材质分布数据,在历史数据中匹配相关的历史数据,进而确定第三阈值范围。更多说明参考前文基于温度分布数据、颜色分布数据在历史数据中进行匹配的相关描述。In some embodiments, the processor 130 may also match relevant historical data in the historical data based on the temperature distribution data, the color distribution data, and the material distribution data, thereby determining the third threshold range. For more information, refer to the above description of matching the historical data based on the temperature distribution data and the color distribution data.
在一些实施例中,非反光点热源可以包括移动热源和静止热源。移动热源是指处于移动状态的非反光点热源。例如,移动中的汽车的引擎、尾气排气口等。静止热源是指处于静止状态的非反光点热源。例如,空调外机、固定的热水杯等。In some embodiments, the non-reflective point heat source may include a mobile heat source and a stationary heat source. A mobile heat source refers to a non-reflective point heat source in a moving state. For example, an engine of a moving car, an exhaust outlet, etc. A stationary heat source refers to a non-reflective point heat source in a stationary state. For example, an air conditioner outdoor unit, a fixed hot water cup, etc.
参见图5,在一些实施例中,在确定第一火情位置对应的非真实火情的干扰类别为非反光点热源后,处理器130还可以进一步判断非反光点热源的干扰类别为移动热源还是静止热源。5 , in some embodiments, after determining that the interference category of the non-real fire corresponding to the first fire location is a non-reflective point heat source, the processor 130 may further determine whether the interference category of the non-reflective point heat source is a moving heat source or a stationary heat source.
参见图5,在一些实施例中,处理器130可以基于目标区域的第三图像确定第二火情位置;判断第二火情位置与第一火情位置是否满足第三判断条件;响应于满足第三判断条件, 确定干扰类别为静止热源;响应于不满足第三判断条件,确定干扰类别为移动热源。5 , in some embodiments, the processor 130 may determine a second fire location based on a third image of the target area; determine whether the second fire location and the first fire location satisfy a third determination condition; and in response to satisfying the third determination condition, The interference category is determined to be a stationary heat source; in response to not satisfying the third judgment condition, the interference category is determined to be a moving heat source.
第三图像是指用于确定火情情况的图像数据。在一些实施例中,第三图像可以是红外热图像。在一些实施例中,处理器130可以在确定第一火情位置之后的时间点,获取目标区域的第三图像。第三图像的获取方式与第一图像的获取方式相同,在此不再赘述。The third image refers to image data used to determine the fire situation. In some embodiments, the third image may be an infrared thermal image. In some embodiments, the processor 130 may acquire the third image of the target area at a time point after determining the first fire location. The acquisition method of the third image is the same as the acquisition method of the first image, which will not be repeated here.
第二火情位置是指疑似发生火情的位置区域。The second fire location refers to the location area where a fire is suspected to have occurred.
在一些实施例中,处理器130可以采用确定第一火情位置的方式,在第三图像中确定第二火情位置。在一些实施例中,处理器130可以根据基准坐标点建立的直角坐标系,确定第二火情位置的第二火情位置坐标。In some embodiments, the processor 130 may determine the second fire location in the third image in the same manner as the first fire location. In some embodiments, the processor 130 may determine the second fire location coordinates of the second fire location based on a rectangular coordinate system established by the reference coordinate points.
在一些实施例中,在第二火情位置确定后,处理器130可以将第三图像与第一图像进行配准,进而根据在第一图像中建立的直角坐标系确定第二火情位置的第二火情位置坐标,并根据火情区域的划分方式确定以第二火情位置为中心向外扩散的火情区域。In some embodiments, after the second fire location is determined, the processor 130 may align the third image with the first image, and then determine the second fire location coordinates of the second fire location based on the rectangular coordinate system established in the first image, and determine the fire area spreading outward with the second fire location as the center according to the fire area division method.
关于确定第一火情位置、建立直角坐标系以及划分火情区域的更多说明参见步骤310及其相关描述。For more information on determining the first fire location, establishing a rectangular coordinate system, and dividing the fire area, refer to step 310 and its related description.
在一些实施例中,第二火情位置与第一火情位置对应同一火情区域。在一些实施例中,对应于同一火情区域可以指第一火情位置对应的火情区域与第二火情位置对应的火情区域完全重叠或部分重叠。其中,部分重叠的区域面积可以满足预设面积阈值。预设面积阈值可以是系统默认值、经验值、人为预先设定值等或其任意组合,可以根据实际需求设定,本说明书对此不做限制。In some embodiments, the second fire location corresponds to the same fire area as the first fire location. In some embodiments, corresponding to the same fire area may mean that the fire area corresponding to the first fire location completely overlaps or partially overlaps the fire area corresponding to the second fire location. The area of the partially overlapping area may meet a preset area threshold. The preset area threshold may be a system default value, an experience value, a manually preset value, or any combination thereof, and may be set according to actual needs, and this specification does not limit this.
第三判断条件是用于进一步判断干扰类别为移动热源还是静止热源的判断条件。The third judgment condition is a judgment condition for further judging whether the interference category is a moving heat source or a stationary heat source.
在一些实施例中,第三判断条件包括:第二火情位置与第一火情位置一致。在一些实施例中,第二火情位置与第一火情位置一致可以指第二火情位置坐标与第一火情位置坐标一致。相应的,处理器130可以根据第二火情位置的第二火情位置坐标,以及第一火情位置的第一火情位置坐标判断第二火情位置与第一火情位置是否满足第三判断条件。In some embodiments, the third judgment condition includes: the second fire location is consistent with the first fire location. In some embodiments, the second fire location is consistent with the first fire location, which may mean that the second fire location coordinates are consistent with the first fire location coordinates. Accordingly, the processor 130 may determine whether the second fire location and the first fire location meet the third judgment condition based on the second fire location coordinates of the second fire location and the first fire location coordinates of the first fire location.
若一致,确定上述第一火情位置坐标对应的第一火情位置的非真实火情属于静止热源,比如静止汽车的引擎或者尾气管位置;若不一致,确定上述第一火情位置坐标对应的第一火情位置的非真实火情属于移动热源,比如运动汽车的引擎、移动的热水杯等。If they are consistent, it is determined that the non-real fire at the first fire position corresponding to the above-mentioned first fire position coordinates belongs to a static heat source, such as the engine or exhaust pipe position of a stationary car; if they are inconsistent, it is determined that the non-real fire at the first fire position corresponding to the above-mentioned first fire position coordinates belongs to a mobile heat source, such as the engine of a moving car, a mobile hot water cup, etc.
本说明书一些实施例中,通过上述的方式,将第一火情位置坐标与第二火情位置坐标进行比较,可以确定非真实火情源是来自于静止热源还是来自于移动热源。同时,上述方式只针对可疑位置计算RGB值,可以大大减少计算量,方便快捷。In some embodiments of this specification, by comparing the first fire location coordinates with the second fire location coordinates in the above manner, it is possible to determine whether the unreal fire source comes from a stationary heat source or a mobile heat source. At the same time, the above manner only calculates the RGB value for the suspicious location, which can greatly reduce the amount of calculation and is convenient and fast.
图6是根据本说明书一些实施例所示的火情判断处理过程的示例性示意图。参见图6,在一些实施例中,火情判断的处理过程包括:FIG6 is an exemplary schematic diagram of a fire condition judgment processing process according to some embodiments of this specification. Referring to FIG6, in some embodiments, the fire condition judgment processing process includes:
S1、获取目标区域的红外热图像和可见光图像。S1. Acquire infrared thermal images and visible light images of the target area.
S2、基于红外热图像确定第一火情位置。S2. Determine the first fire location based on the infrared thermal image.
S3、获取第一火情位置的第一火情位置坐标。S3. Obtain the first fire location coordinates of the first fire location.
S4、在可见光图像中确定第一火情位置的匹配区域。例如,可以将红外热图像中第一火情位置对应到可见光图像中,确定匹配区域。S4. Determine a matching area of the first fire position in the visible light image. For example, the first fire position in the infrared thermal image may be mapped to the visible light image to determine a matching area.
S5、获取匹配区域的RGB值。S5. Get the RGB value of the matching area.
接着,判断第一火情位置对应的火情的真实性,包括:Next, determining the authenticity of the fire corresponding to the first fire location includes:
S6、判断RGB值中R值和G值是否满足第一阈值范围、B值是否满足第二阈值范围。 S6. Determine whether the R value and the G value in the RGB value meet the first threshold range, and whether the B value meets the second threshold range.
S7、响应于R值和G值满足第一阈值范围、B值满足第二阈值范围,确定第一火情位置对应的火情为真实火情。S7. In response to the R value and the G value satisfying the first threshold range and the B value satisfying the second threshold range, determining that the fire corresponding to the first fire location is a real fire.
S8、响应于R值和/或G值不满足第一阈值范围,和/或B值不满足第二阈值范围,确定第一火情位置对应的火情为非真实火情。S8. In response to the R value and/or the G value not satisfying the first threshold range, and/or the B value not satisfying the second threshold range, determining that the fire corresponding to the first fire location is a non-real fire.
在确定第一火情位置对应的火情为非真实火情之后,可以进行下一步判断操作以确定非真实火情的干扰类别,包括:After determining that the fire corresponding to the first fire location is an unreal fire, the next step of judging operation may be performed to determine the interference category of the unreal fire, including:
S9、判断RGB值中R值、G值和B值是否满足第三阈值范围。S9. Determine whether the R value, the G value, and the B value in the RGB value meet a third threshold range.
S10、响应于R值、G值和B值满足第三阈值范围,确定非真实火情的干扰类别为反光点热源。S10 , in response to the R value, the G value, and the B value satisfying a third threshold range, determining that the interference category of the non-real fire situation is a reflective point heat source.
S11、响应于R值、G值和/或B值不满足第三阈值范围,确定非真实火情的干扰类别为非反光点热源。S11 , in response to the R value, the G value and/or the B value not satisfying the third threshold range, determining that the interference category of the non-real fire situation is a non-reflective point heat source.
在确定第一火情位置对应的非真实火情的干扰类别为非反光点热源后,还可以进一步判断非反光点热源的干扰类别为移动热源还是静止热源,包括:After determining that the interference category of the non-real fire corresponding to the first fire location is a non-reflective point heat source, it is also possible to further determine whether the interference category of the non-reflective point heat source is a moving heat source or a stationary heat source, including:
S12、判断第二火情位置与第一火情位置是否一致。S12: Determine whether the second fire location is consistent with the first fire location.
S13、响应于第二火情位置与第一火情位置一致,确定非真实火情的干扰类别为静止热源。S13: In response to the second fire location being consistent with the first fire location, determining that the interference category of the non-real fire is a stationary heat source.
S14、响应于第二火情位置与第一火情位置不一致,确定非真实火情的干扰类别为移动热源。S14: In response to the second fire location being inconsistent with the first fire location, determining that the interference category of the non-real fire is a moving heat source.
在确定第一火情位置对应的火情为真实火情后,上报火情。After confirming that the fire corresponding to the first fire location is a real fire, report the fire.
在确定第一火情位置对应的火情为非真实获取,并进一步确定非真实火源的干扰类别后,上报火情。更多说明参见前文相关描述。After determining that the fire corresponding to the first fire location is not actually acquired and further determining the interference category of the non-real fire source, the fire is reported. For more information, see the related description above.
通过可见光图像可以辅助判断已经被红外热成像灰度图判断为火情的可疑点的真实性,排除了多种情形的误判,比如:强烈的太阳反光点,静止的汽车引擎或尾气管等高温位置,运动的高温物体等对判断结果的干扰,提高了火情判断的精确性。Visible light images can assist in determining the authenticity of suspicious points that have been identified as fires by infrared thermal imaging grayscale images, eliminating misjudgments in many situations, such as strong sun reflections, high-temperature locations such as stationary car engines or exhaust pipes, and moving high-temperature objects, which interfere with the judgment results and improve the accuracy of fire judgments.
基于同一发明构思,本说明书实施例还提供一种火情判断装置,所述装置包括至少一个处理器以及至少一个存储器;所述至少一个存储器用于存储计算机指令;所述至少一个处理器用于执行所述计算机指令中的至少部分指令以实现前文论述的火情判断方法。Based on the same inventive concept, an embodiment of this specification also provides a fire situation judgment device, which includes at least one processor and at least one memory; the at least one memory is used to store computer instructions; the at least one processor is used to execute at least part of the computer instructions to implement the fire situation judgment method discussed above.
基于同一发明构思,本说明书实施例还提供一种存储介质,该存储介质存储有计算机指令,当该计算机指令在计算机上运行时,使得计算机执行前文论述的火情判断方法。Based on the same inventive concept, an embodiment of this specification also provides a storage medium, which stores computer instructions. When the computer instructions are executed on a computer, the computer executes the fire situation judgment method discussed above.
在一些可能的实施方式中,本说明书实施例提供的火情判断方法的各个方面还可以实现为一种程序产品的形式,其包括程序代码,当程序产品在装置上运行时,程序代码用于使该控制设备执行本说明书上述描述的根据本申请各种示例性实施方式的火情判断方法中的步骤。In some possible implementations, various aspects of the fire situation determination method provided in the embodiments of this specification may also be implemented in the form of a program product, which includes a program code. When the program product is run on the device, the program code is used to enable the control device to execute the steps of the fire situation determination method according to various exemplary embodiments of the present application described above in this specification.
在本说明书的实施例中按步骤说明所执行的操作时,如无特别说明,则步骤的次序均为可调换的,步骤是可以省略的,在操作过程中也可以包括其他步骤。When the operations performed in the embodiments of this specification are described in steps, unless otherwise specified, the order of the steps is interchangeable, steps may be omitted, and other steps may be included in the operation process.
本说明书中的实施例对于系统及其模块的描述,仅为描述方便,并不能限制在所举实施例范围之内。可能在不背离该系统原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接。The description of the system and its modules in the embodiments of this specification is only for convenience of description and is not limited to the scope of the embodiments. It is possible to combine the modules arbitrarily or form a subsystem connected with other modules without departing from the principle of the system.
本说明书中的实施例仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以进行的各种修正和改变仍在本说明书的范围之 内。The embodiments in this specification are only for illustration and description, and do not limit the scope of application of this specification. For those skilled in the art, various modifications and changes that can be made under the guidance of this specification are still within the scope of this specification. Inside.
本说明书的一个或多个实施例中的某些特征、结构或特点可以进行适当的组合。Certain features, structures or characteristics of one or more embodiments of this specification may be appropriately combined.
本说明书的各个方面可以完全由硬件执行、可以完全由软件(包括固件、常驻软件、微码等)执行、也可以由硬件和软件组合执行。以上硬件或软件均可被称为“数据块”、“模块”、“引擎”、“单元”、“组件”或“系统”等。此外,本说明书的各方面可能表现为位于一个或多个计算机可读介质中的计算机产品,该产品包括计算机可读程序编码。Various aspects of this specification may be performed entirely by hardware, entirely by software (including firmware, resident software, microcode, etc.), or by a combination of hardware and software. The above hardware or software may be referred to as "data blocks", "modules", "engines", "units", "components" or "systems", etc. In addition, various aspects of this specification may be expressed as a computer product located in one or more computer-readable media, which includes computer-readable program code.
计算机存储介质可以是任何计算机可读介质,该介质可以通过连接至一个指令执行系统、装置或设备以实现通讯、传播或传输供使用的程序。位于计算机存储介质上的程序编码可以通过任何合适的介质进行传播,包括无线电、电缆、光纤电缆、RF、或类似介质,或任何上述介质的组合。Computer storage media can be any computer-readable media that can be connected to an instruction execution system, device or apparatus to communicate, propagate or transport the program for use. The program code on the computer storage medium can be transmitted via any suitable medium, including radio, cable, fiber optic cable, RF, or similar media, or any combination of the above media.
本说明书各部分操作所需的计算机程序编码可以用任意一种或多种程序语言编写。该程序编码可以完全在用户计算机上运行、或作为独立的软件包在用户计算机上运行、或部分在用户计算机上运行部分在远程计算机运行、或完全在远程计算机或处理设备上运行。在后种情况下,远程计算机可以通过任何网络形式与用户计算机连接,比如局域网(LAN)或广域网(WAN),或连接至外部计算机(例如通过因特网),或在云计算环境中,或作为服务使用如软件即服务(SaaS)。The computer program code required for the operation of the various parts of this specification can be written in any one or more programming languages. The program code can be run entirely on the user's computer, or run on the user's computer as a separate software package, or run partially on the user's computer and partially on a remote computer, or run entirely on a remote computer or processing device. In the latter case, the remote computer can be connected to the user's computer through any network form, such as a local area network (LAN) or a wide area network (WAN), or connected to an external computer (e.g., via the Internet), or in a cloud computing environment, or used as a service such as software as a service (SaaS).
一些实施例中使用了描述成分、属性数量的数字,应当理解的是,此类用于实施例描述的数字,在一些示例中使用了修饰词“大约”、“近似”或“大体上”来修饰。除非另外说明,“大约”、“近似”或“大体上”表明所述数字允许有±20%的变化。相应地,在一些实施例中,说明书和权利要求中使用的数值参数均为近似值,该近似值根据个别实施例所需特点可以发生改变。尽管本说明书一些实施例中用于确认其范围广度的数值域和参数为近似值,在具体实施例中,此类数值的设定在可行范围内尽可能精确。In some embodiments, numbers describing the quantity of components and attributes are used. It should be understood that such numbers used to describe the embodiments are modified by the modifiers "about", "approximately" or "substantially" in some examples. Unless otherwise specified, "about", "approximately" or "substantially" indicate that the numbers are allowed to vary by ±20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximate values, which may change according to the required features of individual embodiments. Although the numerical domains and parameters used to confirm the breadth of their range in some embodiments of this specification are approximate values, in specific embodiments, such numerical values are set as accurately as possible within the feasible range.
最后,应当理解的是,本说明书中所述实施例仅用以说明本说明书实施例的原则。其他的变形也可能属于本说明书的范围。因此,作为示例而非限制,本说明书实施例的替代配置可视为与本说明书的教导一致。相应地,本说明书的实施例不仅限于本说明书明确介绍和描述的实施例。 Finally, it should be understood that the embodiments described in this specification are only used to illustrate the principles of the embodiments of this specification. Other variations may also fall within the scope of this specification. Therefore, as an example and not a limitation, alternative configurations of the embodiments of this specification may be considered consistent with the teachings of this specification. Accordingly, the embodiments of this specification are not limited to the embodiments explicitly introduced and described in this specification.

Claims (20)

  1. 一种火情判断方法,其特征在于,所述方法包括:A method for judging a fire situation, characterized in that the method comprises:
    基于目标区域的第一图像确定第一火情位置;determining a first fire location based on the first image of the target area;
    确定所述第一火情位置在所述目标区域的第二图像中的匹配区域,并提取所述匹配区域的颜色信息,所述第一图像与所述第二图像的拍摄时间差满足预设条件;Determine a matching area of the first fire location in the second image of the target area, and extract color information of the matching area, wherein a shooting time difference between the first image and the second image satisfies a preset condition;
    基于所述颜色信息,判断所述第一火情位置对应的火情是否为真实火情。Based on the color information, determine whether the fire corresponding to the first fire location is a real fire.
  2. 如权利要求1所述的方法,其特征在于,所述第一图像为红外热图像,所述第二图像为可见光图像。The method according to claim 1, characterized in that the first image is an infrared thermal image and the second image is a visible light image.
  3. 如权利要求1所述的方法,其特征在于,所述颜色信息包括RGB值,所述基于所述颜色信息,判断所述第一火情位置对应的火情是否为真实火情包括:The method according to claim 1, wherein the color information includes an RGB value, and the determining, based on the color information, whether the fire corresponding to the first fire location is a real fire comprises:
    响应于所述RGB值满足第一判断条件,确定所述第一火情位置对应的火情为所述真实火情;In response to the RGB value satisfying a first judgment condition, determining that the fire corresponding to the first fire position is the real fire;
    响应于所述RGB值不满足所述第一判断条件,确定所述第一火情位置对应的火情为非真实火情。In response to the RGB value not satisfying the first judgment condition, it is determined that the fire corresponding to the first fire location is not a real fire.
  4. 如权利要求3所述的方法,其特征在于,所述第一判断条件包括:所述RGB值中R值和G值满足第一阈值范围、B值满足第二阈值范围。The method according to claim 3 is characterized in that the first judgment condition includes: the R value and the G value of the RGB value meet the first threshold range, and the B value meets the second threshold range.
  5. 如权利要求3所述的方法,其特征在于,所述方法进一步包括:The method according to claim 3, characterized in that the method further comprises:
    响应于所述第一火情位置对应的火情为所述非真实火情,确定所述非真实火情的干扰类别。In response to the fire corresponding to the first fire location being the unreal fire, an interference category of the unreal fire is determined.
  6. 如权利要求5所述的方法,其特征在于,所述响应于所述第一火情位置对应的火情为所述非真实火情,确定所述非真实火情的干扰类别包括:The method according to claim 5, wherein the fire corresponding to the first fire position is the unreal fire, and determining the interference category of the unreal fire comprises:
    响应于所述RGB值满足第二判断条件,确定所述干扰类别为反光点热源;In response to the RGB value satisfying a second judgment condition, determining that the interference category is a reflective point heat source;
    响应于所述RGB值不满足第二判断条件,确定所述干扰类别为非反光点热源。In response to the RGB value not satisfying the second judgment condition, the interference category is determined to be a non-reflective point heat source.
  7. 如权利要求6所述的方法,其特征在于,所述第二判断条件包括:所述RGB值中R值、G值和B值满足第三阈值范围。The method according to claim 6, characterized in that the second judgment condition includes: the R value, the G value and the B value in the RGB value meet a third threshold range.
  8. 如权利要求6所述的方法,其特征在于,所述非反光点热源包括移动热源和静止热源,在确定所述第一火情位置对应的所述非真实火情的所述干扰类别为所述非反光点热源后,所述方法还包括:The method according to claim 6, characterized in that the non-reflective point heat source includes a moving heat source and a stationary heat source, and after determining that the interference category of the non-real fire corresponding to the first fire position is the non-reflective point heat source, the method further includes:
    基于所述目标区域的第三图像确定第二火情位置,所述第二火情位置与所述第一火情位置对应同一火情区域;determining a second fire location based on a third image of the target area, wherein the second fire location corresponds to the same fire area as the first fire location;
    判断所述第二火情位置与所述第一火情位置是否满足第三判断条件;Determining whether the second fire location and the first fire location satisfy a third determination condition;
    响应于满足所述第三判断条件,确定所述干扰类别为所述静止热源;In response to satisfying the third judgment condition, determining that the interference category is the stationary heat source;
    响应于不满足所述第三判断条件,确定所述干扰类别为所述移动热源。 In response to the third judgment condition not being satisfied, the interference category is determined to be the mobile heat source.
  9. 如权利要求8所述的方法,其特征在于,所述第三判断条件包括:所述第二火情位置与所述第一火情位置一致。The method according to claim 8 is characterized in that the third judgment condition includes: the second fire location is consistent with the first fire location.
  10. 一种火情判断系统,其特征在于,所述系统包括:A fire situation judgment system, characterized in that the system comprises:
    确定模块,用于基于目标区域的第一图像确定第一火情位置;A determination module, configured to determine a first fire location based on a first image of a target area;
    提取模块,用于确定所述第一火情位置在所述目标区域的第二图像中的匹配区域,并提取所述匹配区域的颜色信息,所述第一图像与所述第二图像的拍摄时间差满足预设条件;an extraction module, configured to determine a matching area of the first fire location in the second image of the target area, and extract color information of the matching area, wherein a shooting time difference between the first image and the second image satisfies a preset condition;
    判断模块,用于基于所述颜色信息,判断所述第一火情位置对应的火情是否为真实火情。A judgment module is used to judge whether the fire corresponding to the first fire position is a real fire based on the color information.
  11. 如权利要求10所述的系统,其特征在于,所述第一图像为红外热图像,所述第二图像为可见光图像。The system according to claim 10, characterized in that the first image is an infrared thermal image and the second image is a visible light image.
  12. 如权利要求10所述的系统,其特征在于,所述颜色信息包括RGB值,所述判断模块进一步用于:The system of claim 10, wherein the color information comprises RGB values, and the determination module is further configured to:
    响应于所述RGB值满足第一判断条件,确定所述第一火情位置对应的火情为所述真实火情;In response to the RGB value satisfying a first judgment condition, determining that the fire corresponding to the first fire position is the real fire;
    响应于所述RGB值不满足所述第一判断条件,确定所述第一火情位置对应的火情为非真实火情。In response to the RGB value not satisfying the first judgment condition, it is determined that the fire corresponding to the first fire location is not a real fire.
  13. 如权利要求12所述的系统,其特征在于,所述第一判断条件包括:所述RGB值中R值和G值满足第一阈值范围、B值满足第二阈值范围。The system as described in claim 12 is characterized in that the first judgment condition includes: the R value and the G value of the RGB value meet the first threshold range, and the B value meets the second threshold range.
  14. 如权利要求12所述的系统,其特征在于,所述判断模块进一步用于:The system according to claim 12, wherein the determination module is further configured to:
    响应于所述第一火情位置对应的火情为所述非真实火情,确定所述非真实火情的干扰类别。In response to the fire corresponding to the first fire location being the unreal fire, an interference category of the unreal fire is determined.
  15. 如权利要求14所述的系统,其特征在于,所述判断模块进一步用于:The system according to claim 14, characterized in that the determination module is further used to:
    响应于所述RGB值满足第二判断条件,确定所述干扰类别为反光点热源;In response to the RGB value satisfying a second judgment condition, determining that the interference category is a reflective point heat source;
    响应于所述RGB值不满足第二判断条件,确定所述干扰类别为非反光点热源。In response to the RGB value not satisfying the second judgment condition, the interference category is determined to be a non-reflective point heat source.
  16. 如权利要求15所述的系统,其特征在于,所述第二判断条件包括:所述RGB值中R值、G值和B值满足第三阈值范围。The system as claimed in claim 15, characterized in that the second judgment condition includes: the R value, the G value and the B value in the RGB value meet a third threshold range.
  17. 如权利要求15所述的系统,其特征在于,所述非反光点热源包括移动热源和静止热源,在确定所述第一火情位置对应的所述非真实火情的所述干扰类别为所述非反光点热源后,所述判断模块进一步用于:The system according to claim 15, characterized in that the non-reflective point heat source includes a moving heat source and a stationary heat source, and after determining that the interference category of the non-real fire corresponding to the first fire position is the non-reflective point heat source, the judgment module is further used to:
    基于所述目标区域的第三图像确定第二火情位置,所述第二火情位置与所述第一火情位置对应同一火情区域;determining a second fire location based on a third image of the target area, wherein the second fire location corresponds to the same fire area as the first fire location;
    判断所述第二火情位置与所述第一火情位置是否满足第三判断条件;Determining whether the second fire location and the first fire location satisfy a third determination condition;
    响应于满足所述第三判断条件,确定所述干扰类别为所述静止热源; In response to satisfying the third judgment condition, determining that the interference category is the stationary heat source;
    响应于不满足所述第三判断条件,确定所述干扰类别为所述移动热源。In response to the third judgment condition not being satisfied, the interference category is determined to be the mobile heat source.
  18. 如权利要求17所述的系统,其特征在于,所述第三判断条件包括:所述第二火情位置与所述第一火情位置一致。The system as described in claim 17 is characterized in that the third judgment condition includes: the second fire location is consistent with the first fire location.
  19. 一种火情判断装置,其特征在于,所述装置包括至少一个处理器以及至少一个存储器;A fire situation judgment device, characterized in that the device comprises at least one processor and at least one memory;
    所述至少一个存储器用于存储计算机指令;The at least one memory is used to store computer instructions;
    所述至少一个处理器用于执行所述计算机指令中的至少部分指令以实现如权利要求1-9中任一项所述的火情判断方法。The at least one processor is used to execute at least part of the computer instructions to implement the fire situation judgment method according to any one of claims 1-9.
  20. 一种计算机可读存储介质,所述存储介质存储计算机指令,其特征在于,当计算机读取存储介质中的计算机指令后,计算机执行如权利要求1-9中任一项所述的火情判断方法。 A computer-readable storage medium storing computer instructions, characterized in that after a computer reads the computer instructions in the storage medium, the computer executes a fire situation determination method as described in any one of claims 1 to 9.
PCT/CN2023/125066 2022-10-20 2023-10-17 Fire determination method, system and apparatus, and storage medium WO2024083134A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211289815.7 2022-10-20
CN202211289815.7A CN115546727A (en) 2022-10-20 2022-10-20 Method and system for judging fire condition and electronic equipment

Publications (1)

Publication Number Publication Date
WO2024083134A1 true WO2024083134A1 (en) 2024-04-25

Family

ID=84735757

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/125066 WO2024083134A1 (en) 2022-10-20 2023-10-17 Fire determination method, system and apparatus, and storage medium

Country Status (2)

Country Link
CN (1) CN115546727A (en)
WO (1) WO2024083134A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115546727A (en) * 2022-10-20 2022-12-30 浙江华感科技有限公司 Method and system for judging fire condition and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105488941A (en) * 2016-01-15 2016-04-13 中林信达(北京)科技信息有限责任公司 Double-spectrum forest fire disaster monitoring method and double-spectrum forest fire disaster monitoring device based on infrared-visible light image
CN108629940A (en) * 2018-05-02 2018-10-09 北京准视科技有限公司 A kind of image-type fire detection alarm system
US20200012859A1 (en) * 2017-03-28 2020-01-09 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fire detection
CN111951160A (en) * 2020-07-03 2020-11-17 广东工业大学 Fire-fighting unmanned aerial vehicle image fusion method based on visible light and infrared thermal imaging
CN112347942A (en) * 2020-11-09 2021-02-09 深圳英飞拓科技股份有限公司 Flame identification method and device
CN115546727A (en) * 2022-10-20 2022-12-30 浙江华感科技有限公司 Method and system for judging fire condition and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105488941A (en) * 2016-01-15 2016-04-13 中林信达(北京)科技信息有限责任公司 Double-spectrum forest fire disaster monitoring method and double-spectrum forest fire disaster monitoring device based on infrared-visible light image
US20200012859A1 (en) * 2017-03-28 2020-01-09 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fire detection
CN108629940A (en) * 2018-05-02 2018-10-09 北京准视科技有限公司 A kind of image-type fire detection alarm system
CN111951160A (en) * 2020-07-03 2020-11-17 广东工业大学 Fire-fighting unmanned aerial vehicle image fusion method based on visible light and infrared thermal imaging
CN112347942A (en) * 2020-11-09 2021-02-09 深圳英飞拓科技股份有限公司 Flame identification method and device
CN115546727A (en) * 2022-10-20 2022-12-30 浙江华感科技有限公司 Method and system for judging fire condition and electronic equipment

Also Published As

Publication number Publication date
CN115546727A (en) 2022-12-30

Similar Documents

Publication Publication Date Title
CN109846463A (en) Infrared face temp measuring method, system, equipment and storage medium
WO2024083134A1 (en) Fire determination method, system and apparatus, and storage medium
CN108734143A (en) A kind of transmission line of electricity online test method based on binocular vision of crusing robot
WO2022160413A1 (en) Electric power production anomaly monitoring method and apparatus, and computer device and storage medium
CN116092018B (en) Fire-fighting hidden danger monitoring method and system based on intelligent building
CN108389359A (en) A kind of Urban Fires alarm method based on deep learning
WO2024051067A1 (en) Infrared image processing method, apparatus, and device, and storage medium
CN113887412A (en) Detection method, detection terminal, monitoring system and storage medium for pollution emission
CN112001327A (en) Valve hall equipment fault identification method and system
CN112153373A (en) Fault identification method and device for bright kitchen range equipment and storage medium
CN105574468A (en) Video flame detection method, device and system
CN113095114A (en) Method, device and equipment for detecting oil leakage of hydraulic system
CN101316371A (en) Flame detecting method and device
CN113449639A (en) Non-contact data acquisition method for instrument by gateway of Internet of things
CN103607558A (en) Video monitoring system, target matching method and apparatus thereof
CN111626104B (en) Cable hidden trouble point detection method and device based on unmanned aerial vehicle infrared thermal image
CN113408479A (en) Flame detection method and device, computer equipment and storage medium
CN112347874A (en) Fire detection method, device, equipment and storage medium
CN112446304A (en) Flame detection method and system thereof
CN116030412A (en) Escalator monitoring video anomaly detection method and system
CN114387391A (en) Safety monitoring method and device for transformer substation equipment, computer equipment and medium
CN111127433A (en) Method and device for detecting flame
CN112255141A (en) Thermal imaging gas monitoring system
CN114067441B (en) Shooting and recording behavior detection method and system
CN111985497B (en) Crane operation identification method and system under overhead transmission line