WO2022143052A1 - 火点检测方法、装置、电子设备和存储介质 - Google Patents

火点检测方法、装置、电子设备和存储介质 Download PDF

Info

Publication number
WO2022143052A1
WO2022143052A1 PCT/CN2021/136265 CN2021136265W WO2022143052A1 WO 2022143052 A1 WO2022143052 A1 WO 2022143052A1 CN 2021136265 W CN2021136265 W CN 2021136265W WO 2022143052 A1 WO2022143052 A1 WO 2022143052A1
Authority
WO
WIPO (PCT)
Prior art keywords
fire
area
image
fire point
detection
Prior art date
Application number
PCT/CN2021/136265
Other languages
English (en)
French (fr)
Inventor
史飞
毛栊哗
Original Assignee
浙江宇视科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 浙江宇视科技有限公司 filed Critical 浙江宇视科技有限公司
Priority to EP21913793.2A priority Critical patent/EP4273743A1/en
Priority to US18/259,927 priority patent/US20240060822A1/en
Publication of WO2022143052A1 publication Critical patent/WO2022143052A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0066Radiation pyrometry, e.g. infrared or optical thermometry for hot spots detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/08Optical arrangements
    • G01J5/0859Sighting arrangements, e.g. cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/20Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
    • H04N23/23Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only from thermal infrared radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging

Definitions

  • the embodiments of the present application relate to the technical field of image monitoring, for example, to a fire detection method, device, electronic device, and storage medium.
  • Fire is a common and frequent disaster, which has the characteristics of strong suddenness, great destruction, and difficulty in handling and rescue, which seriously affects the safety of people's lives and properties. Therefore, in the fire monitoring process, the timeliness and accuracy of finding the fire point is particularly important, so that the relevant personnel can take rescue measures as soon as possible and minimize the losses caused by the fire.
  • the fire point detection method in the related art can be detected by a thermal imaging pan-tilt camera.
  • the thermal imaging pan-tilt camera is set up on the monitoring high point, and the traversal and cruise scanning of the monitoring scene can realize the complete fire point detection of the monitoring scene. Coverage, once the ignition point is detected, the camera immediately sends an alarm to the control center through the network, and at the same time, the location information of the ignition point can be located through the azimuth of the PTZ.
  • the control center can quickly find the fire point and take measures in the early stage of the fire to avoid greater losses and disasters.
  • the focal length of thermal imaging lenses used in general forest fire prevention or other large monitoring areas is generally more than 50mm. According to the principle of lens imaging, the larger the focal length, the smaller the depth of field.
  • the monitoring scene of the camera is a multi-object distance scene
  • the image acquisition is still performed according to the fixed position of the focusing lens group during the scanning cruise, it will cause the fire point to be outside the effective depth of field of the camera, and the fire point area
  • the thermal radiation intensity is not enough, and the imaging brightness of the camera is not high, which will cause the problem of missing fire detection.
  • Embodiments of the present application provide a fire point detection method, device, electronic device, and storage medium, so as to improve the accuracy of fire point detection under multiple object distances.
  • an embodiment of the present application provides a method for detecting a fire point, including:
  • the target position of the focusing lens group is determined according to the minimum object distance and the maximum object distance in the to-be-detected image
  • Whether the detection area is a fire spot area is determined according to pixel values in the detection area.
  • an embodiment of the present application also provides a fire point detection device, including:
  • an image acquisition module configured to acquire an image to be detected collected by the image acquisition device based on the focusing lens group at a target position; wherein, the target position of the focusing lens group is determined according to the minimum object distance and the maximum object distance in the to-be-detected image;
  • a template moving module configured to move the fire detection template in the to-be-detected image according to a preset movement rule, and use the range covered by the fire detection template as the detection area;
  • the fire point judgment module is configured to determine whether the detection area is a fire point area according to the pixel value in the detection area.
  • an embodiment of the present application also provides an electronic device, including:
  • storage means arranged to store at least one program
  • the at least one processor When the at least one program is executed by the at least one processor, the at least one processor implements the fire detection method according to the first aspect of the present application.
  • an embodiment of the present application further provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, implements the fire point detection method according to the first aspect of the present application.
  • Fig. 1 is the flow chart of the fire point detection method in the first embodiment of the present application
  • FIG. 2 is a schematic diagram of an S-shaped scanning path
  • Fig. 3 is the schematic diagram of fire point detection template
  • FIG. 4 is a schematic diagram of the attenuation of pixel values in the simulated fire point area near the imaging clear point;
  • Figure 5 is a schematic diagram of the actual detection range of the suspected fire point area
  • FIG. 6 is a flowchart of a method for detecting a fire point in Embodiment 2 of the present application.
  • Fig. 7 is a schematic diagram of the location determination of the fire spot region in the image
  • FIG. 8 is a schematic structural diagram of a fire point detection device in Embodiment 3 of the present application.
  • FIG. 9 is a schematic structural diagram of an electronic device in Embodiment 4 of the present application.
  • FIG. 1 is a flowchart of a fire point detection method in Embodiment 1 of the present application. This embodiment can be applied to a fire point detection situation in a multi-object distance scenario.
  • the method can be performed by a fire detection device, which can be implemented in software and/or hardware, and can be configured in an electronic device, for example, the electronic device can be a background server and other devices with communication and computing capabilities.
  • the method includes:
  • Step 101 Acquire an image to be detected based on the focusing lens group collected by the image acquisition device at the target position; wherein, the target position of the focusing lens group is determined according to the minimum object distance and the maximum object distance in the to-be-detected image.
  • the image acquisition device refers to a camera that can realize long-distance monitoring.
  • the camera needs to use a telephoto lens, and based on the imaging principle of the telephoto lens, the telephoto lens must have the problem of a small depth of field.
  • the image acquisition device in the present application is a thermal imaging pan-tilt camera. Point detection achieves full coverage.
  • the focusing lens group there is a set of lens groups in the lens whose position can be adjusted, called the focusing lens group.
  • the focusing lens group is driven by a focusing motor to adjust back and forth, which can make the image of objects at different object distances clear. , and there is a one-to-one correspondence between the object distance and the position of the focusing lens when the image is clear.
  • the object distance refers to the distance between the camera and the observation target.
  • the minimum object distance in the image to be detected refers to the closest monitoring distance in the image to be detected, and the farthest object distance refers to the farthest monitoring distance in the image to be detected.
  • the minimum object distance refers to the distance between the observation target at the bottom of the image to be detected and the camera
  • the maximum object distance refers to the distance between the observation target at the top of the image to be detected and the camera.
  • the target position of the focusing lens group is determined according to the minimum object distance and the maximum object distance in the captured image, that is, the target position of the focusing motor is determined.
  • the target position of the focusing motor is determined according to the minimum object distance and the maximum object distance in the captured image, that is, the target position of the focusing motor is determined.
  • the target object with the minimum object distance is to be imaged most clearly by the image acquisition device, there is a closest position of the focusing motor corresponding to the minimum object distance.
  • the image of the target object is the clearest, and there is also a farthest position of the focusing motor corresponding to the farthest object distance.
  • the focusing motor is in the closest position, the image of the lowest object in the collected image is the clearest, and the farther the distance from the lowest object is, the clearer the image is.
  • the far image is less clear; when the focus motor is at the farthest position, the image is the clearest on the top object of the captured image, and the farther away from the top object, the less clear the image. Therefore, the target position of the focusing lens group is determined according to the minimum object distance and the maximum object distance in the image to be detected, so that the object in the entire image is relatively clearest. In order to take into account the clarity of the entire image, the scanning process sets the focus The target position of the lens group is F, that is, the target position of the focusing motor is determined to be F, and F is located between the closest position and the farthest position of the focusing motor.
  • the closest position of the focusing motor is F N
  • the farthest position is F F
  • F (F N +F F )/2
  • the definition of the image to be detected within the monitoring range is the best.
  • the setting manner of F is not limited in the embodiments of the present application.
  • the minimum object distance and the maximum object distance in the captured image screen will not change greatly.
  • the captured image The minimum and maximum object distances in the screen will change. Therefore, when it is acquired that the lens of the image acquisition device changes in the vertical direction, the minimum object distance and the maximum object distance in the acquired image are determined, and then the target position of the focusing lens group is determined; if the lens of the image acquisition device is not acquired in the When the vertical direction changes, the image acquisition device can keep the position of the focusing lens group unchanged when only scanning and cruising in the horizontal direction to collect images.
  • the target position of the focusing lens group when capturing the image to be detected is re-determined according to the minimum object distance and the maximum object distance in the current monitoring picture.
  • the method before step 101, the method further includes:
  • At least two calibrated object distances and at least two calibrated positions of the focusing lens group determine the relationship between the object distance and the position of the focusing lens group when the image is the clearest;
  • the target position of the focusing lens group is determined according to the minimum object distance and the maximum object distance in the image to be detected, including:
  • the target position of the focusing lens group is determined according to the closest position and the farthest position of the focusing lens group.
  • the calibration object distance is determined according to the monitoring range. Select at least two object distances from the monitoring range as the calibrated object distances. The number and range of the calibrated object distances are set according to the accuracy of the monitoring requirements and the actual monitoring range used, and are not limited here. Exemplarily, if the monitoring range is 10 meters to 50 meters, the calibration object distance may be set to 10 meters, 20 meters, 30 meters, 40 meters and 50 meters.
  • the position of the focusing lens group corresponding to the image of the object under the selected at least two calibration object distances when the image is the clearest and determine it as the calibration position. Thereby, the corresponding relationship between the calibration object distance and the calibration position is established. Due to the imaging principle of the camera, there is a nonlinear relationship between the object distance and the position of the focusing lens group at the clear point. Therefore, the position of the focusing lens group corresponding to each object distance can be obtained by segmental interpolation according to the calibrated object distance and the calibrated position.
  • the image acquisition device When the image acquisition device needs to determine the target position of the focusing lens group, it determines the closest position of the focusing lens group corresponding to the minimum object distance and the farthest position of the focusing lens group corresponding to the maximum object distance in the captured image according to the segmented interpolation results, Thereby, the target position of the focusing lens group is determined. Exemplarily, after obtaining the minimum object distance, determine between two calibrated object distances where the minimum object distance is located, and then perform nonlinear interpolation on the positions of the focusing motors corresponding to the two calibrated object distances to obtain the target of the focusing motor. Location.
  • the target position of the focusing motor corresponding to the currently collected image can be quickly determined when the image is actually scanned and cruised to collect images, thereby improving the efficiency of image collection. It takes a long time to avoid determining the position of the focus motor in real time during acquisition.
  • the scanning path of the image acquisition device for acquiring the image includes at least two scanning inflection points, and at each scanning inflection point, the lens of the image acquisition device is rotated in the vertical direction, and the fire point detection method is further include:
  • the minimum object distance and the maximum object distance in the pre-scanned image determine the nearest position and the farthest position of the focusing lens group when the image is clear, and establish the correlation between each scanning inflection point and the nearest position and the farthest position of the focusing lens group;
  • the target position of the focusing lens group is determined according to the minimum object distance and the maximum object distance in the image to be detected, including:
  • the image acquisition device performs cruise scanning according to a pre-planned scanning path, and collects images during the scanning process.
  • the pre-planned scanning path is based on the delineation of the monitoring area. Through reasonable planning, full coverage of the monitoring area can be achieved after scanning.
  • the planning of the scanning path needs to be determined according to the angle of the monitoring area and the field of view of the current image acquisition device. Guaranteed not to be repeated or omitted.
  • the scanning path is generally planned to be an S-shaped scanning path, as shown in FIG. 2 .
  • the closest position of the focusing lens group corresponding to the minimum object distance in the collected image is the near clear point F N .
  • the farthest position of the focusing lens group corresponding to the maximum object distance, that is, the clear point FF in the distance, FN and FF in each image collected during the horizontal scanning process in the scanning path remain unchanged.
  • the F N and F F in the acquired image during the vertical scan in the path will change.
  • the image acquisition device is controlled to perform pre-scanning in the vertical direction to obtain vertical
  • the pre-scanned images at each scanning inflection point on the scanning path, and the minimum object distance and the maximum object distance in each pre-scanned image were auto-focused respectively, FN and FF in each pre-scanned image were recorded in turn, and Establish the relationship between the vertical direction of each scanning inflection point and the nearest position FN and the farthest position FF of the focusing lens group.
  • a mapping relationship between the vertical rotation direction of the gimbal and F N and F F is established.
  • the current scanning inflection point of the image acquisition device determines the current vertical direction of the image acquisition device, and according to the vertical direction and the relationship between the nearest position FN and the farthest position FF of the focusing lens group, from which The current nearest position and the current farthest position associated with the current vertical direction are determined, and the target position of the focusing lens group is determined according to the current nearest position and the current farthest position.
  • the average value of the nearest position and the farthest position in the scanned image is used as the target position
  • the correlation between the vertical direction and the target position can be directly established, that is, the current vertical direction can be directly determined during the formal scanning process. target position to improve the efficiency of image acquisition.
  • monitor the vertical direction of the image acquisition device If no change in the vertical direction is detected, the current target position of the focusing lens group is directly maintained; The target position of the corresponding focusing lens group is re-determined in the current vertical direction, that is, the vertical direction when the image is captured is determined, and the current target position is updated.
  • the cruise scheme in the related art generally focuses automatically once at each scanning inflection point, the overall picture is clear, but the degree of clarity of each local area cannot be quantitatively judged. That is to say, it cannot be guaranteed that the position of the focusing lens group during autofocusing can ensure the highest degree of local clarity in the entire picture. Therefore, in the embodiment of the present application, the target position of the focusing lens group is determined according to the minimum object distance and the maximum object distance in the collected image, so that the clarity of the global image and the clarity of the local area are ensured, and the accuracy of fire detection is improved. Accuracy of point detection.
  • Step 102 Move the fire detection template in the image to be detected according to a preset movement rule, and use the range covered by the fire detection template as the detection area.
  • the fire point detection template refers to an area template used for locating the fire point position, which may be determined according to the actual accuracy of the fire point detection, which is not limited here.
  • the schematic diagram of the fire point detection template is shown in FIG. 3 , and the three types of fire point detection templates illustrated in FIG. 3 are just some examples, but are not limited to these three types.
  • a fire point detection template of type a is used, that is, a moving traversal search is performed in the image to be detected by using a fire point detection template of four pixels.
  • the preset moving rule is used to set the moving step and moving direction of the fire detection template in the image to be detected.
  • the moving step can be set according to the size of the fire detection template, which is not limited here.
  • fire point detection is performed on each frame of images acquired. After each frame of image is collected, the fire point detection template is used to perform sequential movement traversal in the image to be detected, and each detection area covered by the fire point detection template during the movement and traversal process is determined.
  • the moving step size of the fire detection template can also be set according to actual needs, which is not limited here.
  • Step 103 Determine whether the detection area is a fire spot area according to the pixel values in the detection area.
  • the thermal imaging camera uses an infrared detector to capture the infrared light radiated by the object, and then presents an image of the object according to the intensity of the infrared radiation. Taking a grayscale image as an example, the higher the pixel value, the stronger the infrared radiation and the higher the temperature. Therefore, it can be determined whether the temperature in the detection area is too high according to the size of the pixel values in the detection area, and if the temperature in the detection area is too high, it is determined that the detection area is a fire spot area.
  • the average pixel value in each detection area is determined, and whether it is a fire area is determined according to the comparison result between the average pixel value and a predetermined fire detection threshold, and the fire detection threshold can be determined according to the actual temperature of the fire. , judged by the average pixel value of multiple pixel points in the detection area, which can reduce the interference of random noise of a single pixel point and improve the accuracy of fire point detection.
  • the method before step 103, the method further includes:
  • the object distance between the point and the image acquisition device is determined;
  • the first fire point detection threshold and the second fire point detection threshold according to the pixel value of the simulated fire point area and the pixel value of the non-simulated fire point area; wherein, the pixel value of the simulated fire point area is greater than the first fire point detection threshold, the first fire point detection threshold The first fire detection threshold is greater than the second fire detection threshold, and the second fire detection threshold is greater than the pixel value of the non-simulated fire area.
  • the simulated fire point is a selected fire source for simulating the real fire point.
  • a selected fire source for simulating the real fire point.
  • an alcohol lamp or other easily controllable fire source may be used for the simulated fire point.
  • the simulated fire point is used for actual measurement to calibrate the fire point detection threshold in advance, so that the determined fire point detection threshold value is closer to the real fire point, and the accuracy of the fire point detection is improved.
  • the simulated fire point is placed in the effective detection area of the image acquisition device, for example, at a position at a distance D from the image acquisition device, according to the working principle of the thermal imaging camera, the thermal imaging camera
  • the gray value of the image is related to the temperature of the object being photographed and the object distance between the object and the camera, but in the field of forest fire prevention and other monitoring distances with a long distance, the influence of distance on the gray value of the collected image can be ignored, and the fire point judgment can be ignored.
  • the detection threshold only depends on the difference between the fire temperature and the ambient temperature, so the size of the object distance D is not limited.
  • the simulated fire point image is clear by adopting the method of regional auto-focusing on the simulated fire point area.
  • the simulated fire area where the simulated fire is located is determined according to the fire detection template, and the pixel value of the simulated fire area and the pixel value of the non-simulated fire area are determined.
  • the average pixel value of the simulated fire point area is used as the average pixel value of the real fire point area
  • the average pixel value of the non-simulated fire point area is used as the average pixel value of the real non-fire point area. Since the simulated fire point is artificially placed, it can be determined that the simulated fire point area is the real fire point, which can reflect the pixel value level of the real fire point area, and it can also be determined that the non-simulated fire point area has no fire point, so the non-simulated fire point
  • the pixel value of the area can reflect the pixel value level of the background.
  • the pixel value of the simulated fire point area is marked as I fire
  • the pixel value of the non-simulated fire point area is marked as I b .
  • the pixel value of the simulated fire point area is the largest.
  • the image of the simulated fire point begins to form a divergent light spot, and the pixel value of the simulated fire point area begins to decrease.
  • the pixel value of the spot area gradually decreases to the same level I b as the background of the non-fire spot area. Therefore, the first fire detection threshold I 1 and the second fire detection threshold I 2 are determined between the pixel value I of the simulated fire area and the pixel value I b of the non-simulated fire area, such that I fire > I 1 >I 2 >I b .
  • the fire detection threshold should satisfy the obvious distinction with the background area and minimize misjudgment. Therefore, two fire detection thresholds are set.
  • the first fire detection threshold is the closest to the pixel value of the simulated fire area. Accurate judgment of the fire spot area.
  • the second fire point detection threshold value of the first fire point detection threshold value is used to avoid missing detection of the fire point whose image is not the clearest.
  • step 103 includes:
  • the pixel value in the detection area is smaller than the second fire point detection threshold, it is determined that the detection area is not a fire point area.
  • the detection area can be considered as a fire area. Since the second fire point detection threshold is close to the pixel value of the non-simulated fire point area, if the pixel value in the detection area in the image to be detected is smaller than the second fire point detection threshold value, it is determined that the detection area is not a fire point area, and is closer to background area.
  • the detection area is determined to be a fire point area; if the average pixel value in the detection area is less than the second fire point detection threshold, the detection area is determined. Not a fire spot.
  • step 103 further includes:
  • the detection area is determined to be a suspected fire point area
  • the second detection is performed on the suspected fire point area, and the fire point area in the suspected fire point area is obtained.
  • the detection area is a suspected fire area.
  • the suspected fire point area it may be some sunlight reflection or high temperature object, or it may be the real fire point, but because the suspected fire point area is outside the effective depth of field of the current imaging, the suspected fire point is imaged on the image through the lens. Divergence, resulting in a low pixel value in the suspected fire spot area, failing to meet the judgment condition of being greater than or equal to the first fire spot detection threshold. Therefore, it is necessary to perform secondary detection on the suspected fire point area to ensure the accuracy of the fire point detection and avoid false detection and missed detection.
  • secondary detection is performed on the suspected fire point area to obtain the fire point area in the suspected fire point area, including:
  • the focusing lens group Determine the closest position of the focusing lens group associated with the minimum object distance and the farthest position of the focusing lens group associated with the maximum object distance in the image to be detected, move the focusing lens group from the closest position to the farthest position, and collect the moving process.
  • the pixel value of the target suspected fire spot area in any of the at least two images is greater than or equal to the first fire spot detection threshold, it is determined that the target suspected fire spot area is a fire spot area.
  • the number of suspected fire point areas is at least one, and the target suspected fire point area of the present application is any one of the at least one suspected fire point area.
  • the focusing lens group when detecting a suspected fire spot area with multiple object distances in the same image, the focusing lens group is moved from the closest position associated with the minimum object distance in the image to the farthest position, and then During the moving process, multiple images are collected, and any suspected fire spot area in the multiple images will have a corresponding image, so that the imaging of the suspected fire spot area is clearer. Therefore, if the pixel value of any suspected fire point area in any image is greater than or equal to the first fire point detection threshold, it is determined that the suspected fire point area is a fire point area; if the pixel value of a suspected fire point area in all images is are both smaller than the first fire point detection threshold, then it is determined that the suspected fire point area is not a fire point area.
  • the focus motor may move according to a preset step size when controlling the position of the focus lens group to move, and one frame of image is collected for each step size of the focus motor movement. Compare the pixel value of the corresponding suspected fire point area in each frame of image with the first fire point detection threshold; if the pixel value of the suspected fire point area is greater than the first fire point detection threshold, it is determined that the target suspected fire point area is the fire point area .
  • the setting of the preset step size can be determined according to the object distance range and the depth of field value in the image captured by the image capturing device, which is not limited herein.
  • the secondary detection of the suspected fire spot area After the secondary detection of the suspected fire spot area is completed, if no fire spot area is found in the image to be detected, the next frame of image will be directly collected and detected; if a fire spot area is found in the to-be-detected image, including If the fire spot area is determined in the spot area, the location information is determined for all the fire spot areas.
  • secondary detection is performed on the suspected fire point area to obtain the fire point area in the suspected fire point area, including:
  • the number of suspected fire spot areas is one, perform regional auto-focusing on the suspected fire spot area, and determine the focused pixel value of the suspected fire spot area in the focused image, if the focused pixel value is greater than or equal to the first fire spot detection threshold, then It is determined that the suspected fire spot area is the fire spot area.
  • the focusing lens group is moved from the nearest position to the farthest position for secondary detection, which will result in slow focusing efficiency. Therefore, when there is only one suspected fire spot area in the image to be detected, the area is directly focused on the area, so that the suspected fire spot area is the clearest image, and the focus pixel value of the suspected fire spot area after focusing is recalculated. If the focus pixel value is greater than or equal to If the first fire point detection threshold is set, the suspected fire point area is determined to be a fire point area, and if the focused pixel value is less than the first fire point detection threshold, it is determined that the suspected fire point area is not a fire point area.
  • the suspected fire point area occupies less pixels.
  • the focused pixel value of the suspected fire spot in the region of interest is determined, wherein the size of the region of interest can be determined according to the actual focusing effect, which is not limited here.
  • the pixel value of the suspected fire point area is close to the pixel value of the simulated fire point area , so it is directly compared with the pixel value of the first fire point detection to improve the accuracy of fire point detection.
  • the remaining suspected fire spot area may still have a case where the focused pixel value is smaller than the first fire spot detection pixel value.
  • the secondary detection of the suspected fire spot area needs to be determined according to the number of suspected fire spot areas in the image to be detected, because different numbers indicate that the object distance of the target detection object in the image is also different, so targeted secondary detection can improve detection. efficiency.
  • the method further includes:
  • the scanning path of the image captured by the image capturing device is determined according to the actual detected object distance range.
  • FIG 4 is a schematic diagram of the attenuation of pixel values in the simulated fire spot near the clear image point.
  • F is the position of the focusing lens group corresponding to the clearest image of the simulated fire spot on the imaging surface.
  • the focusing lens When the focusing lens When the group is located at position F, the pixel value of the simulated fire point area on the simulated image is I fire , and the pixel value of the non-simulated fire point area is I b , the values of I 1 and I 2 are determined between I fire and I b , and the After the second fire point detection threshold is determined, there is one focusing lens group on the far and near sides of the clear point F (ie, F 1 and F 2 ), so that the pixel value of the simulated fire point area is I 2 .
  • the position of the focusing lens group is set to be F when the current simulated image is captured, but the position of the focusing lens group corresponding to the distance of the actual simulated fire spot should be F 1 or F 2 , it can be seen that the position of the focusing lens group when the When between F 1 and F 2 , if the pixel value obtained by detecting the simulated fire point area in the simulated image is greater than the second fire point detection threshold, it can be determined as a suspected fire point area.
  • a step-by-step search is performed by moving the position of the focusing lens group.
  • Figure 5 is a schematic diagram of the actual detection range of the suspected fire spot area. When the lens group is located at F1 or F2, the suspected fire spot area at the object distance corresponding to the clear point can be detected, and the pixel value of the suspected fire spot area satisfies the condition that it is greater than the first fire spot detection threshold.
  • the positions F 1 and F 2 of the focus lens group corresponding to the fire point that can actually be detected in the simulated image can be determined.
  • the object distances D 1 and D 2 corresponding to F 1 and F 2 can be determined, that is, the range between D 1 and D 2 in the image to be detected is the actual detected object distance range of a single scan. If the point is outside the object distance range, the pixel value of the real fire point area will be smaller than the second fire point detection threshold, resulting in missed detection.
  • the overlapping range of the collected images is set according to the actual detected object distance range, so that the monitoring area can be fully covered in the scanning process according to the actual detected object distance range.
  • the target position of the focusing lens group is determined based on the minimum object distance and the maximum object distance in the collected image, and the image to be detected is collected based on the target position; the result of the movement of the fire point detection template in the to-be-detected image is determined.
  • Detection area according to the pixel value of the detection area to determine whether the detection area is a fire area.
  • the target position of the focusing lens group is determined according to the object distance in the captured image, so that under the current object distance, the focusing lens group using the target position can achieve the maximum detection clarity, so as to ensure that in the multi-object distance scenario, the The multi-object distance fire point in the scene image is detected to improve the accuracy of fire point detection.
  • FIG. 6 is a flowchart of a method for detecting a fire point in Embodiment 2 of the present application, which is an optional embodiment of the present application. As shown in Figure 6, the method includes:
  • the embodiments of the present application include a set of thermal imaging pan-tilt cameras and their supporting network facilities.
  • the thermal imaging pan-tilt camera adopts a medium-telephoto thermal imaging lens.
  • the lens has the function of precise focusing, and can achieve clear imaging of objects with different object distances by adjusting the position of the focusing lens group.
  • Step 601 parameter calibration.
  • Parameter calibration refers to the necessary parameter information in the process of thermal imaging camera fire detection. Parameters that need to be calibrated include the relationship between the object distance and the position of the focus motor. In order to achieve long-distance monitoring, it is necessary to use a telephoto lens, and the telephoto lens must have the problem of small depth of field. In order to meet different monitoring distances, there is a set of adjustable lens groups in the lens, called focusing lens group. The focusing lens group is driven by the focusing motor to adjust back and forth, which can make the image of objects at different object distances clear, and the object distance and focus are clear. The points are in one-to-one correspondence.
  • the relationship between the object distances and the positions of the focusing motors can be determined.
  • the number and range of the calibrated object distance groups are determined according to the accuracy of the detection requirements and the actual monitoring range.
  • the parameters to be calibrated also include a fire point detection threshold, and the fire point detection threshold value includes a first fire point detection threshold value and a second fire point detection threshold value.
  • the gray value of the image collected by the thermal imaging camera is related to the temperature of the object being photographed and the distance between the object and the camera. The influence of the value can be ignored, and the fire detection threshold depends on the difference between the temperature of the fire area and the surrounding background temperature, which can generally be given by actual measurement.
  • the simulated fire point is used for the actual measurement, and the simulated fire point can be an alcohol lamp or other fire source.
  • the average grayscale size I b of the background non-simulated fire point area is calculated.
  • the gray value of the fire point area is the largest.
  • the first fire detection threshold I 1 and the second fire detection threshold I 2 are selected between I fire and I b to achieve I fire >I 1 >I 2 >I b .
  • Step 602 scan and cruise.
  • control the PTZ to pre-scan one round in the vertical direction focus clearly on the upper and lower edges of the image corresponding to each official scanning path, and record the FF and FN of each round of scanning in turn. value.
  • the target position of the focusing motor can be determined according to FF and FN when the image is captured.
  • the focus motor positions F F and F N corresponding to the two monitoring distances can be obtained respectively; or directly according to the focus obtained by the pre-scanning
  • the motor position value is determined.
  • the focus motor position is set to F during the scanning process, where F is between F N and F F.
  • the embodiment of the present application does not limit the setting manner of F.
  • the gimbal rotates in the horizontal direction the general monitoring distance does not change; when the gimbal rotates in the vertical direction, the monitoring distance changes. At this time, the focus motor position F during scanning needs to be re-determined according to the current monitoring object distance.
  • Step 603 fire point area judgment.
  • the fire point is judged for each frame image captured by the camera.
  • the process is divided into two steps: the first round of judgment and the confirmation of the suspected fire point.
  • the first round of judgment is to traverse and search through the fire point detection template of four pixels, calculate the average gray value I avg of all pixels in the template, and use the average gray value to determine the fire point, which can reduce the interference of random noise of a single pixel. If the average gray level I avg of the current area is greater than the first fire detection threshold I 1 , it is considered to be a fire area. If the average gray level is less than I 1 but greater than or equal to the second fire detection threshold I 2 , it is considered to be a suspected fire area.
  • suspected fire spot After the entire image is searched, if no fire spot area and suspected fire spot area are found, continue to scan the next frame. If a suspected fire spot is found, enter the suspected fire spot confirmation link.
  • the suspected fire point it may be some sunlight reflection or high temperature object, or it may be the real fire point, but because the fire point is outside the effective depth of field of the current imaging, the image of the fire point on the image through the lens is more divergent, resulting in the average The gray value is relatively low, and the first fire point detection threshold judgment condition is not met. Therefore, it is necessary to confirm the authenticity of the suspected fire point. According to the number of suspected fire spots in the whole image, there are two processing methods.
  • the suspected fire point is expanded outward to an area of interest, and an area auto-focus is performed for this area, so that the suspected fire point imaging is the clearest. Recalculate the average gray value of the suspected fire point. If the gray value of the suspected fire point is greater than or equal to I 1 , it is considered to be a fire point area; if the gray value of the suspected fire point is less than I 1 , it is considered to be non-fire point. point area. If it is found that there are multiple suspected fire spots, set the focus motor position of the lens at F N of the image, and then move to F F with a fixed step size.
  • the camera collects a frame of image, and the first fire detection threshold is used to judge the fire point according to the grayscale of the suspected fire point area in the image.
  • the isofocus motor moves to FF, the confirmation of the suspected fire point is completed. In this process, count the number of fire areas and non-fire areas in the suspected fire area. If all the suspected fire areas are non-fire areas, and no fire areas are found in the first round of judgment, the cruise scan will continue. If the final result finds that there is a fire point area, proceed to the next step to confirm the fire point position.
  • Step 604 determining the fire point position.
  • Figure 7 shows a schematic diagram of the location of the fire spot in the image.
  • the actual longitude and latitude information of the fire point area can be determined.
  • the object distance information from the fire point area to the PTZ camera can be determined according to the position of the focus motor when the image frame is collected, and the object distance information of the fire point can be obtained according to the relationship between the pre-calibrated focus motor position and the object distance. After accurate positioning of the fire point area, the location information is sent to the control center for fire point alarm.
  • the embodiment of the present application determines the position of the focusing motor based on the minimum object distance and the maximum object distance in the image, thereby realizing the detection of the maximum range of the multi-object distance fire point in the collected image; secondly, after determining the suspected fire point area in the image, According to the focus motor searches between the nearest position and the farthest position of the focus motor corresponding to the current image, the suspected fire spot area in the image can be clearly focused, and the detection of the suspected fire spot area under multiple object distances in the image is realized, which improves the accuracy of the image.
  • the embodiment of the present application realizes that the thermal imaging camera can detect the fire point at different object distances in the multi-object distance scene during the scanning cruise, thereby improving the fire point detection accuracy.
  • FIG. 8 is a schematic structural diagram of a fire point detection device in Embodiment 3 of the present application. This embodiment can be applied to a fire point detection situation in a multi-object distance scenario. As shown in Figure 8, the device includes:
  • the image acquisition module 810 is configured to acquire the image to be detected collected by the image acquisition device based on the focusing lens group at the target position; wherein, the target position of the focusing lens group is determined according to the minimum object distance and the maximum object distance in the to-be-detected image;
  • the template moving module 820 is configured to move the fire detection template in the image to be detected according to the preset movement rule, and the range covered by the fire detection template is used as the detection area;
  • the fire point determination module 830 is configured to determine whether the detection area is a fire point area according to pixel values in the detection area.
  • the target position of the focusing lens group is determined based on the minimum object distance and the maximum object distance in the collected image, and the image to be detected is collected based on the target position;
  • the detection area is determined according to the pixel value of the detection area whether the detection area is a fire spot area.
  • the target position of the focusing lens group is determined according to the object distance in the captured image, so that under the current object distance, the focusing lens group using the target position can achieve the maximum detection clarity, so as to ensure that in the multi-object distance scene, the The multi-object distance fire point in the scene image is detected to improve the accuracy of fire point detection.
  • the device further includes a fire point detection threshold determination module, configured to: before determining whether the detection area is a fire point area according to the pixel values in the detection area, obtain the image acquisition device based on the focus lens group in the The simulated image collected from the simulated fire point at the simulated position, and the pixel value of the simulated fire point area where the simulated fire point is located in the simulated image is determined; wherein, the simulated position of the focusing lens group is based on the distance between the simulated fire point and the image acquisition device. The object distance is determined;
  • the first fire detection threshold and the second fire detection threshold are determined according to the pixel value of the simulated fire area and the pixel value of the non-simulated fire area; wherein the pixel value of the simulated fire area is greater than the first fire detection threshold.
  • Fire point detection threshold, the first fire point detection threshold value is greater than the second fire point detection threshold value, and the second fire point detection threshold value is greater than the pixel value of the non-simulated fire point area.
  • the fire point judgment module including:
  • a fire point determination unit configured to determine that the detection area is a fire point area if the pixel value in the detection area is greater than or equal to the first fire point detection threshold
  • the non-fire point determination unit is configured to determine that the detection area is not a fire point area if the pixel value in the detection area is smaller than the second fire point detection threshold.
  • the fire point judgment module further includes:
  • a suspected fire point determination unit configured to determine that the detection area is a suspected fire point area if the pixel value in the detection area is less than the first fire point detection threshold and greater than or equal to the second fire point detection threshold;
  • the secondary detection unit is configured to perform secondary detection on the suspected fire point area to obtain the fire point area in the suspected fire point area.
  • the secondary detection unit performs secondary detection on the suspected fire point area in the following manner, and obtains the fire point area in the suspected fire point area:
  • the secondary detection unit performs secondary detection on the suspected fire point area in the following manner, and obtains the fire point area in the suspected fire point area:
  • the suspected fire point area If the number of the suspected fire point area is one, perform regional auto-focusing on the suspected fire point area, and determine the focus pixel value of the suspected fire point area in the focused image, if the focus pixel value is greater than If it is equal to the first fire point detection threshold, it is determined that the suspected fire point area is a fire point area.
  • the device further includes an object distance calibration module, and the object distance calibration module is configured to determine that the image acquisition device is in At least two calibrated positions of the focusing lens group when the image of the object under at least two calibrated object distances is the clearest;
  • the at least two calibrated object distances and the at least two calibrated positions of the focusing lens group determine the relationship between the object distance and the position of the focusing lens group when the image is the clearest;
  • the image acquisition module includes a first determination unit for the target position of the motor, and the first determination unit for the target position of the motor is set to:
  • the target position of the focusing lens group is determined according to the nearest position and the farthest position of the focusing lens group.
  • the device further includes a scan path determination module, and the scan path determination module is set to:
  • the scanning path of the image captured by the image capturing device is determined according to the actual detected object distance range.
  • the scanning path of the image acquisition device for acquiring images includes at least two scanning inflection points, and at each scanning inflection point, the lens of the image acquisition device is rotated in the vertical direction, and the device further includes a pre-scan. module, the pre-scanning module is set to:
  • the minimum object distance and the maximum object distance in the pre-scanned image determine the nearest position and the farthest position of the focusing lens group when the image is clear, and establish each scanning inflection point and the nearest position and the farthest position of the focusing lens group location relationship;
  • the image acquisition module includes a second determination unit for the target position of the motor, and the second determination unit for the target position of the motor is set to:
  • the current closest position and the current current position of the focusing lens group are determined according to the currently experienced scanning inflection point farthest position;
  • the target position of the focusing lens group is determined according to the current closest position and the current farthest position.
  • the fire point detection device provided by the embodiment of the present application can execute the fire point detection method provided by any embodiment of the present application, and has functional modules corresponding to executing the fire point detection method.
  • FIG. 9 is a schematic structural diagram of an electronic device provided in Embodiment 4 of the present application.
  • FIG. 9 shows a block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the present application.
  • the electronic device 12 shown in FIG. 9 is only an example, and should not impose any limitations on the functions and scope of use of the embodiments of the present application.
  • the electronic device 12 takes the form of a general-purpose computing device.
  • Components of the electronic device 12 may include, but are not limited to, at least one processor or processing unit 16, a system storage device 28, and a bus 18 connecting various system components including the system storage device 28 and the processing unit 16.
  • the bus 18 represents at least one of several types of bus structures, including a storage device bus or storage device controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any of a variety of bus structures.
  • these architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, enhanced ISA bus, Video Electronics Standards Association (Video Electronics Standards Association) Association, VESA) local bus and Peripheral Component Interconnect (PCI) bus.
  • Electronic device 12 typically includes a variety of computer system readable media. These media can be any available media that can be accessed by electronic device 12, including both volatile and non-volatile media, removable and non-removable media.
  • System storage 28 may include computer system readable media in the form of volatile storage, such as random access memory (RAM) 30 and/or cache storage 32 .
  • Electronic device 12 may include other removable/non-removable, volatile/non-volatile computer system storage media.
  • storage system 34 may be configured to read and write to non-removable, non-volatile magnetic media (not shown in FIG. 9, commonly referred to as a "hard drive”).
  • a magnetic disk drive for reading and writing to removable non-volatile magnetic disks (such as "floppy disks") and removable non-volatile optical disks (such as Compact Disc-Read only) may be provided.
  • each drive may be connected to bus 18 via at least one data medium interface.
  • the storage device 28 may include at least one program product having a set (eg, at least one) of program modules configured to perform the functions of various embodiments of the present application.
  • a program/utility 40 having a set (at least one) of program modules 42, which may be stored, for example, in the storage device 28, such program modules 42 including, but not limited to, an operating system, at least one application program, other program modules, and program data, An implementation of a network environment may be included in each or some combination of these examples.
  • Program modules 42 generally perform the functions and/or methods of the embodiments described herein.
  • the electronic device 12 may also communicate with at least one external device 14 (eg, a keyboard, pointing device, display 24, etc.), may also communicate with at least one device that enables a user to interact with the device 12, and/or communicate with the device 12. Any device (eg, network card, modem, etc.) in communication with at least one other computing device. Such communication may take place through an input/output (I/O) interface 22 . Also, the electronic device 12 can communicate with at least one network (eg, a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) through a network adapter 20. As shown in FIG. 9 , the network adapter 20 communicates with other modules of the electronic device 12 via the bus 18 .
  • LAN Local Area Network
  • WAN Wide Area Network
  • public network such as the Internet
  • the processing unit 16 executes various functional applications and data processing by running the programs stored in the system storage device 28, such as implementing the fire detection method provided by the embodiments of the present application, including:
  • the target position of the focusing lens group is determined according to the minimum object distance and the maximum object distance in the to-be-detected image
  • Whether the detection area is a fire spot area is determined according to pixel values in the detection area.
  • the fifth embodiment of the present application also provides a computer-readable storage medium, on which a computer program is stored, and when the program is executed by a processor, implements the fire point detection method provided by the embodiment of the present application, including:
  • the target position of the focusing lens group is determined according to the minimum object distance and the maximum object distance in the to-be-detected image
  • Whether the detection area is a fire spot area is determined according to pixel values in the detection area.
  • the computer storage medium of the embodiments of the present application may adopt any combination of at least one computer-readable medium.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above.
  • a computer-readable storage medium can be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a propagated data signal in baseband or as part of a carrier wave, with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms including, but not limited to, electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device .
  • Program code embodied on a computer-readable medium may be transmitted using any suitable medium, including but not limited to wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
  • suitable medium including but not limited to wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out the operations of the present application may be written in one or more programming languages, including object-oriented programming languages such as Java, Smalltalk, C++, and conventional procedures, or a combination thereof programming languages such as "C" or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network including a local area network (LAN) or wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider to connect over the Internet) .
  • LAN local area network
  • WAN wide area network
  • Internet service provider to connect over the Internet

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Toxicology (AREA)
  • General Health & Medical Sciences (AREA)
  • Vascular Medicine (AREA)
  • Fire-Detection Mechanisms (AREA)

Abstract

本申请实施例公开了一种火点检测方法、装置、电子设备和存储介质。本申请实施例根据采集图像画面中的最小物距和最大物距确定聚焦镜片组的目标位置,使得基于聚焦镜片组的目标位置进行采集图像可以实现最大程度的检测清晰度。在对疑似火点区域的检测中,通过聚焦镜片组从最小物距对应的最近位置移动至最大物距对应的最远位置,实现在镜头聚焦镜片组维度的移动,解决了多物距场景镜头景深不足,无法覆盖所有火点的问题。

Description

火点检测方法、装置、电子设备和存储介质
本申请要求在2020年12月29日提交中国专利局、申请号为202011587808.6的中国专利申请的优先权,该申请的全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及图像监控技术领域,例如涉及一种火点检测方法、装置、电子设备和存储介质。
背景技术
火灾是一种常见且频发的灾害,具有突发性强、破坏性大、处置救援困难的特点,严重影响到了人们的生命和财产安全。因此,在火灾监测环节中,发现火点的及时性与准确性尤为重要,能够使相关人员尽快的采取抢救措施,最大限度的降低火灾带来的损失。
相关技术中的火点检测方法可以通过热成像云台摄像机进行检测,将热成像云台摄像机架设在监控高点上,对监控场景的遍历巡航扫描,即可以实现对监控场景火点检测的全覆盖,一旦检测到着火点,相机立刻通过网络对控制中心发出报警,同时可以通过云台方位定位着火点的位置信息。控制中心可以快速发现着火点,在火情早期采取措施,避免更大的损失和灾难。而为了覆盖更远的监控距离,一般森林防火或者其他较大监控面积领域使用的热成像镜头焦距一般在50mm以上。根据镜头成像原理,焦距越大,景深越小。
当摄像机的监控场景为多物距场景时,如果在扫描巡航中仍然按照固定的聚焦镜片组的位置进行图像采集,则会造成当火点在偏离相机有效景深之外的位置,且火点面积较小时,热辐射强度不够,相机成像亮度不高,会造成火点漏检问题。
发明内容
本申请实施例提供一种火点检测方法、装置、电子设备和存储介质,以提 高多物距下火点检测的准确度。
第一方面,本申请实施例提供了一种火点检测方法,包括:
获取图像采集装置基于聚焦镜片组在目标位置处采集的待检测图像;其中,聚焦镜片组的目标位置根据所述待检测图像中的最小物距和最大物距确定;
按照预先设置的移动规则在所述待检测图像中移动火点检测模板,以火点检测模板覆盖的范围作为检测区域;
根据所述检测区域中的像素值确定所述检测区域是否为火点区域。
第二方面,本申请实施例还提供了一种火点检测装置,包括:
图像获取模块,设置为获取图像采集装置基于聚焦镜片组在目标位置处采集的待检测图像;其中,聚焦镜片组的目标位置根据所述待检测图像中的最小物距和最大物距确定;
模板移动模块,设置为按照预先设置的移动规则在所述待检测图像中移动火点检测模板,以火点检测模板覆盖的范围作为检测区域;
火点判断模块,设置为根据所述检测区域中的像素值确定所述检测区域是否为火点区域。
第三方面,本申请实施例还提供了一种电子设备,包括:
至少一个处理器;
存储装置,设置为存储至少一个程序,
当所述至少一个程序被所述至少一个处理器执行,使得所述至少一个处理器实现如本申请第一方面所述的火点检测方法。
第四方面,本申请实施例还提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现如本申请第一方面所述的火点检测方法。
附图说明
图1是本申请实施例一中的火点检测方法的流程图;
图2是一种S型扫描路径的示意图;
图3是火点检测模板的示意图;
图4是模拟火点区域在成像清晰点附近的像素值衰减示意图;
图5是疑似火点区域实际检测范围示意图;
图6是本申请实施例二中的火点检测方法的流程图;
图7是火点区域在图像中位置确定示意图;
图8是本申请实施例三中的火点检测装置的结构示意图;
图9是本申请实施例四中的电子设备的结构示意图。
具体实施方式
下面结合附图和实施例对本申请作详细说明。
实施例一
图1是本申请实施例一中的火点检测方法的流程图,本实施例可适用于多物距场景下的火点检测情况。该方法可以由火点检测装置来执行,该装置可以采用软件和/或硬件的方式实现,并可配置在电子设备中,例如电子设备可以是后台服务器等具有通信和计算能力的设备。如图1所示,该方法包括:
步骤101、获取图像采集装置基于聚焦镜片组在目标位置处采集的待检测图像;其中,聚焦镜片组的目标位置根据待检测图像中的最小物距和最大物距确定。
其中,图像采集装置是指可以实现远距离监控的摄像机,为了实现远距离监控,摄像机需要使用长焦镜头,而基于长焦镜头的成像原理,长焦镜头必然存在景深较小的问题。在本申请实施例中,为了提高火点检测的准确率,本申请中的图像采集装置为热成像云台摄像机,摄像机可以通过云台转动实现对监控场景的遍历巡航扫描,对监控场景的火点检测达到全覆盖。在图像采集装置中为了满足不同的监控距离,镜头中有一组可以调节位置的镜片组,称为聚焦镜片组,聚焦镜片组有聚焦电机带动进行前后调节,可以使得不同物距下的物 体成像清晰,并且在成像清晰时物距与聚焦镜片的位置是一一对应的。物距是指相机与观测目标之间的距离,待检测图像中的最小物距是指待检测图像中最近的监控距离,最远物距是指待检测图像中最远的监控距离。根据相机成像规律,最小物距是指待检测图像最下方的观测目标与相机之间的距离,最大物距是指待检测图像最上方的观测目标与相机之间的距离。
对于图像采集装置在巡航扫描过程中采集图像时,根据采集图像画面中的最小物距和最大物距确定聚焦镜片组的目标位置,即确定聚焦电机的目标位置。示例性的,若要使图像采集装置对最小物距的目标物体成像最清晰,则存在一与该最小物距对应的聚焦电机的最近位置,同样,如要使图像采集装置对最大物距的目标物体成像最清晰,也存在一与该最远物距对应的聚焦电机的最远位置,当聚焦电机在最近位置时,对采集图像的最下方物体成像最清晰,离最下方物体的距离越远成像越不清晰;当聚焦电机在最远位置时,对采集图像的最上方物体成像最清晰,离最上方物体的距离越远成像越不清晰。因此根据待检测图像中的最小物距和最大物距确定聚焦镜片组的目标位置以使得整个图像画面中的物体相对来说是最清晰的,为了兼顾整个图像画面的清晰度,扫描过程设置聚焦镜片组的目标位置为F,即确定聚焦电机的目标位置为F,F位于聚焦电机的最近位置和最远位置之间。示例性的,聚焦电机的最近位置为F N,最远位置为F F,则F=(F N+F F)/2,监控范围内的待检测图像的清晰度最佳。在本申请实施例中对F的设置方式并不限定。
可选的,由于图像采集装置在水平方向上扫描巡航时,采集图像画面中的最小物距和最大物距不会发生较大的变化,当图像采集装置在垂直方向上扫描巡航时,采集图像画面中的最小物距和最大物距会发生变化。因此获取到图像采集装置的镜头在垂直方向上发生变化时,对采集图像中的最小物距和最大物距进行确定,进而确定聚焦镜片组的目标位置;若没有获取到图像采集装置的镜头在垂直方向上发生变化,图像采集装置仅在水平方向上进行扫描巡航采集图像时,可以保持聚焦镜片组的位置保持不变。示例性的,监测到云台在垂直 方向转动时,根据当前监控画面中的最小物距和最大物距重新确定采集待检测图像时的聚焦镜片组的目标位置。
在一个可行的实施例中,在步骤101之前,还包括:
确定使图像采集装置在至少两个标定物距下的物体成像最清晰时,聚焦镜片组的至少两个标定位置;
根据至少两个标定物距和聚焦镜片组的至少两个标定位置,确定成像最清晰时物距与聚焦镜片组的位置之间的关系;
相应的,聚焦镜片组的目标位置根据待检测图像中的最小物距和最大物距确定,包括:
基于物距与聚焦镜片组的位置之间的关系,根据待检测图像中的最小物距和最大物距确定关联的聚焦镜片组的最近位置和最远位置;
根据聚焦镜片组的最近位置和最远位置确定聚焦镜片组的目标位置。
其中,标定物距根据监控范围进行确定。从监控范围中选取至少两个物距作为标定物距,标定物距的数量和范围依据监控要求的精度和实际使用监控范围进行设置,在此不作限制。示例性的,监控范围为10米到50米,则标定物距可以设置为10米、20米、30米、40米以及50米。
测量在选取得到的至少两个标定物距下的物体成像最清晰时所对应的聚焦镜片组的位置,确定为标定位置。以此建立所述标定物距与所述标定位置之间的对应关系。由于相机的成像原理,物距与清晰点聚焦镜片组的位置是呈非线性关系的,因此根据标定物距和标定位置进行分段插值,可以得到每个物距对应的聚焦镜片组的位置。
在图像采集装置需要确定聚焦镜片组的目标位置时,根据分段插值结果,确定采集图像中最小物距对应的聚集镜片组的最近位置,以及最大物距对应的聚焦镜片组的最远位置,从而确定聚焦镜片组的目标位置。示例性的,得到最小物距后,确定最小物距所处在的两个标定物距之间,进而对该两个标定物距对应的聚焦电机的位置进行非线性插值,得到聚焦电机的目标位置。
通过对物距和聚焦电机的位置关系进行预先标定,实现在实际扫描巡航采集图像时可以快速确定当前采集图像所对应的聚焦电机的目标位置,提高对图像采集的效率。避免采集时实时确定聚焦电机的位置导致耗时较长。
在一个可行的实施例中,图像采集装置采集图像的扫描路径中包括至少两个扫描拐点,在每个扫描拐点处对图像采集装置的镜头进行垂直方向上的转动,该火点检测方法方法还包括:
获取图像采集装置在每个扫描拐点处进行垂直方向上的转动后的预扫描图像;
根据预扫描图像中的最小物距和最大物距确定成像清晰时聚焦镜片组的最近位置和最远位置,并建立每个扫描拐点和聚焦镜片组的最近位置和最远位置的关联关系;
相应的,聚焦镜片组的目标位置根据待检测图像中的最小物距和最大物距确定,包括:
确定图像采集装置的当前经历扫描拐点;
基于至少两个扫描拐点中的每个扫描拐点和聚焦镜片组的最近位置和最远位置的关联关系,根据当前经历扫描拐点确定聚焦镜片组的当前最近位置和当前最远位置;
根据当前最近位置和当前最远位置确定聚焦镜片组的目标位置。
其中,图像采集装置是按照预先规划的扫描路径进行巡航扫描,在扫描过程中采集图像。预先规划的扫描路径是根据监控区域的划定范围,通过合理规划,使得扫描结束后能实现全覆盖监控区域,扫描路径的规划需要根据监控区域的角度以及当前图像采集装置的视场角确定,保证不重复不遗漏。为了在保证对监控区域的全覆盖的同时,使得图像采集装置所扫描的路径最短,扫描路径一般都规划为S型,如图2所示为一种S型扫描路径的示意图。按照该扫描路径进行采集图像时,存在水平方向上的扫描以及垂直方向上的扫描,在扫描过程中所采集图像中的最小物距对应的聚焦镜片组的最近位置即近处清晰点F N, 最大物距对应的聚焦镜片组的最远位置即远处清晰点F F,在扫描路径中的水平扫描过程中采集到的每一张图像中的F N和F F保持不变,只有在扫描路径中的垂直扫描过程中采集到的图像中的F N和F F会发生变化。
因此,为了提高后续扫描过程中确定每个扫描点处采集图像的近处清晰点和远处清晰点的效率,在扫描路径规划完成后,控制图像采集装置在垂直方向上进行预扫描,获取垂直扫描路径上每个扫描拐点处的预扫描图像,并对每个预扫描图像中的最小物距和最大物距分别进行自动聚焦,依次记录每张预扫描图像中的F N和F F,并建立每个扫描拐点的垂直方向和聚焦镜片组的最近位置F N和最远位置F F的关联关系。示例性的,建立云台垂直转动方向和F N、F F的映射关系。
在正式扫描过程中,确定图像采集装置的当前经历扫描拐点,即确定图像采集装置的当前垂直方向,并根据垂直方向和聚焦镜片组的最近位置F N和最远位置F F的关联关系,从中确定与当前垂直方向关联的当前最近位置和当前最远位置,根据当前最近位置和当前最远位置确定聚焦镜片组的目标位置。示例性的,若采用扫描图像中的最近位置和最远位置的平均值作为目标位置,则可以直接建立垂直方向和目标位置的关联关系,即在正式扫描过程中可以直接根据当前垂直方向确定当前目标位置,提高图像采集的效率。
可选的,对图像采集装置的垂直方向进行监测,若未检测到垂直方向上发生变化,则直接保持聚焦镜片组的当前目标位置不变;若检测到垂直方向上发生变化,此时需要根据当前垂直方向重新确定对应的聚焦镜片组的目标位置,即确定采集图像时的垂直方向,并对当前目标位置进行更新。
由于相关技术中的巡航方案中一般是在每个扫描拐点自动对焦一次,使得画面全局清晰,但无法定量判断每个局部区域的清晰程度。即无法保证自动对焦时聚焦镜片组的位置可以保证整个画面中的局部清晰程度最高。因此在本申请实施例中根据采集图像中的最小物距和最大物距进行确定聚焦镜片组的目标位置,使得既保证了全局图像的清晰度,也保证了局部区域的清晰程度,提高 对火点检测的准确度。
步骤102、按照预先设置的移动规则在待检测图像中移动火点检测模板,以火点检测模板覆盖的范围作为检测区域。
其中,火点检测模板是指用于对火点位置进行定位的区域模板,可以根据火点检测的实际精度进行确定,在此不作限制。示例性的,火点检测模板的示意图如图3所示,对于图3中所示例的三种类型的火点检测模板只是作为其中的几种示例,但是并不局限于这三种。例如,在本申请实施例中采用类型a的火点检测模板,即通过四个像素的火点检测模板在待检测图像中进行移动遍历搜索。预设设置的移动规则用于对火点检测模板在待检测图像中的移动步长和移动方向进行设置,移动步长可以根据火点检测模板的大小进行设置,在此不作限制。
示例性的,在图像采集装置扫描巡航中,对采集到的每帧图像进行火点检测。每采集到一帧图像后,使用火点检测模板在待检测图像中进行顺序移动遍历,确定火点检测模板移动遍历过程中所覆盖的每个检测区域。火点检测模板移动步长也可以根据实际需求进行设置,在此不作限制。
步骤103、根据检测区域中的像素值确定检测区域是否为火点区域。
由于图像采集装置为热成像摄像机,热成像摄像机是利用红外探测器捕捉物体辐射的红外光,然后根据红外辐射强度的大小,呈现出物体的图像。以灰度图像为例,像素值越高代表该点红外辐射越强,温度也越高。因此根据检测区域中的像素值大小可以确定检测区域内的温度是否过高,在检测区域内的温度过高的情况下,确定该检测区域为火点区域。示例性的,确定每个检测区域内的平均像素值,根据平均像素值与预先确定的火点检测阈值的比较结果确定是否为火点区域,火点检测阈值可以根据火点的实际温度进行确定,通过检测区域内多个像素点的平均像素值进行判断,可以减少单个像素点随机噪声的干扰,提高火点检测的准确率。
在一个可行的实施例中,在步骤103之前,还包括:
获取图像采集装置基于聚焦镜片组在模拟位置处对模拟火点采集的模拟图像,并确定模拟图像中模拟火点所在的模拟火点区域的像素值;其中,聚焦镜片组的模拟位置根据模拟火点与图像采集装置之间的物距确定;
确定模拟图像中非模拟火点区域的像素值;
根据模拟火点区域的像素值和非模拟火点区域的像素值确定第一火点检测阈值和第二火点检测阈值;其中,模拟火点区域的像素值大于第一火点检测阈值,第一火点检测阈值大于第二火点检测阈值,第二火点检测阈值大于非模拟火点区域的像素值。
其中,模拟火点是选取的对真实火点进行模拟测试的火源,例如,模拟火点可以采用酒精灯或其他易控制的火源。采用模拟火点进行实测对火点检测阈值进行提前标定,使得确定的火点检测阈值与真实火点更加接近,提高火点检测的准确率。
示例性的,将模拟火点放置在图像采集装置的有效检测区域内,例如,放在在距离图像采集装置的物距为D的位置处,根据热成像相机的工作原理,热成像相机采集到的图像灰度值和被拍摄的物体自身的温度以及物体和摄像机的物距相关,但是在森林防火等监控距离较远的领域,距离对采集图像灰度值的影响可以忽略,火点判断的检测阈值仅取决于火点温度和周围环境温度的差值,因此对物距D的大小并不作限制。
把模拟火点放置在距离图像采集装置的物距为D的地方,通过调节图像采集装置的聚焦电机的位置实现对聚焦镜片组的位置的调节,使得火点在图像采集装置中成像清晰,示例性的,采用对模拟火点区域进行区域自动对焦的方式实现模拟火点成像清晰。在模拟火点成像清晰的模拟图像中根据火点检测模板确定模拟火点所在的模拟火点区域,确定模拟火点区域的像素值大小,以及非模拟火点区域的像素值大小。示例性的,采用模拟火点区域的平均像素值作为真实火点区域的平均像素值,采用非模拟火点区域的平均像素值作为真实非火点区域的平均像素值。由于模拟火点是人为放置的,因此可以确定模拟火点区 域即为真实火点,能反映真实火点区域的像素值水平,也可以确定非模拟火点区域无火点,因此非模拟火点区域的像素值可以反映背景的像素值水平。将模拟火点区域的像素值记为I ,非模拟火点区域的像素值记为I b
根据摄像机成像原理,当模拟火点在成像面上成像最清晰时,模拟火点区域像素值最大。当聚焦镜片组的位置偏离清晰点所对应的位置时,模拟火点的成像开始成发散的光斑,模拟火点区域的像素值开始降低,随聚焦镜片组的位置偏离清晰点越远,模拟火点区域的像素值逐渐降低至非火点区域背景同一水平I b。因此,在模拟火点区域的像素值I 和非模拟火点区域的像素值I b之间确定第一火点检测阈值I 1和第二火点检测阈值I 2,使得I >I 1>I 2>I b。火点检测阈值要满足与背景区域存在明显的区分度,尽可能减少误判,因此设置了两个火点检测阈值,第一火点检测阈值与模拟火点区域的像素值最接近,实现对火点区域的精准判断,同时对于聚焦镜片组处于目标位置下拍摄得到的待检测图像中,并非所有物距下的火点均能成像最清晰,因此设置大于非模拟火点区域的像素值小于第一火点检测阈值的第二火点检测阈值,以避免对成像非最清晰的火点的漏检。示例性的,采用如下公式确定第一火点检测阈值和第二火点检测阈值:I 1=0.1*I b+0.9*I ;I 2=0.9*I b+0.1*I ,但是本申请实施例并不对火点检测阈值的设置进行限制。
在一个可行的实施例中,步骤103,包括:
若检测区域中的像素值大于等于第一火点检测阈值,则确定检测区域为火点区域;
若检测区域中的像素值小于第二火点检测阈值,则确定检测区域不是火点区域。
由于第一火点检测阈值接近模拟火点区域的像素值,因此若待检测图像中的检测区域内的像素值大于等于第一火点检测阈值,即可认为该检测区域为火点区域。由于第二火点检测阈值接近非模拟火点区域的像素值,因此若待检测图像中的检测区域内的像素值小于第二火点检测阈值,则确定检测区域不是火 点区域,更接近于背景区域。
示例性的,若检测区域中的平均像素值大于等于第一火点检测阈值,则确定检测区域为火点区域;若检测区域中的平均像素值小于第二火点检测阈值,则确定检测区域不是火点区域。
在一个可行的实施例中,步骤103,还包括:
若检测区域中的像素值小于第一火点检测阈值且大于等于第二火点检测阈值,则确定检测区域为疑似火点区域;
对疑似火点区域进行二次检测,得到疑似火点区域中的火点区域。
由于待检测图像中会存在多物距的火点,但是对于多物距的火点在同一图像上不会成像是最清晰的,因此为了避免对火点的漏检,若检测区域中的像素值位于第一火点检测阈值和第二火点检测阈值之间,确定该检测区域为疑似火点区域。对于疑似火点区域,可能是某些阳光反射或高温物体,也有可能是真实的火点,但因为疑似火点区域在当前成像的有效景深之外,疑似火点通过镜头在图像上的成像较为发散,导致疑似火点区域内的像素值较低,未满足大于等于第一火点检测阈值判断条件。因此需要对疑似火点区域进行二次检测,以保证对火点检测的准确性,避免误检和漏检。
对整个待检测图像中的检测区域检测完成后,若未发现火点区域和疑似火点区域则继续进行下一帧图像的扫描。若发现疑似火点区域,则根据疑似火点区域的数量进行二次检测。
在一个可行的实施例中,对疑似火点区域进行二次检测,得到疑似火点区域中的火点区域,包括:
确定待检测图像中的最小物距关联的聚焦镜片组的最近位置以及最大物距关联的聚焦镜片组的最远位置,将聚焦镜片组从最近位置移动至最远位置,并采集移动过程中的至少两幅图像,若目标疑似火点区域在至少两幅图像中任一幅图像中的像素值大于等于第一火点检测阈值,则确定目标疑似火点区域是火点区域。需要说明的是,疑似火点区域的数量为至少一个,本申请目标疑似火 点区域为所述至少一个疑似火点区域中的任意一个疑似火点区域。
在本申请实施例中,针对同一幅图像中存在多物距下的疑似火点区域检测时,通过使聚焦镜片组从该幅图像中最小物距关联的最近位置移动至最远位置,并在移动过程中采集多幅图像,在这多幅图像中任一疑似火点区域均会有对应的一幅图像使得该疑似火点区域的成像较为清晰。因此若任一疑似火点区域在任一幅图像中的像素值大于等于第一火点检测阈值,则确定该疑似火点区域为火点区域;若某疑似火点区域在所有图像中的像素值均小于第一火点检测阈值,则确定该疑似火点区域不是火点区域。示例性的,聚焦电机在控制聚焦镜片组的位置移动时可以按照预设步长进行移动,聚焦电机每移动一个步长采集一帧图像。对于每帧图像中对应的疑似火点区域的像素值和第一火点检测阈值比较;若疑似火点区域的像素值大于第一火点检测阈值,则确定目标疑似火点区域是火点区域。预设步长的设置可以根据图像采集装置所采集图像中的物距范围以及景深值进行确定,在此不作限制。
在疑似火点区域二次检测完成后,若在待检测图像中均未发现火点区域,则直接对下一帧图像进行采集检测;若在待检测图像中发现火点区域,包括从疑似火点区域中确定的火点区域,则对所有的火点区域进行位置信息确定。
在一个可行的实施例中,对疑似火点区域进行二次检测,得到疑似火点区域中的火点区域,包括:
确定待检测图像中疑似火点区域的数量;
若疑似火点区域的数量为一个,则对疑似火点区域进行区域自动聚焦,并确定聚焦后的图像中疑似火点区域的聚焦像素值,若聚焦像素值大于等于第一火点检测阈值则确定疑似火点区域是火点区域。
若对待检测图像遍历完毕后,确定只有一个疑似火点区域,则采用将聚焦镜片组从最近位置移动至最远位置进行二次检测,则会造成聚焦效率慢。因此在待检测图像中只有一个疑似火点区域时直接对该区域进行区域自动聚焦,使得疑似火点区域成像最清晰,重新计算聚焦后疑似火点区域的聚焦像素值,若 聚焦像素值大于等于第一火点检测阈值,则确定该疑似火点区域为火点区域,若聚焦像素值小于第一火点检测阈值,则确定该疑似火点区域不是火点区域。示例性的,由于疑似火点区域是根据火点检测模板确定的,因此疑似火点区域所占的像素点较少,在自动聚焦前,以疑似火点区域为中心向外扩大至一个感兴趣区域,对该感兴趣区域进行区域自动聚焦后,确定感兴趣区域中疑似火点区域的聚焦像素值,其中,感兴趣区域的大小可以根据实际聚焦效果进行确定,在此不做限制。
由于待检测图像中仅有一个疑似火点区域,若该疑似火点区域为真实火点造成,则对其进行自动聚焦后,该疑似火点区域的像素值为接近模拟火点区域的像素值,因此直接与第一火点检测像素值比较,提高对火点检测的准确率。
在此基础上,若对待检测图像遍历完成后,确定有至少两个疑似火点区域,则说明至少两个疑似火点区域处于不同的物距下,因此若直接对其中一个疑似火点区域进行自动聚焦,则剩下的疑似火点区域仍会存在聚焦后的像素值小于第一火点检测像素值的情况。因此,在待检测图像中有至少两个疑似火点区域进行二次检测时,确定待检测图像中的最小物距关联的聚焦镜片组的最近位置以及最大物距关联的聚焦镜片组的最远位置,将聚焦镜片组从最近位置移动至最远位置,并采集移动过程中的至少两幅图像,若目标疑似火点区域在至少两幅图像中任一幅图像中的像素值大于等于第一火点检测阈值,则确定目标疑似火点区域是火点区域。
疑似火点区域的二次检测需要根据待检测图像中疑似火点区域的数量进行确定,因为数量不同表示了图像中存在目标检测物体的物距也不同,因此针对性的二次检测可以提高检测效率。
在一个可行的实施例中,该方法还包括:
移动聚焦镜片组的位置,确定模拟火点区域的像素值在移动过程中的像素值衰减路径上为第二火点检测阈值时,聚焦镜片组的第一位置和第二位置;
根据聚焦镜片组的第一位置和第二位置确定第一物距和第二物距;
根据第一物距和第二物距确定在图像采集装置采集的图像中对火点检测的实际检测物距范围;
根据实际检测物距范围确定图像采集装置采集图像的扫描路径。
在进行模拟火点实测时,模拟火点区域的物距所对应的聚焦清晰点的位置只有一个,当偏离该清晰点时模拟火点区域的像素值会逐渐降低。如图4所示为模拟火点区域在成像清晰点附近的像素值衰减示意图,图4中F即为使模拟火点在成像面上成像最清晰时对应的聚焦镜片组的位置,当聚焦镜片组位于F位置时,模拟图像上模拟火点区域的像素值为I ,非模拟火点区域的像素值为I b,在I 和I b之间确定I 1和I 2的值,在第二火点检测阈值确定后,在清晰点F远近两侧各存在一个聚焦镜片组的位置(即F 1和F 2)使得模拟火点区域的像素值为I 2。若设置当前模拟图像采集时聚焦镜片组的位置为F,但是实际模拟火点的放置物距所对应的聚焦镜片组的位置应为F 1或F 2,则可以看出当聚焦镜片组的位置在F 1和F 2之间时,对模拟图像中的模拟火点区域检测得到的像素值大于第二火点检测阈值,可以判断为是疑似火点区域。示例性的,在对疑似火点区域进行二次检测时,通过移动聚焦镜片组的位置进行逐步搜索,如图5所示为疑似火点区域实际检测范围示意图,从图4中可以确定当聚焦镜片组位于F 1或F 2时,可以对该清晰点对应的物距下的疑似火点区域进行检测,疑似火点区域的像素值满足大于第一火点检测阈值的条件。
因此在第二火点检测阈值确定后,则可确定在模拟图像中实际能检测出的火点所对应的聚焦镜片组的位置F 1和F 2,根据预先标定的物距和聚焦镜片组的关系,可以确定与F 1和F 2对应的物距D 1和D 2,即在待检测图像中D 1和D 2之间的范围为单次扫描的实际检测物距范围,若存在真实火点位于该物距范围之外,则会出现该真实火点区域的像素值小于第二火点检测阈值,造成漏检。
因此若图像采集装置所采集的待检测图像中的F N和F F之间的范围大于F 1和F 2之间的范围,且在巡航扫描的路径中未设置重合部分图像,则会造成疑似火点的漏检。在本申请实施例中,根据实际检测物距范围设置采集图像的重合 范围,以使得按照实际检测物距范围在扫描过程中能实现对监控区域的全覆盖。
本申请实施例基于所采集图像画面中的最小物距和最大物距确定聚焦镜片组的目标位置,基于该目标位置进行采集待检测图像;通过火点检测模板在待检测图像中的移动结果确定检测区域,根据检测区域的像素值确定检测区域是否是火点区域。根据采集图像画面中的物距确定聚焦镜片组的目标位置,使得在当前物距下,使用目标位置的聚焦镜片组可以实现最大程度的检测清晰度,以保证在多物距场景下,可以对场景图像中的多物距火点进行检测,提高火点检测的准确度。
实施例二
图6是本申请实施例二中的火点检测方法的流程图,本实施例二是本申请的一个可选实施例。如图6所示,该方法包括:
本申请实施例中包含一套热成像云台摄像机及其配套的网络设施。其中热成像云台摄像机采用一颗中长焦热成像镜头,镜头具备精确调焦功能,可以通过调节聚焦镜片组的位置实现对不同物距的物体成像清晰。
步骤601、参数标定。
参数标定是指标定热成像摄像机火点检测过程中必要的参数信息。需要标定的参数包括物距和聚焦电机的位置之间的关系。为了实现远距离监控,需要使用长焦镜头,长焦镜头必然存在景深较小的问题。为了满足不同的监控距离,镜头中有一组可调节的镜片组,称作聚焦镜片组,聚焦镜片组由聚焦电机带动进行前后调节,可以使得不同物距下的物体成像清晰,物距和聚焦清晰点是一一对应的。通过测定不同物距下的清晰点所对应的聚焦电机的位置,可以确定物距和聚焦电机的位置之间的关系。标定的物距组数和范围依据检测要求的精度和实际监控使用范围而定。在确定非标定物距对应的聚焦电机的位置时,采用标定物距中的数据分段插值进行确定。
需要标定的参数还包括火点检测阈值,火点检测阈值包括第一火点检测阈 值和第二火点检测阈值。根据热成像相机的工作原理,热成像相机采集到的图像灰度值和被拍摄的物体自身的温度以及物体和摄像机的距离相关,但是在森林防火领域,监控距离较远,距离对图像灰度值的影响可以忽略,火点检测阈值取决于火点区域温度和周围背景环境温度的差值,一般可通过实测给出。
采用模拟火点来进行实测,模拟火点可采用酒精灯或其他火源。把模拟火点放置在距离摄像机物距为D处的地方,调节摄像机的聚焦电机位置,使得模拟火点在摄像机中成像清晰。根据火点检测模板,计算模拟火点区域的平均灰度大小,记做I 。同时计算背景非模拟火点区域平均灰度大小I b。根据摄像机成像原理,当火点在成像面上成像清晰时,火点区域灰度值最大。当聚焦电机的位置偏离成像清晰点时,成像开始成发散的光斑,图像灰度开始降低,随着聚焦电机的位置偏离清晰点越远,火点区域的灰度逐渐降低至背景同一水平I b,因此,在I 和I b之间选定第一火点检测阈值I 1和第二火点检测阈值I 2,以实现I >I 1>I 2>I b
步骤602、扫描巡航。
在扫描路径规划完成以后,控制云台先以垂直方向预扫描一轮,在每个正式扫描路径对应的图像的上边缘和下边缘分别聚焦清晰,依次记录每轮扫描的F F和F N的值。在正式扫描时,即可根据F F和F N,确定采集图像时的聚焦电机目标位置。
在每次扫描中,先要确定当前拍摄图像中最远和最近的监控物距。根据相机成像规律,最远端一般在图像最上方,最近端一般在图像最下方。分别记做D F和D N,根据已标定物距和聚焦电机位置之间的关系,可以得到这两个监控距离分别对应的聚焦电机位置F F和F N;或者直接根据预扫描得到的聚焦电机位置值进行确定。为了兼顾整个画面的清晰度,扫描过程设置聚焦电机位置为F,其中F在F N和F F之间,取F=(F F+F N)/2时监控范围最广,效果最佳,但本申请实施例不限制F的设置方式。当云台在水平方向转动时,一般监控距离不发生变化;当云台在垂直方向转动时,监控距离发生变化,此时需要根据当前监控 物距重新确定扫描时的聚焦电机位置F。
步骤603、火点区域判断。
在扫描巡航过程中,对摄像机捕捉到的每帧图像进行火点判断。流程分为第一轮判断和疑似火点确认两个步骤。第一轮判断是通过四个像素的火点检测模板进行遍历搜索,计算模板内所有像素平均灰度值I avg,采用平均灰度值来判断火点,可以减小单个像素随机噪声的干扰。若当前区域平均灰度I avg大于第一火点检测阈值I 1,则认为是火点区域。若平均灰度小于I 1但是大于等于第二火点检测阈值I 2,则认为是疑似火点区域。对整个图像搜索完成后,若未发现火点区域和疑似火点区域则继续进行下一帧的扫描。若发现疑似火点区域,则进入疑似火点确认环节。对于疑似火点,可能是某些阳光反射或高温物体,也有可能是真实的火点,但因为火点在当前成像的有效景深之外,火点通过镜头在图像上的成像较为发散,导致平均灰度值较低,未满足第一火点检测阈值判断条件。因此,需要确认疑似火点的真伪。根据在整个图像中疑似火点的个数,分为两种处理方法。若发现一个疑似火点,则以疑似火点向外扩大至一个感兴趣区域,针对该区域进行一次区域自动对焦,使得疑似火点成像达到最清晰。重新计算疑似火点的平均灰度值,若疑似火点的灰度值大于或等于I 1,则认为是火点区域;在疑似火点的灰度值小于I 1的情况下,认为非火点区域。若发现有多个疑似火点区域,则将镜头的聚焦电机位置置于该图像的F N处,然后以固定的步长移动至F F处。其中聚焦电机每移动一个步长,摄像机采集一帧图像,针对该图像内疑似火点区域的灰度使用第一火点检测阈值进行火点判断。等聚焦电机运动到F F处,疑似火点确认结束。在这个过程中,统计疑似火点区域中火点区域和非火点区域的数量。若疑似火点区域中全部为非火点区域,且第一轮判断中未发现火点区域,则继续进行巡航扫描。若最终结果发现有火点区域,则进行下一步火点位置确认。
步骤604、火点位置确定。
确定火点区域所在的图像帧采集时的云台水平角度P、该图像帧的视场角α 以及火点区域在图像中的位置(x,y)。火点区域在图像中位置确定示意图如图7所示。根据的云台水平角度P和视场角α计算出图像左边缘的云台角度P L=P-α/2,进而根据图像左边缘的云台角度、图像的宽度W和高度H及火点区域在图像中的位置(x,y)确定火点区域的云台角度,P =P L+x/W*α=P-(W/2-x)*α/W。
确定火点区域的云台角度后,结合云台摄像机安装的经纬度信息,和火点区域到云台摄像机的物距信息,即可确定火点区域的实际经纬度信息。其中火点区域到云台摄像机的物距信息可以根据该图像帧采集时的聚焦电机的位置进行确定,根据预先标定的聚焦电机位置和物距之间的关系,得到火点的物距信息。对火点区域进行精确定位后,将位置信息发送至控制中心进行火点报警。
本申请实施例基于图像中的最小物距和最大物距确定聚焦电机的位置实现了对采集图像中多物距火点的最大范围的检测;其次,在确定图像中的疑似火点区域后,根据聚焦电机在当前图像对应的聚焦电机的最近位置和最远位置之间搜索,使得图像中的疑似火点区域可以聚焦清晰,实现图像中多物距下的疑似火点区域的检测,提高对火点的检测准确率;并且在检测到火点区域后,不用通过额外测距手段就可以准确定位火点的精确位置,提高对火情的阻断处理效率。本申请实施例实现热成像摄像机在扫描巡航中能检测到多物距场景中不同物距下的火点,提高火点检测准确率。
实施例三
图8是本申请实施例三中的火点检测装置的结构示意图,本实施例可适用于多物距场景下的火点检测情况。如图8所示,该装置包括:
图像获取模块810,设置为获取图像采集装置基于聚焦镜片组在目标位置处采集的待检测图像;其中,聚焦镜片组的目标位置根据待检测图像中的最小物距和最大物距确定;
模板移动模块820,设置为按照预先设置的移动规则在待检测图像中移动火 点检测模板,以火点检测模板覆盖的范围作为检测区域;
火点判断模块830,设置为根据检测区域中的像素值确定所述检测区域是否为火点区域。
本申请实施例基于所采集图像画面中的最小物距和最大物距确定聚焦镜片组的目标位置,基于该目标位置进行采集待检测图像;通过火点检测模板在待检测图像中的移动结果确定检测区域,根据检测区域的像素值确定检测区域是否是火点区域。根据采集图像画面中的物距确定聚焦镜片组的目标位置,使得在当前物距下,使用目标位置的聚焦镜片组可以实现最大程度的检测清晰度,以保证在多物距场景下,可以对场景图像中的多物距火点进行检测,提高火点检测的准确度。
可选的,所述装置还包括火点检测阈值确定模块,设置为:在根据所述检测区域中的像素值确定所述检测区域是否为火点区域之前,获取图像采集装置基于聚焦镜片组在模拟位置处对模拟火点采集的模拟图像,并确定模拟图像中模拟火点所在的模拟火点区域的像素值;其中,聚焦镜片组的模拟位置根据所述模拟火点与图像采集装置之间的物距确定;
确定所述模拟图像中非模拟火点区域的像素值;
根据所述模拟火点区域的像素值和非模拟火点区域的像素值确定第一火点检测阈值和第二火点检测阈值;其中,所述模拟火点区域的像素值大于所述第一火点检测阈值,所述第一火点检测阈值大于所述第二火点检测阈值,所述第二火点检测阈值大于所述非模拟火点区域的像素值。
可选的,火点判断模块,包括:
火点确定单元,设置为若所述检测区域中的像素值大于等于所述第一火点检测阈值,则确定所述检测区域为火点区域;
非火点确定单元,设置为若所述检测区域中的像素值小于所述第二火点检测阈值,则确定所述检测区域不是火点区域。
可选的,火点判断模块,还包括:
疑似火点确定单元,设置为若所述检测区域中的像素值小于第一火点检测阈值且大于等于所述第二火点检测阈值,则确定所述检测区域为疑似火点区域;
二次检测单元,设置为对所述疑似火点区域进行二次检测,得到疑似火点区域中的火点区域。
可选的,二次检测单元,通过以下方式实现对所述疑似火点区域进行二次检测,得到疑似火点区域中的火点区域:
确定所述待检测图像中的最小物距关联的聚焦镜片组的最近位置以及最大物距关联的聚焦镜片组的最远位置,将所述聚焦镜片组从所述最近位置移动至所述最远位置,并采集移动过程中的至少两幅图像,若目标疑似火点区域在所述至少两幅图像中任一幅图像中的像素值大于等于所述第一火点检测阈值,则确定所述目标疑似火点区域是火点区域。
可选的,二次检测单元,通过以下方式实现对所述疑似火点区域进行二次检测,得到疑似火点区域中的火点区域:
确定所述待检测图像中疑似火点区域的数量;
若所述疑似火点区域的数量为一个,则对所述疑似火点区域进行区域自动聚焦,并确定聚焦后的图像中所述疑似火点区域的聚焦像素值,若所述聚焦像素值大于等于所述第一火点检测阈值则确定所述疑似火点区域是火点区域。
可选的,所述装置还包括物距标定模块,所述物距标定模块设置为在获取图像采集装置基于聚焦镜片组在目标位置处采集的待检测图像之前,确定使所述图像采集装置在至少两个标定物距下的物体成像最清晰时,聚焦镜片组的至少两个标定位置;
根据所述至少两个标定物距和所述聚焦镜片组的至少两个标定位置,确定成像最清晰时物距与聚焦镜片组的位置之间的关系;
相应的,图像获取模块,包括电机目标位置第一确定单元,所述电机目标位置第一确定单元设置为:
基于所述物距与聚焦镜片组的位置之间的关系,根据所述待检测图像中的 最小物距和最大物距确定关联的聚焦镜片组的最近位置和最远位置;
根据所述聚焦镜片组的最近位置和最远位置确定所述聚焦镜片组的目标位置。
可选的,所述装置还包括扫描路径确定模块,所述扫描路径确定模块设置为:
移动聚焦镜片组的位置,确定所述模拟火点区域的像素值在移动过程中的像素值衰减路径上为第二火点检测阈值时,所述聚焦镜片组的第一位置和第二位置;
根据所述聚焦镜片组的第一位置和第二位置确定第一物距和第二物距;
根据所述第一物距和第二物距确定在所述图像采集装置采集的图像中对火点检测的实际检测物距范围;
根据所述实际检测物距范围确定所述图像采集装置采集图像的扫描路径。
可选的,所述图像采集装置采集图像的扫描路径中包括至少两个扫描拐点,在每个扫描拐点处对所述图像采集装置的镜头进行垂直方向上的转动,所述装置还包括预扫描模块,所述预扫描模块设置为:
获取所述图像采集装置在每个扫描拐点处进行垂直方向上的转动后的预扫描图像;
根据所述预扫描图像中的最小物距和最大物距确定成像清晰时聚焦镜片组的最近位置和最远位置,并建立所述每个扫描拐点和所述聚焦镜片组的最近位置和最远位置的关联关系;
相应的,图像获取模块,包括电机目标位置第二确定单元,所述电机目标位置第二确定单元设置为:
确定所述图像采集装置的当前经历扫描拐点;
基于所述至少两个扫描拐点中的每个扫描拐点和所述聚焦镜片组的最近位置和最远位置的关联关系,根据所述当前经历扫描拐点确定所述聚焦镜片组的当前最近位置和当前最远位置;
根据所述当前最近位置和当前最远位置确定所述聚焦镜片组的目标位置。
本申请实施例所提供的火点检测装置可执行本申请任意实施例所提供的火点检测方法,具备执行火点检测方法相应的功能模块。
实施例四
图9是本申请实施例四提供的一种电子设备的结构示意图。图9示出了适于用来实现本申请实施方式的示例性电子设备12的框图。图9显示的电子设备12仅仅是一个示例,不应对本申请实施例的功能和使用范围带来任何限制。
如图9所示,电子设备12以通用计算设备的形式表现。电子设备12的组件可以包括但不限于:至少一个处理器或者处理单元16,系统存储装置28,连接不同系统组件(包括系统存储装置28和处理单元16)的总线18。
总线18表示几类总线结构中的至少一种,包括存储装置总线或者存储装置控制器,外围总线,图形加速端口,处理器或者使用多种总线结构中的任意总线结构的局域总线。举例来说,这些体系结构包括但不限于工业标准体系结构(Industry Standard Architecture,ISA)总线,微通道体系结构(Micro Channel Architecture,MCA)总线,增强型ISA总线、视频电子标准协会(Video Electronics Standards Association,VESA)局域总线以及外围组件互连(Peripheral Component Interconnect,PCI)总线。
电子设备12典型地包括多种计算机系统可读介质。这些介质可以是任何能够被电子设备12访问的可用介质,包括易失性和非易失性介质,可移动的和不可移动的介质。
系统存储装置28可以包括易失性存储装置形式的计算机系统可读介质,例如随机存取存储装置(Random Access Memory,RAM)30和/或高速缓存存储装置32。电子设备12可以包括其它可移动/不可移动的、易失性/非易失性计算机系统存储介质。仅作为举例,存储系统34可以设置为读写不可移动的、非易失性磁介质(图9未显示,通常称为“硬盘驱动器”)。尽管图9中未示出,可以提 供用于对可移动非易失性磁盘(例如“软盘”)读写的磁盘驱动器,以及对可移动非易失性光盘(例如只读光盘(Compact Disc-Read Only Memory,CD-ROM),数字视盘(Digital Video Disc-Read Only Memory,DVD-ROM)或者其它光介质)读写的光盘驱动器。在这些情况下,每个驱动器可以通过至少一个数据介质接口与总线18相连。存储装置28可以包括至少一个程序产品,该程序产品具有一组(例如至少一个)程序模块,这些程序模块被配置以执行本申请各实施例的功能。
具有一组(至少一个)程序模块42的程序/实用工具40,可以存储在例如存储装置28中,这样的程序模块42包括但不限于操作系统、至少一个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。程序模块42通常执行本申请所描述的实施例中的功能和/或方法。
电子设备12也可以与至少一个外部设备14(例如键盘、指向设备、显示器24等)通信,还可与至少一个使得用户能与该设备12交互的设备通信,和/或与使得该设备12能与至少一个其它计算设备进行通信的任何设备(例如网卡,调制解调器等等)通信。这种通信可以通过输入/输出(Input/Output,I/O)接口22进行。并且,电子设备12还可以通过网络适配器20与至少一个网络(例如局域网(Local Area Network,LAN),广域网(Wide Area Network,WAN)和/或公共网络,例如因特网)通信。如图9所示,网络适配器20通过总线18与电子设备12的其它模块通信。应当明白,尽管图9中未示出,可以结合电子设备12使用其它硬件和/或软件模块,包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、磁盘阵列(Redundant Arrays of Independent Disks,RAID)系统、磁带驱动器以及数据备份存储系统等。
处理单元16通过运行存储在系统存储装置28中的程序,从而执行各种功能应用以及数据处理,例如实现本申请实施例所提供的火点检测方法,包括:
获取图像采集装置基于聚焦镜片组在目标位置处采集的待检测图像;其中, 聚焦镜片组的目标位置根据所述待检测图像中的最小物距和最大物距确定;
按照预先设置的移动规则在所述待检测图像中移动火点检测模板,以火点检测模板覆盖的范围作为检测区域;
根据所述检测区域中的像素值确定所述检测区域是否为火点区域。
实施例五
本申请实施例五还提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现如本申请实施例所提供的火点检测方法,包括:
获取图像采集装置基于聚焦镜片组在目标位置处采集的待检测图像;其中,聚焦镜片组的目标位置根据所述待检测图像中的最小物距和最大物距确定;
按照预先设置的移动规则在所述待检测图像中移动火点检测模板,以火点检测模板覆盖的范围作为检测区域;
根据所述检测区域中的像素值确定所述检测区域是否为火点区域。
本申请实施例的计算机存储介质,可以采用至少一个计算机可读的介质的任意组合。计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质。计算机可读存储介质例如可以是但不限于电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:具有至少一个导线的电连接、便携式计算机磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器((Erasable Programmable Read-Only Memory,EPROM)或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本文件中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。
计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种 形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。
计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于无线、电线、光缆、射频(Radio Frequency,RF)等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言或其组合来编写用于执行本申请操作的计算机程序代码,所述程序设计语言包括面向对象的程序设计语言诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言诸如”C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络包括局域网(LAN)或广域网(WAN)连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。

Claims (12)

  1. 一种火点检测方法,包括:
    获取图像采集装置基于聚焦镜片组在目标位置处采集的待检测图像;其中,聚焦镜片组的目标位置根据所述待检测图像中的最小物距和最大物距确定;
    按照预先设置的移动规则在所述待检测图像中移动火点检测模板,以火点检测模板覆盖的范围作为检测区域;
    根据所述检测区域中的像素值确定所述检测区域是否为火点区域。
  2. 根据权利要求1所述的方法,在根据所述检测区域中的像素值确定所述检测区域是否为火点区域之前,还包括:
    获取图像采集装置基于聚焦镜片组在模拟位置处对模拟火点采集的模拟图像,并确定模拟图像中模拟火点所在的模拟火点区域的像素值;其中,聚焦镜片组的模拟位置根据所述模拟火点与图像采集装置之间的物距确定;
    确定所述模拟图像中非模拟火点区域的像素值;
    根据所述模拟火点区域的像素值和非模拟火点区域的像素值确定第一火点检测阈值和第二火点检测阈值;其中,所述模拟火点区域的像素值大于所述第一火点检测阈值,所述第一火点检测阈值大于所述第二火点检测阈值,所述第二火点检测阈值大于所述非模拟火点区域的像素值。
  3. 根据权利要求2所述的方法,其中,根据所述检测区域中的像素值确定所述检测区域是否为火点区域,包括:
    在所述检测区域中的像素值大于或等于所述第一火点检测阈值的情况下,确定所述检测区域为火点区域;
    在所述检测区域中的像素值小于所述第二火点检测阈值的情况下,确定所述检测区域不是火点区域。
  4. 根据权利要求3所述的方法,根据所述检测区域中的像素值确定所述检测区域是否为火点区域,还包括:
    在所述检测区域中的像素值小于第一火点检测阈值且大于或等于所述第二火点检测阈值的情况下,确定所述检测区域为疑似火点区域;
    对所述疑似火点区域进行二次检测,得到疑似火点区域中的火点区域。
  5. 根据权利要求4所述的方法,其中,对所述疑似火点区域进行二次检测,得到疑似火点区域中的火点区域,包括:
    确定所述待检测图像中的最小物距关联的聚焦镜片组的最近位置以及最大物距关联的聚焦镜片组的最远位置,将所述聚焦镜片组从所述最近位置移动至所述最远位置,并采集移动过程中的至少两幅图像,在目标疑似火点区域在所述至少两幅图像中任一幅图像中的像素值大于或等于所述第一火点检测阈值的情况下,确定所述目标疑似火点区域是火点区域。
  6. 根据权利要求4所述的方法,其中,对所述疑似火点区域进行二次检测,得到疑似火点区域中的火点区域,包括:
    确定所述待检测图像中疑似火点区域的数量;
    在所述疑似火点区域的数量为一个的情况下,对所述疑似火点区域进行区域自动聚焦,并确定聚焦后的图像中所述疑似火点区域的聚焦像素值,在所述聚焦像素值大于或等于所述第一火点检测阈值的情况下,确定所述疑似火点区域是火点区域。
  7. 根据权利要求1所述的方法,在获取图像采集装置基于聚焦镜片组在目标位置处采集的待检测图像之前,还包括:
    确定使所述图像采集装置在至少两个标定物距下的物体成像最清晰时,聚焦镜片组的至少两个标定位置;
    根据所述至少两个标定物距和所述聚焦镜片组的至少两个标定位置,确定成像最清晰时物距与聚焦镜片组的位置之间的关系;
    聚焦镜片组的目标位置根据所述待检测图像中的最小物距和最大物距确定,包括:
    基于所述物距与聚焦镜片组的位置之间的关系,根据所述待检测图像中的最小物距和最大物距确定关联的聚焦镜片组的最近位置和最远位置;
    根据所述聚焦镜片组的最近位置和最远位置确定所述聚焦镜片组的目标位 置。
  8. 根据权利要求2所述的方法,还包括:
    移动聚焦镜片组的位置,确定所述模拟火点区域的像素值在移动过程中的像素值衰减路径上为第二火点检测阈值时,所述聚焦镜片组的第一位置和第二位置;
    根据所述聚焦镜片组的第一位置和第二位置确定第一物距和第二物距;
    根据所述第一物距和第二物距确定在所述图像采集装置采集的图像中对火点检测的实际检测物距范围;
    根据所述实际检测物距范围确定所述图像采集装置采集图像的扫描路径。
  9. 根据权利要求8所述的方法,其中,所述图像采集装置采集图像的扫描路径中包括至少两个扫描拐点,在每个扫描拐点处对所述图像采集装置的镜头进行垂直方向上的转动,所述方法还包括:
    获取所述图像采集装置在每个扫描拐点处进行垂直方向上的转动后的预扫描图像;
    根据所述预扫描图像中的最小物距和最大物距确定成像清晰时聚焦镜片组的最近位置和最远位置,并建立所述每个扫描拐点和所述聚焦镜片组的最近位置和最远位置的关联关系;
    聚焦镜片组的目标位置根据所述待检测图像中的最小物距和最大物距确定,包括:
    确定所述图像采集装置的当前经历扫描拐点;
    基于所述至少两个扫描拐点中的每个扫描拐点和所述聚焦镜片组的最近位置和最远位置的关联关系,根据所述当前经历扫描拐点确定所述聚焦镜片组的当前最近位置和当前最远位置;
    根据所述当前最近位置和当前最远位置确定所述聚焦镜片组的目标位置。
  10. 一种火点检测装置,包括:
    图像获取模块,设置为获取图像采集装置基于聚焦镜片组在目标位置处采 集的待检测图像;其中,聚焦镜片组的目标位置根据所述待检测图像中的最小物距和最大物距确定;
    模板移动模块,设置为按照预先设置的移动规则在所述待检测图像中移动火点检测模板,以火点检测模板覆盖的范围作为检测区域;
    火点判断模块,设置为根据所述检测区域中的像素值确定所述检测区域是否为火点区域。
  11. 一种电子设备,包括:
    至少一个处理器;
    存储装置,设置为存储至少一个程序,
    当所述至少一个程序被所述至少一个处理器执行,使得所述至少一个处理器实现如权利要求1-9中任一所述的火点检测方法。
  12. 一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1-9中任一所述的火点检测方法。
PCT/CN2021/136265 2020-12-29 2021-12-08 火点检测方法、装置、电子设备和存储介质 WO2022143052A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21913793.2A EP4273743A1 (en) 2020-12-29 2021-12-08 Method and apparatus for detecting fire spots, electronic device, and storage medium
US18/259,927 US20240060822A1 (en) 2020-12-29 2021-12-08 Method and apparatus for detecting fire spots, electronic device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011587808.6 2020-12-29
CN202011587808.6A CN114758290A (zh) 2020-12-29 2020-12-29 火点检测方法、装置、电子设备和存储介质

Publications (1)

Publication Number Publication Date
WO2022143052A1 true WO2022143052A1 (zh) 2022-07-07

Family

ID=82258664

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/136265 WO2022143052A1 (zh) 2020-12-29 2021-12-08 火点检测方法、装置、电子设备和存储介质

Country Status (4)

Country Link
US (1) US20240060822A1 (zh)
EP (1) EP4273743A1 (zh)
CN (1) CN114758290A (zh)
WO (1) WO2022143052A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115209051A (zh) * 2022-07-08 2022-10-18 杭州海康威视数字技术股份有限公司 变焦摄像机的聚焦方法及装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106341596A (zh) * 2016-08-31 2017-01-18 浙江宇视科技有限公司 一种聚焦方法和装置
US20190371147A1 (en) * 2018-05-31 2019-12-05 Boe Technology Group Co., Ltd. Fire alarming method and device
CN111368756A (zh) * 2020-03-09 2020-07-03 上海金掌网络技术有限责任公司 一种基于可见光的明火烟雾快速识别方法和系统
CN111818260A (zh) * 2020-07-06 2020-10-23 浙江大华技术股份有限公司 一种自动聚焦方法和装置及电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106341596A (zh) * 2016-08-31 2017-01-18 浙江宇视科技有限公司 一种聚焦方法和装置
US20190371147A1 (en) * 2018-05-31 2019-12-05 Boe Technology Group Co., Ltd. Fire alarming method and device
CN111368756A (zh) * 2020-03-09 2020-07-03 上海金掌网络技术有限责任公司 一种基于可见光的明火烟雾快速识别方法和系统
CN111818260A (zh) * 2020-07-06 2020-10-23 浙江大华技术股份有限公司 一种自动聚焦方法和装置及电子设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115209051A (zh) * 2022-07-08 2022-10-18 杭州海康威视数字技术股份有限公司 变焦摄像机的聚焦方法及装置
CN115209051B (zh) * 2022-07-08 2024-02-13 杭州海康威视数字技术股份有限公司 变焦摄像机的聚焦方法及装置

Also Published As

Publication number Publication date
US20240060822A1 (en) 2024-02-22
CN114758290A (zh) 2022-07-15
EP4273743A1 (en) 2023-11-08

Similar Documents

Publication Publication Date Title
US10848685B2 (en) Imaging system, image processing apparatus, imaging system control method, image processing apparatus control method, and program
US10698308B2 (en) Ranging method, automatic focusing method and device
JP2008172523A (ja) 多焦点カメラ装置及びそれに用いられる制御方法並びにプログラム
US11132814B2 (en) Information processing apparatus, information processing method, and storage medium
JP2008516233A5 (zh)
WO2014106303A1 (en) Panoramic lens calibration for panoramic image and/or video capture apparatus
CN113382155B (zh) 自动聚焦方法、装置、设备和存储介质
WO2022143052A1 (zh) 火点检测方法、装置、电子设备和存储介质
US9041798B1 (en) Automated pointing and control of high resolution cameras using video analytics
CN110602376B (zh) 抓拍方法及装置、摄像机
CN113301314B (zh) 对焦方法、投影仪、成像设备和存储介质
CN107071347A (zh) 一种无线定位设备的调整方法以及前端设备
CN108345002A (zh) 结构光测距装置及方法
CN117666116B (zh) 一种周扫成像装置及其凝视补偿方法
KR101664733B1 (ko) 전 방향 고해상도 추적 녹화 장치 및 방법
JP2019120491A (ja) 欠陥検査方法、および、欠陥検査システム
WO2019061650A1 (zh) 三维图像采集设备及方法
JP7048357B2 (ja) 撮影画像事前確認システム及び撮影画像事前確認方法
JP5277600B2 (ja) 架空線撮影システム及び方法
CN105929819B (zh) 一种控制电子设备的方法及电子设备
CN113970424A (zh) 自动跟踪模式下镜头变焦一致性动态纠偏方法和系统
KR20100007444A (ko) 감시카메라 시스템을 통한 감시 방법
JP2017150816A (ja) 波面計測装置及び波面計測方法
JP2007192755A (ja) 測距装置
TW201404120A (zh) 測試系統、成像裝置以及測試方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21913793

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18259927

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021913793

Country of ref document: EP

Effective date: 20230731