CN113536918A - Smoke and fire detection method, system, electronic device and storage medium - Google Patents

Smoke and fire detection method, system, electronic device and storage medium Download PDF

Info

Publication number
CN113536918A
CN113536918A CN202110647346.0A CN202110647346A CN113536918A CN 113536918 A CN113536918 A CN 113536918A CN 202110647346 A CN202110647346 A CN 202110647346A CN 113536918 A CN113536918 A CN 113536918A
Authority
CN
China
Prior art keywords
preset
area
position information
camera
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110647346.0A
Other languages
Chinese (zh)
Other versions
CN113536918B (en
Inventor
鲁华超
潘武
徐狄权
刘溯
赵炎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Huagan Technology Co ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202110647346.0A priority Critical patent/CN113536918B/en
Publication of CN113536918A publication Critical patent/CN113536918A/en
Application granted granted Critical
Publication of CN113536918B publication Critical patent/CN113536918B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Fire-Detection Mechanisms (AREA)

Abstract

The application relates to a smoke and fire detection method, a smoke and fire detection system, an electronic device and a storage medium, wherein a patrol inspection image obtained by shooting through a camera is obtained, a preset interested object is identified in the patrol inspection image, and first position information of an area where the preset interested object is located is obtained; acquiring second position information of a preset exclusion area; judging whether the intersection ratio of the area where the preset interested object is located and the preset exclusion area is larger than a first preset threshold value or not according to the first position information and the second position information; under the condition that the intersection ratio of the area where the preset interested object is located and the preset exclusion area is larger than a first preset threshold value, a preset shielding area is generated according to the first position information, and fire detection of the area where the preset interested object is located is skipped, so that the problem of high false alarm rate of smoke and fire detection caused by the influence of an interference source at a fixed position in the related technology is solved, and the false alarm rate of smoke and fire detection is reduced.

Description

Smoke and fire detection method, system, electronic device and storage medium
Technical Field
The present application relates to the field of smoke detection, and in particular to a smoke detection method, system, electronic device and storage medium.
Background
Smoke and fire detection is generally realized by a camera, smoke and high-temperature ignition points are monitored in real time outdoors (in forests and plains) through the camera, and once a fire is found, an alarm is triggered. Generally, in the outdoor mobile inspection process, the camera is often repeatedly inspected, so that repeated alarm is generated.
In order to solve the problem, in the existing firework detecting scheme, a detected fire area is shielded in an image, so that repeated alarm is avoided. The shielding areas automatically shield the areas after detecting the firework target or the suspected target, so that the situation that the camera reports for many times because the target is detected in the picture all the time in the rotating process is avoided.
However, this mask is disposable, and when the suspected object is rotated out of the video frame, the original mask is disabled. Therefore, when the camera is rotated back to the position where the suspected target is originally detected and the suspected target does not disappear, the smoke and fire detection device alarms again. The suspected target comprises a fixed interference source, such as a chimney, and because the environment for smoke and fire detection is complex, it cannot be guaranteed that no other smoke and fire interference source exists in a picture obtained by shooting by the camera, and once the interference source at the fixed position appears, the camera can be moved to the position each time, a useless alarm can be generated, and the alarm of a real fire situation is submerged in the alarm.
At present, no effective solution is provided aiming at the problem of high false alarm rate of smoke and fire detection caused by the influence of a fixed position interference source in the related technology.
Disclosure of Invention
The embodiment of the application provides a firework detection method, a firework detection system, an electronic device and a storage medium, and aims to at least solve the problem that the false alarm rate of firework detection is high due to the influence of a fixed-position interference source in the related art.
In a first aspect, an embodiment of the present application provides a smoke and fire detection method, including:
acquiring a patrol inspection image shot by a camera, identifying a preset interested object in the patrol inspection image, and acquiring first position information of an area where the preset interested object is located;
acquiring second position information of a preset exclusion area;
judging whether the intersection ratio of the area where the preset interested object is located and the preset exclusion area is larger than a first preset threshold value or not according to the first position information and the second position information;
and under the condition that the intersection ratio of the area where the preset interested object is located and the preset exclusion area is judged to be larger than the first preset threshold value, generating a preset shielding area according to the first position information, and skipping the fire detection of the area where the preset interested object is located.
In some embodiments, the preset exclusion area is pre-stored as third position information within an image coordinate system, and internal and external parameters of a camera corresponding to the image coordinate system; the first position information and the second position information are both position information in a global coordinate system; wherein the obtaining of the second position information of the preset exclusion zone comprises:
acquiring the third position information, the internal parameter and the external parameter which are stored in advance;
determining a first transformation relationship between the image coordinate system and a camera coordinate system according to the internal parameters, and determining a second transformation relationship between the camera coordinate system and the global coordinate system according to the external parameters;
and converting the third position information from the image coordinate system to the global coordinate system according to the first conversion relation and the second conversion relation to obtain the second position information.
In some of these embodiments, the first location information and the second location information are both location information in a global coordinate system; acquiring first position information of a region where the preset interested object is located comprises:
acquiring fourth position information of the area where the preset interested object is located in a current image coordinate system of the inspection image, and acquiring current internal parameters and current external parameters of a camera for shooting the inspection image;
determining a third conversion relation between the current image coordinate system and the current camera coordinate system according to the current internal parameters, and determining a fourth conversion relation between the current camera coordinate system and the global coordinate system according to the current external parameters;
and converting the fourth position information from the current image coordinate system to the global coordinate system according to the third conversion relation and the fourth conversion relation to obtain the first position information.
In some of these embodiments, the method further comprises:
acquiring position information of a preset shielding area;
judging whether the intersection ratio of the area where the preset interested object is located and the preset shielding area is larger than a second preset threshold value or not according to the first position information and the position information of the preset shielding area;
and skipping fire detection on the area where the preset interested object is located under the condition that the intersection ratio of the area where the preset interested object is located and the preset shielding area is larger than the second preset threshold value.
In some of these embodiments, the method further comprises:
and after the primary fire detection task is finished, emptying the preset shielding area and persistently storing the preset exclusion area.
In some of these embodiments, the method further comprises:
selecting one or more preset shielding areas, and storing the one or more preset shielding areas as the preset exclusion area.
In some of these embodiments, the method further comprises:
and under the condition that the intersection ratio of the area where the preset interested object is located and the preset excluding area is not larger than the first preset threshold value, carrying out fire detection on the area where the preset interested object is located, and after the fire detection is finished, generating a preset shielding area according to the first position information.
In a second aspect, embodiments of the present application provide a smoke and fire detection system, comprising: a camera and a control device, wherein the camera is connected to the control device, and the control device is configured to perform the smoke detection method of the first aspect.
In some embodiments, the camera includes a camera body and a support part, the camera body and the support part are connected, and the support part is used for adjusting a horizontal angle and/or a pitch angle of the camera body.
In some of these embodiments, the cameras comprise binocular cameras including visible light cameras and thermal imaging cameras.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform the smoke detection method according to the first aspect.
In a fourth aspect, the present application provides a storage medium, in which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the smoke and fire detection method according to the first aspect.
Compared with the related art, the smoke and fire detection method, the smoke and fire detection system, the electronic device and the storage medium solve the problem that the false alarm rate of smoke and fire detection is high due to the influence of the interference source at the fixed position in the related art, and reduce the false alarm rate of smoke and fire detection.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a flow chart of a smoke detection method according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a relocation preset exclusion area according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a second principle of relocating a preset exclusion zone according to an embodiment of the application;
FIG. 4 is a flow chart of a smoke detection method according to a preferred embodiment of the present application;
FIG. 5 is a block diagram of a pyrotechnic detection system in accordance with an embodiment of the subject application;
fig. 6 is a hardware configuration block diagram of a terminal of the smoke and fire detection method according to the embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The present embodiment provides a smoke detection method, and fig. 1 is a flowchart of the smoke detection method according to the embodiment of the present application, and as shown in fig. 1, the flowchart includes the following steps:
step S101, a patrol inspection image obtained by shooting of a camera is obtained, a preset interested object is identified in the patrol inspection image, and first position information of an area where the preset interested object is located is obtained.
The camera monitors the surrounding environment in a mobile inspection mode, the motion process of the camera can be realized by setting an inspection track, and the inspection image comprises a plurality of frames of visible light images and thermal imaging images acquired by the camera at a preset frame rate.
In this embodiment, a smoke and fire detection algorithm may be adopted to perform target identification on the current inspection image to obtain at least one preset interested object, where the preset interested object includes any one or more of smoke, fog, and fire, and the preset interested object may be a real fire situation or an interference source. Since the relative position relationship of the preset interested object in the objective world is usually in a discrete state, the relative position relationship of the preset interested object in the inspection image is also in a discrete state, and in order to facilitate subsequent calculation processing, the embodiment uses a bounding box algorithm to contain the preset interested object into a closed space, and records the area and the first position information of the closed space.
Step S102, second position information of a preset exclusion area is obtained.
The preset exclusion zone includes at least one stationary interference source including, but not limited to, a chimney, an exhaust duct, and other objects for providing ventilation for the smoke or fumes. In this embodiment, a bounding box algorithm may also be used to contain the fixed interference source in the inspection image into a closed space, and the area and the second position information of the closed space are recorded. The preset exclusion area can be any polygon or a closed space surrounded by a regular curve, and can be suitable for drawing a fixed interference source area in various complex scenes such as streets and the like.
Step S103, judging whether the intersection ratio of the area where the preset interested object is located and the preset exclusion area is larger than a first preset threshold value or not according to the first position information and the second position information.
The intersection ratio can be determined by the ratio of the area of the intersection part of the two regions to the area of the region where the preset interested object is located, and the larger the intersection ratio is, the higher the probability that the region where the preset interested object is located contains the fixed interference source is. If the intersection ratio of the two areas is larger than a first preset threshold value, the area where the preset interested object is located belongs to a preset exclusion area.
And step S104, generating a preset shielding area according to the first position information under the condition that the intersection ratio of the area where the preset interested object is located and the preset exclusion area is judged to be larger than a first preset threshold value, and skipping the fire detection of the area where the preset interested object is located.
After the area where the preset interested object is located is determined to belong to the preset exclusion area, the area where the preset interested object is located is shielded, and fire detection of the area where the preset interested object corresponding to the first position information is located is skipped. Namely, the area where the preset interested object is located is not processed, and the alarm about the fire condition of the area where the preset interested object is located is not triggered, so that the influence of a fixed interference source is eliminated, and the misinformation is avoided.
Compared with the related art, the preset exclusion area does not disappear in the image due to the movement of the camera, and the preset exclusion area can be ensured to be in the image as long as the fixed interference source is in the field of view of the camera. And when the objects in the preset exclusion area are detected, the objects are filtered, so that the processing resources are saved.
Through the steps, the problem that the false alarm rate of smoke and fire detection is high due to the influence of the interference source at the fixed position in the related technology is solved, and the false alarm rate of smoke and fire detection is reduced.
In the mobile inspection process of the camera, the fixed interference source can deviate relative to the camera, so that the position of the preset exclusion area in the inspection image can be changed in real time, and the position information of the preset exclusion area used by the previous inspection task is not suitable for the next inspection task.
To solve this problem, in some embodiments, the preset exclusion area is pre-stored as third position information within the image coordinate system, and internal and external parameters of the camera corresponding to the image coordinate system; the first position information and the second position information are both position information in a global coordinate system; wherein the obtaining of the second position information of the preset exclusion zone comprises:
acquiring prestored third position information, internal parameters and external parameters;
determining a first transformation relationship between the image coordinate system and the camera coordinate system according to the internal parameters, and determining a second transformation relationship between the camera coordinate system and the global coordinate system according to the external parameters;
and converting the third position information from the image coordinate system to the global coordinate system according to the first conversion relation and the second conversion relation to obtain second position information.
The third positional information is determined positional information, and may be set manually or retrieved from a database stored in advance. When the preset exclusion area is set for the first time, the video picture can be aligned to the fixed interference source, the preset exclusion area is drawn, and the position information is stored.
The first conversion relation represents a perspective projection conversion relation, the second conversion relation represents a rigid body conversion relation, and position information in a video picture can be converted into position information in the real world through the conversion relation between every two of an image coordinate system, a camera coordinate system and a global coordinate system, so that the effect of positioning a preset exclusion area in real time is achieved.
The first conversion relation and the second conversion relation are reversible, that is, in the image coordinate system, the camera coordinate system and the global coordinate system, as long as the position information of any one coordinate system is determined, the position information of the other two coordinate systems can be obtained. Therefore, when calculating the intersection ratio of the region where the preset object of interest is located and the preset exclusion region, the calculation may be performed in any one of the image coordinate system, the camera coordinate system, and the global coordinate system.
The internal parameters include, but are not limited to, the focal length, lens magnification, field of view, and field of view of the camera, wherein the field of view includes a horizontal field of view and/or a tilt field of view.
The external parameters include an attitude angle of the camera, which includes a horizontal angle and/or a pitch angle.
And obtaining a principle that second position information of the preset exclusion area is similar, wherein in some embodiments, the first position information and the second position information are both position information in a global coordinate system; acquiring first position information of a region where a preset object of interest is located includes:
acquiring fourth position information of an area where a preset interested object is located in a current image coordinate system of the inspection image, and acquiring current internal parameters and current external parameters of a camera for shooting the inspection image;
determining a third conversion relation between a current image coordinate system and a current camera coordinate system according to the current internal parameters, and determining a fourth conversion relation between the current camera coordinate system and a global coordinate system according to the current external parameters;
and converting the fourth position information from the current image coordinate system to the global coordinate system according to the third conversion relation and the fourth conversion relation to obtain the first position information.
During the moving inspection process of the camera, internal parameters and/or external parameters of the camera can be changed. If the camera only changes the internal parameters, the camera is reflected on the inspection image and is subjected to scaling transformation; if the camera only changes the external parameters, translation and/or rotation transformation is reflected on the inspection image; if the camera changes both the internal and external parameters, then translation and/or rotation and scaling transformations are present on the inspection image.
The following embodiment will describe an embodiment in which the preset exclusion area is relocated in the image coordinate system.
(1) The camera only changing external parameters
Fig. 2 is a schematic diagram of a principle of relocating a preset exclusion area according to an embodiment of the present application, where as shown in fig. 2, assuming that a preset exclusion area ABCD is an object-side imaging picture of a history inspection task, F is a lens of a camera, in a history inspection image, a is taken as an origin of coordinates, AD is an x-axis growth direction, AB is a y-axis growth direction, and normalized AB is 1024. When the attitude angle of the camera is changed, an object space imaging picture A ' B ' C ' D of the preset exclusion area in the current inspection task is obtained'accordingly, in the current patrol inspection image, a' is taken as the origin of coordinates. Let there be a point Q in ABCD, the coordinate of point Q under the historical image coordinate system is (m)1,n1) There is a point Q ' corresponding to A ' B ' C ' D ', and a point Q ' (m '1,n′1) The coordinates in the current image coordinate system are:
Figure BDA0003109606950000071
wherein odPDegree of offset between the points Q and Q' in the horizontal field of view, odTAs a degree of offset in the elevation field of view direction.
A method of the degree of shift between the history patrol image and the current patrol image in each preset direction (horizontal visual field direction and pitch visual field direction) between the point Q and the point Q' will be described below.
(a) Referring to fig. 2, in the field of view formed by ABCD, BC is a horizontal field of view, AB is a pitch field of view, and FE is a preset focal length, where E is the center of ABCD and FE is perpendicular to ABCD. Degree of deviation od in the horizontal field of view between point Q and point QPAnd a degree of displacement od in the elevation field of view directionTAs follows:
Figure BDA0003109606950000081
wherein, Δ α is a rotation angle of the camera along the horizontal visual field direction, Δ β is a rotation angle of the camera along the pitching visual field direction, h is a preset focal length, fovVal _ P is a horizontal visual field value, and fovVal _ T is a pitching visual field value.
Figure BDA0003109606950000082
Representing an amount of shift between the point Q and the point Q' in the horizontal visual field direction between the history patrol image and the current patrol image,
Figure BDA0003109606950000083
between the representative point Q and the point Q' along the pitch visual field directionAnd the offset between the historical patrol inspection image and the current patrol inspection image.
(b) Referring to fig. 2, the degree of deviation od in the horizontal field of view between point Q and point QPAnd a degree of displacement od in the elevation field of view directionTAs follows:
Figure BDA0003109606950000084
where α is a horizontal angle of view and β is a pitch angle of view.
Because:
Figure BDA0003109606950000085
then one can get:
Figure BDA0003109606950000086
therefore, in some preferred embodiments, the degree of deviation of the preset exclusion area between the history patrol inspection image and the current patrol inspection image along each preset direction may be determined only based on the angle of rotation and the angle of field of the camera in the same preset direction without acquiring the focal length and the size of the field of view of the camera.
(2) The camera only changing internal parameters
Fig. 3 is a schematic diagram of a principle of repositioning the preset exclusion area two according to an embodiment of the present application, as shown in fig. 3, the preset exclusion area ABCD is an imaging picture on an object-side focal plane of the history inspection task, EFGH is an imaging picture on a camera-side focal plane of the history inspection task, a corresponding lens magnification is Z, AB is AD 1024, a is an origin of coordinates, a point O is a coordinate (512), K is an upper vertex of a five-pointed star in the ABCD, and a point K is a coordinate (m) of (m, K)2,n2). Assuming that the attitude angle of the camera is not changed, only the lens multiplying power is changed to obtain an object space imaging picture A 'B' C 'D' of the preset exclusion area in the current inspection task and an imaging picture on the camera image space focal plane of the current inspection taskThe face E ' F ' G ' H ' corresponds to a lens magnification of Z ', and accordingly, point A ' is the origin of coordinates, point O ' is the coordinates of (512 ), and point K ' (m '2,n′2) The coordinates in the current image coordinate system are:
Figure BDA0003109606950000091
in this embodiment, under the condition that the lens is zoomed, the preset exclusion area is repositioned by positioning the vertex coordinates of the preset exclusion area one by one through the picture center point.
(3) Camera simultaneously changing external parameters and internal parameters
Referring to fig. 3, the preset exclusion area ABCD is an imaging frame on an object focal plane of the history polling task, EFGH is an imaging frame on a camera focal plane of the history polling task, a corresponding lens magnification is Z, AB is AD is 1024, a is set as an origin of coordinates, a point O is set as (512 ), K is an upper vertex of a five-pointed star in the ABCD, and a point K is set as (m) a coordinate2,n2). Assuming that the attitude angle and the lens magnification of the camera are changed, the object side imaging picture A ' B ' C ' D ' of the preset exclusion area in the current inspection task and the imaging picture E ' F ' G ' H ' on the camera image side focal plane of the current inspection task are obtained, the corresponding lens magnification is Z ', correspondingly, the point A ' is the coordinate origin, the point O ' is (512 ), and the point K ' (m '2,n′2) The coordinates in the current image coordinate system are:
Figure BDA0003109606950000092
wherein the content of the first and second substances,
Figure BDA0003109606950000101
representing the zoom value of the lens magnification of the camera moving from the historical patrol task to the current patrol task.
In some preferred embodiments, the manner of relocating the preset exclusion area includes:
capturing horizontal field of view of a cameraAlpha, a pitching angle of view beta and a lens focal length h, and acquiring a horizontal angular position P (theta) of a camera of the historical inspection task1) The pitch angle position is T (theta)2) And lens magnification Z.
Wherein, assuming the lens is the pole, θ is1Representing the included angle theta formed by the camera and the preset polar axis after the camera rotates around the pole in the horizontal view field direction2And the included angle formed by the camera and the preset polar axis after the camera rotates around the polar point in the pitching view field direction is represented.
The coordinate of the preset exclusion area in the historical patrol inspection image is set as R { [ x ]0,y0],…,[xi,yi]Where i is the number of vertices of a preset exclusion zone. Zooming is carried out while the camera rotates to obtain the current horizontal position P' (theta) of the camera1) And a pitch position T' (θ)2) And a lens magnification Z'. In combination with the situation that both the external parameter and the internal parameter are changed, the coordinate R '{ [ x'0,y′0],…,[x′i,y′i]X 'in the formula'iAnd y'iThe solving formula of the coordinates of (a) is as follows:
Figure BDA0003109606950000102
when the intersection ratio of the area where the preset interested object is located and the preset exclusion area is judged to be not more than the first preset threshold, the area where the preset interested object is located does not belong to the preset exclusion area, the area where the preset interested object is located may belong to a real fire area or a preset shielding area, and solutions are given for the two situations.
In some of these embodiments, the method further comprises:
acquiring position information of a preset shielding area;
judging whether the intersection ratio of the area where the preset interested object is located and the preset shielding area is larger than a second preset threshold value or not according to the first position information and the position information of the preset shielding area;
and skipping the fire detection of the area where the preset interested object is located under the condition that the intersection ratio of the area where the preset interested object is located and the preset shielding area is judged to be larger than a second preset threshold value.
The intersection ratio can be determined by the ratio of the area of the intersection part of the two regions to the area of the region where the preset interested object is located, and the larger the intersection ratio is, the higher the probability that the region where the preset interested object is located belongs to the preset shielding region is. If the intersection ratio of the two areas is larger than a second preset threshold value, the area where the preset interested object is located belongs to the preset shielding area.
In some of these embodiments, the method further comprises: and after the primary fire detection task is finished, emptying the preset shielding area and persistently storing the preset exclusion area.
The mask area is temporary (non-persistent) and is masked only in the current inspection task, and the mask area comprises the interested areas which are excluded because of belonging to the preset exclusion area and also comprises the interested areas which do not want to repeat inspection because the inspection is carried out. The preset exclusion areas are persistent, and the preset exclusion areas which are set in each routing inspection task take effect.
In some of these embodiments, the method further comprises: one or more preset shielding areas are selected, and the one or more preset shielding areas are stored as preset exclusion areas.
Since the mask area is temporary rather than persistent, some preset mask areas may be added to the preset exclusion area before the mask area is cleared, so as to simplify the setting work of the preset exclusion area. For example, the alarm is given after the inspection, the manual inspection finds that the place is actually a newly-built chimney, and at the moment, the shielding area is set to be a preset exclusion area, so that the chimney can take effect in the next inspection task.
In some of these embodiments, the method further comprises: and under the condition that the intersection ratio of the area where the preset interested object is located and the preset exclusion area is not larger than a first preset threshold value, carrying out fire detection on the area where the preset interested object is located, and after the fire detection is finished, generating a preset shielding area according to the first position information.
After the firework target or the suspected target is detected, the areas are automatically shielded, so that the situation that the camera reports for many times because the target is always detected in the picture in the rotating process is avoided.
The embodiments of the present application are described and illustrated below by means of preferred embodiments.
In some embodiments, the camera is rotated by the pan-tilt, and the inspection track of the camera is the inspection track of the pan-tilt.
FIG. 4 is a flow chart of a smoke detection method according to a preferred embodiment of the present application, as shown in FIG. 4, the flow chart comprising the steps of:
and S401, setting a cloud platform inspection track. Two paths of video images are obtained by utilizing a visible light and thermal imaging binocular camera.
In step S402, a preset exclusion area R is set. Taking a thermal imaging channel as an example, aligning a video image with a fixed interference source, drawing a preset exclusion area, and storing related information including, but not limited to, a pan-tilt horizontal position P, a tilt position T, a horizontal field angle α, a tilt field angle β, and a lens magnification Z, wherein a coordinate R of the preset exclusion area in the image is { [ x ])0,y0],…,[xi,yi]}. Accordingly, the visible light channel can be treated in a similar manner as the thermal imaging channel.
And S403, starting the cloud deck to inspect, and performing target identification on the current visible light and thermal imaging camera image by using the smoke and fire detection algorithm module. Recording i target information in the picture, wherein the target information includes but is not limited to the area of the region where the preset interested object is located, the horizontal position P ' of the holder, the pitch position T ' and the lens magnification Z '.
In step S404, it is determined whether or not a fire target exists on the screen. If yes, go to step S405; if not, returning to the step S403, and continuing to rotate the holder according to the preset routing inspection track.
In step S405, the preset exclusion area R' is relocated. Updating the coordinates R '{ [ x'0,y′0],…,[x′i,y′i]}。
Step S406, calculating an intersection ratio between the region where the preset interested object is located and the preset exclusion region R', and then comparing the intersection ratio with a preset threshold.
Step S407, determine whether the intersection ratio of the region where the preset interested object is located and the preset exclusion region R' is greater than a preset threshold. If yes, go to step S408; if not, step S409 is executed.
In step S408, the region where the preset object of interest is located is masked, and the process returns to step S403.
Step S409, determining whether the region where the preset interested object is located is in the shielding region. If yes, returning to step S408; if not, go to step S410.
Step S410, determining that the area where the preset interested object is located is a real fire, triggering a fire alarm about the area where the preset interested object is located, and executing step S411.
Step S411, adding the area where the preset interested object is located into the shielding area, avoiding the same target from interrupting the motion of the holder again, and returning to step S403.
In the related art, a detected target and a false alarm area are shielded by using a shielding area, but the shielding area cannot be used for shielding targets such as a chimney, so that alarms are frequently generated when a holder rotates to the position every time, and the alarms are useless for users and can interfere with normal smoke and fire alarms.
In the preferred embodiment, the preset exclusion area is used, and when the intersection ratio of the area where the preset interested object is located and the preset exclusion area is larger than the preset threshold value, the area where the preset interested object is located is shielded without any treatment, so that the false alarm rate of smoke and fire detection is reduced. The preset exclusion area can be set according to the intention of a user in the rotation process of the holder, and the position coordinates of the initial preset exclusion area can be invalid after the holder rotates, so that the holder becomes effective, and the holder is repositioned. The program running on the binocular camera can directly acquire external parameters and internal parameters of the binocular camera, and the relative position of the preset exclusion zone at the current holder angle can be obtained according to the information and the formula.
In combination with the firework detection method of the above embodiments, the present embodiment also provides a firework detection system. Fig. 5 is a block diagram of a pyrotechnic detection system according to an embodiment of the present application, which, as shown in fig. 5, includes: a camera 51 and a control device 52, wherein the camera 51 and the control device 52 are connected, and the control device 52 is configured to perform the smoke detection method of the above-described embodiment.
In some of these embodiments, the camera 51 comprises a camera body and a support, the camera body and the support being connected, the support being used to adjust the horizontal and/or pitch angle of the camera body.
The supporting part comprises a holder, and the holder supports the camera to rotate in the horizontal view field direction and the pitching view field direction.
In some of these embodiments, the cameras 51 comprise binocular cameras, including visible light cameras and thermal imaging cameras.
In some preferred embodiments, the smoke and fire detection system detects and alarms a target by using a smoke and fire detection algorithm (hereinafter referred to as a detection algorithm) in the omnibearing rotation process of the holder, and sets a preset exclusion area for filtering aiming at a fixed interference source, so as to avoid generating a large amount of useless alarms. Wherein, predetermine and get rid of the district and can set up according to the user's demand, and along with the rotation of cloud platform, predetermine and get rid of the district and follow the orbit of patrolling and examining of cloud platform and can carry out real-time update, as long as fixed interference source is in the camera field of view, guarantee that fixed interference source is in predetermineeing constantly and get rid of the district in, do not influence firework detecting system's normal warning.
During specific implementation, the smoke and fire detection system generates a routing inspection track of the holder, rotates the holder, moves a target to be eliminated into a video picture, and draws a corresponding preset elimination area according to the current target position. The camera parameters corresponding to the preset exclusion zone at least include: the attitude angle of the current holder, the horizontal field angle, the pitching field angle, the focal length and the lens magnification of the current binocular camera are preset, and the position coordinate information of the exclusion area relative to the video picture is preset. After the position information of the preset exclusion area is set, the cradle head is started to carry out routing inspection.
In the cloud deck inspection process, the frame collected by the binocular camera is transmitted to a detection algorithm at a preset frame rate (25 frames/second), so that the position of the bounding box of the suspected target relative to the current video frame is obtained. And after the information of the target bounding box is obtained, calculating the proportion intersected with the preset exclusion area, and considering that the target is in the preset exclusion area when the proportion is more than a certain threshold value. If the current suspected target is in the preset exclusion area, directly filtering the target without any operation; if the suspected target is out of the preset exclusion area, stopping the cradle head, further confirming the suspected target, and shielding the target bounding box area until all detection and alarm processes are completed, so that the cradle head continues to rotate for inspection.
The present embodiment also provides an electronic device comprising a memory having a computer program stored therein and a processor configured to execute the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In some embodiments, the electronic device includes, but is not limited to, a terminal, a computer, or a similar computing device. Taking the operation on the terminal as an example, fig. 6 is a hardware structure block diagram of the terminal of the smoke and fire detection method according to the embodiment of the present application. As shown in fig. 6, the terminal may include one or more (only one shown in fig. 6) processors 602 (the processors 602 may include, but are not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 604 for storing data, and optionally, a transmission device 606 for communication functions and an input-output device 608. It will be understood by those skilled in the art that the structure shown in fig. 6 is only an illustration and is not intended to limit the structure of the terminal. For example, the terminal may also include more or fewer components than shown in fig. 6, or have a different configuration than shown in fig. 6.
The memory 604 may be used for storing computer programs, for example, software programs and modules of application software, such as computer programs corresponding to the smoke detection method in the embodiment of the present application, and the processor 602 executes various functional applications and data processing by running the computer programs stored in the memory 604, so as to implement the method described above. The memory 604 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 604 may further include memory located remotely from the processor 602, which may be connected to the terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmitting device 606 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the terminal. In one example, the transmission device 606 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmitting device 606 can be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
In addition, in combination with the pyrotechnic method in the above embodiments, the embodiments of the present application may be implemented by providing a storage medium. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any of the smoke detection methods of the above embodiments.
It should be understood by those skilled in the art that various features of the above-described embodiments can be combined in any combination, and for the sake of brevity, all possible combinations of features in the above-described embodiments are not described in detail, but rather, all combinations of features which are not inconsistent with each other should be construed as being within the scope of the present disclosure.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. A method of smoke detection, comprising:
acquiring a patrol inspection image shot by a camera, identifying a preset interested object in the patrol inspection image, and acquiring first position information of an area where the preset interested object is located;
acquiring second position information of a preset exclusion area;
judging whether the intersection ratio of the area where the preset interested object is located and the preset exclusion area is larger than a first preset threshold value or not according to the first position information and the second position information;
and under the condition that the intersection ratio of the area where the preset interested object is located and the preset exclusion area is judged to be larger than the first preset threshold value, generating a preset shielding area according to the first position information, and skipping the fire detection of the area where the preset interested object is located.
2. The smoke and fire detection method according to claim 1, wherein the preset exclusion area is pre-stored as third position information within an image coordinate system, and internal and external parameters of a camera corresponding to the image coordinate system; the first position information and the second position information are both position information in a global coordinate system; wherein the obtaining of the second position information of the preset exclusion zone comprises:
acquiring the third position information, the internal parameter and the external parameter which are stored in advance;
determining a first transformation relationship between the image coordinate system and a camera coordinate system according to the internal parameters, and determining a second transformation relationship between the camera coordinate system and the global coordinate system according to the external parameters;
and converting the third position information from the image coordinate system to the global coordinate system according to the first conversion relation and the second conversion relation to obtain the second position information.
3. The pyrotechnic detection method of claim 1 wherein the first and second position information are both position information in a global coordinate system; acquiring first position information of a region where the preset interested object is located comprises:
acquiring fourth position information of the area where the preset interested object is located in a current image coordinate system of the inspection image, and acquiring current internal parameters and current external parameters of a camera for shooting the inspection image;
determining a third conversion relation between the current image coordinate system and the current camera coordinate system according to the current internal parameters, and determining a fourth conversion relation between the current camera coordinate system and the global coordinate system according to the current external parameters;
and converting the fourth position information from the current image coordinate system to the global coordinate system according to the third conversion relation and the fourth conversion relation to obtain the first position information.
4. The pyrotechnic detection method of claim 1 further comprising:
acquiring position information of a preset shielding area;
judging whether the intersection ratio of the area where the preset interested object is located and the preset shielding area is larger than a second preset threshold value or not according to the first position information and the position information of the preset shielding area;
and skipping fire detection on the area where the preset interested object is located under the condition that the intersection ratio of the area where the preset interested object is located and the preset shielding area is larger than the second preset threshold value.
5. The pyrotechnic detection method of claim 1 further comprising:
and after the primary fire detection task is finished, emptying the preset shielding area and persistently storing the preset exclusion area.
6. The pyrotechnic detection method of claim 5 wherein the method further comprises:
selecting one or more preset shielding areas, and storing the one or more preset shielding areas as the preset exclusion area.
7. The pyrotechnic detection method of any of claims 1-6, further comprising:
and under the condition that the intersection ratio of the area where the preset interested object is located and the preset excluding area is not larger than the first preset threshold value, carrying out fire detection on the area where the preset interested object is located, and after the fire detection is finished, generating a preset shielding area according to the first position information.
8. A smoke and fire detection system, comprising: a camera and a control device, wherein the camera and the control device are connected, the control device being configured to perform the smoke detection method of any one of claims 1 to 7.
9. The smoke and fire detection system of claim 8, wherein the camera comprises a camera body and a support, the camera body and the support being connected, the support being configured to adjust a horizontal and/or pitch angle of the camera body.
10. The smoke and fire detection system of claim 8, wherein the camera comprises a binocular camera comprising a visible light camera and a thermal imaging camera.
11. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the smoke detection method according to any of claims 1 to 7.
12. A storage medium, characterized in that a computer program is stored in the storage medium, wherein the computer program, when being executed by a processor, realizes the steps of the smoke detection method according to any one of claims 1 to 7.
CN202110647346.0A 2021-06-10 2021-06-10 Firework detection method, system, electronic device and storage medium Active CN113536918B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110647346.0A CN113536918B (en) 2021-06-10 2021-06-10 Firework detection method, system, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110647346.0A CN113536918B (en) 2021-06-10 2021-06-10 Firework detection method, system, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN113536918A true CN113536918A (en) 2021-10-22
CN113536918B CN113536918B (en) 2024-04-16

Family

ID=78124810

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110647346.0A Active CN113536918B (en) 2021-06-10 2021-06-10 Firework detection method, system, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN113536918B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100922784B1 (en) * 2009-02-23 2009-10-21 주식회사 이미지넥스트 Image base fire sensing method and system of crime prevention and disaster prevention applying method thereof
CN101587622A (en) * 2009-06-18 2009-11-25 任芳 Forest rocket detection and recognition methods and equipment based on video image intelligent analysis
US20140028803A1 (en) * 2012-07-26 2014-01-30 Robert Bosch Gmbh Fire monitoring system
CN105405244A (en) * 2015-12-22 2016-03-16 山东神戎电子股份有限公司 Interference source shielding method used for forest water prevention
US20190149779A1 (en) * 2016-05-04 2019-05-16 Robert Bosch Gmbh Detection device, method for detection of an event, and computer program
CN110490043A (en) * 2019-06-10 2019-11-22 东南大学 A kind of forest rocket detection method based on region division and feature extraction
CN111339997A (en) * 2020-03-20 2020-06-26 浙江大华技术股份有限公司 Method and apparatus for determining ignition region, storage medium, and electronic apparatus
WO2020214084A1 (en) * 2019-04-17 2020-10-22 Hendricks Corp Pte Ltd Method and system for detecting fire and smoke
CN111858813A (en) * 2020-07-21 2020-10-30 云南电网有限责任公司带电作业分公司 Non-fire area eliminating method based on satellite technology
CN112071016A (en) * 2020-09-14 2020-12-11 广州市几米物联科技有限公司 Fire monitoring method, device, equipment and storage medium
CN112580430A (en) * 2020-11-19 2021-03-30 重庆市科源能源技术发展有限公司 Power plant smoke and fire monitoring method, device and system based on RGB vision and storage medium
CN112614302A (en) * 2020-12-03 2021-04-06 杭州海康微影传感科技有限公司 Fire detection method, device and system and electronic equipment
CN112614165A (en) * 2020-12-04 2021-04-06 浙江大华技术股份有限公司 Smoke and fire monitoring method and device, camera, electronic device and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100922784B1 (en) * 2009-02-23 2009-10-21 주식회사 이미지넥스트 Image base fire sensing method and system of crime prevention and disaster prevention applying method thereof
CN101587622A (en) * 2009-06-18 2009-11-25 任芳 Forest rocket detection and recognition methods and equipment based on video image intelligent analysis
US20140028803A1 (en) * 2012-07-26 2014-01-30 Robert Bosch Gmbh Fire monitoring system
CN105405244A (en) * 2015-12-22 2016-03-16 山东神戎电子股份有限公司 Interference source shielding method used for forest water prevention
US20190149779A1 (en) * 2016-05-04 2019-05-16 Robert Bosch Gmbh Detection device, method for detection of an event, and computer program
WO2020214084A1 (en) * 2019-04-17 2020-10-22 Hendricks Corp Pte Ltd Method and system for detecting fire and smoke
CN110490043A (en) * 2019-06-10 2019-11-22 东南大学 A kind of forest rocket detection method based on region division and feature extraction
CN111339997A (en) * 2020-03-20 2020-06-26 浙江大华技术股份有限公司 Method and apparatus for determining ignition region, storage medium, and electronic apparatus
CN111858813A (en) * 2020-07-21 2020-10-30 云南电网有限责任公司带电作业分公司 Non-fire area eliminating method based on satellite technology
CN112071016A (en) * 2020-09-14 2020-12-11 广州市几米物联科技有限公司 Fire monitoring method, device, equipment and storage medium
CN112580430A (en) * 2020-11-19 2021-03-30 重庆市科源能源技术发展有限公司 Power plant smoke and fire monitoring method, device and system based on RGB vision and storage medium
CN112614302A (en) * 2020-12-03 2021-04-06 杭州海康微影传感科技有限公司 Fire detection method, device and system and electronic equipment
CN112614165A (en) * 2020-12-04 2021-04-06 浙江大华技术股份有限公司 Smoke and fire monitoring method and device, camera, electronic device and storage medium

Also Published As

Publication number Publication date
CN113536918B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
US20220078349A1 (en) Gimbal control method and apparatus, control terminal and aircraft system
CN105391910B (en) Multiple-camera laser scanner
CN206260046U (en) A kind of thermal source based on thermal infrared imager and swarm into tracks of device
CN113473010B (en) Snapshot method and device, storage medium and electronic device
CN110910459A (en) Camera device calibration method and device and calibration equipment
CN108364369B (en) Unmanned aerial vehicle inspection point determining method, unmanned aerial vehicle inspection point determining device, unmanned aerial vehicle inspection point determining medium, unmanned aerial vehicle inspection point determining equipment and unmanned aerial vehicle inspection point determining system
CN107438152A (en) A kind of motion cameras is to panorama target fast positioning method for catching and system
CN110839127A (en) Inspection robot snapshot method, device and system and inspection robot
CN111445537A (en) Calibration method and system of camera
CN109327656A (en) The image-pickup method and system of full-view image
CN111294563B (en) Video monitoring method and device, storage medium and electronic device
CN110933297B (en) Photographing control method and device of intelligent photographing system, storage medium and system
CN113079369A (en) Method and device for determining image pickup equipment, storage medium and electronic device
CN110351475A (en) Camera system, information processing equipment and its control method and storage medium
CN113536918A (en) Smoke and fire detection method, system, electronic device and storage medium
JP2013021399A (en) Photographing request device, control method for photographing request device, and program
JP2023523364A (en) Visual positioning method, device, equipment and readable storage medium
CN117115935A (en) Substation unmanned plane routing inspection route adjusting method and device and computer equipment
CN113518174A (en) Shooting method, device and system
CN113364980B (en) Device control method, device, storage medium, and electronic apparatus
CN113674356A (en) Camera screening method and related device
CN114333199B (en) Alarm method, equipment, system and chip
CN117255247B (en) Method and device for linkage of panoramic camera and detail dome camera
CN113840073A (en) Control method, device, equipment and medium for shooting equipment
CN113920144B (en) Real-scene photo ground vision field analysis method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230824

Address after: Building A, No. 858 Jianshe Second Road, Xiaoshan District, Hangzhou City, Zhejiang Province, 311200

Applicant after: Zhejiang Huagan Technology Co.,Ltd.

Address before: No. 1187 Bin'an Road, Binjiang District, Hangzhou, Zhejiang Province

Applicant before: ZHEJIANG DAHUA TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant