KR20170006079A - Fire surveillance apparatus - Google Patents
Fire surveillance apparatus Download PDFInfo
- Publication number
- KR20170006079A KR20170006079A KR1020150096482A KR20150096482A KR20170006079A KR 20170006079 A KR20170006079 A KR 20170006079A KR 1020150096482 A KR1020150096482 A KR 1020150096482A KR 20150096482 A KR20150096482 A KR 20150096482A KR 20170006079 A KR20170006079 A KR 20170006079A
- Authority
- KR
- South Korea
- Prior art keywords
- image
- lens
- light
- fire
- processor
- Prior art date
Links
- 238000012544 monitoring process Methods 0.000 claims abstract description 60
- 238000000034 method Methods 0.000 claims abstract description 41
- 238000004891 communication Methods 0.000 claims abstract description 37
- 230000033001 locomotion Effects 0.000 claims abstract description 33
- 238000012545 processing Methods 0.000 claims abstract description 23
- 230000008569 process Effects 0.000 claims abstract description 15
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 12
- 238000010801 machine learning Methods 0.000 claims abstract description 7
- 230000009466 transformation Effects 0.000 claims abstract description 5
- 230000003287 optical effect Effects 0.000 claims description 75
- 238000005259 measurement Methods 0.000 claims description 43
- 230000008859 change Effects 0.000 claims description 19
- 238000001514 detection method Methods 0.000 claims description 16
- 238000004364 calculation method Methods 0.000 claims description 15
- 238000004458 analytical method Methods 0.000 claims description 13
- 239000000203 mixture Substances 0.000 claims description 5
- 238000003860 storage Methods 0.000 claims description 5
- 238000006243 chemical reaction Methods 0.000 claims description 4
- 230000001678 irradiating effect Effects 0.000 claims description 2
- 238000005286 illumination Methods 0.000 claims 1
- 238000005070 sampling Methods 0.000 abstract description 6
- 230000000875 corresponding effect Effects 0.000 description 23
- 238000010586 diagram Methods 0.000 description 18
- 239000000779 smoke Substances 0.000 description 12
- 230000006870 function Effects 0.000 description 10
- 238000012937 correction Methods 0.000 description 9
- 238000004519 manufacturing process Methods 0.000 description 8
- 239000000463 material Substances 0.000 description 8
- 238000012546 transfer Methods 0.000 description 8
- 230000003044 adaptive effect Effects 0.000 description 7
- 239000004065 semiconductor Substances 0.000 description 7
- 238000009792 diffusion process Methods 0.000 description 6
- 239000000284 extract Substances 0.000 description 6
- 238000007667 floating Methods 0.000 description 4
- 229910052751 metal Inorganic materials 0.000 description 4
- 239000002184 metal Substances 0.000 description 4
- 239000000758 substrate Substances 0.000 description 4
- 238000009434 installation Methods 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000003672 processing method Methods 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 2
- 229910052782 aluminium Inorganic materials 0.000 description 2
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 238000000354 decomposition reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 238000005304 joining Methods 0.000 description 2
- 230000002265 prevention Effects 0.000 description 2
- 230000010076 replication Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 229910052709 silver Inorganic materials 0.000 description 2
- 239000004332 silver Substances 0.000 description 2
- 238000011426 transformation method Methods 0.000 description 2
- 230000001131 transforming effect Effects 0.000 description 2
- 206010000369 Accident Diseases 0.000 description 1
- 241000282414 Homo sapiens Species 0.000 description 1
- 241001085205 Prenanthella exigua Species 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- -1 temperature Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B17/00—Fire alarms; Alarms responsive to explosion
- G08B17/12—Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
- G08B17/125—Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/08—Systems for measuring distance only
- G01S13/10—Systems for measuring distance only using transmission of interrupted, pulse modulated waves
- G01S13/18—Systems for measuring distance only using transmission of interrupted, pulse modulated waves wherein range gates are used
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Alarm Systems (AREA)
- Fire-Detection Mechanisms (AREA)
Abstract
A fire monitoring apparatus is disclosed. The fire monitoring apparatus of the present invention includes a panoramic camera including a lens module including an aspheric lens for focusing light incident at 360 ° foreground and a CMOS sensor for converting focused light into an electric signal, A polar transformation process is performed to convert an original image of an annular shape captured by a PoE socket for receiving and network communication and a omnidirectional camera into a rectangular image, Sampling processing for reducing the length of the outer arc so that each sector image obtained by dividing the sector image at regular intervals along the arc length is coincident with the intermediate arc length, and interpolation processing for extending the length of the inner arc, The brightness of the center of the region in which the motion exists from the plane image is greater than or equal to a predetermined depth value, Comparing the front and back frames of the corrected planar image to extract a flame region and detecting the occurrence of a fire based on the dynamic texture of the learned flame region according to a machine learning algorithm.
Description
BACKGROUND OF THE
Fire accidents can cause a lot of damage to human beings and property. To prevent this, a monitoring device for detecting the occurrence of a fire is placed inside the building or in a specific area.
Conventionally, a device for detecting the occurrence of a fire using a CCTV camera has a blind spot due to a narrow viewing angle of the camera. When installing multiple cameras to avoid this, a cost burden was incurred. In addition, images captured by a plurality of cameras are collected in an external server connected to a communication, and image processing is performed to determine the occurrence of a fire.
SUMMARY OF THE INVENTION It is an object of the present invention to provide a fire monitoring apparatus for detecting the occurrence of a fire from an image of a surveillance camera photographed in 360-degree omnidirection.
The fire monitoring apparatus according to an embodiment of the present invention includes a lens module including an aspherical lens that focuses light incident on a foreground of 360 degrees and a CMOS sensor that converts the focused light into an electric signal A PoE socket for receiving electric power and network communication, and a polar transformation process for converting a circle-shaped original image photographed by the omnidirectional camera into a rectangular plane image. A downsampling process for reducing the length of the outer arc so that each sector image obtained by dividing the original image at equal intervals along the circumferential direction is matched with the intermediate arc length, and interpolation processing for extending the length of the inner arc And the brightness of the center of the region in which the motion exists from the corrected planar image is a predetermined depth value And comparing the brightness of the surrounding region with the brightness of the surrounding region by comparing the front and back frames of the corrected plane image to extract a flame region, Wherein the processor performs the polar coordinate transform processing using the following equation: < EMI ID =
Herein, θ i and r i are an angle (deg) and a radius when a pixel position of the original image is represented by a polar coordinate system, and x i and y i represent a position of a pixel of the original image by a rectangular coordinate x-axis coordinate value and a y-axis coordinate value at the time when, with R the radius, the c x and c y of the outer concentric circle from the center in the torus is the x coordinate value and y coordinate value of the center of the torus.
The aspherical lens has a convex first incident surface 11 formed on one surface thereof and a first emitting surface 13 formed on the other surface. A first reflecting surface 15 And a second reflection surface 23 is formed on the other surface of the
Meanwhile, the processor may extract the flame region from a background-separated image based on a Gaussian Mixture Model (GMM).
In this case, the processor determines the proximity between the characteristics of the dynamic texture for the extracted spark area and the characteristics of the motion of the spark estimated by the machine learning using Volume Local Binary Patterns (VLBP) Or not.
Meanwhile, the processor may transmit at least one of the original image and the plane image to an external storage device connected through the socket.
The fire monitoring apparatus includes a first laser generator for emitting a laser beam, a first frequency modulator for modulating the frequency of the laser beam emitted by the first laser generator, A first plane optical optical lens for converting a laser beam having a frequency modulated by the first planar light optical fiber into a plane light, and a first plane optical receiver for receiving the reflected light reflected by the object to be measured, A second frequency modulation section for modulating the frequency of the laser beam emitted by the second laser generation section; and a second frequency modulation section for converting the laser beam of the frequency modulated by the second frequency modulation section into a plane light And a second planar light receiving section for receiving the reflected light reflected by the measurement object, and a second planar light optical lens positioned between the first and second laser modules A tilting module for tilting the first laser module and the second laser module and a laser beam emitted from the first laser generator and the second laser generator at the same time A first distance calculation unit for calculating a distance between the measurement object and the first planar light reception unit by measuring a reception frequency of the first planar light reception unit and a second distance calculation unit for measuring a reception frequency of the second planar light reception unit, And a coordinate analyzer for obtaining the distance calculated by the first distance calculator and the second distance calculator and analyzing the X coordinate and Y coordinate of the measurement object, Wherein the processor can detect occurrence of a fire based on data of distance, coordinates, and color of the measurement target processed by the control module.
The fire monitoring apparatus includes a planar light converter for converting a laser beam into planar light. The planar light emitted from the planar light converter is transmitted to the first optical lens to diffuse the planar light, A light emitting unit for irradiating the object to be sensed, and a third and a third optical lens for diffusing the reflected light reflected from the sensing object as the sensing object is irradiated with the diffused planar light to diffuse the diffused reflected light, A plurality of relative coordinate values of the object to be sensed are numerically arranged by numerically arranging the object sensed image data and numerically analyzing the plurality of relative coordinate values to obtain a relative object coordinate value An arithmetic processor for calculating a position, a moving path and a moving speed of the object to be detected, Wherein the light emitting unit and the light receiving unit are on the same vertical plane, and the charge coupled device is a device for detecting a change in the pixel of the object, Wherein the light emitting unit is adjustable within an angle of 0 to 90 degrees and the receiving angle of the light receiving unit is adjusted within a range of 0 to 90 degrees, Wherein the processor is further adapted to detect an occurrence of a fire based on at least one of the calculated relative position, the travel path, the moving speed, and the detected instantaneous rate of change have.
The fire monitoring apparatus according to various embodiments as described above can reduce the cost and space for installing the entire monitoring system, and realize a fire alarm with high reliability.
1 is a block diagram showing the configuration of a fire monitoring system according to an embodiment of the present invention;
2 is a block diagram showing a configuration of a fire monitoring apparatus according to an embodiment of the present invention;
FIG. 3 is a block diagram showing a more detailed configuration of the fire monitoring apparatus of FIG. 2;
FIG. 4 is a block diagram showing a configuration according to the first embodiment of the additional information apparatus of FIG. 3;
FIG. 5 is a block diagram showing a configuration according to a second embodiment of the additional information apparatus of FIG. 3;
6 is a side cross-sectional view of an omnidirectional lens according to an embodiment of the present invention,
7 is a side cross-sectional view of a lens module according to an embodiment of the present invention,
8 is a view showing a path of light incident on the lens module of Fig. 7 in all directions,
FIG. 9 is a flowchart illustrating an image processing method for fire detection according to an embodiment of the present invention. FIG.
FIG. 10 is a view showing an original image taken by the fire monitoring apparatus of FIG. 2 and a sector image obtained by dividing the original image,
11 is a diagram for explaining conversion processing for flattening a sector image of Fig. 10,
12 is a diagram for explaining a polar coordinate transformation method for transforming the original image of FIG. 10 into a plane image, and
13 is a flowchart for determining whether a fire has occurred from an image according to an embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.
1 is a block diagram showing a configuration of a fire monitoring system according to an embodiment of the present invention.
1, a
The
A plurality of fire monitoring apparatuses 100-1, 100-2, ..., 100-N may be installed in one area. For example, a plurality of fire monitoring apparatuses 100-1, 100-2, ..., 100-N may be installed at various places in a building or in a place where a fire hazard exists in one factory complex.
A plurality of fire monitoring apparatuses 100-1, 100-2, ..., 100-N are connected to the
The
The
The
The
The fire department (50) receives the fire occurrence report. Specifically, the
The disaster prevention server (60) monitors a fire occurrence situation in a large area. Specifically, the
2 is a block diagram showing a configuration of a fire monitoring apparatus according to an embodiment of the present invention.
2, the
The
The
The
The
The
The
The above-described AFE and TG can be designed to be replaced with different configurations. In particular, when the
The communication /
The
The
The
The
As described above, the
3 is a block diagram showing a more detailed configuration of the fire monitoring apparatus of FIG.
3, the
The
The converted digital image signal (raw data) is transmitted to a digital signal processing unit (DSP) 132 to be processed so as to monitor omnidirectional observation and determine whether or not a fire has occurred. In detail, the
A microcomputer (MICOM) 133 processes signals received from the
The
The input /
The
The
The
The
The communication /
As described above, the
4 is a block diagram showing a configuration according to the first embodiment of the additional information apparatus of FIG.
4, the
The
The first
The
The first plane
The first plane
The
The second
The second
The second plane
The second plane
In the
The first
The first plane
The
The
The control module 450 includes a simultaneous
The simultaneous
The first
The second
The
The coordinate
The control module 450 analyzes the coordinates of the measurement object analyzed by the coordinate
In addition, the control module 450 can change the color of the measurement target read from the
The
5 is a block diagram showing a configuration according to a second embodiment of the additional information apparatus of FIG.
Referring to FIG. 5, the plane light emitted through the
The reflected light reflected from the object to be detected 530 is received through the
The
The
The irradiation angle of the
For example, when the irradiation angle of the
Note that the
The first
In practice, F-theta lenses are widely used among those skilled in the art in mounting the first
The F-theta lens is a lens designed to receive a laser beam (or a laser beam) from the
As a second
The MLA (Micro Lens Array) lens is a cylindrical lens designed to converge the reflected light reflected from the object to be detected 530 uniformly as the object to be sensed 530 is irradiated with the plane light diffused by the first
The focusing lens transmits the reflected light received from the MLA lens to the reflected light, and transmits the reflected light to the charge coupled
A charge coupled device (CCD) 230 according to an embodiment of the present invention generates object sensing image data by electrically measuring a plurality of photons present in diffused reflected light.
The charge coupled
The charge coupled
The charge-coupled
The
The numerical analysis method for finding the position of the object to be detected by using the coordinate values can be applied to various methods. Accordingly, the numerical analysis method applied to the present invention can be applied to various methods such as interpolation method, LU decomposition method, CDG (Characteristic-Dissipative- Galerkin) (Multi Dimensional Scaling) technique.
The
The input /
The
Movement path, moving speed, instantaneous rate of change, etc., including the object detection data output by the
6 is a side cross-sectional view of the omnidirectional lens according to an embodiment of the present invention.
Referring to FIG. 6A, the
6 (b) shows a form in which the
The
The
Here, it is preferable that the
In the case of the omnidirectional lens module mounted on the surveillance camera, the radius of curvature of the
The radius of curvature of the
When the radius of curvature of the
The radius of curvature of the second reflecting
7 is a side cross-sectional view of a lens module according to an embodiment of the present invention.
Referring to FIG. 7, the
The relay lens unit includes a
The relay lens unit may further include an
FIG. 8 is a view showing paths of light incident on the lens module of FIG. 7 in all directions.
8, light rays incident through the
Thus, the
9 is a flowchart illustrating an image processing method for fire detection according to an embodiment of the present invention.
Referring to FIG. 9, the processor receives the original image through the omnidirectional camera (S910). As shown in Fig. 8 in which an image is formed on the photoelectric sensor through the omnidirectional lens module, the image captured by the omnidirectional camera is an annular image in which two concentric circles are enclosed on the photoelectric sensor substrate of the two-dimensional plane.
Next, the processor converts the circular original image into a rectangular plane image (S920). Specifically, the incidence at 360 degrees foreground is transformed into a rectangular plane image so that the distorted image is processed through reflection and refraction of light, and the image is easily recognized.
Then, coordinates of the photographed image can be analyzed (S930). Specifically, in the case of the
Then, the distance can be measured (S940). Specifically, the
The above steps S930 and S940 are optional steps and can be omitted. Also, this step may be performed in parallel with the following steps to reach the step S980 of judging the fire.
If the
Next, the processor performs a pixel correction including up-sampling and down-sampling, which fills in the lack of pixels to be developed in the plane of the rectangle in the planarization process and performs averaging on a plurality of pixel information to be contained in one- (S950).
In addition, the processor may further examine the pixels to resolve the bad pixels, and then read (S960) correcting the distorted color values.
In addition, the processor may perform a sharpening process for enhancing the edge of the high-frequency region where the difference between the values of the pixels is large, in order to enhance the image discrimination power (S970).
The processor detects the fire using the corrected image (S980). Specifically, the processor can determine whether a fire has occurred in a region photographed from the corrected image, based on one or a plurality of algorithm operation processing results. Here, the processor can perform more accurate judgment based on the additional information received from the
FIG. 10 is a view showing an original image taken by the fire monitoring apparatus of FIG. 2 and a sector image obtained by dividing the original image.
Referring to FIG. 10 (a), an example of an
11 is a diagram for explaining the conversion processing for flattening the sector image of Fig.
Referring to FIG. 11 (a), an
The
Accordingly, when the
3 is a block diagram showing a more detailed configuration of the fire monitoring apparatus of FIG.
3, the
The
The converted digital image signal (raw data) is transmitted to a digital signal processing unit (DSP) 132 to be processed so as to monitor omnidirectional observation and determine whether or not a fire has occurred. In detail, the
A microcomputer (MICOM) 133 processes signals received from the
The
The input /
The
The
The
The
The communication /
As described above, the
4 is a block diagram showing a configuration according to the first embodiment of the additional information apparatus of FIG.
4, the
The
The first
The
The first plane
The first plane
The
The second
The second
The second plane
The second plane
In the
The first
The first plane
The
The
The control module 450 includes a simultaneous
The simultaneous
The first
The second
The
The coordinate
The control module 450 analyzes the coordinates of the measurement object analyzed by the coordinate
In addition, the control module 450 can change the color of the measurement target read from the
The
5 is a block diagram showing a configuration according to a second embodiment of the additional information apparatus of FIG.
Referring to FIG. 5, the plane light emitted through the
The reflected light reflected from the object to be detected 530 is received through the
The
The
The irradiation angle of the
For example, when the irradiation angle of the
Note that the
The first
In practice, F-theta lenses are widely used among those skilled in the art in mounting the first
The F-theta lens is a lens designed to receive a laser beam (or a laser beam) from the
As a second
The MLA (Micro Lens Array) lens is a cylindrical lens designed to converge the reflected light reflected from the object to be detected 530 uniformly as the object to be sensed 530 is irradiated with the plane light diffused by the first
The focusing lens transmits the reflected light received from the MLA lens to the reflected light, and transmits the reflected light to the charge coupled
A charge coupled device (CCD) 230 according to an embodiment of the present invention generates object sensing image data by electrically measuring a plurality of photons present in diffused reflected light.
The charge coupled
The charge coupled
The charge-coupled
The
The numerical analysis method for finding the position of the object to be detected by using the coordinate values can be applied to various methods. Accordingly, the numerical analysis method applied to the present invention can be applied to various methods such as interpolation method, LU decomposition method, CDG (Characteristic-Dissipative- Galerkin) (Multi Dimensional Scaling) technique.
The
The input /
The
Movement path, moving speed, instantaneous rate of change, etc., including the object detection data output by the
6 is a side cross-sectional view of the omnidirectional lens according to an embodiment of the present invention.
Referring to FIG. 6A, the
6 (b) shows a form in which the
The
The
Here, it is preferable that the
In the case of the omnidirectional lens module mounted on the surveillance camera, the radius of curvature of the
The radius of curvature of the
When the radius of curvature of the
The radius of curvature of the second reflecting
7 is a side cross-sectional view of a lens module according to an embodiment of the present invention.
Referring to FIG. 7, the
The relay lens unit includes a
The relay lens unit may further include an
FIG. 8 is a view showing paths of light incident on the lens module of FIG. 7 in all directions.
8, light rays incident through the
Thus, the
9 is a flowchart illustrating an image processing method for fire detection according to an embodiment of the present invention.
Referring to FIG. 9, the processor receives the original image through the omnidirectional camera (S910). As shown in Fig. 8 in which an image is formed on the photoelectric sensor through the omnidirectional lens module, the image captured by the omnidirectional camera is an annular image in which two concentric circles are enclosed on the photoelectric sensor substrate of the two-dimensional plane.
Next, the processor converts the circular original image into a rectangular plane image (S920). Specifically, the incidence at 360 degrees foreground is transformed into a rectangular plane image so that the distorted image is processed through reflection and refraction of light, and the image is easily recognized.
Then, coordinates of the photographed image can be analyzed (S930). Specifically, in the case of the
Then, the distance can be measured (S940). Specifically, the
The above steps S930 and S940 are optional steps and can be omitted. Also, this step may be performed in parallel with the following steps to reach the step S980 of judging the fire.
If the
Next, the processor performs a pixel correction including up-sampling and down-sampling, which fills in the lack of pixels to be developed in the plane of the rectangle in the planarization process and performs averaging on a plurality of pixel information to be contained in one- (S950).
In addition, the processor may further examine the pixels to resolve the bad pixels, and then read (S960) correcting the distorted color values.
In addition, the processor may perform a sharpening process for enhancing the edge of the high-frequency region where the difference between the values of the pixels is large, in order to enhance the image discrimination power (S970).
The processor detects the fire using the corrected image (S980). Specifically, the processor can determine whether a fire has occurred in a region photographed from the corrected image, based on one or a plurality of algorithm operation processing results. Here, the processor can perform more accurate judgment based on the additional information received from the
FIG. 10 is a view showing an original image taken by the fire monitoring apparatus of FIG. 2 and a sector image obtained by dividing the original image.
Referring to FIG. 10 (a), an example of an
11 is a diagram for explaining the conversion processing for flattening the sector image of Fig.
Referring to FIG. 11 (a), an
The
Accordingly, when the
12 is a diagram for explaining a polar coordinate transformation method for transforming the original image of FIG. 10 into a plane image.
Referring to FIG. 12 (a), an
The
Referring to FIG. 12 (b), FIG. 12 (b) shows a
Planarization for converting the toric
Here, the θi and ri are the angle (deg) and the radius when the pixel position of the
12 (b) also shows the position (xf, yf) on the
13 is a flowchart for determining whether a fire has occurred from an image according to an embodiment of the present invention.
Referring to FIG. 13, a sequence of a processed image through a planarization process, a pixel correction, and the like is received (S1310). Specifically, the processor is configured to flatten an original image of an annular shape photographed in all directions so as to identify whether or not a fire has occurred from the image by the configuration of the DSP, and to correct the plurality of corrected planes processed with pixel correction, color correction, And receives image frames according to a time sequence (sequence).
Then, coordinates of the photographed image can be analyzed (S1320). Specifically, in the case of the
Next, the distance can be measured (S1330). Specifically, the
The above steps S930 and S940 are optional steps and can be omitted. If this step is included, the processor can judge based on more accurate information in determining whether a fire has occurred or not.
Next, a flame candidate pixel is detected based on the brightness / color of the received image (S1340). Specifically, the processor can detect a region in which the motion exists in the received plane image. For example, a processor can detect an area of an image that is different from a fixed background. Then, the processor can analyze the shape in which the brightness is distributed in the area of the moving image. For example, the processor can detect a flame candidate pixel, which is likely to be flame, by determining whether the central portion of the moving region is greater than a predetermined depth value and whether the difference in brightness from the peripheral portion is equal to or greater than a predetermined contrast value. More specifically, in the case of a flame, the brightness of the flame is relatively higher than that of the peripheral portion, and when the curve is drawn along the same brightness, a plurality of closed lines that do not intersect each other are overlapped. The flame candidate region can be detected by comparing the predetermined depth value with the preset contrast value by using the feature of the flame. On the other hand, it is possible to detect the candidate region of flame by detecting the change of the color of the central part and the peripheral part of the flame. The flame has bright white in the deep part and yellow, orange and red in the order of the farther out.
Next, an adaptive background difference image is obtained (S1350). Background subtraction refers to separating and removing a fixed background to extract moving objects of a time-serial image. The adaptive background differential algorithm is an algorithm that adaptively reflects changes in environment such as changes in lighting, shaking of trees, and appearance of objects with a relatively high speed, thereby separating backgrounds. The processor can acquire an image with the background part removed in addition to the moving flame using the adaptive background difference algorithm.
In addition, the flame region can be extracted using the color (S1360). Specifically, there may be a situation in which it is difficult to accurately extract only the region corresponding to the flame only by the presence or absence of the above-described movement, so that it is possible to further more accurately detect the pixel region corresponding to the flame by using the color of the flame. The color of the flame may vary depending on the burning material, temperature, air, and camera shooting environment. A Gaussian Mixture Model can be used to model the color of these various flames. In addition, the model can be learned using the EM algorithm for GMM, which estimates parameters (E-step) by recursively repeating using the accumulated image data and next new data, and estimates parameters using expected values.
Next, the dynamic texture of the flame is analyzed (S1370). Specifically, since the flame has a stochastic motion due to the shape of the burning material and the wind direction, the dynamic texture of the image corresponding to the change of the flame motion acquired before has a unique characteristic of flame. As an example, the Volume Local Binary Patterns (VLBP) technique can be used to analyze the dynamic texture characteristics of extracted sparkle region images.
Finally, it is determined whether a fire has occurred (S1380). Specifically, the dynamic texture properties of spark images extracted using the VLBP technique are learned by machine learning algorithms. Then, the feature information of the currently inputted dynamic texture can be judged whether or not it can be classified as fire by the machine learning algorithm. In one embodiment, a k-Nearest Neighbor Algorithm may be used as a method for classifying dynamic texture features by machine learning. In this case, the performance of the k-NN depends on the distance calculation formula used in the algorithm, so a design suitable for actual implementation is required. In particular, in the case of the LBP feature, it is preferable to experimentally determine the histogram matching method suitable for actual implementation since the LBP is not used as it is and the histogram is mainly used.
Such a fire detection method can detect a fire quickly and accurately without complicated large-capacity operation processing.
In addition, the fire detection method according to an embodiment of the present invention can be implemented in the fire detection apparatus of FIGS. Further, the above-described fire detection method may be embodied as at least one program for executing the fire detection method, and the program may be stored in a computer-readable recording medium.
Thus, each block of the present invention may be embodied as computer-writable code on a computer-readable recording medium. The computer-readable recording medium may be a device capable of storing data that can be read by a computer system.
For example, the computer-readable recording medium may be a ROM, a RAM, a CD-ROMs, a magnetic tape, a floppy disk, an optical disk, an optical data storage device, The same image display device or the like. In addition, the computer readable code may be embodied as a computer data signal of a carrier wave.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be construed as limiting the scope of the invention as defined by the appended claims. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention.
100: optical device 110: omnidirectional camera
111: Lens module 112: Photoelectric sensor
120: communication / power interface 130: processor
Claims (7)
A lens module including an aspherical lens that focuses light incident on a foreground of 360 degrees in all directions; And a CMOS sensor for converting the focused light into an electrical signal.
PoE sockets for power reception and network communication; And
A polar transformation process is performed to convert an original image of the circular shape taken by the omnidirectional camera into a rectangular image having a rectangular shape,
A downsampling process for reducing the length of the outer arc so that each sector image obtained by dividing the original image at equal intervals along the circumferential direction is matched with the intermediate arc length, and interpolation processing for extending the length of the inner arc Perform the calibration,
If the brightness of the center of the region in which the motion exists from the corrected planar image is greater than or equal to a predetermined depth value and the difference from the brightness of the surroundings is greater than or equal to a preset contrast value, And a processor for detecting occurrence of a fire based on the dynamic texture of the spark area learned according to a machine learning algorithm,
Wherein the processor performs the polar coordinate conversion processing using the following equation:
Herein, θ i and r i are an angle (deg) and a radius when a pixel position of the original image is represented by a polar coordinate system, and x i and y i represent a position of a pixel of the original image by a rectangular coordinate x-axis coordinate value and a y-axis coordinate value at the time when, with R the radius, the c x and c y of the outer concentric circle from the center in the torus is the x coordinate value and y coordinate value of the center of the torus.
In the aspherical lens,
And a first reflection surface 415 is formed at the center of the first incident surface 411. The first reflection surface 415 is formed at the center of the first incident surface 411, (410); And
And a second exit surface 425 is formed at the center of the second reflection surface 423. The second reflection surface 425 is formed at the center of the second reflection surface 423, 420)
The second incident surface 421 is convex so that the first exit surface 413 is concave and the first exit surface 413 and the second incident surface 421 have the same radius of curvature Respectively,
The curvature radius of the first incidence surface 411 is 21 mm to 23 mm and the curvature radius of the second reflection surface 423 is 10 mm to 12 mm and the curvature radius of the first incidence surface 413 and the second incidence surface 421 ) Is formed to have a radius of curvature of 29 mm to 31 mm so that no focal point is formed on the joint surface of the first lens (410) and the second lens (420).
The processor comprising:
Wherein the flame region is extracted from an image separated from the background based on a Gaussian Mixture Model (GMM).
The processor comprising:
It is possible to detect the occurrence of a fire by determining the proximity between the characteristics of the dynamic texture for the extracted spark region and the characteristics of the motion of the spark estimated by the machine learning using Volume Local Binary Patterns (VLBP) Fire monitoring system.
The processor comprising:
And transmits at least one of the original image and the plane image to an external storage device connected through the socket.
A first laser generator for emitting a laser beam,
A first frequency modulation section for modulating the frequency of the laser beam emitted by the first laser generation section,
A first plane optical lens for converting a laser beam of a frequency modulated by the first frequency modulator into plane light;
A first laser module configured to receive reflected light reflected by a measurement object;
A second laser generator for emitting a laser beam,
A second frequency modulator for modulating the frequency of the laser beam emitted by the second laser generator,
A second planar optical lens for converting the laser beam of the frequency modulated by the second frequency modulator to plane light,
A second laser module including a second plane light receiving unit receiving reflected light reflected by a measurement object;
A camera module located between the first laser module and the second laser module and reading the color of the measurement object;
A tilting module for tilting the first laser module and the second laser module; And
A simultaneous emission adjusting unit for simultaneously emitting the laser beam emitted from the first laser generating unit and the second laser generating unit,
A first distance calculation unit for measuring a reception frequency of the first planar light reception unit and calculating a distance to an object to be measured,
A second distance calculation unit for measuring a reception frequency of the second planar light reception unit and calculating a distance to an object to be measured,
The distance calculated by the first distance calculation unit and the second distance calculation unit is acquired and the X and Y coordinates of the measurement object are analyzed
And a control module composed of a coordinate analysis unit,
The processor comprising:
And detects occurrence of a fire based on data of distance, coordinates, and color of the measurement target processed by the control module.
A light emitting unit having a planar light converter for converting the laser beam into planar light and transmitting the planar light emitted through the planar light converter to the first optical lens to diffuse the planar light and then irradiating the diffused planar light to the sensing object; And
A light receiving unit for transmitting the diffused reflected light to the second and third optical lenses and for transmitting the diffused reflected light to the charge coupled devices, ;
A relative position, a movement path and a movement speed of the object to be sensed are calculated by numerically arranging the object detection image data to extract a plurality of relative coordinate values of the object to be sensed and numerically analyzing the plurality of relative coordinate values An arithmetic processor to solve the problem; And
And an optical measuring unit for detecting an instantaneous rate of change of the object to be sensed by measuring a pixel change with respect to the object in real time based on the plurality of photon moving states,
The light emitting portion and the light receiving portion are on the same vertical plane,
Wherein the charge coupled device generates object sensing image data by electrically measuring a plurality of photons present in the diffused reflected light and wherein the angle of illumination of the light emitting portion is adjustable from 0 to 90 degrees, Further comprises an additional information device further comprising an additional information device adjustable within 0 to 90 degrees,
The processor comprising:
Wherein the occurrence of a fire is detected based on at least one of the calculated relative position, the movement route, the movement speed, and the detected instantaneous change rate.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150096482A KR101716036B1 (en) | 2015-07-07 | 2015-07-07 | Fire surveillance apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150096482A KR101716036B1 (en) | 2015-07-07 | 2015-07-07 | Fire surveillance apparatus |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170006079A true KR20170006079A (en) | 2017-01-17 |
KR101716036B1 KR101716036B1 (en) | 2017-03-13 |
Family
ID=57990344
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150096482A KR101716036B1 (en) | 2015-07-07 | 2015-07-07 | Fire surveillance apparatus |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101716036B1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106934763A (en) * | 2017-04-17 | 2017-07-07 | 北京果毅科技有限公司 | panoramic camera, drive recorder, image processing method and device |
CN107607960A (en) * | 2017-10-19 | 2018-01-19 | 深圳市欢创科技有限公司 | A kind of anallatic method and device |
CN107798734A (en) * | 2017-12-07 | 2018-03-13 | 梦工场珠宝企业管理有限公司 | The adaptive deformation method of threedimensional model |
KR20190029901A (en) * | 2017-09-13 | 2019-03-21 | 네이버랩스 주식회사 | Light focusing system for detection distance enhancement of area sensor type lidar |
KR101953342B1 (en) * | 2017-12-08 | 2019-05-23 | 주식회사 비젼인 | Multi-sensor fire detection method and system |
CN109917405A (en) * | 2019-03-04 | 2019-06-21 | 中国电子科技集团公司第十一研究所 | A kind of laser distance measurement method and system |
KR200489704Y1 (en) * | 2019-01-21 | 2019-07-25 | 주식회사 엠에이티 | Air quality monitoring apparatus |
KR20190130186A (en) * | 2018-04-16 | 2019-11-22 | 세종대학교산학협력단 | Fire monitoring method and apparatus |
KR102244187B1 (en) * | 2019-10-31 | 2021-04-26 | 한국과학기술원 | Method for video frame interpolation robust to exceptional motion and the apparatus thereof |
KR20210110084A (en) * | 2020-02-28 | 2021-09-07 | (주)트리플렛 | Device and method for detecting fire |
KR102332699B1 (en) * | 2021-06-04 | 2021-12-01 | (주)재상피앤에스 | Event processing system for detecting changes in spatial environment conditions using image model-based AI algorithms |
KR102408171B1 (en) * | 2021-12-20 | 2022-06-13 | 주식회사 코난테크놀로지 | Real-time explosion time detection method in CCTV camera environment and CCTV image processing apparatus |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102067994B1 (en) | 2019-05-20 | 2020-01-20 | 한밭대학교 산학협력단 | System for detecting flame of embedded environment using deep learning |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100901784B1 (en) * | 2008-11-11 | 2009-06-11 | 주식회사 창성에이스산업 | System for fire warning and the method thereof |
KR20100018998A (en) * | 2008-08-08 | 2010-02-18 | 펜타원 주식회사 | Omnidirectional monitoring camera system and an image processing method using the same |
KR101432440B1 (en) * | 2013-04-29 | 2014-08-21 | 홍익대학교 산학협력단 | Fire smoke detection method and apparatus |
KR101439411B1 (en) * | 2014-01-23 | 2014-09-11 | 이선구 | Omnidirectional lens module |
-
2015
- 2015-07-07 KR KR1020150096482A patent/KR101716036B1/en active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100018998A (en) * | 2008-08-08 | 2010-02-18 | 펜타원 주식회사 | Omnidirectional monitoring camera system and an image processing method using the same |
KR100901784B1 (en) * | 2008-11-11 | 2009-06-11 | 주식회사 창성에이스산업 | System for fire warning and the method thereof |
KR101432440B1 (en) * | 2013-04-29 | 2014-08-21 | 홍익대학교 산학협력단 | Fire smoke detection method and apparatus |
KR101439411B1 (en) * | 2014-01-23 | 2014-09-11 | 이선구 | Omnidirectional lens module |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106934763B (en) * | 2017-04-17 | 2023-08-22 | 北京灵起科技有限公司 | Panoramic camera, automobile data recorder, image processing method and device |
CN106934763A (en) * | 2017-04-17 | 2017-07-07 | 北京果毅科技有限公司 | panoramic camera, drive recorder, image processing method and device |
KR20190029901A (en) * | 2017-09-13 | 2019-03-21 | 네이버랩스 주식회사 | Light focusing system for detection distance enhancement of area sensor type lidar |
CN107607960A (en) * | 2017-10-19 | 2018-01-19 | 深圳市欢创科技有限公司 | A kind of anallatic method and device |
CN107798734A (en) * | 2017-12-07 | 2018-03-13 | 梦工场珠宝企业管理有限公司 | The adaptive deformation method of threedimensional model |
KR101953342B1 (en) * | 2017-12-08 | 2019-05-23 | 주식회사 비젼인 | Multi-sensor fire detection method and system |
KR20190130186A (en) * | 2018-04-16 | 2019-11-22 | 세종대학교산학협력단 | Fire monitoring method and apparatus |
KR200489704Y1 (en) * | 2019-01-21 | 2019-07-25 | 주식회사 엠에이티 | Air quality monitoring apparatus |
CN109917405B (en) * | 2019-03-04 | 2021-09-03 | 中国电子科技集团公司第十一研究所 | Laser ranging method and system |
CN109917405A (en) * | 2019-03-04 | 2019-06-21 | 中国电子科技集团公司第十一研究所 | A kind of laser distance measurement method and system |
KR102244187B1 (en) * | 2019-10-31 | 2021-04-26 | 한국과학기술원 | Method for video frame interpolation robust to exceptional motion and the apparatus thereof |
WO2021085757A1 (en) * | 2019-10-31 | 2021-05-06 | 한국과학기술원 | Video frame interpolation method robust against exceptional motion, and apparatus therefor |
KR20210110084A (en) * | 2020-02-28 | 2021-09-07 | (주)트리플렛 | Device and method for detecting fire |
KR102332699B1 (en) * | 2021-06-04 | 2021-12-01 | (주)재상피앤에스 | Event processing system for detecting changes in spatial environment conditions using image model-based AI algorithms |
KR102408171B1 (en) * | 2021-12-20 | 2022-06-13 | 주식회사 코난테크놀로지 | Real-time explosion time detection method in CCTV camera environment and CCTV image processing apparatus |
Also Published As
Publication number | Publication date |
---|---|
KR101716036B1 (en) | 2017-03-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101716036B1 (en) | Fire surveillance apparatus | |
US11842564B2 (en) | Imaging apparatus and imaging system | |
US11405535B2 (en) | Quad color filter array camera sensor configurations | |
US9992457B2 (en) | High resolution multispectral image capture | |
JP4347888B2 (en) | Method and apparatus for generating infrared image and normal image | |
US10848693B2 (en) | Image flare detection using asymmetric pixels | |
US8427632B1 (en) | Image sensor with laser for range measurements | |
US20140028861A1 (en) | Object detection and tracking | |
CN111062378A (en) | Image processing method, model training method, target detection method and related device | |
US9412039B2 (en) | Blur detection system for night scene images | |
CA2654455A1 (en) | Apparatus and method for determining characteristics of a light source | |
CN111294526B (en) | Processing method and device for preventing camera from being burnt by sun | |
US8970728B2 (en) | Image pickup apparatus and image processing method | |
CN108234897B (en) | Method and device for controlling night vision system, storage medium and processor | |
CN108886571B (en) | Imaging apparatus with improved auto-focusing performance | |
US9894255B2 (en) | Method and system for depth selective segmentation of object | |
JP7192778B2 (en) | IMAGING APPARATUS AND METHOD AND IMAGE PROCESSING APPARATUS AND METHOD | |
JP2014138290A (en) | Imaging device and imaging method | |
JP2004222231A (en) | Image processing apparatus and image processing program | |
JP2018056786A (en) | Image processing device, imaging apparatus, movable body and image processing method | |
US11245878B2 (en) | Quad color filter array image sensor with aperture simulation and phase detection | |
JP2017207883A (en) | Monitoring system, color camera device and optical component | |
JPWO2017022331A1 (en) | Control device, control method, computer program, and electronic device | |
CN211880472U (en) | Image acquisition device and camera | |
CN108449547B (en) | Method for controlling a night vision system, storage medium and processor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |