KR20170006079A - Fire surveillance apparatus - Google Patents

Fire surveillance apparatus Download PDF

Info

Publication number
KR20170006079A
KR20170006079A KR1020150096482A KR20150096482A KR20170006079A KR 20170006079 A KR20170006079 A KR 20170006079A KR 1020150096482 A KR1020150096482 A KR 1020150096482A KR 20150096482 A KR20150096482 A KR 20150096482A KR 20170006079 A KR20170006079 A KR 20170006079A
Authority
KR
South Korea
Prior art keywords
image
lens
light
fire
processor
Prior art date
Application number
KR1020150096482A
Other languages
Korean (ko)
Other versions
KR101716036B1 (en
Inventor
이선구
이충구
Original Assignee
이선구
이충구
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 이선구, 이충구 filed Critical 이선구
Priority to KR1020150096482A priority Critical patent/KR101716036B1/en
Publication of KR20170006079A publication Critical patent/KR20170006079A/en
Application granted granted Critical
Publication of KR101716036B1 publication Critical patent/KR101716036B1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • G01S13/10Systems for measuring distance only using transmission of interrupted, pulse modulated waves
    • G01S13/18Systems for measuring distance only using transmission of interrupted, pulse modulated waves wherein range gates are used
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Alarm Systems (AREA)
  • Fire-Detection Mechanisms (AREA)

Abstract

A fire monitoring apparatus is disclosed. The fire monitoring apparatus of the present invention includes a panoramic camera including a lens module including an aspheric lens for focusing light incident at 360 ° foreground and a CMOS sensor for converting focused light into an electric signal, A polar transformation process is performed to convert an original image of an annular shape captured by a PoE socket for receiving and network communication and a omnidirectional camera into a rectangular image, Sampling processing for reducing the length of the outer arc so that each sector image obtained by dividing the sector image at regular intervals along the arc length is coincident with the intermediate arc length, and interpolation processing for extending the length of the inner arc, The brightness of the center of the region in which the motion exists from the plane image is greater than or equal to a predetermined depth value, Comparing the front and back frames of the corrected planar image to extract a flame region and detecting the occurrence of a fire based on the dynamic texture of the learned flame region according to a machine learning algorithm.

Description

FIRE SURVEILLANCE APPARATUS

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a fire monitoring apparatus, and more particularly, to a fire monitoring apparatus for detecting whether a fire occurs from an image of a surveillance camera photographed in 360-degree omnidirection.

Fire accidents can cause a lot of damage to human beings and property. To prevent this, a monitoring device for detecting the occurrence of a fire is placed inside the building or in a specific area.

Conventionally, a device for detecting the occurrence of a fire using a CCTV camera has a blind spot due to a narrow viewing angle of the camera. When installing multiple cameras to avoid this, a cost burden was incurred. In addition, images captured by a plurality of cameras are collected in an external server connected to a communication, and image processing is performed to determine the occurrence of a fire.

SUMMARY OF THE INVENTION It is an object of the present invention to provide a fire monitoring apparatus for detecting the occurrence of a fire from an image of a surveillance camera photographed in 360-degree omnidirection.

The fire monitoring apparatus according to an embodiment of the present invention includes a lens module including an aspherical lens that focuses light incident on a foreground of 360 degrees and a CMOS sensor that converts the focused light into an electric signal A PoE socket for receiving electric power and network communication, and a polar transformation process for converting a circle-shaped original image photographed by the omnidirectional camera into a rectangular plane image. A downsampling process for reducing the length of the outer arc so that each sector image obtained by dividing the original image at equal intervals along the circumferential direction is matched with the intermediate arc length, and interpolation processing for extending the length of the inner arc And the brightness of the center of the region in which the motion exists from the corrected planar image is a predetermined depth value And comparing the brightness of the surrounding region with the brightness of the surrounding region by comparing the front and back frames of the corrected plane image to extract a flame region, Wherein the processor performs the polar coordinate transform processing using the following equation: < EMI ID =

Figure pat00001

Figure pat00002

Figure pat00003

Figure pat00004

Herein, θ i and r i are an angle (deg) and a radius when a pixel position of the original image is represented by a polar coordinate system, and x i and y i represent a position of a pixel of the original image by a rectangular coordinate x-axis coordinate value and a y-axis coordinate value at the time when, with R the radius, the c x and c y of the outer concentric circle from the center in the torus is the x coordinate value and y coordinate value of the center of the torus.

The aspherical lens has a convex first incident surface 11 formed on one surface thereof and a first emitting surface 13 formed on the other surface. A first reflecting surface 15 And a second reflection surface 23 is formed on the other surface of the first lens 10. A second reflection surface 23 is formed at the center of the second reflection surface 23, And a second lens 20 on which a surface 25 is formed and in which the first emitting surface 13 is concave and the second emitting surface 21 is convex and the first emitting surface 13 ) And the second incident surface (21) are formed to be the same, and the radius of curvature of the first incident surface (11) is 21 mm to 23 mm, and the radius of curvature of the second reflection surface (23) The radius of curvature of the first exit surface 13 and the second incident surface 21 is set to 29 mm to 31 mm so that the contact surface of the first lens 10 and the second lens 20 The focus may not be formed.

Meanwhile, the processor may extract the flame region from a background-separated image based on a Gaussian Mixture Model (GMM).

In this case, the processor determines the proximity between the characteristics of the dynamic texture for the extracted spark area and the characteristics of the motion of the spark estimated by the machine learning using Volume Local Binary Patterns (VLBP) Or not.

Meanwhile, the processor may transmit at least one of the original image and the plane image to an external storage device connected through the socket.

The fire monitoring apparatus includes a first laser generator for emitting a laser beam, a first frequency modulator for modulating the frequency of the laser beam emitted by the first laser generator, A first plane optical optical lens for converting a laser beam having a frequency modulated by the first planar light optical fiber into a plane light, and a first plane optical receiver for receiving the reflected light reflected by the object to be measured, A second frequency modulation section for modulating the frequency of the laser beam emitted by the second laser generation section; and a second frequency modulation section for converting the laser beam of the frequency modulated by the second frequency modulation section into a plane light And a second planar light receiving section for receiving the reflected light reflected by the measurement object, and a second planar light optical lens positioned between the first and second laser modules A tilting module for tilting the first laser module and the second laser module and a laser beam emitted from the first laser generator and the second laser generator at the same time A first distance calculation unit for calculating a distance between the measurement object and the first planar light reception unit by measuring a reception frequency of the first planar light reception unit and a second distance calculation unit for measuring a reception frequency of the second planar light reception unit, And a coordinate analyzer for obtaining the distance calculated by the first distance calculator and the second distance calculator and analyzing the X coordinate and Y coordinate of the measurement object, Wherein the processor can detect occurrence of a fire based on data of distance, coordinates, and color of the measurement target processed by the control module.

The fire monitoring apparatus includes a planar light converter for converting a laser beam into planar light. The planar light emitted from the planar light converter is transmitted to the first optical lens to diffuse the planar light, A light emitting unit for irradiating the object to be sensed, and a third and a third optical lens for diffusing the reflected light reflected from the sensing object as the sensing object is irradiated with the diffused planar light to diffuse the diffused reflected light, A plurality of relative coordinate values of the object to be sensed are numerically arranged by numerically arranging the object sensed image data and numerically analyzing the plurality of relative coordinate values to obtain a relative object coordinate value An arithmetic processor for calculating a position, a moving path and a moving speed of the object to be detected, Wherein the light emitting unit and the light receiving unit are on the same vertical plane, and the charge coupled device is a device for detecting a change in the pixel of the object, Wherein the light emitting unit is adjustable within an angle of 0 to 90 degrees and the receiving angle of the light receiving unit is adjusted within a range of 0 to 90 degrees, Wherein the processor is further adapted to detect an occurrence of a fire based on at least one of the calculated relative position, the travel path, the moving speed, and the detected instantaneous rate of change have.

The fire monitoring apparatus according to various embodiments as described above can reduce the cost and space for installing the entire monitoring system, and realize a fire alarm with high reliability.

1 is a block diagram showing the configuration of a fire monitoring system according to an embodiment of the present invention;
2 is a block diagram showing a configuration of a fire monitoring apparatus according to an embodiment of the present invention;
FIG. 3 is a block diagram showing a more detailed configuration of the fire monitoring apparatus of FIG. 2;
FIG. 4 is a block diagram showing a configuration according to the first embodiment of the additional information apparatus of FIG. 3;
FIG. 5 is a block diagram showing a configuration according to a second embodiment of the additional information apparatus of FIG. 3;
6 is a side cross-sectional view of an omnidirectional lens according to an embodiment of the present invention,
7 is a side cross-sectional view of a lens module according to an embodiment of the present invention,
8 is a view showing a path of light incident on the lens module of Fig. 7 in all directions,
FIG. 9 is a flowchart illustrating an image processing method for fire detection according to an embodiment of the present invention. FIG.
FIG. 10 is a view showing an original image taken by the fire monitoring apparatus of FIG. 2 and a sector image obtained by dividing the original image,
11 is a diagram for explaining conversion processing for flattening a sector image of Fig. 10,
12 is a diagram for explaining a polar coordinate transformation method for transforming the original image of FIG. 10 into a plane image, and
13 is a flowchart for determining whether a fire has occurred from an image according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.

1 is a block diagram showing a configuration of a fire monitoring system according to an embodiment of the present invention.

1, a fire monitoring system 1000 according to an embodiment of the present invention includes a plurality of fire monitoring apparatuses 100-1, 100-2, ..., 100-N, a hub 10, A video recorder 20 and a user terminal 40 connected through a network 30, a control server 60 and a fire department 50. [

The fire monitoring apparatus 100 captures images of all directions. Specifically, the fire monitoring apparatus 100 can shoot an image for monitoring all directions of the installed area. Then, the fire monitoring apparatus 100 can determine whether or not a fire has occurred from the photographed image.

A plurality of fire monitoring apparatuses 100-1, 100-2, ..., 100-N may be installed in one area. For example, a plurality of fire monitoring apparatuses 100-1, 100-2, ..., 100-N may be installed at various places in a building or in a place where a fire hazard exists in one factory complex.

A plurality of fire monitoring apparatuses 100-1, 100-2, ..., 100-N are connected to the hub 10. More specifically, the plurality of fire monitoring apparatuses 100-1, 100-2, ..., 100-N includes a plurality of fire monitoring apparatuses 100-1, 100-2, ..., 100- N are assigned to the hub 10, respectively, so that the hub 10 can be connected to the local network.

The network video recorder 20 records an image transmitted from the hub 10. Specifically, the network video recorder 20 can separate images transmitted from each fire monitoring apparatus 100 and store them in a storage. In addition, the network video recorder 20 can provide an image recorded through the network 30 to an external device. In addition, the network video recorder 20 can compress an image received from the fire monitoring apparatus 100 or transmit it in real time.

The network 30 includes both wired and wireless communication networks. Here, the wired network includes an Internet network such as a cable network or a public switched telephone network (PSTN), and the wireless communication network includes means such as CDMA, WCDMA, GSM, Evolved Packet Core (EPC), Long Term Evolution (LTE) to be. Therefore, when the network 30 is a wired communication network, the access point can access the exchange of a telephone office or the like. However, in the case of a wireless communication network, the access point may access the SGSN or the Gateway GPRS Support Node (GGSN) Station Transmission), NodeB, e-NodeB, and the like.

The network 30 also includes a small base station (AP) such as a femto or pico base station, which is installed in a large number of buildings. Of course, the AP includes a short-range communication module for performing short-range communication such as ZigBee and Wi-Fi. In the embodiment of the present invention, the short-range communication includes a wide variety of radio frequency (RF) and ultra wideband (UWB) communications such as Bluetooth, Zigbee, IrDA, UHF and VHF Standard. Accordingly, the AP can extract the location of the data packet, specify the best communication path to the extracted location, and forward the data packet to the next device, e.g., the user terminal device 40, along the designated communication path.

The user terminal device 40 may receive a currently monitored image in real time via the network 30 or may receive a previously recorded image. When the user terminal device 40 accesses the network video recorder 20, it can perform the authentication procedure.

The fire department (50) receives the fire occurrence report. Specifically, the fire department 50 may receive a fire notification from the fire monitoring apparatus 100 and receive a fire report in the area where the fire monitoring apparatus 100 is installed.

The disaster prevention server (60) monitors a fire occurrence situation in a large area. Specifically, the disaster prevention server 60 can acquire information on fire occurrence status, images, and the like from the fire monitoring system 1000 of various regions connected through the network 30. [

2 is a block diagram showing a configuration of a fire monitoring apparatus according to an embodiment of the present invention.

2, the fire monitoring apparatus 100 includes an omnidirectional camera 110, a communication / power interface 120, and a processor 130, which include a lens module 111 and a photoelectric sensor 112.

The omnidirectional camera 110 takes 360-degree omnidirectional images. To this end, the lens module 111 constituting the omnidirectional camera 110 includes an aspherical lens that focuses light incident on the foreground of 360 degrees. The lens module 111 may include a lens barrel having a plurality of lenses arranged along the optical axis. The specific configuration of the aspheric lens and lens module 111 will be described later with reference to Figs. 4 and 5. Fig.

The photoelectric sensor 112 converts the focused light into an electrical signal. Specifically, the photoelectric sensor 112 can convert an image of a subject formed at a position where the photoelectric sensor 112 is located, through the lens module 111, into an electric signal.

The photoelectric sensor 112 includes a plurality of pixels arranged in a matrix form. The plurality of pixels may form a Bayer pattern. Each of the plurality of pixels accumulates light charges corresponding to the incident light, and outputs an image due to the photoelectric charge as an electric signal. The photoelectric sensor 112 may be a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Chanrge Coupled Device). A plurality of pixels according to an embodiment of the present invention may include a plurality of phase difference pixels.

The omnidirectional camera 110 may include a photodiode PD, a transfer transistor TX, a reset transistor RX, and a floating diffusion node FD. The photodiode PD generates and accumulates photo charges corresponding to the optical image of the subject. The transfer transistor TX transmits the optical telephone generated in the photodiode PD to the floating diffusion node FD in response to the transmission signal. The reset transistor discharges the charge stored in the floating diffusion node FD in response to the reset signal. Before the reset signal is applied, the charges stored in the floating diffusion node FD are output. In the case of the CDS image sensor, CDS (Correlated Double Sampling) processing is performed. Then, the ADC converts the analog signal subjected to CDS processing into a digital signal.

The omnidirectional camera 110 may include an analog front end (AFE). The AFE samples and digitizes an electric signal on the subject output from the photoelectric sensor 112. [ The AFE is controlled by the processor 130.

The omnidirectional camera 110 may include a timing generator (TG). The TG outputs a timing signal for reading out the pixel data of the photoelectric sensor 112. [ The TG is controlled by the processor 130.

The above-described AFE and TG can be designed to be replaced with different configurations. In particular, when the photoelectric sensor 112 is implemented as a CMOS type, such a configuration may be unnecessary.

The communication / power interface 120 sends and receives signals. Specifically, the communication / power interface 120 may provide an interface for data exchange and an interface for supplying power. The plug of the cable for connecting with the wire communication network and the plug of the power cable for supplying the power can be realized as one. In one example, the communication / power interface 120 may include an RJ-45 socket. In this case, the placement of the pins connected to the cable can comply with the 802.3af standard.

The processor 130 controls each configuration of the fire monitoring apparatus 100. Specifically, the processor 130 may control the execution of the functions of the respective components of the fire monitoring apparatus 100 for an operation for monitoring a fire.

The processor 130 may be implemented in a number of different ways. For example, the processor 130 may be implemented as an application specific integrated circuit (ASIC), an embedded processor, a microprocessor, hardware control logic, a hardware finite state machine (FSM), a digital signal processor processor, DSP). A control interface (e.g., bus) may be used for communication between the processor 130 and other configurations.

The processor 130 may convert the toric original image photographed by the omnidirectional camera 110 into a rectangular image having a rectangular shape. For this purpose, the processor 130 may perform a polar transformation process. The processor 130 may divide the original image into equal intervals along the circumferential direction. Here, the portion of the divided original image is referred to as a sector image. The processor 130 may adjust the length of the arc to a predetermined length to flatten the sector image in a rectangular shape. Here, the length of the predetermined reference may be the length of the arc corresponding to the middle between the inner and outer concentric circles. A detailed description thereof will be given later with reference to FIG. 8 to FIG.

The processor 130 may perform an operation to detect a fire from the flattened image. Specifically, the processor 130 may detect a region in which a motion exists from the planar image. Then, the processor 130 may determine that there is a possibility of flame based on the brightness of the region where the motion exists. The processor 130 extracts a flame region suspected to be flame based on motion and color, and determines whether or not it is a fire based on the dynamic texture of the moving flame image by determining whether or not it corresponds to the flame-specific motion. A detailed description thereof will be given later with reference to Fig.

As described above, the fire monitoring apparatus 100 captures an omnidirectional image through a single photographing module and judges whether the fire is directly fired. Thus, it is possible to secure the economics of cost and installation space and reliability of fire detection.

3 is a block diagram showing a more detailed configuration of the fire monitoring apparatus of FIG.

3, the fire monitoring apparatus 100 includes an omnidirectional camera 110, a communication / power interface (PoE) 120, a processor 130, a temperature sensor 140, a smoke sensor 150, a speaker 160, And a beacon light 170, and a hub 20 and a network video recorder 30 are additionally shown for the sake of explanation. Here, the fire monitoring apparatus 100 may further include an additional information apparatus 300 as an optional configuration. Here, the omnidirectional camera 110, the communication / power source interface 120, and the processor 130 are identical in configuration and function to the omnidirectional camera 110, communication / power interface 120, and processor 130 of FIG. The description is omitted.

The photoelectric sensor 112 constituting the omnidirectional camera 110 may be the CMOS sensor 112. [ The electrical signal output from the CMOS sensor 112 can be converted into a digital image signal that is transmitted to the image signal processing unit (ISP) 131 and sampled the electrical signal.

The converted digital image signal (raw data) is transmitted to a digital signal processing unit (DSP) 132 to be processed so as to monitor omnidirectional observation and determine whether or not a fire has occurred. In detail, the digital signal processor 132 may perform downsampling to interpolate the missing pixel information in the process of flattening the toric original image, and to reduce the excess pixel information according to the size of the display image. In addition, the digital signal processor 132 may inspect the converted plane image to interpolate insufficient color information, perform color correction, and the like. Meanwhile, the digital signal processor 132 may convert the RGB color space into the YCbCr color space to perform the above-described image processing.

A microcomputer (MICOM) 133 processes signals received from the camera 110 and the sensors 140 and 150 and informs them. For example, the microcomputer 133 can inform when and at what time the panoramic image photographed by the camera is photographed. It is also possible to determine whether a fire has occurred from the image processed by the DSP 132 and the signal detected by the temperature sensor 140 and the smoke sensor 150. The microcomputer 133 can control the output units 160 and 170 so that a predetermined condition is satisfied or an event occurs, so that a corresponding output can be generated.

The additional information device 300 may obtain additional information about the surrounding objects. Specifically, the additional information device 300 can acquire information such as distance, movement, position, coordinates, color, etc. of an object around the fire monitoring apparatus 100. Then, the additional information device 300 can transmit the obtained distance and / or coordinates of the object to the DSP 132. In this case, the DSP 132 may further include the additional information received from the additional information device 300, and may transmit the additional information as information for determining whether a fire has occurred in the microcomputer 133. The specific configuration and function of the additional information device 300 will be described later with reference to FIGS. 4 and 5. FIG.

The input / output unit 134 provides an interface for inputting / outputting information to / from an external device. Although not shown, the input / output unit 134 may include a receiver for receiving a signal of a remote control, a wireless communication module such as Wi-Fi, and a wired communication module such as a USB.

The temperature sensor 140 senses the temperature of the air around the fire monitoring apparatus 100. Using the temperature sensor 140, the fire monitoring apparatus 100 can collect information on the temperature of the air that has been raised in the event of a fire.

The smoke sensor 140 senses the smoke concentration of the air around the fire monitoring apparatus 100. Specifically, the smoke sensor 140 senses an increased smoke concentration in the air and can provide information on the composition of the changed air during the fire.

The speaker 160 outputs sound. Specifically, the speaker 160 may output an alarm sound for notifying occurrence of a fire.

The beacon light 170 emits light. Specifically, the beacon light 170 can output light for notifying the occurrence of a fire.

The communication / power interface 120 may be implemented by PoE (Power over Ethernet) technology. Specifically, the hub 20 can provide the power supplied by the UPS with the routing of the data communication. And can receive power along with the data via the PoE socket from the Ethernet cable connected to the communication / power interface 120 hub 20.

As described above, the fire monitoring apparatus 100 captures an omnidirectional image through a single photographing module and judges whether the fire is directly fired. Thus, it is possible to secure the economics of cost and installation space and reliability of fire detection.

4 is a block diagram showing a configuration according to the first embodiment of the additional information apparatus of FIG.

4, the additional information apparatus 400 according to the first embodiment includes a first laser module 410, a second laser module 420, a camera module 430, a tilting module 440, a control module 450).

The first laser module 410 includes a first laser generator 411, a first frequency modulator 412, a first plane optical lens 413, and a first plane optical receiver 414.

The first laser generation unit 411 emits a laser beam.

The first frequency modulator 412 modulates the frequency of the laser beam emitted by the first laser generator 411.

The first plane optical lens 413 converts the laser beam of the frequency modulated by the first frequency modulator 412 into plane light.

The first plane light receiving section 414 receives the reflected light reflected by the measurement object of the laser beam emitted in the form of plane light through the first plane optical lens 413.

The second laser module 420 includes a first laser generator 421, a first frequency modulator 422, a first plane optical lens 423, and a first plane optical receiver 424.

The second laser generation unit 421 emits a laser beam.

The second frequency modulating section 422 modulates the frequency of the laser beam emitted by the second laser generating section 421.

The second plane optical lens 423 converts the laser beam of the frequency modulated by the second frequency modulator 422 into plane light.

The second plane light receiving section 424 receives the reflected light reflected by the measurement object of the laser beam emitted in the form of plane light through the second plane optical lens 423.

In the first laser module 410 and the second laser module 420, the first planar optical lens 413 and the second planar optical lens 423 are formed by the first laser generating portion 411 and the second laser The first planar optical lens 413 and the second planar optical lens 423 are required to project the laser beam within a range of 60 to 180 degrees It is designed to be converted into wide plane light.

The first frequency modulating unit 412 and the second frequency modulating unit 422 modulate the frequency of the laser beam in different frequency bands to provide different frequency bands.

The first plane light receiving section 414 and the second plane light receiving section 424 are configured to receive the first and second frequency modulation sections 412 and 422 in accordance with the frequency bands provided by the first frequency modulation section 412 and the second frequency modulation section 422, And a filter unit (not shown) so as to receive only the frequency of the band.

The camera module 430 is located between the first laser module 410 and the second laser module 420 and reads an optical form that can not be detected by a laser, such as color or planar pattern of a measurement object.

The tilting module 440 is connected to the first laser module 410 and the second laser module 420 and tilts when the laser beam is emitted to help measurement of the measurement object.

The control module 450 includes a simultaneous emission adjustment unit 451, a first distance calculation unit 452, a second distance calculation unit 453, and a coordinate analysis unit 454.

The simultaneous emission control unit 451 simultaneously emits the laser beam emitted from the first laser generation unit 411 and the second laser generation unit 421.

The first distance calculation section 452 measures the reception frequency of the first plane light reception section 414 and calculates the distance to the measurement object.

The second distance calculation unit 453 measures the reception frequency of the second plane light reception unit 424 and calculates the distance to the measurement target.

The first distance calculator 452 and the second distance calculator 453 calculate the distance by applying a Doppler effect to measure the reflection distance of the laser beam reflected by the measurement object.

The coordinate analyzer 454 obtains the distances calculated by the first distance calculator 452 and the second distance calculator 453 to analyze the X and Y coordinates of the measurement object.

The control module 450 analyzes the coordinates of the measurement object analyzed by the coordinate analysis unit 454 to determine the size of the measurement object and analyzes the color information received from the camera module 430 to calculate the distance, Color and size data.

In addition, the control module 450 can change the color of the measurement target read from the camera module 430 to a different color according to the user's selection.

The DSP 132 receives data of distance, coordinate color and size of the measurement target processed by the control module 450, and transmits the data to the microcomputer 133 which determines the fire as additional information for more accurately identifying the fire.

5 is a block diagram showing a configuration according to a second embodiment of the additional information apparatus of FIG.

Referring to FIG. 5, the plane light emitted through the light emitting unit 510 is applied to the sensing object 530

The reflected light reflected from the object to be detected 530 is received through the light receiving unit 520 and the plurality of photons present in the reflected light are electrically measured by using the charge coupled device 523 provided in the light receiving unit 520 The additional information device 500 for generating object sensing image data on the basis of the light receiving unit 520 includes a light emitting unit 510, a light receiving unit 520, a charge coupled device 523, an operation processor 540, an optical measuring device 550, 560 and a controller 570.

The light emitting unit 510 includes a planar light converter 511 that converts the laser beam into planar light. The plane light emitted through the planar light converter 511 is transmitted to the first optical lens 512 for diffusion, And irradiates the sensing object 530 with the plane light.

The light receiving unit 520 transmits the reflected light reflected from the sensing object 530 to the second and third optical lenses 521 and 522 and diffuses it after diffusing the planar light diffused by the sensing object 530, And transmits the reflected light to the charge coupled device 523, which is non-instrumental.

The irradiation angle of the light emitting portion 510 can be adjusted within 0 to 90 degrees and the receiving angle of the light receiving portion 520 can be adjusted within 0 to 90 degrees.

For example, when the irradiation angle of the light emitting portion 510 is fixed to a selected one of 0 to 60 degrees, the receiving angle of the light receiving portion may be fixed to a selected one of 0 to 5 degrees. In another example, Is fixed to one of 0 to 5 degrees, the receiving angle of the light receiving unit 520 is fixed to one of 0 to 60 degrees.

Note that the light emitting portion 510 and the light receiving portion 520 are present on the same vertical plane.

The first optical lens 512 is a concave lens, the second optical lens 521 is a convex lens, and the third optical lens 522 is another concave lens, which is considered to be one of important points of the present invention. The first optical lens 512 is provided in the light emitting portion 510 and the second and third optical lenses 521 and 522 are provided in the light receiving portion 523. [

In practice, F-theta lenses are widely used among those skilled in the art in mounting the first optical lens 512 on the light emitting portion 510. [

The F-theta lens is a lens designed to receive a laser beam (or a laser beam) from the light emitting portion 510 without loss, and is a lens designed to maximize a viewing angle when emitting a plane light. The F-theta lens can be filter coated so that it can detect only a specific wavelength (in the present invention, a wavelength range of 150 nm to 10.6 탆 corresponding to a laser beam).

As a second optical lens 521 widely used among those skilled in the art in mounting the second and third optical lenses 521 and 522 on the light receiving unit 520, an MLA (Micro Lens Array) As the third optical lens 522, a focusing lens is exemplified.

The MLA (Micro Lens Array) lens is a cylindrical lens designed to converge the reflected light reflected from the object to be detected 530 uniformly as the object to be sensed 530 is irradiated with the plane light diffused by the first optical lens 512 One of the lens models of the present invention is to change the reflected light horizontally and transmit it to the focusing lens.

The focusing lens transmits the reflected light received from the MLA lens to the reflected light, and transmits the reflected light to the charge coupled device 523.

A charge coupled device (CCD) 230 according to an embodiment of the present invention generates object sensing image data by electrically measuring a plurality of photons present in diffused reflected light.

The charge coupled device 523 is also referred to as a charge coupled detector, a charge transfer device, and a charge control device. The charge coupled device 523 is a pulse control device that applies charges injected into a semiconductor region immediately below a minute electrode to an electrode group A semiconductor integrated circuit device which has a function of arbitrarily moving below an adjacent electrode by a voltage or taking it out to the outside by using the property that a charge flows to a higher voltage applied to a control electrode, Are sequentially used in a delay circuit or a memory circuit.

The charge coupled device 523 is formed by forming an insulating layer having a thickness of about 0.1 mm on the surface of an n-type semiconductor substrate, arranging metal electrodes, and controlling the voltage of the metal electrode, So that the accumulated charges can be sequentially transferred in accordance therewith.

The charge-coupled device 523 has two functions, that is, storage by accumulation of electric charge and transfer by transfer of electric charge, and it is possible to store and transmit an analog quantity according to the size of the electric charge. Therefore, (Image sensor).

The operation processor 540 extracts a plurality of relative coordinate values of the object to be sensed 530 by numerically arranging the object sensed image data and then numerically analyzes a plurality of relative coordinate values to determine a relative position of the sensing object 530 And the movement path and the movement speed are calculated.

The numerical analysis method for finding the position of the object to be detected by using the coordinate values can be applied to various methods. Accordingly, the numerical analysis method applied to the present invention can be applied to various methods such as interpolation method, LU decomposition method, CDG (Characteristic-Dissipative- Galerkin) (Multi Dimensional Scaling) technique.

The optical measuring device 550 detects the instantaneous rate of change of the sensing object 530 by measuring the pixel change with respect to the sensing target object on the basis of a plurality of photon moving states in real time.

The input / output unit 560 outputs the object detection data, the relative position, the movement path, the moving speed, and the instantaneous change rate to the outside.

The controller 570 collectively controls the signal processing of the arithmetic processor 540, the optical metering unit 550 and the input / output unit 560.

Movement path, moving speed, instantaneous rate of change, etc., including the object detection data output by the DSP 132 input / output unit 560, and transmits the information as additional information to the microcomputer 133 for determining whether or not the fire has occurred.

6 is a side cross-sectional view of the omnidirectional lens according to an embodiment of the present invention.

Referring to FIG. 6A, the omnidirectional lens 400 is composed of a first lens 610 and a second lens 620. Specifically, the omnidirectional lens 400 has a convex first incident surface 611 formed on one surface thereof and a first emitting surface 613 formed on the other surface thereof. A first reflecting surface 615 is formed at the center of the first incident surface 611 A second reflective surface 623 formed on the other surface of the first lens 610 and a second reflective surface 623 formed in the center of the second reflective surface 623, And a second lens 620 on which a surface 625 is formed.

6 (b) shows a form in which the first lens 610 and the second lens 620 are joined.

The first lens 610 is a lens to which an external light beam is incident. The first incident surface 611 is convex on one surface and an external light ray is incident on the first incident surface 611. A first exit surface 613 is formed on the other surface of the first lens 610 so that light rays incident on the first incident surface 611 are emitted to the first exit surface 613. In addition, a first reflecting surface 615 is formed at the center of the first incident surface 611.

The second lens 620 is a lens that is joined to the first lens 610. Specifically, a second incident surface 621 is formed on one surface of the second lens 620, and a second incident surface 621 is formed on the 1 exit surface 613 of the light guide plate 613. A convex second reflecting surface 623 is formed on the other surface of the second lens 620 and a second reflecting surface 623 is formed in the center of the second reflecting surface 623, An emission surface 625 is formed.

Here, it is preferable that the first reflection surface 615 and the second reflection surface 623 are coated with a material capable of reflecting visible light, such as aluminum or silver. In addition, although the first reflecting surface 615 and the second emitting surface 625 are shown flat on the drawing, they may be convex or concave only in one embodiment.

In the case of the omnidirectional lens module mounted on the surveillance camera, the radius of curvature of the first exit surface 613 and the second incident surface 621 is preferably 30 mm. It also includes a radius of curvature of 29 mm to 31 mm, depending on the manufacturing tolerance (± 1 mm). When the radius of curvature of the first exit surface 613 and the second incident surface 621 is larger than 31 mm, the size of the second lens 620 becomes larger. When the radius of curvature of the second lens 620 is smaller than 29 mm, This is a problem that becomes difficult.

The radius of curvature of the first incident surface 611 is preferably 21 to 23 mm. The radius of curvature of the first incident surface 611 is 21 to 23 mm so that the field of view can be secured in the range of 40 DEG to 85 DEG in the vertical direction.

When the radius of curvature of the first incident surface 611 is out of the range of 21 to 23 mm, shielding is generated at the second emitting surface 625 and a proper angle of view can not be secured. Specifically, when the radius of curvature of the first incident surface 611 is out of the upper radius of curvature, the light rays of some wavelengths are incident on the second incident surface 611. Therefore, It is impossible to secure the intended angle of view because the light can not be output to the surface 625. [ Therefore, it is most preferable to make the radius of curvature of the first incident surface 611 to be 22 mm in consideration of the manufacturing tolerance (± 1 mm).

The radius of curvature of the second reflecting surface 623 is preferably 10 to 12 mm. If the radius of curvature of the second reflection surface 623 is not 10 to 12 mm, shielding occurs at the first reflection surface 615 and a proper angle of view can not be secured. Therefore, it is most preferable that the radius of curvature of the second reflecting surface 623 is 11 mm in consideration of the manufacturing tolerance (± 1 mm).

7 is a side cross-sectional view of a lens module according to an embodiment of the present invention.

Referring to FIG. 7, the lens module 700 includes, in addition to the first lens 610 and the second lens 620 of the omnidirectional lens 400 of FIG. 6, a plurality of lenses (Hereinafter, referred to as 'relay lens unit').

The relay lens unit includes a third lens 710 to a ninth lens 770 arranged in a line so that their optical axes coincide. For correction of chromatic aberration, some lenses of the third lens 710 to the ninth lens 770 may be formed of a crown-based material, and the remaining lenses may be formed of a flint-based material.

The relay lens unit may further include an infrared filter 780 of a shutter type. The infrared filter 780 may be closed to block the infrared ray, and the infrared ray filter 780 may be opened to allow the infrared ray to pass therethrough. The infrared filter 780 is advantageous in that it is easier to manufacture than that the infrared filter 780 is disposed at the rearmost portion of the relay lens portion between the plurality of lenses of the relay lens portion.

FIG. 8 is a view showing paths of light incident on the lens module of FIG. 7 in all directions.

8, light rays incident through the first incident surface 611 are reflected by the second reflection surface 623 through the joint surfaces of the first exit surface 613 and the second incident surface 621, The light beam reflected by the second reflecting surface 623 is reflected by the first reflecting surface 615 through the joining surface of the first emitting surface 613 and the second emitting surface 621 and then reflected by the first emitting surface 613 And the second incident surface 621 through the second exit surface 625. In this way,

Thus, the lens module 700 can acquire a 360-degree omni-directional image through the refraction refracting lens including the first lens 610 and the second lens 620.

9 is a flowchart illustrating an image processing method for fire detection according to an embodiment of the present invention.

Referring to FIG. 9, the processor receives the original image through the omnidirectional camera (S910). As shown in Fig. 8 in which an image is formed on the photoelectric sensor through the omnidirectional lens module, the image captured by the omnidirectional camera is an annular image in which two concentric circles are enclosed on the photoelectric sensor substrate of the two-dimensional plane.

Next, the processor converts the circular original image into a rectangular plane image (S920). Specifically, the incidence at 360 degrees foreground is transformed into a rectangular plane image so that the distorted image is processed through reflection and refraction of light, and the image is easily recognized.

Then, coordinates of the photographed image can be analyzed (S930). Specifically, in the case of the fire monitoring apparatus 100 in which the additional information apparatus 300 is combined, the additional information apparatus 300 photographs an image of the surrounding object, The coordinates can be analyzed. More specifically, step S930 may be coordinate analysis of the measurement object by the coordinate analysis unit 454 when the additional information device 400 according to the first embodiment of Fig. 4 is used.

Then, the distance can be measured (S940). Specifically, the additional information device 400 can generate data on at least one of the size, distance, coordinates, and hue of the measurement object from the coordinates of the analyzed measurement object. For example, if an ambient fire has occurred, the additional information device 400 may generate precise and detailed data on the coordinates of the direction in which the fire occurred, the size of the fire, and the color of the fire.

The above steps S930 and S940 are optional steps and can be omitted. Also, this step may be performed in parallel with the following steps to reach the step S980 of judging the fire.

If the additional information apparatus 500 according to the second embodiment is used, step S930 may be a step of extracting the relative coordinates of the object to be detected by the operation processor 540. [ And S940 may be a step of calculating the relative position, movement path, and movement speed of the object by numerically analyzing the relative coordinate value. For example, if an ambient fire has occurred, the additional information device 500 may generate precise and detailed information about the relative position of the fire, fire motion, fire spread path and speed.

Next, the processor performs a pixel correction including up-sampling and down-sampling, which fills in the lack of pixels to be developed in the plane of the rectangle in the planarization process and performs averaging on a plurality of pixel information to be contained in one- (S950).

In addition, the processor may further examine the pixels to resolve the bad pixels, and then read (S960) correcting the distorted color values.

In addition, the processor may perform a sharpening process for enhancing the edge of the high-frequency region where the difference between the values of the pixels is large, in order to enhance the image discrimination power (S970).

The processor detects the fire using the corrected image (S980). Specifically, the processor can determine whether a fire has occurred in a region photographed from the corrected image, based on one or a plurality of algorithm operation processing results. Here, the processor can perform more accurate judgment based on the additional information received from the additional information apparatus 300. [

FIG. 10 is a view showing an original image taken by the fire monitoring apparatus of FIG. 2 and a sector image obtained by dividing the original image.

Referring to FIG. 10 (a), an example of an original image 1010 in an annular shape is shown. 10 (b) shows a state in which the original image 1010 is divided into N parts in the circumferential direction. The sector images 1020-1, 1020-2, ..., and 1020-N of the N-divided original image 1010 have a short arc length corresponding to the inner concentric circle and a relatively long arc corresponding to the outer concentric circle . Where N is a positive integer, and in one example N may be a power squared number of 2 (i.e., 2, 4, 8, 16, ...).

11 is a diagram for explaining the conversion processing for flattening the sector image of Fig.

Referring to FIG. 11 (a), an arc 1021 intersecting the middle of one sector image 1020 becomes a reference length. Specifically, the reference arc 1021 corresponds to the horizontal length of the converted plane image.

The partial area 1022 corresponding to the outer side around the reference arc 1021 exceeds the number of pixels required in the plane image. On the contrary, the partial area 1023 corresponding to the inside with respect to the reference arc 1021 is smaller than the number of pixels required for the plane image.

Accordingly, when the sector image 1020 is transformed into a planar image as shown in FIG. 11 (b), interpolation is performed to fill the pixel space 1025 in the partial area 1023 corresponding to the inside. Conversely, downsampling is performed in the partial region 1022 corresponding to the outer side to fill a plurality of pixels with one-pixel 1024. Interpolation can be performed using a non-adaptive interpolation method (eg, nearest neighbor replication, bilinear interpolation, median interpolation, etc.) and adaptive interpolation (eg, pattern matching based interpolation, interpolation using a threshold- sensing interpolation, etc.) can be used. Downsampling may be a method of selecting either one of a plurality of pixels, or using a result calculated by a constant equation such as an average value or a median value.

3 is a block diagram showing a more detailed configuration of the fire monitoring apparatus of FIG.

3, the fire monitoring apparatus 100 includes an omnidirectional camera 110, a communication / power interface (PoE) 120, a processor 130, a temperature sensor 140, a smoke sensor 150, a speaker 160, And a beacon light 170, and a hub 20 and a network video recorder 30 are additionally shown for the sake of explanation. Here, the fire monitoring apparatus 100 may further include an additional information apparatus 300 as an optional configuration. Here, the omnidirectional camera 110, the communication / power source interface 120, and the processor 130 are identical in configuration and function to the omnidirectional camera 110, communication / power interface 120, and processor 130 of FIG. The description is omitted.

The photoelectric sensor 112 constituting the omnidirectional camera 110 may be the CMOS sensor 112. [ The electrical signal output from the CMOS sensor 112 can be converted into a digital image signal that is transmitted to the image signal processing unit (ISP) 131 and sampled the electrical signal.

The converted digital image signal (raw data) is transmitted to a digital signal processing unit (DSP) 132 to be processed so as to monitor omnidirectional observation and determine whether or not a fire has occurred. In detail, the digital signal processor 132 may perform downsampling to interpolate the missing pixel information in the process of flattening the toric original image, and to reduce the excess pixel information according to the size of the display image. In addition, the digital signal processor 132 may inspect the converted plane image to interpolate insufficient color information, perform color correction, and the like. Meanwhile, the digital signal processor 132 may convert the RGB color space into the YCbCr color space to perform the above-described image processing.

A microcomputer (MICOM) 133 processes signals received from the camera 110 and the sensors 140 and 150 and informs them. For example, the microcomputer 133 can inform when and at what time the panoramic image photographed by the camera is photographed. It is also possible to determine whether a fire has occurred from the image processed by the DSP 132 and the signal detected by the temperature sensor 140 and the smoke sensor 150. The microcomputer 133 can control the output units 160 and 170 so that a predetermined condition is satisfied or an event occurs, so that a corresponding output can be generated.

The additional information device 300 may obtain additional information about the surrounding objects. Specifically, the additional information device 300 can acquire information such as distance, movement, position, coordinates, color, etc. of an object around the fire monitoring apparatus 100. Then, the additional information device 300 can transmit the obtained distance and / or coordinates of the object to the DSP 132. In this case, the DSP 132 may further include the additional information received from the additional information device 300, and may transmit the additional information as information for determining whether a fire has occurred in the microcomputer 133. The specific configuration and function of the additional information device 300 will be described later with reference to FIGS. 4 and 5. FIG.

The input / output unit 134 provides an interface for inputting / outputting information to / from an external device. Although not shown, the input / output unit 134 may include a receiver for receiving a signal of a remote control, a wireless communication module such as Wi-Fi, and a wired communication module such as a USB.

The temperature sensor 140 senses the temperature of the air around the fire monitoring apparatus 100. Using the temperature sensor 140, the fire monitoring apparatus 100 can collect information on the temperature of the air that has been raised in the event of a fire.

The smoke sensor 140 senses the smoke concentration of the air around the fire monitoring apparatus 100. Specifically, the smoke sensor 140 senses an increased smoke concentration in the air and can provide information on the composition of the changed air during the fire.

The speaker 160 outputs sound. Specifically, the speaker 160 may output an alarm sound for notifying occurrence of a fire.

The beacon light 170 emits light. Specifically, the beacon light 170 can output light for notifying the occurrence of a fire.

The communication / power interface 120 may be implemented by PoE (Power over Ethernet) technology. Specifically, the hub 20 can provide the power supplied by the UPS with the routing of the data communication. And can receive power along with the data via the PoE socket from the Ethernet cable connected to the communication / power interface 120 hub 20.

As described above, the fire monitoring apparatus 100 captures an omnidirectional image through a single photographing module and judges whether the fire is directly fired. Thus, it is possible to secure the economics of cost and installation space and reliability of fire detection.

4 is a block diagram showing a configuration according to the first embodiment of the additional information apparatus of FIG.

4, the additional information apparatus 400 according to the first embodiment includes a first laser module 410, a second laser module 420, a camera module 430, a tilting module 440, a control module 450).

The first laser module 410 includes a first laser generator 411, a first frequency modulator 412, a first plane optical lens 413, and a first plane optical receiver 414.

The first laser generation unit 411 emits a laser beam.

The first frequency modulator 412 modulates the frequency of the laser beam emitted by the first laser generator 411.

The first plane optical lens 413 converts the laser beam of the frequency modulated by the first frequency modulator 412 into plane light.

The first plane light receiving section 414 receives the reflected light reflected by the measurement object of the laser beam emitted in the form of plane light through the first plane optical lens 413.

The second laser module 420 includes a first laser generator 421, a first frequency modulator 422, a first plane optical lens 423, and a first plane optical receiver 424.

The second laser generation unit 421 emits a laser beam.

The second frequency modulating section 422 modulates the frequency of the laser beam emitted by the second laser generating section 421.

The second plane optical lens 423 converts the laser beam of the frequency modulated by the second frequency modulator 422 into plane light.

The second plane light receiving section 424 receives the reflected light reflected by the measurement object of the laser beam emitted in the form of plane light through the second plane optical lens 423.

In the first laser module 410 and the second laser module 420, the first planar optical lens 413 and the second planar optical lens 423 are formed by the first laser generating portion 411 and the second laser The first planar optical lens 413 and the second planar optical lens 423 are required to project the laser beam within a range of 60 to 180 degrees It is designed to be converted into wide plane light.

The first frequency modulating unit 412 and the second frequency modulating unit 422 modulate the frequency of the laser beam in different frequency bands to provide different frequency bands.

The first plane light receiving section 414 and the second plane light receiving section 424 are configured to receive the first and second frequency modulation sections 412 and 422 in accordance with the frequency bands provided by the first frequency modulation section 412 and the second frequency modulation section 422, And a filter unit (not shown) so as to receive only the frequency of the band.

The camera module 430 is located between the first laser module 410 and the second laser module 420 and reads an optical form that can not be detected by a laser, such as color or planar pattern of a measurement object.

The tilting module 440 is connected to the first laser module 410 and the second laser module 420 and tilts when the laser beam is emitted to help measurement of the measurement object.

The control module 450 includes a simultaneous emission adjustment unit 451, a first distance calculation unit 452, a second distance calculation unit 453, and a coordinate analysis unit 454.

The simultaneous emission control unit 451 simultaneously emits the laser beam emitted from the first laser generation unit 411 and the second laser generation unit 421.

The first distance calculation section 452 measures the reception frequency of the first plane light reception section 414 and calculates the distance to the measurement object.

The second distance calculation unit 453 measures the reception frequency of the second plane light reception unit 424 and calculates the distance to the measurement target.

The first distance calculator 452 and the second distance calculator 453 calculate the distance by applying a Doppler effect to measure the reflection distance of the laser beam reflected by the measurement object.

The coordinate analyzer 454 obtains the distances calculated by the first distance calculator 452 and the second distance calculator 453 to analyze the X and Y coordinates of the measurement object.

The control module 450 analyzes the coordinates of the measurement object analyzed by the coordinate analysis unit 454 to determine the size of the measurement object and analyzes the color information received from the camera module 430 to calculate the distance, Color and size data.

In addition, the control module 450 can change the color of the measurement target read from the camera module 430 to a different color according to the user's selection.

The DSP 132 receives data of distance, coordinate color and size of the measurement target processed by the control module 450, and transmits the data to the microcomputer 133 which determines the fire as additional information for more accurately identifying the fire.

5 is a block diagram showing a configuration according to a second embodiment of the additional information apparatus of FIG.

Referring to FIG. 5, the plane light emitted through the light emitting unit 510 is applied to the sensing object 530

The reflected light reflected from the object to be detected 530 is received through the light receiving unit 520 and the plurality of photons present in the reflected light are electrically measured by using the charge coupled device 523 provided in the light receiving unit 520 The additional information device 500 for generating object sensing image data on the basis of the light receiving unit 520 includes a light emitting unit 510, a light receiving unit 520, a charge coupled device 523, an operation processor 540, an optical measuring device 550, 560 and a controller 570.

The light emitting unit 510 includes a planar light converter 511 that converts the laser beam into planar light. The plane light emitted through the planar light converter 511 is transmitted to the first optical lens 512 for diffusion, And irradiates the sensing object 530 with the plane light.

The light receiving unit 520 transmits the reflected light reflected from the sensing object 530 to the second and third optical lenses 521 and 522 and diffuses it after diffusing the planar light diffused by the sensing object 530, And transmits the reflected light to the charge coupled device 523, which is non-instrumental.

The irradiation angle of the light emitting portion 510 can be adjusted within 0 to 90 degrees and the receiving angle of the light receiving portion 520 can be adjusted within 0 to 90 degrees.

For example, when the irradiation angle of the light emitting portion 510 is fixed to a selected one of 0 to 60 degrees, the receiving angle of the light receiving portion may be fixed to a selected one of 0 to 5 degrees. In another example, Is fixed to one of 0 to 5 degrees, the receiving angle of the light receiving unit 520 is fixed to one of 0 to 60 degrees.

Note that the light emitting portion 510 and the light receiving portion 520 are present on the same vertical plane.

The first optical lens 512 is a concave lens, the second optical lens 521 is a convex lens, and the third optical lens 522 is another concave lens, which is considered to be one of important points of the present invention. The first optical lens 512 is provided in the light emitting portion 510 and the second and third optical lenses 521 and 522 are provided in the light receiving portion 523. [

In practice, F-theta lenses are widely used among those skilled in the art in mounting the first optical lens 512 on the light emitting portion 510. [

The F-theta lens is a lens designed to receive a laser beam (or a laser beam) from the light emitting portion 510 without loss, and is a lens designed to maximize a viewing angle when emitting a plane light. The F-theta lens can be filter coated so that it can detect only a specific wavelength (in the present invention, a wavelength range of 150 nm to 10.6 탆 corresponding to a laser beam).

As a second optical lens 521 widely used among those skilled in the art in mounting the second and third optical lenses 521 and 522 on the light receiving unit 520, an MLA (Micro Lens Array) As the third optical lens 522, a focusing lens is exemplified.

The MLA (Micro Lens Array) lens is a cylindrical lens designed to converge the reflected light reflected from the object to be detected 530 uniformly as the object to be sensed 530 is irradiated with the plane light diffused by the first optical lens 512 One of the lens models of the present invention is to change the reflected light horizontally and transmit it to the focusing lens.

The focusing lens transmits the reflected light received from the MLA lens to the reflected light, and transmits the reflected light to the charge coupled device 523.

A charge coupled device (CCD) 230 according to an embodiment of the present invention generates object sensing image data by electrically measuring a plurality of photons present in diffused reflected light.

The charge coupled device 523 is also referred to as a charge coupled detector, a charge transfer device, and a charge control device. The charge coupled device 523 is a pulse control device that applies charges injected into a semiconductor region immediately below a minute electrode to an electrode group A semiconductor integrated circuit device which has a function of arbitrarily moving below an adjacent electrode by a voltage or taking it out to the outside by using the property that a charge flows to a higher voltage applied to a control electrode, Are sequentially used in a delay circuit or a memory circuit.

The charge coupled device 523 is formed by forming an insulating layer having a thickness of about 0.1 mm on the surface of an n-type semiconductor substrate, arranging metal electrodes, and controlling the voltage of the metal electrode, So that the accumulated charges can be sequentially transferred in accordance therewith.

The charge-coupled device 523 has two functions, that is, storage by accumulation of electric charge and transfer by transfer of electric charge, and it is possible to store and transmit an analog quantity according to the size of the electric charge. Therefore, (Image sensor).

The operation processor 540 extracts a plurality of relative coordinate values of the object to be sensed 530 by numerically arranging the object sensed image data and then numerically analyzes a plurality of relative coordinate values to determine a relative position of the sensing object 530 And the movement path and the movement speed are calculated.

The numerical analysis method for finding the position of the object to be detected by using the coordinate values can be applied to various methods. Accordingly, the numerical analysis method applied to the present invention can be applied to various methods such as interpolation method, LU decomposition method, CDG (Characteristic-Dissipative- Galerkin) (Multi Dimensional Scaling) technique.

The optical measuring device 550 detects the instantaneous rate of change of the sensing object 530 by measuring the pixel change with respect to the sensing target object on the basis of a plurality of photon moving states in real time.

The input / output unit 560 outputs the object detection data, the relative position, the movement path, the moving speed, and the instantaneous change rate to the outside.

The controller 570 collectively controls the signal processing of the arithmetic processor 540, the optical metering unit 550 and the input / output unit 560.

Movement path, moving speed, instantaneous rate of change, etc., including the object detection data output by the DSP 132 input / output unit 560, and transmits the information as additional information to the microcomputer 133 for determining whether or not the fire has occurred.

6 is a side cross-sectional view of the omnidirectional lens according to an embodiment of the present invention.

Referring to FIG. 6A, the omnidirectional lens 400 is composed of a first lens 610 and a second lens 620. Specifically, the omnidirectional lens 400 has a convex first incident surface 611 formed on one surface thereof and a first emitting surface 613 formed on the other surface thereof. A first reflecting surface 615 is formed at the center of the first incident surface 611 A second reflective surface 623 formed on the other surface of the first lens 610 and a second reflective surface 623 formed in the center of the second reflective surface 623, And a second lens 620 on which a surface 625 is formed.

6 (b) shows a form in which the first lens 610 and the second lens 620 are joined.

The first lens 610 is a lens to which an external light beam is incident. The first incident surface 611 is convex on one surface and an external light ray is incident on the first incident surface 611. A first exit surface 613 is formed on the other surface of the first lens 610 so that light rays incident on the first incident surface 611 are emitted to the first exit surface 613. In addition, a first reflecting surface 615 is formed at the center of the first incident surface 611.

The second lens 620 is a lens that is joined to the first lens 610. Specifically, a second incident surface 621 is formed on one surface of the second lens 620, and a second incident surface 621 is formed on the 1 exit surface 613 of the light guide plate 613. A convex second reflecting surface 623 is formed on the other surface of the second lens 620 and a second reflecting surface 623 is formed in the center of the second reflecting surface 623, An emission surface 625 is formed.

Here, it is preferable that the first reflection surface 615 and the second reflection surface 623 are coated with a material capable of reflecting visible light, such as aluminum or silver. In addition, although the first reflecting surface 615 and the second emitting surface 625 are shown flat on the drawing, they may be convex or concave only in one embodiment.

In the case of the omnidirectional lens module mounted on the surveillance camera, the radius of curvature of the first exit surface 613 and the second incident surface 621 is preferably 30 mm. It also includes a radius of curvature of 29 mm to 31 mm, depending on the manufacturing tolerance (± 1 mm). When the radius of curvature of the first exit surface 613 and the second incident surface 621 is larger than 31 mm, the size of the second lens 620 becomes larger. When the radius of curvature of the second lens 620 is smaller than 29 mm, This is a problem that becomes difficult.

The radius of curvature of the first incident surface 611 is preferably 21 to 23 mm. The radius of curvature of the first incident surface 611 is 21 to 23 mm so that the field of view can be secured in the range of 40 DEG to 85 DEG in the vertical direction.

When the radius of curvature of the first incident surface 611 is out of the range of 21 to 23 mm, shielding is generated at the second emitting surface 625 and a proper angle of view can not be secured. Specifically, when the radius of curvature of the first incident surface 611 is out of the upper radius of curvature, the light rays of some wavelengths are incident on the second incident surface 611. Therefore, It is impossible to secure the intended angle of view because the light can not be output to the surface 625. [ Therefore, it is most preferable to make the radius of curvature of the first incident surface 611 to be 22 mm in consideration of the manufacturing tolerance (± 1 mm).

The radius of curvature of the second reflecting surface 623 is preferably 10 to 12 mm. If the radius of curvature of the second reflection surface 623 is not 10 to 12 mm, shielding occurs at the first reflection surface 615 and a proper angle of view can not be secured. Therefore, it is most preferable that the radius of curvature of the second reflecting surface 623 is 11 mm in consideration of the manufacturing tolerance (± 1 mm).

7 is a side cross-sectional view of a lens module according to an embodiment of the present invention.

Referring to FIG. 7, the lens module 700 includes, in addition to the first lens 610 and the second lens 620 of the omnidirectional lens 400 of FIG. 6, a plurality of lenses (Hereinafter, referred to as 'relay lens unit').

The relay lens unit includes a third lens 710 to a ninth lens 770 arranged in a line so that their optical axes coincide. For correction of chromatic aberration, some lenses of the third lens 710 to the ninth lens 770 may be formed of a crown-based material, and the remaining lenses may be formed of a flint-based material.

The relay lens unit may further include an infrared filter 780 of a shutter type. The infrared filter 780 may be closed to block the infrared ray, and the infrared ray filter 780 may be opened to allow the infrared ray to pass therethrough. The infrared filter 780 is advantageous in that it is easier to manufacture than that the infrared filter 780 is disposed at the rearmost portion of the relay lens portion between the plurality of lenses of the relay lens portion.

FIG. 8 is a view showing paths of light incident on the lens module of FIG. 7 in all directions.

8, light rays incident through the first incident surface 611 are reflected by the second reflection surface 623 through the joint surfaces of the first exit surface 613 and the second incident surface 621, The light beam reflected by the second reflecting surface 623 is reflected by the first reflecting surface 615 through the joining surface of the first emitting surface 613 and the second emitting surface 621 and then reflected by the first emitting surface 613 And the second incident surface 621 through the second exit surface 625. In this way,

Thus, the lens module 700 can acquire a 360-degree omni-directional image through the refraction refracting lens including the first lens 610 and the second lens 620.

9 is a flowchart illustrating an image processing method for fire detection according to an embodiment of the present invention.

Referring to FIG. 9, the processor receives the original image through the omnidirectional camera (S910). As shown in Fig. 8 in which an image is formed on the photoelectric sensor through the omnidirectional lens module, the image captured by the omnidirectional camera is an annular image in which two concentric circles are enclosed on the photoelectric sensor substrate of the two-dimensional plane.

Next, the processor converts the circular original image into a rectangular plane image (S920). Specifically, the incidence at 360 degrees foreground is transformed into a rectangular plane image so that the distorted image is processed through reflection and refraction of light, and the image is easily recognized.

Then, coordinates of the photographed image can be analyzed (S930). Specifically, in the case of the fire monitoring apparatus 100 in which the additional information apparatus 300 is combined, the additional information apparatus 300 photographs an image of the surrounding object, The coordinates can be analyzed. More specifically, step S930 may be coordinate analysis of the measurement object by the coordinate analysis unit 454 when the additional information device 400 according to the first embodiment of Fig. 4 is used.

Then, the distance can be measured (S940). Specifically, the additional information device 400 can generate data on at least one of the size, distance, coordinates, and hue of the measurement object from the coordinates of the analyzed measurement object. For example, if an ambient fire has occurred, the additional information device 400 may generate precise and detailed data on the coordinates of the direction in which the fire occurred, the size of the fire, and the color of the fire.

The above steps S930 and S940 are optional steps and can be omitted. Also, this step may be performed in parallel with the following steps to reach the step S980 of judging the fire.

If the additional information apparatus 500 according to the second embodiment is used, step S930 may be a step of extracting the relative coordinates of the object to be detected by the operation processor 540. [ And S940 may be a step of calculating the relative position, movement path, and movement speed of the object by numerically analyzing the relative coordinate value. For example, if an ambient fire has occurred, the additional information device 500 may generate precise and detailed information about the relative position of the fire, fire motion, fire spread path and speed.

Next, the processor performs a pixel correction including up-sampling and down-sampling, which fills in the lack of pixels to be developed in the plane of the rectangle in the planarization process and performs averaging on a plurality of pixel information to be contained in one- (S950).

In addition, the processor may further examine the pixels to resolve the bad pixels, and then read (S960) correcting the distorted color values.

In addition, the processor may perform a sharpening process for enhancing the edge of the high-frequency region where the difference between the values of the pixels is large, in order to enhance the image discrimination power (S970).

The processor detects the fire using the corrected image (S980). Specifically, the processor can determine whether a fire has occurred in a region photographed from the corrected image, based on one or a plurality of algorithm operation processing results. Here, the processor can perform more accurate judgment based on the additional information received from the additional information apparatus 300. [

FIG. 10 is a view showing an original image taken by the fire monitoring apparatus of FIG. 2 and a sector image obtained by dividing the original image.

Referring to FIG. 10 (a), an example of an original image 1010 in an annular shape is shown. 10 (b) shows a state in which the original image 1010 is divided into N parts in the circumferential direction. The sector images 1020-1, 1020-2, ..., and 1020-N of the N-divided original image 1010 have a short arc length corresponding to the inner concentric circle and a relatively long arc corresponding to the outer concentric circle . Where N is a positive integer, and in one example N may be a power squared number of 2 (i.e., 2, 4, 8, 16, ...).

11 is a diagram for explaining the conversion processing for flattening the sector image of Fig.

Referring to FIG. 11 (a), an arc 1021 intersecting the middle of one sector image 1020 becomes a reference length. Specifically, the reference arc 1021 corresponds to the horizontal length of the converted plane image.

The partial area 1022 corresponding to the outer side around the reference arc 1021 exceeds the number of pixels required in the plane image. On the contrary, the partial area 1023 corresponding to the inside with respect to the reference arc 1021 is smaller than the number of pixels required for the plane image.

Accordingly, when the sector image 1020 is transformed into a planar image as shown in FIG. 11 (b), interpolation is performed to fill the pixel space 1025 in the partial area 1023 corresponding to the inside. Conversely, downsampling is performed in the partial region 1022 corresponding to the outer side to fill a plurality of pixels with one-pixel 1024. Interpolation can be performed using a non-adaptive interpolation method (eg, nearest neighbor replication, bilinear interpolation, median interpolation, etc.) and adaptive interpolation (eg, pattern matching based interpolation, interpolation using a threshold- sensing interpolation, etc.) can be used. Downsampling may be a method of selecting either one of a plurality of pixels, or using a result calculated by a constant equation such as an average value or a median value.

12 is a diagram for explaining a polar coordinate transformation method for transforming the original image of FIG. 10 into a plane image.

Referring to FIG. 12 (a), an original image 1010 having a length of 2R and a length of 2R is shown on an orthogonal coordinate system.

The original image 1010 is divided into N sector images by line segments 1210-1, 1210-2, ..., 1210-N.

Referring to FIG. 12 (b), FIG. 12 (b) shows a planar image 1240 in which the toric original image 1010 is planarized. The planar image 1240 has a length R of length 4R. 12 (b), the positions of the plane images 1240 in which the line segments 1210-1, 1210-2, ..., 1210-N of the original image 1010 are transformed together are shown.

Planarization for converting the toric original image 1010 into a rectangular plane image 1240 employs the concept of polar coordinate transformation. Specifically, the position of an arbitrary pixel 1230 of the original image 1010 can be expressed by (xi, yi) by an orthogonal coordinate system and (? I, ri) by a polar coordinate system. The arc direction of the original image 1010 corresponds to the horizontal direction of the planar image 1240 and the radial direction of the original image 1010 corresponds to the vertical direction of the planar image 1240. Therefore, Can be calculated.

Figure pat00005

Figure pat00006

Figure pat00007

Figure pat00008

Here, the θi and ri are the angle (deg) and the radius when the pixel position of the original image 1010 is represented by the polar coordinate system, and the xi and yi are the rectangular coordinates of the pixel of the original image 1010 X, y, and y are the x-axis coordinate values and y-axis coordinate values of the center of the torus, R is the radius of the concentric circle outside the center in the torus, cx and cy are the x- Axis coordinate value and a y-axis coordinate value of the planar image 1240 corresponding to xi and yi.

12 (b) also shows the position (xf, yf) on the plane image 1240 of any pixel 1230 transformed according to Equation (1) above.

13 is a flowchart for determining whether a fire has occurred from an image according to an embodiment of the present invention.

Referring to FIG. 13, a sequence of a processed image through a planarization process, a pixel correction, and the like is received (S1310). Specifically, the processor is configured to flatten an original image of an annular shape photographed in all directions so as to identify whether or not a fire has occurred from the image by the configuration of the DSP, and to correct the plurality of corrected planes processed with pixel correction, color correction, And receives image frames according to a time sequence (sequence).

Then, coordinates of the photographed image can be analyzed (S1320). Specifically, in the case of the fire monitoring apparatus 100 in which the additional information apparatus 300 is combined, the additional information apparatus 300 photographs an image of the surrounding object, The coordinates can be analyzed. This step corresponds to step S930 of FIG. 9, except that the procedure for acquiring more detailed information of the object is different, but the contents are the same and redundant explanation is omitted.

Next, the distance can be measured (S1330). Specifically, the additional information device 300 can generate data on at least one of the size, distance, coordinates, and hue of the measurement object from the coordinates of the analyzed measurement object. This step also corresponds to step S940 in Fig. 9, and redundant description will be omitted.

The above steps S930 and S940 are optional steps and can be omitted. If this step is included, the processor can judge based on more accurate information in determining whether a fire has occurred or not.

Next, a flame candidate pixel is detected based on the brightness / color of the received image (S1340). Specifically, the processor can detect a region in which the motion exists in the received plane image. For example, a processor can detect an area of an image that is different from a fixed background. Then, the processor can analyze the shape in which the brightness is distributed in the area of the moving image. For example, the processor can detect a flame candidate pixel, which is likely to be flame, by determining whether the central portion of the moving region is greater than a predetermined depth value and whether the difference in brightness from the peripheral portion is equal to or greater than a predetermined contrast value. More specifically, in the case of a flame, the brightness of the flame is relatively higher than that of the peripheral portion, and when the curve is drawn along the same brightness, a plurality of closed lines that do not intersect each other are overlapped. The flame candidate region can be detected by comparing the predetermined depth value with the preset contrast value by using the feature of the flame. On the other hand, it is possible to detect the candidate region of flame by detecting the change of the color of the central part and the peripheral part of the flame. The flame has bright white in the deep part and yellow, orange and red in the order of the farther out.

Next, an adaptive background difference image is obtained (S1350). Background subtraction refers to separating and removing a fixed background to extract moving objects of a time-serial image. The adaptive background differential algorithm is an algorithm that adaptively reflects changes in environment such as changes in lighting, shaking of trees, and appearance of objects with a relatively high speed, thereby separating backgrounds. The processor can acquire an image with the background part removed in addition to the moving flame using the adaptive background difference algorithm.

In addition, the flame region can be extracted using the color (S1360). Specifically, there may be a situation in which it is difficult to accurately extract only the region corresponding to the flame only by the presence or absence of the above-described movement, so that it is possible to further more accurately detect the pixel region corresponding to the flame by using the color of the flame. The color of the flame may vary depending on the burning material, temperature, air, and camera shooting environment. A Gaussian Mixture Model can be used to model the color of these various flames. In addition, the model can be learned using the EM algorithm for GMM, which estimates parameters (E-step) by recursively repeating using the accumulated image data and next new data, and estimates parameters using expected values.

Next, the dynamic texture of the flame is analyzed (S1370). Specifically, since the flame has a stochastic motion due to the shape of the burning material and the wind direction, the dynamic texture of the image corresponding to the change of the flame motion acquired before has a unique characteristic of flame. As an example, the Volume Local Binary Patterns (VLBP) technique can be used to analyze the dynamic texture characteristics of extracted sparkle region images.

Finally, it is determined whether a fire has occurred (S1380). Specifically, the dynamic texture properties of spark images extracted using the VLBP technique are learned by machine learning algorithms. Then, the feature information of the currently inputted dynamic texture can be judged whether or not it can be classified as fire by the machine learning algorithm. In one embodiment, a k-Nearest Neighbor Algorithm may be used as a method for classifying dynamic texture features by machine learning. In this case, the performance of the k-NN depends on the distance calculation formula used in the algorithm, so a design suitable for actual implementation is required. In particular, in the case of the LBP feature, it is preferable to experimentally determine the histogram matching method suitable for actual implementation since the LBP is not used as it is and the histogram is mainly used.

Such a fire detection method can detect a fire quickly and accurately without complicated large-capacity operation processing.

In addition, the fire detection method according to an embodiment of the present invention can be implemented in the fire detection apparatus of FIGS. Further, the above-described fire detection method may be embodied as at least one program for executing the fire detection method, and the program may be stored in a computer-readable recording medium.

Thus, each block of the present invention may be embodied as computer-writable code on a computer-readable recording medium. The computer-readable recording medium may be a device capable of storing data that can be read by a computer system.

For example, the computer-readable recording medium may be a ROM, a RAM, a CD-ROMs, a magnetic tape, a floppy disk, an optical disk, an optical data storage device, The same image display device or the like. In addition, the computer readable code may be embodied as a computer data signal of a carrier wave.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be construed as limiting the scope of the invention as defined by the appended claims. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention.

100: optical device 110: omnidirectional camera
111: Lens module 112: Photoelectric sensor
120: communication / power interface 130: processor

Claims (7)

In a fire monitoring apparatus,
A lens module including an aspherical lens that focuses light incident on a foreground of 360 degrees in all directions; And a CMOS sensor for converting the focused light into an electrical signal.
PoE sockets for power reception and network communication; And
A polar transformation process is performed to convert an original image of the circular shape taken by the omnidirectional camera into a rectangular image having a rectangular shape,
A downsampling process for reducing the length of the outer arc so that each sector image obtained by dividing the original image at equal intervals along the circumferential direction is matched with the intermediate arc length, and interpolation processing for extending the length of the inner arc Perform the calibration,
If the brightness of the center of the region in which the motion exists from the corrected planar image is greater than or equal to a predetermined depth value and the difference from the brightness of the surroundings is greater than or equal to a preset contrast value, And a processor for detecting occurrence of a fire based on the dynamic texture of the spark area learned according to a machine learning algorithm,
Wherein the processor performs the polar coordinate conversion processing using the following equation:
Figure pat00009

Figure pat00010

Figure pat00011

Figure pat00012

Herein, θ i and r i are an angle (deg) and a radius when a pixel position of the original image is represented by a polar coordinate system, and x i and y i represent a position of a pixel of the original image by a rectangular coordinate x-axis coordinate value and a y-axis coordinate value at the time when, with R the radius, the c x and c y of the outer concentric circle from the center in the torus is the x coordinate value and y coordinate value of the center of the torus.
The method according to claim 1,
In the aspherical lens,
And a first reflection surface 415 is formed at the center of the first incident surface 411. The first reflection surface 415 is formed at the center of the first incident surface 411, (410); And
And a second exit surface 425 is formed at the center of the second reflection surface 423. The second reflection surface 425 is formed at the center of the second reflection surface 423, 420)
The second incident surface 421 is convex so that the first exit surface 413 is concave and the first exit surface 413 and the second incident surface 421 have the same radius of curvature Respectively,
The curvature radius of the first incidence surface 411 is 21 mm to 23 mm and the curvature radius of the second reflection surface 423 is 10 mm to 12 mm and the curvature radius of the first incidence surface 413 and the second incidence surface 421 ) Is formed to have a radius of curvature of 29 mm to 31 mm so that no focal point is formed on the joint surface of the first lens (410) and the second lens (420).
The method according to claim 1,
The processor comprising:
Wherein the flame region is extracted from an image separated from the background based on a Gaussian Mixture Model (GMM).
The method of claim 3,
The processor comprising:
It is possible to detect the occurrence of a fire by determining the proximity between the characteristics of the dynamic texture for the extracted spark region and the characteristics of the motion of the spark estimated by the machine learning using Volume Local Binary Patterns (VLBP) Fire monitoring system.
The method according to claim 1,
The processor comprising:
And transmits at least one of the original image and the plane image to an external storage device connected through the socket.
The method according to claim 1,
A first laser generator for emitting a laser beam,
A first frequency modulation section for modulating the frequency of the laser beam emitted by the first laser generation section,
A first plane optical lens for converting a laser beam of a frequency modulated by the first frequency modulator into plane light;
A first laser module configured to receive reflected light reflected by a measurement object;
A second laser generator for emitting a laser beam,
A second frequency modulator for modulating the frequency of the laser beam emitted by the second laser generator,
A second planar optical lens for converting the laser beam of the frequency modulated by the second frequency modulator to plane light,
A second laser module including a second plane light receiving unit receiving reflected light reflected by a measurement object;
A camera module located between the first laser module and the second laser module and reading the color of the measurement object;
A tilting module for tilting the first laser module and the second laser module; And
A simultaneous emission adjusting unit for simultaneously emitting the laser beam emitted from the first laser generating unit and the second laser generating unit,
A first distance calculation unit for measuring a reception frequency of the first planar light reception unit and calculating a distance to an object to be measured,
A second distance calculation unit for measuring a reception frequency of the second planar light reception unit and calculating a distance to an object to be measured,
The distance calculated by the first distance calculation unit and the second distance calculation unit is acquired and the X and Y coordinates of the measurement object are analyzed
And a control module composed of a coordinate analysis unit,
The processor comprising:
And detects occurrence of a fire based on data of distance, coordinates, and color of the measurement target processed by the control module.
The method according to claim 1,
A light emitting unit having a planar light converter for converting the laser beam into planar light and transmitting the planar light emitted through the planar light converter to the first optical lens to diffuse the planar light and then irradiating the diffused planar light to the sensing object; And
A light receiving unit for transmitting the diffused reflected light to the second and third optical lenses and for transmitting the diffused reflected light to the charge coupled devices, ;
A relative position, a movement path and a movement speed of the object to be sensed are calculated by numerically arranging the object detection image data to extract a plurality of relative coordinate values of the object to be sensed and numerically analyzing the plurality of relative coordinate values An arithmetic processor to solve the problem; And
And an optical measuring unit for detecting an instantaneous rate of change of the object to be sensed by measuring a pixel change with respect to the object in real time based on the plurality of photon moving states,
The light emitting portion and the light receiving portion are on the same vertical plane,
Wherein the charge coupled device generates object sensing image data by electrically measuring a plurality of photons present in the diffused reflected light and wherein the angle of illumination of the light emitting portion is adjustable from 0 to 90 degrees, Further comprises an additional information device further comprising an additional information device adjustable within 0 to 90 degrees,
The processor comprising:
Wherein the occurrence of a fire is detected based on at least one of the calculated relative position, the movement route, the movement speed, and the detected instantaneous change rate.
KR1020150096482A 2015-07-07 2015-07-07 Fire surveillance apparatus KR101716036B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150096482A KR101716036B1 (en) 2015-07-07 2015-07-07 Fire surveillance apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150096482A KR101716036B1 (en) 2015-07-07 2015-07-07 Fire surveillance apparatus

Publications (2)

Publication Number Publication Date
KR20170006079A true KR20170006079A (en) 2017-01-17
KR101716036B1 KR101716036B1 (en) 2017-03-13

Family

ID=57990344

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150096482A KR101716036B1 (en) 2015-07-07 2015-07-07 Fire surveillance apparatus

Country Status (1)

Country Link
KR (1) KR101716036B1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106934763A (en) * 2017-04-17 2017-07-07 北京果毅科技有限公司 panoramic camera, drive recorder, image processing method and device
CN107607960A (en) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 A kind of anallatic method and device
CN107798734A (en) * 2017-12-07 2018-03-13 梦工场珠宝企业管理有限公司 The adaptive deformation method of threedimensional model
KR20190029901A (en) * 2017-09-13 2019-03-21 네이버랩스 주식회사 Light focusing system for detection distance enhancement of area sensor type lidar
KR101953342B1 (en) * 2017-12-08 2019-05-23 주식회사 비젼인 Multi-sensor fire detection method and system
CN109917405A (en) * 2019-03-04 2019-06-21 中国电子科技集团公司第十一研究所 A kind of laser distance measurement method and system
KR200489704Y1 (en) * 2019-01-21 2019-07-25 주식회사 엠에이티 Air quality monitoring apparatus
KR20190130186A (en) * 2018-04-16 2019-11-22 세종대학교산학협력단 Fire monitoring method and apparatus
KR102244187B1 (en) * 2019-10-31 2021-04-26 한국과학기술원 Method for video frame interpolation robust to exceptional motion and the apparatus thereof
KR20210110084A (en) * 2020-02-28 2021-09-07 (주)트리플렛 Device and method for detecting fire
KR102332699B1 (en) * 2021-06-04 2021-12-01 (주)재상피앤에스 Event processing system for detecting changes in spatial environment conditions using image model-based AI algorithms
KR102408171B1 (en) * 2021-12-20 2022-06-13 주식회사 코난테크놀로지 Real-time explosion time detection method in CCTV camera environment and CCTV image processing apparatus

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102067994B1 (en) 2019-05-20 2020-01-20 한밭대학교 산학협력단 System for detecting flame of embedded environment using deep learning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100901784B1 (en) * 2008-11-11 2009-06-11 주식회사 창성에이스산업 System for fire warning and the method thereof
KR20100018998A (en) * 2008-08-08 2010-02-18 펜타원 주식회사 Omnidirectional monitoring camera system and an image processing method using the same
KR101432440B1 (en) * 2013-04-29 2014-08-21 홍익대학교 산학협력단 Fire smoke detection method and apparatus
KR101439411B1 (en) * 2014-01-23 2014-09-11 이선구 Omnidirectional lens module

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100018998A (en) * 2008-08-08 2010-02-18 펜타원 주식회사 Omnidirectional monitoring camera system and an image processing method using the same
KR100901784B1 (en) * 2008-11-11 2009-06-11 주식회사 창성에이스산업 System for fire warning and the method thereof
KR101432440B1 (en) * 2013-04-29 2014-08-21 홍익대학교 산학협력단 Fire smoke detection method and apparatus
KR101439411B1 (en) * 2014-01-23 2014-09-11 이선구 Omnidirectional lens module

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106934763B (en) * 2017-04-17 2023-08-22 北京灵起科技有限公司 Panoramic camera, automobile data recorder, image processing method and device
CN106934763A (en) * 2017-04-17 2017-07-07 北京果毅科技有限公司 panoramic camera, drive recorder, image processing method and device
KR20190029901A (en) * 2017-09-13 2019-03-21 네이버랩스 주식회사 Light focusing system for detection distance enhancement of area sensor type lidar
CN107607960A (en) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 A kind of anallatic method and device
CN107798734A (en) * 2017-12-07 2018-03-13 梦工场珠宝企业管理有限公司 The adaptive deformation method of threedimensional model
KR101953342B1 (en) * 2017-12-08 2019-05-23 주식회사 비젼인 Multi-sensor fire detection method and system
KR20190130186A (en) * 2018-04-16 2019-11-22 세종대학교산학협력단 Fire monitoring method and apparatus
KR200489704Y1 (en) * 2019-01-21 2019-07-25 주식회사 엠에이티 Air quality monitoring apparatus
CN109917405B (en) * 2019-03-04 2021-09-03 中国电子科技集团公司第十一研究所 Laser ranging method and system
CN109917405A (en) * 2019-03-04 2019-06-21 中国电子科技集团公司第十一研究所 A kind of laser distance measurement method and system
KR102244187B1 (en) * 2019-10-31 2021-04-26 한국과학기술원 Method for video frame interpolation robust to exceptional motion and the apparatus thereof
WO2021085757A1 (en) * 2019-10-31 2021-05-06 한국과학기술원 Video frame interpolation method robust against exceptional motion, and apparatus therefor
KR20210110084A (en) * 2020-02-28 2021-09-07 (주)트리플렛 Device and method for detecting fire
KR102332699B1 (en) * 2021-06-04 2021-12-01 (주)재상피앤에스 Event processing system for detecting changes in spatial environment conditions using image model-based AI algorithms
KR102408171B1 (en) * 2021-12-20 2022-06-13 주식회사 코난테크놀로지 Real-time explosion time detection method in CCTV camera environment and CCTV image processing apparatus

Also Published As

Publication number Publication date
KR101716036B1 (en) 2017-03-13

Similar Documents

Publication Publication Date Title
KR101716036B1 (en) Fire surveillance apparatus
US11842564B2 (en) Imaging apparatus and imaging system
US11405535B2 (en) Quad color filter array camera sensor configurations
US9992457B2 (en) High resolution multispectral image capture
JP4347888B2 (en) Method and apparatus for generating infrared image and normal image
US10848693B2 (en) Image flare detection using asymmetric pixels
US8427632B1 (en) Image sensor with laser for range measurements
US20140028861A1 (en) Object detection and tracking
CN111062378A (en) Image processing method, model training method, target detection method and related device
US9412039B2 (en) Blur detection system for night scene images
CA2654455A1 (en) Apparatus and method for determining characteristics of a light source
CN111294526B (en) Processing method and device for preventing camera from being burnt by sun
US8970728B2 (en) Image pickup apparatus and image processing method
CN108234897B (en) Method and device for controlling night vision system, storage medium and processor
CN108886571B (en) Imaging apparatus with improved auto-focusing performance
US9894255B2 (en) Method and system for depth selective segmentation of object
JP7192778B2 (en) IMAGING APPARATUS AND METHOD AND IMAGE PROCESSING APPARATUS AND METHOD
JP2014138290A (en) Imaging device and imaging method
JP2004222231A (en) Image processing apparatus and image processing program
JP2018056786A (en) Image processing device, imaging apparatus, movable body and image processing method
US11245878B2 (en) Quad color filter array image sensor with aperture simulation and phase detection
JP2017207883A (en) Monitoring system, color camera device and optical component
JPWO2017022331A1 (en) Control device, control method, computer program, and electronic device
CN211880472U (en) Image acquisition device and camera
CN108449547B (en) Method for controlling a night vision system, storage medium and processor

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant