KR101788411B1 - Operating method and system for drones to prevent industrial disasters - Google Patents

Operating method and system for drones to prevent industrial disasters Download PDF

Info

Publication number
KR101788411B1
KR101788411B1 KR1020150165797A KR20150165797A KR101788411B1 KR 101788411 B1 KR101788411 B1 KR 101788411B1 KR 1020150165797 A KR1020150165797 A KR 1020150165797A KR 20150165797 A KR20150165797 A KR 20150165797A KR 101788411 B1 KR101788411 B1 KR 101788411B1
Authority
KR
South Korea
Prior art keywords
flight path
image
drones
situation
similarity
Prior art date
Application number
KR1020150165797A
Other languages
Korean (ko)
Other versions
KR20170060973A (en
Inventor
정의필
하철근
이정철
추상목
Original Assignee
울산대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 울산대학교 산학협력단 filed Critical 울산대학교 산학협력단
Priority to KR1020150165797A priority Critical patent/KR101788411B1/en
Publication of KR20170060973A publication Critical patent/KR20170060973A/en
Application granted granted Critical
Publication of KR101788411B1 publication Critical patent/KR101788411B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • B64C2201/127
    • B64C2201/141

Abstract

[0001] The present invention relates to a method and system for operating a drone to cope with industrial disaster, and more particularly, to a method and system for operating a drone, in which a first image photographed at each disaster management site is transmitted to a drone management server, A first mode operation step of analyzing one image to determine whether each disaster management site is in an abnormal situation or an emergency situation; If it is determined that the dron is in an abnormal state in the first mode operation step, the second image captured by changing the flight path to the abnormal flight path including the proximity point and the measured first sensor measurement value are transmitted to the drones management server A second mode operation step of analyzing the second image and the first sensor measurement value to determine whether the normal state or the emergency state is present; And when the dron is judged to be in an emergency in the first mode operation step or the second mode operation step, the drones change the flight path to the emergency flight path including the maximum proximity point, And transferring the measurement value to the drones management server and the service room server.

Description

Technical Field [0001] The present invention relates to a drone operating method and system for industrial disaster response,

The present invention relates to a drone operating method and system for responding to industrial disasters, and more particularly, to a drone operating method and system for dealing with industrial disasters, more specifically, to determine whether a normal, abnormal, or emergency situation occurs at a disaster management site, And more particularly, to a drone operating method and system for industrial disaster response that can automatically determine a point and transmit measured images and sensor-specific measured values to a situation room and a server.

An unmanned aerial vehicle (UAV) is a flight craft designed to allow a pilot to perform a specified mission without boarding the aircraft. Or, an unmanned aircraft is simply called a drone. The drones do not burn pilots, they fly by autonomous or remote control by levitating by aerodynamic forces. Drones are disposable or reusable powered vehicles that can carry weapons or general cargo.

These drones carry various duties such as surveillance, reconnaissance, communication, and information relay with various equipment (for example, camera, optical, infrared, radar sensor, etc.) It is also useful when a drone is used to collect initial information in the event of a disaster.

As the drone technology has advanced and the automatic control technology or the remote control technique has progressed, the application field of the drone has been expanded to a very monotonous surveillance operation, a wide field surveillance operation, or a surveillance operation in a dangerous environment such as high temperature or low temperature. These drones are applied in industrial complexes, forest complexes, and so on. In particular, drones are being used to help people minimize loss of life by monitoring dangerous industrial sites that are difficult to monitor the entire complex.

The drones are equipped with a global positioning system (GPS) to identify the real-time location, obtain route information of the destination location, and simply acquire and return the image information of the disaster management site under unfavorable conditions that can not be visually checked. And the like.

However, since the drone management system judges whether or not the disaster management site is abnormal by using only the image information photographed by the drone, the false alarm may be operated due to an error such as a photographed image.

Korean Registered Patent No. 10-1363066 (Registered on February 4, 2014)

Embodiments of the present invention can be applied to a disaster management site image and an anomaly detection sensor (not shown) photographed by a drone along one of the three routes of a normal flight path, an abnormal flight path and an emergency flight path, Based on sensor measurement values of a sensor (e.g., a flame sensor, a smoke sensor, a gas sensor, a heat sensor, etc.) By determining the proximity point and transmitting the images and sensor-specific measurement values taken at the nearest point to the situation room and server, it is possible to check the situation of the disaster management site in real time and respond quickly to industrial disasters. And to provide a drone operating method and system.

In addition, the embodiments of the present disclosure may be configured such that when the disaster management site is determined to be in an emergency situation, the image captured at the nearest proximity to the management object and the measured sensor measurement value are transmitted to the situation room server, The control room of the drones of the drones is handed over from the drones management server so that the situation room server can check the situation of the disaster management site from each drone as a whole and respond quickly to industrial disasters when an emergency occurs. Method and system.

According to the first aspect of the present invention, the first image captured at each disaster management site is transmitted to the drones management server while the drones fly according to the normal flight path, and the first image is analyzed, Or an emergency situation; If it is determined that the dron is in an abnormal state in the first mode operation step, the second image captured by changing the flight path to the abnormal flight path including the proximity point and the measured first sensor measurement value are transmitted to the drones management server A second mode operation step of analyzing the second image and the first sensor measurement value to determine whether the normal state or the emergency state is present; And when the dron is judged to be in an emergency in the first mode operation step or the second mode operation step, the drones change the flight path to the emergency flight path including the maximum proximity point, And a third mode operating step of transmitting the measurement value to the drones management server and the service room server.

Wherein the drones management server has control of the drones in the first mode operation step and the second mode operation step and the third mode operation step in the first mode operation step or the second mode operation step The step of transferring the control of the drones to the control room server may be performed.

The method comprises the steps of: image processing the image collected while the drone is flying the normal flight path, extracting the steady state image information using the position and the outline information of the management object in each disaster management site, Further comprising the step of constructing a steady state standard image database by adding at least one of date, time, weather, temperature, wind direction, wind speed and location information of the drone.

Wherein the first mode operation step calculates a first similarity degree which is a probability similarity degree between the photographed first image and a steady state standard image constructed in the steady state standard image database, The situation of the management site can be judged by comparing with two threshold values.

The first mode operation step may include: when the calculated first degree of similarity does not exceed the first threshold value and exceeds the second threshold value, the second image is photographed by changing the flight path to the abnormal flight path, And if the calculated first degree of similarity does not exceed the first threshold value and the second threshold value, the flight path is changed to the emergency flight path to photograph the third image and measure the second sensor-specific measured value.

The second mode operation step may include a step of, when the calculated first degree of similarity does not exceed the first threshold value and exceeds the second threshold value, the second image captured by changing the flight path to the abnormal flight path, The second degree of similarity and the third degree of similarity, which are image and sensor value similarities, respectively, and compares the calculated second degree of similarity with the third threshold value or compares the third degree of similarity with the fourth threshold value, It is determined that the situation is an emergency situation or a normal situation, and if the emergency situation is determined, the flight path is changed to the emergency flight path, and if the normal situation is determined, the normal flight path can be returned.

According to a second aspect of the present invention, a first image captured at each disaster management site is transmitted while flying according to a normal flight path, and the first image is analyzed to determine whether each disaster management site is an abnormal situation or an emergency situation The second image and the measured first sensor measurement value by changing the flight path to the abnormal flight path including the proximity point and transmitting the second image and the first sensor measurement value, A dron for determining whether the vehicle is in a normal situation or an emergency situation and transmitting a third image and a measured second sensor measurement value by changing a flight path to an emergency flight path including a maximum proximity point when it is determined that the vehicle is in an emergency situation; A drones management server for managing the drones and receiving the first to third images taken by the drones along the respective flight paths and the first and second sensor measurement values; And a drones operating system for managing an industrial disaster, including a situation room server for managing each disaster management site and receiving a third image taken by the dron along the emergency flight path and a second sensor measurement value.

The drones management server may have the control of the drones in the normal situation and the abnormal situation and may transfer the control of the drones to the situation room server if the normal situation or the abnormal situation is changed to the emergency situation.

The drones image the captured image while flying the normal flight path, extract the steady state image information using the position and the outline information of the management object of each disaster management site, , The weather, the temperature, the wind direction, the wind speed, and the location information of the drones.

Wherein the drone calculates a first similarity that is a probabilistic similarity between the photographed first image and the steady-state standard image constructed in the steady-state standard image database, compares the calculated first similarity with the first and second threshold values So that the situation of the management site can be judged.

If the calculated first degree of similarity does not exceed the first threshold value and exceeds the second threshold value, the drone changes the flight path to the abnormal flight path to photograph the second image, measures the first sensor measurement value, If the first similarity degree does not exceed the first threshold value and the second threshold value, the third image can be photographed by changing the flight path to the emergency flight path, and the second sensor-specific measured value can be measured.

If the calculated first degree of similarity does not exceed the first threshold and exceeds the second threshold value, the drone may generate a second image based on the second image captured by changing the flight path on the abnormal flight path, And a second similarity degree and a third similarity degree, which are sensor similarities, and compares the calculated second similarity level with a third threshold value or compares the calculated third similarity level with a fourth threshold value, Or a normal situation, and if the emergency situation is determined, the flight path is changed to the emergency flight path, and if the normal situation is determined, the normal flight path can be returned.

Embodiments of the present invention can be applied to a disaster management site image and an anomaly detection sensor (not shown) photographed by a drone along one of the three routes of a normal flight path, an abnormal flight path and an emergency flight path, Based on sensor measurement values of a sensor (e.g., a flame sensor, a smoke sensor, a gas sensor, a heat sensor, etc.) By determining the proximity point and transmitting the image and the sensor-specific measurement values taken at the nearest point to the situation room and the server, it is possible to check the situation of the disaster management site in real time and respond to industrial disaster quickly.

In addition, the embodiments of the present disclosure may be configured such that when the disaster management site is determined to be in an emergency situation, the image captured at the nearest proximity to the management object and the measured sensor measurement value are transmitted to the situation room server, By taking control of the drones of the drones from the management server, the emergency room situation server can check the situation of the disaster management site from each drone and respond quickly to industrial disaster.

1 is a configuration diagram of a drone operating system for responding to an industrial accident according to an embodiment of the present invention.
2 is an explanatory diagram of each mode in the drone operating system according to the embodiment of the present invention.
3 is an explanatory view of a process of converting an industrial complex, which is a disaster management site, into a plan view according to an embodiment of the present invention.
FIG. 4 is an explanatory view of each flight path according to an embodiment of the present invention; FIG.
5 is a flowchart illustrating a drone operation method for an industrial disaster according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. Will be described in detail with reference to the portions necessary for understanding the operation and operation according to the present specification. In describing the embodiments of the present invention, description of technical contents which are well known in the technical field to which the present invention belongs and which are not directly related to the present specification will be omitted. This is for the sake of clarity without omitting the unnecessary explanation and without giving the gist of the present invention.

In describing the components of the present specification, the same reference numerals may be given to components having the same name, and the same reference numerals may be given to different drawings. However, even in such a case, it does not mean that the corresponding component has different functions according to the embodiment, or does not mean that it has the same function in different embodiments, and the function of each component is different from that of the corresponding embodiment Based on the description of each component in FIG.

1 is a configuration diagram of a drone operating system for responding to an industrial accident according to an embodiment of the present invention.

As shown in FIG. 1, a drone operation system 100 for an industrial disaster response according to an embodiment of the present invention includes a drone 110, a drone management server 120, and a situation room server 130.

Hereinafter, the specific configuration and operation of each component of the drone operating system 100 for industrial disaster response of FIG. 1 will be described.

First, if you look at the disaster management site, the disaster management site 1 (101), the disaster management site 2 (102), ... , The disaster management site including the disaster management site N (10N) is monitored by the drones 110 and the drones management server 120. [ The entire disaster management site including each disaster management site is managed entirely by the status room server 130. The disaster management site may be a disaster management site for managing the disaster of the industrial complex.

The dron 110 includes a plurality of drones 111, drones 112, ... , And drones N (11N). Dron 1 (111), Dron 2 (112), ... , Drone N (11N), Disaster Management Site 1 (101), Disaster Management Site 2 (102), ... , And disaster management site N (10N), respectively. The drone (110) is a normal flight route when the disaster management site is operating normally, an emergency flight situation where it is difficult to judge an emergency situation at the disaster management site but an abnormal flight situation occurs when an abnormal situation occurs, Set the flight path. Then, the drone management server 120 determines whether or not the drone management server 120 has detected the image taken through one of the three paths of the normal flight path, the abnormal flight path, or the emergency flight path in the flight area including the disaster management site, Analyze the sensor readings to determine if a normal, abnormal or emergency situation occurs.

The drone 110 changes the flight path according to the result of the situation determination. The drone 110 monitors the occurrence of an anomaly or an emergency situation at the disaster management site while automatically flying along the normal flight path at the normal time. When the abnormal situation occurs during normal flight route operation, the drone 110 changes the abnormal flight route to the abnormal flight in the normal flight route to determine the emergency situation. At this time, the drone 110 returns to the normal flight path if it determines that the abnormal flight path is normal. Otherwise, the drone 110 changes the emergency flight path to the nearest flight, collects emergency situation information at the nearest point, and transmits it to the drones management server 120 and the situation room server 130.

The drones management server 120 includes a drones management server 1 121, a drones management server 2 122, and a drones management server 122, which are operated by an administrator and manage a plurality of drones, respectively. , And a drones management server N (12N). The drones management server 120 manages the drones 110 and receives images and sensor measurements taken from the drones 110 along each flight path.

The situation room server 130 communicates with each of the plurality of drones 110 and the plurality of drones management servers 120 to manage each disaster management site as a whole. The situation room server 130 may receive images and sensor measurements taken from the drones 110 or the drones management server 120 along the emergency flight path. For example, the situation room server 130 may be implemented as a single situation room server 130 operating in the 119 situation room, and one situation room server 130 may manage N disaster management sites.

2 is an explanatory diagram of each mode in the drone operating system according to the embodiment of the present invention.

2, a drones 111, a drones management server 1 121 and a control room server 130, which manage the disaster management site 1 (101) among a plurality of drones and a plurality of drones management servers, Each mode in the drone operating system 100 will be described through an example of the operating system 100. [

The drone 1 111 applies the normal presence / absence determination algorithm based on the image or sensor measurement value of the disaster management site 1 101 to determine the situation of the disaster management site 1 101 as a normal situation (Mode 1), an abnormal situation Mode 2) and an emergency (Mode 3). Here, the normal presence / absence determination algorithm will be described in detail in the drone operation method shown in FIG.

First, the normal situation (Mode 1) is the image based on the disaster management site, which means that the generation of flame or smoke is within the range of the normal reference value.

Next, an abnormal situation (Mode 2) means a situation in which the occurrence of flame or smoke exceeds the normal reference value range based on the image or sensor measurement value, but does not exceed a predetermined ratio (for example, 50% or more) of the normal reference value stochastically do.

Finally, an emergency (Mode 3) refers to a situation where measurements such as flame, smoke generation, or chemical leaks based on image or sensor measurements are within the range set for the hazardous conditions.

On the other hand, let's look at the role and authority of each disaster management site.

In the normal situation (Mode 1), the drone 1 (111) shoots at the predetermined normal point while flying along the normal flight path, and transmits the photographed image to the drone management server 1 (121) .

In the normal situation (Mode 1), the drone management server 1 121 analyzes the photographed image and can change the flight path of the dron 1 111. If the drone management server 1 (121) determines that an emergency situation has occurred, the drone management server (121) can notify the emergency situation to the 119 control room through the control room server (130). At this time, the situation room server 130 does not yet take over the control of the drones 111 and has no role or authority.

On the other hand, in the abnormal situation (Mode 2), when the dron 1 (111) judges occurrence of an abnormal situation by the normal existence determination algorithm during the normal flight route flight, it changes the route to the abnormal flight route. Then, the dron 1 (111) operates a camera for shooting an image and an abnormality detection sensor while flying the abnormal flight path. The dron 1 (111) judges a normal situation, an abnormal situation, or an emergency situation by using the normal existence determination algorithm while flying the abnormal flight route several times. Dron 1 (111) automatically changes the flight path according to the result. The drone 1 (111) transfers the collected image and sensor data to the drone management server 1 (121).

In the abnormal state (Mode 2), the drone management server 1 (121) judges the current situation based on the transmitted image and sensor measurement values. Accordingly, the first drone management server 121 can change the flight path of the first drone 111. [ Here, the drone management server 1 (121) can notify the emergency room situation server 130 of the emergency situation when it is determined that the emergency situation has occurred. At this time, the situation room server 130 does not take over the control of the drones 111 and has no role or authority.

On the other hand, in the emergency situation (Mode 3), when the dron 1 (111) judges occurrence of the emergency situation by the normal existence determination algorithm during the flight along the normal flight path or the abnormal flight path, the dron 1 111 changes its route to the emergency flight path. Dron 1 (111) moves to the nearest point of the emergency flight path and shoots and activates the sensor. At this time, the dron 1 (111) transmits the collected image and sensor data to the drones management server 1 (121) and the situation room server (130).

In the emergency situation (Mode 3), when the emergency situation occurs, the drones management server 1 121 hands the dron control right to the situation room server 130 and receives the information acquired from the drones 1 111 and the situation room server 130.

In the emergency situation (Mode 3), the situation room server 130 can change the nearest point to the flight path of the dron 1 111 by judging the image or sensor measurement value for the disaster management site transmitted from the dron 1 111 . The situation room server 130 may control the drones 111 and provide image or sensor measurement values for the disaster management site to the drones management server 1 121. [

Thus, the drones management server 1 (121) has control of the drones in the normal flight path and the abnormal flight path, and transfers the control of the dron 1 111 to the situation room server 130 when the path changes from the abnormal flight path to the emergency flight path .

The drone management server 1 121 processes the image collected while the drone 1 111 is flying over the normal flight path, and extracts the steady state image information using the position and the outline information of the management object in each disaster management site. Then, the drone 1 111 adds at least one of date, time, weather, temperature, wind direction, wind speed and location information of the drone 1 111 to the extracted steady state image information, 111) or the drone management server 1 (121).

Then, the drone 1 111 calculates the first similarity, which is the probability similarity between the position and shape of the extracted supervised object and the steady state standard image constructed in the steady state standard image database by image processing of the captured first image, 1 and the second threshold value.

Specifically, when the first similarity degree does not exceed the first threshold value and the second threshold value is exceeded, the drone 1 111 measures the second image or the measured first sensor measurement value, The third threshold value or the fourth threshold value to determine whether or not an accident has occurred.

The dron 1 (111) can set at least one proximity point to the management subject based on the image taken according to the abnormal flight path and the measured sensor measurement value. The drone 1 111 moves to at least one set proximity point and transmits the photographed image and measured sensor measurement values to the drone management server 1 121 and the situation room server 130.

On the other hand, if the first degree of similarity does not exceed the first threshold value and the second threshold value, the drone 1 (111) can enter the emergency flight path and measure the sensor-specific measured value provided in the dron 1 (111). At this time, when the drone 1 (111) judges that an accident has occurred at the disaster management site, the emergency flight route is entered and the operation is performed. On the other hand, when it is judged as normal, the normal flight path is returned to perform the normal observation.

3 is an explanatory view of a process of converting an industrial complex, which is a disaster management site, into a plan view according to an embodiment of the present invention.

As shown in FIG. 3, the industrial complex 301, which is a disaster management site monitored by the drone 110, may be represented by a plan view 310 showing management points for a plurality of managed objects and respective managed objects. The plan view 310 is a transformed stereoscopic region to be managed by the drones 110. The plan view 310 shows a maximum proximity point to be managed in preparation for the occurrence of an emergency situation and characteristics of each point (e.g., emergency situation and surrounding structure information).

For example, the industrial complex 301 may be managed by management objects B1, B2, B3, ... , Bm may be included. The drones 110 are arranged in such a manner that each of the objects B1, B2, B3, ... , The flight path R1, R2, R3, ... , Rm, and the measured sensor values are analyzed to determine whether each object B1, B2, B3, ... , It is determined whether Bm is normal, abnormal or emergency.

When an emergency situation occurs in the management object B3, the drone 110 moves to the maximum proximity point 1 for the management object B3

Figure 112015115256594-pat00001
(311), the maximum proximity point 2
Figure 112015115256594-pat00002
(312), and the determined maximum proximity point 1
Figure 112015115256594-pat00003
(311), the maximum proximity point 2
Figure 112015115256594-pat00004
And transmits the measured sensor measurement value to the service room server 130 through the sensor 312. [

Among the industrial complexes applied to the embodiments of the present invention, the type of fire or explosion in a petrochemical plant is as follows.

In petrochemical plants, fires caused by flammable liquids, flammable gases, and flammable liquid gases are more at risk than other general fires due to the three characteristics of high combustion heat, rapid heat release rate, and fluidity. Types of accidents related to fire or explosion can be classified into five types. A combination of one or more of these can cause accidents.

First, a fire accident may occur when the liquid is ignited by liquids in open tanks or in the oil that has exposed the liquid surface to the atmosphere.

Second, the liquid released on the open space surface spreads over the surface of the earth to form a thin film, resulting in a fire accident.

Third, an ignited fire accident may occur when a pressurized gas or liquid is ejected.

Fourth, an explosion may occur due to a sudden chemical reaction in a confined space.

Fifth, when a large amount of flammable material that can be easily evaporated flows out to the atmosphere, it forms a vapor cloud and spreads. In case of contact with the ignition source, explosion may occur.

By organizing, analyzing and formalizing comprehensive accident scenarios for major facilities that make up these petrochemical processes, it is possible to determine the types of accident scenarios that can be applied in most petrochemical processes. In other words, it is possible to model dangerous materials or to design sensitive facilities such as temperature and pressure in the process as major hazardous facilities, and to model the type of accidents by referring to existing accident cases and experience of the corresponding process operators have. It is possible to generate several accident scenarios for the facility by analyzing the operation method of the facility in accordance with the flow charts and process flow for the selected hazardous facilities, and then analyzing the potential risks according to the qualitative risk assessment method have. These types of incident scenarios can be reflected in the steady state standard image database. The steady state standard image database may be pre-built and stored in the drones 110, stored in the drones management server 120 or the presence room server 130, and may be shared with each other.

Accordingly, the drone 110 can determine a maximum proximity point according to an accident scenario type for a building, a structure, a device, and the like requiring continuous observation using the constructed steady-state standard image database, or apply different threshold values to each accident scenario Accident can be observed intensively.

On the other hand, the drone 110 may determine the maximum proximity of each object based on the second image taken along the abnormal flight path or the first sensor measurement value measured by the anomaly sensor. The drone 110 transmits the third image and the second sensor measurement value taken at the determined maximum near point to the drones management server 120 and the situation room server 130. [

FIG. 4 is an explanatory view of each flight path according to an embodiment of the present invention; FIG.

As shown in FIG. 4, the mode-specific flight path is divided into a normal flight path 401, an abnormal flight path 402, and an emergency flight path 403.

The normal flight path 401 means a flight path through which the drone 110 can easily capture and grasp the abnormality of the management object in the disaster management field as a whole.

The abnormal flight path 402 is a path that allows the drones 110 to move closer to the object to be managed in which the abnormal situation is found during flight on the normal flight path and to detect a normal, . When the object to be managed is B3, the abnormal flight path 402 is referred to as R3.

The emergency flight path 403 is a path for the drone 110 to determine the maximum proximity point of each management object based on the measured values of the abnormality detection sensors (e.g., flame detection sensor, smoke detection sensor, gas detection sensor, For example, the maximum proximity point 1 of the object B3

Figure 112015115256594-pat00005
(311), the maximum proximity point 2
Figure 112015115256594-pat00006
(312)) and determines the flight path determined using the surrounding terrain information.

5 is a flowchart illustrating a drone operation method for an industrial disaster according to an embodiment of the present invention.

First, the drone 110 extracts the steady state image information using the position and the outline information of a building, a structure, and a device provided by the user by applying an image processing algorithm to the collected image while flying the normal flight path, State standard image database by adding information related to the date, time, weather, temperature, wind direction, wind speed, and location of the drones 110.

The normal situation image DB 510 of the steady state standard image database includes object information (position and contour of the object to be observed in the normal flight path), normal situation image information (dron status, date, time, weather, (First similarity), a threshold value 1 (a first threshold value), and a threshold value 2 (a second threshold value) at each shooting point using object information and normal situation image information. When the steady state standard image database is constructed, the drone 110 can perform the steady state industrial disaster occurrence observation based on the steady state standard image database.

In addition, the abnormal situation DB 520 of the steady state standard image database includes a proximity steady state image DB 521 and a proximity steady state sensor DB 522. The proximity normal situation image DB 521 stores object information (the position and contour of the object to be inspected in the abnormal flight path), normal situation image information (dron status, date, time, weather, disaster management scene image) (Similarity a (second similarity) and threshold value a (third threshold value)) at each shooting point using the image information are stored in a database. In addition, the proximity steady state sensor DB 522 stores object information (the position and outline of the controlled object detected in the abnormal flight path), normal situation sensor information (drones, date, time, weather, disaster management site sensor measurement values) (Similarity degree b (third degree of similarity), threshold value b (fourth threshold value)) at each photographing point using the information and the normal situation sensor information are stored.

FIG. 5 shows a drone operation method to which a normal presence / absence determination algorithm is applied.

As shown in FIG. 5, the drones 110 fly according to a normal flight path (S502).

The drone 110 shoots the first image of the disaster management site while flying along the normal flight path (S504). Here, the drones 110 transmit the photographed images to the drones management server 120.

Then, the drone 110 calculates a probability similarity between the photographed first image and the steady state standard image, and compares the calculated probability similarity (first similarity) with the threshold 1 to determine whether the probability similarity exceeds the threshold 1 (S506). That is, the drone 110 compares the shape of a building, a structure, and a device extracted by applying an image processing algorithm on a first image observed while flying a normal flight path and a normal situation image constructed by a database, . At this time, the drones 110 calculate the probability similarity (first degree of similarity) even if the images of the same object are different according to the date, time, weather, temperature, wind direction, The similarity determination uses a threshold value 1 for determining whether a normal situation or an abnormal situation and a threshold value 2 for determining an abnormal situation or an emergency situation.

If the probability similarity degree (first degree of similarity) exceeds the threshold value 1, the drone 110 determines that it is in a normal state, and repeats the procedure from step S502 in which the user fills the normal flight path. That is, when it is determined that the drone 110 is in the normal state, the drone 110 returns to the normal flight path and performs normal observation. The drone 110 may use the collected image to update the steady state standard image database.

If the probability similarity degree (first degree of similarity) does not exceed the threshold value 1, the drone 110 checks whether the calculated probability similarity degree (first degree of similarity) exceeds the threshold value 2 (step S505) S508).

If the probability similarity exceeds the threshold value 2, the drone 110 changes the flight path to the abnormal flight path (S510). Then, the drone 110 photographs a second image of the disaster management site while flying along the abnormal flight path, and measures the first sensor measurement value through the anomaly sensor (S512). That is, when the degree of similarity is low but the type of accident is ambiguous (first degree of similarity> threshold 2), the drone 110 moves to the abnormal flight path, and the second image photographed through the camera mounted on the dron 110, Sensor, smoke detection sensor, gas detection sensor, heat detection sensor, and the like. Here, the drone 110 transmits the photographed second image and the measured first sensor measurement value to the drones management server 120.

On the other hand, if the probability similarity degree (first degree of similarity) does not exceed the threshold value 2, the drone 110 transits to S516 in which the flight path is changed to the emergency flight path. That is, when the similarity does not exceed the threshold value 2, the drone 110 can enlarge a large difference portion or convert it into a partial image, thereby recognizing the accident type by comparing it with existing fire, explosion, and leakage accident images.

After step S512, the drone 110 determines whether the degree of similarity a is less than the threshold value a or the degree of similarity degree b is less than the threshold value b using the second image photographed according to the abnormal flight path, the first sensor measurement value, (S514). Here, the degree of similarity a refers to a degree of similarity between the second image photographed according to the abnormal flight path and the proximity steady-state image constructed in the proximity steady-state image DB 521. In addition, the similarity degree b refers to the sensor value similarity between the first sensor measurement value measured according to the abnormal flight path and the proximity steady state sensor value constructed in the proximity steady state sensor DB 522. [

If the similarity degree a is less than the threshold value a or the similarity degree b is less than the threshold value b, the drone 110 changes the flight path to the emergency flight path and then flows (S516).

Then, the drone 110 shoots a third image of the disaster management site while flying along the emergency flight path, and measures the second sensor measurement value through the anomaly sensor (S518). That is, when the type of accident is clearly identified (similarity a <threshold value a or similarity degree b <threshold value b), the drone 110 enters the emergency flight path and the flame detection sensor mounted on the drone 110, , Gas sensors, and heat sensors. Here, the drone 110 transmits the photographed third image and the measured second sensor measurement values to the situation room server 130 and the drones management server 120.

On the other hand, if the similarity degree a is not less than the threshold value a or the similarity degree b is not less than the threshold value b, the drone 110 determines that the normal condition is satisfied, and resumes the normal flight path from step S502 .

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or essential characteristics thereof. Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas within the scope of equivalents should be construed as falling within the scope of the present invention.

100: Dron operation system
101, 102, ... , 10N: disaster management site 1, disaster management site 2, ... , Disaster management site N
110: Drones
111, 112, ... , 11N: Drones 1, Drones 2, ... , Drone N
120: Drones management server
121, 122, ... , 12N: drones management server 1, drones management server 2, ... , Drones management server N
130: Situation room server
301: Industrial complex
310:
311, 312: maximum proximity point 1, maximum proximity point 2
401, 402, 403: normal flight path, abnormal flight path, emergency flight path

Claims (12)

The first image captured at each disaster management site is transmitted to the drones management server while the drones fly along the normal flight path and the first image is analyzed to determine whether each disaster management site is in an abnormal situation or an emergency situation Mode operation phase;
If it is determined that the dron is in an abnormal state in the first mode operation step, the second image captured by changing the flight path to the abnormal flight path including the proximity point and the measured first sensor measurement value are transmitted to the drones management server A second mode operation step of analyzing the second image and the first sensor measurement value to determine whether the normal state or the emergency state is present; And
The drones may include a third image captured by changing the flight path to the emergency flight path including the maximum proximity point and a second sensor image obtained by measuring the second sensor image when the dron is determined to be an emergency in the first mode operation step or the second mode operation step. Value to the drones management server and the service room server,
Extracting the steady state image information using the location and the outline information of the management object of each disaster management site by image processing the collected image while the drones fly over the normal flight path, Further comprising the step of constructing a steady state standard image database by adding at least one of weather, temperature, wind direction, wind speed, and location information of the drones.
The method according to claim 1,
When the drone management server has the control of the drones in the first mode operation step and the second mode operation step and is changed to the third mode operation step in the first mode operation step or the second mode operation step, The step of transferring the control of the drones to the control room server
The method for operating a drone for industrial disaster response.
delete The method according to claim 1,
The first mode operation step
The first similarity degree calculating unit calculates a first similarity degree, which is a probability similarity degree between a position and a shape of the supervised object extracted by image processing the first image and the steady state standard image constructed in the steady state standard image database, And comparing the first and second thresholds to determine the situation of the management site.
5. The method of claim 4,
The first mode operation step
When the calculated first degree of similarity does not exceed the first threshold value and exceeds the second threshold value, the second image is photographed by changing the flight path to the abnormal flight path, the first sensor measurement value is measured, The method of claim 1, wherein if the degree of similarity does not exceed the first threshold value and the second threshold value, the third image is photographed by changing the flight path to the emergency flight path, and the second sensor is measured.
5. The method of claim 4,
The second mode operation step
When the calculated first degree of similarity does not exceed the first threshold value and exceeds the second threshold value, the second image captured by changing the flight path to the abnormal flight path and the first sensor measured value based on the measured first sensor measurement value, The second degree of similarity and the third degree of similarity, compares the calculated second degree of similarity with the third threshold, or compares the calculated third degree of similarity with the fourth threshold to determine the state of the management site as an emergency situation or a normal situation And if the emergency situation is determined, the flight path is changed to the emergency flight path, and if the normal situation is determined, the normal flight path is returned.
The first image captured at each disaster management site is transmitted while flying according to the normal flight route and the first image is analyzed to determine whether each disaster management site is an abnormal situation or an emergency situation, The second image and the measured first sensor measurement value obtained by changing the flight path with the abnormal flight path included are analyzed and the second image and the first sensor measurement value are analyzed to determine whether the normal state or the emergency state is present A third image captured by changing the flight path to an emergency flight path including the maximum proximity point and a second sensor measurement value transmitted when the emergency situation is determined;
A drones management server for managing the drones and receiving the first to third images taken by the drones along the respective flight paths and the first and second sensor measurement values; And
And a situation room server that manages each disaster management site and receives a third image and a second sensor measurement value taken by the drones along the emergency flight path,
The drones image the captured image while flying the normal flight path, extract the steady state image information using the position and the outline information of the management object of each disaster management site, , The weather, the temperature, the wind direction, the wind speed, and the location information of the drone to construct a steady state standard image database.
8. The method of claim 7,
The drones management server
Wherein the drones have control of the drones in the normal situation and the abnormal situation and pass control of the drones to the control room server when the normal situation or the abnormal situation is changed to the emergency situation.
delete 8. The method of claim 7,
The drones
The first similarity degree calculating unit calculates a first similarity degree, which is a probability similarity degree between a position and a shape of the supervised object extracted by image processing the first image and the steady state standard image constructed in the steady state standard image database, And the first and second thresholds to determine the situation at the management site.
11. The method of claim 10,
The drones
When the calculated first degree of similarity does not exceed the first threshold value and exceeds the second threshold value, the second image is photographed by changing the flight path to the abnormal flight path, the first sensor measurement value is measured, The drone operating system for an industrial accident response system for measuring a second sensor by measuring a third image by changing a flight path by an emergency flight path if the similarity does not exceed a first threshold value and a second threshold value.
11. The method of claim 10,
The drones
When the calculated first degree of similarity does not exceed the first threshold value and exceeds the second threshold value, the second image captured by changing the flight path to the abnormal flight path and the first sensor measured value based on the measured first sensor measurement value, The second degree of similarity and the third degree of similarity, compares the calculated second degree of similarity with the third threshold, or compares the calculated third degree of similarity with the fourth threshold to determine the state of the management site as an emergency situation or a normal situation , And when the emergency situation is determined, the flight path is changed to the emergency flight path, and when the normal situation is determined, the normal flight path is returned.
KR1020150165797A 2015-11-25 2015-11-25 Operating method and system for drones to prevent industrial disasters KR101788411B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150165797A KR101788411B1 (en) 2015-11-25 2015-11-25 Operating method and system for drones to prevent industrial disasters

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150165797A KR101788411B1 (en) 2015-11-25 2015-11-25 Operating method and system for drones to prevent industrial disasters

Publications (2)

Publication Number Publication Date
KR20170060973A KR20170060973A (en) 2017-06-02
KR101788411B1 true KR101788411B1 (en) 2017-10-19

Family

ID=59222221

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150165797A KR101788411B1 (en) 2015-11-25 2015-11-25 Operating method and system for drones to prevent industrial disasters

Country Status (1)

Country Link
KR (1) KR101788411B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101895366B1 (en) 2018-03-29 2018-09-05 이호형 the improved hybrid drone
KR20190119330A (en) * 2018-04-12 2019-10-22 황순신 Festival unmanned management system using drones

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101949525B1 (en) * 2017-11-16 2019-03-18 주식회사 아이오티봇 Safety management system using unmanned detector
US10676216B2 (en) * 2018-04-25 2020-06-09 International Business Machines Corporation Non-intrusive unmanned entity inspection
US10629009B2 (en) 2018-04-25 2020-04-21 International Business Machines Corporation Non-intrusive unmanned entity inspection
CN108762306A (en) * 2018-06-22 2018-11-06 深圳市科比特航空科技有限公司 A kind of sensor-based system of unmanned plane and unmanned plane
KR102202395B1 (en) * 2020-10-12 2021-01-12 심정은 Walking Safety System of Industries Site
CN112464859A (en) * 2020-12-10 2021-03-09 交控科技股份有限公司 Platform guiding system and method
KR102437407B1 (en) * 2021-10-18 2022-08-30 (주)레인보우테크 Tunnel maintaining method and system using drone image data

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013051300A1 (en) * 2011-10-03 2013-04-11 Hanabata Mitsuaki Disaster circumstance ascertainment system
JP2013134663A (en) * 2011-12-27 2013-07-08 Mitsubishi Heavy Ind Ltd System and method for supporting disaster relief activities

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013051300A1 (en) * 2011-10-03 2013-04-11 Hanabata Mitsuaki Disaster circumstance ascertainment system
JP2013134663A (en) * 2011-12-27 2013-07-08 Mitsubishi Heavy Ind Ltd System and method for supporting disaster relief activities

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101895366B1 (en) 2018-03-29 2018-09-05 이호형 the improved hybrid drone
KR20190119330A (en) * 2018-04-12 2019-10-22 황순신 Festival unmanned management system using drones
KR102074448B1 (en) * 2018-04-12 2020-02-06 황순신 Festival unmanned management system using drones

Also Published As

Publication number Publication date
KR20170060973A (en) 2017-06-02

Similar Documents

Publication Publication Date Title
KR101788411B1 (en) Operating method and system for drones to prevent industrial disasters
US10643444B2 (en) Facility management system using Internet of things (IoT) based sensor and unmanned aerial vehicle (UAV), and method for the same
Yuan et al. A survey on technologies for automatic forest fire monitoring, detection, and fighting using unmanned aerial vehicles and remote sensing techniques
US11253736B2 (en) Dispatching UAVs for wildfire surveillance
KR20170101519A (en) Apparatus and method for disaster monitoring using unmanned aerial vehicle
US20170217588A1 (en) Methods and systems for assessing an emergency situation
KR20170101516A (en) Apparatus and method for fire monitoring using unmanned aerial vehicle
Alharam et al. Real time AI-based pipeline inspection using drone for oil and gas industries in Bahrain
CN105243627A (en) Fire-fighting monitoring system based on unmanned aerial vehicle
Sherstjuk et al. Forest fire monitoring system based on UAV team, remote sensing, and image processing
CA3022200C (en) System and method for visualizing and validating process events
CN111311865A (en) Forest fire prevention unmanned aerial vehicle platform based on carry on thermal imager
Deng et al. Research on application of fire uumanned aerial vehicles in emergency rescue
Jeon et al. A real-time drone mapping platform for marine surveillance
Mehta et al. Internet-of-things enabled forest fire detection system
Ramana Karumanchi et al. Fully Smart fire detection and prevention in the authorized forests
Novac et al. A framework for wildfire inspection using deep convolutional neural networks
Ashour et al. Applications of UAVs in search and rescue
Yebra et al. An integrated system to protect Australia from catastrophic bushfires
CN108280895A (en) A kind of explosion-proof crusing robot of petrochemical industry and explosion protection system
Brown et al. Surveillance for intelligent emergency response robotic aircraft (SIERRA)-VTOL aircraft for emergency response
Patterson et al. Integration of terrain image sensing with UAV safety management protocols
Hinterhofer et al. UAV-based LiDAR and gamma probe with real-time data processing and downlink for survey of nuclear disaster locations
Wang et al. Application of mini-UAV in emergency rescue of major accidents of hazardous chemicals
Zheng et al. Design and research of forest farm fire drone monitoring system based on deep learning

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right