KR101752671B1 - Emergency rescue system using unmanned aerial vehicle and method thereof - Google Patents

Emergency rescue system using unmanned aerial vehicle and method thereof Download PDF

Info

Publication number
KR101752671B1
KR101752671B1 KR1020150135980A KR20150135980A KR101752671B1 KR 101752671 B1 KR101752671 B1 KR 101752671B1 KR 1020150135980 A KR1020150135980 A KR 1020150135980A KR 20150135980 A KR20150135980 A KR 20150135980A KR 101752671 B1 KR101752671 B1 KR 101752671B1
Authority
KR
South Korea
Prior art keywords
information
mobile terminal
unmanned aerial
image
aerial vehicle
Prior art date
Application number
KR1020150135980A
Other languages
Korean (ko)
Other versions
KR20170037696A (en
Inventor
전석기
소준영
Original Assignee
주식회사 아이티스테이션
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 아이티스테이션 filed Critical 주식회사 아이티스테이션
Priority to KR1020150135980A priority Critical patent/KR101752671B1/en
Publication of KR20170037696A publication Critical patent/KR20170037696A/en
Application granted granted Critical
Publication of KR101752671B1 publication Critical patent/KR101752671B1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/10Alarms for ensuring the safety of persons responsive to calamitous events, e.g. tornados or earthquakes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B19/00Alarms responsive to two or more different undesired or abnormal conditions, e.g. burglary and fire, abnormal temperature and abnormal rate of flow
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/0202Child monitoring systems using a transmitter-receiver system carried by the parent and the child
    • G08B21/0269System arrangements wherein the object is to detect the exact location of child or item using a navigation satellite system, e.g. GPS
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/14Central alarm receiver or annunciator arrangements
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B27/00Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations
    • B64C2201/127
    • B64C2201/145

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Child & Adolescent Psychology (AREA)
  • Environmental & Geological Engineering (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geology (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Navigation (AREA)
  • Alarm Systems (AREA)

Abstract

An emergency response system and method using an unmanned aerial vehicle is provided. The emergency response system using an unmanned aerial vehicle according to an embodiment of the present invention acquires image information while flying according to a flight path generated based on the received destination information, analyzes the image information, Computing unmanned aircraft; And a mobile terminal for receiving the traveling route and the image information from the unmanned air vehicle.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an emergency response system using an unmanned aerial vehicle,

The present invention relates to an emergency response system and method using an unmanned aerial vehicle, and more particularly, to an emergency response system and method using an unmanned aerial vehicle capable of emergency response using an unmanned aerial vehicle in the event of an accident.

The government provides emergency response services using emergency telephone numbers such as 112 and 119 to safely rescue people from various emergency situations such as sudden medical accidents, crime, fire and disaster and to reduce property loss.

In such an emergency response service, it is important to arrive at the site as soon as possible in order to quickly and safely rescue people and reduce the loss of property. Also, after arriving at the site, it is important to collect information on the site, such as the location of the people who need the structure of the accident, the structure of the building and the structure, and take measures accordingly.

However, it is often difficult for the incident response team to arrive at the site quickly due to narrow alleys, traffic congestion, etc., and it may be difficult to collect accurate site information and establish appropriate response measures due to the complexity of the site such as the complex structure of high-rise buildings .

Korea Patent Publication No. 2015-0033241 (published on April 21, 2015)

SUMMARY OF THE INVENTION The present invention has been made to solve the above problems, and it is an object of the present invention to provide an unmanned aerial vehicle capable of rapidly arriving at an accident site using an unmanned air vehicle in the event of an accident, And provides an emergency response system and method using a flight vehicle.

The present invention also provides a computer-readable recording medium storing a program for causing a computer to execute an emergency response method using the above-mentioned unmanned aerial vehicle.

The problems to be solved by the present invention are not limited to the above-mentioned problems, and other matters not mentioned can be clearly understood by those skilled in the art from the following description.

According to an aspect of the present invention, there is provided an emergency response system using an unmanned aerial vehicle, which acquires image information while flying according to a flight path generated based on received destination information, A unmanned aerial vehicle for calculating a traveling route to a destination; And a mobile terminal for receiving the traveling route and the image information from the unmanned air vehicle.

The unmanned air vehicle further includes a flight communication unit for communicating with the mobile terminal and receiving the destination information; A GPS receiver for receiving GPS information; And a flight control unit for generating the flight path based on the destination information and controlling the flight according to the flight path based on the GPS information.

In addition, the unmanned aerial vehicle includes a photographing unit; And an image analyzing unit for analyzing an image of the photographing unit and detecting an event.

In addition, the flight control unit may optimize the traveling route in real time based on the event.

Also, the airplane control unit may map the traveling route to the GIS information and transmit the same to the mobile terminal through the air communication unit.

In addition, the image analyzer may set a region of interest in the image information of the destination, and may derive feature information of the detected event in the region of interest.

In addition, the airplane control unit may calculate positional information on the event using the GIS information and the GPS information, and may transmit the positional information together with the feature information to the mobile terminal through the air communication unit.

The image analyzing unit may detect at least one of brightness data, color data, and motion data of the image.

The image analyzing unit may compare at least one of the detected brightness data, color data, and motion data with at least one corresponding data among the threshold brightness data, the threshold color data, and the threshold motion data to detect the event .

Also, the mobile terminal includes: a terminal input unit for inputting command information on the unmanned aerial vehicle; A terminal communication unit for wirelessly communicating with the unmanned aerial vehicle; A terminal control unit for remotely controlling the unmanned aerial vehicle according to the command information; And a terminal output unit for outputting the traveling route and the image information transmitted from the unmanned air vehicle.

In addition, the mobile terminal further includes a terminal storage unit, and the storage unit stores GIS information and may store various information transmitted from the unmanned aerial vehicle.

Further, the terminal control unit may display the traveling route together with the GIS information and output the same to the terminal output unit.

In addition, the mobile terminal may further include an authentication unit that authenticates an authority to access the unmanned air vehicle.

In addition, the mobile terminal includes a plurality of mobile terminals, and a mobile terminal, which is first authenticated as an authority to connect to the unmanned air vehicle by the authentication unit among a plurality of mobile terminals, can remotely control the unmanned air vehicle.

In addition, the mobile terminal includes a plurality of mobile terminals, and each of the mobile terminals may further include a GPS receiver for receiving GPS information.

The unmanned aerial vehicle includes GPS information of each mobile terminal, extracts an event by analyzing image information at the destination, and transmits the GPS information and the event to a mobile terminal The control authority of the mobile terminal can be changed to another mobile terminal.

According to an aspect of the present invention, there is provided an emergency response method using an unmanned aerial vehicle, comprising: receiving destination information; Generating a flight path based on the destination information; Flying according to the flight path; Capturing and acquiring image information during the flight; Analyzing the image information; Calculating a traveling route to a destination based on the analyzed image information; And transmitting the traveling route to the mobile terminal.

Analyzing the image information at the destination; Extracting an event from the image information; And transmitting the image information and the event to the mobile terminal.

Deriving the feature information from the event; Calculating location information on the event using GIS information and GPS information; And transmitting the location information together with the feature information to the mobile terminal.

According to another aspect of the present invention, there is provided a computer-readable recording medium storing a program for causing a computer to execute an emergency response method using an unmanned aerial vehicle according to an embodiment of the present invention.

Other specific details of the invention are included in the detailed description and drawings.

According to the present invention, when an accident occurs, a team can quickly arrive at an accident site using an unmanned aerial vehicle.

Also, in case of an accident, the team can use the unmanned aerial vehicle to collect correct information at the accident site and take appropriate countermeasures.

In addition, by responding to accidents accurately and quickly, it is possible to provide an emergency response service capable of safely rescuing persons from various emergency situations and reducing property loss.

1 is a conceptual diagram of an emergency response system using an unmanned aerial vehicle according to an embodiment of the present invention.
2 is a block diagram of the unmanned aerial vehicle of FIG.
3 is a block diagram of the mobile terminal of FIG.
4 is a diagram illustrating a flight path of an unmanned aerial vehicle in an emergency response system using an unmanned aerial vehicle according to an embodiment of the present invention.
5 is a view showing a traveling route of an emergency response vehicle using a mobile terminal in an emergency response system using an unmanned aerial vehicle according to an embodiment of the present invention.
6 is a view illustrating a traveling path displayed on a mobile terminal in an emergency response system using an unmanned air vehicle according to an embodiment of the present invention.
7 is a diagram for explaining an example of image analysis.
FIG. 8 is a diagram showing an example of the feature information derived from the ROI in FIG.
9 is a signal flow diagram of an emergency response system using an unmanned aerial vehicle according to an embodiment of the present invention.
10 is a conceptual diagram of an emergency response system using an unmanned aerial vehicle according to another embodiment of the present invention.
11 is a signal flow diagram of an emergency response system using an unmanned aerial vehicle according to another embodiment of the present invention.
12 is a flowchart of an emergency response method using an unmanned aerial vehicle according to an embodiment of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention and the manner of achieving them will become apparent with reference to the embodiments described in detail below with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Is provided to fully convey the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.

Although the first, second, etc. are used to describe various elements, components and / or sections, it is needless to say that these elements, components and / or sections are not limited by these terms. These terms are only used to distinguish one element, element or section from another element, element or section. Therefore, it goes without saying that the first element, the first element or the first section mentioned below may be the second element, the second element or the second section within the technical spirit of the present invention.

The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. As used herein, the terms "comprises" and / or "made of" means that a component, step, operation, and / or element may be embodied in one or more other components, steps, operations, and / And does not exclude the presence or addition thereof.

Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.

In this regard, throughout the specification, like reference numerals refer to like elements, and it will be understood that each configuration of the processing flowchart diagrams and combinations of flowchart illustrations may be performed by computer program instructions. These computer program instructions may be loaded into a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, so that those instructions, which are executed through a processor of a computer or other programmable data processing apparatus, Thereby creating means for performing functions.

It should also be noted that in some alternative embodiments, the functions mentioned in the configurations may occur out of order. For example, the two configurations shown in succession may in fact be performed substantially concurrently, or the configurations may sometimes be performed in reverse order according to the corresponding function.

Hereinafter, the present invention will be described in more detail with reference to the accompanying drawings.

1 is a conceptual diagram of an emergency response system using an unmanned aerial vehicle according to an embodiment of the present invention. 2 is a block diagram of the unmanned aerial vehicle of FIG. 3 is a block diagram of the mobile terminal of FIG.

1 to 3, an emergency response system 10 using an unmanned aerial vehicle according to an embodiment of the present invention may include an unmanned aerial vehicle 100 and a mobile terminal 200. Here, the unmanned air vehicle 100 and the mobile terminal 200 may each be plural. Specifically, the unmanned air vehicle 100 firstly collects information from the emergency response team, which must be dispatched on the spot using a fire truck, an ambulance, a police car, etc. through the emergency response system 10 using the unmanned air vehicle according to an embodiment of the present invention. And can accurately and precisely provide accurate data to the mobile terminal 200 of the emergency response team.

The UAV 100 refers to flight things that a person does not ride on and can be managed by an agency that performs an emergency response. The unmanned aerial vehicle 100 acquires image information while flying according to a flight path generated based on the received destination information, calculates a travel route to the destination by analyzing the image information, and transmits the information to the mobile terminal 200 . Here, the destination information is generated by the contents reported to the agency performing the emergency response (for example, a police station, a fire station, etc.) and includes information on the location of the scene where the accident occurred. The destination information is transmitted from the server (not shown) or the mobile terminal 200 of the agency performing the emergency response to the unmanned air vehicle 100, and the unmanned air vehicle 100 automatically flows to the accident site according to the destination information . The unmanned aerial vehicle 100 may be controlled by a server of an agency that performs an emergency response or may be controlled by a mobile terminal 200. In addition, the unmanned air vehicle 100 may modify its own flight path using a photographed image signal and a GPS signal transmitted from the outside.

2, the unmanned air vehicle 100 includes a flight communication unit 110, a GPS receiving unit 120, a flight control unit 130, a photographing unit 140, an image analysis unit 150, a permission unit 160 , A flight object storage unit 170, and the like.

The flight communication unit 110 communicates with the mobile terminal 200 and receives destination information from a server (not shown) of the emergency response agency or the mobile terminal 200 in particular. The flight communication unit 110 communicates with the server of the emergency response agency or the mobile terminal 200 through the network 50. That is, the flight communication unit 110 may be connected to the emergency response organization that manages the unmanned air vehicle 100 through the network 50. [

The GPS receiving unit 120 receives GPS information. The unmanned air vehicle 100 can fly along the flight path using the GPS information received from the GPS receiving unit 120 and the server of the emergency response agency or the mobile terminal 200 can operate the unmanned air vehicle 100 You can check the location.

The airplane control unit 130 controls the unmanned airplane 100 and its components. Specifically, the flight control unit 130 generates a flight path based on the destination information, and can control the unmanned air vehicle 100 to fly along the flight path based on the GPS information. For example, the flight control unit 130 can set the flight path to the shortest distance to the accident site according to the destination information, and can fly according to the flight path set to the shortest distance. At this time, referring to the GPS information, the flight control unit 130 controls the unmanned air vehicle 100 so as to fly to the destination without leaving the flight path at a proper height.

In addition, the airplane control unit 130 analyzes the image information and calculates the travel route to the destination. For example, the flight control unit 130 calculates a traveling route that can reach the destination in the earliest time based on the analysis information of the image photographed during flight, and transmits the traveling route to the mobile terminal 200 through the flight communication unit 110. [ Lt; / RTI > Accordingly, the emergency response team using the mobile terminal 200 can help to reach the accident site in the earliest time. In particular, the flight control unit 130 can optimize the traveling route in real time based on the image analysis result. For example, when a traffic accident occurs in the photographed image, the flight control unit 130 may calculate a traveling route that can bypass the road where the traffic accident occurred, and transmit the calculated traveling route to the mobile terminal 200. Preferably, the flight control unit 130 may map the traveling route to the GIS information and transmit it to the mobile terminal 200 through the flight communication unit 110. A specific embodiment will be described later.

The photographing unit 140 photographs the unmanned air vehicle 100 to acquire an image. The image photographed by the photographing unit 140 may be transmitted to the mobile terminal 200 in real time and may be transmitted to the mobile terminal 200 together with the image data after being analyzed by the image analysis unit 150. The photographing unit 140 may include a general camera, but may further include an infrared camera, a thermal camera, and the like so as to perform a monitoring function at night. In addition, the photographing unit 140 may include a color tracking module (not shown). By this function, an object having a specific color is photographed or excluded from the photographing object, thereby reducing data of the image information, . The image information obtained by the photographing unit 140 is transmitted to the image analysis unit 150.

The image analysis unit 150 processes the image information to acquire image data and convert the image into a compressed format in which the image is compressed to facilitate data transmission. The video data in the form of a compressed format may have various formats such as Moving Picture Experts Group (MPEG) -1 or MPEG-4. In particular, the image analysis unit 150 can detect an object by analyzing the image of the photographing unit 140, and can detect a specific event. Here, the object includes a specific object such as a vehicle or a person, and an event includes all situations in which the image data of the image can be changed, such as a change in the position of the object, a change in a specific situation such as a fire or the like. At this time, the image analyzer 150 may set the region of interest in the image information and derive the feature information of the detected event in the region of interest.

Specifically, the image analysis unit 150 can detect an object using feature extraction and feature extraction that extracts visual feature information of an object to be detected from an image input from the photographing unit 140 . At this time, there are a method of using a learning machine such as AdaBoost or SVM (Support Vector Machine) at the time of detecting an object and a non-learning method using vector similarity of extracted features, The learning method and the non-learning method can be appropriately selected and used according to the complexity of the program. For example, a Haar-like feature that uses the sum of weight products using the difference in the sum of pixel values (pixels) between two or more adjacent blocks in a local feature of the image, feature can be applied. To extract the difference between the sum of the pixel values of neighboring blocks in extracting the Hahn-like feature, a mask considering a simple square feature is used.

In addition, the image analysis unit 150 can detect a change in the position of the object in the captured image using the image recognition algorithm. For example, the motion of an object can be detected from an image using a mean shift algorithm or a particle filter algorithm. Of course, those skilled in the art will appreciate that other algorithms can be used to detect motion of an object.

Here, the mean shift algorithm enables high-speed tracking of a region of interest (ROI) based on density distribution (feature point, color) in an image, , Color clusters are generated by the iterative color division calculation, and the motion of the object of interest can be extracted by determining the boundary based on the initially designated color region. The Particle Filter algorithm is a kind of particle based Kalman filter which estimates the probability distribution of the current state variables by using the observed values and the random state variables obtained from the modeled system equations. will be.

The authority permitting unit 160 may allow an external server or the mobile terminal 200 to control the unmanned aerial vehicle 100. [ For example, when the unmanned air vehicle 100 receives the dispatch signal and is controlled by the server at the start time, and the unmanned air vehicle 100 is flying to the accident site or arriving at the accident site, The control authority of the mobile terminal 200 can be changed. When there are a plurality of mobile terminals 200, the unmanned air vehicle 100 receives the GPS information of each mobile terminal, extracts the event by analyzing the video information at the destination, The control authority of the mobile terminal 100 that remotely controls the UAV 100 may be changed to another mobile terminal based on the GPS information and the event.

The flight body storage unit 170 stores various information and data received from the outside and stores various types of information such as images captured by the photographing unit 140 and image data analyzed by the image analysis unit 150, Information, and data. In particular, the GIS information may be received externally via the flight communication unit 110, but GIS information may be stored in the flight object storage unit 170. The flight control unit 130 can map the traveling route to the GIS information stored in the flight object storage unit 170 and transmit the same to the mobile terminal 200 so that even when the GIS information can not be accessed through the network 50, It is possible to provide an optimal travel route to the mobile terminal 200.

The mobile terminal 200 can be connected to the unmanned air vehicle 100 through the network 50 and receives various information such as a traveling route and an event from the unmanned air vehicle 100, So that the direct unmanned air vehicle 100 can be controlled. The mobile terminal 200 may have a web browser (Netscape, Internet Explorer, Chrome, etc.) capable of displaying contents of a web page such as HTML, XML, and the like. The mobile terminal 200 may be a general mobile communication terminal such as a cellular phone, a Personal Communications Services phone (PCS phone), a synchronous / asynchronous IMT-2000 (International Mobile Telecommunication 2000) A PDA (Personal Digital Assistant), a smart phone, a WAP phone (wireless application protocol phone), and the like, which are capable of providing a wired wireless network service, a Palm Personal Computer, a Personal Digital Assistant (PDA) And may be an apparatus having an interface for a wireless LAN connection such as an IEEE 802.11 wireless LAN network card or the like. The mobile terminal 200 may be an information communication device such as a computer, a notebook computer, or the like, in addition to the mobile communication terminal. That is, it may include all wired / wireless home appliances / communication devices that can be carried on a vehicle or directly carried by a person, and can communicate with the unmanned air vehicle 100 via the network 50.

 3, the mobile terminal 200 includes a terminal input unit 210, a terminal communication unit 220, a terminal control unit 230, a terminal output unit 240, a terminal storage unit 250, an authentication unit 260 ), A GPS receiver 270, and the like.

The terminal input unit 210 is an input interface through which the user 21 or the like of the mobile terminal 200 inputs and selects information, and can input command information for the unmanned aerial vehicle 100. The terminal input unit 210 may include a button, a wheel, a jog shuttle, and the like to receive commands from the user 21. In addition, if the display provides a touch screen function, the display may serve as the terminal input unit 210. [ In addition, the terminal input unit 210 may be a single device by itself, for example, a wireless remote controller may serve as the terminal input unit 210. [ That is, the user 21 can input information using buttons and wheels provided in the wireless remote controller. In this case, the user 21 can control the flight of the unmanned air vehicle 100 by operating the wireless remote controller. In addition, the terminal input unit 210 may further include a microphone or the like capable of inputting voice and the like.

The terminal communication unit 220 wirelessly communicates with the unmanned air vehicle 100. Specifically, the terminal communication unit 220 and the airborne communication unit 110 are connected to each other through the network 50. [ The terminal communication unit 220 may be connected to a server (not shown) of the emergency response agency and may exchange various information with the server. In addition, the terminal communication units 220 of the mobile terminals 200 may be connected to each other to transmit / receive various kinds of information.

The terminal control unit 230 can remotely control the unmanned aerial vehicle and can control the unmanned aerial vehicle remotely according to the command information inputted by the user 21 through the terminal input unit 210. [ In addition, the terminal control unit 230 controls the mobile terminal 200 and its constituent elements. Specifically, the terminal control unit 230 may display the traveling route generated and transmitted from the UAV 100 on the terminal output unit 240 and inform the user 21 of the traveling route. For example, the terminal control unit 230 may display the GIS information on the traveling route generated by the unmanned aerial vehicle 100 and output the same to the terminal output unit 240. Of course, as described above, the GIS information may be mapped to the traveling route in the UAV 100, and the terminal control unit 230 may output the GIS information to the terminal output unit 240.

The terminal output unit 240 serves to display and provide information to the user 21 and the like. The terminal output unit 240 may be an audible display means such as a warning sound or a voice or a visual display means. For example, it may include a speaker capable of outputting sound of the terminal output unit 240, and may include a display module such as an LCD monitor or an LED capable of displaying characters, symbols, and the like.

The terminal storage unit 250 stores various information and data received from the outside, and can receive and store the information and data generated by the unmanned aerial vehicle 100. In particular, the GIS information may be stored in the terminal storage unit 250. The terminal control unit 230 can map the GIS information stored in the terminal storage unit 250 to the terminal output unit 240 by mapping the traveling route transmitted from the unmanned air vehicle 100. [ Of course, as described above, the unmanned air vehicle 100 may map the traveling route to the GIS information and transmit the same to the mobile terminal 200. In this case, the terminal storage unit 250 may store the information that maps the traveling route to the GIS information Will be stored.

The authentication unit 260 serves to authenticate the authority to access the unmanned aerial vehicle 100. For example, the authentication unit 260 authenticates the right to access the unmanned air vehicle 100 through the indexer or the password of the unique authentication module of the user 21 or the like. The authentication unit 260 receives at least one authentication protocol among the text file authentication module (mod_auth), Berkeley DB authentication module (mod_auth_db), DBM authentication module (mod_auth_dbm), Anonymous authentication module (mod_auth_anon), PostgreSQL authentication module and XNS authentication service But the present invention is not limited thereto. Since the mobile terminal 200 can be connected to the unmanned air vehicle 100 only when it is authenticated by the authentication unit 260, information safety, confidentiality, and the like can be maintained.

The GPS receiver 270 receives the GPS information. The mobile terminal 200 can move along the traveling route using the GPS information received by the GPS receiver 270 and the server of the emergency response agency or the unmanned air vehicle 100 can move to the location of the mobile terminal 200 .

4 is a diagram illustrating a flight path of an unmanned aerial vehicle in an emergency response system using an unmanned aerial vehicle according to an embodiment of the present invention. 5 is a view showing a traveling route of an emergency response vehicle using a mobile terminal in an emergency response system using an unmanned aerial vehicle according to an embodiment of the present invention. 6 is a diagram illustrating a traveling route displayed on a mobile terminal in an emergency response system using an unmanned aerial vehicle according to an embodiment of the present invention.

Referring to FIG. 4, when a fire report is received in a fire department or the like, the unmanned air vehicle 100 under the control of a fire department or the like first goes to the fire scene (D). That is, the destination information is transmitted to the unmanned air vehicle 100 when the report is received, the unmanned air vehicle 100 generates a flight route based on the received destination information, .

Here, it is general that the unmanned air vehicle 100 generates and flows the shortest flight path P1 that is a straight flight to the fire scene D. However, the unmanned aerial vehicle 100 may be able to fly while modifying its own flight route in consideration of various situations such as the flying distance, the flight time, the traffic situation, and the location of the mobile terminal 200.

Then, the unmanned air vehicle 100 captures an image at the photographing unit 140 while flying along the flight path, and analyzes the image at the image analyzing unit 150 to confirm a specific situation. The image photographed by the photographing unit 140 and / or the image information analyzed by the image analysis unit 150 are transmitted to the mobile terminal 200 through the flight communication unit 110.

At this time, in consideration of the speed of the network 50 and the like, only the image determined as a specific situation such as the image in which the event is detected in the image analysis unit 150 may be filtered by the flight control unit 130 and transmitted to the mobile terminal 200 . For example, in FIG. 4, the unmanned air vehicle 100 has a state in which the vehicle V is stagnant from the image so that the fire truck 20 to be moved to the fire scene D can move away from traffic jam and the like The road image may be transmitted to the mobile terminal 200 to provide accurate information to the user of the mobile terminal 200.

Referring to FIG. 5, the UAV 100 analyzes an image taken while flying along a flight path, and calculates a travel route to a destination. The travel route of the fire truck 20 can be navigation or the like mounted on the fire truck 20 but the flight control unit 130 can use the image photographed by the photographing unit 140 to determine the optimum travel route P2 in real time .

For example, in FIG. 5, even when the unmanned flight vehicle 100 analyzes a photographed image to calculate a traveling route, it can optimize the traveling route P2 in real time considering the fluctuating traffic conditions and the like. The optimized travel route P2 is transmitted in real time to the mobile terminal 200 via the network 50, and this travel route P2 is displayed on the mobile terminal. At this time, only the optimal traveling route P2 may be transmitted from the UAV 100 to the mobile terminal 200, and the optimal traveling route P2 may be mapped to the GIS information.

Referring to FIG. 6, the optimal travel route P2 calculated by the unmanned air vehicle 100 up to the destination B34 is transmitted to the mobile terminal 200 and displayed on the map M. At this time, as described above, the map M can use the GIS information. The GIS information may be built in the mobile terminal 200, or the GIS information in which the optimum traveling route P2 is mapped may be unattended Or may be transmitted from the air vehicle 100. The traveling path P2 shown in FIG. 6 can be changed in real time and can be reflected in real time to the mobile terminal 200.

7 is a diagram for explaining an example of image analysis. FIG. 8 is a diagram showing an example of the feature information derived from the ROI in FIG.

Referring to FIG. 7, the UAV 100 automatically dispatches to an accident site according to destination information, and the image captured by the photographing unit 140 at a location above the destination is analyzed by the image analysis unit 150. At this time, the image analyzer 150 may set an ROI in the image information I of the destination and may detect the object O in the ROI. That is, the image analyzer 150 can detect the event by analyzing the image information I of the destination, and can set the ROI to track the object O such as a person. At this time, the flight control unit 130 may calculate position information on the event using the GIS information and the GPS information.

Specifically, the image analysis unit 150 may detect at least one of brightness data, color data, and motion data of the image in the image, and may detect at least one of the detected brightness data, The at least one of the data may be compared with at least one of the threshold brightness data, the threshold color data, and the threshold motion data to detect the event.

For example, when smoke occurs, or when a person or an object appears and the background becomes dark, the brightness data of the image may decrease. Using this feature, the image analyzer 150 can detect an event such as the generation of smoke or the movement of the object when the brightness data exceeds the preset threshold value of the brightness data.

In addition, if there are color data of red and yellow series in a certain level or more in the image, it can be judged that the flame is spreading. Using this feature, the image analyzer 150 can detect an event of fire occurrence when the specific gravity of the red and yellow data among the color data is greater than a predetermined threshold specific gravity.

And, if the motion is detected in the image, or if the degree of motion is greater than the degree of motion of the stored reference motion data, an event can be detected that the object is moving or the flame is worn.

The event detected by the image analyzing unit 150 is transmitted to the mobile terminal 200, and is used as a basic data for the emergency response team to quickly determine information, which helps to establish quick response measures.

Referring to FIG. 8, the image analyzer 150 may extract the feature information of the object O detected in the region of interest (ROI). That is, the image analysis unit 150 may extract the feature information of the event detected in the ROI and transmit the feature information to the mobile terminal 200.

For example, the image analyzer 150 may extract the feature information of the object O to be detected in the ROI set in the image. For example, after designating an initial search window area at the initial position of the object O shown in the image information, a color probability distribution is calculated, a central pixel of the object is searched in the inside and outside of the search window area, Return the pixel position, and reduce the search window size. It is possible to estimate the size of the object O by repeating the above steps and estimating the pixels of the moving object O and converting it into the actual scale using the correction table and then outputting it. Accordingly, in the case of a person, it is possible to estimate whether the adult or child can be estimated in consideration of the size of the object O, etc., and to estimate the size of the hair in the object O to estimate a male or female.

At this time, the flight control unit 130 may calculate position information on the event using the GIS information and the GPS information. The flight control unit 130 transmits the position information together with the feature information through the flight communication unit 110 To the terminal 200. Accordingly, the mobile terminal 200 can recognize the location of the object O along with the characteristics of the object O, thereby obtaining information necessary for securing a physical path for the structure.

9 is a signal flow diagram of an emergency response system using an unmanned aerial vehicle according to an embodiment of the present invention.

Referring to FIG. 9, an emergency response system 10 using an unmanned aerial vehicle according to an embodiment of the present invention is acquired by a UAV 100 and transmits various generated information to a mobile terminal 200, The user of the mobile terminal 200 can accurately and precisely collect on-site information, and the correspondent team can establish quick response measures. Here, it is preferable that the mobile terminal 200 can be connected to the UAV 100 only after authentication is performed (S210).

When the UAV 100 receives the destination information from the outside (S110), the UAV 100 automatically creates a flight path and then flows to the destination (S120). Then, the unmanned aerial vehicle 100 not only takes a picture to the destination while flying, but also shoots the scene continuously over the destination even after arriving at the destination (S130). The image photographed by the unmanned air vehicle 100 can be transmitted to the mobile terminal 200 that has the authority to access the unmanned air vehicle 100. The unmanned air vehicle 100 can receive not only all images, Only images can be sent. The transmitted image is displayed on the mobile terminal 200 (S220).

The image captured by the unmanned air vehicle 100 may be analyzed by the unmanned air vehicle 100 itself to acquire image data at step S140 and the image data may be transmitted from the unmanned air vehicle 100 to the mobile terminal 200 And the image data may be displayed together with the image on the mobile terminal 200 (S220). In particular, the unmanned aerial vehicle 100 can calculate an optimum traveling route to the destination in real time based on the analyzed image (S150), and the optimal traveling route is transmitted to the mobile terminal 200 and displayed (S230 ).

In addition, the unmanned aerial vehicle 100 can detect an event through image analysis (S160), and information about the event can be transmitted to the mobile terminal 200 and displayed (S240).

Then, the UAV 100 can calculate the location information of the event using the GIS information and the GPS information, and derive the feature information of the event detected in the ROI (S170). The location information and the characteristic information are transmitted from the unmanned air vehicle 100 to the mobile terminal 200 and the location information and the characteristic information may be displayed on the mobile terminal 200 in operation S250. The location of the mobile terminal 200 as well as the characteristics of the event can be known together, so that the user of the mobile terminal 200 can be assisted in collecting accurate data.

10 is a conceptual diagram of an emergency response system using an unmanned aerial vehicle according to another embodiment of the present invention.

Referring to FIG. 10, an emergency response system using an unmanned aerial vehicle according to another embodiment of the present invention may include a plurality of unmanned aerial vehicles (100) and a plurality of mobile terminals (200). In FIG. 10, only three unmanned aerial vehicles (100a, 100b, 100c) and two mobile terminals (200a, 200b) are shown but it should be apparent to those skilled in the art. The following description will be made with reference to FIG. Specific components of the respective unmanned aerial vehicles 100a, 100b and 100c and the respective mobile terminals 200a and 200b will be omitted here since they are as described above.

The unmanned air vehicle 100 includes a plurality of unmanned vehicles 100a, 100b, and 100c, which are moved to an accident site in the air when an emergency occurs. At this time, each of the unmanned aerial vehicles 100a, 100b, and 100c can move in a straight line for the shortest time correspondence, but can move while correcting the flight path to the destination depending on the situation. Preferably, each of the unmanned aerial vehicles 100a, 100b, and 100c can fly over different flight paths to calculate an optimal travel route.

The mobile terminal 200 includes a plurality of mobile terminals 200a and 200b and may connect to at least one unmanned aerial vehicle 100. [ Each of the mobile terminals 200a and 200b may control the plurality of unmanned aerial vehicles 100a, 100b and 100c or may control only one unmanned air vehicle 100. [ In addition, each of the unmanned aerial vehicles 100a, 100b, and 100c may be controlled by one mobile terminal 200 or may be controlled by a plurality of mobile terminals 200. [

Preferably, one unmanned aerial vehicle 100 is controlled by one mobile terminal 200. For this purpose, the mobile terminal 200, which is first authenticated as an authority to connect to the unmanned air vehicle 100 by the authentication unit 260 among the plurality of mobile terminals 200, can remotely control the unmanned air vehicle 100 have. The unmanned air vehicle 100 receives GPS information of each mobile terminal 200, extracts an event by analyzing image information at a destination, and transmits the unmanned air vehicle 100 to the remote The control authority of the mobile terminal 200 controlling the mobile terminal 200 can be changed to another mobile terminal 200. For example, even if there is a mobile terminal 200a that has been authorized to connect to the unmanned air vehicle 100a for the first time, the unmanned air vehicle 100a can detect the location of each of the mobile terminals 200a and 200b, The control authority for remotely controlling the UAV 100a can be changed from the mobile terminal 200a to the mobile terminal 200b.

11 is a signal flow diagram of an emergency response system using an unmanned aerial vehicle according to another embodiment of the present invention.

11, an emergency response system using an unmanned aerial vehicle according to another embodiment of the present invention is acquired by the unmanned air vehicle 100c and transmits various generated information to the mobile terminals 200a and 200b, A user of the terminals 200a and 200b can accurately and precisely collect on-site information, and the correspondent team can establish a quick response measure.

In this case, each of the mobile terminals 200a and 200b must be authenticated before they can access the unmanned aerial vehicle 100 (S21 and 31), and the mobile terminal 200a that has been authenticated first can obtain the control right (S23 ).

In addition, the unmanned air vehicle 100c transmits images photographed during a flight to each of the mobile terminals 200a and 200b and transmits image data (e.g., location information of the event, feature information of the event, etc.) (S13).

During the flight of the unmanned air vehicle 100c, the unmanned air vehicle 100c can receive the GPS information of the mobile terminals 200a and 200b (S25 and S33) And the authority to control the unmanned air vehicle 100c can be maintained or changed on the basis of the event obtained by the unmanned air vehicle 100c and the image captured by the unmanned air vehicle 100c (S17).

For example, when a fire occurs in a plurality of buildings, and one of the buildings is injured, a user of the mobile terminal 200a capable of currently controlling the unmanned air vehicle 100c is put into another building, If the user of the mobile terminal 200b is close to the building where the injured person is present, the unmanned air vehicle 100c may change the control right from the current mobile terminal 200a to the other mobile terminal 200b (S35). Then, the user of the mobile terminal 200b receiving the control right can collect more precise data by manipulating the unmanned air vehicle 100c and establish a more practical response measure. Of course, the control authority of the unmanned air vehicle 100c may be transferred between the mobile terminals 200a and 200b rather than the unmanned air vehicle 100c.

12 is a flowchart of an emergency response method using an unmanned aerial vehicle according to an embodiment of the present invention.

Referring to FIG. 12, in an emergency response method using an unmanned aerial vehicle according to an embodiment of the present invention, the unmanned air vehicle 100 receives destination information (S1), generates a flight route based on the destination information S2), and then it is flown according to the flight path (S3). The unmanned aerial vehicle 100 acquires image information in step S4 and acquires image information in step S4. The image information is analyzed in step S5. A traveling route to a destination is calculated based on the analyzed image information in step S6. (S7), the unmanned aerial vehicle 100 moves to the site first rather than the emergency response team using the mobile terminal 200 to help secure the route to the site by transmitting the traveling route to the mobile terminal 200 do.

In addition, the UAV 100 analyzes image information at a destination, extracts an event from the image information, and transmits the image information and the event to the mobile terminal 200, so that the emergency response team using the mobile terminal 200 Can collect and provide various information.

In addition, the UAV 100 derives feature information from the event, calculates position information on the event using GIS information and GPS information, and transmits the position information to the mobile terminal together with the feature information And analyze the information to provide necessary information to the emergency response team.

Further, more accurate data can be obtained by operating the unmanned aerial vehicle (100) at the mobile terminal (200) in a dangerous place where the emergency response team such as a fire scene is very difficult to access. The right to operate the unmanned aerial vehicle 100 may be transferred between the mobile terminals 200 or may be changed to another mobile terminal 200 itself by analyzing information obtained from the unmanned air vehicle 100. [

Meanwhile, the emergency response method using an unmanned aerial vehicle according to an embodiment of the present invention can be implemented as one module by software and hardware, and the embodiments of the present invention described above can be created as a program that can be executed in a computer, And can be implemented in a general-purpose computer that operates the program using a computer-readable recording medium. The computer-readable recording medium is implemented in the form of a carrier wave such as a ROM, a floppy disk, a magnetic medium such as a hard disk, an optical medium such as a CD or a DVD, and a transmission through the Internet. In addition, the computer-readable recording medium may be distributed to a network-connected computer system so that computer-readable codes may be stored and executed in a distributed manner.

The components or parts used in the embodiments of the present invention may be software such as a task, a class, a subroutine, a process, an object, an execution thread, a program, field-programmable gate array (ASIC), or an application-specific integrated circuit (ASIC), or a combination of the above software and hardware. The components or parts may be included in a computer-readable storage medium, or a part of the components may be distributed to a plurality of computers.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, You will understand. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.

10: Emergency response system using unmanned aerial vehicle
100: unmanned vehicle
110: Flight communication unit 120: GPS receiver
130: Flight control unit 140:
150: Image analysis unit 160: Authorization unit
170:
200: mobile terminal
210: terminal input unit 220: terminal communication unit
230 terminal control unit 240 terminal output unit
250: terminal storage unit 260: authentication unit
270: GPS receiver

Claims (10)

An unmanned aerial vehicle that acquires image information while flying according to a flight path generated based on the received destination information, and calculates a travel route to the destination by analyzing the image information; And
And a mobile terminal for receiving the traveling route and the image information from the unmanned air vehicle,
In the unmanned aerial vehicle,
A flight communication unit for communicating with the mobile terminal and receiving the destination information,
A GPS receiver for receiving GPS information,
A photographing unit for photographing and acquiring an image,
An image analyzing unit for analyzing an image of the photographing unit to detect an event, detecting an event in an image photographed during a flight to the destination along the flight path, and detecting an event in an image photographed in flight above the destination, Wow,
The route is generated based on the destination information, and the traveling route is optimized in real time based on an event detected in an image photographed during a flight to the destination A flight control unit for controlling the airplane to fly along the flight path based on the GPS information,
An authorization unit for allowing the mobile terminal to control the unmanned aerial vehicle,
And a flight object storage unit for storing GIS information,
The image analyzing unit sets a region of interest in an image photographed while flying over the destination, extracts feature information of the detected event in the region of interest,
The airplane control unit maps the traveling route to the GIS information, transmits the same to the mobile terminal through the air communication unit, calculates position information on the detected event in the ROI using the GIS information and the GPS information And transmits the location information together with the feature information to the mobile terminal through the air communication unit.
delete delete delete delete delete The method according to claim 1,
The mobile terminal,
A terminal input unit for inputting command information for the unmanned aerial vehicle;
A terminal communication unit for wirelessly communicating with the unmanned aerial vehicle;
A terminal control unit for remotely controlling the unmanned aerial vehicle according to the command information; And
And a terminal output unit for outputting the traveling route and the image information transmitted from the unmanned aerial vehicle.
8. The method of claim 7,
The mobile terminal,
Further comprising an authentication unit for authenticating an authority to connect to the unmanned aerial vehicle.
9. The method of claim 8,
The mobile terminal includes a plurality of mobile terminals,
Wherein the mobile terminal, which is first authorized to access the unmanned aerial vehicle by the authentication unit, remotely controls the unmanned aerial vehicle among the plurality of mobile terminals, using the unmanned air vehicle.
delete
KR1020150135980A 2015-09-25 2015-09-25 Emergency rescue system using unmanned aerial vehicle and method thereof KR101752671B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150135980A KR101752671B1 (en) 2015-09-25 2015-09-25 Emergency rescue system using unmanned aerial vehicle and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150135980A KR101752671B1 (en) 2015-09-25 2015-09-25 Emergency rescue system using unmanned aerial vehicle and method thereof

Publications (2)

Publication Number Publication Date
KR20170037696A KR20170037696A (en) 2017-04-05
KR101752671B1 true KR101752671B1 (en) 2017-07-03

Family

ID=58586841

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150135980A KR101752671B1 (en) 2015-09-25 2015-09-25 Emergency rescue system using unmanned aerial vehicle and method thereof

Country Status (1)

Country Link
KR (1) KR101752671B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108711273A (en) * 2018-03-30 2018-10-26 榛硕(武汉)智能科技有限公司 A kind of quick processing system of traffic accident and its processing method

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107416207A (en) * 2017-06-13 2017-12-01 深圳市易成自动驾驶技术有限公司 Unmanned plane rescue mode, unmanned plane and computer-readable recording medium
KR102067875B1 (en) * 2017-12-06 2020-01-20 주식회사 아세스 Method for rescue using drone and computer readable record-medium on which program for executing method therefor
KR102441099B1 (en) * 2018-06-29 2022-09-07 현대오토에버 주식회사 System for processing information and operating method thereof
JP6630893B1 (en) * 2019-03-28 2020-01-15 光司商会株式会社 Hanging work support system
CN112972935A (en) * 2019-12-16 2021-06-18 泉州胜日科技有限公司 Textile industry workshop fire emergency rescue method and device
CN110816839A (en) * 2019-12-17 2020-02-21 淮安航空产业研究院有限公司 Unmanned aerial vehicle and system for putting in emergency rescue equipment and people-in-loop putting method thereof
KR102493780B1 (en) * 2020-09-16 2023-02-02 이민형 System and method for monitoring the ground using hybrid unmanned airship
KR102654951B1 (en) * 2022-04-05 2024-04-04 인하대학교 산학협력단 Method and System for Fire Response based on Smart Unmanned Self-Driving Platform
CN115331389B (en) * 2022-07-20 2023-08-11 浙江鹿枫户外用品有限公司 Outdoor rescue device for handling emergency

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002237000A (en) * 2001-02-09 2002-08-23 Chishiki Joho Kenkyusho:Kk Real-time map information communication system and its method
KR101286376B1 (en) * 2013-03-07 2013-07-15 건국대학교 산학협력단 System and method for controlling unmanned aerial vehicle

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150033241A (en) 2013-09-24 2015-04-01 엘에스산전 주식회사 System for disaster using unmanned aerial vehicle

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002237000A (en) * 2001-02-09 2002-08-23 Chishiki Joho Kenkyusho:Kk Real-time map information communication system and its method
KR101286376B1 (en) * 2013-03-07 2013-07-15 건국대학교 산학협력단 System and method for controlling unmanned aerial vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108711273A (en) * 2018-03-30 2018-10-26 榛硕(武汉)智能科技有限公司 A kind of quick processing system of traffic accident and its processing method

Also Published As

Publication number Publication date
KR20170037696A (en) 2017-04-05

Similar Documents

Publication Publication Date Title
KR101752671B1 (en) Emergency rescue system using unmanned aerial vehicle and method thereof
US11785458B2 (en) Security and public safety application for a mobile device
US11143521B2 (en) System and method for aiding responses to an event detected by a monitoring system
KR101937272B1 (en) Method and Apparatus for Detecting Event from Multiple Image
KR101883292B1 (en) Disaster rescue and response system and operating method threrof
KR101459104B1 (en) Intelligent cctv system detecting emergency with motion analysis and method of emergency detection using the same
KR101459024B1 (en) Security System for Monitoring Facilities
US20220139199A1 (en) Accurate digital security system, method, and program
CN109544870A (en) Alarm decision method and intelligent monitor system for intelligent monitor system
Aljehani et al. Safe map generation after a disaster, assisted by an unmanned aerial vehicle tracking system
Novac et al. A framework for wildfire inspection using deep convolutional neural networks
KR20190099216A (en) RGBD detection based object detection system and method
US20220189038A1 (en) Object tracking apparatus, control method, and program
US20190251367A1 (en) Surveillance apparatus, control method, and program
JP6789905B2 (en) Information processing equipment, information processing methods, programs and communication systems
JP2017167800A (en) Monitoring system, information processor, monitoring method, and monitoring program
KR102225456B1 (en) Road mate system, and road mate service providing method
KR101961800B1 (en) Emergency accident rescue system using drones
KR102039246B1 (en) Management system
KR102039247B1 (en) Management system based terminal device for user
KR102417518B1 (en) System for remote operation of unmanned vehicles and method thereof
KR101445362B1 (en) Device for Imagery Interpretation
KR102134799B1 (en) Management system based terminal device for car
Mallat et al. IoT Based People Detection for Emergency Scenarios
KR20230086946A (en) System and method for determining fever and mask wearing status based on deep learning face recognition technology

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant