KR20170016684A - The unmanned air vehicle for castaway tracking - Google Patents
The unmanned air vehicle for castaway tracking Download PDFInfo
- Publication number
- KR20170016684A KR20170016684A KR1020150110110A KR20150110110A KR20170016684A KR 20170016684 A KR20170016684 A KR 20170016684A KR 1020150110110 A KR1020150110110 A KR 1020150110110A KR 20150110110 A KR20150110110 A KR 20150110110A KR 20170016684 A KR20170016684 A KR 20170016684A
- Authority
- KR
- South Korea
- Prior art keywords
- tracking
- unmanned aerial
- transmitter
- aerial vehicle
- displaying
- Prior art date
Links
- 230000005540 biological transmission Effects 0.000 claims description 19
- 238000000034 method Methods 0.000 claims description 15
- 230000009429 distress Effects 0.000 claims description 14
- 230000007613 environmental effect Effects 0.000 claims description 9
- 238000004891 communication Methods 0.000 description 16
- 241000282414 Homo sapiens Species 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 5
- 239000000463 material Substances 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000011359 shock absorbing material Substances 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D45/00—Aircraft indicators or protectors not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0038—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
-
- B64C2201/12—
-
- B64C2201/127—
-
- B64C2201/146—
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Alarm Systems (AREA)
Abstract
The present invention relates to a camera for photographing a body and a front surface, a microphone capable of receiving an external sound, a speaker capable of transmitting sound to the outside, a directional antenna for moving direction, a proximity sensor for detecting an obstacle, And a control antenna for transmitting control information to the unmanned aerial vehicle for rescue of the victims who can not evacuate to various situations such as a fire scene or a collapse site, To quickly navigate the victims and provide a safe path for the rescue team to enter.
Description
In the present invention, in order to rescue the evacuees who can not evacuate to various situations such as a fire scene or a collapse site, instead of putting the manpower into a high-risk scene, , And a unmanned aerial vehicle capable of providing a safe route through which rescue units can be inserted.
In modern society, the development of technology has made it possible for people to do things that only human beings can do without people, and the existing tools are being replaced by people's machines, machines, and people without people .
Unmanned aircraft is the opposite expression of the popularity of a person's ride. In broad terms, it literally refers to any aircraft that can be operated without a driver.
This is a term in terms of contrast with popularity, so if you look at it in a narrow sense, you can say it is unmanned when you unmanned a boarding vehicle that was able to be operated and operated.
In the case of unmanned aerial vehicles, which occupy most of the UAVs, it is necessary to acquire aerial images of disaster areas or disaster areas that are difficult to reach by persons, provide information on enemy concealment in powerline inspection or battlefield situations, perform reconnaissance missions or surveillance missions through UAV And so on.
For example, there was a reconnaissance aircraft on the airplane that detected the enemy in the Air Force and looked at the trend of the enemy, but it is now replacing the role of the reconnaissance aircraft due to the development of small and miniaturized unmanned aerial vehicles. In fact, We are in the process of powering the aircraft.
However, from a modern point of view, it is safe to say that there is almost no wordless manless man who does not participate in maneuvering. In order to implement full unmanned aerial vehicle operation without any human intervention, artificial intelligence built with advanced technology is required and it should be able to think and judge oneself without man's control.
In addition, modern technology has some difficulties in applying artificial intelligence to real UAVs that are in trial production or development stages.
In addition, even if these technical problems are solved, when it is considered to consider ethical issues such as whether to think of the machine as a personality, it is highly likely that the unmanned aerial vehicle,
On the other hand, there is an unmanned aerial vehicle with the technique of the representative embodiment of the UAV. Unmanned aerial vehicles are literally flying objects operated by remote operation without being carried by a person.
These need to be able to work in the place where the person is difficult to work in recent days instead of the person. In particular, it is advantageous in that man-power loss and secondary disaster can be prevented by injecting UAV into a point where it is difficult for human to input in various disaster sites such as fire scene or collapse site.
For example, in case of fire rescue, the fire brigade has a fire escape route that can escape from the fire scene in case of a fire, In fact, many survivors were rescued by the rescuers through the above method.
However, it is also true that the number of survivors who have been rescued by the above method occupies as many as the survivors who lost their lives in the field.
Currently, the universal type of unmanned remote control takeoff and landing aircraft is a single rotor helicopter, a coaxial inverted helicopter, and a quadraturecopter. Among these models, the quadraturecopter is an electric type that uses electricity, It consists of a rotor and two inverted pitch rotors. It is the simplest mechanically because there are no movable parts other than the four rotors.
Each rotor is driven by one electric motor, each motor is controlled by a microcomputer, and the microcomputer processes signals from various sensors and remote controls, which can be very stable in the case of a quadraturecopter .
The present invention relates to a unmanned aerial vehicle for tracking the position of a victim, which can secure the safety of the above-mentioned rescue personnel and reduce the casualty damage. Particularly, the present invention aims to reduce damage to human lives and property by transmitting information (for example, temperature, humidity, air composition, topographical image, etc.) of a disaster area, which is hardly accessed by humans, to a driver through an unmanned aerial vehicle.
The problems to be solved by the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.
According to an exemplary embodiment of the present invention, a man-powered unmanned aerial vehicle for tracking a victim includes a body, a camera for photographing a front face, a first microphone for receiving an external sound, a first speaker for transmitting sound to the outside, A proximity sensor for detecting an obstacle, an environmental sensor for detecting an environmental condition around the unmanned aerial vehicle, and a steering antenna for transmitting control information.
In addition, it may further comprise a transmitter disposed at a lower portion of the body, and a transmitter for indicating a structural route located inside the transmitter.
In addition, the unmanned aerial vehicle for tracking the victim position includes a steering wheel for controlling the unmanned air vehicle, an image screen for displaying an image of the camera, a proximity sensor alarm for displaying an alarm of the proximity sensor, an RSSI indicator for displaying the strength of the distress signal, A second microphone for receiving the sound, and a controller for adjusting the photographing direction of the camera.
The controller may further include a release button for releasing the transmitter for displaying the structural route and a remaining amount indicator for indicating the remaining number of the transmitter for displaying the structural route.
The present invention relates to an unmanned aerial vehicle that searches for a victim who has not been caught in a disaster area, and it is possible to search for a victim while securing the safety of the survivor.
In addition, it is possible to transmit the temperature, humidity, atmospheric composition, and topographic image of the disaster area such as fire, explosion, and pollution, which is difficult to access to the maneuver to the maneuver. Based on the received data, Can be performed.
In addition, because of the characteristics of the UAV, it is possible to move the unmanned aerial vehicle in the direction desired by the operator, so that it is possible to easily track the location of the distresser even in extreme situations without special infrastructure.
The effects according to the embodiments of the present invention are not limited by the contents exemplified above, and more various effects are included in the specification.
FIG. 1 is a perspective view of an unmanned aerial vehicle for tracking a distress location according to an embodiment of the present invention.
FIG. 2 is a front view of the unmanned aerial vehicle for tracking the location of a victim according to an embodiment of the present invention.
FIG. 3 illustrates a first manipulator of the unmanned aerial vehicle for tracking the position of a victim according to an embodiment of the present invention.
FIG. 4 is a side view of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
5 is a bottom view of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
FIG. 6 shows a transmitter for displaying a route route, which is dropped from the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
FIG. 7 illustrates a second manipulator of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
FIG. 8 shows a block diagram of a microprocessor of the unmanned aerial vehicle for tracking the position of a victim according to the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention and the manner of achieving them will become apparent with reference to the embodiments described in detail below with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. And is merely defined by the scope of the claims of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. &Quot; and / or " " include each and every combination of one or more of the mentioned items.
The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms "comprises" and / or "comprising" used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element.
Unless defined otherwise, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.
* Embodiments of the present invention will now be described in detail with reference to the accompanying drawings.
FIG. 1 is a perspective view schematically illustrating a manned location-tracking unmanned aerial vehicle according to an embodiment of the present invention.
1, an unmanned aerial vehicle for tracking a distress location (hereinafter referred to as a unmanned aerial vehicle) includes a directional antenna for sensing a distress radio signal, a steered antenna for transmitting / receiving information to / from the steerer, a proximity sensor for detecting nearby obstacles, A propeller for generating buoyancy to float the unmanned aerial vehicle, a motor as a power source of the propeller, and a connect arm for supporting the motor.
The
In addition, as the tracking
A
The
A wireless communication technique may be used as the transmitting means for allowing the
For example, a wireless communication technology using the frequencies of the ISM band, which is a typical license-exempt band, can be used.
The frequency of the Industrial Scientific Medical (ISM) band is the frequency band of the license-exempt band that can be used in industrial, scientific, and medical devices, as defined by ITU-R.
However, due to the nature of the license-exempt band, interference may occur during the communication period using the same frequency, so that a separate setting may be required to periodically rotate the frequency.
As a representative example using the frequency of the ISM band, Wi-Fi (Wireless-Fidelity, Wi-Fi) can be used.
Wi-Fi is a technology that refers to a local area network (LAN) capable of using a wireless Internet within a certain distance. It is also a data transmission protocol between different devices based on IEEE 802.11, and is usually called Wireless LAN or WLAN.
Basically, AP (Access Point) based communication method is used. For example, if an AP is placed at a certain point, a unique ID (for example, Internet Protocol and MAC Address) may be assigned to each device centering on the AP and connected to a star type to be connected.
As the number of connected equipment increases, the transmission rate may decrease proportionately, and the coverage may be 150 meters because it is a technique based on short distance communication.
In addition, Zigbee communication can be used.
ZigBee is one of the Institute of Electrical and Electronics Engineers (IEEE) 802.15.4 standards that supports near-field networks as a standard technology for data networks with low transmission rates.
ZigBee has the advantage of minimizing power consumption and low construction cost, and is being used in the near field communication market of intelligent home network, building, logistics, environmental monitoring and human interface.
However, due to the characteristics of low-power communication, the effective range of communication is only 10 to 20 meters, and it may be difficult to use due to its narrow range for long-distance communication means.
In addition, Bluetooth can be used.
Bluetooth is a standard for low-power wireless communication between standardized wireless communication devices in IEEE 802.15.1. It is a technology that transmits data of about 3Mbps (Mega bit per sec) at a distance of 10 meters or less. It is compatible with Bluetooth If you are a device, you can recognize it and move it immediately.
However, if the device is not a Bluetooth compatible device, a separate Bluetooth dongle must be additionally provided to enable communication, and since the effective range of Bluetooth is also narrow, long-distance communication may not be possible, and transmission speed may be low, .
However, the wireless communication means is not limited to the above-described embodiments, and may be implemented by various wireless communication means taking advantage of the respective shortcomings and disadvantages.
In addition, the tracking
More specifically, although the flight of the tracking
A
The
In order to support the weight of the
As shown in the figure, the
The
One side of the
The
If the tracking
The
For example, when the tracking
Meanwhile, although not shown in the drawing, the tracking
The speaker 182 and the
The speaker 182 and the
2 shows a front view of the unmanned aerial vehicle for tracking the position of a victim according to the present invention.
2, the tracking unmanned aerial vehicle further includes a camera attached to the front surface, a light for securing a view, a speaker for transmitting the voice of the pilot, a microphone for receiving the voice of the victim, and a support for supporting the unmanned aerial vehicle .
The light 181 is positioned on the front side of the tracking
The
The
A
Speaker 182 also provides a means for delivering voice outside the unmanned aerial vehicle as described above.
For example, it is not only easy to grasp the position of the victim in the situation of the victim search, but also can lead the victim to take appropriate action for the victim.
More specifically, if the victim is not disturbed, the evacuation route can be informed through the voice instruction of the rapid navigator. If the victim's behavior is inconvenient, the present location can be informed and the rescue personnel can be introduced .
In addition, an environmental sensor for detecting the environmental condition of the tracked vehicle may be positioned at the lower end of the
The environment sensor can measure the ambient temperature and the atmospheric component of the section where the trailing body is flying, and can transmit the acquired information to the microprocessor. FIG. 3 is a block diagram of the unmanned aerial vehicle according to an embodiment of the present invention. FIG.
3, the
As described above, the intensity of the distress radio signal increases as the tracking
In this case, the navigator must maneuver so that the tracking
The first control space located at the lower left of the
For example, if the direction of the first control is upwardly operated, the revolutions per minute (RPM) of the
On the other hand, if the direction of the first control point is downwardly operated, the
The second control space located at the lower right of the
For example, when the direction of the second steering is operated to the left, the
On the other hand, if the direction of the second control point is moved to the right, the
Meanwhile, the
The
The information transmitted from the steered
That is, the
The
In this case, the speaker 182 of the unmanned aerial vehicle can be divided into the first speaker and the
Here, the
For example, when the operator intends to transmit a doctor through a microphone, the
A
For example, if the obstacle is within 1 meter of the traveling direction of the tracking
However, the automatic evasive maneuver mode setting can be canceled according to the mode set in the
On the right side of the
The
When the distress radio signal is strongly sensed from the
FIG. 4 is a side view of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
Referring to FIG. 4, another embodiment of the unmanned aerial vehicle according to the present invention (hereinafter, referred to as a unmanned aerial vehicle) of the present invention includes a camera attached to the front, a directional antenna for sensing a distress radio signal, A proximity sensor for detecting a nearby obstacle, a propeller for generating buoyancy to float the unmanned aerial vehicle, a motor as a power source of the propeller, and a connect arm for supporting the motor,
The method of claim 1, further comprising injecting at the lower end of the unmanned aerial vehicle,
The components of the portable
The
The
The inner space of the
Although the number of the frame is not limited, the number of the frame having a certain amount or more can be limited to obtain the buoyancy by increasing the load of the unmanned
The length of the
5 is a bottom view of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
As shown in FIG. 5, the
A transmission
The proximity sensor 230 further includes a
Accordingly, the
The
FIG. 6 illustrates a transmitter for displaying the route structure of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
Referring to FIG. 6, a transmitter for displaying a structural route includes a transmitting antenna disposed at an upper end of a body, a light emitter disposed at a stop, a transmitter for transmitting a wireless signal, a battery for providing power to be consumed by the transmitter, .
As described above with reference to FIG. 5, the
The
The material of the
The
The
The
The
FIG. 7 illustrates a second manipulator of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
Referring to FIG. 7, the second controller may be configured in a manner similar to the first controller described above with reference to FIG.
The second controller is an image display unit for visualizing and displaying the information transmitted from the camera of the maneuvering and unmanned aerial vehicle that manages the movement of the unmanned aerial vehicle for tracking the position of the victim according to another embodiment of the present invention, An RSSI display unit for displaying the strength of the distress radio signal, and an antenna for controlling the unmanned aerial vehicle or transmitting / receiving information,
A
In the case of the
The
The remaining
FIG. 8 shows a block diagram of a microprocessor and an apparatus connected to the microprocessor of the unmanned aerial vehicle according to the present invention.
Referring to FIG. 8, a microprocessor may be configured as the control means of the unmanned aerial vehicle for tracking the position of the victim according to an embodiment of the present invention.
Examples of the devices connected to the microprocessor include a signal axis receiver for accumulating signals received from the directional antenna and transmitting the signals to the microprocessor, a control transmitter for transmitting information processed by the microprocessor to the controller, receiving the information received from the controller, A gyroscope for calculating the position of the gyroscope, a motor for generating torque, a projection control unit for controlling projection, an environmental sensor for detecting the environmental condition, a light for illuminating the light for visual observation, A microphone for receiving information, a speaker for outputting voice information, and a proximity sensor for measuring the distance between the main body and surrounding objects.
The signal axis receiving unit may accumulate signal information received from the directional antenna that receives the distress signal and transmit the signal information to the microprocessor.
The pilot transmitting / receiving unit may transmit information processed by the microprocessor to the controller or receive information received from the controller.
The gyroscope uses a gyro effect to inversely estimate the origin position, inversely calculating how distorted the current direction is, and provides the position information to the microprocessor.
Other components of the unmanned aerial vehicle that are connected to the microprocessor include a light, a camera, a speaker, a microphone, a motor, a controller, a proximity sensor, and a directional antenna. In another embodiment of the present invention, have.
The remaining components of the unmanned aerial vehicle connected to the microprocessor except for the signal-receiving unit, the pilot transmission / reception unit, and the gyroscope are described in detail in the tracking
While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, You will understand. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.
100: Tracking unmanned vehicle 200: Transport unmanned vehicle
300: First controller 400: Second controller
Claims (4)
A camera for photographing the front;
A first microphone for receiving an external sound;
A first speaker for transmitting sound to the outside;
A directional antenna tracking a distress radio signal;
A proximity sensor for detecting an obstacle;
An environmental sensor for detecting the environmental condition; And a steered antenna for transmitting control information.
A tubing disposed at a lower portion of the body, and
Further comprising a transmitter for displaying a structural route located inside the transmission.
Controls for maneuvering unmanned aerial vehicles;
An image screen for displaying the image of the camera;
Proximity sensor alarm indicating the alarm of the proximity sensor;
An RSSI display unit for displaying the strength of the distress signal;
A second microphone for receiving an external sound;
A second speaker for transmitting sound to the outside; And
And a controller for adjusting the shooting direction of the camera.
The manipulator includes a drop button for dropping a transmitter for displaying a structural root; And
And a remaining amount display unit for displaying the remaining number of the transmitter for displaying the structure route.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150110110A KR101739262B1 (en) | 2015-08-04 | 2015-08-04 | The unmanned air vehicle for castaway tracking |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150110110A KR101739262B1 (en) | 2015-08-04 | 2015-08-04 | The unmanned air vehicle for castaway tracking |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170016684A true KR20170016684A (en) | 2017-02-14 |
KR101739262B1 KR101739262B1 (en) | 2017-05-24 |
Family
ID=58121095
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150110110A KR101739262B1 (en) | 2015-08-04 | 2015-08-04 | The unmanned air vehicle for castaway tracking |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101739262B1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108241349A (en) * | 2018-01-15 | 2018-07-03 | 梁晓龙 | Fire-fighting unmanned plane group system and fire-fighting method |
WO2018161331A1 (en) * | 2017-03-09 | 2018-09-13 | 邹霞 | Unmanned aerial vehicle monitoring system and monitoring method |
CN108657452A (en) * | 2018-03-30 | 2018-10-16 | 利辛县雨若信息科技有限公司 | A kind of photoelectric nacelle protective cover |
CN109987228A (en) * | 2017-12-30 | 2019-07-09 | 广州刀锋智能科技有限公司 | A kind of disaster area personnel search and rescue and goods and materials put-on method based on unmanned plane |
KR20190091064A (en) * | 2018-01-26 | 2019-08-05 | 한국항공우주연구원 | Flight control system and method of aerial vehicles based on electric driven |
CN110162086A (en) * | 2019-03-21 | 2019-08-23 | 中山大学 | A kind of cluster unmanned plane formation method based on Model Predictive Control frame |
CN110162035A (en) * | 2019-03-21 | 2019-08-23 | 中山大学 | A kind of clustered machine people is having the cooperative motion method in barrier scene |
JP2021024488A (en) * | 2019-08-08 | 2021-02-22 | 株式会社T&T | Drone for emergency evacuation guide |
KR20220170078A (en) * | 2021-06-22 | 2022-12-29 | 태경전자주식회사 | Drone traking a target |
KR20230104786A (en) * | 2021-12-30 | 2023-07-11 | 우석대학교 산학협력단 | Firefighting drone with thermal imaging camera |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102626835B1 (en) | 2018-10-08 | 2024-01-18 | 삼성전자주식회사 | Method and apparatus of determining path |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101536095B1 (en) * | 2015-01-14 | 2015-07-13 | 농업회사법인 주식회사 에이치알제주 | Grassland management system using drone |
KR101535401B1 (en) * | 2015-04-01 | 2015-07-08 | 오인선 | Drone type life ring dropping device |
-
2015
- 2015-08-04 KR KR1020150110110A patent/KR101739262B1/en active IP Right Grant
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018161331A1 (en) * | 2017-03-09 | 2018-09-13 | 邹霞 | Unmanned aerial vehicle monitoring system and monitoring method |
CN109987228A (en) * | 2017-12-30 | 2019-07-09 | 广州刀锋智能科技有限公司 | A kind of disaster area personnel search and rescue and goods and materials put-on method based on unmanned plane |
CN108241349A (en) * | 2018-01-15 | 2018-07-03 | 梁晓龙 | Fire-fighting unmanned plane group system and fire-fighting method |
KR20190091064A (en) * | 2018-01-26 | 2019-08-05 | 한국항공우주연구원 | Flight control system and method of aerial vehicles based on electric driven |
CN108657452A (en) * | 2018-03-30 | 2018-10-16 | 利辛县雨若信息科技有限公司 | A kind of photoelectric nacelle protective cover |
CN110162086A (en) * | 2019-03-21 | 2019-08-23 | 中山大学 | A kind of cluster unmanned plane formation method based on Model Predictive Control frame |
CN110162035A (en) * | 2019-03-21 | 2019-08-23 | 中山大学 | A kind of clustered machine people is having the cooperative motion method in barrier scene |
CN110162035B (en) * | 2019-03-21 | 2020-09-18 | 中山大学 | Cooperative motion method of cluster robot in scene with obstacle |
JP2021024488A (en) * | 2019-08-08 | 2021-02-22 | 株式会社T&T | Drone for emergency evacuation guide |
KR20220170078A (en) * | 2021-06-22 | 2022-12-29 | 태경전자주식회사 | Drone traking a target |
KR20230104786A (en) * | 2021-12-30 | 2023-07-11 | 우석대학교 산학협력단 | Firefighting drone with thermal imaging camera |
Also Published As
Publication number | Publication date |
---|---|
KR101739262B1 (en) | 2017-05-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101739262B1 (en) | The unmanned air vehicle for castaway tracking | |
US11407526B2 (en) | Systems and methods for UAV docking | |
US11370540B2 (en) | Context-based flight mode selection | |
CN107531217B (en) | Apparatus and method for identifying or detecting obstacles | |
US9665094B1 (en) | Automatically deployed UAVs for disaster response | |
US20180290748A1 (en) | Autonomous in-tunnel intelligence, surveillance, and reconnaissance drone | |
CN112904892A (en) | System and method for monitoring with visual indicia | |
JP6539073B2 (en) | Surveillance system and flight robot | |
JP6539072B2 (en) | Surveillance system and flight robot | |
JP6664209B2 (en) | Investigation method using flight-type survey aircraft | |
WO2020116392A1 (en) | Drone system, drone, movable body, drone system control method, and drone system control program | |
CN113805586A (en) | Autonomous fire-fighting special explosion-proof robot | |
JP2018055362A (en) | Monitoring system | |
US20210011472A1 (en) | System, device and method for time limited communication for remotely controlled vehicles | |
US20220343779A1 (en) | System, device and method for time limited communication for remotely controlled vehicles | |
SEZGİN et al. | Color-Based Object Trackıng For Unmanned Aerıal Vehıcles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |