KR20170016684A - The unmanned air vehicle for castaway tracking - Google Patents

The unmanned air vehicle for castaway tracking Download PDF

Info

Publication number
KR20170016684A
KR20170016684A KR1020150110110A KR20150110110A KR20170016684A KR 20170016684 A KR20170016684 A KR 20170016684A KR 1020150110110 A KR1020150110110 A KR 1020150110110A KR 20150110110 A KR20150110110 A KR 20150110110A KR 20170016684 A KR20170016684 A KR 20170016684A
Authority
KR
South Korea
Prior art keywords
tracking
unmanned aerial
transmitter
aerial vehicle
displaying
Prior art date
Application number
KR1020150110110A
Other languages
Korean (ko)
Other versions
KR101739262B1 (en
Inventor
고봉진
Original Assignee
창원대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 창원대학교 산학협력단 filed Critical 창원대학교 산학협력단
Priority to KR1020150110110A priority Critical patent/KR101739262B1/en
Publication of KR20170016684A publication Critical patent/KR20170016684A/en
Application granted granted Critical
Publication of KR101739262B1 publication Critical patent/KR101739262B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • B64C2201/12
    • B64C2201/127
    • B64C2201/146

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Alarm Systems (AREA)

Abstract

The present invention relates to a camera for photographing a body and a front surface, a microphone capable of receiving an external sound, a speaker capable of transmitting sound to the outside, a directional antenna for moving direction, a proximity sensor for detecting an obstacle, And a control antenna for transmitting control information to the unmanned aerial vehicle for rescue of the victims who can not evacuate to various situations such as a fire scene or a collapse site, To quickly navigate the victims and provide a safe path for the rescue team to enter.

Description

The unmanned air vehicle for castaway tracking

In the present invention, in order to rescue the evacuees who can not evacuate to various situations such as a fire scene or a collapse site, instead of putting the manpower into a high-risk scene, , And a unmanned aerial vehicle capable of providing a safe route through which rescue units can be inserted.

In modern society, the development of technology has made it possible for people to do things that only human beings can do without people, and the existing tools are being replaced by people's machines, machines, and people without people .

Unmanned aircraft is the opposite expression of the popularity of a person's ride. In broad terms, it literally refers to any aircraft that can be operated without a driver.

This is a term in terms of contrast with popularity, so if you look at it in a narrow sense, you can say it is unmanned when you unmanned a boarding vehicle that was able to be operated and operated.

In the case of unmanned aerial vehicles, which occupy most of the UAVs, it is necessary to acquire aerial images of disaster areas or disaster areas that are difficult to reach by persons, provide information on enemy concealment in powerline inspection or battlefield situations, perform reconnaissance missions or surveillance missions through UAV And so on.

For example, there was a reconnaissance aircraft on the airplane that detected the enemy in the Air Force and looked at the trend of the enemy, but it is now replacing the role of the reconnaissance aircraft due to the development of small and miniaturized unmanned aerial vehicles. In fact, We are in the process of powering the aircraft.

However, from a modern point of view, it is safe to say that there is almost no wordless manless man who does not participate in maneuvering. In order to implement full unmanned aerial vehicle operation without any human intervention, artificial intelligence built with advanced technology is required and it should be able to think and judge oneself without man's control.

In addition, modern technology has some difficulties in applying artificial intelligence to real UAVs that are in trial production or development stages.

In addition, even if these technical problems are solved, when it is considered to consider ethical issues such as whether to think of the machine as a personality, it is highly likely that the unmanned aerial vehicle,

On the other hand, there is an unmanned aerial vehicle with the technique of the representative embodiment of the UAV. Unmanned aerial vehicles are literally flying objects operated by remote operation without being carried by a person.

These need to be able to work in the place where the person is difficult to work in recent days instead of the person. In particular, it is advantageous in that man-power loss and secondary disaster can be prevented by injecting UAV into a point where it is difficult for human to input in various disaster sites such as fire scene or collapse site.

For example, in case of fire rescue, the fire brigade has a fire escape route that can escape from the fire scene in case of a fire, In fact, many survivors were rescued by the rescuers through the above method.

However, it is also true that the number of survivors who have been rescued by the above method occupies as many as the survivors who lost their lives in the field.

Currently, the universal type of unmanned remote control takeoff and landing aircraft is a single rotor helicopter, a coaxial inverted helicopter, and a quadraturecopter. Among these models, the quadraturecopter is an electric type that uses electricity, It consists of a rotor and two inverted pitch rotors. It is the simplest mechanically because there are no movable parts other than the four rotors.

Each rotor is driven by one electric motor, each motor is controlled by a microcomputer, and the microcomputer processes signals from various sensors and remote controls, which can be very stable in the case of a quadraturecopter .

The present invention relates to a unmanned aerial vehicle for tracking the position of a victim, which can secure the safety of the above-mentioned rescue personnel and reduce the casualty damage. Particularly, the present invention aims to reduce damage to human lives and property by transmitting information (for example, temperature, humidity, air composition, topographical image, etc.) of a disaster area, which is hardly accessed by humans, to a driver through an unmanned aerial vehicle.

The problems to be solved by the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.

According to an exemplary embodiment of the present invention, a man-powered unmanned aerial vehicle for tracking a victim includes a body, a camera for photographing a front face, a first microphone for receiving an external sound, a first speaker for transmitting sound to the outside, A proximity sensor for detecting an obstacle, an environmental sensor for detecting an environmental condition around the unmanned aerial vehicle, and a steering antenna for transmitting control information.

In addition, it may further comprise a transmitter disposed at a lower portion of the body, and a transmitter for indicating a structural route located inside the transmitter.

In addition, the unmanned aerial vehicle for tracking the victim position includes a steering wheel for controlling the unmanned air vehicle, an image screen for displaying an image of the camera, a proximity sensor alarm for displaying an alarm of the proximity sensor, an RSSI indicator for displaying the strength of the distress signal, A second microphone for receiving the sound, and a controller for adjusting the photographing direction of the camera.

The controller may further include a release button for releasing the transmitter for displaying the structural route and a remaining amount indicator for indicating the remaining number of the transmitter for displaying the structural route.

The present invention relates to an unmanned aerial vehicle that searches for a victim who has not been caught in a disaster area, and it is possible to search for a victim while securing the safety of the survivor.

In addition, it is possible to transmit the temperature, humidity, atmospheric composition, and topographic image of the disaster area such as fire, explosion, and pollution, which is difficult to access to the maneuver to the maneuver. Based on the received data, Can be performed.

In addition, because of the characteristics of the UAV, it is possible to move the unmanned aerial vehicle in the direction desired by the operator, so that it is possible to easily track the location of the distresser even in extreme situations without special infrastructure.

The effects according to the embodiments of the present invention are not limited by the contents exemplified above, and more various effects are included in the specification.

FIG. 1 is a perspective view of an unmanned aerial vehicle for tracking a distress location according to an embodiment of the present invention.
FIG. 2 is a front view of the unmanned aerial vehicle for tracking the location of a victim according to an embodiment of the present invention.
FIG. 3 illustrates a first manipulator of the unmanned aerial vehicle for tracking the position of a victim according to an embodiment of the present invention.
FIG. 4 is a side view of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
5 is a bottom view of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
FIG. 6 shows a transmitter for displaying a route route, which is dropped from the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
FIG. 7 illustrates a second manipulator of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.
FIG. 8 shows a block diagram of a microprocessor of the unmanned aerial vehicle for tracking the position of a victim according to the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention and the manner of achieving them will become apparent with reference to the embodiments described in detail below with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. And is merely defined by the scope of the claims of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. &Quot; and / or " " include each and every combination of one or more of the mentioned items.

The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms "comprises" and / or "comprising" used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element.

Unless defined otherwise, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.

* Embodiments of the present invention will now be described in detail with reference to the accompanying drawings.

FIG. 1 is a perspective view schematically illustrating a manned location-tracking unmanned aerial vehicle according to an embodiment of the present invention.

1, an unmanned aerial vehicle for tracking a distress location (hereinafter referred to as a unmanned aerial vehicle) includes a directional antenna for sensing a distress radio signal, a steered antenna for transmitting / receiving information to / from the steerer, a proximity sensor for detecting nearby obstacles, A propeller for generating buoyancy to float the unmanned aerial vehicle, a motor as a power source of the propeller, and a connect arm for supporting the motor.

The directional antenna 110 always directs the traveling direction of the tracking unmanned air vehicle 100 to track the location of the victim.

In addition, as the tracking unmanned air vehicle 100 approaches the distressed person through the directional antenna 110, the intensity (for example, RSSI) of the distress radio signal generated from the distressed person increases, The controller can steer the tracking unmanned air vehicle 100 so that the position of the victim can be grasped.

A steering antenna 120 is disposed above the tracking unmanned air vehicle 100 to transmit and receive information to be steered and acquired.

The steering antenna 120 is a non-directional antenna protruding from the upper surface of the tracking unmanned air vehicle 100, and can receive signals transmitted from the controller 300 without limitation in directions. The inverse process may also be possible.

A wireless communication technique may be used as the transmitting means for allowing the steering antenna 120 to transmit and receive information.

For example, a wireless communication technology using the frequencies of the ISM band, which is a typical license-exempt band, can be used.

The frequency of the Industrial Scientific Medical (ISM) band is the frequency band of the license-exempt band that can be used in industrial, scientific, and medical devices, as defined by ITU-R.

However, due to the nature of the license-exempt band, interference may occur during the communication period using the same frequency, so that a separate setting may be required to periodically rotate the frequency.

As a representative example using the frequency of the ISM band, Wi-Fi (Wireless-Fidelity, Wi-Fi) can be used.

Wi-Fi is a technology that refers to a local area network (LAN) capable of using a wireless Internet within a certain distance. It is also a data transmission protocol between different devices based on IEEE 802.11, and is usually called Wireless LAN or WLAN.

Basically, AP (Access Point) based communication method is used. For example, if an AP is placed at a certain point, a unique ID (for example, Internet Protocol and MAC Address) may be assigned to each device centering on the AP and connected to a star type to be connected.

As the number of connected equipment increases, the transmission rate may decrease proportionately, and the coverage may be 150 meters because it is a technique based on short distance communication.

In addition, Zigbee communication can be used.

ZigBee is one of the Institute of Electrical and Electronics Engineers (IEEE) 802.15.4 standards that supports near-field networks as a standard technology for data networks with low transmission rates.

ZigBee has the advantage of minimizing power consumption and low construction cost, and is being used in the near field communication market of intelligent home network, building, logistics, environmental monitoring and human interface.

However, due to the characteristics of low-power communication, the effective range of communication is only 10 to 20 meters, and it may be difficult to use due to its narrow range for long-distance communication means.

In addition, Bluetooth can be used.

Bluetooth is a standard for low-power wireless communication between standardized wireless communication devices in IEEE 802.15.1. It is a technology that transmits data of about 3Mbps (Mega bit per sec) at a distance of 10 meters or less. It is compatible with Bluetooth If you are a device, you can recognize it and move it immediately.

However, if the device is not a Bluetooth compatible device, a separate Bluetooth dongle must be additionally provided to enable communication, and since the effective range of Bluetooth is also narrow, long-distance communication may not be possible, and transmission speed may be low, .

However, the wireless communication means is not limited to the above-described embodiments, and may be implemented by various wireless communication means taking advantage of the respective shortcomings and disadvantages.

In addition, the tracking unmanned air vehicle 100 includes proximity sensors 130a and 130b on its outer surface, so that the tracking unmanned air vehicle 100 can prevent collision with surrounding objects through the proximity sensors 130a and 130b.

More specifically, although the flight of the tracking unmanned air vehicle 100 is carried out by the steering of the driver, the unmanned aerial vehicle may collide with the surrounding objects owing to the inadequate steering of the driver, , It is possible to avoid the collision by the tracking unmanned vehicle 100 itself by collision with the surrounding objects.

A connector arm 140 for supporting the motor 155 including the propeller 160 may be disposed on the side of the tracking unmanned air vehicle 100.

The connector arm 140 may protrude from the outer surface of the tracking unmanned air vehicle 100 to have a rod shape having a predetermined length and may support a motor 155 having a predetermined weight.

In order to support the weight of the motor 155, the connector arm 140 is fixed to the outer surface of the body and does not move.

As shown in the figure, the connector arm 140 can be configured in four directions. This is for easily balancing the body through the operation of the propeller 160 connected to the connector arm 140 during the flight of the tracking unmanned air vehicle 100.

The connection arm 140 is not limited to the above embodiment, and each of the connecting arms 140 may play the same role of supporting the motor 155 in order to balance the stable body.

One side of the connector arm 140 is coupled to the outer surface of the tracking unmanned air vehicle 100 and the other side of the connector arm 140 can be coupled to the housing 150 surrounding the motor 155.

The motor 155 serves to generate rotational force and transmits the power to the propeller 160. The propeller 160, which receives the rotational force from the motor 155, can generate buoyancy to float the tracking unmanned air vehicle 100 in the air.

If the tracking unmanned air vehicle 100 is to change the direction of travel during flight, the direction of the propeller 160 may be changed by adjusting the housing 150 surrounding the motor 155.

The housing 150 may be formed of a circular resin material surrounding the motor 155 and may be connected to the connector arm 140. The housing 150 is capable of switching the direction of the trajectory of the tracking unmanned air vehicle 100 to a full direction that does not damage the body. The angle of the range may be about 50 to 60 degrees, but is not limited to the range of the angle , And may include all range angles such that the propeller 160 is not adjacent to the connector arm 140.

For example, when the tracking unmanned air vehicle 100 is to travel to the left with respect to the front of the body during flight, the housing 150, which maintains the direction perpendicular to the ground, At the same time, the tracking unmanned air vehicle 100 can be moved in the corresponding direction.

Meanwhile, although not shown in the drawing, the tracking unmanned air vehicle 100 may include a speaker 182 and a microphone 183.

The speaker 182 and the microphone 183 are devices for communicating through the voice between the navigator and the distresser, which will be described later.

The speaker 182 and the microphone 183 will be described later.

2 shows a front view of the unmanned aerial vehicle for tracking the position of a victim according to the present invention.

2, the tracking unmanned aerial vehicle further includes a camera attached to the front surface, a light for securing a view, a speaker for transmitting the voice of the pilot, a microphone for receiving the voice of the victim, and a support for supporting the unmanned aerial vehicle .

The light 181 is positioned on the front side of the tracking unmanned air vehicle 100 so as to secure the viewpoint of the tracking unmanned air vehicle 100. In addition, by illuminating the area where the tracking unmanned air vehicle 100 is located, For example, it may be possible for the victims to secure visibility in extreme environments.

The camera 180 is located at the front center of the tracking unmanned air vehicle 100 and can provide visual information to the driver so that the operator can check the situation of the distressed person and the surroundings of the tracking unmanned air vehicle 100.

The camera 180 may be circular and the operator may move the lens of the camera 180 remotely up, down, left, or right via the manipulator 300 to secure multiple views.

A microphone 183 and a speaker 182 are positioned at the lower end of the camera 180. As described above, the microphone 183 is a means for obtaining voice information around the tracking unmanned air vehicle 100. Based on the voice information obtained through the microphone 183, the operator can grasp the situation of the place where the victim is located have.

Speaker 182 also provides a means for delivering voice outside the unmanned aerial vehicle as described above.

For example, it is not only easy to grasp the position of the victim in the situation of the victim search, but also can lead the victim to take appropriate action for the victim.

More specifically, if the victim is not disturbed, the evacuation route can be informed through the voice instruction of the rapid navigator. If the victim's behavior is inconvenient, the present location can be informed and the rescue personnel can be introduced .

In addition, an environmental sensor for detecting the environmental condition of the tracked vehicle may be positioned at the lower end of the camera 180. [

The environment sensor can measure the ambient temperature and the atmospheric component of the section where the trailing body is flying, and can transmit the acquired information to the microprocessor. FIG. 3 is a block diagram of the unmanned aerial vehicle according to an embodiment of the present invention. FIG.

3, the controller 300 for controlling the tracking unmanned aerial vehicle 100 includes an RSSI display unit 360. [

As described above, the intensity of the distress radio signal increases as the tracking unmanned air vehicle 100 approaches the distressed person through the directional antenna 110 located on the upper side of the tracking unmanned air vehicle 100, Since the intensity is displayed on the RSSI display unit 360 of the controller 300, the navigator can easily search for the victim.

In this case, the navigator must maneuver so that the tracking unmanned air vehicle 100 can find the position that it has grasped through the RSSI display unit 360, and the tracking unmanned air vehicle 100 can move in a proper direction desired by the navigator The tracking unmanned air vehicle 100 can be moved through the control 310 located at the lower left and right ends of the controller 300. [

The first control space located at the lower left of the controller 300 is capable of remotely controlling the height of the tracking unmanned air vehicle 100.

For example, if the direction of the first control is upwardly operated, the revolutions per minute (RPM) of the motor 155 increases and the rotational force of the propeller 160 also increases. Based on the increased rotational force, So that the tracking unmanned vehicle 100 can ascend to a higher altitude.

On the other hand, if the direction of the first control point is downwardly operated, the motor 155, which has been operating with the RPM to maintain the altitude of the conventional unmanned aerial vehicle, will gradually decrease the RPM and decrease the rotational force. The rotational force of the reduced motor 155 affects the rotational force of the propeller 160, so that the buoyant force is reduced and the tracking unmanned air vehicle 100 can be lowered to a lower altitude.

The second control space located at the lower right of the controller 300 is capable of controlling the moving direction of the tracking unmanned air vehicle 100.

For example, when the direction of the second steering is operated to the left, the housing 150 capable of moving in all directions as described above is inclined to the left side of the tracking unmanned air vehicle 100, The body of the tracking unmanned air vehicle 100 is tilted to the left and the direction moves to the left.

On the other hand, if the direction of the second control point is moved to the right, the housing 150 capable of moving in all directions as described above is inclined to the right side of the tracking unmanned air vehicle 100, The body of the unmanned air vehicle 100 can be tilted to the right and moved in the right direction.

Meanwhile, the image screen 320 may be positioned between the first control point and the second control point.

The image screen 320 may be a touch screen configured in the form of a rectangular window, and the material thereof may be a liquid crystal display (LCD) or a low power LED (LED) display.

The information transmitted from the steered antenna 120 of the tracking unmanned air vehicle 100 can be received from the antenna 305 located above the image screen and the image or image information of the received information is displayed on the image screen 320 .

That is, the image screen 320 can display information obtained from the camera 180 of the tracking unmanned air vehicle 100.

The speaker 330 and the microphone 340 may be positioned on the upper left side of the controller 300. The two components complementarily perform their functions and are connected to the speaker 300 of the tracking unmanned air vehicle 100 through the antenna 305 182 and the microphone 183, respectively.

In this case, the speaker 182 of the unmanned aerial vehicle can be divided into the first speaker and the speaker 330 of the controller can be divided into the second speaker. In the case of the microphone, the microphone 183 of the unmanned aerial vehicle can be divided into the first microphone, ) Can be distinguished from the second micro.

Here, the first microphone 183 may be interlocked with the second speaker 330, and the first speaker 182 may be interlocked with the second microphone 340.

For example, when the operator intends to transmit a doctor through a microphone, the second microphone 240 can output the voice signal of the pilot to the first speaker 182 through the antenna of the transmitter, The first microphone 183 may transmit the voice signal of the victim to the second speaker 330 by transmitting information through the steered antenna of the unmanned aerial vehicle.

A proximity sensor alarm 350 is located at the upper right of the controller 300. The proximity sensor alarm 350 can receive information from proximity sensors 130a and 130b of the unmanned aerial vehicle. The proximity sensors 130a and 130b of the tracking unmanned air vehicle 100 may transmit information to the proximity sensor alarm 350 through the steering antenna 130 when an obstacle is detected within a radius of 1 meter during the movement of the tracking unmanned air vehicle 100 And the proximity sensor alarm 350 receiving the information blinks the LED light installed in itself and informs the driver to pay attention to the movement.

For example, if the obstacle is within 1 meter of the traveling direction of the tracking unmanned air vehicle 100, the proximity sensor alarm 350 receiving the signals of the proximity sensors 130a and 130b flashes at intervals of 1/2 second The second speaker 330 can output a beep sound at the same interval as the proximity sensor alarm 350 so that information on the obstacle detection can be transmitted to the driver who has not confirmed the light emission. If the operator conducts the flight control toward the obstacle despite the proximity sensors 130a and 130b transmitting information to the proximity sensor alarm 350 of the controller 300, the proximity sensors 130a and 130b When the distance between the obstacles is within 0.5 meters, the information is transmitted to the microprocessor so as to automatically start the avoidance maneuver of the tracking unmanned air vehicle 100, which can be performed irrespective of the intention of the maneuver.

However, the automatic evasive maneuver mode setting can be canceled according to the mode set in the controller 300, although the maneuver can be controlled after the evasive maneuver ends.

On the right side of the proximity sensor alarm 350, an RSSI display unit 360 is located. The RSSI display unit 360 can receive information on the distress radio signal from the directional antenna 110 of the tracking unmanned aerial vehicle 100 and display it as time information.

The RSSI display unit 360 may display time information through a liquid crystal display (LCD) or a light emitting diode (LED) display similarly to the image screen 320, but the present invention is not limited to the above embodiments, Visualization can be performed through various visual display devices capable of displaying information on sensitivity.

When the distress radio signal is strongly sensed from the directional antenna 110 of the tracking unmanned air vehicle 100, the RSSI display unit 360 can display a high value by reflecting the numerical value. Conversely, when the distress signal is weakly sensed , The lower value can be displayed by reflecting the numerical value.

FIG. 4 is a side view of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.

Referring to FIG. 4, another embodiment of the unmanned aerial vehicle according to the present invention (hereinafter, referred to as a unmanned aerial vehicle) of the present invention includes a camera attached to the front, a directional antenna for sensing a distress radio signal, A proximity sensor for detecting a nearby obstacle, a propeller for generating buoyancy to float the unmanned aerial vehicle, a motor as a power source of the propeller, and a connect arm for supporting the motor,

The method of claim 1, further comprising injecting at the lower end of the unmanned aerial vehicle,

The components of the portable unmanned vehicle 200 shown in FIG. 4 are substantially the same as those of the tracking unmanned aerial vehicle 100 shown in FIGS. 1 and 2, and additionally, a transmission 290 can be further configured.

The thrower 290 is located in the form of a rectangular frame protruding from the lower end outer surface of the transport unmanned aerial vehicle 200 for tracking the victim position, and the throwing 290 includes a space having a certain inner area.

The proximity sensor 230b and the transmitter for indicating the structural route are located in the interior space of the transmitter 290. In this case, the proximity sensor 230b located in the interior space of the transmission 290 may be located in the interior space of the transmission 290.

The inner space of the projection 290 may be a shape in which a plurality of frames having a rectangular shape are arranged, but the inner space may be a regularly arranged frame.

Although the number of the frame is not limited, the number of the frame having a certain amount or more can be limited to obtain the buoyancy by increasing the load of the unmanned aerial vehicle 200, The number can be adjusted.

The length of the support 270 supporting the transporting unmanned object 290 is set to be shorter than the length of the transporting unmanned object 200 of the unmanned air vehicle 200 And the increased length is proportional to the height at which the projection 290 protrudes.

5 is a bottom view of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.

As shown in FIG. 5, the thrower 290 is disposed on the lower outer surface of the transporting unmanned air vehicle 290, and a space may be formed therein, and the shape thereof may be a rectangular-shaped frame.

A transmission route display transmitter 295 may be disposed inside the frame of the transmission 290. The structural root display transmitter 295 may have the shape of a rod having a certain length. The structure root indicating transmitter 295 includes a radio transmitter 10 inside. The wireless transmitter 10 may be disposed within the originating route transmitter 295 to transmit the wireless signal to the outside.

The proximity sensor 230 further includes a proximity sensor 230b. The surveillance position tracking unmanned aerial vehicle 100 according to an exemplary embodiment of the present invention may include a proximity sensor 130b protruding from a lower outer surface of the unmanned air vehicle 100. However, It is difficult to dispose the proximity sensor separately due to the arrangement of the projections 290.

Accordingly, the proximity sensor 230b may be configured to match the shape of the mold positioned inside the mold cavity 290, so that the proximity sensor 230b, as well as the structural root display transmitters 295 located inside the mold, (Not shown).

The proximity sensor 230b may be fixedly positioned inside the frame so that the proximity sensor 230b may not be discharged. In order to facilitate interlocking with the proximity sensor 230a located on the upper surface, As shown in FIG.

FIG. 6 illustrates a transmitter for displaying the route structure of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.

Referring to FIG. 6, a transmitter for displaying a structural route includes a transmitting antenna disposed at an upper end of a body, a light emitter disposed at a stop, a transmitter for transmitting a wireless signal, a battery for providing power to be consumed by the transmitter, .

As described above with reference to FIG. 5, the sender 295 for structured route display is located inside the frame of the sender 290 and can wait in a fixed form until a drop command is issued.

The body 30 located at the lower end of the transmission route display transmitter 295 itself serves to support the transmission route display transmitter 295 and may have a bar shape having a rounded end shape.

The material of the body 30 may be a shock absorbing material that absorbs impact. This is because when the sender 295 for the root route display is dropped from the sender 290 to the specific location according to the command of the operator, Since the probability that the transmitter 20 and the luminous body 10 are broken can be increased, the body 30 composed of the material absorbs the amount of the vibration, thereby protecting the radio transmitter 20 and the luminous body 10 have.

The battery 25 is located at the upper end of the body 30 and can supply power to each unit and can be detachably installed so that the transmitter 295 for the structural route display can perform its function.

The light emitting unit 10 may be configured such that the rescue unit is inserted along the safety path guided by the unmanned air vehicles 100 and 200 through the divergence of light at the disaster site where the power is shut off or the unmanned air vehicles 100 and 200 guide the victims, , It is possible to secure a field of view so as to pass through a specific area more safely.

The transmitter 20 is located in a place where it is difficult to position in the process of guiding the safety route in a special situation in the area where the light emitting body 10 is not in the field of view or in the process of retrieving the transmitter 295 It can be placed at the interruption of the root route display transmitter 295 to search for the origin route transmitter 295 and transmit the information through the transmitting antenna 5 located at the top.

The transmitter 20 periodically emits the radio waves to the outside through the transmission antenna 5 located at the upper end to inform the position of the transmitter 20, and only the transmitter 295 for indicating the location of the structure is operated, However, in places where the risk is high, it may not be recovered.

FIG. 7 illustrates a second manipulator of the unmanned aerial vehicle for tracking the position of a victim according to another embodiment of the present invention.

Referring to FIG. 7, the second controller may be configured in a manner similar to the first controller described above with reference to FIG.

The second controller is an image display unit for visualizing and displaying the information transmitted from the camera of the maneuvering and unmanned aerial vehicle that manages the movement of the unmanned aerial vehicle for tracking the position of the victim according to another embodiment of the present invention, An RSSI display unit for displaying the strength of the distress radio signal, and an antenna for controlling the unmanned aerial vehicle or transmitting / receiving information,

A drop button 470 for dropping the transmission route display transmitter 295 from the transmission 290 and a remaining amount display portion 480 for indicating the number of the transmission route display transmitter 295 left in the transmission 290 .

In the case of the control section 410, the image screen 420, the proximity sensor alarm 450, the antenna 405, the camera control section 425, and the RSSI display section 460, A detailed description thereof will be omitted.

The release button 470 is disposed at the upper left of the second controller 400 so that when the operator intends to drop the structural route display transmitter 295 at a specific position, ) Can be issued.

The remaining amount display unit 480 displays the number of the remaining structural root display transmitters 295 disposed adjacent to the drop button 470 at the upper left of the second controller 400. [ The remaining amount of the structural route display transmitter 295 is the number of the structural route display transmitters 295 remaining inside the transmission 290 at the time when the operator confirms the remaining amount display portion 480, And does not include the proximity sensor 230b located inside.

FIG. 8 shows a block diagram of a microprocessor and an apparatus connected to the microprocessor of the unmanned aerial vehicle according to the present invention.

Referring to FIG. 8, a microprocessor may be configured as the control means of the unmanned aerial vehicle for tracking the position of the victim according to an embodiment of the present invention.

Examples of the devices connected to the microprocessor include a signal axis receiver for accumulating signals received from the directional antenna and transmitting the signals to the microprocessor, a control transmitter for transmitting information processed by the microprocessor to the controller, receiving the information received from the controller, A gyroscope for calculating the position of the gyroscope, a motor for generating torque, a projection control unit for controlling projection, an environmental sensor for detecting the environmental condition, a light for illuminating the light for visual observation, A microphone for receiving information, a speaker for outputting voice information, and a proximity sensor for measuring the distance between the main body and surrounding objects.

The signal axis receiving unit may accumulate signal information received from the directional antenna that receives the distress signal and transmit the signal information to the microprocessor.

The pilot transmitting / receiving unit may transmit information processed by the microprocessor to the controller or receive information received from the controller.

The gyroscope uses a gyro effect to inversely estimate the origin position, inversely calculating how distorted the current direction is, and provides the position information to the microprocessor.

Other components of the unmanned aerial vehicle that are connected to the microprocessor include a light, a camera, a speaker, a microphone, a motor, a controller, a proximity sensor, and a directional antenna. In another embodiment of the present invention, have.

The remaining components of the unmanned aerial vehicle connected to the microprocessor except for the signal-receiving unit, the pilot transmission / reception unit, and the gyroscope are described in detail in the tracking unmanned air vehicle 100 of FIG. 3 and the unmanned aerial vehicle 200 of FIG. And the detailed description thereof may be omitted, and the additional components may be disposed as necessary, including the above components.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, You will understand. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.

100: Tracking unmanned vehicle 200: Transport unmanned vehicle
300: First controller 400: Second controller

Claims (4)

Body;
A camera for photographing the front;
A first microphone for receiving an external sound;
A first speaker for transmitting sound to the outside;
A directional antenna tracking a distress radio signal;
A proximity sensor for detecting an obstacle;
An environmental sensor for detecting the environmental condition; And a steered antenna for transmitting control information.
The method according to claim 1,
A tubing disposed at a lower portion of the body, and
Further comprising a transmitter for displaying a structural route located inside the transmission.
3. The method of claim 2,
Controls for maneuvering unmanned aerial vehicles;
An image screen for displaying the image of the camera;
Proximity sensor alarm indicating the alarm of the proximity sensor;
An RSSI display unit for displaying the strength of the distress signal;
A second microphone for receiving an external sound;
A second speaker for transmitting sound to the outside; And
And a controller for adjusting the shooting direction of the camera.
The method of claim 3,
The manipulator includes a drop button for dropping a transmitter for displaying a structural root; And
And a remaining amount display unit for displaying the remaining number of the transmitter for displaying the structure route.
KR1020150110110A 2015-08-04 2015-08-04 The unmanned air vehicle for castaway tracking KR101739262B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150110110A KR101739262B1 (en) 2015-08-04 2015-08-04 The unmanned air vehicle for castaway tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150110110A KR101739262B1 (en) 2015-08-04 2015-08-04 The unmanned air vehicle for castaway tracking

Publications (2)

Publication Number Publication Date
KR20170016684A true KR20170016684A (en) 2017-02-14
KR101739262B1 KR101739262B1 (en) 2017-05-24

Family

ID=58121095

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150110110A KR101739262B1 (en) 2015-08-04 2015-08-04 The unmanned air vehicle for castaway tracking

Country Status (1)

Country Link
KR (1) KR101739262B1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108241349A (en) * 2018-01-15 2018-07-03 梁晓龙 Fire-fighting unmanned plane group system and fire-fighting method
WO2018161331A1 (en) * 2017-03-09 2018-09-13 邹霞 Unmanned aerial vehicle monitoring system and monitoring method
CN108657452A (en) * 2018-03-30 2018-10-16 利辛县雨若信息科技有限公司 A kind of photoelectric nacelle protective cover
CN109987228A (en) * 2017-12-30 2019-07-09 广州刀锋智能科技有限公司 A kind of disaster area personnel search and rescue and goods and materials put-on method based on unmanned plane
KR20190091064A (en) * 2018-01-26 2019-08-05 한국항공우주연구원 Flight control system and method of aerial vehicles based on electric driven
CN110162086A (en) * 2019-03-21 2019-08-23 中山大学 A kind of cluster unmanned plane formation method based on Model Predictive Control frame
CN110162035A (en) * 2019-03-21 2019-08-23 中山大学 A kind of clustered machine people is having the cooperative motion method in barrier scene
JP2021024488A (en) * 2019-08-08 2021-02-22 株式会社T&T Drone for emergency evacuation guide
KR20220170078A (en) * 2021-06-22 2022-12-29 태경전자주식회사 Drone traking a target
KR20230104786A (en) * 2021-12-30 2023-07-11 우석대학교 산학협력단 Firefighting drone with thermal imaging camera

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102626835B1 (en) 2018-10-08 2024-01-18 삼성전자주식회사 Method and apparatus of determining path

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101536095B1 (en) * 2015-01-14 2015-07-13 농업회사법인 주식회사 에이치알제주 Grassland management system using drone
KR101535401B1 (en) * 2015-04-01 2015-07-08 오인선 Drone type life ring dropping device

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018161331A1 (en) * 2017-03-09 2018-09-13 邹霞 Unmanned aerial vehicle monitoring system and monitoring method
CN109987228A (en) * 2017-12-30 2019-07-09 广州刀锋智能科技有限公司 A kind of disaster area personnel search and rescue and goods and materials put-on method based on unmanned plane
CN108241349A (en) * 2018-01-15 2018-07-03 梁晓龙 Fire-fighting unmanned plane group system and fire-fighting method
KR20190091064A (en) * 2018-01-26 2019-08-05 한국항공우주연구원 Flight control system and method of aerial vehicles based on electric driven
CN108657452A (en) * 2018-03-30 2018-10-16 利辛县雨若信息科技有限公司 A kind of photoelectric nacelle protective cover
CN110162086A (en) * 2019-03-21 2019-08-23 中山大学 A kind of cluster unmanned plane formation method based on Model Predictive Control frame
CN110162035A (en) * 2019-03-21 2019-08-23 中山大学 A kind of clustered machine people is having the cooperative motion method in barrier scene
CN110162035B (en) * 2019-03-21 2020-09-18 中山大学 Cooperative motion method of cluster robot in scene with obstacle
JP2021024488A (en) * 2019-08-08 2021-02-22 株式会社T&T Drone for emergency evacuation guide
KR20220170078A (en) * 2021-06-22 2022-12-29 태경전자주식회사 Drone traking a target
KR20230104786A (en) * 2021-12-30 2023-07-11 우석대학교 산학협력단 Firefighting drone with thermal imaging camera

Also Published As

Publication number Publication date
KR101739262B1 (en) 2017-05-24

Similar Documents

Publication Publication Date Title
KR101739262B1 (en) The unmanned air vehicle for castaway tracking
US11407526B2 (en) Systems and methods for UAV docking
US11370540B2 (en) Context-based flight mode selection
CN107531217B (en) Apparatus and method for identifying or detecting obstacles
US9665094B1 (en) Automatically deployed UAVs for disaster response
US20180290748A1 (en) Autonomous in-tunnel intelligence, surveillance, and reconnaissance drone
CN112904892A (en) System and method for monitoring with visual indicia
JP6539073B2 (en) Surveillance system and flight robot
JP6539072B2 (en) Surveillance system and flight robot
JP6664209B2 (en) Investigation method using flight-type survey aircraft
WO2020116392A1 (en) Drone system, drone, movable body, drone system control method, and drone system control program
CN113805586A (en) Autonomous fire-fighting special explosion-proof robot
JP2018055362A (en) Monitoring system
US20210011472A1 (en) System, device and method for time limited communication for remotely controlled vehicles
US20220343779A1 (en) System, device and method for time limited communication for remotely controlled vehicles
SEZGİN et al. Color-Based Object Trackıng For Unmanned Aerıal Vehıcles

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant