WO2016210432A1 - Robotic apparatus, systems, and related methods - Google Patents

Robotic apparatus, systems, and related methods Download PDF

Info

Publication number
WO2016210432A1
WO2016210432A1 PCT/US2016/039638 US2016039638W WO2016210432A1 WO 2016210432 A1 WO2016210432 A1 WO 2016210432A1 US 2016039638 W US2016039638 W US 2016039638W WO 2016210432 A1 WO2016210432 A1 WO 2016210432A1
Authority
WO
WIPO (PCT)
Prior art keywords
uav
landing
landing platform
emr
unmanned vehicle
Prior art date
Application number
PCT/US2016/039638
Other languages
French (fr)
Inventor
Taylor Duane DIXON
Alton Vincent WELLS
Jake August HEWITT
Ethan Ryan BURNETT
Steven Pierre HICKS
Original Assignee
Apollo Robotic Systems Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/924,475 external-priority patent/US9444544B1/en
Application filed by Apollo Robotic Systems Incorporated filed Critical Apollo Robotic Systems Incorporated
Publication of WO2016210432A1 publication Critical patent/WO2016210432A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • G01S13/913Radar or analogous systems specially adapted for specific applications for traffic control for landing purposes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0027Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/0202Control of position or course in two dimensions specially adapted to aircraft
    • G05D1/0204Control of position or course in two dimensions specially adapted to aircraft to counteract a sudden perturbation, e.g. cross-wind, gust
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0013Transmission of traffic-related information to or from an aircraft with a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0026Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0082Surveillance aids for monitoring traffic from a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0091Surveillance aids for monitoring atmospheric conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/02Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data
    • G08G5/025Navigation or guidance aids
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]

Definitions

  • the present disclosure is generally related to robotic systems and related methods, and more particularly is related to unmanned vehicle safety management systems and methods for unmanned vehicle safety management implementation, unmanned vehicle communication system and methods for unmanned vehicle communication through short message service (SMS), a non-inertial frame lock for unmanned aerial vehicles, a foldable unmanned aerial vehicle landing platform; an unmanned aerial vehicle landing and containment station; a method and design for unmanned aerial vehicle delivery optimization; and; a system and methods for determining wind vectors using power consumption data from an unmanned vehicle.
  • SMS short message service
  • the disclosure has particular utility for use with drones and will be described in connection with such utility, although other utilities are contemplated.
  • Robotic devices are increasingly becoming integrated within society, business, and existing technologies.
  • robotic devices are used through the manufacturing industries to manufacture automobiles, machinery, computer components, and countless other goods.
  • Robots are also being relied on increasingly to perform service activities.
  • robots are routinely used for cleaning, for delivering goods in manufacturing facilities, and to perform military activities where a human injury or fatality is at risk.
  • UAV unmanned aerial vehicle
  • UAS unmanned aircraft system
  • While UAVs are steadily improving in their use, there are numerous areas of their use where further enhancements are needed to better utilize UAVs for future use. These areas include, but are not limited to, the following aspects of UAV use:
  • UAV flight requires a UAV to travel from one set point to another, but to also maintain flight stability.
  • One component of flight stability may be to ensure the UAV is able to keep a stable roll and pitch above a fixed position. This is specifically important for a successful UAV landing and to ensure the safety of the UAV and its surroundings.
  • Some UAVs use image detection technology to identify a specific landing place; once the UAV identifies said landing place, the UAV begins its descent to land.
  • Other UAVs may allow another entity to take control of the UAV at the time of landing to provide more guidance in the process.
  • many of these technologies do not address the importance of stability throughout the UAV's landing. During the landing phase, the UAV must maintain stability to ensure its safety during landing.
  • one side of the UAV may reach the ground or landing platform before another side and cause an uneven shift in the UAV position. Consequently the UAV may tip over or fall, possibly damaging the UAV itself and/or the UAV's surroundings.
  • Vehicle landing platform Standard methods of delivery have improved over the past years as many large companies offer the option to have one-day deliveries. However, most of these delivery methods involve the process of an item being picked up at a facility, the item being shipped to a specific shipping earner, and the shipping carrier completing the process by delivering the item to the buyer. The use of UAVs can drastically simplify this process of delivery and reduce the amount of time it takes to complete a deliver)-. Current UAV delivery methods offer no reliable landing platform for the delivery of an item. In addition, a majority of UAV owners and users lack a consistent and reliable landing platform, an essential piece of hardware for any sophisticated system that employs UAVs.
  • UAV uses expanding beyond military applications, there is an evident need for portable and convenient housing units for UAVs. Additionally, aside from recreation, UAVs have been used for agricultural purposes (e.g. crop inspection), photography, and search and rescue to name a few. As public interest regarding using UAVs to accomplish a variety of tasks increases, there is a greater demand for UAVs, and consequently UAV owners may need to manage multiple UAVs at once.
  • Wind maps show wind speed and direction are used throughout the aeronautics industry. Wind mapping plays an important role in various applications, such as weather broadcast, aviation, UAV, etc. Wind mapping technologies for UAVs include but not limited to sonic anemometers, hot-wire anemometers, pilot tubes etc. However, many of these technologies consume power that could limit the unmanned vehicle's flight time. The usefulness of UAV is often limited by the battery life of the vehicle.
  • Embodiments of the disclosure relate to unmanned vehicle safety management system and methods for unmanned vehicle safety management implementation.
  • a cloud-based, unmanned vehicle safety management system for registering and enabling a plurality of unmanned v ehicles to accomplish the full extent of their potential tasks while obeying federal and private restrictions.
  • the system comprises a UV communication interface, a network interface, a memory and a microprocessor coupled to the aforementioned components.
  • the safety management system may further comprise a radar to identify locations of any unmanned vehicles within a geographic area. Information related to the geographic area may be preloaded within the memory.
  • the UV communication interface communicates to at least one unmanned vehicle via a UV communication link.
  • the UV communication link is a dual way wirelessly link supporting both downlink and uplink
  • the unmanned vehicle may transmit signals to or receive signals from the UV communication link.
  • the information transmitted from the unmanned vehicle may include but not limit to location of the UV, height of the UV, speed of the UV, power reserve information, etc.
  • the information transmitted to the umnanned vehicle may include but not limit to any applicable restricted areas near the UV, any tall buildings along the travelling path of the UV, any nearby landing stations/zones, etc.
  • the safety management system assigns travel paths to registered and reporting unmanned vehicles (utilizing path-searching algorithms such as breadth-first, depth-first. A*, or other heuristic methods) to ensure the safety o unmanned vehicles and their surroundings.
  • manually controlled unmanned vehicles registered with the safety management system may be limited to one or more zones of operation by the same safety management system.
  • the management system accomplishes said tasks by assessing risk factors associated with geographical areas (e.g. area population, ground-based activities, tall buildings or structures, weather) and federal, state, or local restrictions (e.g. restricted airspace or roadways, unmanned vehicle operation or use, quantity of unmanned vehicles) so as to appropriately enforce the height, speed, and travel path of unmanned vehicles registered with the management system.
  • the management system may also have the ability to permit specific unmanned vehicles to bypass restrictions if necessary, manually control any unmanned vehicles that attempt to illegally bypass set restrictions, and assess the health of any registered and reporting unmanned vehicle. Additionally, the management system may be further equipped to detect illegal unmanned vehicle presence via low level radar, and also give unmanned vehicles commands based on power consumption predictions. All communications accomplished between this safety management system and all unmanned vehicles registered with the safety management system will be accomplished through computer units located on the unmanned vehicles, or indirectly through a separate computer that would facilitate communication between the safety management system and the unmanned vehicle.
  • a method for an unmanned vehicle using a SMS to relay information to and from a separate management system is disclosed. All or portions of the payload/data to be transmitted to and from the unmanned vehicle and management system uses this SMS, specifically payload/data that aids in the operation and location determination (such as GPS data) of the unmanned vehicle.
  • This method of communication offers a new, low power solution for unmanned vehicle communication by utilizing existing cellular technology.
  • the unmanned vehicle and management system are defined as having at least one computer unit/device, with the management system also requiring a user interface and subsequent additional computers and/or systems to accomplish its management tasks.
  • an unmanned vehicle capable of communication through SMS.
  • the UV may include a telemetry unit to communicate with a cellular network via a bidirectional communication link supporting SMS communication with at least one communication protocol.
  • a processor is coupled to the telemetry unit.
  • a memory device comprises a non-transitory computer-readable medium comprising computer executable instructions that, when executed by the processor, cause the UV to: receive information from cellular network via the telemetry unit; and transmit short messages to the cellular network via the telemetry unit, the short messages being refreshed and transmitted actively from the unmanned vehicle on a repetitive schedule with a repetitive interval.
  • a method for SMS between an unmanned vehicle and a cellular network is disclosed.
  • a short message is compiled from information obtained from at least one of current location, speed, altitude, destination, power reserve of the unmanned vehicle.
  • the short message is sent on a repetitive schedule over a communication path between the UV and the cellular network, the communication path supporting at least one
  • the short messages are sent from the unmanned vehicle using a first communication protocol, if no reply or conformation received within a time threshold, the short messages are sent via a second communication protocol different from the communication protocol, i still no reply or conformation received within the time threshold, the short messages are sent via the first communication protocol again, wherein the time threshold is smaller than the repetitive interval, and wherein the short messages are refreshed on the repetitive schedule.
  • Embodiments of the disclosure also relate to a UAV non-inertial frame lock system.
  • a UAV non- inertial frame lock system has a landing platform having at least one electromagnetic wave (EMR) emitter positioned thereon.
  • EMR electromagnetic wave
  • a UAV has at least three EMR receivers.
  • a computational unit of the UAV uses at least one EMR signal communicated between the EMR emitter and at least one of the at least three EMR receivers to control at least one of a position, a roll, and a pitch of the UAV during a landing process.
  • a UAV non- inertial frame lock system has a landing platform having at least three electromagnetic wave (EMR) receivers positioned thereon.
  • a UAV has at least one EMR emitter.
  • a landing platform computational unit receives at least one EMR signal from the at least one EMR emitter, processes the at least one EMR signal, and communicates it to a UAV computational unit, wherein the UAV computational unit uses at least one processed EMR signal to control at least one of a position, a roll, and a pitch of the UAV during a landing process.
  • a method of controlling a landing of a UAV with a non-inertial frame lock system is disclosed.
  • a UAV is positioned proximate to a landing platform having at least one
  • EMR electromagnetic wave
  • Embodiments of the disclosure also relate to a foldable unmanned aerial vehicle landing platform.
  • an aerial robotic landing apparatus has a landing platform formed from a plurality of portions.
  • a plurality of warning devices are positioned along an outer edge of the landing platform, wherein the plurality of warning devices are activatable by a UAV during a landing o the UAV on the landing platform.
  • a robot landing platform has a polygon-shaped platform having a plurality of sections.
  • a flexible junction is formed between each f the plurality of sections, wherein the polygon- shaped platform is foldable along each of the flexible junctions.
  • a method of landing a UAV on a landing platform is disclosed.
  • a communication link is established between an in-flight UAV and a landing platform positioned on a grounded surface.
  • At least one warning on the landing platform is activated by the in- flight UAV through the communication link, wherein the at least one warning is identifiable by a human observer of the landing platform.
  • Embodiments of the disclosure also relate to an unmanned aerial vehicle landing and containment station.
  • a UAV landing station apparatus has support legs contactable to a grounded surface.
  • An electronics bay is supported by the support legs.
  • a support column extends vertically from the electronics bay.
  • a landing platform support is positioned vertically above the electronics bay with the support column.
  • a charging ring is formed on the landing platform.
  • a UAV landing station apparatus has a plurality of panels forming an enclosable area, wherein at least a portion of the plurality of panels are roof panels.
  • a landing platform is formed as a floor of the enclosable area, wherein the roof panels are movable between open and closed positions, wherein in the open position, the landing platform is accessible for landing a UAV.
  • a UAV landing station apparatus has a landing platform with a charging ring formed on the landing platform.
  • the UAV is landed on the landing platform, wherein at least one leg of the UAV contacts the charging ring.
  • a power source to the charging ring is supplied to recharge a power source of the UAV.
  • Embodiments of the disclosure also relate to a method and design for unmanned aerial vehicle delivery optimization.
  • an unmanned aerial vehicle apparatus has a control board and at least one rotor plate connec table to the control board, wherein the at least one rotor plate has a plurality of rotor arms extending outwards therefrom.
  • a modular unmanned aerial vehicle apparatus has a control board.
  • a plurality of rotor plates is connected to the control board, wherein the plurality of rotor plates each have at least four rotor arms extending outwards therefrom, wherein each of the rotor arms carries at least one rotor.
  • a plurality of modular, stacked batteries is connected by at least one of: the control board and the plurality of rotor plates.
  • a payload area is included in the UAV.
  • a method of delivering payloads with an unmanned aerial vehicle disclosed.
  • a weight of a payload is determined.
  • a delivery distance of the payload is determined.
  • a quantity of UAV rotors required for the determined weight and delivery distance of the payload is selected.
  • the payload is delivered.
  • this disclosure describes an unmanned vehicle using its power consumption data to determine wind vectors from computations made by an external management system using the power consumption data. These vectors are mapped to a graphical user interface, which not only displays the wind but also the unmanned vehicle(s) along with a general map of the area and the unmanned vehicle's planned flight path.
  • This method of computing wind vectors offers a low power solution for determining the direction and intensity of wind with respect to an unmanned vehicle.
  • the unmanned vehicle is able to save battery power from wind mapping onboard, thus allow improved overall flight time or allocate more power to a specific function of the unmanned vehicle.
  • the wind represents a dynamic variable that will account for the rate of change of wind velocity as well as the intensity or strength of the wind as determined by its velocity and force.
  • the UAV consumes more or less power from the battery, depending on parameters including wind speed and direction.
  • power consumption data dependents on wind factors and the angle at which the wind is going with respect to the UAV.
  • the power consumption data can be analyzed using external computing by a management system to determine what the cause for varying power consumption data is. This computing gives sufficient data to determine the direction and intensity of the wind to a specific accuracy.
  • a graphical user interface includes a map of the UAV, the UAV's flight path, and wind vectors.
  • the details of the flight may be seen in a web browser or other computer window.
  • the background of the GUI may be a map displaying the general area of where the UAV takes off, where the UAV lands, and places in between as shown by the flight path.
  • the map may change as the flight path changes and may be interactive through scrolling, or it may automatically follow the UAVs through tracking.
  • the GUI may display more than one UAV and how each flight path is used to determine wind direction and intensity as shown by the imaging of wind vectors on the map.
  • the unmanned vehicle's flight path may be in the direct path of a wind, which may affect the UAV's power consumption.
  • an external management system is able to compute the direction and intensity of the wind, which is displayed as a visual representation of wind vectors on the GUI.
  • This disclosure describes a method of compiling a plurality of aerial images taken by unmanned aerial vehicles (UAVs) in order to create a single two- dimensional and/or three dimensional virtual display or environment in real-time, as defined by the limitations or delay inherent with the methods and system used to accomplish the present invention.
  • UAVs unmanned aerial vehicles
  • Multiple images would be sent from multiple UAVs to an image compilation server, which in turn would perform one or more algorithmic processes so as to construct the virtual 3D environment of the present invention.
  • the image compilation server would then have the ability to send the virtual 3D environment data to an enabled computer that would display the environment in a virtual reality, augmented reality, or other similarly capable display format.
  • a light display is mounted on a rotor blade of an unmanned aerial vehicle (UAV).
  • the display includes an array of lights.
  • the array is attached to the underside of the rotor blade, or is otherwise embedded inside the rotor blade itself.
  • a separate computational device equipped with all specific rotations per minute (RPM) of the rotor blades of the UAV will transmit the current RPM of the rotor blades to the microprocessor.
  • the microprocessor will then have the ability to transmit signals to the array of lights at a frequency corresponding to the RPM input, so as to
  • a microprocessor mounted on the UAV and individually connected to the array of lights by a cable, includes a memory which stores a plurality of display patterns.
  • the microprocessor modulates the array of lights according to a selected one of the plurality of display patterns and the preprogrammed angular v elocity of the rotating blade to form an image using persistence of vision of a viewer.
  • Three different implementations with respect to the array of lights are described, including utilizing LED's, laser diodes and an array of reflectors, and finally light sources coupled to the ends of optical fibers.
  • this disclosure describes a method of utilizing a mobile device equipped with a camera and electronic visual display so as to identify UAVs within the field of vision of the mobile device ' s camera, and quickly cross-check the location data of the UAV(s) in view of the camera with location data of a UAV management system, so as to inform a user via the same mobile device as to whether or not the observed UAV(s) are registered with or known to the UAV management system.
  • the mobile device is further described as being capable of displaying the location of UAVs known to the UAV management system, in addition to UAVs unknown to the UAV management system that have been observed by a user implementing the method of the present invention.
  • Figure 1A shows a component diagram of the unmanned vehicle safety management system, in accordance with an embodiment of the present disclosure.
  • Figure IB shows the core actors that interact with the safety management system and other relevant systems and actors that may be indirectly involved, in accordance with an embodiment of the present disclosure.
  • Figures 2A-2C show a graphical user interface used to interact with the safety management system, including establishing various types of zones to create a risk- based topography and changing a variety of settings with respect to the unmanned vehicles and zones involved, in accordance with an embodiment of the present disclosure.
  • Figure 3 shows a graphical representation of how zones created to establish a risk-based topography would be interpreted by the safety management system, and in turn how a path determined by a pathing algorithm would be sent to an unmanned vehicle, in accordance with an embodiment of the present disclosure.
  • Figures 4A-4D show a graphical user interface used to identify illegal unmanned vehicles versus vehicles registered with the safety management system through the process of overlaying real-time locations of unmanned vehicles with a low level radar map, in accordance with an embodiment of the present disclosure.
  • Figures 5A-5B show a graphical user interface that displays how a power management system may prevent any dangerous traveling activity based on reading a value of potential directly from a battery of an unmanned vehicle, in accordance with an embodiment of the present disclosure.
  • FIG. 6 shows an environment and method by which unmanned vehicles, computers, and the cloud-based safety management system would communicate, in accordance with an embodiment of the present disclosure.
  • Figure 7 shows a configuration and specification of components located on an unmanned vehicle registered with and reporting to the safety management system, in accordance with an embodiment of the present disclosure.
  • Figure 8 shows a process of communication between the external management system, the cellular network, and the unmanned vehicle, in accordance with an embodiment of the present disclosure.
  • Figure 9 shows a minimum composition of the device that is present in the unmanned vehicle, and may be present in the external management system, in accordance with an embodiment of the present disclosure.
  • Figure 10 shows an example communication path diagram between the unmanned vehicle, an external management system and a landing platform, in accordance with an embodiment of the present disclosure.
  • Figure 11 shows one possible configuration of unmanned vehicle positions for a graphical user interface showing the position of unmanned vehicles, cellular towers, and how communication between them occurs, in accordance with an embodiment of the present disclosure.
  • Figure 12 shows an example implementation circuit diagram for the Non- Inertia 1 Frame Lock, in accordance with an embodiment of the present disclosure.
  • Figure 13 shows the flow diagram of the computational unit of the Non- Inertial Frame Lock, in accordance with an embodiment of the present disclosure.
  • Figures 14A-14B show a top-down view of landing platform and imbedded wiring and circuitry with the electromagnetic radiation receivers and/or
  • electromagnetic radiation emitters in accordance with an embodiment of the present disclosure.
  • Figures 15A-15D show a UAV approaching and landing on a compatible platform with the electromagnetic wave receivers located on the UAV, and the electromagnetic radiation emitter located on the platform, in accordance with an embodiment of the present disclosure.
  • Figures 16A- 16E shows an opposite configuration with respect to the example implementation in Figs. 15A-15D, where the UAV is portrayed having the electromagnetic radiation emitter and the compatible platform having the
  • electromagnetic radiation receivers in accordance with an embodiment of the present disclosure.
  • Figures 17A-1 7B show a side view of the example embodiment referred to in Figs. 15A-15D, with a UAV aligning itself with a landing platform, and a graphical representation of the electromagnetic radiation being emitted from the landing platform, in accordance with an embodiment of the present disclosure.
  • Figures 18A-18C shows an example configuration for electromagnetic radiation receivers on a UAV, utilizing a protruding rectangular apparatus to suspend the receivers beneath the UAV, in accordance with an embodiment of the present disclosure.
  • Figure 19 shows a closer view of a rail fitted with electromagnetic radiation emitters on one of its sides, used specifically for the implementation described in following figures, in accordance with an embodiment of the present disclosure.
  • Figures 20A-20B show a UAV approaching the rail described in Fig. 19, the UAV restricted to movement in only one plane, in accordance with an embodiment of the present disclosure.
  • Figures 21 A-21C show a UAV hovering very closely above the rail described in Fig. 19, at what may be a minimal distance of interaction between the UAV and the rail for the purposes of the implementation described, in accordance with an embodiment of the present disclosure.
  • Figure 22 shows an example logic diagram of a landing platform's computational unit, in accordance with an embodiment of the present disclosure.
  • Figure 23 shows a top-down view of the landing platform design with imbedded wiring and circuitry, in accordance with an embodiment of the present disclosure.
  • Figures 24A-24G show platform foldability and what the landing platform looks like when completely folded, in accordance with an embodiment of the present disclosure.
  • Figures 25A-25C show a UAV approach to and landing on the landing platform, in accordance with an embodiment of the present disclosure.
  • Figures 26A-26E show examples of the folding process and final form for the housing of a UAV using the landing platform, in accordance with an embodiment of the present disclosure.
  • Figures 27A-27B show an example implementation of a cellular
  • communication device being used to communicate the landing platform's location to the UAV via an external management system, in accordance with an embodiment of the present disclosure.
  • Figures 28A-28H show a first example station embodiment and its components, in accordance with an embodiment of the present disclosure.
  • FIGS. 28 ⁇ -28 ⁇ show an unmanned aerial vehicle approaching and landing on the station initially described in FIGS. 28 ⁇ -28 ⁇ , in accordance with an embodiment of the present disclosure.
  • Figures 30A-30B show a second example station embodiment that may additionally house an unmanned aerial vehicle, in accordance with an embodiment of the present disclosure.
  • Figures 31 A- 3 IE show the station initially described in FIGS. 30A-30B in more detail and from multiple perspectives, specifying components and systems within the station, in accordance with an embodiment of the present disclosure.
  • Figures 32A-32C shows the components and assembly technique of stacking components to create an unmanned aerial vehicle, in accordance with an embodiment of the present disclosure.
  • Figure 33 shows the algorithmic logic of an example delivery optimizatio method used to construct an unmanned aerial vehicle as described in FIGS. 32A-32C, in accordance with an embodiment of the present disclosure.
  • Figure 34 shows a block diagram of an unmanned vehicle in communication with a management system for wind mapping using power consumption data, in accordance an embodiment of the present disclosure.
  • Figure 35 shows an exemplary configuration of an unmanned vehicle with wind that may be present during the unmanned vehicle's flight and how the wind may approach the unmanned vehicle at various angles, in accordance with an embodiment of the present disclosure.
  • Figure 36 shows a graph of the power consumption data specific to the motors of the unmanned vehicle relative to the direction of the unmanned vehicle according to the example embodiment of FIG. 35, in accordance with an embodiment of the present disclosure.
  • FIG. 37 shows an example graphical user interface (GUI), in accordance with an embodiment of the present disclosure.
  • GUI graphical user interface
  • Figure 38 shows a block diagram of a wind mapping system, in accordance with an embodiment of the present disclosure.
  • Figure 39 shows an example flow diagram for wind mapping, in accordance with an embodiment of the present disclosure.
  • Figures 40A and B show the underside of a UAV, where a stationary locked camera is attached facing the ground.
  • Figure 41 shows a diagram of generic system elements, specific to an implementation where five UAVs communicate with a single image compilation server, which in turn communicates with a computer.
  • Figure 42 shows a more specific block diagram of the components comprising a UAV, and its communication with the image compilation server.
  • Figure 43 shows the algorithmic process by which the image compilation server creates the virtual 3D environment of the present invention, given images acquired by UAVs.
  • Figures 44 A -C show a specific implementation with respect to the general implementation of utilizing five UAVs as in Fig. 4, with the geographic locations of the UAVs on a map. along with the area capable of image capture specific to each UAV.
  • Figures 45A-I show a UAV above a building over time, with the perspective of the camera on the UAV shown, and the UAV capturing images over a multitude of angles and perspectives.
  • Figures 46A-C show the first implementation of the present invention, where a rotor blade equipped with an array of individual lights on its bottom face, from multiple perspectives.
  • Figures 47A-D show a second implementation of the present invention, where a rotor blade is equipped with a horizontal array of laser diodes at the center of rotation, and the laser diode beams are directed along hollow cavities inside the blades to equidistant 45-degree diffuse reflector to yield an identical effect as the first implementation.
  • Figures 48 A-D show a third implementation of the present invention, where optical fibers are embedded in the rotor blade so as to guide light sources (located at the center of rotation of the rotor blade, similar to the second implementation) to a diffuse or semi-diffuse emission along equidistant points of the rotor blade.
  • Figure 49 shows a block diagram of the control circuitry and electronic components of the present invention.
  • Figures 50A-D show several examples of messages and display possibilities of which the present invention is capable.
  • Figure 51 shows the underside of a UAV utilizing the rotor blades of the present invention to display a message.
  • Figure 52 shows a diagram o generic system elements, specific to an implementation where three UAVs communicate with a single UAV management system, which in turn communicates with an appropriately equipped mobile device. An unknown UAV not in communication with the UAV management system is also shown.
  • Figures 53A-B show the perspective of a user utilizing an example implementation o the present invention, in which four UAVs are present within the view o a user, and the mobile device displays whether or not a selected UAV within the display of the mobile device is registered with the UAV management system of the present invention.
  • Figures 54A-B show another implementation of the present invention, in which a UAV management system would be capable of displaying the location of a user's mobile device on the display of the mobile device in addition to the location of UAVs known (or potentially unknown) to exist in a given airspace.
  • Figure 55 shows a distanced perspective of a user with a mobile device, observing four UAVs as consistently described in FIG. 52, FIGs. 53A-B and FIG. 54A-B. Additionally, lines of perspective are drawn to show the specific field of view presented in FIGs. 53A-B.
  • inventions of the disclosure may take the form of computer- executable instructions, including algorithms executed by a programmable computer or a computational unit. However, the disclosure can be practiced with other computer system configurations as well. Certain aspects of the disclosure can be embodied in a special-purpose computer or data processor that is specifically programmed, configured or constructed to perform one or more of the computer- executable algorithms described below.
  • a device may include a variety of computing hardware and software, including the use of computerized communication networks.
  • the computing systems disclosed may include any type of computing or computational device, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a network server, and the like.
  • Any of the computing systems described herein may include various computing components, such as at least one processor, an input/output (I/O) interface, and a memory.
  • the at least one processor may be implemented as one or more microprocessors, microcomputers,
  • the at least one processor is configured to fetch and execute computer-readable instructions stored in the memory.
  • the I/O interface may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like.
  • the I O interface may allow the system to i nteract with a user directly or through the client devices. Further, the I/O interface may enable communication with other computing devices, such as web servers and external data servers.
  • the I/O interface can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite.
  • the I/O interface may include one or more ports for connecting a number of devices to one another or to another server.
  • the memory used by the computing device may include any computer- readable medium known in the art including, for example, volatile memory , such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable
  • the memory may include the programmed instructions and data.
  • the disclosed disclosures may utilize various communication networks.
  • a network may be a wireless network, a wired network or a combination thereof.
  • the network can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the Internet, and the like.
  • the computing devices may be implemented in a cloud- based environment and may be accessed by multiple other computing systems through one or more devices communicatively coupled to the computing device through a network.
  • the network may be a dedicated network or a shared network.
  • the shared network may represent an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP),
  • HTTP Hypertext Transfer Protocol
  • Transmission Control Protocol/Internet Protocol TCP/IP
  • WAP Wireless Application Protocol
  • the network may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
  • This disclosure pertains to various improvements in the field of robotics, and particularly in the use of UAVs and other aerial systems. While the disclose describes the subject disclosure relative to UAVs, it is noted that the disclosure is not limited to UAVs and may be equally applied to other robotic devices, including manned, unmanned, aerial, and/or grounded robotic devices, or any combination thereof.
  • Unmanned Vehicle Safety Management To assist in the safe use of UAVs for commercial and recreational use, a management system specially configured for unmanned vehicle safety may be implemented. Such a system may contain all data specific to any deployed unmanned vehicles, while also serving as an environment to set restrictions on said unmanned vehicles.
  • a cloud-based management system may not only enable the registering and tracking of unmanned vehicles, but also restrict or limit unmanned vehicles to specific travel configurations. These restrictions may include determining specific areas where it is and is not safe to travel through certain areas, any speed restrictions, and in the case of aerial vehicles, operational height restrictions.
  • the goal of such a system may be to provide an administrator with a streamlined environment that makes it easy to perform the tasks mentioned above and also for a common user to view areas of travel restrictions and pertinent data.
  • FIG. 1A shows a component diagram of the unmanned vehicle (UV) safety management system 10 in accordance with an embodiment of the present disclosure.
  • the system 10 comprises a UV communication interface 20, a network interface 30, a memory 50 and a microprocessor 40 coupled to the aforementioned components.
  • the microprocessor 40 may be a microprocessor, a central processing unit (CPU), a digital signal processing (DSP) circuit, a programmable logic controller (PLC), or a combination thereof.
  • the network interface 30, memory 50 and the microprocessor 40 are all disposed within a server 15.
  • the server may be a cloud-based server.
  • some or all of the functionalities described herein as being performed by the unmanned vehicle safety management system 10 may be provided by the microprocessor 40 executing instructions stored on a non-transitory computer-readable medium, such as the memory 50, as shown in Figure 1A.
  • the UV communication interface 20 communicates to at least one unmanned vehicle 60 via a UV communication link 22.
  • the UV communication link 22 is a bidirectional wirelessly link supporting both downlink and uplink commumcation such that the unmanned vehicle 60 may transmit signals to or receive signals from the UV communication link 22.
  • the UV communication link 22 may be a cellular network link, a Wi-Fi link, a Bluetooth link or a telecommunication channel directly or via a satellite.
  • the information transmitted from the unmanned vehicle 60 may include but not limit to UV identification number, location of the UV, speed of the UV, power reserve information, etc.
  • the power reserve information may be referred as a battery or fuel reserve information.
  • the signals transmitted to the unmanned vehicle 60 may be an alert, a command for landing, changing travel path, etc.
  • the signal transmitted to the unmanned vehicle 60 may include but not limit to any applicable restricted areas near the UV, any tall buildings along the travelling path of the UV, any nearby landing stations/zones, etc.
  • the unmanned vehicle 60 may comprise necessary communication means to communicate to the UV safety management system 60.
  • One exemplary UV component diagram is shown in Figure 7.
  • the safety management system 10 further comprises a radar 55 (typically a low level radar) coupled to the server 15 (or the microprocessor 40) to identify locations of any unmanned vehicles within a geographic area.
  • a radar 55 typically a low level radar
  • Information related to the geographic area may be preloaded within the memory 50 or loaded by a user to die memoiy 50.
  • the information may include geographical information (e.g. area population, ground-based activities, tall buildings or structures, weather) and regulation information, such as any applicable federal, state, or local restrictions (e.g. restricted airspace or roadways, unmanned vehicle operation or use, quantity of unmanned vehicles) within the geographic area.
  • Geographic information e.g. area population, ground-based activities, tall buildings or structures, weather
  • regulation information such as any applicable federal, state, or local restrictions (e.g. restricted airspace or roadways, unmanned vehicle operation or use, quantity of unmanned vehicles) within the geographic area.
  • the information may be retrieved and utilized by the microprocessor to appropriately enforce the height, speed, and travel path of unmanned vehicles registered with the management system.
  • the network interface 30 communicates to at least one user device 70 via a network communication link 32.
  • the network communication link 32 is a
  • the network communication link 32 may be a wireless link or a wired link.
  • the network communication link 32 may be a Wi-Fi link, a Bluetooth link, a USB, FireWirc, DIN. or an e-SATA/SATA connection.
  • the server 15 hosts a GUI (graphic user interface), which may be visited by a user via a web browser. The GUI receives user input and responses accordingly. A user or customer may use the user device for UV
  • safety management system 10 may be operated by multiple administrators (also referred as actors). Different actors may have different levels of permission and ability to operate the management system.
  • FIG. IB shows a diagram of an example implementation of the present disclosure that describes multiple actors and their given permissions and abilities with respect to a safety management system. It should be noted that all arrows pointing from one actor to another denote that the actor from which the arrow originates inherits the permissions and abilities of the actor to which the arrow is pointing.
  • the administrator 101 would be an actor who is able to add 102, edit 103, and delete 104 zones. The administrator 101 may also be responsible for granting permission to specific users 1 1 1 and entities 115 to travel in restricted zones 106. The administrator 101 may also have the ability to query any o the users 1 1 1 or entities 115 and view specific information about them 105. The administrator 101 may view and approve user requests 110 and actions 109.
  • the administrator 101 may also view the current location 107 and operation history 108 of each unmanned vehicle. All system operational settings, including modifying risk-based topography settings 125, modifying unmanned vehicle parameters 126. modifying unmanned vehicle schedules 127, would also be abilities and permissions available to an administrator 101. Finally, an administrator 101 would have access to all functions of a landing station/zone manager, including the ability to add 128, edit 129, and delete 130 landing
  • a customer 1 15 or a user 1 1 1 may have the ability to search 113 and view zone locations 114, which may be important for viewing potential travel routes and determining which zones an unmanned vehicle would not have access to.
  • the user may also create personal profiles 1 12 for monitoring their unmanned vehicles, or create entities 116 in which groups of people can collaborate to operate and manage an unmanned vehicle.
  • the entities 1 15 may have the ability to view 1 18 and modify 1 17 their entity person 115 roster which would allow for the easy addition and subtraction of users.
  • Each customer 1 15 and entity 116 may have the ability to view their current unmanned vehicle location 107, view the history of operations 108, modify their unmanned vehicle roster 109, and request special permission 1 10 from the administrator 101.
  • the traffic manager 11 may oversee all unmanned vehicle traffic in one or more defined areas and/or zones.
  • a traffic manager 119 may have the ability to validate zones and ' or requests 120 from the administrator 101 or customers 1 15, which would serve the purpose of ensuring that no conflicting unmanned vehicle traffic and/or differing assigned risk values zones exist.
  • the traffic manager 119 may also validate temporary zone reservations 121 in the scenario that a new or old zone requires modification for a specific period of time due to an unexpected event (e.g. high unmanned vehicle traffic, areas affected by a recent natural disaster, building fire). Additionally the traffic manager 119 may validate paths 122 prior to unmanned vehicle travel and confirm unmanned vehicle landing success 123.
  • the traffic manager 1 19 may have the ability to allow the unmanned vehicle to bypass said restrictions 124. It should also be noted that the dotted arrow from the traffic manager 119 to the administrator 101 implies that certain permissions and abilities may be inherited by the traffic manager 1 19 from the administrator 101. These specific inherited permissions and abilities would include risk management (102 through 104), landing station/zone management ( 128 through 130), and UV management (107 through 110).
  • FIG. 2A describes one example implementation of a graphical user interface (GUI) in a web browser, map-based environment 201 being utilized to create a series of geofences to establish a zone 203-1. All of the zones 203 would be created by clicking the map 201 to insert and move zone vertices to create the zone geometries 203. Additionally the GUI may have the ability to click on objects on a map display so as to quickly create a zone (such as the zones 203) surrounding said object.
  • GUI graphical user interface
  • Specifications and alterations to the properties of the zones 203 may be accomplished through a smaller options window 202.
  • the options window 202 would have an input feature for specifying a risk value for one of the selected zones 203.
  • This input may be a drop-down menu as shown, or another graphical input interface so as to allow a user to manually input: a specific risk value.
  • Risk values would follow the convention of representing the risk of an unmanned vehicle failing to operate or operating erratically, introducing the potential for damages to persons or property. Risk values may be directly proportional to values such as cost of po tential crash damages or the statistical probability f a collision. Risk values may be determined by a multitude of other factors related to the area over which a zone is established.
  • the options window 202 may have a feature for establishing different risk values for different times of day, with respect to the same zone.
  • zone 203-2 may have an assigned risk value of 1 from 12am to 8am, a risk value of 2 from 8am to 4pm, and a risk value of 3 at all other times, with the lowest number representing the lowest possible risk, and the highest number representing the highest possible risk.
  • options window 202 may also have a feature for establishing temporary zone restrictions for specific times or days as well as establishing risk values based on variables such as heavy unmanned vehicle traffic or other emergency events. Different bound, scaling, and maximum and minimum value conventions may be employed to establish a similar risk-based topography system.
  • FIG. 2B shows one example representation of how a customer's GUI 201 may appear.
  • the customer 115 may interface with the safety management system and manage their unmanned vehicle(s) 205, create entities and user profiles, and view zone locations 207.
  • a core feature of the safety management system will be its ability to receive GPS location data specific to unmanned vehicle locations.
  • the customer 1 15 may view an unmanned vehicle ' s 205 location either through the graphical representation on the map where an unmanned vehicle 205 may be located, or through another screen, popup, or sidebar that may show exact latitude and longitude coordinates 207 of the unmanned vehicle 205. These methods of viewing an unmanned vehicle 205 are often useful in recovering lost unmanned vehicles 205 and monitoring an unmanned vehicle's 205 behavior.
  • the customer 115 may also track an unmanned vehicle's 205 history of operations 206, which may be important for commercial entities that wish to review travel times and operations. Additionally the customer 115 may have the ability to add unmanned vehicles 205 to their roster 206 and request permissions to break certain gco-feneing parameters when appropriate 207.
  • FIG. 2C shows another example representation of how an administrator's GUI 201 may appear.
  • a feature of being able to view and query any customers or entities is shown 209.
  • an administrator may view customers' and/or entities' unmanned vehicles 205, the unmanned vehicles' 205 corresponding positions, as well as each unmanned vehicles * 205 history 209.
  • the administrator may have the option to grant any customer requests or permissions 209. It should be noted that the exact configurations and placement of the functionalities of the safety management system may be in any form of a screen, popup, or sidebar in the GUI 201, which may vary from the given example figures.
  • FIG. 3 displays how the risk-based topography and travel planning may be combined to create waypoints 302 and a travel route 219 for an unmanned vehicle 205-3.
  • the safety management system would first process data from the GPS coordinates of the unmanned vehicle 205-3 and risk values of any relevant areas 203 of the map that may be between the starting and final location of the unmanned vehicle 205-3. The system would then convert the unmanned vehicle's 205-3 current location and its final destination's location 204 GPS coordinates into grid coordinates. Then through a pa thing algorithm, the safety management system would be enabled to search for paths between the two grid coordinates and establish waypoints 302 along the path where changes in the path direction occur (also known as path derivatives).
  • the safety management system may then either automatically select a path (utilizing path-searching algorithms such as breadth- first, depth-first, ⁇ *. or other heuristic methods) the travel route for an unmanned vehicle 205-3, or different paths may be presented to a customer and/or admin, who would then be given a choice to pick which path they deem suitable.
  • path-searching algorithms such as breadth- first, depth-first, ⁇ *. or other heuristic methods
  • the safety management system may have the ability to store previous travel paths that an unmanned vehicle may have already performed. By utilizing prior pathing data, the safety management system may not have to repeat pathing calculations when creating a new travel path for an unmanned vehicle.
  • a server may utilize a graphics processing unit (GPU) to create and store a grid that incorporates established zones when calculating a path. By storing said grid, the GPU may store all traveled paths for future use.
  • a path may be stored as a segmented path as opposed to one continuous path from the start to the end of the travel path.
  • the safety management system may be able to reuse the previously executed path or segments of the previously executed path so as to avoid repeating a pathing calculation.
  • the safety- management system may simply recalculate the segmented path(s) affected by said zone(s) and reuse all other segmented paths, or if a recalculation is too difficu lt or inefficient, the previously executed path data may simply be discarded.
  • the safety management system may be capable of maintaining a schedule for one or more unmanned vehicles. These scheduled unmanned vehicles may be required to repeatedly execute the same or similar paths consistently over time, therein being the efficiency of the safety management system to consistently give the unmanned vehicle(s) directions so as to cany out said consistent paths.
  • the scheduling system described would further enable the safety management system to avoid the risk of heavy unmanned vehicle traffic, and as a result reduce the probability of unmanned vehicles colliding with each other.
  • the safety management system may also autonomously modify unmanned vehicle schedules according to a variety of expected or unexpected factors (of which it is capable of being aware of), such as weather, roadways or airspace periodically blocked due to street fairs or community events, police activity, and natural disasters.
  • the administrator(s) will also have the ability to locate any unregistered (or unknown, unidentified) unmanned vehicles that may cause harm.
  • the following is related to the implementation and designs of an overlay method in which a system that has all real-time locations of each active unmanned vehicle overlays a low-level radar for the purposes of identifying unregistered unmanned vehicles. All unmanned vehicles previously described would be registered with and have the ability to report to the safety management system of the present disclosure, thus any unregistered unmanned vehicles that appear in the overlay of real-time locations and low-level radar would be flagged, and the safety management system would be notified. This system provides the advantage of protecting the public from malicious unmanned vehicles.
  • the safety management system may receive location data from one or more registered unmanned vehicles and update their locations on a map-based graphical user interface in real-time 401.
  • the system may then utilize a feed of the low-level radar 402to determine all of the locations of all vehicles in the air.
  • the two maps may be compared 404 by the safety management system. Upon that comparison, vehicles that are not already reporting their positions to the safety management system and therefore not implementing proper regulations would be rapidly identified 405.
  • FIG. 4B shows the initial real-time locations of the unmanned vehicles 406 as it may be seen in the graphical user interface 201.
  • FIG. 4C displays the low-level radar locations of any unmanned vehicles that may or may not be registered 406 and 407 in the same graphical user interface 201.
  • FIG. 4D is the overlaid map of the two previous figures, FIG. 4B and FIG. 4C.
  • the unmanned vehicles 406 that have been flagged as illegal unmanned vehicles 407 may then be uniquely identified in some fashion (e.g. a difference in color, an addition of a border, a unique animation) making it easy for the administrator(s) to locate and identify the illegal system(s) 407.
  • the safety management system may have the ability to eliminate any potentially dangerous activity related to unsafe unmanned vehicle travel (e.g. low battery or damaged unmanned vehicle travel), especially in an emergency situation.
  • the safety management system may achieve this by first reading a value of potential and current received from an unmanned vehicle battery. Using this information the safety management system may determine the unmanned vehicle battery charge by reading the differential of potential as well as estimate if an unmanned vehicle is damaged by reading the resulting reactive power.
  • the safety management system may also read the GPS coordinates of an unmanned vehicle to determine the locat ion of an unmanned vehicle. This location may be used to fetch the locat ion of nearby charging stations, all-purpose stations (i.e.
  • An emergency landing zone would be a zone where an unmanned vehicle may land safely and securely away from the public.
  • the safety management system may then determine if an unmanned vehicle is safe to travel by checking if the battery level is below the minimum operating battery potential or if the unmanned vehicle is damaged. In the event either of the prior conditions is true, the safety management system may send a signal to the unmanned vehicle interface to communicate a landing location and a request to land for the unmanned vehicle to prevent the unmanned vehicle from traveling in adverse conditions.
  • the safety management system may similarly determine the minimum operating battery potential by calculating the distance from an unmanned vehicle's current position to a desired charging station or all-purpose station. If the calculated distance is greater than a certain threshold, then the minimum operating battery would be set to a standard minimal operating battery potential. If the distance is in between two thresholds, then the minimum operating battery would be set to be inversely proportional to the distance. If the distance is lower than both thresholds then the minimum operating battery would be set to a constant standard minimal operating battery potential.
  • the safety management system may send a signal to the unmanned vehicle interface to land by setting a land request as well as by sending the final destination location through any digital communication
  • the unmanned vehicle interface (synchronous or asynchronous) to the unmanned vehicle interface.
  • the final destination locations may be differentiated by their core functionality.
  • Charging stations would be secure stations where an unmanned vehicle would arrive and recharge its battery until its battery is at a sufficient level for travel.
  • the all-purpose station may be a station where the unmanned vehicle would be repaired in case of failure, recharged, or picked up by the owner. It should be noted that these are only a few example landing stations; the user may have the option to use these specific stations or any combination of these stations with or without the addition of other stations that may have different functionalities.
  • the station or zone may be removed from the safety management system GUI indicating its unavailability and identified as "in use” in the safety management system database, which may also help to ensure that the safety management system GUI does not display irrelevant information.
  • FIG. 5 A shows the logic flow that the safety management system may utilize to implement the previously described battery use planning, and begins with an unmanned vehicle powering on 501.
  • the safety management system may then determine whether the battery potential is appropriate with respect to the minimal operating battery potential 502. If the battery potential is appropriate, then the safety management system would constantly calculate the location of the nearest station and the new minimal operating battery potential 504. In the event that the battery potential is less than the minimal operating battery potential, the safety management system may then send a safety stop and arrival request and the GPS location of the nearest safe unmanned vehicle station location 503.
  • FIG. 5B describes the key elements and concepts that comprise the safety management system.
  • the figure shows an unmanned vehicle 505 that is entering radial zones 510 centered at recharging stations 507, emergency landing zones 508, or all-purpose unmanned vehicle stations 509.
  • the unmanned vehicle 505 When the unmanned vehicle 505 is outside any radial zones it obeys a standard minimal operating battery potential, and upon entering the radial zone 510- 1 , the unmanned vehicle's 505 minimal operating battery potential is then reduced relative to the position from the recharging station 507; this would also apply to an unmanned vehicle approaching an all-purpose station 509.
  • a relative reduction gradient may stop and a minimal operating battery potential would be constant within this zone.
  • the safety management system may give the location of the optimal unmanned vehicle station position and a request to occupy said station. If a recharging station 507 or an all-purpose station 509 is not within traveling distance, a designated emergency zone 508 may be found and routed to by the unmanned vehicle 505.
  • FIG. 6 describes how the safety management system 601 may communicate ith one or more unmanned vehicles 602, computers 603, and manually controlled unmanned vehicles 604.
  • the safety management system 601 should be understood to have the qualities and functionality of a cloud-based server capable of enabling network access among a plurality of computing elements.
  • Independent computers 603 would have access to and control unmanned vehicles 602 via the safety management system 601.
  • Unmanned vehicles 602 would be specially equipped with additional hardware and software so as to communicate directly with the safety management system 601 wirelessly, while manually controlled unmanned vehicles 604 would not.
  • FIG. 6 shows one example implementation of the present disclosure in which a computer 603-2 is connected to a manually controlled unmanned vehicle 604 via a communication link 605, which is either wirelessly (e.g.
  • Wi-Fi Wireless Fidelity
  • Bluetooth Wireless Fidelity
  • a cellular network e.g. via a USB, FireWire, DIN, e-SATA/SATA connection
  • An indirect connection as described, facilitated by a computer 603-2, would allow the manually controlled unmanned vehicle 604 to receive new or updated restriction data from the safety management system 601, and may also permit the manually controlled unmanned vehicle 604 to transmit any pertinent or necessary data (e.g. power consumption data, location data, speed, etc.) it may have stored to the safety management system 601.
  • any pertinent or necessary data e.g. power consumption data, location data, speed, etc.
  • an unmanned vehicle 602 registered ith the safety management system 601 may include a telemetry unit 701, a processor 702, a resource manager 703, a memory device 704, and mechanical components 705. These five components would compose the unmanned vehicle interface as previously mentioned.
  • the memory device 705 may include, for example, one or more volatile and/or non-volatile memories.
  • the memory device 705 may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device).
  • the memory device 705 may be configured to store information, data, applications, instructions or the like for enabling the apparatus to carry out various functions in accordance with example embodiments of the present disclosure.
  • the memory device 705 could be configured to buffer input data for processing by the processor 702.
  • the memory device 705 could be configured to store instructions for execution by the processor 702.
  • the processor 702 may be embodied in a number of different ways.
  • the processor 702 may be embodied as one or more of various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip. or the like.
  • the processor 702 may be configured to execute instructions stored in the memory device 705 or otherwise accessible to the processor 702. Alternatively or additionally, the processor 702 may be configured to execute hard coded functionality.
  • the processor 702 may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to embodiments of the present disclosure while configured accordingly.
  • the processor 702 when the processor 702 is embodied as an ASIC, FPGA or the like, the processor 702 may be specifically configured hardware for conducting the operations described herein.
  • the processor 702 when the processor 702 is embodied as an executor of software instructions, the instructions may specifically configure the processor 702 to perform the algorithms and/or operations necessary for the unmanned vehicle to operate successfully.
  • the processor 702 may be a processor of a specific device (e.g., an eNB, AP or other network device) adapted for employing embodiments of the present disclosure, and may entail further configuration of the processor 702.
  • the processor 702 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 702.
  • ALU arithmetic logic unit
  • the telemetry u it 701 may be any means such as a device or circuitry embodied in either hardware, software, or a combination of hardware and software that is configured to receive and/or transmit data from/to a safety management system 706 and or any other device or module in communication with the apparatus.
  • the telemetry u it 701 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network.
  • the telemetry unit 701 may alternatively or also support wired communication.
  • the telemetry unit 701 may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
  • the processor 702 may be embodied as, include or otherwise control a resource manager 703.
  • the resource manager 703 may be any means such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g., processor 702 operating under software control, the processor 702 embodied as an ASIC or FPGA specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the resource manager 703 as described herein.
  • a device or circuitry e.g., the processor 702 executing the software forms the structure associated with such means.
  • the resource manager 703 would be configured to control all mechanical components 705 o the unmanned vehicle. Upon receiving instructions from the processor 702, the resource manager 703 would interpret and translate the instructions into actions performed by the mechanical components 705.
  • the mechanical components 705 of the unmanned vehicle may include but are not limited to rotors. LEDs, rudders, mechanical or robotics arms, retractable cables, gimbals, or other mechanical apparatus.
  • Unmanned vehicle communication systems and methods through SMS As put forth in the background, managing power consumption in robotic devices such as UAVs is essential in providing successful, long-term use of the vehicles.
  • Conventional communication methods and systems require significantly higher levels of power consumption to operate normally relative to a majority of the various power requirements for SMS. Reducing the power requirement for an unmanned vehicle inherently allows it to operate for longer periods of time or dedicate more power to one of its functions. Any such power improvement is essential in engineering more efficient unmanned vehicles and is applicable to all of its possible functions and assigned tasks.
  • the use of a SMS communication system simplifies the entire communication process and inherently reduces the risk of error being introduced to the system.
  • the use of a SMS has unique capabilities that other communication methods do not offer.
  • SMS is supported by the current inherent positive characteristics of a SMS. These characteristics include but are not limited to a wider area of coverage, strong existing infrastructure, and with an increasing transition to data supported messaging services by the general population, the usage of a SMS is less intensive on networks. Whereas other communication networks such as 4G LTE datalinks are not covered in certain areas of the United States, SMSs are readily available in a majority of inhabited areas.
  • Figure 8 illustrates a generic system diagram in which the management system 1101 , cellular network 1 102 and unmanned vehicle 1103 exist within a
  • the management system 1 101 and unmanned vehicle 1103 are in communication with each other via the cellular network 1102 through a communication path 1104 between the unmanned vehicle 1103 and the cellular network 1 102 and a communication path 1 105 between the management system 1101 and the cellular network 1102.
  • the communication paths 1 104 and 1105 are bidirectional and comprise at least one channel and support SMS function to allow SMS communication.
  • the communication paths 1104 and 1 105 may or may be the same, or under the same communication protocol.
  • embodiments of the present disclosure may further include one or more network devices with which the management system 1101 and unmanned vehicle 1103 may communicate to provide, request and/or receive information.
  • the cellular network 1 102 may include a collection of various different nodes, devices or functions that may be in communication with each other via corresponding wired and/or wireless interfaces. As such, the cellular network 1102 should be understood to be an example of a broad view of certain elements of the system and not an all-inclusive or detailed view of the cellular network 1102.
  • the unmanned vehicle 1103 and management system 1101 may also communicate via device to device (D2D) communication as part of the attempt to communicate via the cellular network 1102, and each may include a telemetry unit for transmitting signals to and for receiving signals from a cellular base site which could be, for example, a base station that is a part of one or more cellular or mobile networks or an access point that may be coupled to a data network, such as a local area network (LAN), a metropolitan area network (MAN), and/or a wide area network (WAN), such as the Internet.
  • LAN local area network
  • MAN metropolitan area network
  • WAN wide area network
  • the cellular network may utilize one or more mobile access mechanisms such as wideband code division multiple access (W-CDMA),
  • W-CDMA wideband code division multiple access
  • GDMA2000 global system for mobile communications (GSM), general packet radio service (GPRS), long term evolution (LTE), LTE Advanced and/or other similar mechanisms.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • LTE long term evolution
  • LTE Advanced LTE Advanced
  • the unmanned vehicle 1 103 may include a telemetry unit 1201, a processor 1202, a resource manager 1203, a memory device 1204, and mechanical components 1205.
  • the memory device 1204 may include, for example, one or more volatile and/or non-volatile memories.
  • the memory device 1204 may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device).
  • the memory device 1204 may be configured to store information, data, applications, instructions or the like for enabling the apparatus to carry out various functions in accordance with example embodiments of the present disclosure.
  • the memory device 1204 could be configured to buffer input data for processing by the processor 1202. Additionally or alternatively, the memory device 1204 could be configured to store instructions for execution by the processor 1202.
  • the processor 1202 may be embodied in a number of different ways.
  • the processor 1202 may be embodied as one or more of various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an ASIC
  • the processor 1202 may be configured to execute instructions stored in the memory device 1204 or otherwise accessible to the processor 1202. Alternatively or additionally, the processor 1202 may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor 1202 may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to embodiments of the present disclosure while configured accordingly. Thus, for example, when the processor 1202 is embodied as an ASIC, FPGA or the like, the processor 1202 may be specifically configured hardware for conducting the operations described herein.
  • the processor 1202 when the processor 1202 is embodied as an executor of software instructions, the instructions may specifically configure the processor 1202 to perform the algorithms and/or operations necessary for the unmanned vehicle 1 103 to operate successfully.
  • the processor 1202 may be a processor of a specific device (e.g., an cNB, AP or other network device) adapted for employing embodiments of the present disclosure, and may entail further configuration of the processor 1202.
  • the processor 1202 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 1202.
  • ALU arithmetic logic unit
  • the telemetry unit 1201 may be any means such as a device or circuitry embodied in either hardware, software, or a combination of hardware and software that is configured to receive and/or transmit data from/to a cellular network 1 102 and/or any other device or module in communication with the apparatus.
  • the telemetry unit 1201 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network.
  • the telemetry unit 1201 may alternatively or also support wired communication.
  • the telemetry unit 1201 may include a communication modem and/or other
  • the telemetry unit 1201 may support SMS
  • the telemetry unit 1201 receives information from the cellular network through the communication path 1104.
  • the information may be sent using SMS or other communication formats.
  • the telemetry unit 1201 may be configured to send short messages to the cellular network actively or upon request from the management system 1 101.
  • the short messages may be refreshed and sent on a repetitive schedule.
  • the message may be compiled from information selected from current location, speed, altitude, destination, power reserve information of the unmanned vehicle 1103. Alternatively, the message may be designated or compiled according to a specific request from the management system.
  • the repetitive interval may he a predetermined parameter stored within the memory device 1204.
  • the repetitive interval may be dynamically determined based on at least one parameter selected from current location, speed, altitude, destination, power reserve information of the unmanned vehicle 1 103. For example, when the power reserve of the unmanned vehicle 1103 is below a reserve threshold, the telemetry unit 1201 may reduce the repetitive interval to send short message more frequently.
  • the repetitive interval may be determined by a request from the management system 1101.
  • the telemetry unit 1201 may support SMS
  • the telemetry unit 1201 may send a short message using a first communication protocol. If no reply or conformation received within a time threshold, the telemetry unit 1201 then sends the short message via a second communication protocol different from the communication protocol. If still no reply or conformation received within the time threshold, the telemetry unit 1201 may switch back to the first communication protocol to resend short message.
  • the time threshold may be a predetermined parameter stored within the memory device 1204. In some embodiments, the time threshold is configured to be smaller than the repetitive interval such that the same short message may be resent before the short message to be sent is refreshed.
  • the telemetry unit 1201 when the power reserve of the unmanned vehicle 1103 is below the reserve threshold, the telemetry unit 1201 is configured to send short message via both the communication protocols and/or reduce the repetitive interval to send short message more frequently . In some embodiments, when the processor 1202 detects any mechanical breakdown, the telemetry unit 1201 is configured to send short message via both the communication protocols and also reduce the repetitive interval to send short message more frequently.
  • the processor 1202 may be embodied as, include or otherwise control a resource manager 1203.
  • the resource manager 1203 may be any means such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g., processor 1202 operating under software control, the processor 1202 embodied as an ASIC or FPGA specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the resource manager 1203 as described herein.
  • a device or circuitry e.g., the processor 1202
  • executing the software forms the structure associated with such means.
  • the resource manager 1203 would be configured to control all mechanical components 1205 of the unmanned vehicle 1103. Upon receiving instructions from the processor 1202, the resource manager 1203 would interpret and translate the instructions to actions performed by the mechanical components 1205.
  • the mechanical components 1205 of the unmanned vehicle 1 103 may include but are not limited to rotors, rudders, mechanical or robotics arms, retractable cables, gimbals, or other mechanical apparatus.
  • the processor 1202 may be enabled to configure subframes of the cellular (or other communication interface) downlink signaling structure.
  • the processor 1202 may provide information to the unmanned vehicle 1103 and the management system 1101 (or other machines) via the telemetry unit 1201 indicating the configuration to the signaling structure so that the unmanned vehicle 1103 and the management system 1 101 may utilize the corresponding signaling structure accordingly.
  • FIG. 10 illustrates one system that may be implemented for SMS
  • the external management system 1 101 will relay SMS data to the telemetry unit 1201 of the unmanned vehicle 1 103.
  • the telemetry unit 1201 will then transmit the data to the processor 1202 which will then decide whether the data needs to be stored in the memory device 1204 or sent to the telemetry unit via the cellular network 1 102.
  • the telemetry unit 1201 on the unmanned vehicle 1 103 may communicate with the telemetry unit 1302 that is located on a landing module 1301. Implementations that may arise as a result of this communication between the unmanned vehicle 1103 and landing module 1301 include but are not limited to informing the landing module 1301 that the unmanned vehicle 1103 is traveling or performing an approach towards the landing module 1301.
  • FIG. 11 is a graphical representation of how multiple unmanned vehicles 1404 may communicate with a cellular network via SMS by communicating with the closest cellular towers 1403.
  • Each individual unmanned vehicle 1404 should be understood to possess all the capabilities and components described of the unmanned vehicle 1103 in prior figures.
  • each individual unmanned vehicle 1404 will communicate with a cellular tower 1403 according to which area of ideal connectivity it is located in, as defined by the connectivity boundaries 1401.
  • unmanned vehicles 1404- 1 and 1404-2 would communicate with the cellular tower 1403-1, seeing as both lie within the connectivity boundary 1401 corresponding to the cellular tower 1403-1.
  • the unmanned vehicles 1404-3 and 1404-4 would communicate with the cellular tower 1403-2, since both lie within the connectivity boundary 1402 corresponding to the cellular tower 1403-2.
  • Each cellular tower 1403 and corresponding connectivity boundary define a base station at the center of a geographic cell, of which all cellular networks are comprised.
  • the unmanned vehicles 1404 will then have the ability to efficiently communicate across this larger cellular network and with other systems or devices connected to the same network. Examples of said other systems and devices include but are not limited to landing modules, other unmanned vehicles 1404. management systems, personal cellular
  • Non-Inertial Frame Lock for Unmanned Aerial Vehicles As put forth in the background, it is important for the UAV to maintain stability during a landing procedure.
  • the UAV may accomplish stability by utilizing a system of receivers and emitters that can constrain a UAV in an inertial frame.
  • electromagnetic wave (EMR) receivers and EMR emitters e.g. radiation in the infrared spectrum
  • EMR receivers and emitters may be used to allow the UAV to maintain stability during landing.
  • the EMR receivers and emitters may allow the UAV's computational unit to calculate the proper adjustments needed for the UAV's flight in order for the UAV's position, roll and pitch to be maintained above a corresponding wave emitter.
  • the electromagnetic radiation would lock the UAV in a non-inertial frame and allow the UAV to land precisely. Consequently through this technology, the UAV will have the ability to land on moving platforms as well as on static platforms. This ability greatly expands the range of locations available for landing the UAV and gives the UAV more flexibility in determining overall flight paths over the conventional art.
  • FIG. 12 shows a circuit diagram for the Non-Inertial Frame Lock, in accordance with one embodiment of the present disclosure.
  • the basic circuit of the non-inertial frame origin consists of a battery 2002 powering a collection of electromagnetic wave (EMR) emitters 2001. Once powered, the EMR emitters 2001 emit EMR radiation which is received by a matrix of photo-transistors 2003. The photo-transistors 2003 decrease in resistance to the battery's 2005 current as the intensity received increases. This is measured by an increase in potential across the corresponding resistor 2004. These potentials are then input into a high-speed multiplexer 2006 which is controlled by a computational unit 2007. The output from the multiplexer 2006 is input into the computational unit 2007. The computational unit 2007 then sends the directional communication to the aerial vehicle controller 2008 based off the perceived symmetries in potentials.
  • EMR electromagnetic wave
  • FIG. 13 shows the flow diagram of the computational unit 2007 as seen in FIG 12.
  • the computational unit 2007 may power on 2009 and may immediately begin to read through the receiver values 2010 by varying the multiplexer 2006 input. With this multiplexer 2006 input, the computational unit 2007 may then calculate the transmission center's 2011 position relative to the UAV's center and create a vector from the UAV's center to the transmission center 201 1. Utilizing the transmission center 201 1 , the computational unit 2007 may then split the vector into Cartesian directional components 2012 and transmit the components as flight commands 2013.
  • FIGS. 14A- 14B show a top-down view of the landing platform 2301.
  • the landing platform 2301 may be hexagonal in shape, as shown in the example figure, or it may also be octagonal, nonagonal, or have any number of possible sides.
  • FIG. 14A there is a single EMR emitter 2303 located above a power source 2302. It should be noted that the graphical representation of the power source 2302 is concealed by the graphical representation of the EMR emitter 2303, and should henceforth be recognized in following figures as the smaller, hexagonal center object of the landing platform 2301.
  • FIG. 14B there are multiple EMR receivers 2304 located around the landing platform 2301.
  • the amount of EMR emitters 2303 and EMR receivers 2304 may vary for different implementations of the present disclosure.
  • At the center of the landing platform 23 1 may be a power source 2302 which may provide power to the interface and any other receivers belonging to the landing platform 2301.
  • the power source 2302 may be a battery.
  • a system of at least three EMR receivers and one EMR emitter must be employed. Any increase in the number of EMR receivers may result in an increase in accuracy of the present disclosure. Similarly, any increase in the number of EMR emitters 2303 increases the EMR signal strength communicated to the EMR receivers 2304, increasing precision.
  • Both example embodiments described in FIG. 14A and FIG. 14B are viable configurations, however the configuration described in FIG. 14A may be ideal in that it does not require additional communication between the UAV computational unit 2007 and an additional computational unit integrated into the landing platform 2301. That is, the example implementation represented in FIG.
  • the landing platform 2301 may require the landing platform 2301 to possess or integrate a computational unit that could interpret the EMR receivers 2304 information and in turn send that message to the UAV via another EMR communication method, or another type of communication means.
  • the implementation in FIG 14 ⁇ may be optimal in that all computations with respect to the present disclosure can be made solely by the UAV computational unit 2007 without the need for additional computational units.
  • FIGS. 15A-15D show the configuration of the EMR receivers 2404 and EMR emitter 2303 with respect to the UAV 2401 and a landing platform 2301 for the UAV 2401.
  • FIG. 15 A depicts the EMR receivers 2404 on the underside of the UAV 2401 and the EMR emitter 2303 on the landing platform 2301.
  • the EMR emitter 2303 may begin to emit EMR which will trigger the EMR receivers 2404 on the UAV 2401.
  • the algorithmic processes described in FIG. 12 and FIG. 13, or another process may be employed to correct the position, roll and pitch of the UAV as needed.
  • FIG. 12 and FIG. 13 may be employed to correct the position, roll and pitch of the UAV as needed.
  • FIG. 15B is a close up of the landing platform 2301 configuration with the EMR emitter 2303 shown specifically in the center of the landing platform 2301.
  • FIG. 15C shows a possible configuration of the EMR receivers 2404 located on the bottom of the UAV 2401, although it is noted that the shape and design of the UAV 2401 may vary and the location of the EMR receivers 2404 may also vary.
  • FIG. 15D depicts a close up of one possible configuration on the UAV 2401 , showing the EMR receivers 2404 positioned on the underside of the body or housing of the UAV 2401 . As shown, located around the base of the UAV 2401 are multiple EMR receivers 2404-1 through 2404-6, however the exact locations of the receivers may vary.
  • the number of EMR receivers and emitters is only one possible implementation of the present disclosure.
  • the number of EMR emitters may be increased or decreased with the effect of creating a stronger or weaker EMR signal strength.
  • the number of EMR receivers may be increased or decreased with the effect of increasing or decreasing receiver precision.
  • FIGS. 16A-16E depict a similar system to FIGS. 15A-15D, but where the EMR receivers 2304 are located on the landing platform 2301 and the EMR emitters 2502 are located on the UAV 2401 .
  • the EMR emitters 2502 will begin to emit EMR, which will trigger the EMR receivers 2304 on the landing platform 2301.
  • FIG. 16B is a close up of one possible configuration of the EMR receivers 2304 located on the landing platform 2301. Specifically, FIG.
  • FIG. 16B shows how the F.MR receivers 2304 are connected to the embedded wiring of the landing platform 2301, for example, where each of the EMR receivers 2304 are positioned between portions of the landing platform 2301 and connected to the embedded wiring to the power source 2302.
  • FIG. 16C is a close up of FIG. 16B and specifically shows the EMR receiver 2304-3 and its placement on the landing platform 2301 and connected to the embedded wiring.
  • FIG. 16D shows one possible configuration of the EMR emitters 2502 located on an underside of the body of the UAV 2401.
  • FIG. 16E shows a close up of FIG. 16D with the EMR emitters 2502 located directly on the bottom of the body of the UAV 2401.
  • the number of EMR receivers and emitters is only one possible implementation of the present disclosure. The number of receivers and emitters may be increased or decreased with the effect of creating a stronger or weaker EMR signal strength. This configuration yields the same functionality as the first example as shown in FIGS. 15 A- 15D.
  • FIGS. 17A-17B show how the EMR technology may work with respect to the UAV 2401.
  • FIG. 1 7 A the UAV 2401 is seen approaching the landing platform 2301 .
  • the EMR emitters 2304 are located on the landing platform 2301 and the EMR receivers are located on the UAV 2401 ; however, the emitters and receivers may be configured with the emitters on the UAV 2401 and the receivers on the landing platform 2301, without changing the desired effect of the present disclosure.
  • EMR 2601 is emitted from the EMR emitter 2304 and the receivers on the UAV 2401 begin to adjust the position of the UAV 2401 with respect to the frame lock's center or the center of the EMR emitters 2602.
  • the UAV 2401 can be seen substantially above the center of the EMR emitters 2602, which is a result of the EMR receivers adjusting the roll and pitch as well as the overall position of the UAV 2401 with respect to the EMR emitter 2304.
  • the circuitry used in the frame lock may measure the intensities of the EMR 2601 to calculate the corrective movements needed for the UAV to position itself above the EMR emitter center 2602 and maintain a lev el of consistency in roll and pitch. With this control, the UAV 2401 may descend to land on the landing platform 2301 while maintaining the desired control over the UAV 2401.
  • FIGS . 18 A- 18C show a UAV 2401 with a descending rectangular apparatus suspending EMR receivers 2701 from the UAV 2401.
  • the EMR receivers 2701-1 through 2701 -5 may be positioned substantially vertical of one another on the descending structure.
  • Fig. 18A shows a close up perspective of the underside of the UAV 2401 and the individual EMR receivers 2701, with the lowest receiver being the EMR receiver 2701-5 and the highest receiver being the EMR receiver 2701-1. relative to the ground beneath the UAV 2401.
  • Fig. 18B shows the left side view of the UAV 2401 and the attached EMR receivers 2701.
  • Fig 18C shows the back side view f the UAV 2401 and the attached EMR receivers 2701.
  • FIG. 19 shows a rail 2801 raised by two supports and fitted with functional EMR emitters 2802 on its right face.
  • the EMR emitters 2802 may also share two planes, one plane, or may all be configured to be in a horizontal line. In some designs, it may be important to locate all EMR emitters 2802 within a single plane.
  • the EMR emitters 2802 may be independently powered or an external power source may supply all of the EMR emitters 2802. If an external power source were used, it may be located in or outside of the rail 2801.
  • FIGS. 20A-20B show the UAV 2401 equipped with EMR receivers 2701 approaching the rail 2801 and the attached EMR emitters 2802. These figures show an implementation of the present disclosure where the motion of the UAV 24 1 could be restricted to a single plane based on the linearly-aligned EMR receivers 2701 and the EMR emitters 2802 positioned within a single plane. This would be accomplished by the UAV 2401 adjusting its position in an attempt to maintain a constant average intensity across all of the EMR receivers 2701 , minimizing if not eliminating movement to the right and left of the rail.
  • FIG. 20A shows a top-right comer view of the
  • FIG. 20 B shows a top side view of the
  • FIGS. 21A-21C depict one of the ideal standards of precision the UAV 2401 could accomplish when the implementation described in FIGS. 20A-20B is executed correctly.
  • the UAV 2401 could hover just above the rail 2801, and move along the rail 2801 at the height depicted or various other heights.
  • This implementation system is an ideal example of how the UAV 2401 may be utilized in a commercial assembly line setting. In such a setting the UAV 2401 may be further equipped or enabled to add, remove, or inspect objects on an assembly line.
  • An assembly line implementation could entail one or more uses of a similar rail 2801 or may directly integrate the EMR emitters 2802.
  • FIG. 21 A shows a top-right view of the implementation system and
  • FIG. 2 IB shows a front view of the implementation system.
  • FIG. 21 C shows a top view of the implementation system.
  • the number of EMR receivers and emitters are only examples of possible implementations of the present disclosure.
  • the number of EMR emitters may be increased or decreased with the effect of creating a stronger or weaker EMR signal strength.
  • the number of EMR receivers may be increased or decreased with the effect of increasing or decreasing receiver precision.
  • the number of EMR emitters and receivers does not change the basis of the novelty on which the present disclosure embodies.
  • the use of a landing platform may decrease the chance of error and danger by specifying a location for the UAV to land before reaching its final destination.
  • a foldable and portable landing platform may enable easy and convenient movement of the landing location, should it need to be changed.
  • FIG. 22 shows an example logic diagram of the landing platform's computational unit if a computational unit were employed.
  • the landing platform begins at start 3101 and proceeds to determine if it is receiving a connection attempt 3102 from the UAV. If there is no connection attempt, the computational unit continues to expect a connection via the same decision 3102. Once a connection attempt is received from the UAV, the computational unit establishes a secure and consistent connection 3 103 with the UAV. Once the connection 3103 is established, the computational unit activates the warning capabilities of the landing platform (e.g. changes in LED lighting, sound emitted from a speaker, etc.) and warns observers to stand back 3104 from the landing platform. The computational unit then attempts to communicate confirmation to the UAV that the landing platform is ready 3105.
  • the warning capabilities of the landing platform e.g. changes in LED lighting, sound emitted from a speaker, etc.
  • the UAV receives confimiation 3106, it proceeds to performing a guided landing 3107. If the UAV does not receive confirmation 3106, the landing platform's computational unit continues to attempt to communicate the confirmation 3105 to the UAV.
  • the UAV's guided landing 107 may entail one or more methods or combinations of methods to safely land the UAV in the center of the landing platform.
  • these methods may include but are not limited to utilizing a system of electromagnetic wave emitters and receivers, utilizing a camera on the UAV or landing platform, and establishing a physical connection of some type between the UAV and landing platform.
  • the UAV may be capable of accomplishing most if not all of the computations necessary to accomplish a safe landing in the center of the landing platform, and as a result may be equipped with one or more transceivers and other electronic devices to accomplish all necessary landing related tasks. If the UAV communicates to the computational unit of the landing platform that the UAV has successfully landed 3108, the computational u it ends any further communication and computation 3109. If the UAV does not communicate a successful landing 3108, the UAV continues to perform its guided landing process 3107 until such a communication 3108 is received.
  • FIG. 23 shows a top-down view of the landing platform 3201.
  • the landing platform 3201 may have a polygon shape when in an unfolded configuration, such as being hexagonal in shape as shown in the example figure, or may also be a symmetrical polygon comprised of more or less sides than the current implementation (octagonal, nonagonal, etc.).
  • the landing platform 3201 may be foldable along junctions between a plurality of portions of the landing platform 3201 and may be folded along the junctions. As shown in FIG. 23, each portion may be triangular in shape and the landing plat form 3201 may be folded up along the junctions into a thicker layer of equally symmetrical and stacking triangles. In other words, the folded configuration of the landing platform 3201 may have substantially the same footprint as the footprint of a section of the landing platform 3201.
  • Wires may be imbedded along each side and potential folding crease of the landing platform 3201. These wires may be necessary to enable the landing platform's 3201 interface(s), communication, and any other devices located in or on the landing platform 3201.
  • warning devices such as LEDs 3203, audible device, or other warning devices.
  • the warning devices may be used for indication purposes, which may include but are not limited to warning observers that a UAV is landing, aiding a UAV in its landing process, and displaying certain colors, patterns, or combinations thereof that would allow an observer to interpret the state of or a message communicated by the landing platform 3201.
  • a power source 3202 which may provide power to the landing platform's interface(s) and any of the other devices located in or on the landing platform 3201.
  • This computational unit for the landing platform 3201. when mentioned henceforth, may be located in the same housing as the power source 3202 or elsewhere in or on the landing platform 3201 .
  • This computational unit would be comprised of all software and hardware necessary to accomplish the necessary tasks and processes of the landing platform 3201 described henceforth.
  • FIGS. 24A-24G show one implementation of the present disclosure, specific to how the landing platform 3201 may be folded, demonstrating its portability.
  • FIG. 24 A shows a side view of the landing platform 3201 when it is unfolded and in position for UAV landing and/or take off.
  • sensor(s) such as optical range finders, ultrasonic range finders, and or GPS units the landing platform 3201 will be able to indicate whether it is in position for landing based on its locational data to determine if it is in an open location and in an unfolded state.
  • FIGS. 24B, 24C, and 24D show different perspectives of an intermediate stage in the folding process of the landing platform 3201. FIG.
  • FIG. 24 E is a top right diagonal view of the landing platform ' s 3201 final shape, a slightly extruded triangle.
  • FIG. 24 F shows a front perspective of this final triangle form.
  • FIG. 24G shows the landing platform 3201 from the side once completely folded, offering a detailed perspective of the resulting thickness of the final shape due to the stacked nature of each of the landing platform's 3201 triangular faces.
  • FIGS. 25A-25C demonstrate an example of a UAV 3401 approaching the landing platform 3201 and successfully landing on the landing platform 3201.
  • the UAV 3401 may approach the landing platform 3201 with an additional payload, not pictured in the drawings of the present disclosure.
  • FIG. 25 A the UAV 3401 is seen approaching the landing platform 3201 from a distance, which it may accomplish through communication with a computational unit on the landing platform 3201.
  • the UAV 3401 will know the location of the landing platform 3201 and the algorithm described in FIG. 22 will be employed.
  • the landing platform 3201 may also be located in a predetermined location and the UAV 3401 would be aware of said location prior to travel.
  • the UAV 3401 may also receive commands from an external system (e.g. a computational unit on the landing platform or another external system with wireless communication abilities) to establish a flight restriction zone centered at the landing platform's GPS location.
  • This restriction zone may be established with a switch on the landing platform 3201 accessible by the user.
  • the landing platform 3201 may also come with different radii of restriction.
  • FIG. 25B shows the UAV preparing for landing on the landing platform 3201.
  • the landing platform 3201 may utilize various techniques including but not limited to a method of constraining the UAV 3401 to a non-inertial frame, cellular positioning techniques, and techniques involving optical range finders.
  • FIG. 25C depicts the UAV 3401 landing on the landing platform 3201.
  • the methods the UAV 3401 uses to achieve takeoff and travel to another destination may include similar or the same techniques employed when approaching and landing on the landing platform 3201. After the UAV 3401 takes off, its next location may be another landing platform 3201 or the original operator of the UAV 3401.
  • FIGS. 26A-26E show a step-by-step process of folding the landing platform 3201 into a viable housing unit for the UAV 3401 having an interior space or cavity in which the UAV 3401 can be stored in.
  • FIG. 26A depicts the landing platform 3201 laid flat with one triangular face of the landing platform 3201 folded up,
  • the landing platform 3201 is hexagonal in shape, but may be a polygon comprised of more or less sides with a similar applicable folding technique. By folding one of the triangular faces of the landing platform 3201, it may become a pentagonal based pyramid. In FIG. 26B this is shown with the side that is being folded, held up to demonstrate its position for reference throughout the folding process. In FIG. 26C a landing platform 3201 face adjacent to the perpendicularly folded face is brought on top of the face across from it, effectively overlapping the two faces and creating a square pyramid (with the reference side still folded up perpendicularly once more to demonstrate its position for folding clarification). FIG.
  • FIG. 26D shows the final result after folding the landing platform 3201 into its final shape, with the face that was previously folded perpendicularly to its adjacent faces for reference now folded down to overlap and become parallel with another adjacent face.
  • FIG. 26E shows the underside of the housing unit as shown in FIG. 26 D with a UAV 3401 underneath the newly formed square pyramid created by the folding of the landing platform 3201. This configuration provides ample space for a UAV 3401 to reside underneath the square pyramid.
  • FIGS. 27A-27B show how a UAV 3401 may acquire the necessary positional information of the landing platform 3201 so as to travel 3605 to the landing platform's 3201 position.
  • FIG. 27 A demonstrates one implementation whereby a commercialized cellular communication device 3601 is held above the landing platform's 3201 location, and sends GPS data specific to said location via a common cellular communication method 3602-1.
  • this GPS data sent via the wireless communication method 3602-1 is received by an external management system 3603.
  • the external management system 3603 would be equipped with all necessary software and hardware to receive the GPS data from the cellular device 3601, and relay said data to the appropriate UAV 3401 via a wireless communication method 3602-2. which may be similar to or the same as the wireless communication method 3602-1.
  • the UAV 3401 successfully receives GPS data specific to the landing platform's 3201 location from the external management system 3603, the UAV 3401 would then travel 3605 to the landing platform's 3201 location, and the algorithm described in FIG. 22 may be employed by the landing platform 3201.
  • Unmanned Aerial Vehicle Landing and Containment Station In order to reduce the pressure of managing and organizing several UAVs, unique stations could be used to combine the landing, housing, and charging components of the UAV in one compact place. Through autonomous control the landing unit will be able to accomplish all of this on its own without further help from a separate entity.
  • the stations may be able to autonomously communicate with a UAV to accomplish landing and take-off.
  • Several stations, combined with their autonomous capabilities, would enable efficient and easy UAV management for station users. Additionally, through the addition of a possible roofing system, such a station would not only be able to provide a secure location for UAV landing, but also provide protection for the UAV from any possible tampering or outside forces, such as weather.
  • FIGS. 28A-28H show an example configuration of a station 4101 and some of its independent components.
  • FIG. 28 A shows how the station 4101 may be comprised o various main parts, including: a landing platform 4102, an electronics bay 4103, a charging ring 4104, support legs 4105, and support column 4106.
  • the electronics bay 4103 may be positioned underneath the landing platform 4102 and may serve primarily as a sub-platform for storing and/or holding various electronics for the support of a UAV and the mechanical system of the station 4101 as a whole. This may include any units necessary for the charging ring's 4104 functionality and/or any other parts including but not limited to wires, batteries, sensors, and various circuitry.
  • the electronics bay 4103 may be further configured to hold other objects, such as spare mechanical parts for the landing station 4101 or a UAV utilizing the same landing station 4101.
  • a computer and compatible graphical user interface may also be incorporated in to the electronics bay 4103, with the ability to be accessed from in or outside the electronics bay 4103.
  • outside accessibility may entail the graphical user interface (at 1 minimum) to be on a sliding apparatus integrated in to the electronics bay 4103, such
  • the station 4101 may be supported by a framework, such as two legs 4105. W which may have holes drilled through bottom components so as to enable fastenings U to the ground underneath. These fasteners may be composed of concrete screws,
  • fasteners may be further configured to be easily removable so as to render the station 25 4101 more portable, and may also be used to level the station 4101 to the ground
  • FIG. 28B shows a top view of the station 4101. In this configuration, the legs
  • FIG. 28C shows a view of the station from the side, with the charging ring
  • FIG. 28D shows a bottom view of the
  • the station 4101 is elevated by support legs 4105, however the support
  • legs may assume a different shape or aesthetic design depending on the overall size
  • FIG. 28 E shows an independent view of the
  • the support column vertically
  • FIG. 28F shows an independent view of one design of one of 8 the support legs 4105.
  • FIG. 28G shows an independent view of one half of the charging ring 4104, 0 which may be used to charge or recharge a power source of a UAV, and the
  • the capture ring 4106 may 2 be used to help guide the UAV to the appropriate position so it can electrically 3 connect with the charging ring 4104.
  • the capture ring 4106 may be ideally shaped so 4 as to account for any possible error in a UAV's landing position.
  • the 1 charging ring 4104 may be positioned as the floor of an angled slot within the capture
  • the feet of the UAV may be guided to
  • FIG. 2811 shows a top view of the same half of the capture ring 4106 and
  • each half of the charging ring 4104 may have a 10 strong opposite polarity relative to the other half, and as such the charging rings may U act as the terminals of a traditional battery.
  • the power source inducing this change in
  • the M feet of a UAV utilizing the station 4101 may be equipped with a charging apparatus
  • the charging ring 4104 and the feet of the UAV may utilize another
  • FIGS. 29A-29C describe an example process of how a UAV 42 1 may land
  • FIG. 29 A depicts a UAV 4201 interacting with electromagnetic
  • the UAV 4201 may be equipped with multiple
  • FIG. 29B shows another example system in which the UAV 4201
  • the camera 4203 would be powered by and communicate with devices located
  • the UAV 4201 may then have the ability to perform
  • FIG. 29C shows the UAV 4201 and station 4101 after a successful landing following an approach as described in FIG. 29 A or FIG. 29B, with the feet of the UAV 4201 making contact with the charging ring 4104.
  • FIGS. 30A-30B describe one of many alternative embodiments of the present disclosure for a landing and containment station.
  • the main difference between the design of this alternative embodiment and the design described in FIGS. 28A-29C is the inclusion of a roof system and housing components, thus making this design capable of storing a UAV internally and rendering it exempt from outside exposure, as well as a charging system involving electromagnets.
  • FIG. 3 OA shows a holistic view of the alternative station 4301. The entire structure is held up by the base support 4302 which may be held stable by fasteners through the fastening holes 4303. These fasteners may be composed of concrete screws, welds, nut and bolt configurations, spikes, and other various devices and methods for solidifying the foundations of the station 4301.
  • the base support 4302 may have an I-shape structure with a hollow core that can store network and computational devices.
  • the base support 4302 may also assume a different shape or aesthetic design depending on the overall size, shape, and weight of the station 4301.
  • the landing platform 41 Connected to the base support 4302 is the landing platform 41 2. Adjacently connected are the station walls 4305, which are in turn flexibly connected to the triangular roof components 4306. Located on the surface of the roof components 4306 may be solar panels 4307, which may provide power for the station 4301.
  • the flexibly connected roof components 4306 serve the purpose of opening and closing the main bay or interior compartment of the station 4301 in which the UAV is kept.
  • This main bay is defined as the enclosure created by the landing platform 4102, the station walls 4305, and the roof components 4306.
  • the main bay combined with any machinery that may induce the movement of the main bay will henceforth be described as the dynamic roof management system. This system may perform some or all of its features or actions completely autonomously.
  • FIG. 30B shows a top view of the station 4301.
  • the landing platform 4102 can be seen to have embedded charging rings 4308 and a larger conical impression 4309.
  • One UAV attraction element and method that may be employed is including electromagnets that may consist of coiled wire around an inductive element fastened beneath the charging rings 4308 on the landing platform 4102.
  • the electromagnets may serve two functions: bringing a UAV down into a centered, stable position in the final inches of descent, and during take-off hold a UAV down while the motors of the UAV reach a flight operational state. This function may prevent the UAV from coming into contact with the roof components 4306 that may still protrude from the station 4301 when the UAV attempts to take-off from the landing platform 4102.
  • the electromagnets may also ensure correct polarity during charging.
  • the charging rings 4308 may charge the UAV by using low-resistive rings that are attached to the landing platform 4102 centered on the center axis of the electromagnets.
  • the polarity of the rings would relate with the polarity o the electromagnetic field ensuring that potential battery harming cross polarity does not occur.
  • the electromagnet polarity of electromagnets that may exist on a UAV would match the permanent magnetic polarity scheme of the landing platform 4102.
  • current may flow from low-resistive rings to low-resistive permanent magnet casings (components of the charging rings 4308), and then to the appropriate power source circuitry.
  • the larger conical impression 4309 may increase the strength of the electromagnets that may exist underneath the charging rings 4308 with respect to corresponding electromagnets that may exist on a UAV.
  • the charging rings 4308 may be modified in size, shape, or position on the landing platform 4102 in order to accommodate or be compatible with other UAVs. Consequently, the conical impression 4309 may be subject to similar changes in size, shape, and position in order to serve the same purpose as previously described.
  • FIGS. 31A- 1 E show the station 4301 in more detail, in addition to the dynamic roof management system and its important features, each contributing to the overall tasks and performance of the station 4301.
  • FIG. 31 A shows an initial view of the station 4301 with two of the closest station walls 4305 and corresponding roof components 4306 omitted so as to provide better detail of the inside components of the station 4301. It is important to note that all the components seen in FIGS. 31 A- 3 IE from this perspective may exist symmetrically on the sides of the station 4301 that are omitted for the sake of detailed perspective.
  • FIG. 31 A shows a first such perspective of a slide 4402, fastening section 4403, and the rack drive train section 4404.
  • a spring angular closing system may also be implemented to automatically raise or lower the roof components 4306 as needed.
  • This spring angular closing section may provide the functionality of angling the roof components 4306 to meet at a center point only when the station walls 4305 rise to a certain point.
  • the purpose of implementing this system would be to allow for the opening and closing o the roof components 4306 without the use of heavy motors at the pivot points where the roof components 4306 flexibly connect with the station walls 4305. This also allows the station walls 4305 of the main bay to retract below the landing platform 4102 without ever coming in to contact with the UAV 4401.
  • a spring angular closing system may be implemented by using two springs for each of the four vertical sides of the station 4301; one linear spring would be attached to the landing platform 4102 and one of the four roof components 4306, while another radial spring would be attached to the same roof component 4306 and the exteiior of the closest of the station walls 4305. These springs would have the appropriate properties of tension and strength necày to accomplish the appropriate timing for pushing and pulling the roof components 4306 in to an opened or closed position.
  • FIG. 3 IB and FIG. 31C show the same perspective and components as FIG. 31 A, but with the station 4301 in a midway position of opening the roof components 4306 and lowering the station walls 4305 in FIG. 3 IB, and the roof components 4306 fully open and station walls 4305 fully lowered in FIG. 31C.
  • FIG. 3 ID shows a top side view of the station 4301 with the roof components 4306 fully open so as to allow for the UAV 4401 to take off.
  • FIG. 31A, FIG. 31B, FIG. 31C, and FIG. 3 IE show the location of where electromagnets 441 1 may exist underneath the charging rings 4308 located on the landing platform 4102.
  • the slide 4402 and fastening section 4403 may stabilize a rack 4405, which is essential to facilitating the linear motion of the station walls 4305.
  • This linear retraction of the station walls 4305 and connected roof components 4306 allows for the UAV to land and take-off from the landing platform 4102 without risk of hitting the station walls 4305 or roof components 4306.
  • Stabilization of the rack will increase water-resistance, reliability in functionality, and the robustness of the entire station 4301.
  • a constraint of the rack 4405 with respect to the X, Y, ⁇ , and ⁇ dimensions i.e. all geometric and polar dimensions excluding all vertical dimensions
  • the landing platform 4102 of the station 4301 utilizes a slot slide 4402 made of nearly frictionless material that appropriately fits the corresponding dimensions of the rack 4405, while still allowing vertical movement.
  • Each secured rack 4405 would be attached to each of the station walls 4305.
  • FIG. 3 IE shows a closer view of the rack drive train section 4404.
  • the rack drive train section 4404 serves the purpose of driving the rack 4405 vertically upwards or downwards.
  • a compound gear 4406 is placed on an axis located at a point in which the larger section of the compound gear's teeth are aligned with the rack 4405.
  • This axle is supported at both ends by a triangular support 4407 connected to the landing platform 4102.
  • a separate gear 4408 is then aligned to intertwine with the smaller gear component of the compound gear 4406; this second gear 4408 is similarly supported by a triangular axel support 4409 at both ends.
  • a drive motor 4410 fastened to a plate that is perpendicular to the landing platform 4102 and parallel to the station wall 4305.
  • This motor 4410 along with three other similar motors located in the other rack drive train sections 4404, may drive the station walls 4305 and would therefore drive the previously described axel.
  • the motor 4410 would also be equipped with a limit switch, a device that would be capable of sending an electronic signal when the motor approaches the maximum or minimum driving distance along the rack 4405.
  • An operational control system may be responsible for the autonomous operation of the dynamic roof control system's motors 4410 as well as all other aspects of communication and proper landing of the UAV.
  • the operational control system may be comprised of a computational unit, a communication interface and any other necessary electronic units necessary to perform its designated tasks.
  • the operational control system and all of its components may be housed in the structure of the base support 4302.
  • One primary task assigned to the operational control system would be to ensure the smooth operation of the motors 4410 by taking into consideration the signal value of the motor's 4410 respective limit switches.
  • the operational control system may also be tasked with (and appropriately equipped to accomplish) other assignments such as storing charging data, communicating wirelessly with a UAV, or communicating with any other components that may be added to the station 4301. These tasks would aid in increasing robustness and reliability of the station 4301, and with the assistance of an operational control system, the station 4301 would be able to accomplish autonomous UAV landing, housing, and take-off.
  • UAV flight deliveries may be completed in a matter of hours versus standard delivery methods which often take several days.
  • UAV design has become increasingly relevant to ensure safe and accurate deliveries. Since deliveries will vary in size, shape, and weight it may be necessary to have a versatile UAV design that is able to accomplish a variety of deliveries.
  • delivery companies may not only want to utilize unmanned aerial delivery but also optimize power. Such optimizations may include minimizing UAV down time, rapidly checking the operational status of each UAV component, and quickly replacing and/or repairing UAV components.
  • UAVs specially designed to be modular such that they can be easily dismantled and reconstructed before and after each delivery if repairs or modifications are in order.
  • a UAV must be designed so as to easily increase or decrease its carrying capacity and power supply in order to achieve maximum efficiency.
  • FIGS. 32A-32C show an example of an oclocopter constructed using a stacking method with modular UAV components.
  • a quadcopter to create a four-rotor copter (hereby referred to as a "quadcopter") the user may simply connect a control board (or any other electronic devices necessary for the successful operation of a UAV) located on the control plate 5101 to a single rotor plate 5102- 1.
  • the rotor plate 5102-1 may have four or more rotor arms, each of which is sized to cany a rotor.
  • Using one rotor plate 5102-1 may provide sufficient lift and power for utilization of a lighter payload, but if a user requires a UAV to carry a heavier payload that a quadcopter configuration cannot support, then the user may simply add another rotor-plate 5102-2 that is out of phase with respect to the prior rotor plate 5102-1 by 45°.
  • the addition of this second rotor plate 5102-2 would create an eight rotor copter (hereby referred to as an "octocopter").
  • These plates may be connected and secured at easily accessible and intuitive fastening points 5103 located symmetrically around the structure of the plates.
  • a modular style of battery stacking on the control plate may be implemented to ensure that only the sufficient power required for flight is on board.
  • This ability to specifically select UAV components based on payload weight, a delivery distance, or another aspect, may reduce weight of the UAV, and in turn may increase the lifetime of the battery while also reducing the frequency at which the batteries would have to be replaced.
  • To attach ulterior sensors, payload holders, or other system elements would simply be a matter of stacking an additional unit (similar to the configuration and compatibility of the plates previously described) with the desired components below the control plate or above the rotor plate, given that it is designed to not impede the rotors.
  • An additional design advantage of a UAV constructed utilizing a stacking method is the ability to additionally dampen vibrations coming from the rotors that may affect the control board. By adding dampening elements on the connection point between each plate, or the connection between the control plate 5101 and the rotor plates 5102, the vibrations described may be minimized or eliminated.
  • the octocopter may include a control plate 5101 that houses both the batteries and the control units 5104. Two rotor plates 5102 are attached at the connection points 5103 above the control plate 5101.
  • FIG. 32B shows a single rotor plate with its motor mounts 5106 and control unit 5104.
  • FIG. 32C two rotor plates are stacked on top of each other to create an octocopter. Located on each rotor plate of the octocopter are motor mounts 5106 and the respective motor controllers 5105 as well as the control unit 104.
  • FIG. 33 demonstrates an example algorithmic system process of how a UAV may be configured to accomplish a specific delivery. This customization may be made possible by the utilization of a UAV constructed utilizing a stacking method as described in FIGS. 32A-32C. This system optimizes a payload delivery system by minimizing the excess material for each flight, and m inimizing the amount of time a UAV control plate is out of commission.
  • One example implementation of the present disclosure may begin with weighing the payload 5201 and determining the distance required to travel 5202 for purposes of optimization.
  • the system may then determine through payload thresholds 5203, 5204, and 5205 whether to use a quadcopter with one rotor plate for light payloads 5208, an octocopter with two rotor plates for heavier payloads 5209, or utilize a circular frame with three rotor plates for the heaviest payloads 5210, respectively, in the case that the payload exceeds the maximum weight that a UAV may carry, the payload may be split into separate deliveries 5206 and the system may restart the algorithmic process by weighing each of the divided payloads 5201 and determining the required distance to travel 5202 for each.
  • the payload may be communicated to a user or construction system that the payload is not deliverable via the current UAV method 5207. It is important to note that the exact weight configurations for each payload and UAV may vary and the values shown in FIG. 33 are only one example implementation of the present disclosure.
  • the entire system may perform a quick systems test on the motors and control board 5213. Once a UAV is verified as functional, the system may then attach a payload 5214, perform the delivery to a delivery site, return 5215, and the UAV may land 5216. Before a UAV is ready for another payload and delivery, a user or construction system may perform a system check on the vehicle 5217. If all elements are functioning 5218, the system may remove the discharged batteries 5221 and the UAV would be recognized as ready for the next payload and delivery. In the event that one or more elements are dysfunctional, a user or construction system may check that the control board is functioning properly 5219.
  • control board If the control board is functioning, then the system may utilize the same control plate for the next delivery and send the rotor plates for evaluation and repair 5222. Otherwise if the control board is malfunctioning, the control plate may be decommissioned for repairs 5220. It should also be noted that the control board may be one of if not the most expensive component of a UAV. Consequently, constant utilization and maintenance of the control board is essential to maximizing the cost efficiency of a UAV constructed using the stacking method of the present disclosure.
  • FIG. 34 shows a block diagram of an unmanned vehicle in communication with a management system for wind mapping using power consumption data.
  • the unmanned vehicle (also referred as UAV or unmanned aerial vehicle) 6102 may comprise a processor 6105, a communication device 6104. a memory device 6108, and a battery 6106.
  • the memory device 6108 may include, for example, one or more volatile and/or non-volatile memories.
  • the memory device 6108 may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device).
  • the memory device 6108 may be configured to store information, data, applications, instructions or the like for enabling the apparatus to carry out various functions in accordance with example embodiments of the present disclosure.
  • the memory device 6108 could be configured to buffer input data for processing by the processor 6105. Additionally or alternatively, the memory device 6108 could be configured to store instructions for execution by the processor 6105.
  • the processor 6105 may be embodied in a number of different ways.
  • the processor 6105 may be embodied as one or more of various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip. or the like.
  • the processor 6105 may be configured to execute instructions stored in the memory device 6108 or otherwise accessible to the processor 6105. Alternatively or additionally, the processor 6105 may be configured to execute hard coded functionality.
  • the processor 6105 may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to embodiments of the present disclosure while configured accordingly.
  • the processor 6105 when the processor 6105 is embodied as an ASIC, FPGA or the like, the processor 610 may be specifically configured hardware for conducting the operations described herein.
  • the processor 6105 when the processor 6105 is embodied as an executor of software instructions, the instructions may specifically configure the processor 6105 to perform the algorithms and/or operations described herein when the instructions are executed.
  • the processor 6105 may be a processor of a specific device (e.g., an eNB, AP or other network device) adapted for employing embodiments of the present disclosure by further configuration of the processor 6105 by instructions for performing the algorithms and/or operations described herein.
  • the processor 6105 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 6105.
  • ALU arithmetic logic unit
  • the battery 6106 may be any container consisting of one or more cells by which chemical energy is converted into electricity and used as a source of power for the motor/generator 6107.
  • Examples of possible battery types that may be used include but are not limited to alkaline, lithium, lithium-ion, and carbon-zinc, all of which may or may not have rechargeable properties.
  • the battery depicted may not be restricted to solely providing power for the motor/generator 6107 and could also supply power to the elements within the computing module 6103.
  • the communication device 6104 may be any means such as a device or circuitry embodied in either hardware, software, or a combination of hardware and software that is configured to receive and/or transmit data from/to a network and/or any other device or module in communication with the apparatus.
  • the communication device 6104 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling
  • the communication device 6104 may alternatively or also support wired communication.
  • the communication device 6104 may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
  • DSL digital subscriber line
  • USB universal serial bus
  • the processor 6105 may be embodied as, include or otherwise control a resource manager 6109.
  • the resource manager 6109 may be any means such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g., processor 6105 operating under software control, the processor 6105 embodied as an ASIC or FPGA specifically configured to perfonn the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the resource manager 6109 as described herein.
  • a device or circuitry executing the software forms the structure associated with such means.
  • the processor 6105 may be further configured to receive power consumption specific information from the batteiy 6106, directly or indirectly. That is, a direct communication 6050 may be defined as the processor 6105 already possessing the ability to interpret power consumption data from the battery 6106, or that an indirect communication method may be necessary and defined as another device coupled with the processor 6105 and the battery 6106 to act as a translator or relay process between the two. In the case where a direct communication 6050 may not exist, and the batteiy 6106 may have other power commitments to other devices, the processor 6105 may be further configured to account for these other devices when determining the power consumption data specific to the motor/generator 6107.
  • the resource manager 6109 is configured to control the allocation of wireless communication resources to enable the
  • the resource manager 6109 is configured to allocate resources for use by machines or management systems such as the management system 6101 to communicate directly with a cellular network (e.g., in the downlink direction), and/or to communicate with other machines or management systems (bi- dircctionally), and/or to communicate with a gateway or relay.
  • the resource manager 6109 may be configured to allocate wireless network downlink resources (e.g., cellular downlink channel resources) for use by the management system 6101 to provide signaling to other machines or management systems or to the gateway.
  • the resource manager 6109 may also be configured to allocate wireless network uplink resources (e.g., cellular uplink channel resources) to receive data from the management system 6101 via the gateway (e.g., the unmanned vehicle 6103). Uplink and downlink resources may also be managed with respect to communications with the unmanned vehicle 6103 for communications that are not related to data being reported by the management system 6101 or other machines or management systems.
  • Parameters affecting power consumption data in an UAV may include power usage by equipment installed on the unmanned vehicle, such as sensors, cameras, etc., the speed of the UAV, and the climate where the UAV is traveling. Specifically the climate is strongly influenced by the presence of wind factors, which may increase or decrease power consumption based on whether wind speed or direction. This relationship between power consumption and the UAV ambient wind factors may then be utilized to map wind vectors. Wind mapping is an essential tool for management systems and users, and provides more information about the
  • an unmanned vehicle 6102 may travel in winds, or any other weather related event 6201.
  • Winds stand as a general term that includes but is not limited to gusts, breezes, and storms.
  • the unmanned vehicle's travel trajectory and plans are affected by the presence of wind 6201.
  • FIG. 35 depicts the unmanned vehicle 6102 traveling in the same direction as the wind 6201, but it is also possible for the unmanned vehicle 6102 to travel perpendicular, diagonally, or any other directional degree with respect to the wind 6201.
  • the wind may represent a dynamic variable that accounts for the rate of change of wind velocity as well as the intensity or strength of the wind as determined by its velocity and force.
  • the unmanned vehicle 6102 When traveling with wind 6201, the unmanned vehicle 6102 may consume more or less power from the battery. As a result, actual power consumption data is dependent on wind factors and the angle at which the wind goes with respect to the unmanned vehicle 6102. For example, when the unmanned vehicle 6102 heads opposite the direction of the wind 6201, it typical ly consumes more power in order to account for the force pushing it away from its trajectory. While on the contrary, if the wind 6201 and unmanned vehicle 6102 head in the same direction as seen in FIG. 35, the unmanned vehicle 6102 consumes less power because the wind 6201 provides a force to support the unmanned vehicle's flight.
  • the power consumption data can be analyzed using external computing by a management system to determine what the cause for varying power consumption data is. This computing method may give sufficient data to determine the direction and intensity of the wind 6201 to a specific accuracy.
  • Figure 36 shows a graph of the power consumption data specific to the motors of the unmanned vehicle relative to the direction of the unmanned vehicle according to the example embodiment of Fig. 35.
  • the x-axis of the graph indicates what angle the wind is blowing with respect to the unmanned vehicle, and the y-axis indicates how much power is being consumed by the unmanned vehicle 6102 in watts.
  • the example embodiment of the present disclosure is only one possible representation of the unmanned vehicle's 6102 power consumption; any type of graph or visual representation may be used with any scale of x, y variables and units.
  • FIG. 36 it can be seen that the graph follows a bell curve, but if the graph were extended it would closely resemble a sinusoidal wave.
  • the curve is closest to zero in terms of power consumption. It will not be exactly zero watts because the unmanned vehicle will consume power regardless of the surrounding wind intensity. However, these directions would be the graph's minimum according to the example embodiment in FIG. 35, where the unmanned vehicle would be traveling in the same direction as the wind, in this case the unmanned vehicle may consume less power because the force of the wind will assist the unmanned vehicle to continue on its flight path and the wind will not be hindering the unmanned vehicle from going in its intended direction. It can be seen that the peak of the curve is at approximately 180 degrees, which if the unmanned vehicle were to turn and travel in the horizontal direction of 180 degrees (as shown in FIG.
  • the unmanned vehicle may consume the greatest amount of power.
  • the power consumption of the unmanned vehicle may be calculated using an external computing vehicle.
  • the other straight curve represents the base power usage if wind were not present and thus not affecting the unmanned vehicle's flight. Additionally the exact values of power consumption and degrees will be unique to each unmanned vehicle based upon the unmanned vehicle's design and the existing technology present on or inside the vehicle. In terms of calculating power consumption, the unmanned vehicle may determine its power consumption relative to its base power usage (power usage with no wind present), or another computing process may be employed.
  • FIG 37 shows an example graphical user interface (GUI) in accordance with embodiments of the present disclosure.
  • GUI graphical user interface
  • An interactive map is displayed through a web browser showing wind patterns and directions calculated by the management system using the unmanned vehicles' power consumption data, in accordance with embodiments of the present disclosure.
  • the graphical user interface (GUI) 401 may include the unmanned vehicle(s) 6201, the unmanned vehicle's flight path 6404, and wind vectors 6201-1 and 6202-2.
  • the details of the llight may be seen in a web browser or other computer window 6401.
  • the background of the GUI will have a map 6402, which may display the general area of where the unmanned vehicle takes off, where the unmanned vehicle lands 6403, and any places in between the unmanned vehicle's takeoff and landing spot as shown by the flight path 6404.
  • the map 6402 may change as the unmanned vehicle's flight path changes and may be interactive through scrolling, or it may automatically follow the unmanned vehicle(s) through tracking.
  • the GUI will be able to display more than one unmanned vehicle 6404 and how each flight path 6404 may be used to determine wind direction and intensity as shown by the imaging of wind vectors 6201-1 and 6202-2 on the map.
  • the umnanned vehicle ' s flight path may be in the direct path of a wind, which will affect the umnanned vehicle's power consumption. Based on the power consumption data computations, the external management system will be able to compute and display the direction and intensity of the wind through a visual representation of wind vectors 6201-1 and 6202-2 on the GUI.
  • FIG. 38 illustrates a block diagram of a wind mapping system in accordance with embodiments of the present disclosure.
  • an onboard system 6510 in communication with a cloud environment 6520 to transfer onboard information for cloud processing.
  • the onboard system 6 10 may comprise a collector 6511, passive sensor systems 6512, electrical feedback sensors 6513 and other components, such as output control module, etc.
  • the electrical feedback sensors 6513 function to get power consumption information of the UAV.
  • the passive sensor systems 6512 may be refereed as Global Positioning system (GPS) receiver to obtain information such as GPS signal, compass information, UAV speed, etc.
  • GPS Global Positioning system
  • the passive sensor systems 6512 may be also refereed as temperature sensor, humidity sensor and barometer pressure sensor to obtain temperature, humidity and pressure information respectfully.
  • the collector 6511 couples to the aforementioned components, collects information to be sent and transfers the collected information via a wireless communication path to the cloud environment 6520.
  • the. cloud environment 6520 processes the information for various applications.
  • the electrical load measurement may be used for wind mapping 6521.
  • the information may also be used for 3D cell coverage mapping 6523, 3D GPS receiving mapping, determining magnetic field anomaly using compass error information 6523.
  • the cloud environment 6520 comprises a model installed on a cloud server to calculate wind factor, including wind speed and direction, based on UAV power consumption and other required information.
  • the model may be preloaded with an aerodynamic profile of the UAV, which may be necessary to resolve wind factor information with desired precision.
  • the other required information may comprise UAV speed and position information, which may be obtained through GPS signal received onboard.
  • the other required information may also comprise ambient air temperature and humidity, which may be measured with onboard sensors too.
  • the other required information may also comprise UAV aviation status information, such as pitch angle, roll angle and yaw angle.
  • the modal may be pre-calibrated by comparing the calculated wind factor information with wind factor information obtained from third parties, such as weather broadcast services. In operation, all obtained information, including UAV power consumption, speed, position, aviation status, are input to the model for the calculation of wind factors.
  • FIG. 39 illustrates an example flow diagram for wind mapping in accordance with embodiments of the present disclosure.
  • the power consumption specific information from the battery of an UAV is obtained.
  • the power consumption may be obtained directly or indirectly, as described earlier.
  • aviation information of the UAV is obtained.
  • the aviation infonnation may comprise UAV speed, direction, position, flight gesture as defined by Euler angles including roll, pitch and yaw angels.
  • the UAV speed, direction may be obtained via received GPS signal using an onboard GPS receiver.
  • the received GPS signal may be used in a differential algorithm to calculate the speed, direction and position.
  • the Euler angles may be obtained using an onboard gyroscope.
  • ambient air information around the UAV is obtained.
  • the ambient information may be air temperature, humidity and pressure, which may be measured using onboard temperature sensor, humidity sensor and barometric pressure sensor respectively.
  • Weather information provided by weather broadcast services may not precise enough or not in real time. Precise ambient air information measured onboard would still be necessary to meet the precision calculation requirement.
  • the power consumption information, UAV aviation information and the ambient air infonnation may be refreshed at the same or different rate.
  • the power consumption infonnation, UAV aviation information and the ambient air information are transferred to the management system via a wireless communication path.
  • the transfer schedule is in line with the information refresh rate.
  • wind factor is calculated based on the transferred power consumption information, UAV aviation information and the ambient air information. The calculation may be done on a cloud server accessible via internet.
  • the wind factor comprises wind speed and direction, which are mapped into a wind map displayed on a Graphic user interface (GUI).
  • GUI Graphic user interface
  • the GUI may be displayed as an interactive map accessible through a web browser Besides the wind map, the GUI may also comprises other layers of information, such as the UAV and its flight path, geographic background map, etc.
  • An exemplary GUI and its description may be referred back to Fig. 35.
  • exemplary steps are shown in Fig. 39 for wind mapping, it is understood to one skilled in the art that certain steps may optionally be performed; certain steps may be performed in different orders; and certain steps may be done concurrently.
  • FIG. 40A-B shows the underside of a UAV 7101 with a stationary locked camera 7102.
  • FIG. 40A shows the entire UAV 7101, with FIG. 40B showing a closer view of the camera 7102 connected to the underside of the UAV 7101. It should be understood that various other implementations with respect to imaging optics and hardware may be used without departing from the core novelties of the present invention.
  • FIG. 41 describes an example system embodiment of the present invention, in which five UAVs 7101 communicate with an image compilation server 7201, which in turn may communicate with a computer 7202.
  • the five UAVs 7101 would capture image data from various geographic locations and send the data wirelessly to the image compilation server 7201.
  • the image compilation server 7201 should be understood to have the qualities and functionality of a cloud-based server capable of enabling network access among a plurality of computing elements. Additionally, it should be understood that the image compilation server 7201 would be programmed or otherwise enabled to store image data (which may include image data that it not from the UAVs of the present invention, such as satellite imagery), autonomously perform algorithmic processes on said image data, and store the results of those processes.
  • the computer 7202 should be understood to be any device which are at minimum capable of storing and processing data according to instructions given to it by a variable program.
  • at least one computer 7202 would be in communication with the image compilation server 7201 directly or indirectly, via either a wireless (e.g. via WiFi, Bluetooth, or a cellular network) or direct (e.g. via a USB, FireWire, DIN, or e-SATA/SATA) connection.
  • a wireless e.g. via WiFi, Bluetooth, or a cellular network
  • direct e.g. via a USB, FireWire, DIN, or e-SATA/SATA
  • At least one computer 7202 would be equipped with a monitor or other display component that would allow the virtual 3D environment constructed by the image compilation server 7201 to be displayed to a user in a virtual reality (e.g. via immersive wearable headgear, large field of view and depth of field glasses), augmented reality (e.g. via by holographic displays, projector equipped glasses) or traditional display fashion (e.g. via LED, LCD, CRT displays). All three of these display types would be equipped with necessary hardware and software so as to enable a user to use the computer 7202 to view, navigate, label, partition, save, or otherwise add, subtract, or change the virtual 3D environment viewed for whatever purposes a user may require.
  • a virtual reality e.g. via immersive wearable headgear, large field of view and depth of field glasses
  • augmented reality e.g. via by holographic displays, projector equipped glasses
  • traditional display fashion e.g. via LED, LCD, CRT displays
  • a UAV 7101 registered or otherwise in communication with the image compilation server 7201 may include a telemetry unit 7301, a processor 7302, a resource manager 7303, camera and imaging components 7304, a memoiy device 7305, mechanical components 7306, and a GPS unit 7307. These seven components (including the physical frame and supporting body) would compose the unmanned aerial vehicle as previously mentioned.
  • the memory device 7305 may include, for example, one or more volatile and/or non-volatile memories.
  • the memory device 7305 may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device).
  • the memory device 7305 may be configured to store information, data, applications, instructions or the like for enabling the apparatus to carry out various functions in accordance with example embodiments of the present invention.
  • the memoiy device 7305 could be configured to buffer input data for processing by the processor 7302. Additionally or alternatively, the memory device 7305 could be configured to store instructions for execution by the processor 7302.
  • the processor 7302 may be embodied in a number of different ways.
  • the processor 7302 may be embodied as one or more of various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like.
  • the processor 7302 may be configured to execute instructions stored in the memory device 7305 or otherwise accessible to the processor 7302.
  • the processor 7302 may be configured to execute hard coded functionality.
  • the processor 7302 may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to embodiments of the present invention while configured accordingly.
  • the processor 7302 may be specifically configured hardware for conducting the operations described herein.
  • the processor 7302 when the processor 7302 is embodied as an executor of software instructions, the instructions may specifically configure the processor 7302 to perform the algorithms and/or operations necessary for the UAV to operate successfully.
  • the processor 7302 may be a processor of a specific device (e.g., an eNB, AP or other network device) adapted for employing embodiments of the present invention, and may entail further configuration of the processor 7302.
  • the processor 7302 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 7702.
  • ALU arithmetic logic unit
  • the telemetry unit 7301 may be any means such as a device or circuitry embodied in either hardware, software, or a combination of hard ware and software that is configured to receive and/or transmit data from/ to the image compilation server 7201 and or any other device or module in communication with the apparatus.
  • the telemetry unit 7301 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network.
  • the telemetry unit 7301 may alternatively or also support wired communication.
  • the telemetry unit 7301 may include a communication modem and/or other hardware/ so f tware for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
  • the processor 7302 may be embodied as, include or otherwise control a resource manager 7303.
  • the resource manager 7303 may be any means such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g., processor 7302 operating under software control, the processor 7302 embodied as an ASIC or FPGA specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the resource manager 7303 as described herein.
  • a device or circuitry e.g., the processor 7302
  • executing the software forms the structure associated with such means.
  • the resource manager 7303 would be configured to control all mechanical components 7305, the camera and imaging components 7304, and GPS unit 7307 of the UAV. Upon receiving instructions from the processor 7302, the resource manager 7303 would interpret and translate the instructions into actions performed by the mechanical components 7305, and if enabled to do so, the camera and imaging components 7304.
  • the mechanical components 7305 of the UAV may include but are not limited to rotors, UEDs, rudders, mechanical or robotics arms, retractable cables, gimbals, or other mechamcal apparatus.
  • Any and all data from the camera and imaging components 7304 and GPS unit 7307 would be interpreted by the resource manager 7303, and in turn analyzed by the processor 7302 so as to communicate the data to the image compilation server 72 1 via the telemetry unit 7301.
  • the geographic location data produced by the GPS unit 7307 would be processed and sent in an identical, parallel manner.
  • FIG. 43 describes the algorithmic process by which the image compilation server creates the virtual 3D environment of the present invention, given images acquired by UAVs.
  • the process begins with multiple UAVs acquiring images 7401.
  • the images are then transmitted to the image compilation server via a form of wireless communication 7402, such as ISM radio band communication (more commonly 2.4GHz, 1 SMI Iz, RF spectrum), cellular communication networks, or other similar distanced communication methods.
  • wireless communication 7402 such as ISM radio band communication (more commonly 2.4GHz, 1 SMI Iz, RF spectrum), cellular communication networks, or other similar distanced communication methods.
  • Each of the images is then associated with or assigned to a specific geographic area 7403.
  • an image may be further cropped, split, or otherwise modified as calculated manually (by a user) or automatically by the image compilation server so as to make more precise assignments or associations of the image data with geographic areas.
  • the image compilation server then associates angles of perspective to images of the same geographic area 7405, and accomplishes this task for every geographic area captured by the UAVs. Finally, the image compilation server constructs the 3D environment utilizing a processing algorithm 7406 that those familiar in the art will recognize as possible given that all data necessary to construct the 3D environment is supplied by the UAVs of the present invention.
  • the 3D environment should be understood to be any form of data that could be interpreted to represent a three-dimensional space containing one or more tliree-dimensional objects.
  • the 3D environment may be in the form of a large three-dimensional matrix, with each data value in the matrix representing if the space is occupied by an object, and if so, what the properties of the object are (e.g.
  • the 3D environment may be in the form of a software specific data fonnat, where the 3D compilation server utilizes an existing 3D visualization software file format to store the data (e.g. ⁇ ac " , '.obj', ⁇ skp ⁇ '.sldprt', '.snid').
  • This more complex implementation may be advantageous for utilizing the 3D environment to accomplish tasks other than simply viewing the environment, such as 3D printing the environment or otherwise reconstructing the environment in reality.
  • FIG. 44A-C shows a series of images describing a specific implementation where five UAVs 7101, as similarly depicted in a general fashion in FIG. 41, are located in certain locations 7502 with the image capture area of each UAV 7502 also shown.
  • FIG 77A shows a map void of actual images taken by UAVs or a satellite, so as to better display a general figurative interface in which UAVs 7101 are located in various positions throughout a given geographic area 7501.
  • the dotted rectangles 7502 should be seen as the area capable of having image data collected by the UAV at the center of each rectangle.
  • FIG. 44B shows the same geographic area 7501 prior to UAVs being present, with outdated aerial images supplied, for example, either by the UAVs of the present invention or by a satellite.
  • FIG. 44C shows an updated visualization of the geographic area 7501 as would be stored in the image compilation server previously described. It should be noted that the updated areas of the area 7501 have been denoted in a similar fashion rectangular fashion 7502 as in FIG. 44A. Where overlapping image data occurs, the image compilation server previously described may either choose data collected from on UAV, or perform another image processing algorithm that would combine the best qualities of all the image data specific to the overlapping area to create a new set of image data for the purposes of display on to a graphical user interface.
  • FIG. 45A-I shows an example implementation of the present invention, by means of a series of images pertaining to a UAV 7101 flying over a building over time at multiple angles, so as to gather sufficient image data to allow the image compilation server as previously described to virtually reconstruct the building in 3D.
  • FIG. 45 A and FIG. 45B both show a UAV at the same position (south of the building, and slightly angled so as to yield an ideal perspective for the fixed camera or other imaging components) near the target building, with FIG. 45C showing a sample image taken by the camera located on the UAV 7101.
  • FIG. 45E both show the UAV 7101 at the same position at a different time (directly over the building, with the plane formed by the rotor blades parallel to the ground) over the target building, with FIG. 45F showing a sample image taken by the camera located on the UAV 7101.
  • FIG. 45G and FIG. 45H both show the UAV 7101 at the same position at yet another time (north of the building, and slightly angled so as to yield an ideal perspective for the fixed camera or other imaging components) over the target building, with FIG. 451 showing a sample image taken by the camera located on the UAV 7101.
  • the apparatus comprises one or more linear arrays 8103 of light-emitting diodes (LEDs) placed on or otherwise embedded in a UAV rotor blade 8102.
  • the LED arrays 8103 each include, for example, nine LEDs. The number of lights may vary for rotor radii of different lengths and desired resolution of the persistence of vision image formed.
  • FIG. 46A shows a bottom view of the first
  • FIG. 46B showing a side view
  • FIG. 46C showing a top-left diagonal view.
  • the necessary wiring and connections to the LED arrays 8103 are present, and would be embedded in the UAV rotor blade, with the necessary larger and heavier circuitry components being located in the center compartment 8101.
  • FIGs 47A-B show a rotor blade with a horizontal array of collimated laser diodes 8201 located inside the center compartment 8101 shining light beams along small rectangular cavities 8202 inside the rotor blade, until the beams come in to contact with diffuse reflective surfaces 8203 that emit light underneath the rotor blade.
  • the reflective surfaces 8203 may be comprised of additional optical components (e.g. lensiets, gratings, polished metals, etc.) so as to sufficiently reflect the laser diode beams downwards, or underneath the rotor blade 8102 in order to have the same or better lighting effect as a LED.
  • FIG. 47A shows a transparent, close up, top-left diagonal view of where part of the rotor blade 8102-1 conjoins with the center compartment 8101, and where the laser diodes 8201-1 are located and oriented. It should be understood that an identical configuration exists on the opposite side of the center compartment 8101.
  • FIG. 47B shows a holistic top-left diagonal view of the current implementation, where the reflective surfaces 8203 are periodically spaced along the rotor blade 8102.
  • FIG. 47C and FIG. 47D show bottom and side views of the current implementation, respectively. Again, as similarly explained in the first implementation, although not pictured, the necessary wiring and connections to the laser diodes 8201 are present, and would be embedded in the center compartment 8101.
  • FIGs 48A-D show a rotor blade equipped with a horizontal array of uncoUimated light sources 83 1 (e.g. LEDs, uncoUimated laser diodes, etc.) inside the center compartment 8101 and shining light along an aligned array of optical fibers 8302.
  • This third implementation is aimed at addressing the possibility of utilizing uncoUimated light sources as the light sources 8301 in the center compartment 8101 and yielding an identical beam redirection effect as in the second implementation so as to reduce the rotor blade aerodynamic impacts as much as possible.
  • FIG. 48A shows a transparent, close up, top-left diagonal view of where part of the rotor blade 8102- 1 conjoins with the center compartment 8101, and where the uncoUimated light sources 8301-1 are located and oriented.
  • FIG. 48B shows a holistic top-left diagonal view of the current implementation, where the optical fiber ends 8303 are periodically spaced along the rotor blade 8102. The optical fiber ends 8303 are gradually angled downwards so as to maintain a total internal reflection effect until the light is emitted beneath the rotor blade 8102. Additional lenses or optical components may be added to the end of the optical fibers 8302 so as to ensure as equal of a light flux (equal distribution of light across a given angle) as possible from the ends of the optical fibers 8302.
  • FIG. 48C and FIG. 48 D show top and side views of the current implementation, respectively.
  • FIG. 49 is a block diagram of the circuitry 8200 for a light array 8406 of any of the previously described implementations of the present invention. That is, the light array 8406 should be understood to be the LEDs 8103, laser diodes 8201, or uncollimated light sources 8301 , depending on which implementation is used.
  • the circuitry includes batteries 8405 for powering the microprocessor 8402, memory 8404, and light sources 8406. The LEDs are activated via series-parallel
  • the microprocessor 8402 includes memory 8404 for storing the possible patterns. As shown, the two banks of nine light sources 8406-1 and 8406-2 are driven via serial to parallel converters 8403. When displaying text and other patterns, the microprocessor must know how the individual lights of the light arrays are distributed around the rotor blade. It should be noted that this spacing would be taken in to account prior to programming in the specific light frequencies for each RPM of which the UAV is capable.
  • the light arrays 8406 rotate as the rotor blades spin.
  • the microprocessor can synchronize images and patterns displayed by the light arrays 8406 to the speed of the rotor blade. This allows images to be “frozen” or controllably “scrolled” in one direction or the other. Because the entire linear array of lights is swept during motion, it appears to the viewer as if the entire rotor blade is illuminated.
  • a memory 8404 connected to the microprocessor can store a large number of patterns, images, and messages. These images can be played out in a random, sequential, or fixed pattern. Selection of the playback method is done via a selection by a user remotely via the circuitry 8400, or may be automated by the circuitry 8400 itself based on the operational state of the UAV or a variety of other factors related to the UAV. This decision process may or may not be executed by the integrated UAV main computational unit 8401.
  • the UAV main computational unit 8401 should be understood to be comprised of some or all of the following components so as to be capable of accomplishing the electronic tasks the present invention may require: a processor, a telemetry unit, a power supply unit, a memory device, and other electronic devices.
  • FIG. 50 shows four different example persistence of vision display possibilities and applications of the present invention.
  • FIG. 50A shows a large capital "A" created by a rotating light a nay, which may be used as an indicator for which rotor blade is which, or as a single letter of a larger message spelled out by adjacent rotor blades.
  • FIG. 50B shows a sample message and option interface asking a user to "Confirm GPS Coord.?" and two options, "YES” and “NO.” This display interface may be controlled or interacted with remotely by a user looking up at the display from the ground, and used to communicate directions or preferences to the UAV so as to modify its behavior.
  • FIG. 50C shows just such a status report display, with the message "LOW BATTERY" being conveyed to an observer, followed by a low battery symbol. Tins may be a crucial communication tool so as to inform a remote operator that the UAV needs to recharge its batteries or battery.
  • FIG. 50C shows just such a status report display, with the message "LOW BATTERY" being conveyed to an observer, followed by a low battery symbol. Tins may be a crucial communication tool so as to inform a remote operator that the UAV needs to recharge its batteries or battery.
  • SOD shows a QR code being displayed, suggesting the possibility of machine-to-machine communication via this persistence of vision display.
  • the display could also be used to calibrate or coordinate specific autonomous processes such as landing, and utilize images that may not be interpreted by a human observer, but that hold plenty of information for an enabled image processing machine to interpret
  • all rotor blades of a UAV may be equipped with light arrays as previously described in any of the three implementations so as to display the message "LANDING" twice on two rotor blades, and the message “STAND BACK” on the remaining two rotor blades.
  • Some or all of the electronics described in FIG. 49 (8401 through 8405) may be housed inside the main body of the UAV, with connections to the light arrays on the rotor blades being accomplished via embedded wiring from the UAV body to the UAV arms, and up to the rotor blade motors and center compartments.
  • the embodiments shown are for UAV rotor blades, it should be understood that certain embodiments may be applicable to automobile wheels, ceiling fans, wiper blades, similar airplane propellers, or other rotating or oscillating objects.
  • the second implementation of the present invention may utilize optical fibers as in the third implementation (or other total internal reflection techniques) for more complex light paths through varying rotor blade shapes.
  • the examples illustrate the various embodiments and arc not intended to limit the present disclosure. Therefore, the description and claims should be interpreted liberally with only such limitation as is necessary in view of the pertinent prior art.
  • FIG. 52 describes an example system embodiment of the present invention, in which three UAVs 9101 communicate with an UAV management system 9102, which in turn may communicate with a mobile device 9103.
  • the three UAVs 9101 would send location data (such as GPS coordinates) specific to each respective UAV wirelessly to the UAV management system 9102.
  • location data such as GPS coordinates
  • An unknown UAV 9101-1 is also shown, to demonstrate that other UAVs that arc not known to the U A V management system 9102 should be understood to have the qualities and functionality of a cloud- based server capable of enabling network access among a plurality of computing elements. Additionally, it should be understood that the UAV management system 9102 would be programmed or otherwise enabled to store location data,
  • the mobile device 9103 and any other similar actors that may be present in various embodiments not described, should be understood to be any type of mobile device that is at minimum capable of storing and processing data according to instructions given to it by a variable program, while also having at least a camera, electronic visual display, accelerometer and gyroscope. With respect to the present invention, at least one mobile device 9103 would be in communication with the UAV management system 9102 directly or indirectly, via a wireless connection (e.g. via WiFi, Bluetooth, or a cellular network).
  • a wireless connection e.g. via WiFi, Bluetooth, or a cellular network
  • At least one mobile device 9103 would be equipped with a monitor or other display component that would allow the location data of the UAV management system 9102 to be displayed to a user utilizing the mobile device 9103.
  • FIGs 53A-B describe an example scenario of how multiple UAVs may occupy an airspace relative to the example system embodiment described in FIG. 52, in which three UAVs 9101-2, 9101 -3, 9101 -4 communicate with the UAV management system 9102, and an unknown UAV 9101 - 1 and known UAV 9101-2 are within the field of view of the camera of the mobile device 9103.
  • FIG. 53 A specifically shows the electronic display of the mobile device 9103 denoting the UAV 9101-3 as a known registered UAV with a dotted circle 9202- 1 , and unique message box 9201 -1 in the bottom right corner of the mobile device 9103 display.
  • FIG. 53B specifically shows the electronic display of the mobile device 9103 denoting the UAV 9101-1 as an unknown UAV with a dotted circle 9202-2, and unique message box 9201-2 in the bottom right comer of the mobile device 9103 display.
  • FIGs 54A-B show another implementation of the present invention, in which a UAV management system would be capable of displaying the location of a user's mobile device on the electronic display of the mobile device 9103 in addition to the location of UAVs 9101 known to exist in the airspace surrounding the user.
  • FIG 54A shows the implementation previously described, with graphical icons 9301 (graphical icon representations of UAVs 9101 -2, 9101 -3 , 9101 -4) showing known UAVs registered with the UAV management system of the present invention.
  • the UAV management system may be capable of receiving information from a user's mobile device as to the location of unknown UAVs, and additionally display them on the same interface to the user.
  • This implementation is shown in FIG. 54B, where an unknown UAV 9301-1 (a graphical icon representation of UAV 9101-1) is shown on the mobile device 9103 display.
  • Fig. 55 shows a distanced perspective of a user 9401 with a mobile device 9103, observing four UAVs 9101 as consistently described in FIG. 52, FIGs. 53A-B and FIGs. 54A-B, with lines of perspective 9402 drawn to show the field of view of the user suggested in FIGs. 53A-B.
  • FIG. 52 shows a mobile device 9103, observing four UAVs 9101 as consistently described in FIG. 52, FIGs. 53A-B and FIGs. 54A-B, with lines of perspective 9402 drawn to show the field of view of the user suggested in FIGs. 53A-B.
  • the examples illustrate the various embodiments and are not intended to limit the present disclosure.
  • multiple static devices similar to the mobile devices of the present invention may be used to constantly and consistently observe a variety of airspaces so as to update the UAV management system of the present invention consistently. Therefore, the description and claims should be interpreted liberally with only such limitation as is necerney in view of the pertinent prior art.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present disclosure is directed to robotic systems, unmanned aerial vehicles in particular, and various systems and improvements within the field of robotics. Specifically, the present disclosure includes advancements in unmanned vehicle safety management systems and methods for unmanned vehicle safety management implementation and unmanned vehicle communication system and methods for unmanned vehicle communication through short message service (SMS). The use of a non-inertial frame lock for unmanned aerial vehicles, a foldable unmanned aerial vehicle landing platform, and an unmanned aerial vehicle landing and containment station is also provided. Additionally, a method and design for unmanned aerial vehicle delivery optimization and wind mapping using power consumption data of unmanned vehicles is disclosed.

Description

ROBOTIC APPARATUS, SYSTEMS, AND RELATED METHODS The present disclosure is generally related to robotic systems and related methods, and more particularly is related to unmanned vehicle safety management systems and methods for unmanned vehicle safety management implementation, unmanned vehicle communication system and methods for unmanned vehicle communication through short message service (SMS), a non-inertial frame lock for unmanned aerial vehicles, a foldable unmanned aerial vehicle landing platform; an unmanned aerial vehicle landing and containment station; a method and design for unmanned aerial vehicle delivery optimization; and; a system and methods for determining wind vectors using power consumption data from an unmanned vehicle. The disclosure has particular utility for use with drones and will be described in connection with such utility, although other utilities are contemplated.
Robotic devices are increasingly becoming integrated within society, business, and existing technologies. In particular, robotic devices are used through the manufacturing industries to manufacture automobiles, machinery, computer components, and countless other goods. Robots are also being relied on increasingly to perform service activities. For example, robots are routinely used for cleaning, for delivering goods in manufacturing facilities, and to perform military activities where a human injury or fatality is at risk. One type of robotic device with an increasing presence is the unmanned aerial vehicle (UAV), also referred to as a drone or an unmanned aircraft system (UAS), which is routinely used in military settings, commercial settings, and more recently for recreational purposes. While UAVs are steadily improving in their use, there are numerous areas of their use where further enhancements are needed to better utilize UAVs for future use. These areas include, but are not limited to, the following aspects of UAV use:
Safety management:
Many public concerns have arisen regarding the safety of commercial and recreational use of unmanned vehicles. Issues such as unmanned vehicles crashing and damaging property, potentially harming bystanders, and vandalism of the unmanned vehicles themselves while in transit are just a couple o concerns relevant to introducing unmanned vehicles to everyday life. Federal restrictions exist for unmanned vehicle travel, but in order to introduce unmanned vehicles into specific markets such as unmanned vehicle delivery, additional restrictions may be required, and a system may be needed to implement those restrictions quickly and efficiently as needed. Before unmanned vehicles can be used to their full potential, specific steps may need to be taken to offer a display of safe unmanned vehicle practices to relevant government departments (e.g. the TSA, FAA, and other branches of the DOT) and to the general public. Only after an approval by the government and the public may unmanned vehicles operate to their fullest potential, aiding fields such as agriculture, firefighting, infrastructure monitoring, mapping, and package delivery.
Communication:
Communications between unmanned vehicles and their controllers exist in a variety of forms; currently, the unmanned vehicles market is saturated with communication methods that fail to recognize the power usage of said communication as an important factor with respect to the overall power usage of the unmanned vehicle. Current examples of communication methods being used are socket supported data connections and ISM radio band communication (more commonly 2.4GHz, 915MHz, RF spectrum). These methods of communication vary in power consumption, but undoubtedly require a higher level of power consumption to operate normally relative to a majority of the various power requirements for other communication mediums.
Flight stability during landing:
UAV flight requires a UAV to travel from one set point to another, but to also maintain flight stability. One component of flight stability may be to ensure the UAV is able to keep a stable roll and pitch above a fixed position. This is specifically important for a successful UAV landing and to ensure the safety of the UAV and its surroundings. Some UAVs use image detection technology to identify a specific landing place; once the UAV identifies said landing place, the UAV begins its descent to land. Other UAVs may allow another entity to take control of the UAV at the time of landing to provide more guidance in the process. However, many of these technologies do not address the importance of stability throughout the UAV's landing. During the landing phase, the UAV must maintain stability to ensure its safety during landing. If the UAV is unable to keep a stable roll and pitch, one side of the UAV may reach the ground or landing platform before another side and cause an uneven shift in the UAV position. Consequently the UAV may tip over or fall, possibly damaging the UAV itself and/or the UAV's surroundings.
Vehicle landing platform: Standard methods of delivery have improved over the past years as many large companies offer the option to have one-day deliveries. However, most of these delivery methods involve the process of an item being picked up at a facility, the item being shipped to a specific shipping earner, and the shipping carrier completing the process by delivering the item to the buyer. The use of UAVs can drastically simplify this process of delivery and reduce the amount of time it takes to complete a deliver)-. Current UAV delivery methods offer no reliable landing platform for the delivery of an item. In addition, a majority of UAV owners and users lack a consistent and reliable landing platform, an essential piece of hardware for any sophisticated system that employs UAVs. Most often a UAV will not know its exact landing destination, or will arrive at its landing destination, and either be controlled by a remote entity or take further time to assess the area for an appropriate descent. This uncertainty increases the possibility of a landing error, possibly endangering the UAV and any nearby life which may be unaware of the UAV or where it may decide to land.
Unmanned Aerial Vehicle Landing and Containment Station:
With UAV uses expanding beyond military applications, there is an evident need for portable and convenient housing units for UAVs. Additionally, aside from recreation, UAVs have been used for agricultural purposes (e.g. crop inspection), photography, and search and rescue to name a few. As public interest regarding using UAVs to accomplish a variety of tasks increases, there is a greater demand for UAVs, and consequently UAV owners may need to manage multiple UAVs at once.
Unmanned aerial vehicle delivery optimization:
Current delivery systems often embody the process of an item being sent from a manufacturing facility to a shipping center, and then to a customer. The final delivery step of getting a package from a shipping center to a customer is commonly accomplished through manually controlled vehicles such as trucks or motorcycles. Although delivery methods have drastically improved to the point where some companies are able to offer one-day shipping services, delivery can be further improved with the introduction of UAV delivery methods. However, any use of UAVs for the delivery of products must be efficient, safe, and optimized to provide success over manually controlled vehicles.
Wind mapping:
Wind maps show wind speed and direction are used throughout the aeronautics industry. Wind mapping plays an important role in various applications, such as weather broadcast, aviation, UAV, etc. Wind mapping technologies for UAVs include but not limited to sonic anemometers, hot-wire anemometers, pilot tubes etc. However, many of these technologies consume power that could limit the unmanned vehicle's flight time. The usefulness of UAV is often limited by the battery life of the vehicle.
Thus, a heretofore unaddressed need exists in the industry to address the aforementioned deficiencies and inadequacies.
Embodiments of the disclosure relate to unmanned vehicle safety management system and methods for unmanned vehicle safety management implementation.
In accordance with one embodiment of the present disclosure, a cloud-based, unmanned vehicle safety management system is disclosed for registering and enabling a plurality of unmanned v ehicles to accomplish the full extent of their potential tasks while obeying federal and private restrictions. The system comprises a UV communication interface, a network interface, a memory and a microprocessor coupled to the aforementioned components. The safety management system may further comprise a radar to identify locations of any unmanned vehicles within a geographic area. Information related to the geographic area may be preloaded within the memory.
In some embodiments, the UV communication interface communicates to at least one unmanned vehicle via a UV communication link. The UV communication link is a dual way wirelessly link supporting both downlink and uplink
communication such that the unmanned vehicle may transmit signals to or receive signals from the UV communication link. The information transmitted from the unmanned vehicle may include but not limit to location of the UV, height of the UV, speed of the UV, power reserve information, etc. The information transmitted to the umnanned vehicle may include but not limit to any applicable restricted areas near the UV, any tall buildings along the travelling path of the UV, any nearby landing stations/zones, etc.
In some embodiments, the safety management system assigns travel paths to registered and reporting unmanned vehicles (utilizing path-searching algorithms such as breadth-first, depth-first. A*, or other heuristic methods) to ensure the safety o unmanned vehicles and their surroundings. In one embodiment, manually controlled unmanned vehicles registered with the safety management system may be limited to one or more zones of operation by the same safety management system. The management system accomplishes said tasks by assessing risk factors associated with geographical areas (e.g. area population, ground-based activities, tall buildings or structures, weather) and federal, state, or local restrictions (e.g. restricted airspace or roadways, unmanned vehicle operation or use, quantity of unmanned vehicles) so as to appropriately enforce the height, speed, and travel path of unmanned vehicles registered with the management system.
In some embodiments, the management system may also have the ability to permit specific unmanned vehicles to bypass restrictions if necessary, manually control any unmanned vehicles that attempt to illegally bypass set restrictions, and assess the health of any registered and reporting unmanned vehicle. Additionally, the management system may be further equipped to detect illegal unmanned vehicle presence via low level radar, and also give unmanned vehicles commands based on power consumption predictions. All communications accomplished between this safety management system and all unmanned vehicles registered with the safety management system will be accomplished through computer units located on the unmanned vehicles, or indirectly through a separate computer that would facilitate communication between the safety management system and the unmanned vehicle.
Embodiments of the disclosure also relate to unmanned vehicle
communication system and methods for unmanned vehicle communication through SMS.
In accordance with one embodiment of the present disclosure, a method for an unmanned vehicle using a SMS to relay information to and from a separate management system is disclosed. All or portions of the payload/data to be transmitted to and from the unmanned vehicle and management system uses this SMS, specifically payload/data that aids in the operation and location determination (such as GPS data) of the unmanned vehicle. This method of communication offers a new, low power solution for unmanned vehicle communication by utilizing existing cellular technology. The unmanned vehicle and management system are defined as having at least one computer unit/device, with the management system also requiring a user interface and subsequent additional computers and/or systems to accomplish its management tasks.
In accordance with one embodiment of the present disclosure, an unmanned vehicle (UV) capable of communication through SMS is disclosed. The UV may include a telemetry unit to communicate with a cellular network via a bidirectional communication link supporting SMS communication with at least one communication protocol. A processor is coupled to the telemetry unit. A memory device comprises a non-transitory computer-readable medium comprising computer executable instructions that, when executed by the processor, cause the UV to: receive information from cellular network via the telemetry unit; and transmit short messages to the cellular network via the telemetry unit, the short messages being refreshed and transmitted actively from the unmanned vehicle on a repetitive schedule with a repetitive interval.
In accordance with one embodiment of the present disclosure, a method for SMS between an unmanned vehicle and a cellular network is disclosed. A short message is compiled from information obtained from at least one of current location, speed, altitude, destination, power reserve of the unmanned vehicle. The short message is sent on a repetitive schedule over a communication path between the UV and the cellular network, the communication path supporting at least one
communication protocol, wherein the short messages are sent from the unmanned vehicle using a first communication protocol, if no reply or conformation received within a time threshold, the short messages are sent via a second communication protocol different from the communication protocol, i still no reply or conformation received within the time threshold, the short messages are sent via the first communication protocol again, wherein the time threshold is smaller than the repetitive interval, and wherein the short messages are refreshed on the repetitive schedule.
Embodiments of the disclosure also relate to a UAV non-inertial frame lock system.
In accordance with one embodiment of the present disclosure, a UAV non- inertial frame lock system has a landing platform having at least one electromagnetic wave (EMR) emitter positioned thereon. A UAV has at least three EMR receivers. A computational unit of the UAV uses at least one EMR signal communicated between the EMR emitter and at least one of the at least three EMR receivers to control at least one of a position, a roll, and a pitch of the UAV during a landing process.
In accordance with another embodiment of the present disclosure, a UAV non- inertial frame lock system has a landing platform having at least three electromagnetic wave (EMR) receivers positioned thereon. A UAV has at least one EMR emitter. A landing platform computational unit receives at least one EMR signal from the at least one EMR emitter, processes the at least one EMR signal, and communicates it to a UAV computational unit, wherein the UAV computational unit uses at least one processed EMR signal to control at least one of a position, a roll, and a pitch of the UAV during a landing process.
In accordance with one embodiment of the present disclosure, a method of controlling a landing of a UAV with a non-inertial frame lock system is disclosed. A UAV is positioned proximate to a landing platform having at least one
electromagnetic wave (EMR) emitter positioned thereon, wherein the UAV has at least three EMR receivers positioned thereon. At least one of a position, a roll, and a pitch of the UAV is controlled during a landing process with a computational unit of the UAV by using at least one EMR signal communicated between the EMR emitter and at least one of the at least three EMR receivers.
Embodiments of the disclosure also relate to a foldable unmanned aerial vehicle landing platform.
In accordance with one embodiment of the present disclosure, an aerial robotic landing apparatus has a landing platform formed from a plurality of portions. A plurality of warning devices are positioned along an outer edge of the landing platform, wherein the plurality of warning devices are activatable by a UAV during a landing o the UAV on the landing platform.
In accordance with one embodiment of the present disclosure, a robot landing platform has a polygon-shaped platform having a plurality of sections. A flexible junction is formed between each f the plurality of sections, wherein the polygon- shaped platform is foldable along each of the flexible junctions.
In accordance with one embodiment of the present disclosure, a method of landing a UAV on a landing platform is disclosed. A communication link is established between an in-flight UAV and a landing platform positioned on a grounded surface. At least one warning on the landing platform is activated by the in- flight UAV through the communication link, wherein the at least one warning is identifiable by a human observer of the landing platform.
Embodiments of the disclosure also relate to an unmanned aerial vehicle landing and containment station.
In accordance with one embodiment of the present disclosure, a UAV landing station apparatus has support legs contactable to a grounded surface. An electronics bay is supported by the support legs. A support column extends vertically from the electronics bay. A landing platform support is positioned vertically above the electronics bay with the support column. A charging ring is formed on the landing platform.
In accordance with one embodiment of the present disclosure, a UAV landing station apparatus has a plurality of panels forming an enclosable area, wherein at least a portion of the plurality of panels are roof panels. A landing platform is formed as a floor of the enclosable area, wherein the roof panels are movable between open and closed positions, wherein in the open position, the landing platform is accessible for landing a UAV.
In accordance with one embodiment of the present disclosure, a method of supplying power to a UAV is disclosed. A UAV landing station apparatus has a landing platform with a charging ring formed on the landing platform. The UAV is landed on the landing platform, wherein at least one leg of the UAV contacts the charging ring. A power source to the charging ring is supplied to recharge a power source of the UAV.
Embodiments of the disclosure also relate to a method and design for unmanned aerial vehicle delivery optimization.
In accordance with one embodiment of the present disclosure, an unmanned aerial vehicle apparatus has a control board and at least one rotor plate connec table to the control board, wherein the at least one rotor plate has a plurality of rotor arms extending outwards therefrom.
In accordance with one embodiment of the present disclosure, a modular unmanned aerial vehicle apparatus has a control board. A plurality of rotor plates is connected to the control board, wherein the plurality of rotor plates each have at least four rotor arms extending outwards therefrom, wherein each of the rotor arms carries at least one rotor. A plurality of modular, stacked batteries is connected by at least one of: the control board and the plurality of rotor plates. A payload area is included in the UAV.
In accordance with one embodiment of the present disclosure, a method of delivering payloads with an unmanned aerial vehicle disclosed. A weight of a payload is determined. A delivery distance of the payload is determined. A quantity of UAV rotors required for the determined weight and delivery distance of the payload is selected. And, the payload is delivered. In accordance with one embodiment of the present disclosure, this disclosure describes an unmanned vehicle using its power consumption data to determine wind vectors from computations made by an external management system using the power consumption data. These vectors are mapped to a graphical user interface, which not only displays the wind but also the unmanned vehicle(s) along with a general map of the area and the unmanned vehicle's planned flight path. This method of computing wind vectors offers a low power solution for determining the direction and intensity of wind with respect to an unmanned vehicle. By utilizing an inherent process of producing power consumption data, the unmanned vehicle is able to save battery power from wind mapping onboard, thus allow improved overall flight time or allocate more power to a specific function of the unmanned vehicle.
In accordance with one embodiment of the present disclosure, when the UAV travels in winds, the wind represents a dynamic variable that will account for the rate of change of wind velocity as well as the intensity or strength of the wind as determined by its velocity and force. When traveling with wind, the UAV consumes more or less power from the battery, depending on parameters including wind speed and direction. As a result, power consumption data dependents on wind factors and the angle at which the wind is going with respect to the UAV. The power consumption data can be analyzed using external computing by a management system to determine what the cause for varying power consumption data is. This computing gives sufficient data to determine the direction and intensity of the wind to a specific accuracy.
In accordance with another embodiment of the present disclosure, a graphical user interface (GUI) is disclosed. The GUI includes a map of the UAV, the UAV's flight path, and wind vectors. The details of the flight may be seen in a web browser or other computer window. The background of the GUI may be a map displaying the general area of where the UAV takes off, where the UAV lands, and places in between as shown by the flight path. The map may change as the flight path changes and may be interactive through scrolling, or it may automatically follow the UAVs through tracking. The GUI may display more than one UAV and how each flight path is used to determine wind direction and intensity as shown by the imaging of wind vectors on the map. The unmanned vehicle's flight path may be in the direct path of a wind, which may affect the UAV's power consumption. Based on the power consumption data computations, an external management system is able to compute the direction and intensity of the wind, which is displayed as a visual representation of wind vectors on the GUI.
This disclosure describes a method of compiling a plurality of aerial images taken by unmanned aerial vehicles (UAVs) in order to create a single two- dimensional and/or three dimensional virtual display or environment in real-time, as defined by the limitations or delay inherent with the methods and system used to accomplish the present invention. Multiple images would be sent from multiple UAVs to an image compilation server, which in turn would perform one or more algorithmic processes so as to construct the virtual 3D environment of the present invention. The image compilation server would then have the ability to send the virtual 3D environment data to an enabled computer that would display the environment in a virtual reality, augmented reality, or other similarly capable display format.
A light display is mounted on a rotor blade of an unmanned aerial vehicle (UAV). The display includes an array of lights. The array is attached to the underside of the rotor blade, or is otherwise embedded inside the rotor blade itself. A separate computational device equipped with all specific rotations per minute (RPM) of the rotor blades of the UAV will transmit the current RPM of the rotor blades to the microprocessor. The microprocessor will then have the ability to transmit signals to the array of lights at a frequency corresponding to the RPM input, so as to
appropriately create a persistence of vision effect.
A microprocessor, mounted on the UAV and individually connected to the array of lights by a cable, includes a memory which stores a plurality of display patterns. The microprocessor modulates the array of lights according to a selected one of the plurality of display patterns and the preprogrammed angular v elocity of the rotating blade to form an image using persistence of vision of a viewer. Three different implementations with respect to the array of lights (type and method of lighting) are described, including utilizing LED's, laser diodes and an array of reflectors, and finally light sources coupled to the ends of optical fibers.
Finally, this disclosure describes a method of utilizing a mobile device equipped with a camera and electronic visual display so as to identify UAVs within the field of vision of the mobile device's camera, and quickly cross-check the location data of the UAV(s) in view of the camera with location data of a UAV management system, so as to inform a user via the same mobile device as to whether or not the observed UAV(s) are registered with or known to the UAV management system. Additionally, the mobile device is further described as being capable of displaying the location of UAVs known to the UAV management system, in addition to UAVs unknown to the UAV management system that have been observed by a user implementing the method of the present invention.
Other systems, methods, features, and advantages of the present disclosure will be or become apparent to one with skill in the art upon examination of the following drawings and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims
Reference will be made to exemplary embodiments of the present disclosure that are illustrated in the accompanying figures. Those figures are intended to be illustrative, rather than limiting. Although the present disclosure is generally described in the context of those embodiments, it is not intended by so doing to limit the scope of the present disclosure to the particular features of the embodiments depicted and described.
Figure 1A shows a component diagram of the unmanned vehicle safety management system, in accordance with an embodiment of the present disclosure.
Figure IB shows the core actors that interact with the safety management system and other relevant systems and actors that may be indirectly involved, in accordance with an embodiment of the present disclosure.
Figures 2A-2C show a graphical user interface used to interact with the safety management system, including establishing various types of zones to create a risk- based topography and changing a variety of settings with respect to the unmanned vehicles and zones involved, in accordance with an embodiment of the present disclosure.
Figure 3 shows a graphical representation of how zones created to establish a risk-based topography would be interpreted by the safety management system, and in turn how a path determined by a pathing algorithm would be sent to an unmanned vehicle, in accordance with an embodiment of the present disclosure.
Figures 4A-4D show a graphical user interface used to identify illegal unmanned vehicles versus vehicles registered with the safety management system through the process of overlaying real-time locations of unmanned vehicles with a low level radar map, in accordance with an embodiment of the present disclosure. Figures 5A-5B show a graphical user interface that displays how a power management system may prevent any dangerous traveling activity based on reading a value of potential directly from a battery of an unmanned vehicle, in accordance with an embodiment of the present disclosure.
Figure 6 shows an environment and method by which unmanned vehicles, computers, and the cloud-based safety management system would communicate, in accordance with an embodiment of the present disclosure.
Figure 7 shows a configuration and specification of components located on an unmanned vehicle registered with and reporting to the safety management system, in accordance with an embodiment of the present disclosure.
Figure 8 shows a process of communication between the external management system, the cellular network, and the unmanned vehicle, in accordance with an embodiment of the present disclosure.
Figure 9 shows a minimum composition of the device that is present in the unmanned vehicle, and may be present in the external management system, in accordance with an embodiment of the present disclosure.
Figure 10 shows an example communication path diagram between the unmanned vehicle, an external management system and a landing platform, in accordance with an embodiment of the present disclosure.
Figure 11 shows one possible configuration of unmanned vehicle positions for a graphical user interface showing the position of unmanned vehicles, cellular towers, and how communication between them occurs, in accordance with an embodiment of the present disclosure.
Figure 12 shows an example implementation circuit diagram for the Non- Inertia 1 Frame Lock, in accordance with an embodiment of the present disclosure.
Figure 13 shows the flow diagram of the computational unit of the Non- Inertial Frame Lock, in accordance with an embodiment of the present disclosure.
Figures 14A-14B show a top-down view of landing platform and imbedded wiring and circuitry with the electromagnetic radiation receivers and/or
electromagnetic radiation emitters, in accordance with an embodiment of the present disclosure.
Figures 15A-15D show a UAV approaching and landing on a compatible platform with the electromagnetic wave receivers located on the UAV, and the electromagnetic radiation emitter located on the platform, in accordance with an embodiment of the present disclosure.
Figures 16A- 16E shows an opposite configuration with respect to the example implementation in Figs. 15A-15D, where the UAV is portrayed having the electromagnetic radiation emitter and the compatible platform having the
electromagnetic radiation receivers, in accordance with an embodiment of the present disclosure.
Figures 17A-1 7B show a side view of the example embodiment referred to in Figs. 15A-15D, with a UAV aligning itself with a landing platform, and a graphical representation of the electromagnetic radiation being emitted from the landing platform, in accordance with an embodiment of the present disclosure.
Figures 18A-18C shows an example configuration for electromagnetic radiation receivers on a UAV, utilizing a protruding rectangular apparatus to suspend the receivers beneath the UAV, in accordance with an embodiment of the present disclosure.
Figure 19 shows a closer view of a rail fitted with electromagnetic radiation emitters on one of its sides, used specifically for the implementation described in following figures, in accordance with an embodiment of the present disclosure.
Figures 20A-20B show a UAV approaching the rail described in Fig. 19, the UAV restricted to movement in only one plane, in accordance with an embodiment of the present disclosure.
Figures 21 A-21C show a UAV hovering very closely above the rail described in Fig. 19, at what may be a minimal distance of interaction between the UAV and the rail for the purposes of the implementation described, in accordance with an embodiment of the present disclosure.
Figure 22 shows an example logic diagram of a landing platform's computational unit, in accordance with an embodiment of the present disclosure.
Figure 23 shows a top-down view of the landing platform design with imbedded wiring and circuitry, in accordance with an embodiment of the present disclosure.
Figures 24A-24G show platform foldability and what the landing platform looks like when completely folded, in accordance with an embodiment of the present disclosure. Figures 25A-25C show a UAV approach to and landing on the landing platform, in accordance with an embodiment of the present disclosure.
Figures 26A-26E show examples of the folding process and final form for the housing of a UAV using the landing platform, in accordance with an embodiment of the present disclosure.
Figures 27A-27B show an example implementation of a cellular
communication device being used to communicate the landing platform's location to the UAV via an external management system, in accordance with an embodiment of the present disclosure.
Figures 28A-28H show a first example station embodiment and its components, in accordance with an embodiment of the present disclosure.
Figures 29 A-29C show an unmanned aerial vehicle approaching and landing on the station initially described in FIGS. 28Λ-28Η, in accordance with an embodiment of the present disclosure.
Figures 30A-30B show a second example station embodiment that may additionally house an unmanned aerial vehicle, in accordance with an embodiment of the present disclosure.
Figures 31 A- 3 IE show the station initially described in FIGS. 30A-30B in more detail and from multiple perspectives, specifying components and systems within the station, in accordance with an embodiment of the present disclosure.
Figures 32A-32C shows the components and assembly technique of stacking components to create an unmanned aerial vehicle, in accordance with an embodiment of the present disclosure.
Figure 33 shows the algorithmic logic of an example delivery optimizatio method used to construct an unmanned aerial vehicle as described in FIGS. 32A-32C, in accordance with an embodiment of the present disclosure.
Figure 34 shows a block diagram of an unmanned vehicle in communication with a management system for wind mapping using power consumption data, in accordance an embodiment of the present disclosure.
Figure 35 shows an exemplary configuration of an unmanned vehicle with wind that may be present during the unmanned vehicle's flight and how the wind may approach the unmanned vehicle at various angles, in accordance with an embodiment of the present disclosure. Figure 36 shows a graph of the power consumption data specific to the motors of the unmanned vehicle relative to the direction of the unmanned vehicle according to the example embodiment of FIG. 35, in accordance with an embodiment of the present disclosure.
Figure 37 shows an example graphical user interface (GUI), in accordance with an embodiment of the present disclosure.
Figure 38 shows a block diagram of a wind mapping system, in accordance with an embodiment of the present disclosure.
Figure 39 shows an example flow diagram for wind mapping, in accordance with an embodiment of the present disclosure.
Figures 40A and B show the underside of a UAV, where a stationary locked camera is attached facing the ground.
Figure 41 shows a diagram of generic system elements, specific to an implementation where five UAVs communicate with a single image compilation server, which in turn communicates with a computer.
Figure 42 shows a more specific block diagram of the components comprising a UAV, and its communication with the image compilation server.
Figure 43 shows the algorithmic process by which the image compilation server creates the virtual 3D environment of the present invention, given images acquired by UAVs.
Figures 44 A -C show a specific implementation with respect to the general implementation of utilizing five UAVs as in Fig. 4, with the geographic locations of the UAVs on a map. along with the area capable of image capture specific to each UAV.
Figures 45A-I show a UAV above a building over time, with the perspective of the camera on the UAV shown, and the UAV capturing images over a multitude of angles and perspectives.
Figures 46A-C show the first implementation of the present invention, where a rotor blade equipped with an array of individual lights on its bottom face, from multiple perspectives.
Figures 47A-D show a second implementation of the present invention, where a rotor blade is equipped with a horizontal array of laser diodes at the center of rotation, and the laser diode beams are directed along hollow cavities inside the blades to equidistant 45-degree diffuse reflector to yield an identical effect as the first implementation.
Figures 48 A-D show a third implementation of the present invention, where optical fibers are embedded in the rotor blade so as to guide light sources (located at the center of rotation of the rotor blade, similar to the second implementation) to a diffuse or semi-diffuse emission along equidistant points of the rotor blade.
Figure 49 shows a block diagram of the control circuitry and electronic components of the present invention.
Figures 50A-D show several examples of messages and display possibilities of which the present invention is capable.
Figure 51 shows the underside of a UAV utilizing the rotor blades of the present invention to display a message.
Figure 52 shows a diagram o generic system elements, specific to an implementation where three UAVs communicate with a single UAV management system, which in turn communicates with an appropriately equipped mobile device. An unknown UAV not in communication with the UAV management system is also shown.
Figures 53A-B show the perspective of a user utilizing an example implementation o the present invention, in which four UAVs are present within the view o a user, and the mobile device displays whether or not a selected UAV within the display of the mobile device is registered with the UAV management system of the present invention.
Figures 54A-B show another implementation of the present invention, in which a UAV management system would be capable of displaying the location of a user's mobile device on the display of the mobile device in addition to the location of UAVs known (or potentially unknown) to exist in a given airspace.
Figure 55 shows a distanced perspective of a user with a mobile device, observing four UAVs as consistently described in FIG. 52, FIGs. 53A-B and FIG. 54A-B. Additionally, lines of perspective are drawn to show the specific field of view presented in FIGs. 53A-B.
In the following description, for the purpose of explanation, specific details are set forth in order to provide an understanding of the present disclosure. The present disclosure may, however, be practiced without some or all of these details. The embodiments of the present disclosure described below may be incorporated into a number of different means, components, circuits, devices, and systems. Structures and devices shown in block diagram are illustrative of exemplary embodiments of the present disclosure. Connections between components within the figures are not intended to be limited to direct connections. Instead, connections between components may be modified, re-formatted via intermediary components. When the specification makes reference to "one embodiment" or to "an embodiment", it is intended to mean that a particular feature, structure, characteristic, or function described in connection with the embodiment being discussed is included in at least one contemplated embodiment of the present disclosure. Thus, the appearance of the phrase, "in one embodiment," in different places in the specification does not constitute a plurality of references to a single embodiment of the present disclosure.
Many embodiments of the disclosure may take the form of computer- executable instructions, including algorithms executed by a programmable computer or a computational unit. However, the disclosure can be practiced with other computer system configurations as well. Certain aspects of the disclosure can be embodied in a special-purpose computer or data processor that is specifically programmed, configured or constructed to perform one or more of the computer- executable algorithms described below.
With regards to the use of a computer and or computational unit, it is noted that such a device may include a variety of computing hardware and software, including the use of computerized communication networks. For example, the computing systems disclosed may include any type of computing or computational device, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a network server, and the like. Any of the computing systems described herein may include various computing components, such as at least one processor, an input/output (I/O) interface, and a memory. The at least one processor may be implemented as one or more microprocessors, microcomputers,
microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor is configured to fetch and execute computer-readable instructions stored in the memory.
The I/O interface may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I O interface may allow the system to i nteract with a user directly or through the client devices. Further, the I/O interface may enable communication with other computing devices, such as web servers and external data servers. The I/O interface can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface may include one or more ports for connecting a number of devices to one another or to another server.
The memory used by the computing device may include any computer- readable medium known in the art including, for example, volatile memory , such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable
programmable ROM, flash memories, non-transitory memory, hard disks, optical disks, and magnetic tapes. The memory may include the programmed instructions and data.
The disclosed disclosures may utilize various communication networks. Such a network may be a wireless network, a wired network or a combination thereof. The network can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the Internet, and the like. In one implementation, the computing devices may be implemented in a cloud- based environment and may be accessed by multiple other computing systems through one or more devices communicatively coupled to the computing device through a network. The network may be a dedicated network or a shared network. The shared network may represent an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP),
Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
This disclosure pertains to various improvements in the field of robotics, and particularly in the use of UAVs and other aerial systems. While the disclose describes the subject disclosure relative to UAVs, it is noted that the disclosure is not limited to UAVs and may be equally applied to other robotic devices, including manned, unmanned, aerial, and/or grounded robotic devices, or any combination thereof.
Unmanned Vehicle Safety Management To assist in the safe use of UAVs for commercial and recreational use, a management system specially configured for unmanned vehicle safety may be implemented. Such a system may contain all data specific to any deployed unmanned vehicles, while also serving as an environment to set restrictions on said unmanned vehicles. A cloud-based management system may not only enable the registering and tracking of unmanned vehicles, but also restrict or limit unmanned vehicles to specific travel configurations. These restrictions may include determining specific areas where it is and is not safe to travel through certain areas, any speed restrictions, and in the case of aerial vehicles, operational height restrictions. The goal of such a system may be to provide an administrator with a streamlined environment that makes it easy to perform the tasks mentioned above and also for a common user to view areas of travel restrictions and pertinent data.
Figure 1A shows a component diagram of the unmanned vehicle (UV) safety management system 10 in accordance with an embodiment of the present disclosure. The system 10 comprises a UV communication interface 20, a network interface 30, a memory 50 and a microprocessor 40 coupled to the aforementioned components. The microprocessor 40 may be a microprocessor, a central processing unit (CPU), a digital signal processing (DSP) circuit, a programmable logic controller (PLC), or a combination thereof. In some embodiments, the network interface 30, memory 50 and the microprocessor 40 are all disposed within a server 15. The server may be a cloud-based server. In certain embodiments, some or all of the functionalities described herein as being performed by the unmanned vehicle safety management system 10 may be provided by the microprocessor 40 executing instructions stored on a non-transitory computer-readable medium, such as the memory 50, as shown in Figure 1A.
The UV communication interface 20 communicates to at least one unmanned vehicle 60 via a UV communication link 22. The UV communication link 22 is a bidirectional wirelessly link supporting both downlink and uplink commumcation such that the unmanned vehicle 60 may transmit signals to or receive signals from the UV communication link 22. The UV communication link 22 may be a cellular network link, a Wi-Fi link, a Bluetooth link or a telecommunication channel directly or via a satellite. The information transmitted from the unmanned vehicle 60 may include but not limit to UV identification number, location of the UV, speed of the UV, power reserve information, etc. The power reserve information may be referred as a battery or fuel reserve information. The signals transmitted to the unmanned vehicle 60 may be an alert, a command for landing, changing travel path, etc. The signal transmitted to the unmanned vehicle 60 may include but not limit to any applicable restricted areas near the UV, any tall buildings along the travelling path of the UV, any nearby landing stations/zones, etc. The unmanned vehicle 60 may comprise necessary communication means to communicate to the UV safety management system 60. One exemplary UV component diagram is shown in Figure 7.
In some embodiments, the safety management system 10 further comprises a radar 55 (typically a low level radar) coupled to the server 15 (or the microprocessor 40) to identify locations of any unmanned vehicles within a geographic area.
Information related to the geographic area may be preloaded within the memory 50 or loaded by a user to die memoiy 50. The information may include geographical information (e.g. area population, ground-based activities, tall buildings or structures, weather) and regulation information, such as any applicable federal, state, or local restrictions (e.g. restricted airspace or roadways, unmanned vehicle operation or use, quantity of unmanned vehicles) within the geographic area. The information may be retrieved and utilized by the microprocessor to appropriately enforce the height, speed, and travel path of unmanned vehicles registered with the management system.
The network interface 30 communicates to at least one user device 70 via a network communication link 32. The network communication link 32 is a
bidirectional link such that a user may transmit commands to or receive data from the server 15 via the user device 70. The network communication link 32 may be a wireless link or a wired link. For example, the network communication link 32 may be a Wi-Fi link, a Bluetooth link, a USB, FireWirc, DIN. or an e-SATA/SATA connection. In some embodiments, the server 15 hosts a GUI (graphic user interface), which may be visited by a user via a web browser. The GUI receives user input and responses accordingly. A user or customer may use the user device for UV
registration, request permission for UV, report UV schedule or route, etc.
hi some embodiments, safety management system 10 may be operated by multiple administrators (also referred as actors). Different actors may have different levels of permission and ability to operate the management system.
FIG. IB shows a diagram of an example implementation of the present disclosure that describes multiple actors and their given permissions and abilities with respect to a safety management system. It should be noted that all arrows pointing from one actor to another denote that the actor from which the arrow originates inherits the permissions and abilities of the actor to which the arrow is pointing. The administrator 101 would be an actor who is able to add 102, edit 103, and delete 104 zones. The administrator 101 may also be responsible for granting permission to specific users 1 1 1 and entities 115 to travel in restricted zones 106. The administrator 101 may also have the ability to query any o the users 1 1 1 or entities 115 and view specific information about them 105. The administrator 101 may view and approve user requests 110 and actions 109. Additionally the administrator 101 may also view the current location 107 and operation history 108 of each unmanned vehicle. All system operational settings, including modifying risk-based topography settings 125, modifying unmanned vehicle parameters 126. modifying unmanned vehicle schedules 127, would also be abilities and permissions available to an administrator 101. Finally, an administrator 101 would have access to all functions of a landing station/zone manager, including the ability to add 128, edit 129, and delete 130 landing
stations/zones.
A customer 1 15 or a user 1 1 1 may have the ability to search 113 and view zone locations 114, which may be important for viewing potential travel routes and determining which zones an unmanned vehicle would not have access to. The user may also create personal profiles 1 12 for monitoring their unmanned vehicles, or create entities 116 in which groups of people can collaborate to operate and manage an unmanned vehicle. The entities 1 15 may have the ability to view 1 18 and modify 1 17 their entity person 115 roster which would allow for the easy addition and subtraction of users. Each customer 1 15 and entity 116 may have the ability to view their current unmanned vehicle location 107, view the history of operations 108, modify their unmanned vehicle roster 109, and request special permission 1 10 from the administrator 101.
The traffic manager 11 may oversee all unmanned vehicle traffic in one or more defined areas and/or zones. A traffic manager 119 may have the ability to validate zones and'or requests 120 from the administrator 101 or customers 1 15, which would serve the purpose of ensuring that no conflicting unmanned vehicle traffic and/or differing assigned risk values zones exist. The traffic manager 119 may also validate temporary zone reservations 121 in the scenario that a new or old zone requires modification for a specific period of time due to an unexpected event (e.g. high unmanned vehicle traffic, areas affected by a recent natural disaster, building fire). Additionally the traffic manager 119 may validate paths 122 prior to unmanned vehicle travel and confirm unmanned vehicle landing success 123. In the case that an unmanned vehicle must bypass specific zone restrictions, the traffic manager 1 19 may have the ability to allow the unmanned vehicle to bypass said restrictions 124. It should also be noted that the dotted arrow from the traffic manager 119 to the administrator 101 implies that certain permissions and abilities may be inherited by the traffic manager 1 19 from the administrator 101. These specific inherited permissions and abilities would include risk management (102 through 104), landing station/zone management ( 128 through 130), and UV management (107 through 110).
FIG. 2A describes one example implementation of a graphical user interface (GUI) in a web browser, map-based environment 201 being utilized to create a series of geofences to establish a zone 203-1. All of the zones 203 would be created by clicking the map 201 to insert and move zone vertices to create the zone geometries 203. Additionally the GUI may have the ability to click on objects on a map display so as to quickly create a zone (such as the zones 203) surrounding said object.
Specifications and alterations to the properties of the zones 203 may be accomplished through a smaller options window 202. The options window 202 would have an input feature for specifying a risk value for one of the selected zones 203. This input may be a drop-down menu as shown, or another graphical input interface so as to allow a user to manually input: a specific risk value. Risk values would follow the convention of representing the risk of an unmanned vehicle failing to operate or operating erratically, introducing the potential for damages to persons or property. Risk values may be directly proportional to values such as cost of po tential crash damages or the statistical probability f a collision. Risk values may be determined by a multitude of other factors related to the area over which a zone is established. Every area will have a risk value in order to properly evaluate an unmanned vehicle's path. The options window 202 may have a feature for establishing different risk values for different times of day, with respect to the same zone. For example, zone 203-2 may have an assigned risk value of 1 from 12am to 8am, a risk value of 2 from 8am to 4pm, and a risk value of 3 at all other times, with the lowest number representing the lowest possible risk, and the highest number representing the highest possible risk.
Additionally, the options window 202 may also have a feature for establishing temporary zone restrictions for specific times or days as well as establishing risk values based on variables such as heavy unmanned vehicle traffic or other emergency events. Different bound, scaling, and maximum and minimum value conventions may be employed to establish a similar risk-based topography system.
FIG. 2B shows one example representation of how a customer's GUI 201 may appear. The customer 115 may interface with the safety management system and manage their unmanned vehicle(s) 205, create entities and user profiles, and view zone locations 207. A core feature of the safety management system will be its ability to receive GPS location data specific to unmanned vehicle locations. Within the safety management system interface 201 the customer 1 15 may view an unmanned vehicle's 205 location either through the graphical representation on the map where an unmanned vehicle 205 may be located, or through another screen, popup, or sidebar that may show exact latitude and longitude coordinates 207 of the unmanned vehicle 205. These methods of viewing an unmanned vehicle 205 are often useful in recovering lost unmanned vehicles 205 and monitoring an unmanned vehicle's 205 behavior. The customer 115 may also track an unmanned vehicle's 205 history of operations 206, which may be important for commercial entities that wish to review travel times and operations. Additionally the customer 115 may have the ability to add unmanned vehicles 205 to their roster 206 and request permissions to break certain gco-feneing parameters when appropriate 207.
FIG. 2C shows another example representation of how an administrator's GUI 201 may appear. In this particular figure, a feature of being able to view and query any customers or entities is shown 209. In addition to being able to view any zones 203, landing locations 204, and unmanned vehicles 205, an administrator may view customers' and/or entities' unmanned vehicles 205, the unmanned vehicles' 205 corresponding positions, as well as each unmanned vehicles* 205 history 209.
Additionally the administrator may have the option to grant any customer requests or permissions 209. It should be noted that the exact configurations and placement of the functionalities of the safety management system may be in any form of a screen, popup, or sidebar in the GUI 201, which may vary from the given example figures.
FIG. 3 displays how the risk-based topography and travel planning may be combined to create waypoints 302 and a travel route 219 for an unmanned vehicle 205-3. The safety management system would first process data from the GPS coordinates of the unmanned vehicle 205-3 and risk values of any relevant areas 203 of the map that may be between the starting and final location of the unmanned vehicle 205-3. The system would then convert the unmanned vehicle's 205-3 current location and its final destination's location 204 GPS coordinates into grid coordinates. Then through a pa thing algorithm, the safety management system would be enabled to search for paths between the two grid coordinates and establish waypoints 302 along the path where changes in the path direction occur (also known as path derivatives). These waypoints would serve as ideal data to direct an unmanned vehicle along a calculated path. After all possible paths are created the safety management system may then either automatically select a path (utilizing path-searching algorithms such as breadth- first, depth-first, Λ*. or other heuristic methods) the travel route for an unmanned vehicle 205-3, or different paths may be presented to a customer and/or admin, who would then be given a choice to pick which path they deem suitable.
To accelerate the path calculation process, the safety management system may have the ability to store previous travel paths that an unmanned vehicle may have already performed. By utilizing prior pathing data, the safety management system may not have to repeat pathing calculations when creating a new travel path for an unmanned vehicle. A server may utilize a graphics processing unit (GPU) to create and store a grid that incorporates established zones when calculating a path. By storing said grid, the GPU may store all traveled paths for future use. A path may be stored as a segmented path as opposed to one continuous path from the start to the end of the travel path. If an unmanned vehicle were to travel along a similar path segment of or identical path as a previously executed path, and all assigned risk values in the travel area remained constant, the safety management system may be able to reuse the previously executed path or segments of the previously executed path so as to avoid repeating a pathing calculation. In the case that a zone's risk value or multiple zones' risk values may have changed, the safety- management system may simply recalculate the segmented path(s) affected by said zone(s) and reuse all other segmented paths, or if a recalculation is too difficu lt or inefficient, the previously executed path data may simply be discarded.
The safety management system may be capable of maintaining a schedule for one or more unmanned vehicles. These scheduled unmanned vehicles may be required to repeatedly execute the same or similar paths consistently over time, therein being the efficiency of the safety management system to consistently give the unmanned vehicle(s) directions so as to cany out said consistent paths. The scheduling system described would further enable the safety management system to avoid the risk of heavy unmanned vehicle traffic, and as a result reduce the probability of unmanned vehicles colliding with each other. The safety management system may also autonomously modify unmanned vehicle schedules according to a variety of expected or unexpected factors (of which it is capable of being aware of), such as weather, roadways or airspace periodically blocked due to street fairs or community events, police activity, and natural disasters.
As seen in FIG. 4 the administrator(s) will also have the ability to locate any unregistered (or unknown, unidentified) unmanned vehicles that may cause harm. The following is related to the implementation and designs of an overlay method in which a system that has all real-time locations of each active unmanned vehicle overlays a low-level radar for the purposes of identifying unregistered unmanned vehicles. All unmanned vehicles previously described would be registered with and have the ability to report to the safety management system of the present disclosure, thus any unregistered unmanned vehicles that appear in the overlay of real-time locations and low-level radar would be flagged, and the safety management system would be notified. This system provides the advantage of protecting the public from malicious unmanned vehicles. FIG. 4A gives one example implementation of the present disclosure describing the process by which the system may identify illegal unmanned vehicles. To implement this method the safety management system may receive location data from one or more registered unmanned vehicles and update their locations on a map-based graphical user interface in real-time 401. The system may then utilize a feed of the low-level radar 402to determine all of the locations of all vehicles in the air. By overlaying the two data feeds 403 through either data analysis techniques, image processing techniques, or other ulterior computational methods (performed by a GPU, CPU, or other appropriate computational unit), the two maps may be compared 404 by the safety management system. Upon that comparison, vehicles that are not already reporting their positions to the safety management system and therefore not implementing proper regulations would be rapidly identified 405. Once the illegal unmanned vehicle(s) are identified, the vehicle(s) and its operator may be reported to the appropriate government regulator or responsible organization. This system would also further secure the space in which unmanned vehicles travel by ensuring that if there is a malfunctioning regulator (e.g. if an unmanned vehicle is reported to be in one location but is actually distanced from what is reported), then the unmanned vehicle would be flagged as a malfunctioning unmanned vehicle. FIG. 4B shows the initial real-time locations of the unmanned vehicles 406 as it may be seen in the graphical user interface 201. FIG. 4C displays the low-level radar locations of any unmanned vehicles that may or may not be registered 406 and 407 in the same graphical user interface 201. FIG. 4D is the overlaid map of the two previous figures, FIG. 4B and FIG. 4C. The unmanned vehicles 406 that have been flagged as illegal unmanned vehicles 407 may then be uniquely identified in some fashion (e.g. a difference in color, an addition of a border, a unique animation) making it easy for the administrator(s) to locate and identify the illegal system(s) 407.
In FIG.5 another potential capability of the safety management system is shown. The safety management system may have the ability to eliminate any potentially dangerous activity related to unsafe unmanned vehicle travel (e.g. low battery or damaged unmanned vehicle travel), especially in an emergency situation. The safety management system may achieve this by first reading a value of potential and current received from an unmanned vehicle battery. Using this information the safety management system may determine the unmanned vehicle battery charge by reading the differential of potential as well as estimate if an unmanned vehicle is damaged by reading the resulting reactive power. The safety management system may also read the GPS coordinates of an unmanned vehicle to determine the locat ion of an unmanned vehicle. This location may be used to fetch the locat ion of nearby charging stations, all-purpose stations (i.e. a landing station with additional features such as charging, housing, automated repair, etc.), and emergency landing zones. An emergency landing zone would be a zone where an unmanned vehicle may land safely and securely away from the public. The safety management system may then determine if an unmanned vehicle is safe to travel by checking if the battery level is below the minimum operating battery potential or if the unmanned vehicle is damaged. In the event either of the prior conditions is true, the safety management system may send a signal to the unmanned vehicle interface to communicate a landing location and a request to land for the unmanned vehicle to prevent the unmanned vehicle from traveling in adverse conditions.
In a non-emergency scenario, the safety management system may similarly determine the minimum operating battery potential by calculating the distance from an unmanned vehicle's current position to a desired charging station or all-purpose station. If the calculated distance is greater than a certain threshold, then the minimum operating battery would be set to a standard minimal operating battery potential. If the distance is in between two thresholds, then the minimum operating battery would be set to be inversely proportional to the distance. If the distance is lower than both thresholds then the minimum operating battery would be set to a constant standard minimal operating battery potential. The safety management system may send a signal to the unmanned vehicle interface to land by setting a land request as well as by sending the final destination location through any digital communication
(synchronous or asynchronous) to the unmanned vehicle interface. The final destination locations may be differentiated by their core functionality. Charging stations would be secure stations where an unmanned vehicle would arrive and recharge its battery until its battery is at a sufficient level for travel. The all-purpose station may be a station where the unmanned vehicle would be repaired in case of failure, recharged, or picked up by the owner. It should be noted that these are only a few example landing stations; the user may have the option to use these specific stations or any combination of these stations with or without the addition of other stations that may have different functionalities. In the case of a station or zone already being occupied, the station or zone may be removed from the safety management system GUI indicating its unavailability and identified as "in use" in the safety management system database, which may also help to ensure that the safety management system GUI does not display irrelevant information.
FIG. 5 A shows the logic flow that the safety management system may utilize to implement the previously described battery use planning, and begins with an unmanned vehicle powering on 501. The safety management system may then determine whether the battery potential is appropriate with respect to the minimal operating battery potential 502. If the battery potential is appropriate, then the safety management system would constantly calculate the location of the nearest station and the new minimal operating battery potential 504. In the event that the battery potential is less than the minimal operating battery potential, the safety management system may then send a safety stop and arrival request and the GPS location of the nearest safe unmanned vehicle station location 503.
FIG. 5B describes the key elements and concepts that comprise the safety management system. The figure shows an unmanned vehicle 505 that is entering radial zones 510 centered at recharging stations 507, emergency landing zones 508, or all-purpose unmanned vehicle stations 509. When the unmanned vehicle 505 is outside any radial zones it obeys a standard minimal operating battery potential, and upon entering the radial zone 510- 1 , the unmanned vehicle's 505 minimal operating battery potential is then reduced relative to the position from the recharging station 507; this would also apply to an unmanned vehicle approaching an all-purpose station 509. As the unmanned vehicle 505 enters the inner circle 510-2 a relative reduction gradient may stop and a minimal operating battery potential would be constant within this zone. If the unmanned vehicle's 505 battery potential drops below the minimal operating battery potential, then the safety management system may give the location of the optimal unmanned vehicle station position and a request to occupy said station. If a recharging station 507 or an all-purpose station 509 is not within traveling distance, a designated emergency zone 508 may be found and routed to by the unmanned vehicle 505.
FIG. 6 describes how the safety management system 601 may communicate ith one or more unmanned vehicles 602, computers 603, and manually controlled unmanned vehicles 604. The safety management system 601 should be understood to have the qualities and functionality of a cloud-based server capable of enabling network access among a plurality of computing elements. Independent computers 603 would have access to and control unmanned vehicles 602 via the safety management system 601. Unmanned vehicles 602 would be specially equipped with additional hardware and software so as to communicate directly with the safety management system 601 wirelessly, while manually controlled unmanned vehicles 604 would not. FIG. 6 shows one example implementation of the present disclosure in which a computer 603-2 is connected to a manually controlled unmanned vehicle 604 via a communication link 605, which is either wirelessly (e.g. via Wi-Fi, Bluetooth, or a cellular network) or directly (e.g. via a USB, FireWire, DIN, e-SATA/SATA connection) so as to send and receive data to and from the safety management system 601 indirectly. An indirect connection as described, facilitated by a computer 603-2, would allow the manually controlled unmanned vehicle 604 to receive new or updated restriction data from the safety management system 601, and may also permit the manually controlled unmanned vehicle 604 to transmit any pertinent or necessary data (e.g. power consumption data, location data, speed, etc.) it may have stored to the safety management system 601.
Referring to FIG. 7, an unmanned vehicle 602 registered ith the safety management system 601 may include a telemetry unit 701, a processor 702, a resource manager 703, a memory device 704, and mechanical components 705. These five components would compose the unmanned vehicle interface as previously mentioned. The memory device 705 may include, for example, one or more volatile and/or non-volatile memories. In other words, for example, the memory device 705 may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device). The memory device 705 may be configured to store information, data, applications, instructions or the like for enabling the apparatus to carry out various functions in accordance with example embodiments of the present disclosure. For example, the memory device 705 could be configured to buffer input data for processing by the processor 702. Additionally or alternatively, the memory device 705 could be configured to store instructions for execution by the processor 702.
The processor 702 may be embodied in a number of different ways. For example, the processor 702 may be embodied as one or more of various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip. or the like. In an example embodiment, the processor 702 may be configured to execute instructions stored in the memory device 705 or otherwise accessible to the processor 702. Alternatively or additionally, the processor 702 may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor 702 may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to embodiments of the present disclosure while configured accordingly. Thus, for example, when the processor 702 is embodied as an ASIC, FPGA or the like, the processor 702 may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor 702 is embodied as an executor of software instructions, the instructions may specifically configure the processor 702 to perform the algorithms and/or operations necessary for the unmanned vehicle to operate successfully. However, in some cases, the processor 702 may be a processor of a specific device (e.g., an eNB, AP or other network device) adapted for employing embodiments of the present disclosure, and may entail further configuration of the processor 702. The processor 702 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 702.
Meanwhile, the telemetry u it 701 may be any means such as a device or circuitry embodied in either hardware, software, or a combination of hardware and software that is configured to receive and/or transmit data from/to a safety management system 706 and or any other device or module in communication with the apparatus. In this regard, the telemetry u it 701 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network. In some environments, the telemetry unit 701 may alternatively or also support wired communication. As such, for example, the telemetry unit 701 may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
In an example embodiment, the processor 702 may be embodied as, include or otherwise control a resource manager 703. The resource manager 703 may be any means such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g., processor 702 operating under software control, the processor 702 embodied as an ASIC or FPGA specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the resource manager 703 as described herein. Thus, in implementations in which software is employed, a device or circuitry (e.g., the processor 702) executing the software forms the structure associated with such means. In the example embodiment of the present disclosure, the resource manager 703 would be configured to control all mechanical components 705 o the unmanned vehicle. Upon receiving instructions from the processor 702, the resource manager 703 would interpret and translate the instructions into actions performed by the mechanical components 705. The mechanical components 705 of the unmanned vehicle may include but are not limited to rotors. LEDs, rudders, mechanical or robotics arms, retractable cables, gimbals, or other mechanical apparatus.
Unmanned vehicle communication systems and methods through SMS As put forth in the background, managing power consumption in robotic devices such as UAVs is essential in providing successful, long-term use of the vehicles. Conventional communication methods and systems require significantly higher levels of power consumption to operate normally relative to a majority of the various power requirements for SMS. Reducing the power requirement for an unmanned vehicle inherently allows it to operate for longer periods of time or dedicate more power to one of its functions. Any such power improvement is essential in engineering more efficient unmanned vehicles and is applicable to all of its possible functions and assigned tasks. Overall, the use of a SMS communication system simplifies the entire communication process and inherently reduces the risk of error being introduced to the system. The use of a SMS has unique capabilities that other communication methods do not offer. Specifically the use of a SMS is supported by the current inherent positive characteristics of a SMS. These characteristics include but are not limited to a wider area of coverage, strong existing infrastructure, and with an increasing transition to data supported messaging services by the general population, the usage of a SMS is less intensive on networks. Whereas other communication networks such as 4G LTE datalinks are not covered in certain areas of the United States, SMSs are readily available in a majority of inhabited areas.
Figure 8 illustrates a generic system diagram in which the management system 1101 , cellular network 1 102 and unmanned vehicle 1103 exist within a
communication environment in which embodiments of the present disclosure may be employed. As shown in FIG. 8, the management system 1 101 and unmanned vehicle 1103 are in communication with each other via the cellular network 1102 through a communication path 1104 between the unmanned vehicle 1103 and the cellular network 1 102 and a communication path 1 105 between the management system 1101 and the cellular network 1102. The communication paths 1 104 and 1105 are bidirectional and comprise at least one channel and support SMS function to allow SMS communication. The communication paths 1104 and 1 105 may or may be the same, or under the same communication protocol. In some cases, embodiments of the present disclosure may further include one or more network devices with which the management system 1101 and unmanned vehicle 1103 may communicate to provide, request and/or receive information.
The cellular network 1 102 may include a collection of various different nodes, devices or functions that may be in communication with each other via corresponding wired and/or wireless interfaces. As such, the cellular network 1102 should be understood to be an example of a broad view of certain elements of the system and not an all-inclusive or detailed view of the cellular network 1102. The unmanned vehicle 1103 and management system 1101 may also communicate via device to device (D2D) communication as part of the attempt to communicate via the cellular network 1102, and each may include a telemetry unit for transmitting signals to and for receiving signals from a cellular base site which could be, for example, a base station that is a part of one or more cellular or mobile networks or an access point that may be coupled to a data network, such as a local area network (LAN), a metropolitan area network (MAN), and/or a wide area network (WAN), such as the Internet. In some embodiments, the cellular network may utilize one or more mobile access mechanisms such as wideband code division multiple access (W-CDMA),
GDMA2000, global system for mobile communications (GSM), general packet radio service (GPRS), long term evolution (LTE), LTE Advanced and/or other similar mechanisms.
Referring to FIG. 9, the unmanned vehicle 1 103 may include a telemetry unit 1201, a processor 1202, a resource manager 1203, a memory device 1204, and mechanical components 1205. The memory device 1204 may include, for example, one or more volatile and/or non-volatile memories. In other words, for example, the memory device 1204 may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device). The memory device 1204 may be configured to store information, data, applications, instructions or the like for enabling the apparatus to carry out various functions in accordance with example embodiments of the present disclosure. For example, the memory device 1204 could be configured to buffer input data for processing by the processor 1202. Additionally or alternatively, the memory device 1204 could be configured to store instructions for execution by the processor 1202.
The processor 1202 may be embodied in a number of different ways. For example, the processor 1202 may be embodied as one or more of various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an ASIC
(application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. In an example embodiment, the processor 1202 may be configured to execute instructions stored in the memory device 1204 or otherwise accessible to the processor 1202. Alternatively or additionally, the processor 1202 may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor 1202 may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to embodiments of the present disclosure while configured accordingly. Thus, for example, when the processor 1202 is embodied as an ASIC, FPGA or the like, the processor 1202 may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor 1202 is embodied as an executor of software instructions, the instructions may specifically configure the processor 1202 to perform the algorithms and/or operations necessary for the unmanned vehicle 1 103 to operate successfully. However, in some cases, the processor 1202 may be a processor of a specific device (e.g., an cNB, AP or other network device) adapted for employing embodiments of the present disclosure, and may entail further configuration of the processor 1202. The processor 1202 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 1202.
Meanwhile, the telemetry unit 1201 may be any means such as a device or circuitry embodied in either hardware, software, or a combination of hardware and software that is configured to receive and/or transmit data from/to a cellular network 1 102 and/or any other device or module in communication with the apparatus. In this regard, the telemetry unit 1201 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network. In some environments, the telemetry unit 1201 may alternatively or also support wired communication. As such, for example, the telemetry unit 1201 may include a communication modem and/or other
hardware/software for supporting communication via cable, digital subscriber line (DSL ), universal serial bus (USB) or other mechanisms.
In some embodiments, the telemetry unit 1201 may support SMS
communication through the communication path 1104 to the cellular network 1102 via at least one communication protocol, such as GSM, CDMS, TDMA, LTE, etc, The telemetry unit 1201 receives information from the cellular network through the communication path 1104. The information may be sent using SMS or other communication formats. During the operation of the unmanned vehicle 1103, the telemetry unit 1201 may be configured to send short messages to the cellular network actively or upon request from the management system 1 101. The short messages may be refreshed and sent on a repetitive schedule. The message may be compiled from information selected from current location, speed, altitude, destination, power reserve information of the unmanned vehicle 1103. Alternatively, the message may be designated or compiled according to a specific request from the management system. In some embodiments, the repetitive interval may he a predetermined parameter stored within the memory device 1204. In some embodiments, the repetitive interval may be dynamically determined based on at least one parameter selected from current location, speed, altitude, destination, power reserve information of the unmanned vehicle 1 103. For example, when the power reserve of the unmanned vehicle 1103 is below a reserve threshold, the telemetry unit 1201 may reduce the repetitive interval to send short message more frequently. In some embodiments, the repetitive interval may be determined by a request from the management system 1101.
In some embodiments, the telemetry unit 1201 may support SMS
communication through the communication path 1 104 via multiple communication protocols. For example, the telemetry unit 1201 may send a short message using a first communication protocol. If no reply or conformation received within a time threshold, the telemetry unit 1201 then sends the short message via a second communication protocol different from the communication protocol. If still no reply or conformation received within the time threshold, the telemetry unit 1201 may switch back to the first communication protocol to resend short message. The time threshold may be a predetermined parameter stored within the memory device 1204. In some embodiments, the time threshold is configured to be smaller than the repetitive interval such that the same short message may be resent before the short message to be sent is refreshed.
In some embodiments, when the power reserve of the unmanned vehicle 1103 is below the reserve threshold, the telemetry unit 1201 is configured to send short message via both the communication protocols and/or reduce the repetitive interval to send short message more frequently . In some embodiments, when the processor 1202 detects any mechanical breakdown, the telemetry unit 1201 is configured to send short message via both the communication protocols and also reduce the repetitive interval to send short message more frequently.
In an example embodiment, the processor 1202 may be embodied as, include or otherwise control a resource manager 1203. The resource manager 1203 may be any means such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g., processor 1202 operating under software control, the processor 1202 embodied as an ASIC or FPGA specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the resource manager 1203 as described herein. Thus, in implementations in which software is employed, a device or circuitry (e.g., the processor 1202) executing the software forms the structure associated with such means.
In the example embodiment of the present disclosure, the resource manager 1203 would be configured to control all mechanical components 1205 of the unmanned vehicle 1103. Upon receiving instructions from the processor 1202, the resource manager 1203 would interpret and translate the instructions to actions performed by the mechanical components 1205. The mechanical components 1205 of the unmanned vehicle 1 103 may include but are not limited to rotors, rudders, mechanical or robotics arms, retractable cables, gimbals, or other mechanical apparatus. In some cases, the processor 1202 may be enabled to configure subframes of the cellular (or other communication interface) downlink signaling structure. Furthermore, the processor 1202 may provide information to the unmanned vehicle 1103 and the management system 1101 (or other machines) via the telemetry unit 1201 indicating the configuration to the signaling structure so that the unmanned vehicle 1103 and the management system 1 101 may utilize the corresponding signaling structure accordingly.
FIG. 10 illustrates one system that may be implemented for SMS
communication between a landing module's 1301 telemetry unit 1302, an unmanned vehicle 1 103 and a management system 1 101 via a cellular network 1102. Through this system the external management system 1 101 will relay SMS data to the telemetry unit 1201 of the unmanned vehicle 1 103. The telemetry unit 1201 will then transmit the data to the processor 1202 which will then decide whether the data needs to be stored in the memory device 1204 or sent to the telemetry unit via the cellular network 1 102. Specifically the telemetry unit 1201 on the unmanned vehicle 1 103 may communicate with the telemetry unit 1302 that is located on a landing module 1301. Implementations that may arise as a result of this communication between the unmanned vehicle 1103 and landing module 1301 include but are not limited to informing the landing module 1301 that the unmanned vehicle 1103 is traveling or performing an approach towards the landing module 1301.
FIG. 11 is a graphical representation of how multiple unmanned vehicles 1404 may communicate with a cellular network via SMS by communicating with the closest cellular towers 1403. Each individual unmanned vehicle 1404 should be understood to possess all the capabilities and components described of the unmanned vehicle 1103 in prior figures. As shown in the present example f the present disclosure, each individual unmanned vehicle 1404 will communicate with a cellular tower 1403 according to which area of ideal connectivity it is located in, as defined by the connectivity boundaries 1401. For example, as seen in the example configuration, unmanned vehicles 1404- 1 and 1404-2 would communicate with the cellular tower 1403-1, seeing as both lie within the connectivity boundary 1401 corresponding to the cellular tower 1403-1. Similarly, the unmanned vehicles 1404-3 and 1404-4 would communicate with the cellular tower 1403-2, since both lie within the connectivity boundary 1402 corresponding to the cellular tower 1403-2. Each cellular tower 1403 and corresponding connectivity boundary define a base station at the center of a geographic cell, of which all cellular networks are comprised. The unmanned vehicles 1404 will then have the ability to efficiently communicate across this larger cellular network and with other systems or devices connected to the same network. Examples of said other systems and devices include but are not limited to landing modules, other unmanned vehicles 1404. management systems, personal cellular
communications devices, and relay stations.
Non-Inertial Frame Lock for Unmanned Aerial Vehicles As put forth in the background, it is important for the UAV to maintain stability during a landing procedure. The UAV may accomplish stability by utilizing a system of receivers and emitters that can constrain a UAV in an inertial frame. In accordance with the present disclosure, electromagnetic wave (EMR) receivers and EMR emitters (e.g. radiation in the infrared spectrum) may be used to allow the UAV to maintain stability during landing. Specifically, the EMR receivers and emitters may allow the UAV's computational unit to calculate the proper adjustments needed for the UAV's flight in order for the UAV's position, roll and pitch to be maintained above a corresponding wave emitter. The electromagnetic radiation would lock the UAV in a non-inertial frame and allow the UAV to land precisely. Consequently through this technology, the UAV will have the ability to land on moving platforms as well as on static platforms. This ability greatly expands the range of locations available for landing the UAV and gives the UAV more flexibility in determining overall flight paths over the conventional art.
FIG. 12 shows a circuit diagram for the Non-Inertial Frame Lock, in accordance with one embodiment of the present disclosure. The basic circuit of the non-inertial frame origin consists of a battery 2002 powering a collection of electromagnetic wave (EMR) emitters 2001. Once powered, the EMR emitters 2001 emit EMR radiation which is received by a matrix of photo-transistors 2003. The photo-transistors 2003 decrease in resistance to the battery's 2005 current as the intensity received increases. This is measured by an increase in potential across the corresponding resistor 2004. These potentials are then input into a high-speed multiplexer 2006 which is controlled by a computational unit 2007. The output from the multiplexer 2006 is input into the computational unit 2007. The computational unit 2007 then sends the directional communication to the aerial vehicle controller 2008 based off the perceived symmetries in potentials.
FIG. 13 shows the flow diagram of the computational unit 2007 as seen in FIG 12. Initially the computational unit 2007 may power on 2009 and may immediately begin to read through the receiver values 2010 by varying the multiplexer 2006 input. With this multiplexer 2006 input, the computational unit 2007 may then calculate the transmission center's 2011 position relative to the UAV's center and create a vector from the UAV's center to the transmission center 201 1. Utilizing the transmission center 201 1 , the computational unit 2007 may then split the vector into Cartesian directional components 2012 and transmit the components as flight commands 2013.
FIGS. 14A- 14B show a top-down view of the landing platform 2301. The landing platform 2301 may be hexagonal in shape, as shown in the example figure, or it may also be octagonal, nonagonal, or have any number of possible sides. Along each side of the landing platform 2301 shape, which includes the outer edge, is embedded wire. These embedded wires may be used to enable the interface, communication, and other receivers to operate and integrate successfully with the landing platform 2301. In FIG. 14A there is a single EMR emitter 2303 located above a power source 2302. It should be noted that the graphical representation of the power source 2302 is concealed by the graphical representation of the EMR emitter 2303, and should henceforth be recognized in following figures as the smaller, hexagonal center object of the landing platform 2301. In FIG. 14B, there are multiple EMR receivers 2304 located around the landing platform 2301. The amount of EMR emitters 2303 and EMR receivers 2304 may vary for different implementations of the present disclosure. At the center of the landing platform 23 1 may be a power source 2302 which may provide power to the interface and any other receivers belonging to the landing platform 2301. The power source 2302 may be a battery.
In order for the non-inertial frame lock to operate successfully, a system of at least three EMR receivers and one EMR emitter must be employed. Any increase in the number of EMR receivers may result in an increase in accuracy of the present disclosure. Similarly, any increase in the number of EMR emitters 2303 increases the EMR signal strength communicated to the EMR receivers 2304, increasing precision. Both example embodiments described in FIG. 14A and FIG. 14B are viable configurations, however the configuration described in FIG. 14A may be ideal in that it does not require additional communication between the UAV computational unit 2007 and an additional computational unit integrated into the landing platform 2301. That is, the example implementation represented in FIG. 14B may require the landing platform 2301 to possess or integrate a computational unit that could interpret the EMR receivers 2304 information and in turn send that message to the UAV via another EMR communication method, or another type of communication means. The implementation in FIG 14Λ may be optimal in that all computations with respect to the present disclosure can be made solely by the UAV computational unit 2007 without the need for additional computational units.
FIGS. 15A-15D show the configuration of the EMR receivers 2404 and EMR emitter 2303 with respect to the UAV 2401 and a landing platform 2301 for the UAV 2401. FIG. 15 A depicts the EMR receivers 2404 on the underside of the UAV 2401 and the EMR emitter 2303 on the landing platform 2301. As the UAV 2401 approaches the landing platform 2301, the EMR emitter 2303 may begin to emit EMR which will trigger the EMR receivers 2404 on the UAV 2401. Following the EMR receivers 2404 reception of the EMR signal, the algorithmic processes described in FIG. 12 and FIG. 13, or another process, may be employed to correct the position, roll and pitch of the UAV as needed. FIG. 15B is a close up of the landing platform 2301 configuration with the EMR emitter 2303 shown specifically in the center of the landing platform 2301. FIG. 15C shows a possible configuration of the EMR receivers 2404 located on the bottom of the UAV 2401, although it is noted that the shape and design of the UAV 2401 may vary and the location of the EMR receivers 2404 may also vary. FIG. 15D depicts a close up of one possible configuration on the UAV 2401 , showing the EMR receivers 2404 positioned on the underside of the body or housing of the UAV 2401 . As shown, located around the base of the UAV 2401 are multiple EMR receivers 2404-1 through 2404-6, however the exact locations of the receivers may vary. In these figures the number of EMR receivers and emitters is only one possible implementation of the present disclosure. The number of EMR emitters may be increased or decreased with the effect of creating a stronger or weaker EMR signal strength. Similarly, the number of EMR receivers may be increased or decreased with the effect of increasing or decreasing receiver precision.
FIGS. 16A-16E depict a similar system to FIGS. 15A-15D, but where the EMR receivers 2304 are located on the landing platform 2301 and the EMR emitters 2502 are located on the UAV 2401 . In FIG. 16 A as the UAV 2401 approaches the landing platform 2301. the EMR emitters 2502 will begin to emit EMR, which will trigger the EMR receivers 2304 on the landing platform 2301. FIG. 16B is a close up of one possible configuration of the EMR receivers 2304 located on the landing platform 2301. Specifically, FIG. 16B shows how the F.MR receivers 2304 are connected to the embedded wiring of the landing platform 2301, for example, where each of the EMR receivers 2304 are positioned between portions of the landing platform 2301 and connected to the embedded wiring to the power source 2302. FIG. 16C is a close up of FIG. 16B and specifically shows the EMR receiver 2304-3 and its placement on the landing platform 2301 and connected to the embedded wiring. FIG. 16D shows one possible configuration of the EMR emitters 2502 located on an underside of the body of the UAV 2401. FIG. 16E shows a close up of FIG. 16D with the EMR emitters 2502 located directly on the bottom of the body of the UAV 2401. In these figures the number of EMR receivers and emitters is only one possible implementation of the present disclosure. The number of receivers and emitters may be increased or decreased with the effect of creating a stronger or weaker EMR signal strength. This configuration yields the same functionality as the first example as shown in FIGS. 15 A- 15D.
FIGS. 17A-17B show how the EMR technology may work with respect to the UAV 2401. In FIG. 1 7 A the UAV 2401 is seen approaching the landing platform 2301 . In this configuration, the EMR emitters 2304 are located on the landing platform 2301 and the EMR receivers are located on the UAV 2401 ; however, the emitters and receivers may be configured with the emitters on the UAV 2401 and the receivers on the landing platform 2301, without changing the desired effect of the present disclosure. As the UAV 2401 approaches the landing platform 2301, EMR 2601 is emitted from the EMR emitter 2304 and the receivers on the UAV 2401 begin to adjust the position of the UAV 2401 with respect to the frame lock's center or the center of the EMR emitters 2602. In FIG. 17B, the UAV 2401 can be seen substantially above the center of the EMR emitters 2602, which is a result of the EMR receivers adjusting the roll and pitch as well as the overall position of the UAV 2401 with respect to the EMR emitter 2304. The circuitry used in the frame lock may measure the intensities of the EMR 2601 to calculate the corrective movements needed for the UAV to position itself above the EMR emitter center 2602 and maintain a lev el of consistency in roll and pitch. With this control, the UAV 2401 may descend to land on the landing platform 2301 while maintaining the desired control over the UAV 2401.
FIGS . 18 A- 18C show a UAV 2401 with a descending rectangular apparatus suspending EMR receivers 2701 from the UAV 2401. In this design, the EMR receivers 2701-1 through 2701 -5 may be positioned substantially vertical of one another on the descending structure. Fig. 18A shows a close up perspective of the underside of the UAV 2401 and the individual EMR receivers 2701, with the lowest receiver being the EMR receiver 2701-5 and the highest receiver being the EMR receiver 2701-1. relative to the ground beneath the UAV 2401. Fig. 18B shows the left side view of the UAV 2401 and the attached EMR receivers 2701. Fig 18C shows the back side view f the UAV 2401 and the attached EMR receivers 2701.
FIG. 19 shows a rail 2801 raised by two supports and fitted with functional EMR emitters 2802 on its right face. The EMR emitters 2802 may also share two planes, one plane, or may all be configured to be in a horizontal line. In some designs, it may be important to locate all EMR emitters 2802 within a single plane. The EMR emitters 2802 may be independently powered or an external power source may supply all of the EMR emitters 2802. If an external power source were used, it may be located in or outside of the rail 2801.
FIGS. 20A-20B show the UAV 2401 equipped with EMR receivers 2701 approaching the rail 2801 and the attached EMR emitters 2802. These figures show an implementation of the present disclosure where the motion of the UAV 24 1 could be restricted to a single plane based on the linearly-aligned EMR receivers 2701 and the EMR emitters 2802 positioned within a single plane. This would be accomplished by the UAV 2401 adjusting its position in an attempt to maintain a constant average intensity across all of the EMR receivers 2701 , minimizing if not eliminating movement to the right and left of the rail. By recognizing the differences in intensities, symmetry between the signal receptions and the strength of the intensities received by the EMR receivers 2701, the UAV 2401 could make further precise computations so as to execute more precise positioning when moving up, down, forwards and backwards. FIG. 20A shows a top-right comer view of the
implementation system described. FIG. 20 B shows a top side view of the
implementation system described.
FIGS. 21A-21C depict one of the ideal standards of precision the UAV 2401 could accomplish when the implementation described in FIGS. 20A-20B is executed correctly. The UAV 2401 could hover just above the rail 2801, and move along the rail 2801 at the height depicted or various other heights. This implementation system is an ideal example of how the UAV 2401 may be utilized in a commercial assembly line setting. In such a setting the UAV 2401 may be further equipped or enabled to add, remove, or inspect objects on an assembly line. An assembly line implementation could entail one or more uses of a similar rail 2801 or may directly integrate the EMR emitters 2802. FIG. 21 A shows a top-right view of the implementation system and FIG. 2 IB shows a front view of the implementation system. FIG. 21 C shows a top view of the implementation system. In all figures, it can be seen how the UAV 2401 can be positioned to align relative to the rail 2801 using the EMR receivers 2701 and the EMR emitters 2802.
In the figures describing the present disclosure, the number of EMR receivers and emitters are only examples of possible implementations of the present disclosure. The number of EMR emitters may be increased or decreased with the effect of creating a stronger or weaker EMR signal strength. Similarly, the number of EMR receivers may be increased or decreased with the effect of increasing or decreasing receiver precision. The number of EMR emitters and receivers does not change the basis of the novelty on which the present disclosure embodies.
Vehicle Landing Platform
The use of a landing platform may decrease the chance of error and danger by specifying a location for the UAV to land before reaching its final destination.
Additionally it may offer built in features to improve the quality and safety of the landing process, such as ensuring that the landing location is an open and free area, warning any observers to avoid the landing location, and providing landing guidance to the UAV. A foldable and portable landing platform may enable easy and convenient movement of the landing location, should it need to be changed.
FIG. 22 shows an example logic diagram of the landing platform's computational unit if a computational unit were employed. The landing platform begins at start 3101 and proceeds to determine if it is receiving a connection attempt 3102 from the UAV. If there is no connection attempt, the computational unit continues to expect a connection via the same decision 3102. Once a connection attempt is received from the UAV, the computational unit establishes a secure and consistent connection 3 103 with the UAV. Once the connection 3103 is established, the computational unit activates the warning capabilities of the landing platform (e.g. changes in LED lighting, sound emitted from a speaker, etc.) and warns observers to stand back 3104 from the landing platform. The computational unit then attempts to communicate confirmation to the UAV that the landing platform is ready 3105. If the UAV receives confimiation 3106, it proceeds to performing a guided landing 3107. If the UAV does not receive confirmation 3106, the landing platform's computational unit continues to attempt to communicate the confirmation 3105 to the UAV. The UAV's guided landing 107 may entail one or more methods or combinations of methods to safely land the UAV in the center of the landing platform.
These methods may include but are not limited to utilizing a system of electromagnetic wave emitters and receivers, utilizing a camera on the UAV or landing platform, and establishing a physical connection of some type between the UAV and landing platform. In a majority of potential implementations, the UAV may be capable of accomplishing most if not all of the computations necessary to accomplish a safe landing in the center of the landing platform, and as a result may be equipped with one or more transceivers and other electronic devices to accomplish all necessary landing related tasks. If the UAV communicates to the computational unit of the landing platform that the UAV has successfully landed 3108, the computational u it ends any further communication and computation 3109. If the UAV does not communicate a successful landing 3108, the UAV continues to perform its guided landing process 3107 until such a communication 3108 is received.
FIG. 23 shows a top-down view of the landing platform 3201. The landing platform 3201 may have a polygon shape when in an unfolded configuration, such as being hexagonal in shape as shown in the example figure, or may also be a symmetrical polygon comprised of more or less sides than the current implementation (octagonal, nonagonal, etc.). The landing platform 3201 may be foldable along junctions between a plurality of portions of the landing platform 3201 and may be folded along the junctions. As shown in FIG. 23, each portion may be triangular in shape and the landing plat form 3201 may be folded up along the junctions into a thicker layer of equally symmetrical and stacking triangles. In other words, the folded configuration of the landing platform 3201 may have substantially the same footprint as the footprint of a section of the landing platform 3201.
Wires may be imbedded along each side and potential folding crease of the landing platform 3201. These wires may be necessary to enable the landing platform's 3201 interface(s), communication, and any other devices located in or on the landing platform 3201. Along an outer edge or along each corner of the landing platform 3201 there may be warning devices, such as LEDs 3203, audible device, or other warning devices. The warning devices may be used for indication purposes, which may include but are not limited to warning observers that a UAV is landing, aiding a UAV in its landing process, and displaying certain colors, patterns, or combinations thereof that would allow an observer to interpret the state of or a message communicated by the landing platform 3201. At the center of the landing platform may be a power source 3202 which may provide power to the landing platform's interface(s) and any of the other devices located in or on the landing platform 3201. One device not depicted in FIG. 23 and that may be implemented is a computational unit specifically for the landing platform 3201. This computational unit for the landing platform 3201. when mentioned henceforth, may be located in the same housing as the power source 3202 or elsewhere in or on the landing platform 3201 . This computational unit would be comprised of all software and hardware necessary to accomplish the necessary tasks and processes of the landing platform 3201 described henceforth.
FIGS. 24A-24G show one implementation of the present disclosure, specific to how the landing platform 3201 may be folded, demonstrating its portability. FIG. 24 A shows a side view of the landing platform 3201 when it is unfolded and in position for UAV landing and/or take off. By utilizing one or more sensor(s) such as optical range finders, ultrasonic range finders, and or GPS units the landing platform 3201 will be able to indicate whether it is in position for landing based on its locational data to determine if it is in an open location and in an unfolded state. FIGS. 24B, 24C, and 24D show different perspectives of an intermediate stage in the folding process of the landing platform 3201. FIG. 24 E is a top right diagonal view of the landing platform's 3201 final shape, a slightly extruded triangle. FIG. 24 F shows a front perspective of this final triangle form. FIG. 24G shows the landing platform 3201 from the side once completely folded, offering a detailed perspective of the resulting thickness of the final shape due to the stacked nature of each of the landing platform's 3201 triangular faces.
FIGS. 25A-25C demonstrate an example of a UAV 3401 approaching the landing platform 3201 and successfully landing on the landing platform 3201. In other potential implementations, the UAV 3401 may approach the landing platform 3201 with an additional payload, not pictured in the drawings of the present disclosure. In FIG. 25 A, the UAV 3401 is seen approaching the landing platform 3201 from a distance, which it may accomplish through communication with a computational unit on the landing platform 3201. After communicating the location of the landing platform 3201 to the UAV 3401 or communicating the location of the UAV 3401 to the landing platform 3201 through the use of sensors and/or GPS units, the UAV 3401 will know the location of the landing platform 3201 and the algorithm described in FIG. 22 will be employed. The landing platform 3201 may also be located in a predetermined location and the UAV 3401 would be aware of said location prior to travel.
Prior to landing, the UAV 3401 may also receive commands from an external system (e.g. a computational unit on the landing platform or another external system with wireless communication abilities) to establish a flight restriction zone centered at the landing platform's GPS location. This restriction zone may be established with a switch on the landing platform 3201 accessible by the user. The landing platform 3201 may also come with different radii of restriction. FIG. 25B shows the UAV preparing for landing on the landing platform 3201. In order for the UAV 3401 to achieve high precision landing, the landing platform 3201 may utilize various techniques including but not limited to a method of constraining the UAV 3401 to a non-inertial frame, cellular positioning techniques, and techniques involving optical range finders. Cellular positioning techniques refers to any method embodying the idea of using a cellular communication capable device to communicate with a cellular base station and Location Service Client to determine and assist with measurements for position estimation. FIG. 25C depicts the UAV 3401 landing on the landing platform 3201. The methods the UAV 3401 uses to achieve takeoff and travel to another destination may include similar or the same techniques employed when approaching and landing on the landing platform 3201. After the UAV 3401 takes off, its next location may be another landing platform 3201 or the original operator of the UAV 3401.
FIGS. 26A-26E show a step-by-step process of folding the landing platform 3201 into a viable housing unit for the UAV 3401 having an interior space or cavity in which the UAV 3401 can be stored in. FIG. 26A depicts the landing platform 3201 laid flat with one triangular face of the landing platform 3201 folded up,
perpendicular to the ground plane. In this housing implementation of the present disclosure, the landing platform 3201 is hexagonal in shape, but may be a polygon comprised of more or less sides with a similar applicable folding technique. By folding one of the triangular faces of the landing platform 3201, it may become a pentagonal based pyramid. In FIG. 26B this is shown with the side that is being folded, held up to demonstrate its position for reference throughout the folding process. In FIG. 26C a landing platform 3201 face adjacent to the perpendicularly folded face is brought on top of the face across from it, effectively overlapping the two faces and creating a square pyramid (with the reference side still folded up perpendicularly once more to demonstrate its position for folding clarification). FIG. 26D shows the final result after folding the landing platform 3201 into its final shape, with the face that was previously folded perpendicularly to its adjacent faces for reference now folded down to overlap and become parallel with another adjacent face. FIG. 26E shows the underside of the housing unit as shown in FIG. 26 D with a UAV 3401 underneath the newly formed square pyramid created by the folding of the landing platform 3201. This configuration provides ample space for a UAV 3401 to reside underneath the square pyramid.
FIGS. 27A-27B show how a UAV 3401 may acquire the necessary positional information of the landing platform 3201 so as to travel 3605 to the landing platform's 3201 position. FIG. 27 A demonstrates one implementation whereby a commercialized cellular communication device 3601 is held above the landing platform's 3201 location, and sends GPS data specific to said location via a common cellular communication method 3602-1. In Fig. 27B, this GPS data sent via the wireless communication method 3602-1 is received by an external management system 3603. The external management system 3603 would be equipped with all necessary software and hardware to receive the GPS data from the cellular device 3601, and relay said data to the appropriate UAV 3401 via a wireless communication method 3602-2. which may be similar to or the same as the wireless communication method 3602-1. Once the UAV 3401 successfully receives GPS data specific to the landing platform's 3201 location from the external management system 3603, the UAV 3401 would then travel 3605 to the landing platform's 3201 location, and the algorithm described in FIG. 22 may be employed by the landing platform 3201.
Unmanned Aerial Vehicle Landing and Containment Station In order to reduce the pressure of managing and organizing several UAVs, unique stations could be used to combine the landing, housing, and charging components of the UAV in one compact place. Through autonomous control the landing unit will be able to accomplish all of this on its own without further help from a separate entity. The stations may be able to autonomously communicate with a UAV to accomplish landing and take-off. Several stations, combined with their autonomous capabilities, would enable efficient and easy UAV management for station users. Additionally, through the addition of a possible roofing system, such a station would not only be able to provide a secure location for UAV landing, but also provide protection for the UAV from any possible tampering or outside forces, such as weather.
FIGS. 28A-28H show an example configuration of a station 4101 and some of its independent components. FIG. 28 A shows how the station 4101 may be comprised o various main parts, including: a landing platform 4102, an electronics bay 4103, a charging ring 4104, support legs 4105, and support column 4106. The electronics bay 4103 may be positioned underneath the landing platform 4102 and may serve primarily as a sub-platform for storing and/or holding various electronics for the support of a UAV and the mechanical system of the station 4101 as a whole. This may include any units necessary for the charging ring's 4104 functionality and/or any other parts including but not limited to wires, batteries, sensors, and various circuitry. Additionally, the electronics bay 4103 may be further configured to hold other objects, such as spare mechanical parts for the landing station 4101 or a UAV utilizing the same landing station 4101. A computer and compatible graphical user interface may also be incorporated in to the electronics bay 4103, with the ability to be accessed from in or outside the electronics bay 4103. With regards to the addition o such a system, outside accessibility may entail the graphical user interface (at 1 minimum) to be on a sliding apparatus integrated in to the electronics bay 4103, such
2 that the graphical user interface or other components can be easily accessed by an
3 operator. While the figures depict the electronics bay 4103 as being a platform, the
4 edges of the landing platform 4102 and electronics bay 4103 may be connected by
5 metal sheets or another durable material so as to enclose the contents of the
6 electronics bay 4103 as a compartment. These enclosing sides may be further
7 designed to be easily removable or passable to allow for modifications or additions to
8 the electronics bay 4103 as needed.
9 The station 4101 may be supported by a framework, such as two legs 4105. W which may have holes drilled through bottom components so as to enable fastenings U to the ground underneath. These fasteners may be composed of concrete screws,
12 welds, nut and bolt configurations, spikes, and various other devices and methods for
13 solidifying the foundations of the station in a grounded, stable position. These
14 fasteners may be further configured to be easily removable so as to render the station 25 4101 more portable, and may also be used to level the station 4101 to the ground
16 surface. FIG. 28B shows a top view of the station 4101. In this configuration, the legs
17 4105 utilize screws as fasteners.
18 FIG. 28C shows a view of the station from the side, with the charging ring
19 4104 encompassing the landing pad center. FIG. 28D shows a bottom view of the
20 station 41 01. The station 4101 is elevated by support legs 4105, however the support
21 legs may assume a different shape or aesthetic design depending on the overall size,
22 shape, and weight of the station 4101. FIG. 28 E shows an independent view of the
23 support column 4107, located in previous and following figures in between the
24 landing platform 4102 and electronics bay 4103. The support column vertically
25 connects the landing platform 4102 and electronics bay 4103. while still providing 6 ample space above the electronics bay 4103 for storing necessary devices as 7 previously described. FIG. 28F shows an independent view of one design of one of 8 the support legs 4105.
9 FIG. 28G shows an independent view of one half of the charging ring 4104, 0 which may be used to charge or recharge a power source of a UAV, and the
1 surrounding capture ring 4106 located on the station 4101. The capture ring 4106 may 2 be used to help guide the UAV to the appropriate position so it can electrically 3 connect with the charging ring 4104. The capture ring 4106 may be ideally shaped so 4 as to account for any possible error in a UAV's landing position. For example, the 1 charging ring 4104 may be positioned as the floor of an angled slot within the capture
2 ring 4106 which has curved and slanted faces collimating to a center half-circle.
3 When the UAV lands on the landing platform, the feet of the UAV may be guided to
4 slide down the angled sides of the capture ring 4106 and make an effective and
5 guaranteed connection with the charging ring 4104 located in a floor thereof.
6 FIG. 2811 shows a top view of the same half of the capture ring 4106 and
7 charging ring 4104 configuration. When combining two halves of the capture ring
8 4106 and the charging ring 4104 pictured to obtain the entire capture and charging
9 ring component of the station 4101, each half of the charging ring 4104 may have a 10 strong opposite polarity relative to the other half, and as such the charging rings may U act as the terminals of a traditional battery. The power source inducing this change in
12 polarity may be located in the electronic bay 4103. If such a polarity difference was
13 induced to allow for the charging ring 4104 to act as the terminals of a battery, the M feet of a UAV utilizing the station 4101 may be equipped with a charging apparatus
15 so as to enable a charging connection between the station 4101 and the UAV.
16 Alternatively, the charging ring 4104 and the feet of the UAV may utilize another
17 charging technique that is enabled through their direct contact.
18 FIGS. 29A-29C describe an example process of how a UAV 42 1 may land
19 on the station 4101. FIG. 29 A depicts a UAV 4201 interacting with electromagnetic
20 radiation emitters 4202, which may be added to the landing platform 4102 of the
21 station 4101. These emitters 4202 would broadcast radiation to the UAV 4201 , which
22 would be enabled to interpret and utilize these signals to aid in the landing process. In
23 this specific implementation, the UAV 4201 may be equipped with multiple
24 electromagnetic radiation receivers that would receive the electromagnetic radiation
25 intensity from the emitters 4202 and translate those intensities into corrective action
26 with respect to roll and pitch so as to modify its position and achieve the most precise
27 landing possible. FIG. 29B shows another example system in which the UAV 4201
28 may land on the station 4101 utilizing a camera 4203 located on the landing platform
29 4102. The camera 4203 would be powered by and communicate with devices located
30 in the electronics bay 4103. One or more of these devices in the electronics bay 4103
31 would be enabled to receive data input from the camera 4203 and wirelessly send that
32 data to the UAV 4201. The UAV 4201 may then have the ability to perform
33 corrective action with respect to roll and pitch so as to modify its position using the
34 data from the camera 4203 to accomplish a precise landing on to the landing platform 4102. FIG. 29C shows the UAV 4201 and station 4101 after a successful landing following an approach as described in FIG. 29 A or FIG. 29B, with the feet of the UAV 4201 making contact with the charging ring 4104.
FIGS. 30A-30B describe one of many alternative embodiments of the present disclosure for a landing and containment station. The main difference between the design of this alternative embodiment and the design described in FIGS. 28A-29C is the inclusion of a roof system and housing components, thus making this design capable of storing a UAV internally and rendering it exempt from outside exposure, as well as a charging system involving electromagnets. FIG. 3 OA shows a holistic view of the alternative station 4301. The entire structure is held up by the base support 4302 which may be held stable by fasteners through the fastening holes 4303. These fasteners may be composed of concrete screws, welds, nut and bolt configurations, spikes, and other various devices and methods for solidifying the foundations of the station 4301. The base support 4302 may have an I-shape structure with a hollow core that can store network and computational devices. The base support 4302 may also assume a different shape or aesthetic design depending on the overall size, shape, and weight of the station 4301.
Connected to the base support 4302 is the landing platform 41 2. Adjacently connected are the station walls 4305, which are in turn flexibly connected to the triangular roof components 4306. Located on the surface of the roof components 4306 may be solar panels 4307, which may provide power for the station 4301. The flexibly connected roof components 4306 serve the purpose of opening and closing the main bay or interior compartment of the station 4301 in which the UAV is kept. This main bay is defined as the enclosure created by the landing platform 4102, the station walls 4305, and the roof components 4306. The main bay combined with any machinery that may induce the movement of the main bay will henceforth be described as the dynamic roof management system. This system may perform some or all of its features or actions completely autonomously.
FIG. 30B shows a top view of the station 4301. Here, the landing platform 4102 can be seen to have embedded charging rings 4308 and a larger conical impression 4309. One UAV attraction element and method that may be employed is including electromagnets that may consist of coiled wire around an inductive element fastened beneath the charging rings 4308 on the landing platform 4102. The electromagnets may serve two functions: bringing a UAV down into a centered, stable position in the final inches of descent, and during take-off hold a UAV down while the motors of the UAV reach a flight operational state. This function may prevent the UAV from coming into contact with the roof components 4306 that may still protrude from the station 4301 when the UAV attempts to take-off from the landing platform 4102. The electromagnets may also ensure correct polarity during charging.
The charging rings 4308 may charge the UAV by using low-resistive rings that are attached to the landing platform 4102 centered on the center axis of the electromagnets. The polarity of the rings would relate with the polarity o the electromagnetic field ensuring that potential battery harming cross polarity does not occur. As the UAV descends on to the landing platform 4102 the electromagnet polarity of electromagnets that may exist on a UAV would match the permanent magnetic polarity scheme of the landing platform 4102. Once the UAV is down and secured to the landing platform 4102, current may flow from low-resistive rings to low-resistive permanent magnet casings (components of the charging rings 4308), and then to the appropriate power source circuitry. All of these functions may be automated by a jointly connected microcontroller, which may be housed inside the base support 4302. The use of these components together may automate and optimize the safety of the UAV 4201 during take-off, landing, and charging. The larger conical impression 4309 may increase the strength of the electromagnets that may exist underneath the charging rings 4308 with respect to corresponding electromagnets that may exist on a UAV. The charging rings 4308 may be modified in size, shape, or position on the landing platform 4102 in order to accommodate or be compatible with other UAVs. Consequently, the conical impression 4309 may be subject to similar changes in size, shape, and position in order to serve the same purpose as previously described.
FIGS. 31A- 1 E show the station 4301 in more detail, in addition to the dynamic roof management system and its important features, each contributing to the overall tasks and performance of the station 4301. FIG. 31 A shows an initial view of the station 4301 with two of the closest station walls 4305 and corresponding roof components 4306 omitted so as to provide better detail of the inside components of the station 4301. It is important to note that all the components seen in FIGS. 31 A- 3 IE from this perspective may exist symmetrically on the sides of the station 4301 that are omitted for the sake of detailed perspective. FIG. 31 A shows a first such perspective of a slide 4402, fastening section 4403, and the rack drive train section 4404. A spring angular closing system may also be implemented to automatically raise or lower the roof components 4306 as needed. This spring angular closing section may provide the functionality of angling the roof components 4306 to meet at a center point only when the station walls 4305 rise to a certain point. The purpose of implementing this system would be to allow for the opening and closing o the roof components 4306 without the use of heavy motors at the pivot points where the roof components 4306 flexibly connect with the station walls 4305. This also allows the station walls 4305 of the main bay to retract below the landing platform 4102 without ever coming in to contact with the UAV 4401. A spring angular closing system may be implemented by using two springs for each of the four vertical sides of the station 4301; one linear spring would be attached to the landing platform 4102 and one of the four roof components 4306, while another radial spring would be attached to the same roof component 4306 and the exteiior of the closest of the station walls 4305. These springs would have the appropriate properties of tension and strength necessaiy to accomplish the appropriate timing for pushing and pulling the roof components 4306 in to an opened or closed position.
FIG. 3 IB and FIG. 31C show the same perspective and components as FIG. 31 A, but with the station 4301 in a midway position of opening the roof components 4306 and lowering the station walls 4305 in FIG. 3 IB, and the roof components 4306 fully open and station walls 4305 fully lowered in FIG. 31C. FIG. 3 ID shows a top side view of the station 4301 with the roof components 4306 fully open so as to allow for the UAV 4401 to take off. Additionally, FIG. 31A, FIG. 31B, FIG. 31C, and FIG. 3 IE show the location of where electromagnets 441 1 may exist underneath the charging rings 4308 located on the landing platform 4102.
The slide 4402 and fastening section 4403 may stabilize a rack 4405, which is essential to facilitating the linear motion of the station walls 4305. This linear retraction of the station walls 4305 and connected roof components 4306 allows for the UAV to land and take-off from the landing platform 4102 without risk of hitting the station walls 4305 or roof components 4306. Stabilization of the rack will increase water-resistance, reliability in functionality, and the robustness of the entire station 4301. To achieve this stabilization, a constraint of the rack 4405 with respect to the X, Y, Θ, and Φ dimensions (i.e. all geometric and polar dimensions excluding all vertical dimensions) around the origin of the slide 4402 at the landing platform 4102 may be implemented. To accomplish this the landing platform 4102 of the station 4301 utilizes a slot slide 4402 made of nearly frictionless material that appropriately fits the corresponding dimensions of the rack 4405, while still allowing vertical movement. Each secured rack 4405 would be attached to each of the station walls 4305.
FIG. 3 IE shows a closer view of the rack drive train section 4404. Located below the landing platform 4102, the rack drive train section 4404 serves the purpose of driving the rack 4405 vertically upwards or downwards. To perform this action a compound gear 4406 is placed on an axis located at a point in which the larger section of the compound gear's teeth are aligned with the rack 4405. This axle is supported at both ends by a triangular support 4407 connected to the landing platform 4102. A separate gear 4408 is then aligned to intertwine with the smaller gear component of the compound gear 4406; this second gear 4408 is similarly supported by a triangular axel support 4409 at both ends. At one end of the latter mentioned axel is a drive motor 4410 fastened to a plate that is perpendicular to the landing platform 4102 and parallel to the station wall 4305. This motor 4410, along with three other similar motors located in the other rack drive train sections 4404, may drive the station walls 4305 and would therefore drive the previously described axel. The motor 4410 would also be equipped with a limit switch, a device that would be capable of sending an electronic signal when the motor approaches the maximum or minimum driving distance along the rack 4405.
An operational control system may be responsible for the autonomous operation of the dynamic roof control system's motors 4410 as well as all other aspects of communication and proper landing of the UAV. The operational control system may be comprised of a computational unit, a communication interface and any other necessary electronic units necessary to perform its designated tasks. The operational control system and all of its components may be housed in the structure of the base support 4302. One primary task assigned to the operational control system would be to ensure the smooth operation of the motors 4410 by taking into consideration the signal value of the motor's 4410 respective limit switches. Feedback to the operational control system from the limit switches, triggered when the station walls 4305 are at their lowest or highest, would allow the operational control system to issue commands aimed at preventing wear on the rack 4405, specifically the teeth, and also prevent a motor from overloading. The operational control system may also be tasked with (and appropriately equipped to accomplish) other assignments such as storing charging data, communicating wirelessly with a UAV, or communicating with any other components that may be added to the station 4301. These tasks would aid in increasing robustness and reliability of the station 4301, and with the assistance of an operational control system, the station 4301 would be able to accomplish autonomous UAV landing, housing, and take-off.
Unmanned Aerial Vehicle Delivery Optimization The use of UAVs in delivery of products may provide benefits over the conventional art. In comparison to manual delivery systems, UAV flight deliveries may be completed in a matter of hours versus standard delivery methods which often take several days. With the introduction of UAVs to the delivery market, UAV design has become increasingly relevant to ensure safe and accurate deliveries. Since deliveries will vary in size, shape, and weight it may be necessary to have a versatile UAV design that is able to accomplish a variety of deliveries. In order to capitalize on improving shipping methods, delivery companies may not only want to utilize unmanned aerial delivery but also optimize power. Such optimizations may include minimizing UAV down time, rapidly checking the operational status of each UAV component, and quickly replacing and/or repairing UAV components. Thus, it may be necessary to use UAVs specially designed to be modular such that they can be easily dismantled and reconstructed before and after each delivery if repairs or modifications are in order. Additionally, to accommodate for changes in payload size, a UAV must be designed so as to easily increase or decrease its carrying capacity and power supply in order to achieve maximum efficiency.
FIGS. 32A-32C show an example of an oclocopter constructed using a stacking method with modular UAV components. As seen in FIG. 32A, to create a four-rotor copter (hereby referred to as a "quadcopter") the user may simply connect a control board (or any other electronic devices necessary for the successful operation of a UAV) located on the control plate 5101 to a single rotor plate 5102- 1. The rotor plate 5102-1 may have four or more rotor arms, each of which is sized to cany a rotor. Using one rotor plate 5102-1 may provide sufficient lift and power for utilization of a lighter payload, but if a user requires a UAV to carry a heavier payload that a quadcopter configuration cannot support, then the user may simply add another rotor-plate 5102-2 that is out of phase with respect to the prior rotor plate 5102-1 by 45°. The addition of this second rotor plate 5102-2 would create an eight rotor copter (hereby referred to as an "octocopter"). In other design implementations it may be possible to utilize a circular plate structure (compared to an octagonal) and achieve 12 total rotors by setting the plates out of phase by 30°. It would also follow that additional rotors may be added with each plate out of phase at increasingly smaller angles depending on the configuration and size of the UAV. These plates may be connected and secured at easily accessible and intuitive fastening points 5103 located symmetrically around the structure of the plates.
To power a UAV constructed via the method previously described, a modular style of battery stacking on the control plate may be implemented to ensure that only the sufficient power required for flight is on board. This ability to specifically select UAV components based on payload weight, a delivery distance, or another aspect, may reduce weight of the UAV, and in turn may increase the lifetime of the battery while also reducing the frequency at which the batteries would have to be replaced. To attach ulterior sensors, payload holders, or other system elements would simply be a matter of stacking an additional unit (similar to the configuration and compatibility of the plates previously described) with the desired components below the control plate or above the rotor plate, given that it is designed to not impede the rotors. An additional design advantage of a UAV constructed utilizing a stacking method is the ability to additionally dampen vibrations coming from the rotors that may affect the control board. By adding dampening elements on the connection point between each plate, or the connection between the control plate 5101 and the rotor plates 5102, the vibrations described may be minimized or eliminated.
In FIG. 32 A it is important to note that the dotted lines connecting the plates represent the location, path, and direction of connection between the plates. The octocopter may include a control plate 5101 that houses both the batteries and the control units 5104. Two rotor plates 5102 are attached at the connection points 5103 above the control plate 5101. FIG. 32B shows a single rotor plate with its motor mounts 5106 and control unit 5104. In FIG. 32C two rotor plates are stacked on top of each other to create an octocopter. Located on each rotor plate of the octocopter are motor mounts 5106 and the respective motor controllers 5105 as well as the control unit 104.
FIG. 33 demonstrates an example algorithmic system process of how a UAV may be configured to accomplish a specific delivery. This customization may be made possible by the utilization of a UAV constructed utilizing a stacking method as described in FIGS. 32A-32C. This system optimizes a payload delivery system by minimizing the excess material for each flight, and m inimizing the amount of time a UAV control plate is out of commission. One example implementation of the present disclosure may begin with weighing the payload 5201 and determining the distance required to travel 5202 for purposes of optimization. The system may then determine through payload thresholds 5203, 5204, and 5205 whether to use a quadcopter with one rotor plate for light payloads 5208, an octocopter with two rotor plates for heavier payloads 5209, or utilize a circular frame with three rotor plates for the heaviest payloads 5210, respectively, in the case that the payload exceeds the maximum weight that a UAV may carry, the payload may be split into separate deliveries 5206 and the system may restart the algorithmic process by weighing each of the divided payloads 5201 and determining the required distance to travel 5202 for each.
However, if the payload cannot be split into separate deliveries, it may be communicated to a user or construction system that the payload is not deliverable via the current UAV method 5207. It is important to note that the exact weight configurations for each payload and UAV may vary and the values shown in FIG. 33 are only one example implementation of the present disclosure. Once the number of rotor plates to utilize within this delivery is determined, the rotor plates are set onto the control plate 521 1. With the knowledge of the flight distance 5202 the system may then determine the appropriate number of batteries required and could set them in battery slots 5212.
Before attaching the payload the entire system may perform a quick systems test on the motors and control board 5213. Once a UAV is verified as functional, the system may then attach a payload 5214, perform the delivery to a delivery site, return 5215, and the UAV may land 5216. Before a UAV is ready for another payload and delivery, a user or construction system may perform a system check on the vehicle 5217. If all elements are functioning 5218, the system may remove the discharged batteries 5221 and the UAV would be recognized as ready for the next payload and delivery. In the event that one or more elements are dysfunctional, a user or construction system may check that the control board is functioning properly 5219. If the control board is functioning, then the system may utilize the same control plate for the next delivery and send the rotor plates for evaluation and repair 5222. Otherwise if the control board is malfunctioning, the control plate may be decommissioned for repairs 5220. It should also be noted that the control board may be one of if not the most expensive component of a UAV. Consequently, constant utilization and maintenance of the control board is essential to maximizing the cost efficiency of a UAV constructed using the stacking method of the present disclosure.
Wind Mapping Using Power Consumption Data of Unmanned Vehicles Figure 34 shows a block diagram of an unmanned vehicle in communication with a management system for wind mapping using power consumption data.
Referring to FIG. 34, the unmanned vehicle (also referred as UAV or unmanned aerial vehicle) 6102 may comprise a processor 6105, a communication device 6104. a memory device 6108, and a battery 6106. The memory device 6108 may include, for example, one or more volatile and/or non-volatile memories. In other words, for example, the memory device 6108 may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device). The memory device 6108 may be configured to store information, data, applications, instructions or the like for enabling the apparatus to carry out various functions in accordance with example embodiments of the present disclosure. For example, the memory device 6108 could be configured to buffer input data for processing by the processor 6105. Additionally or alternatively, the memory device 6108 could be configured to store instructions for execution by the processor 6105.
The processor 6105 may be embodied in a number of different ways. For example, the processor 6105 may be embodied as one or more of various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip. or the like. In an example embodiment, the processor 6105 may be configured to execute instructions stored in the memory device 6108 or otherwise accessible to the processor 6105. Alternatively or additionally, the processor 6105 may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor 6105 may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to embodiments of the present disclosure while configured accordingly. Thus, for example, when the processor 6105 is embodied as an ASIC, FPGA or the like, the processor 610 may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor 6105 is embodied as an executor of software instructions, the instructions may specifically configure the processor 6105 to perform the algorithms and/or operations described herein when the instructions are executed. However, in some cases, the processor 6105 may be a processor of a specific device (e.g., an eNB, AP or other network device) adapted for employing embodiments of the present disclosure by further configuration of the processor 6105 by instructions for performing the algorithms and/or operations described herein. The processor 6105 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 6105.
In addition, the battery 6106 may be any container consisting of one or more cells by which chemical energy is converted into electricity and used as a source of power for the motor/generator 6107. Examples of possible battery types that may be used include but are not limited to alkaline, lithium, lithium-ion, and carbon-zinc, all of which may or may not have rechargeable properties. In some cases, the battery depicted may not be restricted to solely providing power for the motor/generator 6107 and could also supply power to the elements within the computing module 6103.
Meanwhile, the communication device 6104 may be any means such as a device or circuitry embodied in either hardware, software, or a combination of hardware and software that is configured to receive and/or transmit data from/to a network and/or any other device or module in communication with the apparatus. In this regard, the communication device 6104 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling
communications with a wireless communication network. In some environments, the communication device 6104 may alternatively or also support wired communication. As such, for example, the communication device 6104 may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
In an example embodiment, the processor 6105 may be embodied as, include or otherwise control a resource manager 6109. The resource manager 6109 may be any means such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g., processor 6105 operating under software control, the processor 6105 embodied as an ASIC or FPGA specifically configured to perfonn the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the resource manager 6109 as described herein. Thus, in examples in which software is employed, a device or circuitry executing the software forms the structure associated with such means.
In an example embodiment, the processor 6105 may be further configured to receive power consumption specific information from the batteiy 6106, directly or indirectly. That is, a direct communication 6050 may be defined as the processor 6105 already possessing the ability to interpret power consumption data from the battery 6106, or that an indirect communication method may be necessary and defined as another device coupled with the processor 6105 and the battery 6106 to act as a translator or relay process between the two. In the case where a direct communication 6050 may not exist, and the batteiy 6106 may have other power commitments to other devices, the processor 6105 may be further configured to account for these other devices when determining the power consumption data specific to the motor/generator 6107.
In an example embodiment, the resource manager 6109 is configured to control the allocation of wireless communication resources to enable the
communications described above in accordance with an example embodiment of the present disclosure. For example, the resource manager 6109 is configured to allocate resources for use by machines or management systems such as the management system 6101 to communicate directly with a cellular network (e.g., in the downlink direction), and/or to communicate with other machines or management systems (bi- dircctionally), and/or to communicate with a gateway or relay. In an example embodiment, as described above, the resource manager 6109 may be configured to allocate wireless network downlink resources (e.g., cellular downlink channel resources) for use by the management system 6101 to provide signaling to other machines or management systems or to the gateway. The resource manager 6109 may also be configured to allocate wireless network uplink resources (e.g., cellular uplink channel resources) to receive data from the management system 6101 via the gateway (e.g., the unmanned vehicle 6103). Uplink and downlink resources may also be managed with respect to communications with the unmanned vehicle 6103 for communications that are not related to data being reported by the management system 6101 or other machines or management systems. Parameters affecting power consumption data in an UAV may include power usage by equipment installed on the unmanned vehicle, such as sensors, cameras, etc., the speed of the UAV, and the climate where the UAV is traveling. Specifically the climate is strongly influenced by the presence of wind factors, which may increase or decrease power consumption based on whether wind speed or direction. This relationship between power consumption and the UAV ambient wind factors may then be utilized to map wind vectors. Wind mapping is an essential tool for management systems and users, and provides more information about the
environment in which the unmanned vehicles travel so as to make more informed, efficient, and accurate decisions about the unmanned vehicles' travel paths.
Now referring to FIG. 35, an unmanned vehicle 6102 may travel in winds, or any other weather related event 6201. Winds stand as a general term that includes but is not limited to gusts, breezes, and storms. In this situation, the unmanned vehicle's travel trajectory and plans are affected by the presence of wind 6201. FIG. 35 depicts the unmanned vehicle 6102 traveling in the same direction as the wind 6201, but it is also possible for the unmanned vehicle 6102 to travel perpendicular, diagonally, or any other directional degree with respect to the wind 6201. The wind may represent a dynamic variable that accounts for the rate of change of wind velocity as well as the intensity or strength of the wind as determined by its velocity and force. When traveling with wind 6201, the unmanned vehicle 6102 may consume more or less power from the battery. As a result, actual power consumption data is dependent on wind factors and the angle at which the wind goes with respect to the unmanned vehicle 6102. For example, when the unmanned vehicle 6102 heads opposite the direction of the wind 6201, it typical ly consumes more power in order to account for the force pushing it away from its trajectory. While on the contrary, if the wind 6201 and unmanned vehicle 6102 head in the same direction as seen in FIG. 35, the unmanned vehicle 6102 consumes less power because the wind 6201 provides a force to support the unmanned vehicle's flight. The power consumption data can be analyzed using external computing by a management system to determine what the cause for varying power consumption data is. This computing method may give sufficient data to determine the direction and intensity of the wind 6201 to a specific accuracy.
Figure 36 shows a graph of the power consumption data specific to the motors of the unmanned vehicle relative to the direction of the unmanned vehicle according to the example embodiment of Fig. 35. The x-axis of the graph indicates what angle the wind is blowing with respect to the unmanned vehicle, and the y-axis indicates how much power is being consumed by the unmanned vehicle 6102 in watts. The example embodiment of the present disclosure is only one possible representation of the unmanned vehicle's 6102 power consumption; any type of graph or visual representation may be used with any scale of x, y variables and units. In FIG. 36, it can be seen that the graph follows a bell curve, but if the graph were extended it would closely resemble a sinusoidal wave. At 0 degrees and 360 degrees, the curve is closest to zero in terms of power consumption. It will not be exactly zero watts because the unmanned vehicle will consume power regardless of the surrounding wind intensity. However, these directions would be the graph's minimum according to the example embodiment in FIG. 35, where the unmanned vehicle would be traveling in the same direction as the wind, in this case the unmanned vehicle may consume less power because the force of the wind will assist the unmanned vehicle to continue on its flight path and the wind will not be hindering the unmanned vehicle from going in its intended direction. It can be seen that the peak of the curve is at approximately 180 degrees, which if the unmanned vehicle were to turn and travel in the horizontal direction of 180 degrees (as shown in FIG. 35), then it may be opposing the direction of the wind and the unmanned vehicle may consume the greatest amount of power. In between the angles of 0 degrees, 180 degrees, and 360 degrees the power consumption of the unmanned vehicle may be calculated using an external computing vehicle. The other straight curve represents the base power usage if wind were not present and thus not affecting the unmanned vehicle's flight. Additionally the exact values of power consumption and degrees will be unique to each unmanned vehicle based upon the unmanned vehicle's design and the existing technology present on or inside the vehicle. In terms of calculating power consumption, the unmanned vehicle may determine its power consumption relative to its base power usage (power usage with no wind present), or another computing process may be employed.
Figure 37 shows an example graphical user interface (GUI) in accordance with embodiments of the present disclosure. An interactive map is displayed through a web browser showing wind patterns and directions calculated by the management system using the unmanned vehicles' power consumption data, in accordance with embodiments of the present disclosure. The graphical user interface (GUI) 401 may include the unmanned vehicle(s) 6201, the unmanned vehicle's flight path 6404, and wind vectors 6201-1 and 6202-2. The details of the llight may be seen in a web browser or other computer window 6401. The background of the GUI will have a map 6402, which may display the general area of where the unmanned vehicle takes off, where the unmanned vehicle lands 6403, and any places in between the unmanned vehicle's takeoff and landing spot as shown by the flight path 6404. The map 6402 may change as the unmanned vehicle's flight path changes and may be interactive through scrolling, or it may automatically follow the unmanned vehicle(s) through tracking. The GUI will be able to display more than one unmanned vehicle 6404 and how each flight path 6404 may be used to determine wind direction and intensity as shown by the imaging of wind vectors 6201-1 and 6202-2 on the map. The umnanned vehicle's flight path may be in the direct path of a wind, which will affect the umnanned vehicle's power consumption. Based on the power consumption data computations, the external management system will be able to compute and display the direction and intensity of the wind through a visual representation of wind vectors 6201-1 and 6202-2 on the GUI.
FIG. 38 illustrates a block diagram of a wind mapping system in accordance with embodiments of the present disclosure. As shown in Fig. 38. an onboard system 6510 in communication with a cloud environment 6520 to transfer onboard information for cloud processing. The onboard system 6 10 may comprise a collector 6511, passive sensor systems 6512, electrical feedback sensors 6513 and other components, such as output control module, etc. The electrical feedback sensors 6513 function to get power consumption information of the UAV. The passive sensor systems 6512 may be refereed as Global Positioning system (GPS) receiver to obtain information such as GPS signal, compass information, UAV speed, etc. The passive sensor systems 6512 may be also refereed as temperature sensor, humidity sensor and barometer pressure sensor to obtain temperature, humidity and pressure information respectfully. The collector 6511 couples to the aforementioned components, collects information to be sent and transfers the collected information via a wireless communication path to the cloud environment 6520. Upon receiving the information, the. cloud environment 6520 processes the information for various applications. For example, the electrical load measurement may be used for wind mapping 6521. The information may also be used for 3D cell coverage mapping 6523, 3D GPS receiving mapping, determining magnetic field anomaly using compass error information 6523. In some embodiments, the cloud environment 6520 comprises a model installed on a cloud server to calculate wind factor, including wind speed and direction, based on UAV power consumption and other required information. For example, the model may be preloaded with an aerodynamic profile of the UAV, which may be necessary to resolve wind factor information with desired precision. The other required information may comprise UAV speed and position information, which may be obtained through GPS signal received onboard. The other required information may also comprise ambient air temperature and humidity, which may be measured with onboard sensors too. The other required information may also comprise UAV aviation status information, such as pitch angle, roll angle and yaw angle. The modal may be pre-calibrated by comparing the calculated wind factor information with wind factor information obtained from third parties, such as weather broadcast services. In operation, all obtained information, including UAV power consumption, speed, position, aviation status, are input to the model for the calculation of wind factors.
FIG. 39 illustrates an example flow diagram for wind mapping in accordance with embodiments of the present disclosure. In step 6610, the power consumption specific information from the battery of an UAV is obtained. The power consumption may be obtained directly or indirectly, as described earlier. In step 6620, aviation information of the UAV is obtained. The aviation infonnation may comprise UAV speed, direction, position, flight gesture as defined by Euler angles including roll, pitch and yaw angels. The UAV speed, direction may be obtained via received GPS signal using an onboard GPS receiver. The received GPS signal may be used in a differential algorithm to calculate the speed, direction and position. The Euler angles may be obtained using an onboard gyroscope. In step 6630, ambient air information around the UAV is obtained. The ambient information may be air temperature, humidity and pressure, which may be measured using onboard temperature sensor, humidity sensor and barometric pressure sensor respectively. Weather information provided by weather broadcast services may not precise enough or not in real time. Precise ambient air information measured onboard would still be necessary to meet the precision calculation requirement. The power consumption information, UAV aviation information and the ambient air infonnation may be refreshed at the same or different rate. In step 6640, the power consumption infonnation, UAV aviation information and the ambient air information are transferred to the management system via a wireless communication path. In some embodiments, the transfer schedule is in line with the information refresh rate. In step 6650, wind factor is calculated based on the transferred power consumption information, UAV aviation information and the ambient air information. The calculation may be done on a cloud server accessible via internet. The wind factor comprises wind speed and direction, which are mapped into a wind map displayed on a Graphic user interface (GUI). The GUI may be displayed as an interactive map accessible through a web browser Besides the wind map, the GUI may also comprises other layers of information, such as the UAV and its flight path, geographic background map, etc. An exemplary GUI and its description may be referred back to Fig. 35. Although exemplary steps are shown in Fig. 39 for wind mapping, it is understood to one skilled in the art that certain steps may optionally be performed; certain steps may be performed in different orders; and certain steps may be done concurrently.
FIG. 40A-B shows the underside of a UAV 7101 with a stationary locked camera 7102. FIG. 40A shows the entire UAV 7101, with FIG. 40B showing a closer view of the camera 7102 connected to the underside of the UAV 7101. It should be understood that various other implementations with respect to imaging optics and hardware may be used without departing from the core novelties of the present invention.
FIG. 41 describes an example system embodiment of the present invention, in which five UAVs 7101 communicate with an image compilation server 7201, which in turn may communicate with a computer 7202. The five UAVs 7101 would capture image data from various geographic locations and send the data wirelessly to the image compilation server 7201. The image compilation server 7201 should be understood to have the qualities and functionality of a cloud-based server capable of enabling network access among a plurality of computing elements. Additionally, it should be understood that the image compilation server 7201 would be programmed or otherwise enabled to store image data (which may include image data that it not from the UAVs of the present invention, such as satellite imagery), autonomously perform algorithmic processes on said image data, and store the results of those processes. The computer 7202, and any other similar computer actors that may be present in various embodiments not described, should be understood to be any device which are at minimum capable of storing and processing data according to instructions given to it by a variable program. With respect to the present invention, at least one computer 7202 would be in communication with the image compilation server 7201 directly or indirectly, via either a wireless (e.g. via WiFi, Bluetooth, or a cellular network) or direct (e.g. via a USB, FireWire, DIN, or e-SATA/SATA) connection. Mobile devices should be considered as an obvious implementation of the computer 7202. Additionally, at least one computer 7202 would be equipped with a monitor or other display component that would allow the virtual 3D environment constructed by the image compilation server 7201 to be displayed to a user in a virtual reality (e.g. via immersive wearable headgear, large field of view and depth of field glasses), augmented reality (e.g. via by holographic displays, projector equipped glasses) or traditional display fashion (e.g. via LED, LCD, CRT displays). All three of these display types would be equipped with necessary hardware and software so as to enable a user to use the computer 7202 to view, navigate, label, partition, save, or otherwise add, subtract, or change the virtual 3D environment viewed for whatever purposes a user may require.
Referring to FIG. 42, a UAV 7101 registered or otherwise in communication with the image compilation server 7201 may include a telemetry unit 7301, a processor 7302, a resource manager 7303, camera and imaging components 7304, a memoiy device 7305, mechanical components 7306, and a GPS unit 7307. These seven components (including the physical frame and supporting body) would compose the unmanned aerial vehicle as previously mentioned. The memory device 7305 may include, for example, one or more volatile and/or non-volatile memories. In other words, for example, the memory device 7305 may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device). The memory device 7305 may be configured to store information, data, applications, instructions or the like for enabling the apparatus to carry out various functions in accordance with example embodiments of the present invention. For example, the memoiy device 7305 could be configured to buffer input data for processing by the processor 7302. Additionally or alternatively, the memory device 7305 could be configured to store instructions for execution by the processor 7302.
The processor 7302 may be embodied in a number of different ways. For example, the processor 7302 may be embodied as one or more of various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. In an example embodiment, the processor 7302 may be configured to execute instructions stored in the memory device 7305 or otherwise accessible to the processor 7302. Alternatively or additionally, the processor 7302 may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor 7302 may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to embodiments of the present invention while configured accordingly. Thus, for example, when the processor 7302 is embodied as an ASIC, FPGA or the like, the processor 7302 may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor 7302 is embodied as an executor of software instructions, the instructions may specifically configure the processor 7302 to perform the algorithms and/or operations necessary for the UAV to operate successfully. However, in some cases, the processor 7302 may be a processor of a specific device (e.g., an eNB, AP or other network device) adapted for employing embodiments of the present invention, and may entail further configuration of the processor 7302. The processor 7302 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 7702.
Meanwhile, the telemetry unit 7301 may be any means such as a device or circuitry embodied in either hardware, software, or a combination of hard ware and software that is configured to receive and/or transmit data from/ to the image compilation server 7201 and or any other device or module in communication with the apparatus. In this regard, the telemetry unit 7301 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network. In some environments, the telemetry unit 7301 may alternatively or also support wired communication. As such, for example, the telemetry unit 7301 may include a communication modem and/or other hardware/ so f tware for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
In an example embodiment, the processor 7302 may be embodied as, include or otherwise control a resource manager 7303. The resource manager 7303 ma be any means such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g., processor 7302 operating under software control, the processor 7302 embodied as an ASIC or FPGA specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the resource manager 7303 as described herein. Thus, in implementations in which software is employed, a device or circuitry (e.g., the processor 7302) executing the software forms the structure associated with such means. In the example embodiment of the present invention, the resource manager 7303 would be configured to control all mechanical components 7305, the camera and imaging components 7304, and GPS unit 7307 of the UAV. Upon receiving instructions from the processor 7302, the resource manager 7303 would interpret and translate the instructions into actions performed by the mechanical components 7305, and if enabled to do so, the camera and imaging components 7304. The mechanical components 7305 of the UAV may include but are not limited to rotors, UEDs, rudders, mechanical or robotics arms, retractable cables, gimbals, or other mechamcal apparatus. Any and all data from the camera and imaging components 7304 and GPS unit 7307 would be interpreted by the resource manager 7303, and in turn analyzed by the processor 7302 so as to communicate the data to the image compilation server 72 1 via the telemetry unit 7301. The geographic location data produced by the GPS unit 7307 would be processed and sent in an identical, parallel manner.
It should be noted that when referring to images with respect to the present invention, all possible forms of image data should be considered in addition to the possibility of data collected outside of the visible spectrum. That is, other devices capable of detecting and/or emitting electromagnetic radiation outside the visible spectrum may be present on the UAV to enable or aid in the capture of accurate data for virtual 3D image reconstruction. Specifically, low-frequency radar (characterized by a frequency slightly less than 1 GHz) and other nominal radar frequencies (ranging from 2 GHz to 40GHz) may be utilized so as to detect the volumetric specifications of a target area.
FIG. 43 describes the algorithmic process by which the image compilation server creates the virtual 3D environment of the present invention, given images acquired by UAVs. The process begins with multiple UAVs acquiring images 7401. The images are then transmitted to the image compilation server via a form of wireless communication 7402, such as ISM radio band communication (more commonly 2.4GHz, 1 SMI Iz, RF spectrum), cellular communication networks, or other similar distanced communication methods. Each of the images is then associated with or assigned to a specific geographic area 7403. In some implementations, an image may be further cropped, split, or otherwise modified as calculated manually (by a user) or automatically by the image compilation server so as to make more precise assignments or associations of the image data with geographic areas. The image compilation server then associates angles of perspective to images of the same geographic area 7405, and accomplishes this task for every geographic area captured by the UAVs. Finally, the image compilation server constructs the 3D environment utilizing a processing algorithm 7406 that those familiar in the art will recognize as possible given that all data necessary to construct the 3D environment is supplied by the UAVs of the present invention. The 3D environment should be understood to be any form of data that could be interpreted to represent a three-dimensional space containing one or more tliree-dimensional objects. In a simple implementation, the 3D environment may be in the form of a large three-dimensional matrix, with each data value in the matrix representing if the space is occupied by an object, and if so, what the properties of the object are (e.g. color, texture, etc.). In a more complex implementation, the 3D environment may be in the form of a software specific data fonnat, where the 3D compilation server utilizes an existing 3D visualization software file format to store the data (e.g. \ac", '.obj', \skp\ '.sldprt', '.snid'). This more complex implementation may be advantageous for utilizing the 3D environment to accomplish tasks other than simply viewing the environment, such as 3D printing the environment or otherwise reconstructing the environment in reality.
FIG. 44A-C shows a series of images describing a specific implementation where five UAVs 7101, as similarly depicted in a general fashion in FIG. 41, are located in certain locations 7502 with the image capture area of each UAV 7502 also shown. FIG 77A shows a map void of actual images taken by UAVs or a satellite, so as to better display a general figurative interface in which UAVs 7101 are located in various positions throughout a given geographic area 7501. The dotted rectangles 7502 should be seen as the area capable of having image data collected by the UAV at the center of each rectangle. With respect to the image data rectangles 7502-2, 7502- 3, 7502-4, and 7502-5, there may be an overlap in image data capture area. FIG. 44B shows the same geographic area 7501 prior to UAVs being present, with outdated aerial images supplied, for example, either by the UAVs of the present invention or by a satellite. FIG. 44C shows an updated visualization of the geographic area 7501 as would be stored in the image compilation server previously described. It should be noted that the updated areas of the area 7501 have been denoted in a similar fashion rectangular fashion 7502 as in FIG. 44A. Where overlapping image data occurs, the image compilation server previously described may either choose data collected from on UAV, or perform another image processing algorithm that would combine the best qualities of all the image data specific to the overlapping area to create a new set of image data for the purposes of display on to a graphical user interface.
FIG. 45A-I shows an example implementation of the present invention, by means of a series of images pertaining to a UAV 7101 flying over a building over time at multiple angles, so as to gather sufficient image data to allow the image compilation server as previously described to virtually reconstruct the building in 3D. FIG. 45 A and FIG. 45B both show a UAV at the same position (south of the building, and slightly angled so as to yield an ideal perspective for the fixed camera or other imaging components) near the target building, with FIG. 45C showing a sample image taken by the camera located on the UAV 7101. FIG. 45 D and FIG. 45E both show the UAV 7101 at the same position at a different time (directly over the building, with the plane formed by the rotor blades parallel to the ground) over the target building, with FIG. 45F showing a sample image taken by the camera located on the UAV 7101. FIG. 45G and FIG. 45H both show the UAV 7101 at the same position at yet another time (north of the building, and slightly angled so as to yield an ideal perspective for the fixed camera or other imaging components) over the target building, with FIG. 451 showing a sample image taken by the camera located on the UAV 7101.
As shown in FIG. 46A-C, the apparatus according to the first implementation of the invention comprises one or more linear arrays 8103 of light-emitting diodes (LEDs) placed on or otherwise embedded in a UAV rotor blade 8102. The LED arrays 8103 each include, for example, nine LEDs. The number of lights may vary for rotor radii of different lengths and desired resolution of the persistence of vision image formed. Specifically, FIG. 46A shows a bottom view of the first
implementation of the present invention, followed by FIG. 46B showing a side view, and FIG. 46C showing a top-left diagonal view. Although not pictured, the necessary wiring and connections to the LED arrays 8103 are present, and would be embedded in the UAV rotor blade, with the necessary larger and heavier circuitry components being located in the center compartment 8101.
FIGs 47A-B show a rotor blade with a horizontal array of collimated laser diodes 8201 located inside the center compartment 8101 shining light beams along small rectangular cavities 8202 inside the rotor blade, until the beams come in to contact with diffuse reflective surfaces 8203 that emit light underneath the rotor blade. The reflective surfaces 8203 may be comprised of additional optical components (e.g. lensiets, gratings, polished metals, etc.) so as to sufficiently reflect the laser diode beams downwards, or underneath the rotor blade 8102 in order to have the same or better lighting effect as a LED. Specifically, FIG. 47A shows a transparent, close up, top-left diagonal view of where part of the rotor blade 8102-1 conjoins with the center compartment 8101, and where the laser diodes 8201-1 are located and oriented. It should be understood that an identical configuration exists on the opposite side of the center compartment 8101. FIG. 47B shows a holistic top-left diagonal view of the current implementation, where the reflective surfaces 8203 are periodically spaced along the rotor blade 8102. FIG. 47C and FIG. 47D show bottom and side views of the current implementation, respectively. Again, as similarly explained in the first implementation, although not pictured, the necessary wiring and connections to the laser diodes 8201 are present, and would be embedded in the center compartment 8101.
FIGs 48A-D show a rotor blade equipped with a horizontal array of uncoUimated light sources 83 1 (e.g. LEDs, uncoUimated laser diodes, etc.) inside the center compartment 8101 and shining light along an aligned array of optical fibers 8302. This third implementation is aimed at addressing the possibility of utilizing uncoUimated light sources as the light sources 8301 in the center compartment 8101 and yielding an identical beam redirection effect as in the second implementation so as to reduce the rotor blade aerodynamic impacts as much as possible. Specifically, FIG. 48A shows a transparent, close up, top-left diagonal view of where part of the rotor blade 8102- 1 conjoins with the center compartment 8101, and where the uncoUimated light sources 8301-1 are located and oriented. FIG. 48B shows a holistic top-left diagonal view of the current implementation, where the optical fiber ends 8303 are periodically spaced along the rotor blade 8102. The optical fiber ends 8303 are gradually angled downwards so as to maintain a total internal reflection effect until the light is emitted beneath the rotor blade 8102. Additional lenses or optical components may be added to the end of the optical fibers 8302 so as to ensure as equal of a light flux (equal distribution of light across a given angle) as possible from the ends of the optical fibers 8302. FIG. 48C and FIG. 48 D show top and side views of the current implementation, respectively.
FIG. 49 is a block diagram of the circuitry 8200 for a light array 8406 of any of the previously described implementations of the present invention. That is, the light array 8406 should be understood to be the LEDs 8103, laser diodes 8201, or uncollimated light sources 8301 , depending on which implementation is used. The circuitry includes batteries 8405 for powering the microprocessor 8402, memory 8404, and light sources 8406. The LEDs are activated via series-parallel
conversion LED drivers 8403. The microprocessor 8402 includes memory 8404 for storing the possible patterns. As shown, the two banks of nine light sources 8406-1 and 8406-2 are driven via serial to parallel converters 8403. When displaying text and other patterns, the microprocessor must know how the individual lights of the light arrays are distributed around the rotor blade. It should be noted that this spacing would be taken in to account prior to programming in the specific light frequencies for each RPM of which the UAV is capable.
When a UAV is in flight, the light arrays 8406 rotate as the rotor blades spin. Utilizing known RPM rotor speed information, the microprocessor can synchronize images and patterns displayed by the light arrays 8406 to the speed of the rotor blade. This allows images to be "frozen" or controllably "scrolled" in one direction or the other. Because the entire linear array of lights is swept during motion, it appears to the viewer as if the entire rotor blade is illuminated.
A memory 8404 connected to the microprocessor can store a large number of patterns, images, and messages. These images can be played out in a random, sequential, or fixed pattern. Selection of the playback method is done via a selection by a user remotely via the circuitry 8400, or may be automated by the circuitry 8400 itself based on the operational state of the UAV or a variety of other factors related to the UAV. This decision process may or may not be executed by the integrated UAV main computational unit 8401. The UAV main computational unit 8401 should be understood to be comprised of some or all of the following components so as to be capable of accomplishing the electronic tasks the present invention may require: a processor, a telemetry unit, a power supply unit, a memory device, and other electronic devices. FIG. 50 shows four different example persistence of vision display possibilities and applications of the present invention. FIG. 50A shows a large capital "A" created by a rotating light a nay, which may be used as an indicator for which rotor blade is which, or as a single letter of a larger message spelled out by adjacent rotor blades. FIG. 50B shows a sample message and option interface asking a user to "Confirm GPS Coord.?" and two options, "YES" and "NO." This display interface may be controlled or interacted with remotely by a user looking up at the display from the ground, and used to communicate directions or preferences to the UAV so as to modify its behavior. Having visual confirmation directly from the UAV ensures that the information displayed is uncorrupted and current on the UAV, as opposed to trying to communicate with the UAV by other means may introduce error or incorrectness due to the telemetry interface being used. It should be noted that this is an important novelty in that in the event that traditional wireless communication between the UAV and a controlling system is lost, the UAV may still be able to report its status and behavior via the persistence of vision display. FIG. 50C shows just such a status report display, with the message "LOW BATTERY" being conveyed to an observer, followed by a low battery symbol. Tins may be a crucial communication tool so as to inform a remote operator that the UAV needs to recharge its batteries or battery. FIG. SOD shows a QR code being displayed, suggesting the possibility of machine-to-machine communication via this persistence of vision display. The display could also be used to calibrate or coordinate specific autonomous processes such as landing, and utilize images that may not be interpreted by a human observer, but that hold plenty of information for an enabled image processing machine to interpret
As shown, in FIG. 51, all rotor blades of a UAV may be equipped with light arrays as previously described in any of the three implementations so as to display the message "LANDING" twice on two rotor blades, and the message "STAND BACK" on the remaining two rotor blades. Some or all of the electronics described in FIG. 49 (8401 through 8405) may be housed inside the main body of the UAV, with connections to the light arrays on the rotor blades being accomplished via embedded wiring from the UAV body to the UAV arms, and up to the rotor blade motors and center compartments. Although the embodiments shown are for UAV rotor blades, it should be understood that certain embodiments may be applicable to automobile wheels, ceiling fans, wiper blades, similar airplane propellers, or other rotating or oscillating objects.
While various embodiments have been described, modifications or variations which might be made without departing from the present disclosure. For example, the second implementation of the present invention (utilizing laser diodes located inside the center compartment 8101) may utilize optical fibers as in the third implementation (or other total internal reflection techniques) for more complex light paths through varying rotor blade shapes. The examples illustrate the various embodiments and arc not intended to limit the present disclosure. Therefore, the description and claims should be interpreted liberally with only such limitation as is necessary in view of the pertinent prior art.
FIG. 52 describes an example system embodiment of the present invention, in which three UAVs 9101 communicate with an UAV management system 9102, which in turn may communicate with a mobile device 9103. The three UAVs 9101 would send location data (such as GPS coordinates) specific to each respective UAV wirelessly to the UAV management system 9102. An unknown UAV 9101-1 is also shown, to demonstrate that other UAVs that arc not known to the U A V management system 9102 should be understood to have the qualities and functionality of a cloud- based server capable of enabling network access among a plurality of computing elements. Additionally, it should be understood that the UAV management system 9102 would be programmed or otherwise enabled to store location data,
autonomously perform algorithmic processes on said location data, and store the results of those processes. The mobile device 9103, and any other similar actors that may be present in various embodiments not described, should be understood to be any type of mobile device that is at minimum capable of storing and processing data according to instructions given to it by a variable program, while also having at least a camera, electronic visual display, accelerometer and gyroscope. With respect to the present invention, at least one mobile device 9103 would be in communication with the UAV management system 9102 directly or indirectly, via a wireless connection (e.g. via WiFi, Bluetooth, or a cellular network). Additionally, at least one mobile device 9103 would be equipped with a monitor or other display component that would allow the location data of the UAV management system 9102 to be displayed to a user utilizing the mobile device 9103. FIGs 53A-B describe an example scenario of how multiple UAVs may occupy an airspace relative to the example system embodiment described in FIG. 52, in which three UAVs 9101-2, 9101 -3, 9101 -4 communicate with the UAV management system 9102, and an unknown UAV 9101 - 1 and known UAV 9101-2 are within the field of view of the camera of the mobile device 9103. FIG. 53 A specifically shows the electronic display of the mobile device 9103 denoting the UAV 9101-3 as a known registered UAV with a dotted circle 9202- 1 , and unique message box 9201 -1 in the bottom right corner of the mobile device 9103 display. FIG. 53B specifically shows the electronic display of the mobile device 9103 denoting the UAV 9101-1 as an unknown UAV with a dotted circle 9202-2, and unique message box 9201-2 in the bottom right comer of the mobile device 9103 display. Those familiar in the art will recognize that by using the information provided by the mobile device's 9103 accelerometer and gyroscope, combined with image processing techniques that may be implemented by the UAV management system of the present invention, the viewing angle and field of view of the mobile device, and in turn relative location of the UAVs being observed, could be determined.
FIGs 54A-B show another implementation of the present invention, in which a UAV management system would be capable of displaying the location of a user's mobile device on the electronic display of the mobile device 9103 in addition to the location of UAVs 9101 known to exist in the airspace surrounding the user. FIG 54A shows the implementation previously described, with graphical icons 9301 (graphical icon representations of UAVs 9101 -2, 9101 -3 , 9101 -4) showing known UAVs registered with the UAV management system of the present invention. In addition to simply displaying the locatio of UAVs known to and/or registered with the UAV management system, the UAV management system may be capable of receiving information from a user's mobile device as to the location of unknown UAVs, and additionally display them on the same interface to the user. This implementation is shown in FIG. 54B, where an unknown UAV 9301-1 (a graphical icon representation of UAV 9101-1) is shown on the mobile device 9103 display.
Fig. 55 shows a distanced perspective of a user 9401 with a mobile device 9103, observing four UAVs 9101 as consistently described in FIG. 52, FIGs. 53A-B and FIGs. 54A-B, with lines of perspective 9402 drawn to show the field of view of the user suggested in FIGs. 53A-B. While various embodiments have been described, those skilled in the art will recognize modifications or variations which might be made without departing from the present disclosure. The examples illustrate the various embodiments and are not intended to limit the present disclosure. For example, multiple static devices similar to the mobile devices of the present invention may be used to constantly and consistently observe a variety of airspaces so as to update the UAV management system of the present invention consistently. Therefore, the description and claims should be interpreted liberally with only such limitation as is necessaiy in view of the pertinent prior art.
While various embodiments have been described, modifications or variations which might be made without departing from the present disclosure. The examples illustrate the various embodiments and are not intended to limit the present disclosure. For example, terminology such as "landing" and other unmanned aerial vehicle specific terminology may have been used to describe unmanned vehicles in general; specific unmanned vehicle terminology should be interpreted as being synonymous to the actions or descriptions of any other unmanned vehicle where appropriate.
Therefore, the description and claims should be interpreted liberally with only such limitation as is necessary in view of the pertinent prior art.
Although the foregoing disclosure has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications can be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the disclosure is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.

Claims

1. A method for unmanned vehicle (UV) safety management, the method comprising:
receiving signals from at least one UV in real-time within a geographic area; retrieving geographical information and regulation information within the geographic area; and
transmitting signals to the at least one UV based at least on the signals from the at least one UV and the geographical information and the regulation information.
2. The method of claim 1, characterized by at least one of the following features:
(a) wherein the signals from the at least one UV comprises at least one information selected from UV identification number, UV locations, UV height, UV speed and scheduled UV travel path;
(b) wherein the signals from the at least one UV comprises power reserve information, the power reserve information bring a fuel reserve information or battery remaining information, and preferably wherein when the power reserve information is below a threshold, the at least one UV is configured to send a request to land;
(c) wherein the geographical information comprises information selected from population, ground-based activities, tall buildings or structures, weather within the geographic area;
(d) wherein the restriction information comprises information selected from specific areas not safe to travel through, speed restrictions, and height restrictions;
(e) wherein the restriction information comprises temporary restrictions established for a specific time interval;
(f) further comprising locating unidentified UVs within the geographic area, wherein the unidentified UVs preferably are located by comparing a low level radar feed to a location map of all identified UVs; and
(g) wherein the signals transmitting to at least one UV comprise a calculated path for the at least one UV to follow.
3. A system for unmanned vehicle (UV) safety management, the system comprising: a UV communication interface to communicate with at least one unmanned vehicle via a bidirectional wirelessly UV communication link within a geographic area; a network interface to communicate with at least one user device via a bidirectional network communication link;
a cloud server coupled to the UV communication interface and the network interlace, the cloud server comprising a microprocessor and a non-transitory computer- readable medium comprising computer executable instructions that, when executed by the processor, cause the cloud server to:
receiving signals from the at least one UV in real-time within the geographic area;
retrieving geographical information and restriction information within the geographic area; and
transmitting signals to the at least one UV based at least on the signals from at least one UV and the geographical information and the regulation information.
4. The system of claim 3, characterized by at least one of the following features:
(a) wherein the signals from the at least one UV comprises information selected from UV identification number, UV locations, UV height, UV speed and scheduled UV travel path, and optionally wherein when the scheduled UV travel path crosses a restricted zone, the cloud server is configured to direct the at least one UV a calculated path along multiple established waypoints;
(b) wherein the scheduled UV travel path and other relevant mission data can also be sent to a simulation environment located on the cloud server, so as to test said travel path or other UV task;
(c) wherein the signals from the at least one UV comprises power reserve information, the power reserve information bring a fuel reserve information or battery remaining information;
(d) wherein when the power reserve information is below a threshold, the at least one UV is configured to send a safety stop and arrival request;
(e) wherein upon receiving the land request from the at least one UV, the server sends the at least one UV locations of a nearby recharging stations, an emergency landing zones or an all-purpose unmanned vehicle station; and
(i) wherein the cloud sewer is further configured to establish a restriction zone and assign a risk value to the established restriction zone, wherein the risk value to the established restriction zone preferably varies at different time intervals.
5. A method for risk management for an unmanned vehicle (UV) travel planning, the method comprising:
obtaining a starting location and a final location of the UV;
identifying restriction zones on a map via a graphical user interface (GUI), the map being relevant to the starting location and the final location, the restriction zones being retrieved or input manually;
identifying risk values for each of the restriction zones, the risk values representing the risk of the UV failing to operate or operating erratically, introducing the potential for damages; and
calculating for paths between the two starting location and the final location based at least on the risk values for each identified restriction zones; and
selecting one path and establishing waypoints along the selected path as the UV travelling route.
6. The method of claim 5, wherein the risk value to the created restriction zone various at different time intervals.
7. A method for unmanned vehicle communication through short message service (SMS), the method comprising:
receiving information at an unmanned vehicle, the information being sent from a cellular network via a communication path between the cellular network and the unmanned vehicle; and
transmitting short messages from the unmanned vehicle signals to the cellular network via the communication path.
8. The method of claim 7, characterized by at least one of the following features:
(a) wherein the short messages are refreshed and transmitted actively from the unmanned vehicle on a repetitive schedule with a repetitive interval;
(b) wherein the short messages are compiled according to a specific request received at the unmanned vehicle;
(c) wherein the short messages are selected from the group consisting of a current location, a speed, an altitude, and a destination, power reserve information of the unmanned vehicle;
(d) wherein the repetitive interval is a predetermined parameter stored within the unmanned vehicle;
(e) wherein the repetitive interval is determined by a request received at the unmanned vehicle;
(f) wherein the repetitive interval is dynamically determined based on at least one parameter selected from the group consisting of a current location, a speed, an altitude, a destination, and a power reserve of the unmanned vehicle; and
(g) wherein the short messages are sent from the unmanned vehicle using a first communication protocol, i no reply or conformation received within a time threshold, the short messages are sent via a second communication protocol different from the communication protocol, wherein if still no reply or conformation received within the time threshold, the short messages preferably are sent via the first
communication protocol again, and/or wherein the time threshold is smaller than the repetitive interval such that the short messages may be resent before the short messages are refreshed.
9. An unmanned vehicle (UV) capable of communication through short message service (SMS), the UV comprising:
a telemetry unit to communicate with a cellular network via a bidirectional communication link supporting SMS communication with at least one communication protocol;
a processor coupled to the telemetry unit;
a memory device comprising non-transitory computer-readable medium comprising computer executable instructions that, when executed by the processor, cause the UV to:
receive information from cellular network via the telemetry unit; and transmit short messages to the cellular network via the telemetry unit, the short messages being refreshed and transmitted actively from the unmanned vehicle on a repetitive schedule with a repetitive interval.
10. The UV of claim 9, characterized by at least one of the following:
(a) further comprising a resource manager coupled to the processor to interpret and translate instructions from the processor to actions performed by mechanical components of the UV;
(b) wherein the telemetry unit is also configured to communicate to a second telemetry unit located on a landing module to inform the landing module that the UV is performing an approach towards the landing module;
(c) wherein the short messages are selected from a current location, speed, altitude, destination, power reserve information of the unmanned vehicle;
(d) wherein the repetitive interval is a predetermined parameter stored within the memory device, or determined by a request received at the unmanned vehicle;
(e) wherein the repetitive interval is dynamically determined based on at least one parameter selected from a current location, a speed, an altitude, a destination, and a power reserve of the unmanned vehicle;
(f) wherein the short messages are sent from the unmanned vehicle using a first communication protocol, if no reply or conformation received within a time threshold, the short messages are sent via a second communication protocol different from the communication protocol, if still no reply or conformation received within the time threshold, the short messages are sent via the first communication protocol again;
(g) wherein the time threshold is smaller than the repetitive interval such that the short messages may be resent before the short messages are refreshed;
(h) wherein the modem used for SMS communication is electrically isolated, and in the event of a UAV system power failure, the modem still transmits cellula messages based off of position estimates;
(i) wherein the communication utilizes a proxy enabled by another mobile device such as a cell phone, in order to achieve a better or alternative communication method of sending SMS messages; and
(j) wherein the SMS communication is used to send encryption keys to a UAV so as to enable it to communicate securely via other wireless frequencies such as WiFi, Bluetooth, 4G.
1 1 . A method for short message service (SMS) between an unmanned vehicle and a cellular network, the method comprising:
compiling a short message from information obtained from at least one of current location, speed, altitude, destination, power reserve of the unmanned vehicle; and
sending the short message on a repetitive schedule over a communication path between the UV and the cellular network, the communication path supporting at least one communication protocol; wherein the short messages are sent from the unmanned vehicle using a first communication protocol, if no reply or conformation received within a time threshold, the short messages are sent via a second communication protocol different from the communication protocol, if still no reply or conformation received within the time threshold, the short messages are sent via the first communication protocol again;
wherein the time threshold is smaller than the repetitive interval; and wherein the short messages are refreshed on the repetitive schedule.
12. The method of claim 1 1, characterized by at least one of the following features:
(a) wherein when the power reserve of the unmanned vehicle is below a reserve threshold, the short messages are sent via both the communication protocols and with a reduced repetitive interval;
(b) wherein the latency and signal strength of the communication method is used to validate local position;
(c) wherein the latency and signal strength is additionally used to determine the direction in which a stronger signal may be achieved, and additionally store the location of where that stronger signal is locally on the UAV once the UAV has reached a position of that stronger signal, so as to return to it in the event of signal deterioration or loss; and
(d) wherein the locally stored positions of strong communication signals are sent to a cloud data structure, where multiple positions of signal strength can be stored and organized for use by the same or other UAVs enabled to communicate with the cloud.
13. An unmanned aerial vehicle (UAV) non-inertial frame lock system comprising: a landing platform having at least one electromagnetic wave (EMR) emitter positioned thereon;
a UAV having at least three EMR receivers; and
a computational unit of the UAV, wherein the computational unit uses at least one EMR signal communicated between the EMR emitter and at least one of the at least three EMR receivers to control at least one of a position, a roll, and a pitch of the UAV during a landing process.
14. The system of claim 13, characterized by at least one of the following features:
(a) wherein the computational unit of the UAV controls the at least one of the position, the roll, and the pitch of the UAV during the landing process by calculating a transmission center position of the EMR emitter relative to the UAV;
(b) wherein the landing platform further comprises a plurality of embedded wires positioned along edges of portions of the landing platform;
(c) wherein the at least three EMR receivers are spaced about a substantial center point of the UAV;
(d) wherein the at least three EMR receivers are positioned on a lowermost point of a housing of the UAV;
(e) wherein the at least three EMR receivers are positioned on a suspended structure suspended from a housing of the UAV, wherein the at least three EMR receivers are vertically oriented relative to one another, wherein the suspended structure preferably terminates at a position higher than a terminating end of a leg of the UAV;
(g) wherein the landing platform further comprises a landing bar, wherein the at least one EMR emitter further comprises a plurality of EMR emitters positioned on at least one face of the landing bar, wherein the at least one face preferably further comprises a vertically-oriented face of the landing bar, and/or wherein the plurality of EMR emitters are positioned in a single plane: and
(h) wherein the landing platform further comprises a control circuit, wherein the control circuit communicates landing instructions to the UAV based on a decrease of resistance in at least one photo-transistor receiving EMR radiation from the at least one EMR emitter.
15. An unmanned aerial vehicle (UAV) non-inertial frame lock system comprising: a landing platform having at least three electromagnetic wave (EMR) receivers positioned thereon;
a UAV having at least one EMR emitter; and
a landing platform computational unit receiving at least one EMR signal from the at least one EMR emitter, processing the at least one EMR signal, and communicating it to a UAV computational unit, wherein the UAV computational unit uses at least one processed EMR signal to control at least one of a position, a roll, and a pitch of the UAV during a landing process.
16. The system of claim 15, characterized by at least one of the following features:
(a) wherein the at least three EMR receivers are spaced about a substantial center point of the landing platform;
(b) wherein the landing platform further comprises a multi-sided platform having a plurality of triangular portions, wherein at least one of the at least three EMR receivers is positioned at a junction between two of the plurality of triangular portions; and
(c) further comprising a power source positioned proximate to a center point of the landing platform, wherein the power source provides power to the at least three EMR receivers.
17. A method of controlling a landing of a UAV with a non-inertial frame lock system, the method comprising:
positioning a UAV proximate to a landing platform having at least one electromagnetic wave (EMR) emitter positioned thereon, wherein the UAV has at least three EMR receivers positioned thereon; and
controlling at least one of a position, a roll, and a pitch of the UAV during a landing process with a computational unit of the UAV by using at least one EMR signal communicated between the EMR emitter and at least one of the at least three EMR receivers.
18. The method of claim 17, characterized by at least one of the following features:
(a) further comprising adjusting the position of the UAV with respect to a substantial center point of the landing platform;
(b) further comprising controlling corrective movements of the UAV using an intensity of the at least one EMR signal;
(c) wherein the landing platform further comprises a control circuit, wherein the control circuit communicates landing instructions to the UAV based on a decrease of resistance in at least one photo-transistor receiving EMR radiation from the at least one EMR emitter; and
(d) wherein the at least three EMR receivers are positioned on a raised bar and in a single plane.
19. A n aerial robotic landing apparatus comprising: a landing platform formed from a plurality of portions; and a plurality of warning devices positioned along an outer edge of the landing platform, wherein the plurality of warning devices are activatable by a UAV during a landing of the UAV on the landing platform.
20. The apparatus of claim 19, characterized by at least one of the following features:
(a) wherein the plurality of warning devices further comprise at least one of: a plurality of light emitting diodes (LEDs); and a plurality of audible warnings, and optionally further comprising at least one wire connected between a power source of the landing platform and the plurality of LEDs;
(b) wherein the landing platform is foldable along junctions between the plurality of portions, wherein the plurality of portions preferably are triangular in shape and are substantially equivalent is size, or wherein the landing platform has a polygon shape when unfolded and has a triangular shape when the portions of the landing platform are folded, or wherein the plurality of portions are foldable into a housing for the UAV when the UAV is in a grounded position, and optionally wherein the housing further comprises a pyramid-shaped housing with the UAV positioned therein, wherein at least two of the plurality of portions are positioned overlapping one another;
(c) wherein the UAV and the landing platform are in communication with one another, wherein GPS coordinates of the landing platform are communicated to the UAV, wherein the GPS preferably coordinates of the landing platform are communicated to the UAV using a cellular communication device positioned about the landing platform, wherein the GPS coordinates are communicated using at least one cellular message, and wherein the GPS coordinates preferably are communicated through an external management system;
(d) wherein the foldability of the platform enables the structure of the platform to act as an antenna for enabling wireless communication; and
(e) wherein the platform is unfoldable via a mechanical system that utilizes potential energy to unfold the platform.
21. A method of landing an unmanned aerial vehicle (UAV) on a landing platform, the method comprising:
establishing a communication link between an in-flight UAV and a landing platform positioned on a grounded surface; and activating at least one warning on the landing platform by the in-flight UAV through the communication link, wherein the at least one warning is identifiable by a human observer of the landing platform.
22. The method of claim 21 , characterized by at least one of the following features:
(a) wherein the communication link further comprises a system of electromagnetic (EMR) wave emitters and receivers;
(b) wherein the communication link further comprises receiving permission for the UAV to land on the landing platform by a computational unit of the landing platform, wherein the UAV preferably aborts a landing process if permission is not received from the computational unit of the landing platform; and
(c) wherein activating the at least one warning on the landing platform further comprises at least one of: illuminating an LED; and activating an audible alarm.
23. A robot landing platform comprising:
a polygon-shaped platform having a plurality of sections; and
a flexible junction between each of the plurality of sections, wherein the polygon- shaped platform is foldable along each of the flexible junctions.
24. The robot landing platform of claim 23, characterized by at least one of the following features:
(a) wherein each of the plurality of sections is triangular-shaped;
(b) wherein when the polygon-shaped platform is in a folded state, a footprint of the polygon-shaped platform is substantially equivalent to a footprint of one of the plurality of sections; and
(c) wherein the polygon-shaped platform is foldable into a structure having an interior cavity.
25. A UAV landing station apparatus comprising:
support legs contactable to a grounded surface;
an electronics bay supported by the support legs;
a support column extending vertically from the electronics bay;
a landing platform support vertically above the electronics bay with the support column; and a charging ring formed on the landing platform.
26. The apparatus of claim 25, characterized by at least one of the following features:
(a) wherein the electronics bay is enclosed with sidewalls connected between edges of the electronics bay and the landing platform, wherein the capture ring further preferably comprises two half-circle structures, wherein one of the half-circle structures preferably has an opposite polarity from another of the half-circle structures;
(b) wherein the electronics bay further comprises a graphical user interface; and
(c) wherein the charging ring further comprises a capture ring, wherein the capture ring has at two opposing, slanted surfaces, and wherein the charging ring is positioned at a lower floor between the two opposing, slanted surfaces, wherein the charging ring further preferably comprises a conductive material.
27. A method of supplying power to a UAV comprising:
providing a UAV landing station apparatus having a landing platform with a charging ring formed on the landing platform;
landing the UAV on the landing platform, wherein at least one leg of the UAV contacts the charging ring; and
supplying a power source to the charging ring to recharge a power source of the
UAV.
28. The method of claim 27, characterized by at least one of the following features:
(a) wherein landing the UAV on the landing platform to contact at least one leg of the UAV with the charging ring further comprises guiding the at least one leg of the UAV towards the charging ring with at least one slanted sidewall of a capture ring;
(b) further comprising guiding the UAV to the landing platform with an EMR system having at least one EMR transmitter and at least one EMR receiver; and
(c) further comprising guiding the UAV to the landing platform with a camera.
29. A UAV landing station apparatus comprising:
a plurality of panels forming an enclosable area, wherein at least a portion of the plurality of panels are roof panels; and a landing platform is formed as a floor of the enclosable area, wherein the roof panels are movable between open and closed positions, wherein in the open position, the landing platform is accessible for landing a UAV.
30. The apparatus of claim 29, characterized by at least one of the following features:
(a) further comprising solar panels on at least a portion of the roof panels;
(b) wherein the roof panels are each flexibly connected to a wall panel;
(c) wherein at least one charging ring is positioned on the landing platform, optionally further comprising at least one electromagnet positioned between the at least one charging ring;
(d) further comprising a conical impression positioned substantially centered on the landing platform;
(e) wherein the roof panels are movable with a mechanical drive system having a slide and a drive train, wherein activation of the drive train causes side panels of the plurality of panels forming the enclosable area to move relative to the landing platfonn, wherein activation of the drive train preferably causes the roof panels to move between the open and closed positions, and optionally further comprising a fastening section, wherein the slide and the fastening section stabilize a rack connected to the side panels during movement between the open and closed positioned;
(1) wherein the landing platform is enabled to communicate with similar or identical landing platforms, so as to enable collaborative landing and recharging processes; and
(g) wherein the landing platfonn creates a static electric field, and a UAV within close proximity of the landing platform uses an onboard mechanical apparatus to generate current utilizing said static electric field.
31. An unmanned aerial vehicle apparatus comprising:
a control board; and
at least one rotor plate connee table to the control board, wherein the at least one rotor plate has a plurality of rotor arms extending outwards therefrom.
32. The apparatus of claim 31, characterized by at least one of the following features:
(a) wherein the at least one rotor plate has a continuous frame with a substantially open interior space, wherein the continuous frame preferably further comprises at least one of: a polygon shape; and a circular shape, or wherein an outer shape of a footprint of the control board substantially matches a shape of the continuous frame;
(b) wherein the at least one rotor plate further comprises at least two rotor plates connectable to the control board, wherein the at least two rotor plates are positioned out of phase from one another, wherein the out of phase position of the at least two rotor plates preferably further comprises a 45-degree out of phase position;
(c) wherein the at least one rotor plate further comprises at least three rotor plates connectable to the control board, wherein the at least three rotor plates are positioned substantially 30-degrees out of phase from one another;
(d) further comprising a modular, stacked battery connected to the control board;
(e) further comprising a payload carrying area positioned below the control board;
(f) further comprising dampening elements positioned at a connection point between the control board and the at least one rotor plate, wherein the dampening elements reduce a transfer of vibrations from the at least one rotor plate to the control board; and
(g) further comprising a motor mount positioned on each of the plurality of rotor arms, wherein the motor mount positions optionally may also be utilized as points of electrical and/or power connections for the purpose of recharging or communicating with other systems, and/or optionally further comprising a motor positioned on the motor mount of each of the plurality of rotor arms, wherein the motor is controlled with a motor controller positioned on at least one of: the conti ol board; and one of the plurality of rotor arms.
33. A method of delivering payloads with an unmanned aerial vehicle comprising: determining a weight of a payload;
determining a delivery distance of the payload;
selecting a quantity of UAV rotors required for the determined weight and delivery distance of the payload; and
delivering the payload.
34. The method of claim 33, characterized by at least one of the following features:
(a) further comprising selecting a quantity of modular batteries required for delivering the payload, and optionally
(b) further comprising constructing the UAV by connecting the selected quantify of UAV rotors to a control board, and further comprising:
returning the UAV from delivering the payload;
performing a status check on the UAV; and
removing at least one discharged battery from the UAV;
(c) wherein the payload weight of less than 5 kg requires a single rotor plate, the payload weight of between 5 kg and 15 kg requires two rotor plates, and the payload weight of more than 15 kg requires at least three rotor plates, wherein each rotor plate has at least four UAV rotors.
35. A modular unmanned aerial vehicle apparatus comprising:
a control board;
a plurality of rotor plates connected to the control board, wherein the plurality of rotor plates each have at least four rotor arms extending outwards therefrom, wherein each of the rotor arms carries at least one rotor:
a plurality of modular, stacked batteries connected by at least one of: the control board and the plurality of rotor plates; and
a payload area.
36. The apparatus of claim 35, characterized by one or both of the following features:
(a) wherein a quantity of the plurality of rotor plates connected to the control board and a quantity of the plurality of modular, stacked batteries is operator-selectable based on a payload weight; and
(b) wherein the quantity of the plurality of rotor plates connected to the control board and the quantity of the plurality of modular, stacked batteries is operator- selectable based on both the payload weight and a delivery distance of the payload.
37. A method for determining wind vectors around an unmanned aerial vehicle (UAV), the method comprising:
obtaining power consumption information of the UAV; transferring the obtained power consumption, aviation information and ambient air information from the UAV to a management system via a wireless communication path; and
calculating at the management system wind vectors around the UAV based at least on the transferred power consumption.
38. The method of claim 37, characterized by at least one of the following features:
(a) further comprising obtaining aviation information of the UAV;
(b) further comprising obtaining ambient air information of the UAV, wherein the ambient air information preferably comprises air temperature, humidity and pressure, and wherein the air temperature, humidity and pressure preferably are obtained from an onboard temperature sensor, an onboard humidity sensor and an onboard barometer respectively;
(c) wherein the aviation information of the UAV comprises UAV speed, direction and position, wherein the UAV speed, direction and position preferably are calculated from GPS signals received at an onboard GPS receiver
(d) further comprising displaying the wind vectors as a wind map on a graphic user interface (GUI), wherein the GUI preferably further comprises the UAV, the UAV's flight path and geographic background mformation; and
(e) wherein the power consumption is obtained via onboard electrical feedback sensors.
39. A system for perform external wind mapping using onboard sensors of an unmanned aerial vehicle (UAV), the system comprising:
a UAV comprising:
electrical feedback sensors to measure power consumption of the UAV;
a GPS receiver to obtain aviation information of the UAV;
a communication device to transfer at least the power consumption and the aviation information; and
a cloud server coupled to the communication device via a wireless
communication path, the cloud server receiving the power consumption and the aviation information and calculating wind vectors around the UAV based at lease on the power consumption and the aviation information.
40. The system of claim 39, characterized by at least one of the following features:
(a) wherein UAV comprising a temperature sensor, a humidity sensor and a barometer to measure ambient air temperature, ambient air humidity and ambient air pressure respectively, wherein the ambient air temperature, ambient air humidity and ambient air pressure optionally are also transferred to the cloud sever, wherein the wind vectors are calculated based on also the ambient air temperature, ambient air humidity and ambient air pressure, and optionally wherein the ambient air temperature, ambient air humidity and ambient air pressure are refreshed at the same refreshing rate as the aviation information; and
(b) wherein the calculated wind vectors are mapped as a wind map on a graphic user interface (GUI), wherein the GUI preferably is displayed as an interactive map accessible through a web browser, and optionally wherein the GUI further comprises the UAV, flight path of the UAV and a geographic background map;
(c) wherein a UAV is additionally equipped with a magnetometer and aviation information;
(d) wherein the error in the magnetometer data is used to determine magnetic disturbances on the ground below the UAV;
(e) wherein a UAV is additionally equipped with an accelerometer, gyrocope, and aviation information;
(f) wherein impulse events such as wind gusts or being hit by a physical object are detected by time differential analysis of accelerometer and gyroscope data; and
(g) wherein an onboard processor determines that an analytics load exceeds its capabilities, and automatically sends the additional computational analvtics load to a cloud server via a wireless communications protocol for faster processing.
41. A non-transitory computer-readable medium or media comprising one or more instructions which, when executed by one or more microprocessors, causes the steps to be performed comprising:
receiving aviation information of the UAV;
receiving ambient air information of the UAV
calculating wind vectors around the UAV based at least on the received power consumption, aviation information and the ambient air information; and displaying the calculated wind vectors as a wind map on a graphic user interface
(GUI).
42. The non- transitory computer-readable medium or media of Claim 41, characterized by at least one of the following features:
(a) wherein the GUI is displayed as an interactive map accessible through a web browser;
(b) wherein the GUI further comprises the UAV, flight path of the UAV and a geographic background map; and
(c) wherein calculating wind vectors is done using a model preloaded with profile of the UAV.
PCT/US2016/039638 2015-06-26 2016-06-27 Robotic apparatus, systems, and related methods WO2016210432A1 (en)

Applications Claiming Priority (22)

Application Number Priority Date Filing Date Title
US201562185098P 2015-06-26 2015-06-26
US62/185,098 2015-06-26
US201562188855P 2015-07-06 2015-07-06
US62/188,855 2015-07-06
US201562191009P 2015-07-10 2015-07-10
US62/191,009 2015-07-10
US201562191670P 2015-07-13 2015-07-13
US62/191,670 2015-07-13
US201562193854P 2015-07-17 2015-07-17
US62/193,854 2015-07-17
US201562194897P 2015-07-21 2015-07-21
US62/194,897 2015-07-21
US201562201394P 2015-08-05 2015-08-05
US62/201,394 2015-08-05
US201562218696P 2015-09-15 2015-09-15
US62/218,696 2015-09-15
US14/924,475 2015-10-27
US14/924,475 US9444544B1 (en) 2015-07-13 2015-10-27 Unmanned vehicle communication through short message service
US201562247918P 2015-10-29 2015-10-29
US62/247,918 2015-10-29
US201562255413P 2015-11-14 2015-11-14
US62/255,413 2015-11-14

Publications (1)

Publication Number Publication Date
WO2016210432A1 true WO2016210432A1 (en) 2016-12-29

Family

ID=57586490

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/039638 WO2016210432A1 (en) 2015-06-26 2016-06-27 Robotic apparatus, systems, and related methods

Country Status (1)

Country Link
WO (1) WO2016210432A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107390714A (en) * 2017-06-13 2017-11-24 深圳市易成自动驾驶技术有限公司 Control method, device and the computer-readable recording medium of unmanned plane
CN109155819A (en) * 2017-12-27 2019-01-04 深圳市大疆创新科技有限公司 Camera system, bogey, filming apparatus and their control method
WO2019057464A1 (en) * 2017-09-20 2019-03-28 Robert Bosch Gmbh Transport system
WO2019086335A1 (en) 2017-11-03 2019-05-09 Ipcom Gmbh & Co. Kg Allowing access to unmanned aerial vehicles
US10455520B2 (en) 2017-03-30 2019-10-22 At&T Intellectual Property I, L.P. Altitude based device management in a wireless communications system
US10467578B2 (en) 2017-05-08 2019-11-05 Wing Aviation Llc Methods and systems for requesting and displaying UAV information
CN110998695A (en) * 2017-08-04 2020-04-10 意造科技私人有限公司 UAV system emergency path planning during communication failure
CN111413671A (en) * 2020-04-28 2020-07-14 苏州碧林威智能科技有限公司 L iFi technology-based multi-PD positioning method and system
CN111459172A (en) * 2020-05-20 2020-07-28 中国北方车辆研究所 Autonomous navigation system of boundary security unmanned patrol car
RU199157U1 (en) * 2020-05-21 2020-08-19 Публичное акционерное общество "Межрегиональная распределительная сетевая компания Центра" AUTOMATIC BASE STATION FOR UNMANNED AIRCRAFT
CN111976983A (en) * 2020-08-26 2020-11-24 天津市特种设备监督检验技术研究院(天津市特种设备事故应急调查处理中心) Method for detecting boiler by using unmanned aerial vehicle
WO2020237458A1 (en) * 2019-05-27 2020-12-03 深圳市大疆创新科技有限公司 Flight control method, control terminal, and unmanned aerial vehicle
CN113543066A (en) * 2021-06-07 2021-10-22 北京邮电大学 Sensory-guidance integrated interaction and multi-target emergency networking method and system
US11235875B2 (en) 2019-05-08 2022-02-01 Ford Global Technologies, Llc Zone-based unmanned aerial vehicle landing systems and methods
CN114495402A (en) * 2022-02-07 2022-05-13 阿维塔科技(重庆)有限公司 Dangerous area warning method and magnetic adsorption equipment
WO2022114847A1 (en) * 2020-11-27 2022-06-02 삼성전자주식회사 Electronic device and control method therefor
US11355022B2 (en) * 2019-09-13 2022-06-07 Honeywell International Inc. Systems and methods for computing flight controls for vehicle landing
US11403814B2 (en) 2017-08-04 2022-08-02 Walmart Apollo, Llc Systems, devices, and methods for generating a dynamic three dimensional communication map
WO2022247498A1 (en) * 2021-05-27 2022-12-01 北京三快在线科技有限公司 Unmanned aerial vehicle monitoring
RU2791630C2 (en) * 2017-11-03 2023-03-13 АйПиКОМ ГМБХ УНД КО. КГ Provision of access to unmanned aerial vehicles
WO2024081451A3 (en) * 2022-03-02 2024-08-15 Skygrid, Llc Displaying electromagnetic spectrum information for unmanned aerial vehicle (uav) navigation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070093946A1 (en) * 2004-10-22 2007-04-26 Proxy Aviation Systems, Inc. Methods and apparatus for unmanned vehicle command, control, and communication
US8515609B2 (en) * 2009-07-06 2013-08-20 Honeywell International Inc. Flight technical control management for an unmanned aerial vehicle
US20140018979A1 (en) * 2012-07-13 2014-01-16 Honeywell International Inc. Autonomous airspace flight planning and virtual airspace containment system
US8838289B2 (en) * 2006-04-19 2014-09-16 Jed Margolin System and method for safely flying unmanned aerial vehicles in civilian airspace
US20140277854A1 (en) * 2013-03-15 2014-09-18 Azure Sky Group Llc Modular drone and methods for use
US8965598B2 (en) * 2010-09-30 2015-02-24 Empire Technology Development Llc Automatic flight control for UAV based solid modeling

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070093946A1 (en) * 2004-10-22 2007-04-26 Proxy Aviation Systems, Inc. Methods and apparatus for unmanned vehicle command, control, and communication
US8838289B2 (en) * 2006-04-19 2014-09-16 Jed Margolin System and method for safely flying unmanned aerial vehicles in civilian airspace
US8515609B2 (en) * 2009-07-06 2013-08-20 Honeywell International Inc. Flight technical control management for an unmanned aerial vehicle
US8965598B2 (en) * 2010-09-30 2015-02-24 Empire Technology Development Llc Automatic flight control for UAV based solid modeling
US20140018979A1 (en) * 2012-07-13 2014-01-16 Honeywell International Inc. Autonomous airspace flight planning and virtual airspace containment system
US20140277854A1 (en) * 2013-03-15 2014-09-18 Azure Sky Group Llc Modular drone and methods for use

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10952159B2 (en) 2017-03-30 2021-03-16 At&T Intellectual Property I, L.P. Altitude based device management in a next generation wireless communications system
US11595915B2 (en) 2017-03-30 2023-02-28 At&T Intellectual Property I, L.P. Altitude based device management in a wireless communications system
US10455520B2 (en) 2017-03-30 2019-10-22 At&T Intellectual Property I, L.P. Altitude based device management in a wireless communications system
US11188866B2 (en) 2017-05-08 2021-11-30 Wing Aviation Llc Methods and systems for requesting and displaying UAV information
US10467578B2 (en) 2017-05-08 2019-11-05 Wing Aviation Llc Methods and systems for requesting and displaying UAV information
CN107390714A (en) * 2017-06-13 2017-11-24 深圳市易成自动驾驶技术有限公司 Control method, device and the computer-readable recording medium of unmanned plane
CN110998695A (en) * 2017-08-04 2020-04-10 意造科技私人有限公司 UAV system emergency path planning during communication failure
US11403814B2 (en) 2017-08-04 2022-08-02 Walmart Apollo, Llc Systems, devices, and methods for generating a dynamic three dimensional communication map
WO2019057464A1 (en) * 2017-09-20 2019-03-28 Robert Bosch Gmbh Transport system
CN111316182A (en) * 2017-11-03 2020-06-19 IPCom两合公司 Access-enabling unmanned aerial vehicle
RU2791630C2 (en) * 2017-11-03 2023-03-13 АйПиКОМ ГМБХ УНД КО. КГ Provision of access to unmanned aerial vehicles
WO2019086335A1 (en) 2017-11-03 2019-05-09 Ipcom Gmbh & Co. Kg Allowing access to unmanned aerial vehicles
CN111316182B (en) * 2017-11-03 2024-04-30 IPCom两合公司 Access-enabled unmanned aerial vehicle
CN109155819A (en) * 2017-12-27 2019-01-04 深圳市大疆创新科技有限公司 Camera system, bogey, filming apparatus and their control method
US11235875B2 (en) 2019-05-08 2022-02-01 Ford Global Technologies, Llc Zone-based unmanned aerial vehicle landing systems and methods
WO2020237458A1 (en) * 2019-05-27 2020-12-03 深圳市大疆创新科技有限公司 Flight control method, control terminal, and unmanned aerial vehicle
US11900823B2 (en) 2019-09-13 2024-02-13 Honeywell International Inc. Systems and methods for computing flight controls for vehicle landing
US11355022B2 (en) * 2019-09-13 2022-06-07 Honeywell International Inc. Systems and methods for computing flight controls for vehicle landing
CN111413671B (en) * 2020-04-28 2023-11-07 苏州碧林威智能科技有限公司 LiFi technology-based multi-PD positioning method and system
CN111413671A (en) * 2020-04-28 2020-07-14 苏州碧林威智能科技有限公司 L iFi technology-based multi-PD positioning method and system
CN111459172A (en) * 2020-05-20 2020-07-28 中国北方车辆研究所 Autonomous navigation system of boundary security unmanned patrol car
RU199157U1 (en) * 2020-05-21 2020-08-19 Публичное акционерное общество "Межрегиональная распределительная сетевая компания Центра" AUTOMATIC BASE STATION FOR UNMANNED AIRCRAFT
CN111976983A (en) * 2020-08-26 2020-11-24 天津市特种设备监督检验技术研究院(天津市特种设备事故应急调查处理中心) Method for detecting boiler by using unmanned aerial vehicle
WO2022114847A1 (en) * 2020-11-27 2022-06-02 삼성전자주식회사 Electronic device and control method therefor
WO2022247498A1 (en) * 2021-05-27 2022-12-01 北京三快在线科技有限公司 Unmanned aerial vehicle monitoring
CN113543066B (en) * 2021-06-07 2023-11-03 北京邮电大学 Integrated interaction and multi-target emergency networking method and system for sensing communication guide finger
CN113543066A (en) * 2021-06-07 2021-10-22 北京邮电大学 Sensory-guidance integrated interaction and multi-target emergency networking method and system
CN114495402B (en) * 2022-02-07 2023-06-27 阿维塔科技(重庆)有限公司 Dangerous area warning method and magnetic adsorption equipment
CN114495402A (en) * 2022-02-07 2022-05-13 阿维塔科技(重庆)有限公司 Dangerous area warning method and magnetic adsorption equipment
WO2024081451A3 (en) * 2022-03-02 2024-08-15 Skygrid, Llc Displaying electromagnetic spectrum information for unmanned aerial vehicle (uav) navigation

Similar Documents

Publication Publication Date Title
WO2016210432A1 (en) Robotic apparatus, systems, and related methods
US20240038081A1 (en) Transportation using network of unmanned aerial vehicles
US11868131B2 (en) Flight path determination
US11610493B1 (en) Unmanned aerial vehicles utilized to collect updated travel related data for deliveries
JP6904608B2 (en) Devices and methods for generating flight restriction zones along boundaries or for assessing the aerial response of unmanned aerial vehicles (UAVs) to flight restriction zones.
US11287835B2 (en) Geo-fiducials for UAV navigation
US11874676B2 (en) Cooperative unmanned autonomous aerial vehicles for power grid inspection and management
US10131451B2 (en) Pre-flight self test for unmanned aerial vehicles (UAVs)
US11066184B2 (en) Automated recovery system for unmanned aircraft
AU2017241363B2 (en) Vision Based Calibration System For Unmanned Aerial Vehicles
US20200026720A1 (en) Construction and update of elevation maps
US9262929B1 (en) Ground-sensitive trajectory generation for UAVs
CN108290633A (en) The method and system transported using unmanned aviation carrier
CN105807784A (en) Distributed, unmanned aerial vehicle package transport network
EP4235344A1 (en) A system for repositioning uav swarm
Mugala et al. Leveraging the technology of unmanned aerial vehicles for developing countries
JP2023126465A (en) Electric power supply information determination device, electric power supply information determination method, and program
Lu et al. Research on trajectory planning in thunderstorm weather based on dynamic window algorithm during approach segment
KR20230100833A (en) Flying object and autonomous flight control system with space rendering function
Sakalle et al. The Internet of Drones for Enhancing Service Quality in Smart Cities
CN116795132A (en) Express delivery transportation method and device of unmanned aerial vehicle and electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16815500

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23/04/2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16815500

Country of ref document: EP

Kind code of ref document: A1