US20170225800A1 - Visual landing aids for unmanned aerial systems - Google Patents
Visual landing aids for unmanned aerial systems Download PDFInfo
- Publication number
- US20170225800A1 US20170225800A1 US15/017,263 US201615017263A US2017225800A1 US 20170225800 A1 US20170225800 A1 US 20170225800A1 US 201615017263 A US201615017263 A US 201615017263A US 2017225800 A1 US2017225800 A1 US 2017225800A1
- Authority
- US
- United States
- Prior art keywords
- shapes
- landing
- visual
- pattern
- landing aid
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 46
- 230000001360 synchronised effect Effects 0.000 claims abstract description 5
- 238000012545 processing Methods 0.000 claims description 8
- 238000004891 communication Methods 0.000 claims description 5
- 238000001228 spectrum Methods 0.000 claims description 2
- 238000001514 detection method Methods 0.000 abstract description 21
- 239000003086 colorant Substances 0.000 abstract description 11
- 238000000034 method Methods 0.000 description 11
- 239000000463 material Substances 0.000 description 7
- 238000013459 approach Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 2
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64F—GROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
- B64F1/00—Ground or aircraft-carrier-deck installations
- B64F1/18—Visual or acoustic landing aids
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C19/00—Aircraft control not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64F—GROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
- B64F1/00—Ground or aircraft-carrier-deck installations
- B64F1/007—Helicopter portable landing pads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U70/00—Launching, take-off or landing arrangements
- B64U70/80—Vertical take-off or landing, e.g. using rockets
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/176—Urban or other man-made structures
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0017—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
- G08G5/0021—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0017—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
- G08G5/0026—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
- G08G5/0069—Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/02—Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/02—Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data
- G08G5/025—Navigation or guidance aids
-
- B64C2201/141—
-
- B64C2201/18—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/245—Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
Definitions
- the present disclosure relates generally to landing systems for unmanned aerial vehicles.
- visual landing aids for camera-equipped unmanned aerial vehicles are described.
- GPS positioning is subject to potential loss of satellite signal, and to be fully effective requires the loading of a terrain database into the unmanned aerial vehicle's flight controller. Without the terrain database, the unmanned aerial vehicle will not know its height above ground in order to make a correct landing and, in any event, can suffer limited precision.
- Laser or radar range finding requires additional equipment on the unmanned aerial vehicle which diminishes payload, and while providing accurate vertical guidance, does not by itself provide for homing into a particular designated landing target. Human intervention and manual control, while being accurate, is subject to pilot error and precludes a fully autonomous mission profile.
- conventional landing targets may be subject to false positives and/or may require special hardware for detection.
- Landing aids that use radio beacons may be subject to jamming or glitches.
- the present disclosure is directed to a visual landing aid comprised of a series of circles and polygons for unmanned aerial vehicles that is capable of being accurately detected over a wide range of angles and distances by an unmanned aerial vehicle equipped with a camera and shape detection capabilities.
- the visual landing aid may be implemented using contrasting colors for the pattern which reflect visible and/or UV or infrared light, or by light emitting elements.
- the landing aid includes a secondary smaller version of the landing aid shape pattern that is embedded within the larger pattern, to enable greater detection range while facilitating close-in precision guidance.
- the light emitting elements may be pulsed at a rate that is synchronized with the camera shutter on the unmanned aerial vehicle to further enhance accurate detection.
- FIGS. 1A through 1F are diagrams of various arrangements of visual landing aids.
- FIG. 2A is a diagram of a second arrangement for a visual landing aid, showing a secondary precision target pattern embedded in the primary pattern that is presented in reverse contrast.
- FIG. 2B is a diagram of the second arrangement for a visual landing aid depicted in FIG. 2A , showing the secondary precision target pattern embedded in the primary pattern presented in the same contrast as the primary pattern.
- FIG. 3 is a perspective view of an unmanned aerial vehicle (UAV) in a flight setting where the visual landing aid can be used to guide the UAV in for a landing.
- UAV unmanned aerial vehicle
- FIG. 4 is a block diagram of the components on board a UAV for image processing and detection of a visual landing aid to be used to guide the UAV relative to the position of the visual landing aid.
- Landing aid 100 functions to provide a fixed ground reference useful by an unmanned aerial vehicle that is in flight to effect a precision landing.
- Landing aid 100 is designed to be detectable by a wide range of camera resolutions, at a wide range of distances, and in any orientation.
- landing aid 100 is intended to allow an unmanned aerial vehicle to determine its orientation with respect to landing aid 100 .
- the reader will appreciate from the figures and description below that landing aid 100 addresses shortcomings of conventional visual landing aids.
- landing aid 100 does not require the unmanned aerial vehicle to rely upon a GPS database or onboard range finding equipment.
- Landing aid 100 can allow an unmanned aerial vehicle to guide itself into a landing point using only an onboard camera, which is typically carried by unmanned aerial vehicles, and a processing module capable of performing shape detection and interfacing with the unmanned aerial vehicle's flight controller. Further, landing aid 100 allows the unmanned aerial vehicle to land itself precisely upon or near landing aid 100 completely autonomously, without the need for human intervention. Landing aid 100 , when implemented as a reflective target, is not itself subject to equipment failure, and the shapes and colors of landing aid 100 can be selected so as to minimize false positives.
- Landing aid 100 for unmanned aerial vehicles includes a background 102 , upon which are located one or more shapes 104 in a color that contrasts with the background. Shapes 104 are arranged in a mathematically verifiable pattern that is useful for determining the orientation and distance of an unmanned aerial vehicle (UAV) 300 to relative to the landing aid 100 .
- the shapes 104 are capable of being detected by a camera and then recognized by an image processing system on board UAV 300 .
- landing aid 100 may include additional or alternative features, such as a pulsed light source and possibly a beacon for synchronizing with a camera located onboard UAV 300 .
- shapes 104 comprise at a minimum a plurality of circles, and can optionally include one or more polygons 108 , typically implemented as a rectangle, as seen in FIGS. 1A and 1F , or as one or more conjoined rectangles, as can be seen in FIGS. 1C and 1D .
- Each shape 104 has a minimum height 106 , and is accordingly separated from all other shapes 104 by at least minimum height 106 , or preferably some integer multiple of minimum height 106 . It is seen in the figures that the various shapes are arranged in an approximate grid-like fashion, and in a pattern that is unique from all orientations, viz. there is no mirroring of the pattern.
- This arrangement allows for ready detection by UAV 300 and further allows UAV 300 to determine its orientation relative to landing aid 100 .
- Orientation determination is useful if a landing approach is best effected from only certain directions, such as where high terrain or obstacles surround most of a landing site, or where landing systems such as hooks or nets are employed that require UAV 300 to approach from a particular direction to be snagged by the landing system.
- landing aid 100 The selection of circles and polygons on landing aid 100 is intended to reduce false positives against most terrain.
- other types of shapes that are detectable by machine shape detection algorithms can be used for landing aid 100 without departing from the disclosed invention.
- other types of machine readable patterns such as bar codes, 2D matrices such as QR codes, and other types of tags may be integrated into landing aid 100 in order to reduce false positives, and to accurately and positively identify a desired landing target.
- landing aid 100 has background 102 rendered in white, with shapes 104 rendered in black, to provide maximum contrast.
- background 102 may be rendered in black, with shapes 104 rendered in white.
- background 102 and shapes 104 may be rendered in any combination of colors and/or shades that sufficiently contrast against each other as well as the surrounding background.
- background 102 may be colored in a contrasting color such as red or blue, with shapes 104 rendered in a color that contrasts against background 102 .
- landing aid 100 is colored in such a fashion to be clearly detectable against the surrounding environment.
- These various colors can be rendered using paint, ink, or any other material that is reflective of the desired colors in the visible light spectrum.
- the shape patterns on landing aid 100 can be rendered using any material or method capable of rendering the shape patterns in a sufficiently contrasting fashion that can be detected by a camera system on UAV 300 .
- landing aid 100 could be rendered using materials that are reflective of non-visible light, such as infrared or ultraviolet.
- infrared reflective material for shapes 104 would allow UAV 300 , when equipped with a suitable infrared-sensitive camera, to locate landing aid 100 in low-visibility or low-visible light conditions.
- landing aid 100 could be rendered using light emitting materials, such as lamps or LEDs, thereby allowing detection by and guidance of UAV 300 in night or total darkness conditions, where a landing aid 100 implemented using purely reflective materials would be undetectable. Where landing aid 100 is implemented using light emission, the light emission may be continuous or optionally pulsed, which will be addressed in greater detail herein.
- light emitting materials such as lamps or LEDs
- FIGS. 1A through 1F demonstrate multiple possible patterns of shapes, although it will be appreciated by a person skilled in the relevant art that possible patterns are not limited to the depicted examples.
- the use of varying patterns allows landing aid 100 to be tailored so as to clearly contrast with the surrounding environment.
- the flight controller of unmanned aerial vehicle 300 can be programmed to recognize and track in on a single pattern, thereby allowing several landing targets, each specific to a particular UAV 300 to be placed proximate to each other.
- Landing aid 100 can be scaled to any size, with larger sizes being detectable at greater ranges, but having a practical limit of needing to be within the field of view of whatever camera is installed upon UAV 300 . If landing aid 100 is too large, then UAV 300 will be unable to track in once landing aid 100 exceeds the field of view of UAV 300 's camera as UAV 300 draws near to landing aid 100 . The limit of potential detectability of landing aid 100 will depend upon the resolution of the camera installed upon UAV 300 .
- landing aid 100 is only detectable when the size of shapes 104 cover approximately four pixels of the imaging sensor of UAV 300 's camera; if shapes 104 do not cover at least four pixels, they will become indistinct due to aliasing.
- Landing aid 200 includes many similar or identical features to landing aid 100 . Thus, for the sake of brevity, each feature of landing aid 200 will not be redundantly explained. Rather, key distinctions between landing aid 200 and landing aid 100 will be described in detail and the reader should reference the discussion above for features substantially similar between the two landing aids.
- landing aid 200 includes a background 202 with a series of shapes 204 set upon background 202 in contrasting colors. The configuration and contrasting colors and/or shades of shapes 204 are identical to landing aid 100 .
- landing aid 200 also includes a close range target 206 which is embedded within one of the shapes 204 , which is preferably rendered in the same color or shade of background 202 , thus causing close range target 206 to contrast with surrounding shape 204 .
- the pattern of close range target 206 is preferably identical to the pattern of shapes 204 , but scaled down sufficiently with respect to the size of landing aid 200 so as to become effective once UAV 300 approaches close enough so that landing aid 200 exceeds the field of view of a camera attached to UAV 300 .
- landing aid 200 can be made larger than would be possible with landing aid 100 , by essentially nesting subsequently smaller landing aids so that UAV 300 always has a target within the field of view of its camera until landing.
- FIG. 2B which depicts a landing aid 200 , shows a variation of landing aid 200 depicted in FIG. 2A .
- Close range target 208 is rendered as a smaller version of landing aid 200 , with a small background and shapes that match the colors and/or shading of background 202 and shapes 204 .
- close range targets 206 and 208 can further have additional, smaller close range targets embedded within them, for even greater precision.
- close range targets 206 and 208 are depicted as having an identical pattern to landing aid 200 , this is not necessary; close range targets 206 and 208 can be different patterns, which can further signal a close range to UAV 300 , or possibly trigger additional landing preparations in UAV 300 , such as extending landing gear.
- close range targets 206 and 208 do not need to be centered within the middle of landing aid 200 .
- the close range target can be located anywhere upon landing aid 200 so long as it sufficiently contrasts against background 202 or shapes 204 , depending upon its location.
- close range targets 206 and 208 need not match the colors of background 202 and shapes 204 , but can be in additional different contrasting colors, as light emissive elements, or can be implemented to reflect or emit non-visible spectrum light such as UV or infrared.
- FIG. 3 depicts landing aid 100 in use with UAV 300 .
- UAV 300 is shown with a camera 302 , which is capable of detecting landing aid 100 , which in turn is rendered in a fashion as described above that is within the visual detection range of camera 302 .
- Camera 302 is able to locate landing aid 100 along angle of view 310 . By centering landing aid 100 within the field of view of camera 302 .
- UAV 300 can make a precise approach to landing by following angle of view 310 , where landing aid 100 simply continually increases in size as UAV 300 approaches.
- the angle of camera 302 relative to landing aid 100 can be determined.
- UAV 300 is equipped with GPS location, its altitude above ground can be known.
- a right triangle can be visualized between landing aid 100 and UAV 300 .
- the distance to landing aid 100 can be approximated using well-known trigonometric techniques, such as the law of sines.
- UAV 300 is any unmanned aerial vehicle that is equipped with electronics and guidance systems typical for its size, so long as UAV 300 is equipped with camera 302 .
- the size can range from small UAVs such as the DJI Phantom series of quadcopters (www.dii.com) to large scale UAV systems in use by military and government organizations, such as the General Atomics MQ-9 Reaper.
- the image detection system is comprised of a camera 402 , which is in data communication with an image processor 404 .
- Image processor 404 in turn feeds information about the location of landing aid 100 into flight controller 406 , which in turn controls the motors 408 and/or flight controls 410 of UAV 300 so as to guide UAV 300 relative to landing aid 100 .
- Camera 402 is any standard camera that is suitable for being attached to UAV 300 , and with UAV 300 's payload parameters. Camera 402 and landing aid 100 must be compatible insofar as camera 402 must be capable of imaging landing aid 100 .
- Camera 402 accordingly may be sensitive to infrared, ultraviolet, visible light, or a combination of the foregoing, as appropriate to how the shapes are rendered on landing aid 100 .
- Camera 402 may use CCD, CMOS, or any other suitable imaging technology now known or later developed.
- Image processor 404 receives a video stream from camera 402 , and performs an initial shape detection upon the video stream, specifically to identify circles and polygons.
- Shape detection may be carried out using any suitable algorithm now known or later developed in the relevant art that is capable of conveying the geometry and location in the field of view of each detected shape, such as convolution, correlation, edge detection, Hough transform, or a combination of techniques. The selection of technique or techniques can be tailored to the processing power of image processor 404 and the relative needs for speed and accuracy.
- a second pattern matching algorithm is carried out to detected the presence of landing aid 100 within camera 402 's field of view, to ensure that identification is accurate, and not a false positive.
- a further step may be carried out for certain implementations of landing aid 100 that are illuminated for night or low-light direction.
- Landing aid 100 can be implemented with a pulsed light source, that repeatedly flashes. By synchronizing the flashing of landing aid 100 with the frame rate of camera 402 at a ratio of two to one, landing aid 100 can be made to appear only every other frame of the video stream from camera 402 . Detection of landing aid 100 can then be accomplished by simple comparison between frames, with the difference between frames revealing the position of landing aid 100 .
- Landing aid 100 and camera 402 can be synchronized by reference to an external time base, such as a GPS receiver installed on both UAV 300 and landing aid 100 that can provide a common synchronized time base.
- synchronizing methods may also be possible, including a radio beacon either on UAV 300 or landing aid 100 that signals shutter actuation or light pulsing, respectively. These later methods are useful where camera 402 may have a variable frame rate. It will be understood by a person skilled in the relevant art that using GPS as a common time base for synchronization alone, apart from a real-time exchange of information between UAV 300 and landing aid 100 will require a prior determined common frame and pulse rate.
- the L-shaped arrangement of the circles is detected.
- Each of the three circles corresponds to a point i, j and k.
- Point i is at the corner of the L-shape
- point j is located along the long axis of the L-shape
- point k is located along the short axis. It is observed that the example landing aid pattern is arranged with a 1:2 ratio, where point k is located two diameters of a circle from point i, and point j is located four diameters from point i.
- the negated version of the relationship equation is useful for detecting landing aid 100 when the L-shape is presented in reverse.
- the foregoing algorithm is specific to the examples in the listed figures.
- the parameters of the foregoing algorithm will be varied to accommoxdate the differing spacing between circles in other examples, such as seen in FIGS. 1A, 1B, 1D, 1E and 1F , or any other patterns devised for landing aid 100 . Varying the detection parameters of the foregoing algorithm allows for discrimination and isolation of one particular pattern when multiple targets are within camera 402 's field of view.
- image processor 404 can pass directional information to flight controller 406 for directing UAV 300 relative to landing aid 100 .
- Such information can be used to direct UAV 300 to perform a predetermined flight path, such as reorienting UAV 300 to a certain position relative to landing aid 100 , and automatically bringing UAV 300 in to land.
- Flight controller 406 can be a commercial off-the-shelf flight controller, such as the open source ArduPilotMega or DJI's Naza line of flight controllers, or can be custom developed to interact with image processor 404 in a more complex fashion.
- the algorithms that flight controller 406 uses to direct UAV 300 are well known in the art, and any such algorithms now known or later developed may be utilized, depending upon the mission parameters for UAV 300 .
- the disclosed visual landing aids can be utilized for purposes other than just landing.
- a UAV 300 can navigate with respect to the fixed point of reference so long as the landing aid is visible within UAV 300 's camera.
- a series of landing aids could be supplied at various points along a predetermined course, which could then be used by UAV 300 to navigate the predetermined course.
- UAV 300 could be programmed to perform different actions as it approaches each consecutive different landing aid.
- the disclosed visual landing aids have applicability beyond UAVs, and can be deployed any place a visual target that is readily identifiable and trackable by machine vision is desired.
- Such applications could include space and maritime docking procedures, in-air refueling, and assistance to landing manned aircraft, to name a few possible alternative applications.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Acoustics & Sound (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- The present disclosure relates generally to landing systems for unmanned aerial vehicles. In particular, visual landing aids for camera-equipped unmanned aerial vehicles are described.
- Known systems and methods for landing for unmanned aerial vehicles are not entirely satisfactory for the range of applications in which they are employed. For example, existing systems and methods typically employ either GPS positioning, some type of laser or radar range finding, human intervention or manual control, or some combination of the foregoing. GPS positioning is subject to potential loss of satellite signal, and to be fully effective requires the loading of a terrain database into the unmanned aerial vehicle's flight controller. Without the terrain database, the unmanned aerial vehicle will not know its height above ground in order to make a correct landing and, in any event, can suffer limited precision. Laser or radar range finding requires additional equipment on the unmanned aerial vehicle which diminishes payload, and while providing accurate vertical guidance, does not by itself provide for homing into a particular designated landing target. Human intervention and manual control, while being accurate, is subject to pilot error and precludes a fully autonomous mission profile.
- Finally, conventional landing targets may be subject to false positives and/or may require special hardware for detection. Landing aids that use radio beacons may be subject to jamming or glitches.
- Thus, there exists a need for landing aids that improve upon and advance the design of known systems and methods for landing unmanned aerial vehicles. Examples of new and useful landing aids relevant to the needs existing in the field are discussed below.
- Disclosure addressing one or more of the identified existing needs is provided in the detailed description below. Examples of references relevant to visual landing aids include U.S. Patent References: U.S. Pat. No. 7,705,879 and patent application publication US20090009596. The complete disclosures of the above patents and patent applications are herein incorporated by reference for all purposes.
- The present disclosure is directed to a visual landing aid comprised of a series of circles and polygons for unmanned aerial vehicles that is capable of being accurately detected over a wide range of angles and distances by an unmanned aerial vehicle equipped with a camera and shape detection capabilities. The visual landing aid may be implemented using contrasting colors for the pattern which reflect visible and/or UV or infrared light, or by light emitting elements. In some examples, the landing aid includes a secondary smaller version of the landing aid shape pattern that is embedded within the larger pattern, to enable greater detection range while facilitating close-in precision guidance. In still further examples, the light emitting elements may be pulsed at a rate that is synchronized with the camera shutter on the unmanned aerial vehicle to further enhance accurate detection.
-
FIGS. 1A through 1F are diagrams of various arrangements of visual landing aids. -
FIG. 2A is a diagram of a second arrangement for a visual landing aid, showing a secondary precision target pattern embedded in the primary pattern that is presented in reverse contrast. -
FIG. 2B is a diagram of the second arrangement for a visual landing aid depicted inFIG. 2A , showing the secondary precision target pattern embedded in the primary pattern presented in the same contrast as the primary pattern. -
FIG. 3 is a perspective view of an unmanned aerial vehicle (UAV) in a flight setting where the visual landing aid can be used to guide the UAV in for a landing. -
FIG. 4 is a block diagram of the components on board a UAV for image processing and detection of a visual landing aid to be used to guide the UAV relative to the position of the visual landing aid. - The disclosed visual landing aids will become better understood through review of the following detailed description in conjunction with the figures. The detailed description and figures provide merely examples of the various inventions described herein. Those skilled in the art will understand that the disclosed examples may be varied, modified, and altered without departing from the scope of the inventions described herein. Many variations are contemplated for different applications and design considerations; however, for the sake of brevity each and every contemplated variation is not individually described in the following detailed description.
- Throughout the following detailed description, examples of various visual landing aids are provided. Related features in the examples may be identical, similar, or dissimilar in different examples. For the sake of brevity, related features will not be redundantly explained in each example. Instead, the use of related feature names will cue the reader that the feature with a related feature name may be similar to the related feature in an example explained previously. Features specific to a given example will be described in that particular example. The reader should understand that a given feature need not be the same or similar to the specific portrayal of a related feature in any given figure or example.
- With reference to
FIGS. 1A-4 , a first example of a visual landing aid,landing aid 100, will now be described.Landing aid 100 functions to provide a fixed ground reference useful by an unmanned aerial vehicle that is in flight to effect a precision landing.Landing aid 100 is designed to be detectable by a wide range of camera resolutions, at a wide range of distances, and in any orientation. Moreover,landing aid 100 is intended to allow an unmanned aerial vehicle to determine its orientation with respect tolanding aid 100. The reader will appreciate from the figures and description below thatlanding aid 100 addresses shortcomings of conventional visual landing aids. - For example, by providing a predetermined and fixed landing point on the ground,
landing aid 100 does not require the unmanned aerial vehicle to rely upon a GPS database or onboard range finding equipment.Landing aid 100 can allow an unmanned aerial vehicle to guide itself into a landing point using only an onboard camera, which is typically carried by unmanned aerial vehicles, and a processing module capable of performing shape detection and interfacing with the unmanned aerial vehicle's flight controller. Further,landing aid 100 allows the unmanned aerial vehicle to land itself precisely upon or nearlanding aid 100 completely autonomously, without the need for human intervention.Landing aid 100, when implemented as a reflective target, is not itself subject to equipment failure, and the shapes and colors oflanding aid 100 can be selected so as to minimize false positives. -
Landing aid 100 for unmanned aerial vehicles (UAVs) includes abackground 102, upon which are located one ormore shapes 104 in a color that contrasts with the background.Shapes 104 are arranged in a mathematically verifiable pattern that is useful for determining the orientation and distance of an unmanned aerial vehicle (UAV) 300 to relative to thelanding aid 100. Theshapes 104 are capable of being detected by a camera and then recognized by an image processing system onboard UAV 300. In other examples,landing aid 100 may include additional or alternative features, such as a pulsed light source and possibly a beacon for synchronizing with a camera located onboardUAV 300. - As can be seen in
FIGS. 1A through 1F ,shapes 104 comprise at a minimum a plurality of circles, and can optionally include one ormore polygons 108, typically implemented as a rectangle, as seen inFIGS. 1A and 1F , or as one or more conjoined rectangles, as can be seen inFIGS. 1C and 1D . Eachshape 104 has aminimum height 106, and is accordingly separated from allother shapes 104 by at leastminimum height 106, or preferably some integer multiple ofminimum height 106. It is seen in the figures that the various shapes are arranged in an approximate grid-like fashion, and in a pattern that is unique from all orientations, viz. there is no mirroring of the pattern. This arrangement allows for ready detection by UAV 300 and further allows UAV 300 to determine its orientation relative tolanding aid 100. Orientation determination is useful if a landing approach is best effected from only certain directions, such as where high terrain or obstacles surround most of a landing site, or where landing systems such as hooks or nets are employed that requireUAV 300 to approach from a particular direction to be snagged by the landing system. - The selection of circles and polygons on landing
aid 100 is intended to reduce false positives against most terrain. However, other types of shapes that are detectable by machine shape detection algorithms can be used for landingaid 100 without departing from the disclosed invention. Moreover, other types of machine readable patterns, such as bar codes, 2D matrices such as QR codes, and other types of tags may be integrated intolanding aid 100 in order to reduce false positives, and to accurately and positively identify a desired landing target. - One conventional implementation of
landing aid 100 hasbackground 102 rendered in white, withshapes 104 rendered in black, to provide maximum contrast. Alternatively, wherelanding aid 100 is located within a predominantly light background,background 102 may be rendered in black, withshapes 104 rendered in white. Still further,background 102 andshapes 104 may be rendered in any combination of colors and/or shades that sufficiently contrast against each other as well as the surrounding background. For example, where alanding aid 100 is placed within a predominantly green background,background 102 may be colored in a contrasting color such as red or blue, withshapes 104 rendered in a color that contrasts againstbackground 102. Thus, landingaid 100 is colored in such a fashion to be clearly detectable against the surrounding environment. These various colors can be rendered using paint, ink, or any other material that is reflective of the desired colors in the visible light spectrum. - In addition to materials capable of reflecting and/or absorbing visible light, the shape patterns on
landing aid 100 can be rendered using any material or method capable of rendering the shape patterns in a sufficiently contrasting fashion that can be detected by a camera system onUAV 300. For example, landingaid 100 could be rendered using materials that are reflective of non-visible light, such as infrared or ultraviolet. Using infrared reflective material forshapes 104 would allowUAV 300, when equipped with a suitable infrared-sensitive camera, to locatelanding aid 100 in low-visibility or low-visible light conditions. Furthermore, landingaid 100 could be rendered using light emitting materials, such as lamps or LEDs, thereby allowing detection by and guidance ofUAV 300 in night or total darkness conditions, where alanding aid 100 implemented using purely reflective materials would be undetectable. Where landingaid 100 is implemented using light emission, the light emission may be continuous or optionally pulsed, which will be addressed in greater detail herein. -
FIGS. 1A through 1F demonstrate multiple possible patterns of shapes, although it will be appreciated by a person skilled in the relevant art that possible patterns are not limited to the depicted examples. The use of varying patterns allows landingaid 100 to be tailored so as to clearly contrast with the surrounding environment. Furthermore, the flight controller of unmannedaerial vehicle 300 can be programmed to recognize and track in on a single pattern, thereby allowing several landing targets, each specific to aparticular UAV 300 to be placed proximate to each other. -
Landing aid 100 can be scaled to any size, with larger sizes being detectable at greater ranges, but having a practical limit of needing to be within the field of view of whatever camera is installed uponUAV 300. Iflanding aid 100 is too large, thenUAV 300 will be unable to track in once landingaid 100 exceeds the field of view ofUAV 300's camera asUAV 300 draws near to landingaid 100. The limit of potential detectability oflanding aid 100 will depend upon the resolution of the camera installed uponUAV 300. In keeping with Nyquist's Theorem, landingaid 100 is only detectable when the size ofshapes 104 cover approximately four pixels of the imaging sensor ofUAV 300's camera; ifshapes 104 do not cover at least four pixels, they will become indistinct due to aliasing. Thus, the higher the resolution ofUAV 300's camera, the smaller the associated pixels, and the greater the range of potential detection oflanding aid 100. - Turning attention to
FIGS. 2A and 2B , a second variation oflanding aid 100 will now be described.Landing aid 200 includes many similar or identical features to landingaid 100. Thus, for the sake of brevity, each feature oflanding aid 200 will not be redundantly explained. Rather, key distinctions betweenlanding aid 200 andlanding aid 100 will be described in detail and the reader should reference the discussion above for features substantially similar between the two landing aids. - As can be seen in
FIG. 2A , landingaid 200 includes abackground 202 with a series ofshapes 204 set uponbackground 202 in contrasting colors. The configuration and contrasting colors and/or shades ofshapes 204 are identical tolanding aid 100. However, landingaid 200 also includes aclose range target 206 which is embedded within one of theshapes 204, which is preferably rendered in the same color or shade ofbackground 202, thus causingclose range target 206 to contrast with surroundingshape 204. The pattern ofclose range target 206 is preferably identical to the pattern ofshapes 204, but scaled down sufficiently with respect to the size oflanding aid 200 so as to become effective onceUAV 300 approaches close enough so that landingaid 200 exceeds the field of view of a camera attached toUAV 300. Thus, landingaid 200 can be made larger than would be possible withlanding aid 100, by essentially nesting subsequently smaller landing aids so thatUAV 300 always has a target within the field of view of its camera until landing. -
FIG. 2B , which depicts alanding aid 200, shows a variation oflanding aid 200 depicted inFIG. 2A .Close range target 208, however, is rendered as a smaller version oflanding aid 200, with a small background and shapes that match the colors and/or shading ofbackground 202 and shapes 204. - It should be appreciated that, depending upon the size of
landing aid 200, close range targets 206 and 208 can further have additional, smaller close range targets embedded within them, for even greater precision. Furthermore, while close range targets 206 and 208 are depicted as having an identical pattern to landingaid 200, this is not necessary; close range targets 206 and 208 can be different patterns, which can further signal a close range toUAV 300, or possibly trigger additional landing preparations inUAV 300, such as extending landing gear. Still further, close range targets 206 and 208 do not need to be centered within the middle of landingaid 200. The close range target can be located anywhere upon landingaid 200 so long as it sufficiently contrasts againstbackground 202 orshapes 204, depending upon its location. It should also be appreciated that close range targets 206 and 208 need not match the colors ofbackground 202 andshapes 204, but can be in additional different contrasting colors, as light emissive elements, or can be implemented to reflect or emit non-visible spectrum light such as UV or infrared. -
FIG. 3 depicts landingaid 100 in use withUAV 300.UAV 300 is shown with acamera 302, which is capable of detectinglanding aid 100, which in turn is rendered in a fashion as described above that is within the visual detection range ofcamera 302.Camera 302 is able to locatelanding aid 100 along angle ofview 310. By centeringlanding aid 100 within the field of view ofcamera 302.UAV 300 can make a precise approach to landing by following angle ofview 310, wherelanding aid 100 simply continually increases in size asUAV 300 approaches. Furthermore, wherecamera 302 is mounted on a gimbal, the angle ofcamera 302 relative tolanding aid 100 can be determined. WhereUAV 300 is equipped with GPS location, its altitude above ground can be known. As depicted inFIG. 3 , a right triangle can be visualized betweenlanding aid 100 andUAV 300. By combining the angle ofcamera 302 with GPS altitude, the distance tolanding aid 100 can be approximated using well-known trigonometric techniques, such as the law of sines. -
UAV 300 is any unmanned aerial vehicle that is equipped with electronics and guidance systems typical for its size, so long asUAV 300 is equipped withcamera 302. The size can range from small UAVs such as the DJI Phantom series of quadcopters (www.dii.com) to large scale UAV systems in use by military and government organizations, such as the General Atomics MQ-9 Reaper. - Turning to
FIG. 4 , the components of an image detection system that can be implemented onUAV 300 for detectinglanding aid 100 are shown. The image detection system is comprised of acamera 402, which is in data communication with animage processor 404.Image processor 404 in turn feeds information about the location of landingaid 100 intoflight controller 406, which in turn controls themotors 408 and/orflight controls 410 ofUAV 300 so as to guideUAV 300 relative tolanding aid 100.Camera 402 is any standard camera that is suitable for being attached toUAV 300, and withUAV 300's payload parameters.Camera 402 andlanding aid 100 must be compatible insofar ascamera 402 must be capable of imaginglanding aid 100.Camera 402 accordingly may be sensitive to infrared, ultraviolet, visible light, or a combination of the foregoing, as appropriate to how the shapes are rendered on landingaid 100.Camera 402 may use CCD, CMOS, or any other suitable imaging technology now known or later developed. -
Image processor 404 receives a video stream fromcamera 402, and performs an initial shape detection upon the video stream, specifically to identify circles and polygons. Shape detection may be carried out using any suitable algorithm now known or later developed in the relevant art that is capable of conveying the geometry and location in the field of view of each detected shape, such as convolution, correlation, edge detection, Hough transform, or a combination of techniques. The selection of technique or techniques can be tailored to the processing power ofimage processor 404 and the relative needs for speed and accuracy. Once shape detection is carried out, a second pattern matching algorithm is carried out to detected the presence oflanding aid 100 withincamera 402's field of view, to ensure that identification is accurate, and not a false positive. - A further step may be carried out for certain implementations of
landing aid 100 that are illuminated for night or low-light direction.Landing aid 100 can be implemented with a pulsed light source, that repeatedly flashes. By synchronizing the flashing oflanding aid 100 with the frame rate ofcamera 402 at a ratio of two to one, landingaid 100 can be made to appear only every other frame of the video stream fromcamera 402. Detection oflanding aid 100 can then be accomplished by simple comparison between frames, with the difference between frames revealing the position oflanding aid 100.Landing aid 100 andcamera 402 can be synchronized by reference to an external time base, such as a GPS receiver installed on bothUAV 300 andlanding aid 100 that can provide a common synchronized time base. Other synchronizing methods may also be possible, including a radio beacon either onUAV 300 orlanding aid 100 that signals shutter actuation or light pulsing, respectively. These later methods are useful wherecamera 402 may have a variable frame rate. It will be understood by a person skilled in the relevant art that using GPS as a common time base for synchronization alone, apart from a real-time exchange of information betweenUAV 300 andlanding aid 100 will require a prior determined common frame and pulse rate. - For the example landing aid pattern depicted in
FIGS. 1C, 2A and 2B , the L-shaped arrangement of the circles is detected. Each of the three circles corresponds to a point i, j and k. Point i is at the corner of the L-shape, point j is located along the long axis of the L-shape, and point k is located along the short axis. It is observed that the example landing aid pattern is arranged with a 1:2 ratio, where point k is located two diameters of a circle from point i, and point j is located four diameters from point i. Thus, the relationship of Dx(ij)*2=Dx(ik)*4, or Dx(ij)*2=neg(Dx(ik)*4) is used on any detected circles of approximately equal diameter to locatelanding aid 100 within the field of view ofcamera 402. Observe that the negated version of the relationship equation is useful for detectinglanding aid 100 when the L-shape is presented in reverse. - It will be appreciated by a person skilled in the relevant art that the foregoing algorithm is specific to the examples in the listed figures. The parameters of the foregoing algorithm will be varied to accommoxdate the differing spacing between circles in other examples, such as seen in
FIGS. 1A, 1B, 1D, 1E and 1F , or any other patterns devised for landingaid 100. Varying the detection parameters of the foregoing algorithm allows for discrimination and isolation of one particular pattern when multiple targets are withincamera 402's field of view. - Once
image processor 404 has detected and determined the location of landingaid 100, it can pass directional information toflight controller 406 for directingUAV 300 relative tolanding aid 100. Such information can be used todirect UAV 300 to perform a predetermined flight path, such as reorientingUAV 300 to a certain position relative tolanding aid 100, and automatically bringingUAV 300 in to land.Flight controller 406 can be a commercial off-the-shelf flight controller, such as the open source ArduPilotMega or DJI's Naza line of flight controllers, or can be custom developed to interact withimage processor 404 in a more complex fashion. The algorithms thatflight controller 406 uses to directUAV 300 are well known in the art, and any such algorithms now known or later developed may be utilized, depending upon the mission parameters forUAV 300. - It will be recognized by a person skilled in the relevant art that the disclosed visual landing aids can be utilized for purposes other than just landing. By providing a fixed visual point of reference on the terrain, a
UAV 300 can navigate with respect to the fixed point of reference so long as the landing aid is visible withinUAV 300's camera. In one possible alternative use, a series of landing aids could be supplied at various points along a predetermined course, which could then be used byUAV 300 to navigate the predetermined course. By varying the patterns of the various landing aids,UAV 300 could be programmed to perform different actions as it approaches each consecutive different landing aid. Moreover, the disclosed visual landing aids have applicability beyond UAVs, and can be deployed any place a visual target that is readily identifiable and trackable by machine vision is desired. Such applications could include space and maritime docking procedures, in-air refueling, and assistance to landing manned aircraft, to name a few possible alternative applications. - The disclosure above encompasses multiple distinct inventions with independent utility. While each of these inventions has been disclosed in a particular form, the specific embodiments disclosed and illustrated above are not to be considered in a limiting sense as numerous variations are possible. The subject matter of the inventions includes all novel and non-obvious combinations and subcombinations of the various elements, features, functions and/or properties disclosed above and inherent to those skilled in the art pertaining to such inventions. Where the disclosure or subsequently filed claims recite “a” element, “a first” element, or any such equivalent term, the disclosure or claims should be understood to incorporate one or more such elements, neither requiring nor excluding two or more such elements.
- Applicant(s) reserves the right to submit claims directed to combinations and subcombinations of the disclosed inventions that are believed to be novel and non-obvious. Inventions embodied in other combinations and subcombinations of features, functions, elements and/or properties may be claimed through amendment of those claims or presentation of new claims in the present application or in a related application. Such amended or new claims, whether they are directed to the same invention or a different invention and whether they are different, broader, narrower or equal in scope to the original claims, are to be considered within the subject matter of the inventions described herein.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/017,263 US9738401B1 (en) | 2016-02-05 | 2016-02-05 | Visual landing aids for unmanned aerial systems |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/017,263 US9738401B1 (en) | 2016-02-05 | 2016-02-05 | Visual landing aids for unmanned aerial systems |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170225800A1 true US20170225800A1 (en) | 2017-08-10 |
US9738401B1 US9738401B1 (en) | 2017-08-22 |
Family
ID=59496758
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/017,263 Active US9738401B1 (en) | 2016-02-05 | 2016-02-05 | Visual landing aids for unmanned aerial systems |
Country Status (1)
Country | Link |
---|---|
US (1) | US9738401B1 (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108427061A (en) * | 2018-03-05 | 2018-08-21 | 国网湖南省电力有限公司 | A kind of transmission line forest fire distribution monitoring device and method based on unmanned plane |
CN110871893A (en) * | 2018-09-03 | 2020-03-10 | 中强光电股份有限公司 | Unmanned aerial vehicle landing system and landing method thereof |
WO2020081101A1 (en) * | 2018-10-19 | 2020-04-23 | Anduril Industries Inc. | Ruggedized autonomous helicopter platform |
CN111694370A (en) * | 2019-03-12 | 2020-09-22 | 顺丰科技有限公司 | Visual method and system for multi-stage fixed-point directional landing of unmanned aerial vehicle |
US20200401163A1 (en) * | 2018-03-13 | 2020-12-24 | Nec Corporation | Moving body guidance apparatus, moving body guidance method, and computer-readable recording medium |
EP3668792A4 (en) * | 2017-08-14 | 2021-08-18 | American Robotics, Inc. | Methods and systems for improving the precision of autonomous landings by drone aircraft on landing targets |
CN113283030A (en) * | 2021-05-25 | 2021-08-20 | 西安万飞控制科技有限公司 | Design method for assisting high-precision positioning grid two-dimensional code |
US20210300591A1 (en) * | 2018-07-23 | 2021-09-30 | Shanghai Autoflight Co. Ltd. | Landing platform for unmanned aerial vehicle |
JP2021172318A (en) * | 2020-04-30 | 2021-11-01 | イームズロボティクス株式会社 | Descending system for unmanned flight body |
TWI746234B (en) * | 2020-10-29 | 2021-11-11 | 仲碩科技股份有限公司 | Method for distance measurement and positioning of unmanned helicopter to sea surface target |
CN113642423A (en) * | 2021-07-28 | 2021-11-12 | 南京石知韵智能科技有限公司 | Aerial target accurate positioning method and system for unmanned aerial vehicle |
CN113741534A (en) * | 2021-09-16 | 2021-12-03 | 中国电子科技集团公司第五十四研究所 | Unmanned aerial vehicle vision and positioning double-guidance landing method |
US20220028289A1 (en) * | 2020-07-23 | 2022-01-27 | Aurora Flight Sciences Corporation, a subsidiary of The Boeing Company | System for navigating an aircraft based on infrared beacon signals |
WO2021236195A3 (en) * | 2020-03-13 | 2022-02-24 | Wing Aviation Llc | Adhoc geo-fiducial mats for landing uavs |
CN114489129A (en) * | 2022-01-24 | 2022-05-13 | 北京远度互联科技有限公司 | Unmanned aerial vehicle landing method and related device |
GB2602876A (en) * | 2020-11-13 | 2022-07-20 | Dropzone Store Llc | Aerial delivery location identifier |
CN114815905A (en) * | 2022-06-29 | 2022-07-29 | 中国航空工业集团公司沈阳飞机设计研究所 | Multi-machine continuous landing guide control method and device |
EP4026770A4 (en) * | 2019-10-11 | 2022-10-19 | Mitsubishi Heavy Industries, Ltd. | Automatic landing system for vertical takeoff/landing aircraft, vertical takeoff/landing aircraft, and control method for landing of vertical takeoff/landing aircraft |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10710719B1 (en) * | 2018-04-23 | 2020-07-14 | Amazon Technologies, Inc. | Deployable navigation beacons |
US11105921B2 (en) | 2019-02-19 | 2021-08-31 | Honeywell International Inc. | Systems and methods for vehicle navigation |
US11378986B2 (en) | 2019-04-01 | 2022-07-05 | Honeywell International Inc. | Systems and methods for landing and takeoff guidance |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7705879B2 (en) | 2006-02-13 | 2010-04-27 | Max-Viz, Inc. | System for and method of synchronous acquisition of pulsed source light in performance of monitoring aircraft flight operation |
JP5690539B2 (en) * | 2010-09-28 | 2015-03-25 | 株式会社トプコン | Automatic take-off and landing system |
US20160122038A1 (en) * | 2014-02-25 | 2016-05-05 | Singularity University | Optically assisted landing of autonomous unmanned aircraft |
-
2016
- 2016-02-05 US US15/017,263 patent/US9738401B1/en active Active
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3668792A4 (en) * | 2017-08-14 | 2021-08-18 | American Robotics, Inc. | Methods and systems for improving the precision of autonomous landings by drone aircraft on landing targets |
CN108427061A (en) * | 2018-03-05 | 2018-08-21 | 国网湖南省电力有限公司 | A kind of transmission line forest fire distribution monitoring device and method based on unmanned plane |
US11887329B2 (en) * | 2018-03-13 | 2024-01-30 | Nec Corporation | Moving body guidance apparatus, moving body guidance method, and computer-readable recording medium |
US20200401163A1 (en) * | 2018-03-13 | 2020-12-24 | Nec Corporation | Moving body guidance apparatus, moving body guidance method, and computer-readable recording medium |
US20210300591A1 (en) * | 2018-07-23 | 2021-09-30 | Shanghai Autoflight Co. Ltd. | Landing platform for unmanned aerial vehicle |
US11891192B2 (en) * | 2018-07-23 | 2024-02-06 | Shanghai Autoflight Co., Ltd. | Landing platform for unmanned aerial vehicle |
CN110871893A (en) * | 2018-09-03 | 2020-03-10 | 中强光电股份有限公司 | Unmanned aerial vehicle landing system and landing method thereof |
US20240203267A1 (en) * | 2018-10-19 | 2024-06-20 | Anduril Industries, Inc. | Ruggedized autonomous helicopter platform |
US11443640B2 (en) * | 2018-10-19 | 2022-09-13 | Anduril Industries, Inc. | Ruggedized autonomous helicopter platform |
US11721222B2 (en) * | 2018-10-19 | 2023-08-08 | Anduril Industries, Inc. | Ruggedized autonomous helicopter platform |
WO2020081101A1 (en) * | 2018-10-19 | 2020-04-23 | Anduril Industries Inc. | Ruggedized autonomous helicopter platform |
US20220335840A1 (en) * | 2018-10-19 | 2022-10-20 | Anduril Industries, Inc. | Ruggedized autonomous helicopter platform |
CN111694370A (en) * | 2019-03-12 | 2020-09-22 | 顺丰科技有限公司 | Visual method and system for multi-stage fixed-point directional landing of unmanned aerial vehicle |
EP4026770A4 (en) * | 2019-10-11 | 2022-10-19 | Mitsubishi Heavy Industries, Ltd. | Automatic landing system for vertical takeoff/landing aircraft, vertical takeoff/landing aircraft, and control method for landing of vertical takeoff/landing aircraft |
US11511885B2 (en) | 2020-03-13 | 2022-11-29 | Wing Aviation Llc | Adhoc geo-fiducial mats for landing UAVs |
AU2021274616B2 (en) * | 2020-03-13 | 2023-12-07 | Wing Aviation Llc | Adhoc geo-fiducial mats for landing UAVs |
US11745899B2 (en) | 2020-03-13 | 2023-09-05 | Wing Aviation Llc | Adhoc geo-fiducial mats for landing UAVs |
CN115280398A (en) * | 2020-03-13 | 2022-11-01 | Wing航空有限责任公司 | Ad hoc geographic reference pad for landing UAV |
WO2021236195A3 (en) * | 2020-03-13 | 2022-02-24 | Wing Aviation Llc | Adhoc geo-fiducial mats for landing uavs |
JP2021172318A (en) * | 2020-04-30 | 2021-11-01 | イームズロボティクス株式会社 | Descending system for unmanned flight body |
JP7539688B2 (en) | 2020-04-30 | 2024-08-26 | イームズロボティクス株式会社 | Unmanned aerial vehicle descent system |
US20220028289A1 (en) * | 2020-07-23 | 2022-01-27 | Aurora Flight Sciences Corporation, a subsidiary of The Boeing Company | System for navigating an aircraft based on infrared beacon signals |
TWI746234B (en) * | 2020-10-29 | 2021-11-11 | 仲碩科技股份有限公司 | Method for distance measurement and positioning of unmanned helicopter to sea surface target |
GB2602876A (en) * | 2020-11-13 | 2022-07-20 | Dropzone Store Llc | Aerial delivery location identifier |
CN113283030A (en) * | 2021-05-25 | 2021-08-20 | 西安万飞控制科技有限公司 | Design method for assisting high-precision positioning grid two-dimensional code |
CN113642423A (en) * | 2021-07-28 | 2021-11-12 | 南京石知韵智能科技有限公司 | Aerial target accurate positioning method and system for unmanned aerial vehicle |
CN113741534A (en) * | 2021-09-16 | 2021-12-03 | 中国电子科技集团公司第五十四研究所 | Unmanned aerial vehicle vision and positioning double-guidance landing method |
CN114489129A (en) * | 2022-01-24 | 2022-05-13 | 北京远度互联科技有限公司 | Unmanned aerial vehicle landing method and related device |
CN114815905A (en) * | 2022-06-29 | 2022-07-29 | 中国航空工业集团公司沈阳飞机设计研究所 | Multi-machine continuous landing guide control method and device |
Also Published As
Publication number | Publication date |
---|---|
US9738401B1 (en) | 2017-08-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9738401B1 (en) | Visual landing aids for unmanned aerial systems | |
US10408936B2 (en) | LIDAR light fence to cue long range LIDAR of target drone | |
US20170313439A1 (en) | Methods and syststems for obstruction detection during autonomous unmanned aerial vehicle landings | |
US9933521B2 (en) | Aerial positioning systems and methods | |
US9754498B2 (en) | Follow-me system for unmanned aircraft vehicles | |
US9401094B2 (en) | Method for assisting the piloting of an aircraft on the ground and system for its implementation | |
US20170305537A1 (en) | Un-manned aerial vehicle | |
EP3078988B1 (en) | Flight control system with dual redundant lidar | |
EP2892040A1 (en) | Obstacle detection system providing context awareness | |
WO2016201359A1 (en) | A low altitude aircraft identification system | |
EP3392153A1 (en) | Method and system for providing docking guidance to a pilot of a taxiing aircraft | |
CN111498127B (en) | Directional lighting system mounted on an aircraft and associated lighting method | |
KR20170004508A (en) | Method and apparaus for automatic landing of aircraft using vision-based information | |
KR20170139326A (en) | Autonomous flight system and method of unmanned aerial vehicle | |
US20220081113A1 (en) | Systems and methods for delivery using unmanned aerial vehicles | |
EP4066080A1 (en) | Method for controlling a formation of a collaborating swarm of unmanned mobile units | |
Nowak et al. | Development of a plug-and-play infrared landing system for multirotor unmanned aerial vehicles | |
CN115127544A (en) | Thermal imaging system and method for navigation | |
CN114729804A (en) | Multispectral imaging system and method for navigation | |
RU155323U1 (en) | UNMANNED AIRCRAFT CONTROL SYSTEM | |
EP3979124A1 (en) | An active protection system and method of operating active protection systems | |
Scholz et al. | Concept for Sensor and Processing Equipment for Optical Navigation of VTOL during Approach and Landing | |
KR102079727B1 (en) | Automatic drones landing gear and method using vision recognition | |
US9868546B2 (en) | Dual-mode vehicle light system | |
EP3618037A1 (en) | Systems and methods for identifying air traffic objects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: SIGHTLINE APPLICATIONS, INC., OREGON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HOLT, JORDAN;SARAO, JEREMY A;BARCHET, ALEX;SIGNING DATES FROM 20210820 TO 20210823;REEL/FRAME:057276/0474 |
|
AS | Assignment |
Owner name: CAMBRIDGE SAVINGS BANK, MASSACHUSETTS Free format text: SECURITY INTEREST;ASSIGNOR:SIGHTLINE APPLICATIONS, LLC;REEL/FRAME:064263/0645 Effective date: 20230705 |