WO2020040773A1 - Drone waypoint navigation system - Google Patents

Drone waypoint navigation system Download PDF

Info

Publication number
WO2020040773A1
WO2020040773A1 PCT/US2018/047737 US2018047737W WO2020040773A1 WO 2020040773 A1 WO2020040773 A1 WO 2020040773A1 US 2018047737 W US2018047737 W US 2018047737W WO 2020040773 A1 WO2020040773 A1 WO 2020040773A1
Authority
WO
WIPO (PCT)
Prior art keywords
path
drone
divert
waypoint
goal
Prior art date
Application number
PCT/US2018/047737
Other languages
French (fr)
Inventor
Adam Watkins
Joshua S. MCCONKEY
Original Assignee
Siemens Energy, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Energy, Inc. filed Critical Siemens Energy, Inc.
Priority to PCT/US2018/047737 priority Critical patent/WO2020040773A1/en
Publication of WO2020040773A1 publication Critical patent/WO2020040773A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • the present disclosure is directed, in general, to autonomous drones and more specifically to the navigation of autonomous flyable drones.
  • a method of navigating a drone from a first waypoint to a second waypoint includes providing a detection system coupled to the drone, operating the detection system to detect a bearing and distance from a current position of the drone to an obstacle, operating the drone in a go to goal state in response to not detecting the obstacle on a bearing between the current position and the second waypoint within a predefined first distance, wherein the drone travels on a direct path from the current position toward the second waypoint, and operating the drone in a divert state in response to detecting the obstacle on the bearing between the current position and the second waypoint within the predefined first distance, wherein the detection system analyzes potential paths to a left side and a right side of the obstacle and selects a desired divert path for continued travel.
  • a method of navigating a drone from a first waypoint to a second waypoint includes operating the drone in a go to goal state in which the drone travels along a current path which is a straight line from the first waypoint toward the second waypoint, operating a detection system to detect obstacles on the current path, switching the drone to a select path state in response to the detection of an obstacle on the current path, and operating the detection system to scan a potential left-side path and a right-side path and to analyze and select a divert path from the left-side path and the right-side path, the selected divert path becoming a current path.
  • the method also includes switching from the select path state to a divert state, scanning the current path, a sub-goal path, and a direct path, the direct path being a direct path from a current position of the drone to the second waypoint, and the sub-goal path being a path that is angled a non-zero angle from the divert path toward the second waypoint, and selecting for continued travel the direct path when the direct path is clear and the sub-goal path when the sub-goal path is clear and the direct path is not clear.
  • a drone in another construction, includes a frame, an engine supported by the frame, a propulsion element coupled to the engine and operable to propel the drone along a travel path, and a detection system coupled to the frame and operable to detect a bearing and distance to an obstacle.
  • a steering assembly is coupled to the detection system and is operable to steer the drone along a go to goal path from a first waypoint to a second waypoint in response to the detection system not detecting the obstacle along a direct path and within a predetermined distance from the drone, and operable to divert the drone along a divert path in response to the detection system detecting the obstacle along the direct path within the predetermined distance.
  • a controller is operable to analyze data from the detection system and transition the drone from a go to goal state to a divert state in response to the detection of the obstacle, the controller operable to analyze the divert path, a sub-goal path, and a direct path and to select one of the direct path, the sub-goal path and the divert path for continued travel.
  • FIG. 1 is a perspective view of a drone in the form of a vertical take-off and landing aerial vehicle.
  • FIG. 2 is a schematic illustration of an imaging system scanning in a searchlight mode and in a go to goal state.
  • FIG. 3 is a perspective view of an imaging system scanning in a full circle mode.
  • Fig. 4 is a schematic illustration of the drone operating in a select path state with the imaging system scanning one or more potential paths in the searchlight mode.
  • FIG. 5 is a schematic illustration of an imaging system operating in a divert state performing scans in four directions each in searchlight mode.
  • Fig. 6 is a schematic illustration of the drone operating in a corridor detection state with the imaging system scanning in the full circle mode to find multiple corridors through which the drone could pass.
  • Fig. 7 is a schematic illustration of the drone in a back away state and further illustrating an exclusion zone, a normal size, and an intermediate size.
  • Fig. 8 is a schematic illustration of a travel plan and actual flight path of a drone between three waypoints.
  • Fig. 9 is a more detailed schematic illustration of a portion of the travel plan and actual flight path of Fig. 8 between the second waypoint and the third waypoint.
  • Fig. 10 is a more detailed schematic illustration of a portion of the travel plan and actual flight path of Fig. 8 between the third waypoint and the first waypoint.
  • Fig. 11 is a schematic illustration of a travel plan and actual flight path of a drone operating in a corridor detection state.
  • FIG. 12 schematically illustrates a control system for the drone of Fig. 1
  • phrases“associated with” and“associated therewith,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, or the like.
  • first, second, third and so forth may be used herein to refer to various elements, information, functions, or acts, these elements, information, functions, or acts should not be limited by these terms. Rather these numeral adjectives are used to distinguish different elements, information, functions or acts from each other. For example, a first element, information, function, or act could be termed a second element, information, function, or act, and, similarly, a second element, information, function, or act could be termed a first element, information, function, or act, without departing from the scope of the present disclosure.
  • the term “adjacent to” may mean: that an element is relatively near to but not in contact with a further element; or that the element is in contact with the further portion, unless the context clearly indicates otherwise.
  • the phrase“based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise. Terms“about” or“substantially” or like terms are intended to cover variations in a value that are within normal industry manufacturing tolerances for that dimension. If no industry standard as available a variation of 20 percent would fall within the meaning of these terms unless otherwise stated.
  • Fig. 1 illustrates an autonomous vehicle or drone 10 in the form of a multi-engine vertical take-off and landing aircraft.
  • autonomous vehicles or drones 10 different forms of aircraft or vehicles that are not aircraft could also be used as autonomous vehicles or drones 10.
  • land-based vehicles such as automobiles, or water craft such as surface vessels and submersibles or combinations of these vehicles could be operated as autonomous vehicles or drones 10. While the remainder of this description will focus on a flyable drone 10, it should be clear that the invention is not limited to these types of vehicles.
  • the drone 10 of Fig. 1 includes a frame 15 that supports four engines 20, in the form of electric motors that each drive a rotor or blade 25 to provide lift, steering, and speed control for the drone 10 as is well known.
  • One or more batteries are supported by the frame 15 and connected to the engines 20 to provide the necessary power.
  • the frame 15 also supports landing elements 30 that can be fixed as illustrated in Fig. 1, or that can retract when the drone 10 is in the air.
  • a sensor or imaging device 35 in the form of a LIDAR imaging device 35 (laser or light imaging, detection, and ranging) is mounted to the frame 15 and positioned to allow for clear imaging 360-degrees around the drone 10.
  • a preferred LIDAR system includes a rotating imaging device 35 that can scan a 360-degree circle around the drone 10 with an accuracy of at least one-half degree (i.e., 720 points around the circle) and can return accurate distance measurements for objects up to one hundred meters away.
  • the imaging device 35 is positioned on the lower portion of the drone 10 but could also be positioned on the top.
  • multiple imaging devices 35 could be positioned at various locations around the drone 10 to provide the same 360-degree imaging as the illustrated device.
  • LIDAR is employed in the illustrated example, other constructions may use other imaging devices 35 such as color cameras, black & white cameras, stereo cameras, ultrasonic sensors, RADAR, infrared, and the like.
  • the imaging device 35 is part of a detection system 37 that is operable to detect obstacles around the drone 10 as it travels between waypoints.
  • the detection system 37 includes a controller 40 in the form of a microprocessor-based controller that receives data from the imaging device 35 and uses that data to determine a course for the drone 10.
  • the controller 40 can also be coupled to the engines 20 and a steering device to also directly control the drone 10 or can be coupled to a second controller 45 that then controls the flight operations of the drone 10.
  • the two separate controllers 40, 45 would operate for the sake of the present invention as a single controller 40a such that any reference to the controller 40 could also be considered a reference to the combined controller 40a.
  • the controller 40 can be programmed to direct the drone 10 to travel to several waypoints, in a desired order.
  • the controller 40 cooperates with the imaging device 35 to chart the actual course as the drone 10 travels between the various waypoints as will be described with reference to Figs. 7-11.
  • the drone 10 may include a steering device that is operable to control the orientation, direction, and speed of travel of the drone 10.
  • the type of steering device employed is largely a function of the type of drone 10 and the environment in which the drone 10 travels.
  • the drone 10 is in the form of a four-engine aerial vehicle or quadcopter.
  • steering can be controlled by varying the speed of the individual engines 20.
  • typical quadcopters include two engines 20 that rotate clockwise and two that rotate counterclockwise to balance torque. Increasing the speed of certain engines 20 with respect to the remaining engines 20 can create an unbalanced torque which rotates the drone 10 about its vertical axis.
  • any two adjacent engines 20 will cause the drone 10 to move in a direction away from those engines 20 as the drone 10 tips slightly.
  • typical aircraft control surfaces e.g., rudders, ailerons, etc.
  • the drone 10 is able to control roll, pitch, yaw, speed and course using only the engines 20.
  • Other drones 10 may include a rudder or wheels that are used to control the direction of travel.
  • the drone 10 operates in one of several states at any given time to navigate between waypoints while simultaneously avoiding both fixed and dynamic obstacles.
  • the detection system 37 may perform different types of scans depending upon the current operating state of the drone 10. It should be noted that the detection system 37 described herein scans what is essentially a two-dimensional plane centered on the drone 10. Other systems may include a three-dimensional scanner or imaging device as may be required for the particular application. The use of a three-dimensional imaging device provides the added benefit of collecting height data that can be used to enhance navigation.
  • Fig. 2 illustrates a first mode of scanning referred to as a searchlight scan 50.
  • the imaging device 35 scans a small angle or wedge 55 to determine the distance and direction to the nearest object within that wedge 55.
  • the searchlight scan 50 first scans a small wedge 55 followed by one or more scans of wider wedges 55a, 55b. This can be useful as the narrow wedge 55 can miss objects that are slightly off a desired path 60 but that may be close enough to interfere with navigation.
  • the second or third wedges 55a, 55b may detect these objects that are slightly off the desired path 60 but that may need to be avoided.
  • the controller 40 can calculate not only a distance to the nearest object near a particular bearing but can also calculate an approach distance or a minimum distance between the obstacle and the drone as the drone travels along the scanned bearing.
  • Fig. 2 illustrates a typical searchlight scan 50 that includes multiple differently angled wedges 55, 55a, 55b.
  • Other figures herein illustrate searchlight scans as a single cone or even a single arrow for clarity.
  • preferred searchlight scans 50 include multiple differently angled wedges 55, 55a, 55b.
  • other constructions could use more or fewer wedges 55, 55a, 55b including only a single wedge 55.
  • Fig. 3 illustrates a second scanning mode referred to as a full circle scan 65.
  • the imaging device 35 scans a 360-degree circle around the drone 10 and provides the distance to the nearest object on every bearing around the drone 10.
  • the full circle scan 65 provides at least one distance for each degree of arc, while more preferred constructions provide at least one distance to the nearest object for each half of a degree resulting in 720 measurements. In still other constructions, greater resolution may be provided (e.g., at least 3600 measurements).
  • the drone 10 is capable of operating in a number of states with the type of scan performed by the imaging device 35 or the way the data collected by the various scans is used varying for each state.
  • a first state referred to as the“go to goal” state 70 shown in Fig. 2
  • the drone 10 travels along the desired path 60 from its current location to its target location or waypoint.
  • the drone 10 begins travel in the go to goal state 70.
  • the detection system 37 uses data from the searchlight scan 50 that is directed along the desired path 60 of the drone 10.
  • the imaging device 35 may also be performing full circle scans 65 but until an obstacle is detected along the desired path 60 within a predetermined range, data from the full circle scan 65 is not needed for navigation.
  • the drone 10 switches to a second state referred to as the“select path” state 75.
  • the detection system 37 receives searchlight scan data from the imaging system 37 from both a left side 80 and a right side 85 (e.g., 45 degrees left and 45 degrees right) of a current bearing 73.
  • the controller 40 determines the best path to avoid the obstacle 74 based on the left side 80 and right side 85 scans and selects a divert path 88 (shown in Fig. 5).
  • the controller 40 also saves a divert bias 90 for use in biasing the drone 10 back toward the target should another divert be required.
  • a sub-goal path 95, or angle can also be selected by the controller 40 at this time.
  • a divert further bearing 100 can be selected at this point. The divert further bearing 100 could be used should an additional diversion be required.
  • a path that is“clear” is a path that if followed does not include an obstacle 74 that passes within a predefined distance (approach distance) of the drone 10 for a preselected distance.
  • the predefined distance could be ten meters and the preselected distance could be one hundred meters.
  • the controller 40 analyzes the various paths 80, 85 using a weight function to enable selection of the most desirable path 80, 85.
  • the left side 80 and the right side 85 are scanned.
  • the weight function may take into account factors such as the desirability of the direction, the distance to the nearest detected obstacle 74, the smallest distance between the drone 10 and the obstacle 74 during travel along the path (the approach distance) as well as other factors. Once the weight function is calculated the controller 40 selects the most desirable path 80, 85 based on the weight function.
  • a third state of operation is referred to as the“divert” state 105 and is illustrated in Fig. 5.
  • the drone 10 switches to the divert state 105 after completing the path selection in the select path state 75.
  • the detection system 37 instructs the imaging device 35 to perform one or more separate searchlight scans 50.
  • the searchlight scans 50 are performed in an order that reflects the most desirable course for the drone 10. By scanning in this order, the drone 10 can select the first clear path and skip any remaining scans of the less desirable paths.
  • the first scan is directed along a direct bearing path 110 centered on a bearing that leads from the drone 10 directly to the target location or waypoint.
  • the second scan is directed along the sub-goal path 95 that is angled a predetermined angle 96 (e.g., 15 degrees) off the divert or current course 88 and toward the target location or waypoint. Again, if the second scan is clear, the drone 10 selects the sub-goal path 95 and any additional scans are omitted.
  • the third searchlight scan 50 is centered on the divert or current course 88 on which the drone 10 is currently traveling. If none of the three courses 88, 95, 110 is clear, the drone 10 can perform a searchlight scan 50 along the divert further bearing 100 and then select the best course using the weight function.
  • the weight function is calculated for each path and can take into account the desirability of the direction, the distance to the nearest detected obstacle 74, the approach distance, any directional bias from prior diverts, as well as other factors.
  • the direct bearing path 110 is the most desirable path with the sub-goal path 95 being the second most desirable and the current or divert path 88 being the third most desirable.
  • the weight factor would account for these desirabilities as well as other factors.
  • the drone 10 it might reach a point where the obstacle enters a predetermined range along the most desirable course (based on the weight factor). When this occurs, the drone 10 can switch to a fourth state of operation.
  • the fourth state of operation is activated if, while in the divert state 105 all four (or more) travel options are deemed not clear and the drone has closed to within a predefined distance of the obstacle 74 along the current path 88.
  • the fourth state referred to as the“corridor detection” state 115, illustrated in Fig. 6 uses the full circle scan 65 to find the edges of one or more corridors 120 around the drone 10.
  • the controller 40 scores the various corridors 120 based on width, mean range, closest range, and the distance of the corridor 120 from the target waypoint and selects the best corridor 120 for further travel. Upon selecting a direction and beginning travel, the drone 10 switches back to the divert state 105.
  • Fig. 7 illustrates a fifth state of operation, referred to as the“back away” state 125 which is activated in response to an object or obstacle 126 being detected within an exclusion zone 130 (e.g., 5 meters) of the drone 10.
  • the controller 40 immediately redirects the drone 10 away from the detected object 126 until the object 126 is out of the exclusion zone 130.
  • the drone 10 moves along a path 131 that is 180 degrees away from the detected object 126.
  • the back away state 125 is triggered by a dynamic or moving object 126 as the other states 70, 75, 105,
  • the size of the drone 10 must be defined.
  • the drone 10 has an exclusion zone 130, illustrated in Fig. 7 in which no objects 126 are allowed.
  • this exclusion zone 130 can be considered the smallest size of the drone 10 for purposes of object avoidance.
  • the exclusion zone 130 is intended to be absolutely avoided.
  • Other zones or sizes can be provided that can be violated under certain circumstances.
  • a normal size or short range 135 e.g., 15 meters
  • Those states 70, 105 would both operate to avoid entry of any obstacle 74, 126 within the circle or sphere defined by the normal size 135.
  • the size could be reduced to an intermediate size 140 (e.g., 10 meters) between the normal size 135 and the exclusion zone 130 to aid in finding a suitable corridor 120. While three sizes 130, 135, 140 or zones have been described, additional sizes or zones that define differently sized circles, spheres or other shapes, could be employed as desired to increase efficiency or speed, or to provide additional clearances during travel.
  • an intermediate size 140 e.g. 10 meters
  • Fig. 8 illustrates one possible travel plan 145 for the drone 10 traveling from a first waypoint 150, to a second waypoint 155, to a third waypoint 160, and then returning to the first waypoint 150.
  • Fig. 8 illustrates fixed obstacles 74 such as trees and dynamic objects 126 such as people that can hinder travel.
  • any object 74, 126 could be a hindrance to travel including buildings, wires, utility poles, wind turbines, etc.
  • no go zones can be defined for the controller 40 and these areas can be treated the same as other objects 74, 126 that hinder travel.
  • no go zones are bounded using GPS (global positioning system) coordinates to define fixed volumes or areas to be avoided.
  • an advantage to the system described herein is that programing or modeling of obstacles 74, 126 prior to traveling is not necessary. Rather, the system navigates and detects obstacles 74, 126 without being pre-programmed to know where those obstacles 74, 126 are located.
  • the use of no go zones may be advantageous for avoiding things that really are not obstacles for travel. For example, it may be desirable to not pass over large bodies of water, busy intersections, airports, etc. while traveling between waypoints. These areas would typically not be detected as obstacles 74, 126 as they are flat and do not present a physical hindrance that cannot be passed through (such as a tree or building) but are nonetheless to be avoided.
  • the drone 10 will now be described with reference to Figs. 8-11. It is important to note that to complete the travel plan 145 illustrated in Fig. 8, all that is required is that the drone 10 be programmed with the GPS coordinates of the waypoints 150, 155, 160. The actual courses, course changes, and location of obstacles 74, 126 do not need to be programmed into the controller 40 or drone 10. As illustrated in Fig. 8, the drone 10 begins travel at the first waypoint 150 with the second waypoint 155 being the target or goal. The drone 10 begins in the go to goal state 70 in which it takes a direct course along the desired path 60 to the second waypoint 155. As illustrated, the only potential obstacle between first waypoint and second waypoint is a person l26a.
  • person l26a is off the desired path 60, thereby allowing the drone 10 to travel all the way to the second waypoint 155 while in the go to goal state 70.
  • the drone 10 could be programmed to land, hover, or perform other activities at the second waypoint 155 before proceeding to the third waypoint 160.
  • the drone 10 next proceeds from the second waypoint 155 toward the third waypoint 160.
  • the path between the second waypoint 155 and the third waypoint 160 includes a number of obstacles 74a-74c in the form of trees or bushes.
  • Fig. 9 illustrates in greater detail how the detection system 37 and the controller 40 operate to direct the drone 10 around the obstacles 74a-74c.
  • the drone 10 initially begins its travel in the go to goal state 70 and follows the desired path 60 that is a straight path between the second waypoint 155 and the third waypoint 160.
  • the drone 10 transitions to the select path state 75.
  • the detection system 37 scans both left 80 and right 85 of the desired path 60.
  • the left scan 80 and right scan 85 are angled 45 degrees from the desired path 60 or current path of the drone 10 with other angles also being possible.
  • the scan to the left 80 detects the edge of the first object 74a and also detects the second object 74b.
  • the scan to the right 85 does not detect any obstacles and the right-side path is selected as the divert path 88 for continued travel.
  • a left bias is stored for application to weight factors to bias any additional divert decisions to the left and toward the third waypoint 160.
  • the drone 10 proceeds along the divert path 88 and switches to the divert state 105.
  • the divert path 88 becomes the current path
  • the sub-goal path 95 is selected to be a path angled a predetermined amount (e.g., 15 degrees) to the left of the divert path 88 (i.e., toward the third waypoint 160) and the direct path 110 is the direct path from the current position of the drone 10 to the third waypoint 160.
  • the detection system 37 continues to scan the possible paths 88, 95, 100, 110 as discussed with regard to Fig. 5 and selects the first path that is free from obstructions 74, 126 within a predetermined distance (e.g., 50 meters) in the order of the direct path 110 and the sub-goal path 95.
  • a predetermined distance e.g. 50 meters
  • the best path is selected using the weight factors should the direct path 110 and the sub-goal path 95 not be clear.
  • the drone 10 at a first scan point 195 in the divert state 105 performs the scans and selects the sub-goal path 95 as the direct path 110 is blocked by the third tree 74c. While the divert path 88 is also unblocked, the sub-goal path 95 is more desirable as it is closer to the correct direction to get to the third waypoint 160.
  • the drone 10 proceeds to a second scan point 200, while still in the divert state 105 and again begins to perform the predefined scans in the predefined order. At the second scan point 200, all the paths are clear, so the drone performs the first scan along the direct path 110 and selects the direct path 110 for continued travel. The remaining scans are unnecessary and are not performed. Upon selecting the direct path 110, the drone 10 transitions to the go to goal state 70.
  • the drone 10 next travels from the third waypoint 160 back to the first waypoint 150.
  • the only obstacle 74, 126 on the path between the third waypoint 160 and the first waypoint 150 is a person l26b wandering aimlessly.
  • the drone 10 transitions from the go to goal state 70 to the select path state 75.
  • the detection system 37 and the controller 40 determine that the best path forward is to veer to the left to go around the person 74b.
  • the drone 10 selects the left path, sets that path as its divert path 88 and transitions to the divert state 105.
  • the drone 10 again detects the person l26b in its path.
  • the person l26b has wandered in front of the drone 10 once again.
  • the divert or current path 88 is blocked, the sub-goal path 95 is blocked but the direct path 110 is open.
  • the drone 10 therefore selects the direct path 110 and proceeds toward the first waypoint 150. Once the direct path 110 is selected, the drone 10 transitions back to the go to goal state 70.
  • FIG. 11 an example of the corridor detection state 115 is illustrated.
  • the drone 10 has entered an area with several obstacles 74, 126 while the drone 10 is in the divert state 105.
  • the drone 10 scans the paths of the divert state 105 and finds that all are blocked by obstacles 74f, l26d, 74h.
  • the drone 10 transitions to the corridor detection state 115 and does a full circle scan 65 to find potential paths out of the crowded area.
  • the controller 40 could set the size of the drone 10 to a smaller value to allow for travel through smaller corridors 120 while in the corridor detection state 115 if desired.
  • the controller 40 Upon completing the full circle scan 65, the controller 40 identifies two possible corridors l20a, l20b. The controller 40 calculates weight factors for the available options based on several criteria (e.g., size, direction, distance from goal, etc.) and selects the desired corridor l20a, l20b or path. The drone 10 proceeds along the selected path l20b, transitions to the divert state 105 and sets the selected course as the divert or current course 88.
  • the controller 40 identifies two possible corridors l20a, l20b. The controller 40 calculates weight factors for the available options based on several criteria (e.g., size, direction, distance from goal, etc.) and selects the desired corridor l20a, l20b or path. The drone 10 proceeds along the selected path l20b, transitions to the divert state 105 and sets the selected course as the divert or current course 88.
  • the drone 10 determines that the direct path 110 is still blocked but the sub-goal path 95 is clear. The drone 10 thus turns onto the sub-goal path 95 and the sub-goal path becomes the current path 88. The drone 10 reaches a third scan point 240 and again performs a scan. At the third scan point 240, the direct path 110 is clear, and the drone 10 turns to travel along the direct path 110. Once on the direct path 110, the drone 10 transitions to the go to goal state 70 and proceeds to a desired waypoint 111.
  • the system just described provides for fully autonomous drone navigation through a complex environment that can include stationary obstacles as well as dynamic or moving obstacles.
  • the system is fully predictable with decisions being logged for later review and analysis.
  • the system requires no pre-programming other than to define the desired waypoints, typically using GPS coordinates. It should be noted that the description provided describes and illustrates scans occurring at various scan points. However, the scans are performed on an almost continuous basis to achieve the shortest route around any obstacles and toward the desired waypoint.
  • the system identifies the edges of obstacles 74, 126 and is able to traverse the perimeter of objects 74, 126 until the target waypoint is visible, at which point the drone 10 travels on a direct path to that waypoint.
  • Using a deterministic approach, rather than map-based or deep- learning based approaches allows for more efficient troubleshooting and analysis of the algorithm.
  • the system is far simpler than systems in which the environment is modeled or pre-mapped and added to the navigation system for guidance.
  • the ability to address dynamic obstacles 74 is unique when compared to pre-programmed devices.
  • the system is fully powered and implemented on the drone 10 so that no outside communication or direction is required for the drone 10 to navigate between waypoints.
  • the only user intervention required is the identification of the specific waypoints.

Abstract

A method of navigating a drone from a first waypoint to a second waypoint includes providing a detection system coupled to the drone, operating the detection system to detect a bearing and distance from a current position of the drone to an obstacle, operating the drone in a go to goal state in response to not detecting the obstacle on a bearing between the current position and the second waypoint within a predefined first distance, wherein the drone travels on a direct path from the current position toward the second waypoint, and operating the drone in a divert state in response to detecting the obstacle on the bearing between the current position and the second waypoint within the predefined first distance, wherein the detection system analyzes potential paths to a left side and a right side of the obstacle and selects a desired divert path for continued travel.

Description

DRONE WAYPOINT NAVIGATION SYSTEM
TECHNICAL FIELD
[001] The present disclosure is directed, in general, to autonomous drones and more specifically to the navigation of autonomous flyable drones.
BACKGROUND
[002] Autonomous vehicles or drones have become much more common for applications such as remote video recording, surveillance, transportation of goods, and the like. In most cases, multi-engine vertical take-off and landing vehicles are employed as drones. Typically, a user controls the drone as it travels between waypoints.
SUMMARY
[003] A method of navigating a drone from a first waypoint to a second waypoint includes providing a detection system coupled to the drone, operating the detection system to detect a bearing and distance from a current position of the drone to an obstacle, operating the drone in a go to goal state in response to not detecting the obstacle on a bearing between the current position and the second waypoint within a predefined first distance, wherein the drone travels on a direct path from the current position toward the second waypoint, and operating the drone in a divert state in response to detecting the obstacle on the bearing between the current position and the second waypoint within the predefined first distance, wherein the detection system analyzes potential paths to a left side and a right side of the obstacle and selects a desired divert path for continued travel.
[004] In another construction, a method of navigating a drone from a first waypoint to a second waypoint includes operating the drone in a go to goal state in which the drone travels along a current path which is a straight line from the first waypoint toward the second waypoint, operating a detection system to detect obstacles on the current path, switching the drone to a select path state in response to the detection of an obstacle on the current path, and operating the detection system to scan a potential left-side path and a right-side path and to analyze and select a divert path from the left-side path and the right-side path, the selected divert path becoming a current path. The method also includes switching from the select path state to a divert state, scanning the current path, a sub-goal path, and a direct path, the direct path being a direct path from a current position of the drone to the second waypoint, and the sub-goal path being a path that is angled a non-zero angle from the divert path toward the second waypoint, and selecting for continued travel the direct path when the direct path is clear and the sub-goal path when the sub-goal path is clear and the direct path is not clear.
[005] In another construction, a drone includes a frame, an engine supported by the frame, a propulsion element coupled to the engine and operable to propel the drone along a travel path, and a detection system coupled to the frame and operable to detect a bearing and distance to an obstacle. A steering assembly is coupled to the detection system and is operable to steer the drone along a go to goal path from a first waypoint to a second waypoint in response to the detection system not detecting the obstacle along a direct path and within a predetermined distance from the drone, and operable to divert the drone along a divert path in response to the detection system detecting the obstacle along the direct path within the predetermined distance.
A controller is operable to analyze data from the detection system and transition the drone from a go to goal state to a divert state in response to the detection of the obstacle, the controller operable to analyze the divert path, a sub-goal path, and a direct path and to select one of the direct path, the sub-goal path and the divert path for continued travel.
[006] The foregoing has outlined rather broadly the technical features of the present disclosure so that those skilled in the art may better understand the detailed description that follows.
Additional features and advantages of the disclosure will be described hereinafter that form the subject of the claims. Those skilled in the art will appreciate that they may readily use the conception and the specific embodiments disclosed as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. Those skilled in the art will also realize that such equivalent constructions do not depart from the spirit and scope of the disclosure in its broadest form. [007] Also, before undertaking the Detailed Description below, it should be understood that various definitions for certain words and phrases are provided throughout this specification and those of ordinary skill in the art will understand that such definitions apply in many, if not most, instances to prior as well as future uses of such defined words and phrases. While some terms may include a wide variety of embodiments, the appended claims may expressly limit these terms to specific embodiments.
BRIEF DESCRIPTION OF THE DRAWINGS
[008] lFig. 1 is a perspective view of a drone in the form of a vertical take-off and landing aerial vehicle.
[009] Fig. 2 is a schematic illustration of an imaging system scanning in a searchlight mode and in a go to goal state.
[0010] Fig. 3 is a perspective view of an imaging system scanning in a full circle mode.
[0011] Fig. 4 is a schematic illustration of the drone operating in a select path state with the imaging system scanning one or more potential paths in the searchlight mode.
[0012] Fig. 5 is a schematic illustration of an imaging system operating in a divert state performing scans in four directions each in searchlight mode.
[0013] Fig. 6 is a schematic illustration of the drone operating in a corridor detection state with the imaging system scanning in the full circle mode to find multiple corridors through which the drone could pass.
[0014] Fig. 7 is a schematic illustration of the drone in a back away state and further illustrating an exclusion zone, a normal size, and an intermediate size.
[0015] Fig. 8 is a schematic illustration of a travel plan and actual flight path of a drone between three waypoints.
[0016] Fig. 9 is a more detailed schematic illustration of a portion of the travel plan and actual flight path of Fig. 8 between the second waypoint and the third waypoint. [0017] Fig. 10 is a more detailed schematic illustration of a portion of the travel plan and actual flight path of Fig. 8 between the third waypoint and the first waypoint.
[0018] Fig. 11 is a schematic illustration of a travel plan and actual flight path of a drone operating in a corridor detection state.
[0019] Fig. 12 schematically illustrates a control system for the drone of Fig. 1
[0020] Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
DETAILED DESCRIPTION
[0021] Various technologies that pertain to systems and methods will now be described with reference to the drawings, where like reference numerals represent like elements throughout.
The drawings discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged apparatus. It is to be understood that functionality that is described as being carried out by certain system elements may be performed by multiple elements. Similarly, for instance, an element may be configured to perform functionality that is described as being carried out by multiple elements. The numerous innovative teachings of the present application will be described with reference to exemplary non-limiting embodiments.
[0022] Also, it should be understood that the words or phrases used herein should be construed broadly, unless expressly limited in some examples. For example, the terms“including,” “having,” and“comprising,” as well as derivatives thereof, mean inclusion without limitation. The singular forms“a”,“an” and“the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Further, the term“and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. The term“or” is inclusive, meaning and/or, unless the context clearly indicates otherwise. The phrases“associated with” and“associated therewith,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, or the like.
[0023] Also, although the terms "first", "second", "third" and so forth may be used herein to refer to various elements, information, functions, or acts, these elements, information, functions, or acts should not be limited by these terms. Rather these numeral adjectives are used to distinguish different elements, information, functions or acts from each other. For example, a first element, information, function, or act could be termed a second element, information, function, or act, and, similarly, a second element, information, function, or act could be termed a first element, information, function, or act, without departing from the scope of the present disclosure.
[0024] In addition, the term "adjacent to" may mean: that an element is relatively near to but not in contact with a further element; or that the element is in contact with the further portion, unless the context clearly indicates otherwise. Further, the phrase“based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise. Terms“about” or“substantially” or like terms are intended to cover variations in a value that are within normal industry manufacturing tolerances for that dimension. If no industry standard as available a variation of 20 percent would fall within the meaning of these terms unless otherwise stated.
[0025] When navigating, terms such as“path”,“bearing”,“course”,“direction”, and the like are used interchangeably and should be considered as interchangeable and having the same or similar meanings.
[0026] Fig. 1 illustrates an autonomous vehicle or drone 10 in the form of a multi-engine vertical take-off and landing aircraft. Of course, different forms of aircraft or vehicles that are not aircraft could also be used as autonomous vehicles or drones 10. For example, land-based vehicles such as automobiles, or water craft such as surface vessels and submersibles or combinations of these vehicles could be operated as autonomous vehicles or drones 10. While the remainder of this description will focus on a flyable drone 10, it should be clear that the invention is not limited to these types of vehicles.
[0027] The drone 10 of Fig. 1 includes a frame 15 that supports four engines 20, in the form of electric motors that each drive a rotor or blade 25 to provide lift, steering, and speed control for the drone 10 as is well known. One or more batteries are supported by the frame 15 and connected to the engines 20 to provide the necessary power. The frame 15 also supports landing elements 30 that can be fixed as illustrated in Fig. 1, or that can retract when the drone 10 is in the air. A sensor or imaging device 35, in the form of a LIDAR imaging device 35 (laser or light imaging, detection, and ranging) is mounted to the frame 15 and positioned to allow for clear imaging 360-degrees around the drone 10. A preferred LIDAR system includes a rotating imaging device 35 that can scan a 360-degree circle around the drone 10 with an accuracy of at least one-half degree (i.e., 720 points around the circle) and can return accurate distance measurements for objects up to one hundred meters away. In the illustrated construction, the imaging device 35 is positioned on the lower portion of the drone 10 but could also be positioned on the top. Alternatively, multiple imaging devices 35 could be positioned at various locations around the drone 10 to provide the same 360-degree imaging as the illustrated device. Also, while LIDAR is employed in the illustrated example, other constructions may use other imaging devices 35 such as color cameras, black & white cameras, stereo cameras, ultrasonic sensors, RADAR, infrared, and the like.
[0028] The imaging device 35 is part of a detection system 37 that is operable to detect obstacles around the drone 10 as it travels between waypoints. As illustrated in Fig. 12, the detection system 37 includes a controller 40 in the form of a microprocessor-based controller that receives data from the imaging device 35 and uses that data to determine a course for the drone 10. The controller 40 can also be coupled to the engines 20 and a steering device to also directly control the drone 10 or can be coupled to a second controller 45 that then controls the flight operations of the drone 10. The two separate controllers 40, 45 would operate for the sake of the present invention as a single controller 40a such that any reference to the controller 40 could also be considered a reference to the combined controller 40a. In the illustrated construction, the controller 40 can be programmed to direct the drone 10 to travel to several waypoints, in a desired order. The controller 40 cooperates with the imaging device 35 to chart the actual course as the drone 10 travels between the various waypoints as will be described with reference to Figs. 7-11.
[0029] The drone 10 may include a steering device that is operable to control the orientation, direction, and speed of travel of the drone 10. The type of steering device employed is largely a function of the type of drone 10 and the environment in which the drone 10 travels. In the illustrated construction, the drone 10 is in the form of a four-engine aerial vehicle or quadcopter. In this device, steering can be controlled by varying the speed of the individual engines 20. For example, typical quadcopters include two engines 20 that rotate clockwise and two that rotate counterclockwise to balance torque. Increasing the speed of certain engines 20 with respect to the remaining engines 20 can create an unbalanced torque which rotates the drone 10 about its vertical axis. In addition, increasing the speed of any two adjacent engines 20 will cause the drone 10 to move in a direction away from those engines 20 as the drone 10 tips slightly. Thus, typical aircraft control surfaces (e.g., rudders, ailerons, etc.) are unnecessary and the drone 10 is able to control roll, pitch, yaw, speed and course using only the engines 20. Other drones 10 may include a rudder or wheels that are used to control the direction of travel.
[0030] The drone 10 operates in one of several states at any given time to navigate between waypoints while simultaneously avoiding both fixed and dynamic obstacles. The detection system 37 may perform different types of scans depending upon the current operating state of the drone 10. It should be noted that the detection system 37 described herein scans what is essentially a two-dimensional plane centered on the drone 10. Other systems may include a three-dimensional scanner or imaging device as may be required for the particular application. The use of a three-dimensional imaging device provides the added benefit of collecting height data that can be used to enhance navigation.
[0031] Fig. 2 illustrates a first mode of scanning referred to as a searchlight scan 50. In a searchlight scan 50, the imaging device 35 scans a small angle or wedge 55 to determine the distance and direction to the nearest object within that wedge 55. In some constructions, the searchlight scan 50 first scans a small wedge 55 followed by one or more scans of wider wedges 55a, 55b. This can be useful as the narrow wedge 55 can miss objects that are slightly off a desired path 60 but that may be close enough to interfere with navigation. The second or third wedges 55a, 55b may detect these objects that are slightly off the desired path 60 but that may need to be avoided. With a searchlight scan 50, the controller 40 can calculate not only a distance to the nearest object near a particular bearing but can also calculate an approach distance or a minimum distance between the obstacle and the drone as the drone travels along the scanned bearing.
[0032] Fig. 2 illustrates a typical searchlight scan 50 that includes multiple differently angled wedges 55, 55a, 55b. Other figures herein illustrate searchlight scans as a single cone or even a single arrow for clarity. However, preferred searchlight scans 50 include multiple differently angled wedges 55, 55a, 55b. Of course, other constructions could use more or fewer wedges 55, 55a, 55b including only a single wedge 55.
[0033] Fig. 3 illustrates a second scanning mode referred to as a full circle scan 65. In full circle scan 65, the imaging device 35 scans a 360-degree circle around the drone 10 and provides the distance to the nearest object on every bearing around the drone 10. In some constructions, the full circle scan 65 provides at least one distance for each degree of arc, while more preferred constructions provide at least one distance to the nearest object for each half of a degree resulting in 720 measurements. In still other constructions, greater resolution may be provided (e.g., at least 3600 measurements).
[0034] As mentioned, the drone 10 is capable of operating in a number of states with the type of scan performed by the imaging device 35 or the way the data collected by the various scans is used varying for each state. In a first state, referred to as the“go to goal” state 70 shown in Fig. 2, the drone 10 travels along the desired path 60 from its current location to its target location or waypoint. Typically, the drone 10 begins travel in the go to goal state 70. While traveling in the go to goal state 70, the detection system 37 uses data from the searchlight scan 50 that is directed along the desired path 60 of the drone 10. The imaging device 35 may also be performing full circle scans 65 but until an obstacle is detected along the desired path 60 within a predetermined range, data from the full circle scan 65 is not needed for navigation.
[0035] As illustrated in Fig. 4, if an obstacle 74 is detected in the desired path 60 of the drone 10 and within the predetermined range (e.g., 30 meters), the drone 10 switches to a second state referred to as the“select path” state 75. In the select path state 75, the detection system 37 receives searchlight scan data from the imaging system 37 from both a left side 80 and a right side 85 (e.g., 45 degrees left and 45 degrees right) of a current bearing 73. The controller 40 determines the best path to avoid the obstacle 74 based on the left side 80 and right side 85 scans and selects a divert path 88 (shown in Fig. 5). The controller 40 also saves a divert bias 90 for use in biasing the drone 10 back toward the target should another divert be required. A sub-goal path 95, or angle can also be selected by the controller 40 at this time. In addition, a divert further bearing 100 can be selected at this point. The divert further bearing 100 could be used should an additional diversion be required.
[0036] Before proceeding, it is important to understand how particular paths are selected. A path that is“clear” is a path that if followed does not include an obstacle 74 that passes within a predefined distance (approach distance) of the drone 10 for a preselected distance. For example, the predefined distance could be ten meters and the preselected distance could be one hundred meters. Thus, if the drone 10 were to follow a certain path for one hundred meters and not pass within ten meters of an obstacle 74, that path would be deemed“clear”. If no path is clear, the controller 40 analyzes the various paths 80, 85 using a weight function to enable selection of the most desirable path 80, 85. When operating in the select path state, the left side 80 and the right side 85 are scanned. The weight function may take into account factors such as the desirability of the direction, the distance to the nearest detected obstacle 74, the smallest distance between the drone 10 and the obstacle 74 during travel along the path (the approach distance) as well as other factors. Once the weight function is calculated the controller 40 selects the most desirable path 80, 85 based on the weight function.
[0037] A third state of operation is referred to as the“divert” state 105 and is illustrated in Fig. 5. The drone 10 switches to the divert state 105 after completing the path selection in the select path state 75. When in the divert state 105, the detection system 37 instructs the imaging device 35 to perform one or more separate searchlight scans 50. In order to increase the speed of the scans and the decision making of the controller 40, the searchlight scans 50 are performed in an order that reflects the most desirable course for the drone 10. By scanning in this order, the drone 10 can select the first clear path and skip any remaining scans of the less desirable paths. The first scan is directed along a direct bearing path 110 centered on a bearing that leads from the drone 10 directly to the target location or waypoint. If this scan is clear, no further scans are necessary, and the drone 10 selects the direct bearing path 110 for continued travel. The second scan is directed along the sub-goal path 95 that is angled a predetermined angle 96 (e.g., 15 degrees) off the divert or current course 88 and toward the target location or waypoint. Again, if the second scan is clear, the drone 10 selects the sub-goal path 95 and any additional scans are omitted. The third searchlight scan 50 is centered on the divert or current course 88 on which the drone 10 is currently traveling. If none of the three courses 88, 95, 110 is clear, the drone 10 can perform a searchlight scan 50 along the divert further bearing 100 and then select the best course using the weight function. Again, the weight function is calculated for each path and can take into account the desirability of the direction, the distance to the nearest detected obstacle 74, the approach distance, any directional bias from prior diverts, as well as other factors. In the divert state 105 the direct bearing path 110 is the most desirable path with the sub-goal path 95 being the second most desirable and the current or divert path 88 being the third most desirable. The weight factor would account for these desirabilities as well as other factors. As the drone 10 proceeds, it might reach a point where the obstacle enters a predetermined range along the most desirable course (based on the weight factor). When this occurs, the drone 10 can switch to a fourth state of operation.
[0038] The fourth state of operation is activated if, while in the divert state 105 all four (or more) travel options are deemed not clear and the drone has closed to within a predefined distance of the obstacle 74 along the current path 88. The fourth state, referred to as the“corridor detection” state 115, illustrated in Fig. 6 uses the full circle scan 65 to find the edges of one or more corridors 120 around the drone 10. The controller 40 then scores the various corridors 120 based on width, mean range, closest range, and the distance of the corridor 120 from the target waypoint and selects the best corridor 120 for further travel. Upon selecting a direction and beginning travel, the drone 10 switches back to the divert state 105.
[0039] Fig. 7 illustrates a fifth state of operation, referred to as the“back away” state 125 which is activated in response to an object or obstacle 126 being detected within an exclusion zone 130 (e.g., 5 meters) of the drone 10. When the detection system 37 detects the object 126 in the exclusion zone 130, the controller 40 immediately redirects the drone 10 away from the detected object 126 until the object 126 is out of the exclusion zone 130. Preferably, the drone 10 moves along a path 131 that is 180 degrees away from the detected object 126. Typically, the back away state 125 is triggered by a dynamic or moving object 126 as the other states 70, 75, 105,
115 will inhibit the entry of stationary objects within the exclusion zone 130.
[0040] In order for the drone 10 to navigate between waypoints while avoiding obstacles 74,
126, the size of the drone 10 must be defined. As discussed, the drone 10 has an exclusion zone 130, illustrated in Fig. 7 in which no objects 126 are allowed. Thus, this exclusion zone 130 can be considered the smallest size of the drone 10 for purposes of object avoidance. However, the exclusion zone 130 is intended to be absolutely avoided. Other zones or sizes can be provided that can be violated under certain circumstances. For example, a normal size or short range 135 (e.g., 15 meters) could be used during travel in the go to goal state 70 or the divert state 105 as the predetermined range. Those states 70, 105 would both operate to avoid entry of any obstacle 74, 126 within the circle or sphere defined by the normal size 135. However, when switching to the corridor detection state 115, the size could be reduced to an intermediate size 140 (e.g., 10 meters) between the normal size 135 and the exclusion zone 130 to aid in finding a suitable corridor 120. While three sizes 130, 135, 140 or zones have been described, additional sizes or zones that define differently sized circles, spheres or other shapes, could be employed as desired to increase efficiency or speed, or to provide additional clearances during travel.
[0041] Fig. 8 illustrates one possible travel plan 145 for the drone 10 traveling from a first waypoint 150, to a second waypoint 155, to a third waypoint 160, and then returning to the first waypoint 150. Fig. 8 illustrates fixed obstacles 74 such as trees and dynamic objects 126 such as people that can hinder travel. However, it should be noted that any object 74, 126 could be a hindrance to travel including buildings, wires, utility poles, wind turbines, etc. In addition, no go zones can be defined for the controller 40 and these areas can be treated the same as other objects 74, 126 that hinder travel. Typically, no go zones are bounded using GPS (global positioning system) coordinates to define fixed volumes or areas to be avoided. However, it is important to note that an advantage to the system described herein is that programing or modeling of obstacles 74, 126 prior to traveling is not necessary. Rather, the system navigates and detects obstacles 74, 126 without being pre-programmed to know where those obstacles 74, 126 are located. The use of no go zones may be advantageous for avoiding things that really are not obstacles for travel. For example, it may be desirable to not pass over large bodies of water, busy intersections, airports, etc. while traveling between waypoints. These areas would typically not be detected as obstacles 74, 126 as they are flat and do not present a physical hindrance that cannot be passed through (such as a tree or building) but are nonetheless to be avoided.
[0042] The operation of the drone 10 will now be described with reference to Figs. 8-11. It is important to note that to complete the travel plan 145 illustrated in Fig. 8, all that is required is that the drone 10 be programmed with the GPS coordinates of the waypoints 150, 155, 160. The actual courses, course changes, and location of obstacles 74, 126 do not need to be programmed into the controller 40 or drone 10. As illustrated in Fig. 8, the drone 10 begins travel at the first waypoint 150 with the second waypoint 155 being the target or goal. The drone 10 begins in the go to goal state 70 in which it takes a direct course along the desired path 60 to the second waypoint 155. As illustrated, the only potential obstacle between first waypoint and second waypoint is a person l26a. However, that person l26a is off the desired path 60, thereby allowing the drone 10 to travel all the way to the second waypoint 155 while in the go to goal state 70. The drone 10 could be programmed to land, hover, or perform other activities at the second waypoint 155 before proceeding to the third waypoint 160.
[0043] The drone 10 next proceeds from the second waypoint 155 toward the third waypoint 160. However, the path between the second waypoint 155 and the third waypoint 160 includes a number of obstacles 74a-74c in the form of trees or bushes. Fig. 9 illustrates in greater detail how the detection system 37 and the controller 40 operate to direct the drone 10 around the obstacles 74a-74c. As discussed, the drone 10 initially begins its travel in the go to goal state 70 and follows the desired path 60 that is a straight path between the second waypoint 155 and the third waypoint 160. Upon detection of the first obstacle 74a directly in the desired path 60 of the drone 10, the drone 10 transitions to the select path state 75. In the select path state 75, the detection system 37 scans both left 80 and right 85 of the desired path 60. In the illustrated construction, the left scan 80 and right scan 85 are angled 45 degrees from the desired path 60 or current path of the drone 10 with other angles also being possible. The scan to the left 80 detects the edge of the first object 74a and also detects the second object 74b. The scan to the right 85 does not detect any obstacles and the right-side path is selected as the divert path 88 for continued travel. A left bias is stored for application to weight factors to bias any additional divert decisions to the left and toward the third waypoint 160. The drone 10 proceeds along the divert path 88 and switches to the divert state 105. In the divert state 105, the divert path 88 becomes the current path, the sub-goal path 95 is selected to be a path angled a predetermined amount (e.g., 15 degrees) to the left of the divert path 88 (i.e., toward the third waypoint 160) and the direct path 110 is the direct path from the current position of the drone 10 to the third waypoint 160.
[0044] During travel in the divert state 105 the detection system 37 continues to scan the possible paths 88, 95, 100, 110 as discussed with regard to Fig. 5 and selects the first path that is free from obstructions 74, 126 within a predetermined distance (e.g., 50 meters) in the order of the direct path 110 and the sub-goal path 95. Alternatively, the best path is selected using the weight factors should the direct path 110 and the sub-goal path 95 not be clear. As illustrated in Fig. 9, the drone 10 at a first scan point 195 in the divert state 105 performs the scans and selects the sub-goal path 95 as the direct path 110 is blocked by the third tree 74c. While the divert path 88 is also unblocked, the sub-goal path 95 is more desirable as it is closer to the correct direction to get to the third waypoint 160.
[0045] The drone 10 proceeds to a second scan point 200, while still in the divert state 105 and again begins to perform the predefined scans in the predefined order. At the second scan point 200, all the paths are clear, so the drone performs the first scan along the direct path 110 and selects the direct path 110 for continued travel. The remaining scans are unnecessary and are not performed. Upon selecting the direct path 110, the drone 10 transitions to the go to goal state 70.
[0046] Returning to Fig. 8, the drone 10 next travels from the third waypoint 160 back to the first waypoint 150. The only obstacle 74, 126 on the path between the third waypoint 160 and the first waypoint 150 is a person l26b wandering aimlessly. As illustrated in Fig. 10, as the drone 10 approaches the person l26b the drone 10 transitions from the go to goal state 70 to the select path state 75. The detection system 37 and the controller 40 determine that the best path forward is to veer to the left to go around the person 74b. The drone 10 selects the left path, sets that path as its divert path 88 and transitions to the divert state 105. At a subsequent scan point 210 the drone 10 again detects the person l26b in its path. In this case, the person l26b has wandered in front of the drone 10 once again. However, in this case, the divert or current path 88 is blocked, the sub-goal path 95 is blocked but the direct path 110 is open. The drone 10 therefore selects the direct path 110 and proceeds toward the first waypoint 150. Once the direct path 110 is selected, the drone 10 transitions back to the go to goal state 70.
[0047] Had the person l26b of Fig. 10 wandered into the exclusion zone 130 of the drone 10, the drone 10 would have detected his or her presence and would have immediately transitioned to the back away state 125 and would have traveled 180 degrees away from the person l26b until the person l26b was outside of the exclusion zone 130.
[0048] Turning to Fig. 11, an example of the corridor detection state 115 is illustrated. In this example, the drone 10 has entered an area with several obstacles 74, 126 while the drone 10 is in the divert state 105. In a first scan location 220, the drone 10 scans the paths of the divert state 105 and finds that all are blocked by obstacles 74f, l26d, 74h. The drone 10 transitions to the corridor detection state 115 and does a full circle scan 65 to find potential paths out of the crowded area. As previously noted, the controller 40 could set the size of the drone 10 to a smaller value to allow for travel through smaller corridors 120 while in the corridor detection state 115 if desired. Upon completing the full circle scan 65, the controller 40 identifies two possible corridors l20a, l20b. The controller 40 calculates weight factors for the available options based on several criteria (e.g., size, direction, distance from goal, etc.) and selects the desired corridor l20a, l20b or path. The drone 10 proceeds along the selected path l20b, transitions to the divert state 105 and sets the selected course as the divert or current course 88.
At a second scan position 235 illustrated in Fig. 11, the drone 10 determines that the direct path 110 is still blocked but the sub-goal path 95 is clear. The drone 10 thus turns onto the sub-goal path 95 and the sub-goal path becomes the current path 88. The drone 10 reaches a third scan point 240 and again performs a scan. At the third scan point 240, the direct path 110 is clear, and the drone 10 turns to travel along the direct path 110. Once on the direct path 110, the drone 10 transitions to the go to goal state 70 and proceeds to a desired waypoint 111.
[0049] The system just described provides for fully autonomous drone navigation through a complex environment that can include stationary obstacles as well as dynamic or moving obstacles. The system is fully predictable with decisions being logged for later review and analysis. The system requires no pre-programming other than to define the desired waypoints, typically using GPS coordinates. It should be noted that the description provided describes and illustrates scans occurring at various scan points. However, the scans are performed on an almost continuous basis to achieve the shortest route around any obstacles and toward the desired waypoint.
[0050] The system identifies the edges of obstacles 74, 126 and is able to traverse the perimeter of objects 74, 126 until the target waypoint is visible, at which point the drone 10 travels on a direct path to that waypoint. Using a deterministic approach, rather than map-based or deep- learning based approaches allows for more efficient troubleshooting and analysis of the algorithm. In addition, the system is far simpler than systems in which the environment is modeled or pre-mapped and added to the navigation system for guidance. Additionally, the ability to address dynamic obstacles 74 is unique when compared to pre-programmed devices.
[0051] The system is fully powered and implemented on the drone 10 so that no outside communication or direction is required for the drone 10 to navigate between waypoints. The only user intervention required is the identification of the specific waypoints.
[0052] Although an exemplary embodiment of the present disclosure has been described in detail, those skilled in the art will understand that various changes, substitutions, variations, and improvements disclosed herein may be made without departing from the spirit and scope of the disclosure in its broadest form.
[0053] None of the description in the present application should be read as implying that any particular element, step, act, or function is an essential element, which must be included in the claim scope: the scope of patented subject matter is defined only by the allowed claims.
Moreover, none of these claims are intended to invoke a means plus function claim construction unless the exact words "means for" are followed by a participle.

Claims

CLAIMS What is claimed is:
1. A method of navigating a drone from a first waypoint to a second waypoint, the method comprising:
providing a detection system coupled to the drone;
operating the detection system to detect a bearing and distance from a current position of the drone to an obstacle;
operating the drone in a go to goal state in response to not detecting the obstacle on a bearing between the current position and the second waypoint within a predefined first distance, wherein the drone travels on a direct path from the current position toward the second waypoint; and
operating the drone in a divert state in response to detecting the obstacle on the bearing between the current position and the second waypoint within the predefined first distance, wherein the detection system analyzes potential paths to a left side and a right side of the obstacle and selects a desired divert path for continued travel.
2. The method of claim 1, wherein the detection system includes a LIDAR system and wherein the LIDAR system is operable to scan a wedge having a predetermined angle along a scan bearing.
3. The method of claim 2, wherein the LIDAR system scans a plurality of wedges each having a different dispersion angle along the scan bearing.
4. The method of claim 1, wherein when the drone is in the divert state the detection system scans along the divert path, a direct path which extends from the current position toward the second waypoint, and a sub-goal path which is angled a predetermined angle from the divert path toward the second waypoint.
5. The method of claim 4, further comprising selecting a desired path of travel from one of the divert path, the sub-goal path, the direct path, and a further divert path wherein the direct path is selected if the direct path is clear and the sub-goal path is selected if the direct path is not clear and the sub-goal path is clear, the selected desired path becoming a current path.
6. The method of claim 5, further comprising calculating a weight factor for each of the current path, the sub-goal path, the direct path, and the further divert path when none of the direct path, the sub-goal path, and the current path are clear, and selecting the desired path based on the weight factor.
7. The method of claim 5, further comprising transitioning the drone to a corridor detection state in response to detecting an obstacle within a predefined distance along the current path.
8. The method of claim 7, wherein the detection system scans a 360-degree area around the drone when in the corridor detection state to collect data including a distance to a nearest object along at least 360 directions, and wherein the drone selects a desired direction of travel based on an analysis of the data collected in the 360-degree scan.
9. The method of claim 8, wherein the drone transitions to the divert state upon selecting the desired direction of travel.
10. The method of claim 1 , further comprising defining an exclusion zone around the drone, transitioning to a back away state in response to detecting an object within the exclusion zone, and directing the drone along a path that is 180 degrees away from the object.
11. The method of claim 1 , wherein the detection system scans a two-dimensional area that extends 360 degrees around the drone.
12. A method of navigating a drone from a first waypoint to a second waypoint, the method comprising:
operating the drone in a go to goal state in which the drone travels along a current path which is a straight line from the first waypoint toward the second waypoint;
operating a detection system to detect obstacles on the current path;
switching the drone to a select path state in response to the detection of an obstacle on the current path;
operating the detection system to scan a potential left-side path and a right-side path and to analyze and select a divert path from the left-side path and the right-side path, the selected divert path becoming a current path;
switching from the select path state to a divert state;
scanning the current path, a sub-goal path, and a direct path, the direct path being a direct path from a current position of the drone to the second waypoint, and the sub-goal path being a path that is angled a non-zero angle from the divert path toward the second waypoint; and
selecting for continued travel the direct path when the direct path is clear and the sub goal path when the sub-goal path is clear and the direct path is not clear.
13. The method of claim 12, wherein the detection system includes a LIDAR system and wherein the LIDAR system is operable to scan a wedge having a predetermined dispersion angle along a scan bearing.
14. The method of claim 13, wherein the LIDAR system scans a plurality of wedges each having a different dispersion angle along the scan bearing.
15. The method of claim 12, further comprising selecting a desired path of travel from one of the current path, the sub-goal path, the direct path, and a further divert path, the method further comprising calculating a weight factor for each of the current path, the sub-goal path, the direct path, and the further divert path when none of the direct path, the sub-goal path, and the current path are clear, and selecting the desired path based on the weight factor, the selected desired path becoming the current path.
16. The method of claim 12, further comprising transitioning the drone to a corridor detection state in response to detecting an obstacle within a predefined distance along the current path.
17. The method of claim 16, wherein the detection system scans a 360-degree area around the drone when in the corridor detection state to collect data including a distance to a nearest object in at least 360 directions, and wherein the drone selects a desired direction of travel based on an analysis of the data collected in the 360-degree scan.
18. The method of claim 17, wherein the drone transitions to the divert state upon selecting the desired direction of travel.
19. The method of claim 12, further comprising defining an exclusion zone around the drone, transitioning to a back away state in response to detecting an object within the exclusion zone, and directing the drone along a path that is 180 degrees away from the object.
20. The method of claim 12, wherein the detection system scans a two-dimensional area that extends 360 degrees around the drone.
21. A drone comprising:
a frame;
an engine supported by the frame;
a propulsion element coupled to the engine and operable to propel the drone along a travel path;
a detection system coupled to the frame and operable to detect a bearing and distance to an obstacle;
a steering assembly coupled to the detection system and operable to steer the drone along a go to goal path from a first waypoint to a second waypoint in response to the detection system not detecting the obstacle along a direct path and within a predetermined distance from the drone, and operable to divert the drone along a divert path in response to the detection system detecting the obstacle along the direct path within the predetermined distance; and
a controller operable to analyze data from the detection system and transition the drone from a go to goal state to a divert state in response to the detection of the obstacle, the controller operable to analyze the divert path, a sub-goal path, and a direct path and to select one of the direct path, the sub-goal path and the divert path for continued travel.
22. The drone of claim 21 , wherein the propulsion element includes a rotor and the drone is an autonomous aircraft.
PCT/US2018/047737 2018-08-23 2018-08-23 Drone waypoint navigation system WO2020040773A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2018/047737 WO2020040773A1 (en) 2018-08-23 2018-08-23 Drone waypoint navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2018/047737 WO2020040773A1 (en) 2018-08-23 2018-08-23 Drone waypoint navigation system

Publications (1)

Publication Number Publication Date
WO2020040773A1 true WO2020040773A1 (en) 2020-02-27

Family

ID=63556447

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/047737 WO2020040773A1 (en) 2018-08-23 2018-08-23 Drone waypoint navigation system

Country Status (1)

Country Link
WO (1) WO2020040773A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113589821A (en) * 2020-08-20 2021-11-02 深圳市海柔创新科技有限公司 Warehouse robot navigation route reservation
FR3110999A1 (en) * 2020-05-28 2021-12-03 Airbus Helicopters Method and system for the detection and avoidance of obstacles in several detection spaces for aircraft
WO2022179277A1 (en) * 2021-02-25 2022-09-01 京东鲲鹏(江苏)科技有限公司 Unmanned vehicle path optimization method and related device
CN116301060A (en) * 2023-05-24 2023-06-23 武汉天眼智达科技有限公司 Unmanned aerial vehicle control method, unmanned aerial vehicle control device, computer equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010007115A2 (en) * 2008-07-15 2010-01-21 Aerospy Sense And Avoid Technology Gmbh System and method for preventing a collision
US20160068267A1 (en) * 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Context-based flight mode selection
WO2017222542A1 (en) * 2016-06-24 2017-12-28 Intel IP Corporation Unmanned aerial vehicle avoiding obstacles

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010007115A2 (en) * 2008-07-15 2010-01-21 Aerospy Sense And Avoid Technology Gmbh System and method for preventing a collision
US20160068267A1 (en) * 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Context-based flight mode selection
WO2017222542A1 (en) * 2016-06-24 2017-12-28 Intel IP Corporation Unmanned aerial vehicle avoiding obstacles

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3110999A1 (en) * 2020-05-28 2021-12-03 Airbus Helicopters Method and system for the detection and avoidance of obstacles in several detection spaces for aircraft
CN113589821A (en) * 2020-08-20 2021-11-02 深圳市海柔创新科技有限公司 Warehouse robot navigation route reservation
WO2022179277A1 (en) * 2021-02-25 2022-09-01 京东鲲鹏(江苏)科技有限公司 Unmanned vehicle path optimization method and related device
CN116301060A (en) * 2023-05-24 2023-06-23 武汉天眼智达科技有限公司 Unmanned aerial vehicle control method, unmanned aerial vehicle control device, computer equipment and storage medium
CN116301060B (en) * 2023-05-24 2023-08-18 武汉天眼智达科技有限公司 Unmanned aerial vehicle control method, unmanned aerial vehicle control device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
WO2020040773A1 (en) Drone waypoint navigation system
Polvara et al. Obstacle avoidance approaches for autonomous navigation of unmanned surface vehicles
EP3400494B1 (en) Flight path determination
US9420203B2 (en) Vision system for a vehicle
US10019907B2 (en) Unmanned aerial vehicle obstacle detection and avoidance
US20190286124A1 (en) Drop-off location planning for delivery vehicle
US7050909B2 (en) Automatic taxi manager
Laiacker et al. Vision aided automatic landing system for fixed wing UAV
US7957858B1 (en) Method for determining projected obstacle areas for moving obstacles
CN110226143B (en) Method for leading unmanned aerial vehicle
KR20200042394A (en) Trajectory planner for a vehicle
CN111819422A (en) Method and system for determining flight plan for vertical take-off and landing (VTOL) aircraft
KR101642828B1 (en) Obstacle avoidance system and method based on multiple images
JP2009515771A (en) A control system for automatic overturning flight.
CN110268356A (en) The system of leading unmanned plane
CN111316121A (en) System and method for modulating range of LIDAR sensor on an aircraft
KR20190130614A (en) Vehicle monitoring system and method for detecting foreign objects
Denuelle et al. A sparse snapshot-based navigation strategy for UAS guidance in natural environments
JP6399436B2 (en) Route planning method and apparatus for moving body
JP7086554B2 (en) Unmanned aerial vehicle control method and unmanned aerial vehicle control program
Maki et al. Real time path-planning of an AUV based on characteristics of passive acoustic landmarks for visual mapping of shallow vent fields
Takahashi et al. Full-Scale Flight-Test Results for a Rotorcraft Safe Landing Area Determination Algorithm for Autonomous and Piloted Landing Approaches
JP2021081970A (en) Automatic travel control system
Friebe et al. Situational awareness and obstacle avoidance for a wind propelled marine research ASV
KR102542896B1 (en) Autonomous Boat, Autonomous System, And Method Of Controlling Autonomous System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18769000

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18769000

Country of ref document: EP

Kind code of ref document: A1