AU2013101142A4 - Method of defining a UAV flight path using 3D parametric geometry - Google Patents

Method of defining a UAV flight path using 3D parametric geometry Download PDF

Info

Publication number
AU2013101142A4
AU2013101142A4 AU2013101142A AU2013101142A AU2013101142A4 AU 2013101142 A4 AU2013101142 A4 AU 2013101142A4 AU 2013101142 A AU2013101142 A AU 2013101142A AU 2013101142 A AU2013101142 A AU 2013101142A AU 2013101142 A4 AU2013101142 A4 AU 2013101142A4
Authority
AU
Australia
Prior art keywords
waypoints
gui
vehicle
target
payload
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2013101142A
Inventor
Scott Charles Parker
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Parker, Scott Charles MR
Original Assignee
Parker, Scott Charles MR
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Parker, Scott Charles MR filed Critical Parker, Scott Charles MR
Priority to AU2013101142A priority Critical patent/AU2013101142A4/en
Application granted granted Critical
Publication of AU2013101142A4 publication Critical patent/AU2013101142A4/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Landscapes

  • Navigation (AREA)
  • Traffic Control Systems (AREA)
  • Instructional Devices (AREA)

Abstract

This disclosure relates to methods and systems for planning and generating an unmanned aerial vehicle's flight path using parametric geometry in a 3D graphical user interface. The generated flight path, including waypoints, camera orientations and payload activities at each waypoint, is completely parametric and defined in relative and potentially unit-less coordinates. It remains independent of real world coordinates until the virtual coordinate system is mapped to a specific location and orientation.

Description

1 DESCRIPTION Title Method of defining a UAV flight path using 3D parametric geometry Technical Field [0001] The present invention relates to flight path generation for autonomous unmanned aerial vehicles (UAVs). Background [0002] Autopilots are well known in the art for sensing and controlling the position and orientation of unmanned aerial vehicles (UAVs). Autopilots can also be used to control the position and orientation of stabilised sensor payloads such as pan/tilt camera gimbals that are commonly mounted to unmanned vehicles for surveillance purposes. It is well known in the art that unmanned vehicle autopilots can be programmed to follow a set of GPS waypoints and perform a set of payload activities at each waypoint. [0003] UAVs, such as helicopters and multi-rotor aircraft, form a subset of the unmanned vehicle category and may have vertical takeoff-and-landing (VTOL) capability. VTOL typically means they also have the ability to maintain an approximate stationary hover. Unlike fixed wing aircraft that need to maintain forward flight, these VTOL UAVs are able to move in an arbitrary direction, stop mid-flight and even reverse their movements along their flight path. [0004] VTOL UAVs are good candidates for photographic inspection and other forms of surveillance due to their arbitrary flight paths and VTOL capabilities that allow them to be launched and landed 2 from unprepared sites without any additional infrastructure. Automating these UAVs, as opposed to a human actively flying them via remote control, greatly increases their capabilities and allows them to be operated beyond visual-line-of-sight, along repeatable flight paths and/or with greater speed, agility and efficiency that would be difficult or impossible to achieve with human-in-the-loop control. [0005] In addition to automating the flight path of the unmanned vehicle it is equally possible and advantageous to automate the functions of one or more onboard sensor payloads. In some UAV platform configurations a vehicle's autopilot will control one or more sensor payloads directly. Eg. Triggering a camera shutter. In other configurations a sensor payload may contain an autopilot capable of interpreting high level commands, while maintaining lower levels of autonomy. For example, the inertial stabilisation of a camera could be performed automatically by the sensor payload autopilot, while high level steering commands could be received from the vehicle's autopilot and combined with the lower level stabilisation. Typically the functionality of a vehicle's autopilot is limited as much as possible because it must have very high reliability and in some cases meet certification standards. Sensor payloads are not typically flight critical and hence, are better candidates for complex processing tasks that are inherently less reliable. Due to the variety of unmanned vehicle configurations a mission plan may need to be separated and different parts uploaded to the vehicle's autopilot and one or more sensor autopilots to affect the vehicle position, vehicle orientation, payload orientation and payload activities. [0006] Computer aided design (CAD) software is well known in the art for the three-dimensional display of coordinate systems, points, lines, surfaces and solids, with various view perspectives, 3 view rotation methods and human-machine interfaces. Computer aided machining (CAM) is a CAD-related field that involves generating g code from CAD geometry. G-code is a standard set of macro commands that can be interpreted by a wide variety of Computer Numerically Controlled (CNC) machines to produce linear and rotational motion of multiple axes at specified linear and rotational speeds. G-code may also contain commands for controlling machine activities such as turning a coolant liquid on or off. Summary [0007] Many countries are now permitting unmanned aircraft to fly in civilian airspace and over populated areas. This is a significant shift from their use by military forces for long-range, overhead surveillance and weapons deployment. Access to civilian airspace has created new missions that require close-range, oblique angle surveillance. Bridge and power-pole inspection are good examples, as is taking aerial photographs of a multi-vehicle highway accident from many different viewing angles. This shift in surveillance requirements, and the need to empower unskilled end users, prompted the development of this software and the new techniques described herein for generating oblique-angled surveillance flight paths with a simpler and more intuitive interface than was previously available. [0008] Prior methods for building a UAV flight path in a GUI involved clicking on a 2D geo-referenced map and sequentially creating a series of GPS waypoints. In addition to the GPS coordinates extracted from the map each waypoint is assigned an altitude component and any specific payload activities. Other more advanced flight path software can automatically create waypoints for a UAV to follow such that, given specific camera parameters and 4 image overlap, the UAV can survey a user-defined polygon area of a map at a user-defined altitude. [0009] The present invention describes software having a graphical user interface (GUI) that combines the basic elements of parametric 3D CAD software with a range of aircraft and mission parameters to generate a surveillance "flight path", comprising a plurality of waypoints, payload orientations at each waypoint and payload activities to be completed at each waypoint. The generated flight path is precise and repeatable. It is also created in a virtual world that is relative to a local origin and independent of GPS coordinates. It can be converted to real-world coordinates and/or visualised at a real location using augmented reality at any time by establishing a coordinate transform between the virtual world and the real world. [0010] Those skilled in the art would appreciate that a single flight between take-off and landing may include several distinct flight paths that contain commands and instructions for different phases of a UAV's flight including launching, travel to target, surveillance, emergency landing, return to base, and landing. In an exemplary embodiment the methods and processes described herein would most likely generate a segment of a flight path in one of these phases. Multiple segments could be created and followed by a UAV's autopilot, most often in a sequential manner. In this description a "Flightpath" refers to a segment of a flight path in one phase of the UAV's entire flight. [0011] An exemplary embodiment would contain a user-interface feature that enables the user to interactively visualise changes to a Flightpath resulting from a parameter varying, perhaps discretely, over a range of values. The user has the option of interactively selecting their preferred value for that parameter 5 and in this way achieves improved efficiency and is less error prone by avoiding manual data entry. A novice would create a Flightpath by stepping through a number of parameters and visually selecting their preferred option from a range of alternatives presented on screen. In an exemplary embodiment the range of alternatives would be represented graphically and be shown as a number of different Flightpaths rather than as a range of values for that specific parameter. The steps required to define a Flightpath can be compared to CAM software that generates a tool path for a CNC machine, where the machine and tool parameters in the CAM software are replaced with aircraft and payload parameters in the flight path generation software described herein. This combination of GUI elements and method of interactively selecting alternatives is innovative and a significant improvement on existing map-based flight path generation that use 2D geo referenced maps. Brief Description of Drawings [0012] Drawing #1 illustrates the definition of the Target, which in this example is a straight line 202 defined by two points 200 and 201 in 3D space. The coordinate system is typical of CAD software and uses orthogonal X, Y and Z axes. The measurement units of the coordinate system can be real, as in meters, or arbitrary, as in unit-less. If the units are arbitrary they will need to be converted to real units at a later stage so that a meaningful Flightpath can be created and followed. [0013] Drawing #2 illustrates the definition of the Vehicle Locus 300. In this example a cylindrical surface is being used as the Vehicle Locus. However, a number of other surfaces could be used instead, including a flat plate, sphere or capsule. The Vehicle Locus is parametric, which in the case of a cylinder means that the radius, height, position and orientation are all controlled by 6 parameters that can be modified to affect an update of the geometry. [0014] Drawing #3 illustrates the presentation of a set of alternative elevations 400, 401, 402, 403 and 404 that are all spaced twenty degrees apart. These alternatives are presented to the user and one or more of them could be selected. Line 400 represents an elevation of zero degrees from the Target with respect to the XY plane. If the mission requires the UAV to capture images along the length of the Target at a number of different elevation angles then multiple lines would be selected. Otherwise, a single line would be selected for the elevation. [0015] Drawing #4 illustrates the definition of Waypoints 500 and 501 located on the Vehicle Locus 300. In this example the Waypoints have a positive elevation of twenty degrees from the XY plane and zero azimuth from their respective points 200 and 201 at either end of the Target. [0016] Drawing #5 illustrates the definition of Waypoint Vectors 600 and 601 from the Waypoints 500 and 501 to their respective points on the Target. The Waypoint Vector 600 represents the desired orientation of the UAV payload's field-of-view when the UAV payload is located at Waypoint 500. [0017] Drawing #6 illustrates the definition of three Intermediate Waypoints 502 that are located on the Vehicle Locus and defined by interpolating between the two Waypoints 500 and 501. Three interpolated Waypoint Vectors 602 are also shown. These are automatically derived from the Intermediate Waypoints 502.
7 Description of Embodiments [0018] There is a wide array of civilian aerial imaging applications including inspection of power poles, photographing a highway accident scene, searching a valley for a missing person and open-cut mine surveys. These surveillance and/or aerial imaging missions are ideal candidates for automated UAVs because these UAVs can complete the task efficiently, at lower cost than manned aircraft and with a high level of precision and repeatability. While these aircraft have sophisticated autopilots they are only as good as the instructions they are programmed with. The challenge then is to generate a set of instructions the aircraft can follow that will complete the surveillance mission and achieve the mission goals. This invention describes a method and software for generating the set of instructions, referred to as a Flightpath, in a way that is intuitive and simple so that a low-skilled operator can efficiently plan a successful and potentially quite complicated surveillance mission that is physically achievable by their selected UAV. [0019] This invention applies mostly to VTOL UAVs having the necessary flight characteristics required to fly along an arbitrary trajectory. For aerial imaging these UAVs must also carry a payload that can be oriented at specific azimuth and elevation angles with respect to a distant subject. Those skilled in the art would appreciate that for an aerial imaging mission it is the field of view of the sensor optics that is of interest and that the intent is to position and align the optics with respect to a focus point on the subject. Since the payload is onboard an aircraft being guided by an autopilot the GPS coordinates for navigation may need to be relative to the aircraft's coordinate system, while the orientation of the sensor steering device, referred to herein as a gimbal, would ideally be represented in the gimbal's own coordinate system. For the purpose of describing this invention the vehicle 8 and payload positions are used interchangeably with the understanding that a coordinate transform may be required depending on where the reference point is located and which autopilot the Flightpath is being sent to. Some embodiments may adjust this offset dynamically depending on the physical relationship between the UAV's airframe origin and the imaging sensor location inside a multi-axis gimbal so that the vehicle location in the GUI corresponds to a point on the inner most axis of a multi-axis gimbal. [0020] In an exemplary embodiment, the graphical user interface uses a 3D rendering engine, such as OpenGL, to display a three dimensional world on a computer monitor and via keyboard, mouse and other human-machine interfaces enables the user to virtually rotate the world to view it from an arbitrary angle and interact with various elements in the world that are required for this method of creating a three-dimensional surveillance flight path. [0021] Planning a surveillance mission begins with the definition of a subject of interest called a Target. In an exemplary embodiment, rather than working with a detailed model of a Target a simple representation is used instead. The Target is defined by a point, curve, polyline or surface geometry. In later steps the Target will be used as the focal point of the camera's field-of view. If a point is used then the focal point remains constant. If other Target representations are used then the focal point will vary. In an exemplary embodiment the focal point on the Target will vary according to some parametric formula. The simple case would be a straight-forward interpolation from one end of a Target line to the other that is dependent on the vehicle's relative location along one or more specified segments of its Flightpath.
9 [0022] The next piece of geometry is a three-dimensional surface representing possible vehicle locations - collectively called the Vehicle Locus. As previously described the Vehicle Locus may refer to the position of a sensor element within a multi-axis gimbal rather than the vehicle itself. In an exemplary embodiment the Vehicle Locus would be selected from a limited number of primitive geometries including a flat plate, sphere, cylinder and capsule. In other embodiments the Vehicle Locus could be any 3D surface that can be created within the program or generated and imported from some other CAD software. The intent is that the waypoints used to describe the flight path can only be located on this Vehicle Locus geometry according to the parameters described below. [0023] After establishing the Target and Vehicle Locus the next step is to establish the desired azimuth and elevation viewing angles (Vectors) relative to the Target. The Vectors at the beginning and end of the flight path need to be defined. The Vector definition may use a point on the Target, an azimuth angle and an elevation angle. Alternatively, the definition may use one point on the Target and another point on the Vehicle Locus. Ultimately, for each Vector a point (Waypoint) is created on the Vehicle Locus and a vector from this Waypoint to an associated point on the Target defines the desired orientation of the sensor's field-of-view at that Waypoint. In an exemplary embodiment additional intermediate Waypoints may also be defined individually or by interpolating a specified number of times between two sequential Waypoints (Interpolated Waypoints). All Waypoints are constrained to the surface of the Vehicle Locus. [0024] The ordered list of Waypoints and their corresponding Vectors form a basic Flightpath. At this stage the Flightpath is a list of vehicle or payload coordinates and associated sensor payload field-of-view orientations. In an exemplary embodiment a 10 speed attribute would be added for each leg of the Flightpath between waypoints. Some autopilots are also able to follow a curved path between two waypoints and this could also be added by extracting the curvature from the Vehicle Locus geometry between two successive Waypoints. [0025] A surveillance mission involves collecting images or other forms of data so in an exemplary embodiment one or more activities would also be selected for each Waypoint. These may include hovering for a specified duration, taking a photo, capturing video, recording a sensor's output, activating a low-light filter, etc. In an exemplary embodiment there would be an option to associate these activities with Waypoints individually or to multiple at once through the GUI. [0026] An important feature of the GUI is that it is parametric. This means that the Target, Vehicle Locus, Vectors and Waypoints are defined by Parameters that can be changed and will effect an update of any related geometry and ultimately the Flightpath. In an exemplary embodiment the Parameters are hidden from the novice user who is instead provided with a range of alternative options shown in graphical form with the ability to select the preferred option. Other embodiments may enable an expert to modify the parameters directly. In an exemplary embodiment the user would be presented with some statistics computed for each alternative such as the total flight duration, fuel required and/or distance travelled. This method of displaying and selecting a range of options for each parameter greatly reduces the level of skill required by the user and reduces input time and errors by visually revealing the results of a decision before it is made. [0027] A further feature in the preferred embodiment is the addition of vehicle, payload and mission parameters that further 11 constrain the numerical range of various Parameters to ensure the the Flightpath is physically achievable and the mission parameters can be satisfied. The mission parameters may include a minimum altitude that is allowed. The payload parameters may include the minimum and maximum tilt angle of a multi-axis sensor gimbal, the transform from the payload's coordinate system to the vehicle's coordinate system, the sensor chip's resolution, optical magnification limits, etc. The vehicle parameters may include the maximum flight time, minimum and maximum speeds, maximum acceleration, maximum climb and decent rates, etc. In an exemplary embodiment the vehicle, payload and mission parameters would be stored in some form of database and then selected by the user rather than entered each time a new Flightpath is created. [0028] The additional vehicle, payload and operational parameters enable a further time-saving feature to be implemented that is the automatic creation of intermediate Waypoints, between two existing Waypoints, based on a number of mission goals. In an exemplary embodiment the goals may include an image overlap parameter, which would affect the density of interpolated Waypoints, a desired image resolution, which would affect the optical magnification and/or the Vehicle Locus distance from the Target, eg. by automatically adjusting the radius of a cylindrical Vehicle Locus. [0029] Finally, the coordinate system in the GUI is mapped to the real-world by mapping its position and orientation to the physical world. This can be achieved graphically by importing a geo referenced map into the GUI and establishing its position, scale and orientation. It may also be achieved by a number of other means including specifying the GPS coordinates and altitude of two points in the GUI. Once the coordinate transform is established it is trivial for a person skilled in the art to convert from the GUI coordinates into real-world coordinates and format the Flightpath, 12 including its positions, orientations and activities, into machine readable forms suitable for a vehicle and/or payload autopilot to follow. [0030] In an exemplary embodiment the conversion of a Flightpath into a machine-readable format, suitable for upload to a particular autopilot, would be handled by a post-processing software component that uses an autopilot definition that a novice user could simply select from a list.

Claims (5)

1. A graphical user interface (GUI) that displays a three dimensional (3D) virtual world containing an origin that all other elements are positioned relative to (Origin); a point, polyline or surface (Target) that represents a unique 3D position or locus of positions; a flat plate, sphere, cylinder, capsule or other surface (Vehicle Locus) describing the locus of all possible vehicle positions; a series of points (Waypoints) each being located on the Vehicle Locus and each having an associated line (Vector) to an associated point on the Target, the Vector therefore representing the desired orientation of a payload's field-of-view; a means of defining the Target, Vehicle Locus, Vectors and Waypoints through a set of variables (Parameters) that are associative and therefore affect an update of their associated geometry when modified; a means of interpolating between Waypoints to create additional intermediated Waypoints; a means of associating payload commands (Activities) with each Waypoint either individually or in groups; 14 a method of mapping the virtual coordinates into real-world coordinates so that Waypoints, Vectors and Activities (Flightpath) can be converted into a form suitable for one or more autopilots to interpret and follow.
2. The GUI in Claim 1 where the result of varying a Parameter over a range is visually shown to the user and where the GUI enables the user to select their preferred parameter value by making a graphical selection of one of the displayed alternatives.
3. The GUI in Claim 1 where the Waypoint locations are calculated automatically based on mission parameters that represent higher level goals such as the desired horizontal and vertical image overlap.
4. The GUI in Claim 1 where the Vehicle Locus is scaled automatically based on a distance to Target parameter that is specified or calculated automatically based on a desired image resolution and/or a particular set of sensor payload capabilities including the optical zoom multiplier and image sensor resolution.
5. The GUI in Claim 1 where a Flightpath can also be stored, retrieved and combined with other Flightpaths in a specific sequence to create a longer Flightpath.
AU2013101142A 2013-08-28 2013-08-28 Method of defining a UAV flight path using 3D parametric geometry Ceased AU2013101142A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2013101142A AU2013101142A4 (en) 2013-08-28 2013-08-28 Method of defining a UAV flight path using 3D parametric geometry

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2013101142A AU2013101142A4 (en) 2013-08-28 2013-08-28 Method of defining a UAV flight path using 3D parametric geometry

Publications (1)

Publication Number Publication Date
AU2013101142A4 true AU2013101142A4 (en) 2013-10-03

Family

ID=49237822

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2013101142A Ceased AU2013101142A4 (en) 2013-08-28 2013-08-28 Method of defining a UAV flight path using 3D parametric geometry

Country Status (1)

Country Link
AU (1) AU2013101142A4 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2608430C2 (en) * 2015-06-03 2017-01-18 Акционерное общество "Корпорация "Тактическое ракетное вооружение" Method of processing telemetry data of unmanned aerial vehicle and device therefore
CN109885089A (en) * 2015-11-20 2019-06-14 深圳市大疆创新科技有限公司 The control method and relevant apparatus of unmanned plane
CN114463489A (en) * 2021-12-28 2022-05-10 上海网罗电子科技有限公司 Oblique photography modeling system and method for optimizing unmanned aerial vehicle air route
CN114578842A (en) * 2020-11-30 2022-06-03 南京理工大学 Collaborative path planning method for cluster reconnaissance of unmanned vehicle-mounted rotor unmanned aerial vehicle
CN116257082A (en) * 2022-11-03 2023-06-13 天津大学 Distributed active cooperative detection method for multiple unmanned aerial vehicles

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2608430C2 (en) * 2015-06-03 2017-01-18 Акционерное общество "Корпорация "Тактическое ракетное вооружение" Method of processing telemetry data of unmanned aerial vehicle and device therefore
CN109885089A (en) * 2015-11-20 2019-06-14 深圳市大疆创新科技有限公司 The control method and relevant apparatus of unmanned plane
CN109885089B (en) * 2015-11-20 2022-04-08 深圳市大疆创新科技有限公司 Control method of unmanned aerial vehicle and related device
CN114578842A (en) * 2020-11-30 2022-06-03 南京理工大学 Collaborative path planning method for cluster reconnaissance of unmanned vehicle-mounted rotor unmanned aerial vehicle
CN114578842B (en) * 2020-11-30 2023-04-21 南京理工大学 Collaborative path planning method for unmanned vehicle-mounted rotor unmanned aerial vehicle cluster reconnaissance
CN114463489A (en) * 2021-12-28 2022-05-10 上海网罗电子科技有限公司 Oblique photography modeling system and method for optimizing unmanned aerial vehicle air route
CN116257082A (en) * 2022-11-03 2023-06-13 天津大学 Distributed active cooperative detection method for multiple unmanned aerial vehicles
CN116257082B (en) * 2022-11-03 2023-09-22 天津大学 Distributed active cooperative detection method for multiple unmanned aerial vehicles

Similar Documents

Publication Publication Date Title
US10860040B2 (en) Systems and methods for UAV path planning and control
US20210341527A1 (en) Unmanned Aerial Vehicle Electromagnetic Avoidance And Utilization System
EP3903164B1 (en) Collision avoidance system, depth imaging system, vehicle, map generator, amd methods thereof
CN108139759B (en) System and method for unmanned aerial vehicle path planning and control
Kong et al. Vision-based autonomous landing system for unmanned aerial vehicle: A survey
CN104854428B (en) sensor fusion
Johnson et al. Real-time terrain relative navigation test results from a relevant environment for Mars landing
AU2013101142A4 (en) Method of defining a UAV flight path using 3D parametric geometry
Meier et al. The pixhawk open-source computer vision framework for mavs
CN112789672B (en) Control and navigation system, gesture optimization, mapping and positioning techniques
CN109564434B (en) System and method for positioning a movable object
Tahar Multi rotor UAV at different altitudes for slope mapping studies
US20200221056A1 (en) Systems and methods for processing and displaying image data based on attitude information
Moore et al. UAV altitude and attitude stabilisation using a coaxial stereo vision system
Gandor et al. Photogrammetric mission planner for RPAS
Ali et al. The impact of UAV flight planning parameters on topographic mapping quality control
WO2020225979A1 (en) Information processing device, information processing method, program, and information processing system
Bailey Unmanned aerial vehicle path planning and image processing for orthoimagery and digital surface model generation
US20220390940A1 (en) Interfaces And Control Of Aerial Vehicle For Automated Multidimensional Volume Scanning
US20230107289A1 (en) Information processing method, information processor, and program
Zhang et al. Visual navigation based on stereo camera for water conservancy UAVs
Cieśluk et al. Computationaly simple obstacle avoidance control law for small unmanned aerial vehicles
US20230142394A1 (en) Contour scanning with an unmanned aerial vehicle
Yue et al. A fast target localization method with multi-point observation for a single UAV
Lee et al. Implementation of vision-based automatic guidance system on a fixed-wing unmanned aerial vehicle

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry