AU2021105674A4 - Interface, System and Method for an Unmanned Vehicle - Google Patents

Interface, System and Method for an Unmanned Vehicle Download PDF

Info

Publication number
AU2021105674A4
AU2021105674A4 AU2021105674A AU2021105674A AU2021105674A4 AU 2021105674 A4 AU2021105674 A4 AU 2021105674A4 AU 2021105674 A AU2021105674 A AU 2021105674A AU 2021105674 A AU2021105674 A AU 2021105674A AU 2021105674 A4 AU2021105674 A4 AU 2021105674A4
Authority
AU
Australia
Prior art keywords
interface
path
icons
data
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2021105674A
Inventor
Thomas Caska
Rakesh Routhu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerologix Group Pty Ltd
Original Assignee
Aerologix Group Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2020904149A external-priority patent/AU2020904149A0/en
Application filed by Aerologix Group Pty Ltd filed Critical Aerologix Group Pty Ltd
Application granted granted Critical
Publication of AU2021105674A4 publication Critical patent/AU2021105674A4/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)

Abstract

There is disclosed an interface, a system and a method for the control of an unmanned or autonomous vehicle, such as an unmanned aerial vehicle or drone, to collect data about one or more features of interest. The control of the unmanned or autonomous vehicle is provided by the way of path data or a data package that is determined and communicated with the unmanned or autonomous vehicle. The interface may be configured to display map showing the one or features of interest, one or more icons representative of one or more operational plans to control the unmanned vehicle to collect the data, and one or more operational plan overlays associated with each of the one or more icons, the one or more operational plan overlays each being indicative of a path to be taken by the unmanned vehicle to collect the data. 1/4 UAV/Drone Wireless/Network 12 : 18 Local Device/ 16 Smart Phone Wireless/Network 14 23 Server Data Store 17 Figure 1

Description

1/4
UAV/Drone
Wireless/Network
12 : 18
Local Device/ 16 Smart Phone
Wireless/Network
14 23
Server Data Store
17 Figure 1
Interface, System and Method for an Unmanned Vehicle
Technical Field
[001] The invention relates to an interface, a system and a method for control and navigation of an unmanned or autonomous vehicle such as an unmanned aerial vehicle or drone.
Background
[002] Unmanned vehicles such as unmanned aerial vehicles or drones may be used to perform various activities such as mapping, photography, videography, identifying features or points of interest, and search and rescue.
[003] Some activities such as mapping require specific field data to be collected about a particular feature or point of interest, such as a structure or building. The data to be collected may be from various sources, such as images, video, lidar or the like, that needs to be collected in a particular manner and/or sequence to build up the data set of the feature. An example of this is photogrammetry in which the data set is built up with overlapping images.
[004] To enable the unmanned vehicle to correctly collect the field data, predetermined operational plan may be provided to the unmanned vehicle. In the case of a drone, this may be a predetermined flight plan. Such a plan may include instructional or operational data to autonomously control the unmanned vehicle. This is particularly important where the operator of the drone is unskilled or semi-skilled, or not directly supervised.
[005] The operational plan may be specific to the feature from or about which data is to be collected. For example, the operational plan to collect data about a building or tower, may be different to a plan to collect data about a beach, park or oval, as such, the operational plan may need to be changed to suit.
[006] A problem with such predetermined operational plans is that the actual in-situ conditions for data collection may be different to those used to provide the predetermined operational plan. For example, the location of a feature of interest may have moved or there may be other factors present such as a weather event or a gathering of people - that will require some modification to the plan. Further, it may be that data collected and/or feature may need to be changed in-situ therefore requiring the selection of a new operational plan in the field.
[007] The invention disclosed herein seeks to overcome one or more of the above identified problems or at least provide a useful alternative.
Summary
[008] In accordance with a first broad aspect there is provided, an interface for the control of an unmanned or autonomous vehicle, such as a drone, to collect data about one or more features of interest, the interface including: a map showing the one or features of interest; one or more icons representative of one or more operational plans to control the unmanned vehicle to collect the data; and one or more operational plan overlays associated with each of the one or more icons, the one or more operational plan overlays each being indicative of a path to be taken by the unmanned vehicle to collect the data.
[009] The interface may be configurable so that a select one of the one or more icons is displayed on the interface in association with its associated one or more operational plan overlays to indicate the location of at least one of the one or more features of interest, the select one of the one or more icons being repositionable by a user relative to the one or more features of interest to modify the operation plan, and display a modified operational plan overlay on the interface with a modified path.
[0010] In an aspect, the select one of the one or more icons is draggable relative to the one or features of interest displayed on the map.
[0011] In another aspect, the select one of the one or more icons is partially or fully translucent to enable viewing of the map.
[0012] In yet another aspect, the interface includes a start selector to communicate path data based on at least one of the path and modified path to the unmanned vehicle.
[0013] In yet another aspect, the start selector is provided by selection of the select one of the one or more icons displayed on the interface.
[0014] In yet another aspect, the interface includes a simulation selector to enable display on the interface of a simulation of at least one of the path and modified path.
[0015] In yet another aspect, the one or more icons are shaped to be representative of the one or more features of interest.
[0016] In yet another aspect, the unmanned vehicle is a drone and the operational plan is a flight plan.
[0017] In accordance with a second aspect, there is provided a system for the control of an unmanned or autonomous vehicle to collect data about one or more features of interest, the system being configurable to: display a map on an interface of the system showing the one or features of interest; display on the interface one or more icons representative of one or more operational plans to control the unmanned vehicle to collect the data; in a selected state in which one of the one or more icons is selected by a user, determine an operational plan associated with the selected one of the one or more icons; display an operational plan overlay indicative of the operational plan to show a path to be taken by the unmanned vehicle to collect the data; enable a user to reposition a select one of the one or more icons relative to the one or more features of interest; modify the operation plan based on the reposition of the select one of the one or more icons; and display a modified operational plan overlay on the interface with a modified path.
[0018] In an aspect, the system includes a local device with screen on which the interface is displayable in communication with a remote computer.
[0019] In another aspect, the local device includes a location device to determine the location of the local device relative to the map.
[0020] In another aspect, the system in configurable to communicate path data to the unmanned vehicle based on at least one of the path and modified path.
[0021] In yet another aspect, the interface includes a start selector configured upon selection thereof to send the path data.
[0022] In accordance with a third aspect there is provided, a method for the control of an unmanned or autonomous vehicle to collect data about one or more features of interest using an interface, the method may include: displaying a map on the interface showing the one or features of interest; displaying on the interface one or more icons representative of one or more operational plans to control the unmanned vehicle to collect the data; receiving a user selection of one of the one or more icons to generate an operational plan associated with the selected one of the one or more icons; displaying an operational plan overlay indicative of the operational plan to show a path to be taken by the unmanned vehicle; receiving a user repositioning of the select one of the one or more icons relative to the one or more features of interest; generating a modified operation plan based on the reposition of the select one of the one or more icons; and displaying the modified operational plan overlay on the interface with a modified path.
[0023] In an aspect, the method includes receiving input from a start selector; and sending path data based on one of the path and modified path to the unmanned vehicle.
Brief Description of the Figures
[0024] The invention is described, by way of non-limiting example only, by reference to the accompanying figures, in which;
[0025] Figure 1 is a system diagram illustrating a system for control of an unmanned vehicle to collect data about one or more features of interest using an interface;
[0026] Figure 2 is a plan form view illustrating an example of the interface;
[0027] Figure 3 is a flow diagram illustrating a first method of control using the interface to collect data about one or more features of interest using an interface;
[0028] Figure 4 is a flow diagram illustrating a second method of control using the interface to collect data about one or more features of interest using an interface.
Detailed Description
[0029] Referring to Figure 1, there is shown a system 10 for the control of an unmanned or autonomous vehicle 11, such as an unmanned aerial vehicle or drone, to collect data about one or more features of interest. The features of interest may include, but are not limited to, towers, building, natural features such as beaches or the like. The control of the unmanned or autonomous vehicle is provided by the way of path data or a data package that is determined and communicated with the unmanned or autonomous vehicle.
[0030] Such path data or a data package may operate the unmanned or autonomous vehicle to perform an operational sequence including predetermined spatial movements and operating onboard data collection devices such as, but not limited to, a camera, lidar and video. Such operational sequences may include, for example, flying around a feature of interest such as a tower or building in a series of orbits and operating the camera. Other operations sequence may include search and rescue patterns.
[0031] This path data or a data package may be predetermined such as being developed to map a specific type of feature of interest such as a tower or building and may initially include details of the feature of interest such as its actual or estimated spatial dimensions, geographic coordinates and locations and/or heights of, for example, known obstacles such as a tree nearby to the tower around which the unmanned or autonomous vehicle must navigate. There may also be no travel or fly zones.
[0032] At the site of use, however, the predetermined path data or a data package may not be appropriate for the site. For example, the estimated location, spatial dimensions, and coordinates of the feature of interest may be incorrect. Further, there may be new obstacles, hazards or no travel or fly zones that are identified. As such, the present system 10 is configured to present the predetermined path data or a data package and then allow a user to visually check or preview the path, and then modify the predetermined path data or a data package to account for the in-situ conditions.
[0033] Turning to the system 10 in more detail, the system 10 includes a local device 12 in communication with a remote computer system 14. The local device 12 includes a screen 14 and may be, but not limited to, a smart phone or tablet that includes communication technology and location technology such as a GPS chip or the like. The screen 14 may display an interface 16 to display information, and communicate and receive inputs from a user. An example of the interface 16 is shown in Figure 2 and is described in further detail below. The local device 12 may be configured to communicate with the unmanned or automatous vehicle 12.
[0034] The remote computer system 14 may be a remote server system 17 such as a cloud server that includes access to memory 19, one or more processors 21 and storage 23. In this example, the local device 12 may operate application software that configures and operates the interface 16, and communicates data to and from the remote computer system 14. For example, the remote computer system 14 may be configured to determine and provide the predetermined path data or a data package based on inputs received from the interface 16 of the local device 12.
[0035] Software that determines the path data or a data package may be stored on the remote computer system 14 and executed thereby and the path data or a data package may include calculated a time-based sequence of coordinates and associated data collection actions such as operational direction of a camera and timings of operation. The path data or a data package may be communicated directly with the unmanned or autonomous vehicle 11 or via the application operative by the local device 14. Various other configurations are contemplated such as the local device 14 performing all calculations, and the autonomous vehicle 11 itself being configured to run application software to determine its path data or a data package based on inputs from the interface 16.
[0036] In use, the system 10 is configurable to perform a method 100 as shown in Figure 3 including displaying the interface 16 on the local device and communicate with the remote system to ultimately control the autonomous vehicle 11 by the configuration and providing the path data or a data package. The method 100 is further detailed below with reference to Figure 3 and a more specific method 200 is shown in Figure 4.
[0037] Turning now to the interface 16, the interface may be displayed within a window 18 of the local device 12. At method step 110, the interface 16 is configured to display a map 20 showing the one or features of interest 22 and at least one of an icon bar 24 and/or icons 30. The map may be or include satellite image or other overlays. The one or features of interest 22 may be visible in the image data of the map or be shown as an overlay as a point, zone or region on the map. The interface 16 is operated by application software on the local device 12 to receive user input and send data between the remote system 14 and the unmanned or autonomous vehicle 11.
[0038] In this example, the interface 16 is provided in the form of an icon-based flight navigation system and includes one or more of the icons 30 that may be based on icon selectors 25a, 25c and 25c that are representative of one or more operational plans to control the unmanned vehicle 11 to collect the data. The icons may be any form of suitable indica. However, in this example, the icons have shapes that represent types of features of interest being in this example a building, tower and a house. Other icon shapes and types may also be used.
[0039] Each of these icons is then associated with particular operational plans that, in this example, may include a flight path and sequences of data collection such as image capture from a camera carried a drone. This allows a user to easily identify a suitable operational plan and make a selection of that plan. The selection of an icon 30 may also present a data input window in which the user may input information about the feature 22 such as, but not limited to, its width and height. In other examples, the icon 30 may be preselected, if, for example, a user has been engaged to collect data about particular feature. However, the user may still be able to view of the data, such as height and width, and make amendments accordingly.
[0040] At step 120, the method may include a user input from the from the icon selection bar or strip 24 and at step 130 selected one of the 30 icons may appear over the map along with its associated operational plan overlay 34. The icon 30 may be repositionable, in this example by dragging, over the map 20. The selected icon within the icon strip 32 may be coloured or other indicated as being in a selected state.
[0041] Each of the icons 30 may have one or more operational plan overlays 34 associated each of the one or more icons 30, the one or more operational plan overlays 34 each being indicative of a path 36 to be taken by the unmanned or autonomous vehicle 11 to collect the data. The operational plan overlays 34 may be a 2 two dimensional projection of the overall path.
[0042] For example, if the feature of interest 22 is a tower or similar, the operational plan overlays 34 may be displayed as orbits about the feature of interest 22 as shown in Figure 2. The path 36 may be or be derived from a set to geographic coordinates either predetermined or calculated by the system 10, preferably, at the remote system 14.
[0043] At step 140, the method includes receiving a repositioning input based on the user's movement of the selected icon 30. The icon 30 may be repositionable by a user relative, in this example, by dragging. This may be required for various reasons such an in-situ hazard, change in the location of the feature or a weather event.
[0044] Movement of the selected icon 30 may be used to change the location, in particular a centre point, of the feature of interest 22 provided to the system 10, and at step 150 this requires recalculation of the path 36 by the system 10. For example, a change in position of the icon 30 may generate a new set of coordinates, that this then sent to the remote system 14 to recalculate the path 36. It is also noted the operational data set associated with the path 36 will also need to be updated such as, for example, flight times, height, and data capture instructions such as camera angles, tilt and sequences of image capturing.
[0045] A step 160, the method includes displaying an operational plan overlay 34 may then be displayed which is representative of the modified path 36. At step 170, a start or confirmation action may be received the user such as, but not limited to, as a hard press to select the icon 30 or another similar action, to confirm the modified path 36 and, at step 180 the system 10 may be configured to send operational data including the modified path data to the unmanned or autonomous vehicle 11. The unmanned or autonomous vehicle 11 may the perform a mission in accordance with the operational data and collect data about the feature of interest such as images, video, lidar data or the like.
[0046] In addition to the above features, the interface 16 may also include a user input button 38 that opens a further window to allow viewing and modification to the operational data. For example, a vertical or three-dimensional profile of the path or modified path may be displayed. Another user input button 40, in this example a toggle, may enable a simulation mode in which a representative unmanned or autonomous vehicle 11 is displayed following the path over the map 20, and this may assist to identify hazards or other aspects that require further modification of the path.
[0047] The interface 16 may also include, for example, live video footage window 42 displayed relative to the map 20. This may be used, for example, during a set up phase where the unmanned or autonomous vehicle 11 is operated, such as flying over the feature of interest, and allow a comparison of the map to the actual in-situ situation which may be used to make decisions about the modification of the path data. One example of this may be when the feature of interest 22 is a tower and the interface allows vision from the drone to displayed as it hovers above the feature of interest 22 and allow confirmation of the centre of the feature of interest 22.
[0048] The interface 16 may also be configured to show the real-time position of the drone relative to the icon and features of interest 22, and thereby provide information to allow repositioning of the icon 30. Other examples may include confirming a centre position from the drone, which then in turn auto aligned the icon 30 and regenerates the path data. Finally, a user input button 44 may be provided to geographically locate the local device 12 relative to the map 20.
[0049] Referring to Figure 4, a more specific example of a method 200 as facilitated by the system 10 provided. In this example, the unmanned or autonomous vehicle is a drone and may be operated by a pilot. Figure 4 indicates a front end provided at the local device 12 and a backend provided by the remote system 14.
[0050] A method step 210, the pilot makes a selection from the icon strip 24 and drags and drops the selected icon 30 onto the map 20. The selected icon 30 be representative of the feature to be mapped known as an "asset". At step 220, the pilot may be required to enter or confirm parameters associated with the asset such as its dimensions, type, location or the like.
[0051] At step 230, the system 10 communicates received data received at the local device 12 to the remote system 14 and calculates the flight path data at the remote system 14. The calculation of such flight path data is described in Australian patent application no. 2017203165 "Method and System for Collection of Photographic Data", the contents of which are incorporated by reference, and is not described in further detail herein.
[0052] At step 240, the system 10 is configured to send the calculated values of the path 36 back to the local device 12 for display on the interface 16 as an operational plan overly 34. The user may then move the icon 30 as above described to modify the path, if required. If there are modifications made, then the modification data is sent to the remote system 14 and the flight path data is recalculated and sent back for redisplay by the interface 16. At step 260, a start or confirmation action may be performed by the user at the interface 16, and at step 270 the operational data set may be sent to the drone so that it may perform the mission to collect data, in this example to collect data to map a tower structure.
[0053] Throughout this specification and the claims which follow, unless the context requires otherwise, the word "comprise", and variations such as "comprises" and "comprising", will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps.
[0054] The reference in this specification to any known matter or any prior publication is not, and should not be taken to be, an acknowledgment or admission or suggestion that the known matter or prior art publication forms part of the common general knowledge in the field to which this specification relates.
[0055] While specific examples of the invention have been described, it will be understood that the invention extends to alternative combinations of the features disclosed or evident from the disclosure provided herein.
[0056] Many and various modifications will be apparent to those skilled in the art without departing from the scope of the invention disclosed or evident from the disclosure provided herein.

Claims (15)

The claims defining the Invention are as follows:
1. An interface for the control of an unmanned or autonomous vehicle to collect data about one or more features of interest, the interface including: a. A map showing the one or features of interest; b. One or more icons representative of one or more operational plans to control the unmanned vehicle to collect the data; and c. One or more operational plan overlays associated with each of the one or more icons, the one or more operational plan overlays each being indicative of a path to be taken by the unmanned vehicle to collect the data, wherein a select one of the one or more icons is displayed on the interface in association with its associated one or more operational plan overlays to indicate the location of at least one of the one or more features of interest, the select one of the one or more icons being repositionable by a user relative to the one or more features of interest to modify the operation plan, and display a modified operational plan overlay on the interface with a modified path.
2. The interface according to claim 1, wherein the select one of the one or more icons is draggable relative to the one or features of interest displayed on the map.
3. The interface according to claim 1, the select one of the one or more icons is partially or fully translucent to enable viewing of the map.
4. The interface according to claim 1, wherein the interface includes a start selector to communicate path data based on at least one of the path and modified path to the unmanned vehicle.
5. The interface according to claim 3, wherein the start selector is provided by selection of the select one of the one or more icons displayed on the interface.
6. The interface according to claim 1, wherein the interface includes a simulation selector to enable display on the interface of a simulation of at least one of the path and modified path.
7. The interface according to claim 1, wherein the one or more icons are shaped to be representative of the one or more features of interest.
8. The interface according to claim 1, wherein unmanned vehicle is a drone and the operational plan is a flight plan.
9. A system for the control of an unmanned or autonomous vehicle to collect data about one or more features of interest, the system being configurable to: a. Display a map on an interface of the system showing the one or features of interest; b. Display on the interface one or more icons representative of one or more operational plans to control the unmanned vehicle to collect the data; c. In a selected state in which one of the one or more icons is selected by a user, determine an operational plan associated with the selected one of the one or more icons; d. Display an operational plan overlay indicative of the operational plan to show a path to be taken by the unmanned vehicle to collect the data; e. Enable a user to reposition a select one of the one or more icons relative to the one or more features of interest; f. Modify the operation plan based on the reposition of the select one of the one or more icons; and g. Display a modified operational plan overlay on the interface with a modified path.
10. The system according to claim 9, wherein the system includes a local device with screen on which the interface is displayable in communication with a remote computer.
11. The system according to claim 10, wherein the local device includes a location device to determine the location of the local device relative to the map.
12. The system according to claim 9, wherein the system in configurable to communicate path data to the unmanned vehicle based on at least one of the path and modified path.
13. The system according to claim 12, wherein the interface includes a start selector configured upon selection thereof to send the path data.
14. A method for the control of an unmanned or autonomous vehicle to collect data about one or more features of interest using an interface, the method including: a. Displaying a map on the interface showing the one or features of interest; b. Displaying on the interface one or more icons representative of one or more operational plans to control the unmanned vehicle to collect the data; c. Receiving a user selection of one of the one or more icons to generate an operational plan associated with the selected one of the one or more icons; d. Displaying an operational plan overlay indicative of the operational plan to show a path to be taken by the unmanned vehicle; e. Receiving a user repositioning of the select one of the one or more icons relative to the one or more features of interest; f. Generating a modified operation plan based on the reposition of the select one of the one or more icons; and g. Displaying the modified operational plan overlay on the interface with a modified path.
15. The method according to claim 14, wherein the method includes receiving input from a start selector; and sending path data based on one of the path and modified path to the unmanned vehicle.
AU2021105674A 2020-11-12 2021-08-17 Interface, System and Method for an Unmanned Vehicle Active AU2021105674A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2020904149A AU2020904149A0 (en) 2020-11-12 Interface, System and Method for an Unmanned Vehicle
AU2020904149 2020-11-12

Publications (1)

Publication Number Publication Date
AU2021105674A4 true AU2021105674A4 (en) 2021-10-14

Family

ID=78007584

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021105674A Active AU2021105674A4 (en) 2020-11-12 2021-08-17 Interface, System and Method for an Unmanned Vehicle

Country Status (1)

Country Link
AU (1) AU2021105674A4 (en)

Similar Documents

Publication Publication Date Title
US9852639B2 (en) Generating a mission plan for capturing aerial images with an unmanned aerial vehicle
EP3359918B1 (en) Systems and methods for orienting a user in a map display
US11783543B2 (en) Method and system for displaying and navigating an optimal multi-dimensional building model
AU2020260445A1 (en) Unmanned aircraft structure evaluation system and method
CN113748314B (en) Interactive three-dimensional point cloud matching
EP2244150A2 (en) Methods for generating a flight plan for an unmanned aerial vehicle based on a predicted camera path
EP3147630B1 (en) 3d helicopter view at destination
WO2017092905A1 (en) System and method for navigation guidance of a vehicle in an agricultural field
CN109459029B (en) Method and equipment for determining navigation route information of target object
WO2013181032A2 (en) Method and system for navigation to interior view imagery from street level imagery
US11668577B1 (en) Methods and systems for response vehicle deployment
CN105164683A (en) System and method for geo-locating images
CN110362102B (en) Method, device and system for generating unmanned aerial vehicle route
JP4902236B2 (en) 3D map display device, 3D map display program, and recording medium recording 3D map display program
CN114746822A (en) Path planning method, path planning device, path planning system, and medium
JP2012242962A (en) Traveling object operation information system
US10970923B1 (en) Method and system for virtual area visualization
AU2021105674A4 (en) Interface, System and Method for an Unmanned Vehicle
JP6384898B2 (en) Route guidance system, method and program
AU2018450271B2 (en) Operation control system, and operation control method and device
TWI750821B (en) Navigation method, system, equipment and medium based on optical communication device
US11334232B1 (en) Systems and methods for interactive maps
US20240142978A1 (en) Systems and Methods for Fusing Sensor Data from Drones in a Virtual Environment
KR102467859B1 (en) A method for performing automatic control using an unmanned aerial vehicle capable of autonomous flight and a ground control system therefor
US20220083309A1 (en) Immersive Audio Tours

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)