CN112330167A - Unmanned vehicle task execution method, system, computer device and storage medium - Google Patents

Unmanned vehicle task execution method, system, computer device and storage medium Download PDF

Info

Publication number
CN112330167A
CN112330167A CN202011257152.1A CN202011257152A CN112330167A CN 112330167 A CN112330167 A CN 112330167A CN 202011257152 A CN202011257152 A CN 202011257152A CN 112330167 A CN112330167 A CN 112330167A
Authority
CN
China
Prior art keywords
vehicle
unmanned
target area
task
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011257152.1A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Youtu Innovation Co ltd
Original Assignee
Youtu Innovation Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Youtu Innovation Co ltd filed Critical Youtu Innovation Co ltd
Priority to CN202011257152.1A priority Critical patent/CN112330167A/en
Publication of CN112330167A publication Critical patent/CN112330167A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations

Abstract

The application relates to a method, a system, computer equipment and a storage medium for executing tasks by an unmanned vehicle. The method comprises the following steps: the control center respectively sends task instructions to at least one unmanned aerial vehicle and at least one unmanned vehicle; the unmanned aerial vehicle responds to the received first task instruction, and conveys the unmanned aerial vehicle specified by the first task instruction to the target area and releases the unmanned aerial vehicle; the unmanned vehicle responds to the received second task instruction, executes the specified task in the current target area to which the unmanned vehicle is released, and sends a task completion notice to the unmanned vehicle after the execution is finished; if the unmanned vehicle still has tasks in the target area except the current target area which are not executed, the unmanned vehicle conveys the unmanned vehicle to the next target area and releases the unmanned vehicle after receiving the task completion notification so as to return to execute the unmanned vehicle to respond to the received second task instruction, and execute the specified tasks and subsequent steps in the released current target area. The method can avoid limitation.

Description

Unmanned vehicle task execution method, system, computer device and storage medium
Technical Field
The present application relates to the field of computer, automatic control and robotics, and more particularly, to a method, system, computer device and storage medium for task execution by an unmanned vehicle.
Background
With the development of science and technology, various Unmanned vehicles (UGV) are developed, can be used for executing different tasks, and are very wide in application. Such as: the household cleaning robot can perform automatic household cleaning, and the warehousing and carrying robot can perform automatic cargo carrying and the like.
In the traditional method, the unmanned vehicle can only execute tasks in a continuous area, and the tasks are difficult to execute across areas. However, in some application scenarios, the unmanned vehicle is required to perform tasks in a plurality of areas, and if there are problems such as large height difference, uneven surface relief, insurmountable obstacles, or long distance between adjacent areas, the unmanned vehicle cannot travel from one area to another area to perform tasks, which causes limitation in task execution of the unmanned vehicle.
Disclosure of Invention
In view of the above, it is necessary to provide a method, a system, a computer device and a storage medium for unmanned vehicle task execution that avoid limitations in view of the above technical problems.
An unmanned vehicle performs a task method, the method comprising:
the control center respectively sends corresponding task instructions to at least one unmanned aerial vehicle and at least one unmanned vehicle;
the unmanned aerial vehicle responds to the received first task instruction, and conveys the unmanned aerial vehicle specified by the first task instruction to a target area and releases the unmanned aerial vehicle; the target area is a task execution area specified by the first task instruction for the unmanned vehicle;
the unmanned aerial vehicle responds to the received second task instruction, executes the task specified by the second task instruction aiming at the current target area in the released current target area, and sends a task completion notice to the unmanned aerial vehicle after the execution is finished;
if the unmanned vehicle still has tasks in the target area except the current target area not executed, the unmanned vehicle conveys the unmanned vehicle to the next target area and releases the unmanned vehicle after receiving the task completion notification so as to return to execute the unmanned vehicle to respond to the received second task instruction, and in the released current target area, the tasks and subsequent steps specified by the second task instruction for the current target area are executed.
In one embodiment, the delivering the unmanned vehicle specified by the first mission instruction to the target area and releasing comprises:
the unmanned aerial vehicle flies to the original position of the unmanned vehicle specified by the first task instruction and grabs the unmanned vehicle;
the unmanned aerial vehicle flies to a target area;
and the unmanned aerial vehicle flies to a release position specified by the first task instruction in the target area and releases the unmanned aerial vehicle.
In one embodiment, the drone flies to a release location specified by the first mission command in the target area and releases the drone vehicle includes:
the unmanned aerial vehicle searches a signal source of the release position in the target area according to the release position specified by the first task instruction;
the unmanned aerial vehicle flies above the release position according to the signal source;
the unmanned aerial vehicle finely adjusts the position of the unmanned aerial vehicle above the release position according to the graphic code arranged at the release position;
and after the fine adjustment of the unmanned aerial vehicle is completed, releasing the unmanned aerial vehicle.
In one embodiment, after the unmanned vehicle finishes executing, the method further comprises:
the unmanned vehicle drives to a completion position specified by the second task instruction for the current target area;
after receiving the task completion notification, the unmanned aerial vehicle conveys the unmanned aerial vehicle to a next target area and releases the unmanned aerial vehicle, wherein the unmanned aerial vehicle comprises:
after receiving the task completion notification, the unmanned aerial vehicle flies to the completion position;
the unmanned aerial vehicle is in completion position department, snatchs unmanned vehicle, and will unmanned vehicle carries to next target area.
In one embodiment, the delivering the unmanned vehicle to the next target area and releasing comprises:
if the first task instruction comprises a task of conveying the unmanned vehicle to a next target area, the unmanned vehicle grabs the unmanned vehicle and flies to the next target area appointed by the first task instruction for the unmanned vehicle;
and the unmanned aerial vehicle flies to a release position specified by the first task instruction in the next target area and releases the unmanned aerial vehicle.
In one embodiment, the delivering the unmanned vehicle to the next target area and releasing comprises:
if the first task instruction does not contain a task for conveying the unmanned vehicle to a next target area, the unmanned vehicle sends the second task instruction to the unmanned vehicle;
the unmanned aerial vehicle grabs the unmanned aerial vehicle and flies to the next target area specified by the second task instruction;
and the unmanned aerial vehicle flies to the initial position appointed by the second task instruction for the unmanned vehicle in the next target area, and releases the unmanned vehicle.
In one embodiment, the unmanned aerial vehicle and the unmanned vehicle are respectively provided with an electromechanical coupling device;
unmanned aerial vehicle snatchs unmanned vehicle's step includes:
the unmanned aerial vehicle and the unmanned vehicle respectively start respective electromechanical coupling devices, so that the unmanned aerial vehicle can grab the unmanned vehicle through the electromechanical coupling devices;
the step of unmanned aerial vehicle release unmanned vehicles includes:
the unmanned aerial vehicle and the unmanned vehicle respectively start respective electromechanical coupling devices so that the unmanned aerial vehicle releases the unmanned vehicle.
An unmanned vehicle task execution system, the system comprising a control center, at least one unmanned aerial vehicle and at least one unmanned vehicle, wherein:
the control center is used for respectively sending corresponding task instructions to at least one unmanned aerial vehicle and at least one unmanned vehicle;
the unmanned aerial vehicle is used for responding to the received first task instruction, conveying the unmanned aerial vehicle specified by the first task instruction to a target area and releasing the unmanned aerial vehicle; the target area is a task execution area specified by the first task instruction for the unmanned vehicle;
the unmanned vehicle is used for responding to the received second task instruction, executing the task specified by the second task instruction aiming at the current target area in the released current target area, and sending a task completion notice to the unmanned aerial vehicle after the execution is finished;
the unmanned aerial vehicle is further used for conveying the unmanned aerial vehicle to a next target area and releasing the unmanned aerial vehicle after receiving the task completion notification if the unmanned aerial vehicle still has tasks in the target areas except the current target area which are not executed;
the unmanned vehicle is further configured to return to execute the second task instruction in response to the received second task instruction with the released next target area as a current target area, and execute the task specified by the second task instruction for the current target area in the released current target area.
A computer device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of the unmanned vehicle task performance method of embodiments of the present application.
A computer-readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to perform the steps of the unmanned vehicle task performance method of embodiments of the present application.
The unmanned vehicle executing task method, the unmanned vehicle executing task system, the computer equipment and the storage medium are characterized in that the control center respectively sends corresponding task instructions to at least one unmanned vehicle and at least one unmanned vehicle, the unmanned vehicle responds to the received first task instruction, conveys the unmanned vehicle specified by the first task instruction to the target area and releases the unmanned vehicle, responds to the received second task instruction, executes the task specified by the second task instruction aiming at the current target area in the released current target area, sends a task completion notice to the unmanned vehicle after the execution is finished, if the unmanned vehicle still has tasks in the target areas except the current target area not executed, the unmanned vehicle conveys the unmanned vehicle to the next target area and releases the unmanned vehicle after receiving the task completion notice to return to the unmanned vehicle to respond to the received second task instruction, and executing the task specified by the second task instruction for the current target area and the subsequent steps in the released current target area. Thereby can carry unmanned vehicles to different regions through control center control unmanned aerial vehicle for unmanned vehicles can carry out the task in the region of difference in proper order, has avoided unmanned vehicles to carry out the limitation of task.
Drawings
FIG. 1 is a diagram of an application environment for a method for an unmanned vehicle to perform tasks in one embodiment;
FIG. 2 is a schematic flow chart diagram illustrating a method for an unmanned vehicle to perform tasks in one embodiment;
FIG. 3 is a schematic diagram of a pair of drone and drone vehicle performing respective target area tasks in one embodiment;
fig. 4 is a schematic illustration of an unmanned aerial vehicle transporting the unmanned vehicle from a home position to a target area in one embodiment;
FIG. 5 is a schematic diagram of an embodiment in which a drone delivers drone vehicles to different target areas in sequence;
fig. 6 is a schematic illustration of an unmanned aerial vehicle transporting an unmanned vehicle from a target area back to an original position in one embodiment;
FIG. 7 is a schematic flow chart of a pair of drones and drones performing a task in one embodiment;
FIG. 8 is a system architecture diagram of an unmanned vehicle task execution system, in one embodiment;
FIG. 9 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The unmanned vehicle task execution method can be applied to the application environment shown in fig. 1. The control center 102, the at least one drone 104, and the at least one drone vehicle 106 communicate with one another via a wireless communication network. Control center 102 may send corresponding task instructions to at least one drone 104 and at least one drone vehicle 106, respectively, drone 104 may transport the drone vehicle to the target area in response to the received first task instruction, and drone vehicle 106 may perform a task in the target area in response to the received second task instruction. The control center 102 may be one or more servers or terminals. The terminal may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices. Drone 104 may be, but is not limited to, a rotary wing drone, a fixed wing drone, an unmanned paravane. The rotary-wing unmanned aerial vehicle may comprise a four-rotor unmanned aerial vehicle or a multi-rotor unmanned aerial vehicle. The unmanned vehicle 106 may be, but is not limited to, a wheeled unmanned vehicle and a tracked unmanned vehicle. The wireless communication network may be, but is not limited to, 4G, 5G, WiFi, WiMAX (World Interoperability for Microwave Access), Bluetooth (Bluetooth), and Ultra Wide Band (UWB). The long-distance communication may be through at least one of 4G, 5G, WiFi, WiMAX, etc., and the short-distance communication may be through at least one of bluetooth, ultra wideband, etc.
In one embodiment, as shown in fig. 2, there is provided a method of performing a task by an unmanned vehicle, comprising the steps of:
s202, the control center respectively sends corresponding task instructions to at least one unmanned aerial vehicle and at least one unmanned vehicle.
The control center is one or more servers or terminals which are provided with a basic operating system and used for executing a cooperative control program for the unmanned aerial vehicle and the unmanned vehicle. The basic operating system may be any one of a Windows system or a Linux system. Unmanned Aerial Vehicles (UAVs), otherwise known as Unmanned Aerial vehicles, are Unmanned aircraft or craft. An Unmanned Vehicle (UGV) is a Vehicle that can automatically travel.
In one embodiment, the drone may be any one of a rotary wing drone, a fixed wing drone, an unmanned parachute plane, and the like. Such as: the drone may be a quad-rotor drone or a multi-rotor drone.
In one embodiment, the unmanned vehicle may be any one of a wheeled unmanned vehicle, a tracked unmanned vehicle, and the like.
In one embodiment, the tasks that the drone may perform in response to the task instructions sent by the control center include at least one of automatic positioning, navigation, flight, coordinated formation flight, obstacle avoidance, hovering, three-dimensional mapping, searching for a specific target, finding and flying to a specified destination, grabbing a load, releasing a load, transporting a load, lighting, automatic charging, collecting ambient data, and receiving and sending information over a wireless communication network. In one embodiment, the environmental data includes at least one of temperature, barometric pressure, humidity, visual, acoustic, and ranging, among others.
In one embodiment, the tasks that the unmanned vehicle can perform in response to the task instructions sent by the control center include at least one of automatic positioning, navigation, driving, obstacle avoidance, coordinated formation driving, mapping, searching for specific targets, collecting surrounding environment data, finding and driving to a specified destination, surface surveying, water washing, dust removal, mopping, lighting, vital signal detection, automatic charging, coordinated grabbing by the unmanned aerial vehicle and receiving and sending information over a wireless communication network, and the like. In one embodiment, the environmental data includes at least one of temperature, barometric pressure, humidity, visual, acoustic, and ranging, among others.
Specifically, the control center sends corresponding task instructions to at least one unmanned aerial vehicle and at least one unmanned vehicle respectively, and different unmanned aerial vehicles and different unmanned vehicles receive respective task instructions respectively.
In one embodiment, the control center may send corresponding task instructions to the at least one drone and the at least one drone vehicle, respectively, through a coordinated control program for the drone and the drone vehicle.
It can be appreciated that the control center can send task instructions to the drone and drone vehicle remotely through a wireless communication network.
In one embodiment, the control center may allocate the tasks to be executed by the unmanned aerial vehicles and the unmanned vehicles according to information and task types of each area in which the tasks are to be executed, which are input by the user, to obtain task instructions corresponding to different unmanned aerial vehicles and unmanned vehicles, and send corresponding task instructions to the unmanned aerial vehicles and the unmanned vehicles to which the tasks are allocated, respectively. Such as: the assignment of which drones are responsible for transporting which unmanned vehicles, and which unmanned vehicles are responsible for performing which regional tasks, is made. It can be understood that can carry out arbitrary distribution according to actual demand, that is, no matter which unmanned vehicles are responsible for which target area, as long as can accomplish the task of all target area can, no matter which unmanned vehicles are responsible for carrying which unmanned vehicles, as long as can realize the transport of whole unmanned vehicles can. That is, no matter whether a plurality of unmanned vehicles are responsible for the same target area, or one unmanned vehicle is responsible for a plurality of target areas, or one unmanned vehicle is responsible for the distribution of tasks such as one target area, no matter whether one unmanned vehicle is responsible for conveying a plurality of unmanned vehicles, or whether one unmanned vehicle is responsible for conveying one unmanned vehicle, the distribution of tasks such as one unmanned vehicle is all right.
The control center can optimize task arrangement according to information input by a user, travel cost of the unmanned aerial vehicle and the unmanned aerial vehicle, achievement of executing a single task, time constraint of task completion (for example, the task needs to be completed between 7 points of completion and 7 points in the morning), and the like, determine optimal task allocation, and generate a task instruction. Task allocation refers to when, what drones are scheduled, what drones are transported, and what area to go to perform a task.
For example: the region that the user input needs to carry out the task is region 1, region 2, region 3 and region 4, control center distributes the back, confirm that unmanned vehicle A is responsible for carrying out region 1 and region 2's task, unmanned vehicle B and unmanned vehicle C are responsible for carrying out region 3's task, unmanned vehicle D is responsible for carrying out region 4's task, unmanned vehicle a is responsible for carrying unmanned vehicle A, unmanned vehicle B is responsible for carrying unmanned vehicle B and unmanned vehicle C, unmanned vehicle C is responsible for carrying unmanned vehicle D, then control center generates the task instruction of each unmanned aerial vehicle and each unmanned vehicle that is distributed the task respectively, and send each task instruction to the unmanned aerial vehicle or the unmanned vehicle that this task instruction corresponds. Such as: and the control center sends task instructions for conveying the unmanned vehicle A to the area 1 and the area 2 to the unmanned vehicle a. For another example: the control center sends a task instruction for executing a task in the area 4 to the unmanned vehicle D.
In one embodiment, the control center may also specify the time to execute the task in the task instruction.
In one embodiment, the control center can also receive information returned by the unmanned aerial vehicle and the unmanned vehicle in the process of executing the task instruction, and display the information through a graphical interface. The information that can receive includes at least one of unmanned aerial vehicle or unmanned vehicle's space-time coordinate, unmanned aerial vehicle or unmanned vehicle's environmental data of gathering, unmanned aerial vehicle or unmanned vehicle to the executive state of task instruction and unmanned aerial vehicle or unmanned vehicle's battery state etc..
In one embodiment, the control center can analyze the received information data to generate an analysis report. Such as: in the process that the designated unmanned vehicle carries out a survey task on the ruin areas after disasters, the control center can analyze survey data returned by the unmanned vehicle to obtain an analysis report reconstructed after disasters.
In one embodiment, the control center may monitor the status of the unmanned vehicle and the unmanned vehicle performing the task, and when an abnormal condition is monitored, perform an abnormal alarm. In one embodiment, the abnormal situation includes at least one of a situation where at least one of the unmanned vehicle and the unmanned aerial vehicle is not in communication with the control center for more than a predetermined time, and a situation where at least one of the unmanned vehicle and the unmanned aerial vehicle has a too fast battery power drop.
In one embodiment, the control center and co-scheduler may receive manual intervention when an abnormal situation occurs. For example, the new task instruction may be manually sent to at least one of the drone and the drone vehicle through the control center. Such as: the task in execution can be interrupted manually through the control center.
S204, the unmanned aerial vehicle responds to the received first task instruction, and conveys the unmanned aerial vehicle specified by the first task instruction to a target area and releases the unmanned aerial vehicle; the target area is a task execution area specified by the first task instruction for the unmanned vehicle.
The first task instruction refers to a task instruction sent to the unmanned aerial vehicle by the control center. The target area is an area where the unmanned vehicle is to perform a task.
It is to be understood that different drones receive different first mission instructions.
It can be understood that the unmanned vehicle is difficult to pass through among different target areas. Such as: among the target areas, there are problems such as large height difference, uneven surface relief, insurmountable obstacles, or long distance, and thus the unmanned vehicle cannot travel from one target area to another to perform tasks. For example: for surface cleaning of large-scale photovoltaic panels, unmanned vehicles are required from one set of panels to another, from one roof to another, or from one floor to another, etc., which are difficult to reach by themselves.
In one embodiment, each target area may be a separate area, i.e., an area without any contact. In another embodiment, each target area may also be several sub-areas in a large area, and no vehicle is difficult to pass through between the sub-areas (i.e. the target areas).
In one embodiment, if the drone receiving the first mission command is a rack, the drone may deliver and release the drone specified by the first mission command to the target area in response to the received first mission command.
In another embodiment, if the number of drones receiving the first mission command is multiple, each drone may respectively deliver the drone designated by the respective first mission command to the target area and release the drone in response to the respective received first mission command.
In one embodiment, the unmanned vehicle specified by the first mission instruction is at least one for each unmanned vehicle, i.e., the first mission instruction indicates that the unmanned vehicle(s) transported by a drone is at least one (i.e., one or more). In one embodiment, if the unmanned vehicle specified by the first mission command is one, the unmanned vehicle may transport the unmanned vehicle specified by the first mission command to the target area and release in response to receiving the first mission command. In another embodiment, if there are multiple unmanned vehicles specified by the first task instruction, the unmanned aerial vehicle may sequentially transport each unmanned vehicle specified by the first task instruction to a target area corresponding to each unmanned vehicle in response to the received first task instruction and release the unmanned vehicle.
In one embodiment, the drone may, in response to receiving the first mission instruction, grab the drone vehicle specified by the first mission instruction from the origin location, then transport to the target area and release. Such as: the home location may be a charging location of the unmanned vehicle.
In one embodiment, the task for which the first task instruction is indicative includes at least instructing the drone to transport a designated drone vehicle to a designated target area. In one embodiment, the delivery of the designated unmanned vehicle to the designated target area includes at least grasping, delivering, and releasing. In one embodiment, the task indicated by the first task instruction further comprises at least one of lighting, collecting ambient environment data, performing wireless communication tasks, ranging, searching for specific targets, map reconstruction, search and rescue, charging, and the like. In one embodiment, the environmental data includes at least one of temperature, barometric pressure, humidity, visual, and acoustic, among others. In one embodiment, the map reconstruction includes at least one of a tour mode and a map mode, among others.
In one embodiment, the first task instruction includes zone location coordinates of the target zone. The drone may, in response to the first mission command, deliver the drone vehicle specified by the first mission command to the target area according to the area location coordinates.
In one embodiment, the regional location coordinates may be latitude and longitude coordinates of the target region obtained from satellite (e.g., GPS or beidou, etc.) or other wide area system positioning. In one embodiment, the region position coordinates may be position coordinates of any point in the target region.
And S206, in response to the received second task instruction, the unmanned aerial vehicle executes the task specified by the second task instruction aiming at the current target area in the released current target area, and sends a task completion notice to the unmanned aerial vehicle after the execution is finished.
The second task instruction refers to a task instruction sent to the unmanned vehicle by the control center. And the task completion notification is used for notifying the unmanned aerial vehicle that the unmanned aerial vehicle has executed the task in the current target area.
In one embodiment, the task indicated by the second task instruction includes at least instructing the unmanned vehicle to perform the specified task in the target area. In one embodiment, the designated tasks include at least one of water washing, dust removal, mopping, surface surveying, vital signal detection, and the like. In one embodiment, the task indicated by the second task instruction further comprises at least one of collecting ambient data, performing a wireless communication task, lighting and charging, and the like.
In one embodiment, the drone vehicle may invoke different function modules to perform different designated tasks in the target area. In one embodiment, the second task instruction has different task descriptions (i.e., task specifications) for different specified tasks. For example: the task specification such as the cleaning task includes at least one of a patrol mode, a wash mode, and the like. The line patrol mode includes any one of random mode, fixed mode and the like. The washing mode includes any one of water washing, dust removal, mopping, and the like. The task specification of the map reconstruction task includes at least one of a line patrol mode, a map mode, and the like. The map mode includes any one of two-dimensional, three-dimensional, and the like. The task specification of the search and rescue task includes at least one of attributes of the search and rescue object, which sensors to enable to collect the environmental data, and the like.
It will be appreciated that different unmanned vehicles receive different second task instructions.
In one embodiment, the target area specified by the second task instruction received by the drone vehicle is at least one (i.e., one or more) for each drone vehicle.
In one embodiment, the unmanned vehicle may execute the specified task in the released current target area according to the task in the released current target area specified in the received second task instruction.
In one embodiment, the unmanned vehicle may send a task completion notification to the control center after the unmanned vehicle completes execution, and the control center may send the task completion notification to the unmanned vehicle.
In another embodiment, the unmanned vehicle may directly send a task completion notification to the unmanned vehicle after the unmanned vehicle completes execution.
And S208, if the unmanned vehicle still has tasks in the target areas except the current target area not to be executed, the unmanned vehicle conveys the unmanned vehicle to the next target area and releases the unmanned vehicle after receiving the task completion notification so as to return to execute the second task instruction, and in the released current target area, the unmanned vehicle executes the task and subsequent steps specified by the second task instruction aiming at the current target area.
In one embodiment, after receiving the task completion notification, the drone may determine whether the drone vehicle that completed the task still has tasks in the target area other than the current target area not executed according to the received first task instruction. If so, the drone may transport the drone vehicle to the next target area specified by the first mission instruction and release.
In one embodiment, if the unmanned vehicle sends its second task instruction to the unmanned vehicle, after the unmanned vehicle receives the task completion notification, the unmanned vehicle may determine whether the unmanned vehicle still has tasks in the target area other than the current target area not to be executed according to the second task instruction sent by the unmanned vehicle. If so, the unmanned aerial vehicle can convey the unmanned aerial vehicle to the next target area specified by the second task instruction and release the unmanned aerial vehicle. It can be understood that this kind of condition is under the inconsistent condition of target area in unmanned aerial vehicle and the respective instruction of unmanned vehicle, and unmanned vehicle can send the second task instruction of self for unmanned aerial vehicle to make unmanned aerial vehicle carry unmanned vehicle to next target area, avoided control center long-rangely send untimely or the inconsistent scheduling problem of instruction between unmanned aerial vehicle and the unmanned vehicle of instruction existence to unmanned aerial vehicle.
In one embodiment, after the unmanned aerial vehicle transports and releases the unmanned aerial vehicle to the next target area, the unmanned aerial vehicle may use the released next target area as the current target area, execute a task specified by the second task instruction for the current target area in the released current target area, and after the execution is completed, send a task completion notification to the unmanned aerial vehicle.
In one embodiment, if the unmanned vehicle does not have a task not executed in a target area other than the current target area, the unmanned vehicle may transport the unmanned vehicle back to its original location and release after receiving the task completion notification. Such as: the home location may be a charging location of the unmanned vehicle.
In one embodiment, after the unmanned vehicle is executed, the unmanned vehicle may travel to a completion location specified by the second mission command, and the unmanned vehicle may fly to the completion location, grab the unmanned vehicle, and transport the unmanned vehicle to a next target area or back to the original location.
In another embodiment, after the unmanned vehicle finishes executing, the unmanned vehicle can send the current position information to the unmanned vehicle, and the unmanned vehicle can fly to the position information sent by the unmanned vehicle, grab the unmanned vehicle, and convey the unmanned vehicle to the next target area or convey the unmanned vehicle back to the original position.
In the unmanned vehicle task execution method, the control center respectively sends corresponding task instructions to at least one unmanned aerial vehicle and at least one unmanned vehicle, the unmanned aerial vehicle responds to the received first task instruction, conveys the unmanned vehicle specified by the first task instruction to a target area and releases the unmanned vehicle, the unmanned vehicle responds to the received second task instruction, executes the task specified by the second task instruction aiming at the current target area in the released current target area, and sends a task completion notice to the unmanned aerial vehicle after the execution is finished, if the unmanned vehicle still has tasks in the target areas except the current target area not executed, the unmanned aerial vehicle conveys the unmanned vehicle to the next target area and releases the unmanned vehicle after receiving the task completion notice so as to return to execute the unmanned vehicle responding to the received second task instruction and release the unmanned vehicle to the released current target area, and executing the task specified by the second task instruction for the current target area and the subsequent steps. Thereby can carry unmanned vehicles to different regions through control center control unmanned aerial vehicle for unmanned vehicles can carry out the task in the region of difference in proper order, has avoided unmanned vehicles to carry out the limitation of task. In addition, the scheme of this application only need through control center to unmanned aerial vehicle and unmanned vehicle send the task instruction can, easy and simple to handle, the cost is lower and easy to maintain, but also possess extensive application, for example: cleaning of large-scale photovoltaic panels, high-altitude cleaning, search and rescue exploration, complex area map reconstruction and the like.
In one embodiment, the delivering to the target area and releasing the unmanned vehicle specified by the first mission command comprises: the unmanned aerial vehicle flies to the original position of the unmanned vehicle specified by the first task instruction and captures the unmanned vehicle; flying the unmanned aerial vehicle to a target area; and the unmanned aerial vehicle flies to a release position specified by the first task instruction in the target area and releases the unmanned aerial vehicle.
In one embodiment, the drone may fly and hover over the original location of the drone vehicle specified by the first mission instruction, then fine tune to a location directly above and a predetermined distance from the drone vehicle, and then grab the drone vehicle.
In one embodiment, the drone can be finely tuned to a position directly above the drone vehicle and a preset distance from the drone vehicle by visual positioning.
In one embodiment, the preset distance may be set according to actual conditions. Such as: the predetermined distance may be any distance between 20 and 30 centimeters.
In one embodiment, after the unmanned aerial vehicle grabs the unmanned aerial vehicle, the unmanned aerial vehicle can fly to the target area with the unmanned aerial vehicle, and then find and fly to the release position in the target area to release the unmanned aerial vehicle.
In one embodiment, the first task instruction includes area location coordinates of the target area and release location coordinates in the target area. Unmanned aerial vehicle can indicate the target area according to regional position coordinate flight after snatching unmanned vehicle, then according to release position coordinate, seeks and flies to release position, releases unmanned vehicle.
The release position coordinate is a coordinate of a position where the unmanned aerial vehicle releases the unmanned aerial vehicle in the target area.
In one embodiment, the regional location coordinates may be latitude and longitude coordinates obtained from a satellite (e.g., GPS or beidou, etc.) or other wide area system location. In one embodiment, the region position coordinates may be position coordinates of any point in the target region. In one embodiment, the region position coordinates may also be position coordinates of any point within a preset range outside the boundary of the target region. Such as: position coordinates of any point within 2 meters outside the boundary of the target area.
In one embodiment, the drone may perform a mission during flight as specified by the first mission instructions during flight. Such as: the unmanned aerial vehicle can collect surrounding environment data or search a special target and the like in the flight process, and sends the collected information to the control center. Such as: the environmental data includes at least one of temperature, barometric pressure, humidity, visual, acoustic, and ranging, among others.
In one embodiment, if no other tasks need to be performed after the drone has delivered the drone vehicle, the drone may wait for the drone vehicle to perform the tasks in the target area.
In this embodiment, unmanned aerial vehicle can carry the release position and release in the target area with unmanned vehicle from the primary importance, has avoided unmanned vehicle to be difficult to reach the problem in target area by oneself to avoid unmanned vehicle to carry out the limitation of task, make unmanned vehicle can carry out the task under more scenes.
In one embodiment, the drone flies in the target area to a release location specified by the first mission instructions and releasing the drone includes: the unmanned aerial vehicle searches a signal source of a release position according to the release position appointed by the first task instruction in the target area; the unmanned aerial vehicle flies above the release position according to the signal source; the unmanned aerial vehicle finely adjusts the position of the unmanned aerial vehicle above the release position according to the graphic code arranged at the release position; and after the fine adjustment of the unmanned aerial vehicle is completed, releasing the unmanned aerial vehicle.
In one embodiment, the drone may search for a signal source of a release location in the target area according to the release location in the target area specified by the first task instruction, and then fly above the release location according to the strength of a signal sent by the signal source, that is, fly to a location with the strongest signal strength.
In one embodiment, the unmanned aerial vehicle may first fly into or near the target area according to the area position coordinate in the first task instruction, then search for the signal source of the release position according to the release position coordinate in the first task instruction, and then fly above the release position according to the strength of the signal sent by the signal source. In one embodiment, the drone may fly to a preset distance position above the release position, such as any distance between 30 centimeters and 2 meters.
In one embodiment, the drone may search for signal sources via bluetooth or Ultra Wideband (UWB), or the like.
In one embodiment, a graphic code can be set at the release position in advance, the unmanned aerial vehicle can scan the graphic code set in advance at the release position through a camera carried by the unmanned aerial vehicle, and the position of the unmanned aerial vehicle above the release position is finely adjusted according to the scanned graphic code, so that the unmanned aerial vehicle is finely adjusted to be right above the release position.
In one embodiment, the graphical code may be provided at a fixed location around the release position. Such as: the graphic codes can be respectively arranged at four corners of the release position. In one embodiment, the graphic code may be any one of a two-dimensional code, a bar code, or the like.
In one embodiment, when night or light is not good, the unmanned aerial vehicle can turn on the light source of taking certainly and throw light on to supplementary camera scans graphic code. Such as: the LED lamp can be turned on, and the like.
In one embodiment, after the fine tuning is completed, the drone may adjust the hover gesture and descend by ranging to a safe distance (e.g., 10 cm to 20 cm from the ground) to release the drone, hover again, and then release the drone.
In this embodiment, unmanned aerial vehicle can fly to the top of release position according to the signal source of release position, then according to the graphic code of release position, finely tune to directly over the release position to make unmanned aerial vehicle can accurately release unmanned car in the target area, avoided because civilian satellite positioning system's positioning accuracy is not enough, and lead to unmanned aerial vehicle to release the inaccurate problem of unmanned car position, improved unmanned aerial vehicle transport unmanned car's position accuracy.
In one embodiment, after the unmanned vehicle is executed, the method further comprises: the unmanned vehicle travels to the completion position specified by the second task instruction for the current target area. After receiving the task completion notification, the unmanned aerial vehicle conveys the unmanned aerial vehicle to the next target area and releases the unmanned aerial vehicle, wherein the step comprises the following steps: after receiving the task completion notification, the unmanned aerial vehicle flies to a completion position; the unmanned aerial vehicle grabs the unmanned vehicle at the completion position and conveys the unmanned vehicle to the next target area.
In one embodiment, the first task instruction includes the grab location coordinates. The grasping position coordinates are coordinates of a position where the unmanned aerial vehicle grasps the unmanned aerial vehicle in the target area. After the unmanned vehicle finishes executing, the unmanned vehicle can drive to the completion position specified by the second task instruction for the current target area. After receiving the task completion notification, the unmanned aerial vehicle can fly to a grabbing position according to grabbing position coordinates specified for the unmanned vehicle and the current target area in a first task instruction, grab the unmanned vehicle at the grabbing position, and convey the unmanned vehicle to the next target area. It will be appreciated that the completion location specified in the second task order is coincident with the capture location specified in the first task order for the same unmanned vehicle to be in the same target area.
In another embodiment, in a case where the unmanned aerial vehicle transports the unmanned aerial vehicle according to the second task instruction sent by the unmanned aerial vehicle, after receiving the task completion notification, the unmanned aerial vehicle may fly to a completion position specified by the received second task instruction for the current target area, grab the unmanned aerial vehicle, and transport the unmanned aerial vehicle to the next target area.
In one embodiment, the signal source may be set in advance at the grasping position. The unmanned aerial vehicle can search the signal source of grabbing position department according to grabbing position coordinate, according to the signal source of grabbing position department, flies to the top of grabbing position and predetermines distance department.
In one embodiment, the drone may fly above the grabbing position, i.e., to the position with the strongest signal strength, according to the strength of the signal sent by the signal source at the grabbing position.
In one embodiment, the drone may fly to a preset distance position above the grabbing position, such as any distance between 30 centimeters and 2 meters.
In one embodiment, the drone may search for signal sources via bluetooth or Ultra Wideband (UWB), or the like.
In one embodiment, a graphic code may be set at the grabbing position in advance, the unmanned aerial vehicle may scan the graphic code set at the grabbing position through a camera provided therewith, and the position of the unmanned aerial vehicle above the grabbing position is finely adjusted according to the scanned graphic code, so as to be finely adjusted to be directly above the grabbing position.
In one embodiment, the graphical code may be provided at a fixed location around the grasp location. Such as: graphic codes can be respectively arranged at four corners of the grabbing position. In one embodiment, the graphic code may be any one of a two-dimensional code, a bar code, or the like.
In one embodiment, when night or light is not good, the unmanned aerial vehicle can turn on the light source of taking certainly and throw light on to supplementary camera scans graphic code. Such as: the LED lamp can be turned on, and the like.
In this embodiment, after unmanned aerial vehicle carried out and finishes, unmanned aerial vehicle can drive to completion position department, and unmanned aerial vehicle can fly to completion position department and snatch unmanned aerial vehicle to carry unmanned aerial vehicle to next target area, thereby make unmanned aerial vehicle can accurately snatch unmanned aerial vehicle, carry unmanned aerial vehicle to next target area, avoided unmanned aerial vehicle to be difficult to drive by oneself to the problem in next target area, thereby avoided unmanned aerial vehicle to carry out the limitation of task.
In one embodiment, the delivering and releasing the unmanned vehicle to the next target area comprises: if the first task instruction comprises a task of conveying the unmanned vehicle to the next target area, the unmanned vehicle is grabbed by the unmanned vehicle and flies to the next target area appointed by the first task instruction for the unmanned vehicle; and the unmanned aerial vehicle flies to a release position specified by the first task instruction in the next target area, and releases the unmanned aerial vehicle.
Specifically, if the first task instruction includes a task of transporting the unmanned vehicle to the next target area, the unmanned vehicle may grab the unmanned vehicle, fly to the next target area specified by the first task instruction for the unmanned vehicle, then fly to the release position in the next target area specified by the first task instruction, and release the unmanned vehicle.
In this embodiment, if include in the first task instruction and carry unmanned vehicle to next target area's task, then unmanned vehicle can carry unmanned vehicle to next target area's release position and release according to first task instruction to avoided unmanned vehicle to be difficult to travel by oneself to next target area's problem, thereby avoided unmanned vehicle to carry out the limitation of task.
In one embodiment, the delivering and releasing the unmanned vehicle to the next target area comprises: if the first task instruction does not contain a task of conveying the unmanned vehicle to the next target area, the unmanned vehicle sends a second task instruction to the unmanned vehicle; the unmanned aerial vehicle grabs the unmanned aerial vehicle and flies to the next target area specified by the second task instruction; and the unmanned aerial vehicle flies to the initial position appointed by the second task instruction for the unmanned vehicle in the next target area, and releases the unmanned vehicle.
Specifically, if the first task instruction does not include a task of conveying the unmanned vehicle to the next target area, and the unmanned vehicle still has the task of the next target area not executed, the unmanned vehicle may send the second task instruction to the unmanned vehicle. The unmanned aerial vehicle can grab the unmanned vehicle according to a second task instruction sent by the unmanned vehicle, flies to the initial position of the unmanned vehicle in the next target area specified by the second task instruction, and releases the unmanned vehicle. It can be understood that in this case, when the target areas in the respective commands of the unmanned aerial vehicle and the unmanned aerial vehicle are inconsistent, the unmanned aerial vehicle can send the second task command of the unmanned aerial vehicle to the unmanned aerial vehicle, and the unmanned aerial vehicle can convey the unmanned aerial vehicle to the next target area specified by the second task command.
It will be appreciated that in the event that the first mission instruction of the drone matches the second mission instruction of the drone, the starting position of the drone in the second mission instruction is coincident with the release position of the drone in the first mission instruction in the same target area.
In this embodiment, if do not contain the task of carrying unmanned vehicle to next target area in the first task instruction, then unmanned vehicle can send second task instruction to unmanned aerial vehicle, and unmanned aerial vehicle can carry unmanned vehicle to next target area's initial position and release according to second task instruction to avoided unmanned vehicle to be difficult to travel by oneself to the problem of next target area, thereby avoided unmanned vehicle to carry out the limitation of task. In addition, under the inconsistent condition of target area in unmanned vehicle and unmanned aerial vehicle's instruction, unmanned vehicle can send the instruction of self for unmanned aerial vehicle, has avoided control center long-rangely to the untimely or the inconsistent scheduling problem of instruction between unmanned aerial vehicle and the unmanned aerial vehicle that unmanned aerial vehicle sent the instruction existence.
In one embodiment, the unmanned aerial vehicle and the unmanned vehicle are respectively provided with an electromechanical coupling device. Unmanned aerial vehicle snatchs unmanned vehicle's step includes: the unmanned aerial vehicle and the unmanned vehicle respectively start respective electromechanical coupling devices, so that the unmanned aerial vehicle can grab the unmanned vehicle through the electromechanical coupling devices; the step that unmanned aerial vehicle releases unmanned vehicle includes: unmanned aerial vehicle and unmanned vehicle start respective electromechanical coupling device respectively to make unmanned aerial vehicle release unmanned vehicle.
The electromechanical coupling device is a mechanical coupling device or an electromagnetic coupling device for connecting the unmanned aerial vehicle and the unmanned vehicle.
In one embodiment, the step of the drone grasping the drone vehicle includes: unmanned aerial vehicle hovers to the top of unmanned vehicle, and unmanned aerial vehicle and unmanned vehicle unmanned aerial vehicle and unmanned vehicle adjust respective space gesture respectively to start respective electromechanical coupling device, so that unmanned aerial vehicle snatchs unmanned vehicle.
In one embodiment, the unmanned aerial vehicle can fly above the unmanned vehicle and hover, and then finely adjust to a position directly above the unmanned vehicle and at a preset distance from the unmanned vehicle through visual positioning, and the unmanned aerial vehicle and the unmanned vehicle respectively adjust respective spatial attitudes and start respective electromechanical coupling devices so that the unmanned aerial vehicle can grab the unmanned vehicle.
It can be understood that the electromechanical coupling devices on the drone and the drone vehicle are complementary.
In one embodiment, the electromechanical coupling device may be a hook. After unmanned aerial vehicle hovered to the top of unmanned vehicle, unmanned aerial vehicle and unmanned vehicle can adjust respective space gesture respectively for unmanned aerial vehicle and unmanned vehicle's couple is aimed at each other, then starts respective couple, so that unmanned vehicle hangs on unmanned aerial vehicle.
In one embodiment, the electromechanical coupling device may be an electromagnetic device. After unmanned aerial vehicle hovers to the top of unmanned vehicle, unmanned aerial vehicle and unmanned vehicle can adjust respective space gesture respectively for unmanned aerial vehicle and unmanned vehicle's electromagnetic means aim at each other, then start respective electromagnetic means, so that unmanned vehicle passes through electromagnetic means and adsorbs to unmanned aerial vehicle on.
In one embodiment, the step of the drone releasing the drone vehicle includes: unmanned aerial vehicle flies to release position top and hovers, and the gesture of hovering is adjusted, then descends to the position department apart from ground safety release distance through the range finding to hover again, unmanned aerial vehicle and unmanned vehicle start respective electromechanical coupling device respectively, so that unmanned aerial vehicle releases unmanned vehicle. Such as: the safety release distance may be any distance between 10 and 20 centimeters.
In one embodiment, the electromechanical coupling device may be a hook. Unmanned aerial vehicle and unmanned car can start respective couple respectively for the couple of unmanned aerial vehicle and unmanned car loosens each other, so that unmanned aerial vehicle releases unmanned car.
In one embodiment, the electromechanical coupling device may be an electromagnetic device. Unmanned aerial vehicle and unmanned car can start respective electromagnetic means respectively for the electromagnetic means of unmanned aerial vehicle and unmanned car loses magnetism, so that unmanned aerial vehicle releases unmanned car.
In this embodiment, set up organic electric coupling device on unmanned aerial vehicle and the unmanned vehicle respectively to realize that unmanned aerial vehicle snatchs and releases unmanned vehicle, and unmanned aerial vehicle carries unmanned vehicle to the target area territory, thereby avoided unmanned vehicle to be difficult to carry to the problem in target area territory by oneself, avoided unmanned vehicle to carry out the limitation of task.
It can be understood that the specific steps of the unmanned aerial vehicle grabbing and releasing each unmanned aerial vehicle by the unmanned aerial vehicle are consistent, in the above embodiments, the unmanned aerial vehicle grabbing and releasing the unmanned aerial vehicle in the first target area are only exemplified by the unmanned aerial vehicle grabbing and releasing the unmanned aerial vehicle from the original position and the completion position of the first target area, and the steps of grabbing and releasing the unmanned aerial vehicle in the other target areas are the same and are not described one by one.
In one embodiment, the task for each target area in the first task instruction of the drone includes area position coordinates, release position coordinates, grasp position coordinates, and a task description. The task for each target area in the second task instruction of the unmanned vehicle comprises a completion position coordinate and a task description.
And the task description in the first task instruction is used for indicating which tasks the unmanned aerial vehicle needs to execute. And task descriptions in the second task instructions for indicating which tasks the unmanned vehicle needs to perform. It will be appreciated that for the same target area, the grabbing position coordinates in the first mission instruction of the drone are consistent with the completing position coordinates in the second mission instruction of the drone in charge.
It can be understood that, in the case that the task for each target area in the second task instruction includes the completion position coordinate and the task description, the unmanned vehicle does not need to know the area position coordinate and the start position coordinate, and the task is executed from the position released by the unmanned vehicle.
In one embodiment, the second task instruction of the unmanned vehicle further includes an area position coordinate and a start position coordinate in the task for each target area.
The initial position coordinate is a position coordinate at which the unmanned vehicle starts to execute the task in the target area. It will be appreciated that the release position coordinates in the first mission instruction of the drone are coincident with the start position coordinates in the second mission instruction of the drone in charge for the same target area. For the same target area, the area position coordinates in the first mission instruction of the drone are consistent with the area position coordinates in the second mission instruction of the drone in charge.
It can be understood that, in the case that the second task instruction of the unmanned vehicle further includes the area position coordinate and the start position coordinate in the task for each target area, the area position coordinate and the start position coordinate in the second task instruction may be null, and in the case of null, the unmanned vehicle does not need to know the area position coordinate and the start position coordinate, and the task is executed from the position released by the unmanned vehicle. Under the condition that the second task instruction of the unmanned vehicle does not contain the region position coordinate and the initial position coordinate in the task of each target region, if the instructions of the unmanned vehicle and the unmanned vehicle are inconsistent or not matched, the unmanned vehicle can send the second task instruction to the unmanned vehicle, and the unmanned vehicle can convey the unmanned vehicle and release the unmanned vehicle according to the region position coordinate and the initial position coordinate in the second task instruction.
The overall flow of the method in the embodiments of the present application is described below by taking as an example that one unmanned aerial vehicle and one unmanned vehicle are responsible for performing tasks in multiple areas.
As shown in fig. 3, when the unmanned aerial vehicle and the unmanned vehicle are not receiving the task, the unmanned aerial vehicle and the unmanned vehicle are respectively at their original positions and the likeAnd scheduling the tasks. The home position may be a position of charging. After the user inputs the task through the control center, the control center can arrange the task according to the information input by the user, generate a task instruction and respectively issue the task instruction to the corresponding unmanned aerial vehicle and the unmanned aerial vehicle. Let riIndicating the task of the unmanned vehicle in the target area i, piiThe unmanned aerial vehicle is used for conveying the unmanned aerial vehicle to a target area i, n areas with tasks to be executed are shared, and the control center can instruct a second task (r)1,r2,…,rn) Sending to the unmanned vehicle and sending a first task instruction (pi)12,…,πn) And sending the data to the unmanned aerial vehicle. After receiving respective task instructions, the unmanned vehicle and the unmanned aerial vehicle respectively start the system and prepare for starting. The unmanned aerial vehicle can extract a task pi of the target area 1 from the first task instruction1And starts to perform a task, transporting the unmanned vehicle to the target area 1. The unmanned vehicle can then extract task r of target area 1 from the second task instruction1The task is performed in the target area 1. After the unmanned vehicle finishes executing in the target area 1, the unmanned vehicle can execute a task pi2The unmanned vehicle is transported to the target area 2, where the unmanned vehicle can perform the task r2. And analogizing in turn until the unmanned vehicle executes tasks in all the n target areas, and then the unmanned vehicle can be conveyed back to the original position by the unmanned vehicle.
As shown in fig. 4, after the unmanned aerial vehicle and the unmanned vehicle receive the task instruction, respectively, the unmanned aerial vehicle can grab the unmanned vehicle from the original position, and then automatically plan a flight route, fly to the first target area, and release the unmanned vehicle. The unmanned vehicle can start to execute a specified task from a starting position (namely a release position of the unmanned aerial vehicle) in the target area, and after the execution is finished, the unmanned aerial vehicle runs to a finishing position (namely a next grabbing position of the unmanned aerial vehicle).
In one embodiment, the unmanned vehicle may perform different designated tasks in different target areas, as well as the same designated tasks. As shown in fig. 5, the designated task in the target area 1 is area cleaning, the designated task in the target area 2 is target search, and the designated task in the target area n is map reconstruction. It should be understood that the above are merely examples, and in practical applications, the designated tasks may be other tasks, or the designated tasks of the partial target areas may be the same, but the designated tasks of the partial target areas may be different.
As shown in fig. 5, after the unmanned vehicle performs the designated task in each target area, the unmanned vehicle may transport the unmanned vehicle to the next target area.
As shown in fig. 6, after the unmanned vehicle performs the task of the last target area, the unmanned vehicle may grab the unmanned vehicle from the completion position, and then automatically plan the flight route to transport the unmanned vehicle back to the original position.
As shown in fig. 7, an overall flow diagram for one unmanned vehicle and one unmanned vehicle to perform tasks for n target areas is shown. Firstly, the unmanned vehicle and the unmanned aerial vehicle respectively pre-install task information (r) in the received task instruction1,r2,…,rn) And (pi)12,…,πn). Then, unmanned aerial vehicle snatchs unmanned vehicle and carries to target area 1, and unmanned aerial vehicle releases unmanned vehicle at target area 1's release position top safe distance, and unmanned vehicle is according to r1The unmanned aerial vehicle executes the task according to the pi1After the unmanned vehicle finishes executing the task, the unmanned vehicle drives to the completion position of the target area 1 and informs the unmanned vehicle. Then, the unmanned aerial vehicle can carry the unmanned vehicle to the next target area execution task according to above steps, analogize in this way, until the task of n target areas is all carried out, then unmanned aerial vehicle can snatch the unmanned vehicle and fly to the primary position to release the unmanned vehicle in primary position top safe distance department, unmanned vehicle and unmanned vehicle get back to the region of charging respectively by oneself and charge.
It should be understood that, although the steps in the flowchart of fig. 2 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 2 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
In one embodiment, as shown in fig. 8, there is provided an unmanned vehicle task performing system 800 comprising a control center, at least one unmanned aerial vehicle, and at least one unmanned vehicle, wherein:
and the control center 802 is configured to send corresponding task instructions to at least one unmanned aerial vehicle 804 and at least one unmanned vehicle 806, respectively.
The unmanned aerial vehicle 804 is used for responding to the received first task instruction, conveying the unmanned aerial vehicle 806 specified by the first task instruction to the target area and releasing the unmanned aerial vehicle; the target area is a task execution area specified by the first task instruction for the unmanned vehicle 806.
The unmanned vehicle 806 is configured to, in response to the received second task instruction, execute a task specified by the second task instruction for the current target area in the released current target area, and after the execution is completed, send a task completion notification to the unmanned vehicle 804.
The drone 804 is further configured to, if the unmanned vehicle 806 still has tasks in target areas other than the current target area not executed, deliver the unmanned vehicle 806 to a next target area and release the unmanned vehicle 806 after receiving the task completion notification.
The unmanned vehicle 806 is further configured to return to execute the second task instruction in response to the received second task instruction, with the next target area to which the vehicle is released being set as the current target area, and execute the task specified by the second task instruction for the current target area in the released current target area.
In one embodiment, the drone 804 is further configured to fly to the original location of the drone vehicle 806 as specified by the first mission instructions and grab the drone vehicle 806; flying to a target area; in the target area, the vehicle flies to the release position specified by the first mission command, and releases the unmanned vehicle 806.
In one embodiment, the drone 804 is further configured to search for a signal source of a release location in the target area according to the release location specified by the first task instruction; flying above the release position according to the signal source; finely adjusting the position of the unmanned aerial vehicle above the release position according to the graphic code arranged at the release position; after the fine tuning is completed, the unmanned vehicle is released 806.
In one embodiment, the unmanned vehicle 806 is further configured to, after completion of execution, travel to a completion location specified by the second task instruction for the current target area; the unmanned aerial vehicle 804 is further configured to fly to a completion position after receiving the task completion notification; at the completion location, the unmanned vehicle 806 is grabbed and the unmanned vehicle 806 is transported to the next target area.
In one embodiment, the drone 804 is further configured to grab the drone vehicle and fly to the next target area specified by the first mission instructions for the drone vehicle 806 if the first mission instructions include a task to transport the drone vehicle 806 to the next target area; in the next target area, the vehicle flies to the release position specified by the first mission command, and the unmanned vehicle 806 is released.
In one embodiment, the drone vehicle 806 is further configured to send a second mission instruction to the drone 804 if the first mission instruction does not include a mission to transport the drone vehicle 806 to the next target area. The drone 804 is further configured to grab the drone vehicle 806 and fly to a next target area specified by the second mission instruction; in the next target area, the start position specified for the unmanned vehicle 806 by the second mission instruction is flown, and the unmanned vehicle 806 is released.
In one embodiment, an electromechanical coupling device is provided on drone 804 and drone 806, respectively. Drone 804 and drone 806 are also configured to activate their respective electromechanical coupling devices, respectively, so that drone 804 grabs drone 806 via the electromechanical coupling devices. Drone 804 and drone 806 are also used to activate respective electromechanical coupling devices, respectively, to cause drone 804 to release drone 806.
In the unmanned vehicle task execution system, the control center respectively sends corresponding task instructions to at least one unmanned aerial vehicle and at least one unmanned vehicle, the unmanned aerial vehicle responds to the received first task instruction, conveys the unmanned vehicle specified by the first task instruction to a target area and releases the unmanned vehicle, the unmanned vehicle responds to the received second task instruction, executes the task specified by the second task instruction aiming at the current target area in the released current target area, and sends a task completion notice to the unmanned aerial vehicle after the execution is finished, if the unmanned vehicle still has tasks in the target areas except the current target area not executed, the unmanned aerial vehicle conveys the unmanned vehicle to the next target area and releases the unmanned vehicle after receiving the task completion notice so as to return to execute the unmanned vehicle responding to the received second task instruction and release the unmanned vehicle to the released current target area, and executing the task specified by the second task instruction for the current target area and the subsequent steps. Thereby can carry unmanned vehicles to different regions through control center control unmanned aerial vehicle for unmanned vehicles can carry out the task in the region of difference in proper order, has avoided unmanned vehicles to carry out the limitation of task. In addition, the scheme of this application only need through control center to unmanned aerial vehicle and unmanned vehicle send the task instruction can, easy and simple to handle, the cost is lower and easy to maintain, but also possess extensive application, for example: cleaning of large-scale photovoltaic panels, high-altitude cleaning, search and rescue exploration, complex area map reconstruction and the like.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 9. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement an unmanned vehicle mission performing method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 9 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is further provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An unmanned vehicle task execution method, the method comprising:
the control center respectively sends corresponding task instructions to at least one unmanned aerial vehicle and at least one unmanned vehicle;
the unmanned aerial vehicle responds to the received first task instruction, and conveys the unmanned aerial vehicle specified by the first task instruction to a target area and releases the unmanned aerial vehicle; the target area is a task execution area specified by the first task instruction for the unmanned vehicle;
the unmanned aerial vehicle responds to the received second task instruction, executes the task specified by the second task instruction aiming at the current target area in the released current target area, and sends a task completion notice to the unmanned aerial vehicle after the execution is finished;
if the unmanned vehicle still has tasks in the target area except the current target area not executed, the unmanned vehicle conveys the unmanned vehicle to the next target area and releases the unmanned vehicle after receiving the task completion notification so as to return to execute the unmanned vehicle to respond to the received second task instruction, and in the released current target area, the tasks and subsequent steps specified by the second task instruction for the current target area are executed.
2. The method of claim 1, wherein the delivering and releasing the drone vehicle specified by the first mission command to a target area comprises:
the unmanned aerial vehicle flies to the original position of the unmanned vehicle specified by the first task instruction and grabs the unmanned vehicle;
the unmanned aerial vehicle flies to a target area;
and the unmanned aerial vehicle flies to a release position specified by the first task instruction in the target area and releases the unmanned aerial vehicle.
3. The method of claim 2, wherein the drone is flying in the target area to a release location specified by the first mission command, and releasing the drone vehicle comprises:
the unmanned aerial vehicle searches a signal source of the release position in the target area according to the release position specified by the first task instruction;
the unmanned aerial vehicle flies above the release position according to the signal source;
the unmanned aerial vehicle finely adjusts the position of the unmanned aerial vehicle above the release position according to the graphic code arranged at the release position;
and after the fine adjustment of the unmanned aerial vehicle is completed, releasing the unmanned aerial vehicle.
4. The method of claim 1, wherein the unmanned vehicle, after being executed, further comprises:
the unmanned vehicle drives to a completion position specified by the second task instruction for the current target area;
after receiving the task completion notification, the unmanned aerial vehicle conveys the unmanned aerial vehicle to a next target area and releases the unmanned aerial vehicle, wherein the unmanned aerial vehicle comprises:
after receiving the task completion notification, the unmanned aerial vehicle flies to the completion position;
the unmanned aerial vehicle is in completion position department, snatchs unmanned vehicle, and will unmanned vehicle carries to next target area.
5. The method of claim 1, wherein said transporting the unmanned vehicle to a next target area and releasing comprises:
if the first task instruction comprises a task of conveying the unmanned vehicle to a next target area, the unmanned vehicle grabs the unmanned vehicle and flies to the next target area appointed by the first task instruction for the unmanned vehicle;
and the unmanned aerial vehicle flies to a release position specified by the first task instruction in the next target area and releases the unmanned aerial vehicle.
6. The method of claim 1, wherein said transporting the unmanned vehicle to a next target area and releasing comprises:
if the first task instruction does not contain a task for conveying the unmanned vehicle to a next target area, the unmanned vehicle sends the second task instruction to the unmanned vehicle;
the unmanned aerial vehicle grabs the unmanned aerial vehicle and flies to the next target area specified by the second task instruction;
and the unmanned aerial vehicle flies to the initial position appointed by the second task instruction for the unmanned vehicle in the next target area, and releases the unmanned vehicle.
7. The method according to any one of claims 2 to 6, wherein electromechanical coupling devices are provided on the drone and the drone vehicle, respectively;
unmanned aerial vehicle snatchs unmanned vehicle's step includes:
the unmanned aerial vehicle and the unmanned vehicle respectively start respective electromechanical coupling devices, so that the unmanned aerial vehicle can grab the unmanned vehicle through the electromechanical coupling devices;
the step of unmanned aerial vehicle release unmanned vehicles includes:
the unmanned aerial vehicle and the unmanned vehicle respectively start respective electromechanical coupling devices so that the unmanned aerial vehicle releases the unmanned vehicle.
8. The utility model provides an unmanned vehicle carries out task system, its characterized in that, the system includes control center, at least one unmanned aerial vehicle and at least one unmanned vehicle, wherein:
the control center is used for respectively sending corresponding task instructions to at least one unmanned aerial vehicle and at least one unmanned vehicle;
the unmanned aerial vehicle is used for responding to the received first task instruction, conveying the unmanned aerial vehicle specified by the first task instruction to a target area and releasing the unmanned aerial vehicle; the target area is a task execution area specified by the first task instruction for the unmanned vehicle;
the unmanned vehicle is used for responding to the received second task instruction, executing the task specified by the second task instruction aiming at the current target area in the released current target area, and sending a task completion notice to the unmanned aerial vehicle after the execution is finished;
the unmanned aerial vehicle is further used for conveying the unmanned aerial vehicle to a next target area and releasing the unmanned aerial vehicle after receiving the task completion notification if the unmanned aerial vehicle still has tasks in the target areas except the current target area which are not executed;
the unmanned vehicle is further configured to return to execute the second task instruction in response to the received second task instruction with the released next target area as a current target area, and execute the task specified by the second task instruction for the current target area in the released current target area.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202011257152.1A 2020-11-11 2020-11-11 Unmanned vehicle task execution method, system, computer device and storage medium Withdrawn CN112330167A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011257152.1A CN112330167A (en) 2020-11-11 2020-11-11 Unmanned vehicle task execution method, system, computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011257152.1A CN112330167A (en) 2020-11-11 2020-11-11 Unmanned vehicle task execution method, system, computer device and storage medium

Publications (1)

Publication Number Publication Date
CN112330167A true CN112330167A (en) 2021-02-05

Family

ID=74319029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011257152.1A Withdrawn CN112330167A (en) 2020-11-11 2020-11-11 Unmanned vehicle task execution method, system, computer device and storage medium

Country Status (1)

Country Link
CN (1) CN112330167A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113325836A (en) * 2021-04-21 2021-08-31 优兔创新有限公司 Method for unmanned vehicle to execute task in task area and unmanned vehicle system
CN114372749A (en) * 2022-01-06 2022-04-19 北京京东乾石科技有限公司 Task processing method and device for unmanned vehicle
WO2022252429A1 (en) * 2021-06-02 2022-12-08 北京三快在线科技有限公司 Unmanned device control method and apparatus, storage medium, and electronic device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113325836A (en) * 2021-04-21 2021-08-31 优兔创新有限公司 Method for unmanned vehicle to execute task in task area and unmanned vehicle system
WO2022252429A1 (en) * 2021-06-02 2022-12-08 北京三快在线科技有限公司 Unmanned device control method and apparatus, storage medium, and electronic device
CN114372749A (en) * 2022-01-06 2022-04-19 北京京东乾石科技有限公司 Task processing method and device for unmanned vehicle

Similar Documents

Publication Publication Date Title
CN109917767B (en) Distributed unmanned aerial vehicle cluster autonomous management system and control method
CN112330167A (en) Unmanned vehicle task execution method, system, computer device and storage medium
US11455896B2 (en) Unmanned aerial vehicle power management
US11215986B2 (en) Automated drone systems
CN109478060B (en) Aerial work support and real-time management
US9973737B1 (en) Unmanned aerial vehicle assistant for monitoring of user activity
US9922306B1 (en) Mobile RFID reading systems
US9714139B1 (en) Managing inventory items via overhead drive units
US20200026720A1 (en) Construction and update of elevation maps
US9997080B1 (en) Decentralized air traffic management system for unmanned aerial vehicles
WO2018059325A1 (en) Flight control method and system for unmanned aerial vehicle
CN109891443A (en) System and method for using the internal reservoir content of one or more internal control monitoring unmanned shipment reservoirs
CN104881042B (en) A kind of multiple dimensioned air remote sensing test platform
WO2017131587A9 (en) System and method for controlling an unmanned vehicle and releasing a payload from the same
WO2016130716A2 (en) Geographic survey system for vertical take-off and landing (vtol) unmanned aerial vehicles (uavs)
WO2018028358A1 (en) Method, apparatus and system for implementing formation flying
CN104615143A (en) Unmanned aerial vehicle scheduling method
CN110203395B (en) Method and system for detecting intelligent child equipment carried by mother aircraft of unmanned aerial vehicle
CN104062980A (en) Onboard panorama monitoring system of unmanned aerial vehicle
CN112639735A (en) Distribution of calculated quantities
CN112817331A (en) Intelligent forestry information monitoring system based on multi-machine cooperation
CN113568427B (en) Unmanned aerial vehicle autonomous landing mobile platform method and system
Stampa et al. A scenario for a multi-UAV mapping and surveillance system in emergency response applications
Swain et al. Deep reinforcement learning based target detection for unmanned aerial vehicle
Wu et al. Design and implementation of an unmanned aerial and ground vehicle recharging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210205

WW01 Invention patent application withdrawn after publication