CN113095714A - Task execution method and device for unmanned aerial vehicle, electronic equipment and storage medium - Google Patents

Task execution method and device for unmanned aerial vehicle, electronic equipment and storage medium Download PDF

Info

Publication number
CN113095714A
CN113095714A CN202110468809.7A CN202110468809A CN113095714A CN 113095714 A CN113095714 A CN 113095714A CN 202110468809 A CN202110468809 A CN 202110468809A CN 113095714 A CN113095714 A CN 113095714A
Authority
CN
China
Prior art keywords
task
unmanned aerial
aerial vehicle
model
execution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110468809.7A
Other languages
Chinese (zh)
Inventor
高文敏
陈杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xaircraft Technology Co Ltd
Original Assignee
Guangzhou Xaircraft Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xaircraft Technology Co Ltd filed Critical Guangzhou Xaircraft Technology Co Ltd
Priority to CN202110468809.7A priority Critical patent/CN113095714A/en
Publication of CN113095714A publication Critical patent/CN113095714A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06312Adjustment or analysis of established resource schedule, e.g. resource or task levelling, or dynamic rescheduling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0637Strategic management or analysis, e.g. setting a goal or target of an organisation; Planning actions based on goals; Analysis or evaluation of effectiveness of goals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management

Abstract

The application provides a task execution method and device for an unmanned aerial vehicle, electronic equipment and a storage medium, and relates to the technical field of unmanned aerial vehicles. The method comprises the following steps: acquire the information of the task and the unmanned aerial vehicle of treating the adaptation, unmanned aerial vehicle's information includes: performance parameters of the drone; determining an adaptation result of the task and the unmanned aerial vehicle according to performance parameters of the unmanned aerial vehicle, wherein the performance parameters comprise: unmanned aerial vehicle model and camera model; and if the adaptation result is that the task is matched with the unmanned aerial vehicle, controlling the unmanned aerial vehicle to execute the task. According to the method, on one hand, the matching efficiency of the task and the unmanned aerial vehicle can be improved, the time consumption of matching is reduced, on the other hand, a more accurate matching result can be provided for a user through the adaptive logic, and the accuracy of the task execution result is improved.

Description

Task execution method and device for unmanned aerial vehicle, electronic equipment and storage medium
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to a task execution method and device of an unmanned aerial vehicle, electronic equipment and a storage medium.
Background
With the continuous development of unmanned aerial vehicles, unmanned aerial vehicles have been widely used in the fields of aerial photography, power inspection, environmental monitoring, forest fire prevention, disaster inspection, anti-terrorism and life saving, military reconnaissance, battlefield assessment and the like. In recent years, with the continuous development of the types and performances of the unmanned aerial vehicles, each unmanned aerial vehicle can execute one or more different types of tasks, and how to improve the adaptability between the tasks to be executed and the unmanned aerial vehicles becomes more important to improve the operation efficiency.
In the prior art, before a task is executed, a user is required to select an unmanned aerial vehicle to be adapted to the task to be executed according to own experience and subjective consciousness.
However, the method can lead to long adaptation time and poor adaptation effect of the task and the unmanned aerial vehicle.
Disclosure of Invention
An object of the application is to provide a task execution method, device, electronic device and storage medium for an unmanned aerial vehicle, aiming at the defects in the prior art, so as to solve the problems that the task and unmanned aerial vehicle adaptation in the prior art is long in time consumption and poor in adaptation effect.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides a task execution method for an unmanned aerial vehicle, including:
acquiring information of a task to be adapted and an unmanned aerial vehicle, wherein the information of the unmanned aerial vehicle comprises: performance parameters of the drone;
determining an adaptation result of the task and the unmanned aerial vehicle according to the performance parameters of the unmanned aerial vehicle, wherein the performance parameters comprise: unmanned aerial vehicle model and camera model;
and if the adaptation result is that the task is matched with the unmanned aerial vehicle, controlling the unmanned aerial vehicle to execute the task.
Optionally, the determining, according to the performance parameter of the drone, an adaptation result of the task and the drone includes:
judging whether the model of the unmanned aerial vehicle and the model of the camera are matched with the task;
and determining an adaptation result of the task and the unmanned aerial vehicle according to the matching result.
Optionally, the determining, according to the matching result, an adaptation result of the task and the unmanned aerial vehicle includes:
if the unmanned aerial vehicle model and the camera model are both matched with the task, determining that the task and the unmanned aerial vehicle are matched as follows: the task is matched with the unmanned aerial vehicle;
if the unmanned aerial vehicle model and/or the camera model are not matched with the task, determining that the task and the unmanned aerial vehicle are matched as follows: the task is not matched with the drone.
Optionally, if the adaptation result is that the task is matched with the drone, controlling the drone to execute the task includes:
if the adaptation result is that the task is matched with the unmanned aerial vehicle, displaying a first result, wherein the first result comprises: prompt information, user operable options;
controlling the drone to perform the task in response to a user confirmation operation input for the user-operable option.
Optionally, if the adaptation result is that the task is matched with the drone, displaying a first result, including:
determining the execution progress of the task;
and displaying a first result corresponding to the task according to the execution progress of the task.
Optionally, the displaying a first result corresponding to the task according to the execution progress of the task includes:
if the execution progress of the task is completed, the model of the current unmanned aerial vehicle is the same as the model of the unmanned aerial vehicle and the model of the camera used when the task is completed, and data in a storage module of the current unmanned aerial vehicle are changed, displaying first prompt information and a first user operable option, wherein the first prompt information comprises: detecting a change in data under the task, the first user-operable option comprising: re-executed or cancelled.
Optionally, the displaying a first result corresponding to the task according to the execution progress of the task includes:
if the execution progress of the task is partially completed, the current unmanned aerial vehicle is the same as the camera model used when the task is partially completed, and the unmanned aerial vehicle models are different, displaying second prompt information and a second user operable option, wherein the second prompt information comprises: the current drone is of a different model than the drone that has been partially completed for use, the second user-operable option comprising: breakpoint execution or re-execution;
if the execution progress of the task is partially completed, the current unmanned aerial vehicle is different from a camera model used when the task is partially completed, and the models of the unmanned aerial vehicles are the same, third prompt information and third user operable options are displayed, wherein the third prompt information comprises: when the current drone is of a different camera model than that used for partial completion, a breakpoint may be executed causing inconsistency in the collected data, and the third user operable options include: breakpoint execution or re-execution;
if the execution progress of the task is partially completed, and the model of the current unmanned aerial vehicle is different from the model of the unmanned aerial vehicle and the model of the camera used when the task is partially completed, displaying fourth prompt information and a fourth user operable option, wherein the fourth prompt information comprises: when the current model of the unmanned aerial vehicle is different from the model of the unmanned aerial vehicle and the model of the camera used by the partially completed unmanned aerial vehicle, and a breakpoint is executed, the acquired data may be inconsistent, and the fourth user operable option includes: breakpoint execution or re-execution.
Optionally, the method further comprises:
if the adaptation result is that the task is not matched with the unmanned aerial vehicle, displaying a second result, wherein the second result comprises: and prompting information.
Optionally, if the adaptation result is that the task is not matched with the drone, displaying a second result, including:
if the unmanned aerial vehicle model with the task does not match, just when the camera model with the task matches, then show fifth tip information, fifth tip information includes: the current unmanned aerial vehicle does not support the task;
if the unmanned aerial vehicle model with the task matches, just when the camera model with the task does not match, then show sixth prompt information, sixth prompt information includes: the current camera does not support the task;
if the unmanned aerial vehicle model with the camera model all with when the task did not match, then show seventh tip information, seventh tip information includes: neither current drones nor cameras support this task.
Optionally, said controlling said drone to perform said task in response to a confirmation operation entered by a user for said user-operable option, comprises:
in response to a confirmation operation of the user input for the user-operable option, controlling the drone to execute the task in the selected execution mode, the execution mode including: breakpoint execution or re-execution.
Optionally, the acquiring information of the task to be adapted and the unmanned aerial vehicle includes:
and responding to the confirmation operation of the task and the unmanned aerial vehicle input by the user on the task execution interface, and acquiring the information of the task and the unmanned aerial vehicle to be adapted.
Optionally, before the information of the task to be adapted and the unmanned aerial vehicle is acquired in response to the confirmation operation of the task and the unmanned aerial vehicle input by the user on the task execution interface, the method further includes:
responding to a task area drawn in a target map displayed on a task creation interface by a user, and displaying a recommended task to the user according to the content contained in the task area;
and creating at least one task in response to the recommended task confirmation operation input by the user.
Optionally, the determining, according to the performance parameter of the drone, an adaptation result of the task and the drone includes:
judging whether the unmanned aerial vehicle is connected to a remote controller;
and if so, determining an adaptation result of the task and the unmanned aerial vehicle according to the performance parameters of the unmanned aerial vehicle.
In a second aspect, an embodiment of the present application further provides a task execution device for an unmanned aerial vehicle, including an acquisition module, a determination module, and an execution module;
the acquisition module is used for acquiring the information of the task to be adapted and the unmanned aerial vehicle, and the information of the unmanned aerial vehicle comprises: performance parameters of the drone;
the determining module is configured to determine an adaptation result of the task and the unmanned aerial vehicle according to a performance parameter of the unmanned aerial vehicle, where the performance parameter includes: unmanned aerial vehicle model and camera model;
and the execution module is used for controlling the unmanned aerial vehicle to execute the task if the adaptation result is that the task is matched with the unmanned aerial vehicle.
Optionally, the determining module is specifically configured to determine whether the model of the unmanned aerial vehicle and the model of the camera are matched with the task; and determining an adaptation result of the task and the unmanned aerial vehicle according to the matching result.
Optionally, the determining module is specifically configured to determine that an adaptation result of the task and the unmanned aerial vehicle is: the task is matched with the unmanned aerial vehicle; if the unmanned aerial vehicle model and/or the camera model are not matched with the task, determining that the task and the unmanned aerial vehicle are matched as follows: the task is not matched with the drone.
Optionally, the executing module is specifically configured to display a first result if the adaptation result is that the task is matched with the unmanned aerial vehicle, where the first result includes: prompt information, user operable options; controlling the drone to perform the task in response to a user confirmation operation input for the user-operable option.
Optionally, the execution module is specifically configured to determine an execution progress of the task; and displaying a first result corresponding to the task according to the execution progress of the task.
Optionally, the execution module is specifically configured to display first prompt information and a first user operable option if the execution progress of the task is completed, the model of the current unmanned aerial vehicle is the same as the model of the unmanned aerial vehicle and the model of the camera used when the task is completed, and data in a storage module of the current unmanned aerial vehicle has changed, where the first prompt information includes: detecting a change in data under the task, the first user-operable option comprising: re-executed or cancelled.
Optionally, the execution module is specifically configured to, if the execution progress of the task is partially completed, and the current model of the unmanned aerial vehicle is the same as the model of the camera used when the task is partially completed, and the models of the unmanned aerial vehicles are different, display second prompt information and a second user operable option, where the second prompt information includes: the current drone is of a different model than the drone that has been partially completed for use, the second user-operable option comprising: breakpoint execution or re-execution;
if the execution progress of the task is partially completed, the current unmanned aerial vehicle is different from a camera model used when the task is partially completed, and the models of the unmanned aerial vehicles are the same, third prompt information and third user operable options are displayed, wherein the third prompt information comprises: when the current drone is of a different camera model than that used for partial completion, a breakpoint may be executed causing inconsistency in the collected data, and the third user operable options include: breakpoint execution or re-execution;
if the execution progress of the task is partially completed, and the model of the current unmanned aerial vehicle is different from the model of the unmanned aerial vehicle and the model of the camera used when the task is partially completed, displaying fourth prompt information and a fourth user operable option, wherein the fourth prompt information comprises: when the current model of the unmanned aerial vehicle is different from the model of the unmanned aerial vehicle and the model of the camera used by the partially completed unmanned aerial vehicle, and a breakpoint is executed, the acquired data may be inconsistent, and the fourth user operable option includes: breakpoint execution or re-execution.
Optionally, the executing module is further configured to display a second result if the adaptation result is that the task is not matched with the drone, where the second result includes: and prompting information.
Optionally, the executing module is specifically configured to display fifth prompt information if the model of the unmanned aerial vehicle does not match the task, and the model of the camera matches the task, where the fifth prompt information includes: the current unmanned aerial vehicle does not support the task;
if the unmanned aerial vehicle model with the task matches, just when the camera model with the task does not match, then show sixth prompt information, sixth prompt information includes: the current camera does not support the task;
if the unmanned aerial vehicle model with the camera model all with when the task did not match, then show seventh tip information, seventh tip information includes: neither current drones nor cameras support this task.
Optionally, the execution module is specifically configured to, in response to a confirmation operation input by the user for the user-operable option, control the unmanned aerial vehicle to execute the task according to the selected execution manner, where the execution manner includes: breakpoint execution or re-execution.
Optionally, the obtaining module is specifically configured to obtain information of the task to be adapted and the unmanned aerial vehicle in response to a confirmation operation of the task and the unmanned aerial vehicle input by the user on the task execution interface.
Optionally, the apparatus further comprises: a display module and a creation module;
the display module is used for responding to a task area drawn in a target map displayed on a task creation interface by a user, and displaying a recommended task to the user according to the content contained in the task area;
the creating module is used for creating at least one task in response to the recommended task confirmation operation input by the user.
Optionally, the determining module is specifically configured to determine whether the drone is connected to a remote controller; and if so, determining an adaptation result of the task and the unmanned aerial vehicle according to the performance parameters of the unmanned aerial vehicle.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the electronic device is operated, the processor executing the machine-readable instructions to perform the steps of the method as provided in the first aspect when executed.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, performs the steps of the method as provided in the first aspect.
The beneficial effect of this application is:
the application provides a task execution method and device for an unmanned aerial vehicle, an electronic device and a storage medium, wherein the method comprises the following steps: acquire the information of the task and the unmanned aerial vehicle of treating the adaptation, unmanned aerial vehicle's information includes: performance parameters of the drone; determining an adaptation result of the task and the unmanned aerial vehicle according to performance parameters of the unmanned aerial vehicle, wherein the performance parameters comprise: unmanned aerial vehicle model and camera model; and if the adaptation result is that the task is matched with the unmanned aerial vehicle, controlling the unmanned aerial vehicle to execute the task. In this scheme, through carrying out intelligent matching judgement to task and unmanned aerial vehicle, can improve the matching efficiency of task and unmanned aerial vehicle on the one hand, it is consuming time to reduce to match, and on the other hand, through the adaptation logic, can provide comparatively accurate matching result for the user to improve the accuracy of task execution result.
In addition, matching judgment is carried out on the task selected by the user and the unmanned aerial vehicle, and prompt information corresponding to the adaptation result is displayed for the user, so that friendly unmanned aerial vehicle and task matching reminding can be provided for the user.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic diagram of a task execution system architecture of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a task execution method of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a task execution method of another unmanned aerial vehicle according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a task execution method of a drone according to an embodiment of the present application;
fig. 5 is a schematic flowchart of a task execution method of another unmanned aerial vehicle according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a task execution method of another unmanned aerial vehicle according to an embodiment of the present application;
fig. 7 is a schematic diagram of a task and unmanned aerial vehicle matching interface provided in an embodiment of the present application;
fig. 8 is a schematic flowchart of a task execution method of a drone according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a task creation interface provided in an embodiment of the present application;
fig. 10 is a schematic flowchart of a task execution method of another unmanned aerial vehicle according to an embodiment of the present application;
fig. 11 is a schematic diagram of a task execution device of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
Fig. 1 is a schematic diagram of a task execution system architecture of an unmanned aerial vehicle according to an embodiment of the present application, and a task execution method of an unmanned aerial vehicle according to the present application can be applied to the system, as shown in fig. 1, the system can include: unmanned aerial vehicle and terminal equipment, unmanned aerial vehicle and terminal equipment communication connection, install unmanned aerial vehicle management application on the terminal equipment, the user can carry out the selection of task and the unmanned aerial vehicle of carrying out this task in the functional interface of this application, based on user's selection, accessible adaptation logic judges whether task and unmanned aerial vehicle match, and show corresponding reminder information to the user, when task and unmanned aerial vehicle match, control unmanned aerial vehicle and carry out this task.
Alternatively, the terminal device may be a mobile terminal, such as a mobile phone, a tablet, or the like, or a fixed terminal, such as a computer or the like. Unmanned aerial vehicle includes unmanned aerial vehicle body and camera, and according to the difference of unmanned aerial vehicle and camera model, the task that unmanned aerial vehicle can be used to carry out is different, judges through the matching of task and unmanned aerial vehicle, can realize adopting the unmanned aerial vehicle execution target task of more matching to the accuracy of task execution result has been improved.
Fig. 2 is a schematic flowchart of a task execution method of an unmanned aerial vehicle according to an embodiment of the present application; the execution subject of the method can be the terminal device. As shown in fig. 2, the method may include:
s201, acquiring information of a task to be adapted and an unmanned aerial vehicle, wherein the information of the unmanned aerial vehicle comprises: performance parameters of the drone.
Optionally, the information of the task to be adapted and the drone may be automatically identified according to the information input by the user, may be directly input by the user, or may be selected through a user interface. The task name or the identification model of the unmanned aerial vehicle may be input, or the selection may be directly performed according to a task list or an unmanned aerial vehicle list displayed on the user interface, and the like, which is not limited specifically here.
S202, determining an adaptation result of the task and the unmanned aerial vehicle according to performance parameters of the unmanned aerial vehicle, wherein the performance parameters comprise: unmanned aerial vehicle model and camera model.
Generally, the tasks can include task types and task states, and a task type can include a plurality of different tasks, and each task belongs to the task type, and because the data that its needs to gather are different for different task types, the unmanned aerial vehicle model and the camera model that will use also are different, and the unmanned aerial vehicle that is used for carrying out different tasks needs is different that also is, when unmanned aerial vehicle and task match each other, unmanned aerial vehicle can be efficient and accurate execution task to the utmost. The task state may include: a task is currently unexecuted, partially executed, or completed.
In an optional mode, the adaptation result of the task selected by the user and the unmanned aerial vehicle can be comprehensively judged according to the model of the unmanned aerial vehicle and the model of the camera installed on the unmanned aerial vehicle.
Wherein, the tasks that different drone models and camera models can perform may be predetermined.
And S203, if the adaptation result is that the task is matched with the unmanned aerial vehicle, controlling the unmanned aerial vehicle to execute the task.
In an implementation manner, when it is determined that the adaptation result is that the task matches the drone, a control instruction may be sent to the drone, and the control instruction may include: the task and the execution mode adopted by the task are executed, so that the unmanned aerial vehicle is controlled to execute the task according to the execution mode.
In summary, the task execution method for the unmanned aerial vehicle provided by this embodiment includes: acquire the information of the task and the unmanned aerial vehicle of treating the adaptation, unmanned aerial vehicle's information includes: performance parameters of the drone; determining an adaptation result of the task and the unmanned aerial vehicle according to performance parameters of the unmanned aerial vehicle, wherein the performance parameters comprise: unmanned aerial vehicle model and camera model; and if the adaptation result is that the task is matched with the unmanned aerial vehicle, controlling the unmanned aerial vehicle to execute the task. In this scheme, through carrying out intelligent matching judgement to task and unmanned aerial vehicle, can improve the matching efficiency of task and unmanned aerial vehicle on the one hand, it is consuming time to reduce to match, and on the other hand, through the adaptation logic, can provide comparatively accurate matching result for the user to improve the accuracy of task execution result.
Fig. 3 is a schematic flowchart of a task execution method of another unmanned aerial vehicle according to an embodiment of the present application; optionally, in step S202, determining an adaptation result between the task and the drone according to the performance parameter of the drone may include:
s301, judging whether the model of the unmanned aerial vehicle and the model of the camera are matched with the task.
Optionally, whether the model of the unmanned aerial vehicle and the model of the camera are matched with the task may be determined according to a predetermined correspondence between different types of unmanned aerial vehicles and types of tasks supported by the model of the camera. Wherein, when unmanned aerial vehicle model and camera model support certain task type, also support all tasks that belong to this task type promptly to can judge whether unmanned aerial vehicle model and camera model match with the task.
This application uses being applied to the agricultural field as an example, and the task type that unmanned aerial vehicle supported can include different farming tasks, as shown in table 1 below, has demonstrated the task type that different unmanned aerial vehicle models and camera model supported:
TABLE 1
Figure BDA0003044504610000101
S302, according to the matching result, determining an adaptation result of the task and the unmanned aerial vehicle.
Optionally, the matching result of the model of the drone and the model of the camera to the task includes any one of: the unmanned aerial vehicle model and the camera model are matched with the task, the unmanned aerial vehicle model is matched with the task, the camera model is matched with the task, and the unmanned aerial vehicle model and the camera model are not matched with the task.
Based on the obtained matching result, the adaptation result of the task and the unmanned aerial vehicle can be determined.
Fig. 4 is a schematic flowchart of a task execution method of a drone according to an embodiment of the present application; optionally, in step S302, determining an adaptation result between the task and the unmanned aerial vehicle according to the matching result may include:
s401, if the model of the unmanned aerial vehicle and the model of the camera are matched with the task, determining that the adaptation result of the task and the unmanned aerial vehicle is as follows: the task is matched with the unmanned aerial vehicle.
Typically, a drone may be considered to match a task if and only if both the drone model and the camera model support the performance of the task.
Optionally, when it is determined that both the model of the unmanned aerial vehicle and the model of the camera support the task, both the model of the unmanned aerial vehicle and the model of the camera are matched with the task, and then it is determined that an adaptation result of the task and the unmanned aerial vehicle is: the task is matched with the unmanned aerial vehicle.
S402, if the model of the unmanned aerial vehicle and/or the model of the camera are not matched with the task, determining that the adaptation result of the task and the unmanned aerial vehicle is as follows: the task does not match the drone.
When any one of the model of the drone and the model of the camera does not support execution of a certain task, the drone is considered not to match the task.
Optionally, when it is determined that both the model of the unmanned aerial vehicle and the model of the camera support the task, or that the model of the unmanned aerial vehicle supports the task, but the model of the camera does not support the task, or that the model of the unmanned aerial vehicle does not support the task, but the model of the camera supports the task, then the model of the unmanned aerial vehicle and/or the model of the camera do not match the task, then it is determined that an adaptation result of the task and the unmanned aerial vehicle is: the task does not match the drone.
Fig. 5 is a schematic flowchart of a task execution method of another unmanned aerial vehicle according to an embodiment of the present application; optionally, in step S203, if the adaptation result is that the task is matched with the drone, controlling the drone to execute the task includes:
s501, if the adaptation result is that the task is matched with the unmanned aerial vehicle, displaying a first result, wherein the first result comprises: prompt information, user operable options.
In some embodiments, when the determined adaptation result is that the task matches the drone, the first result may be displayed to the user, where the prompt information may be used to provide a more friendly reminder to the user so that the user can grasp the degree of matching between the selected task and the drone and the special circumstances that may be encountered when the drone performs the task. The user-operable options may be used to enable the user to select an execution mode for the drone when performing the task.
And S502, responding to the confirmation operation input by the user aiming at the user operable option, and controlling the unmanned aerial vehicle to execute the task.
Optionally, the user operable option may include at least two options, and the user may input a confirmation operation of the operable option according to the prompt information and the task requirement, and in response to the confirmation operation, the drone may be controlled to execute the task according to the selection of the user.
Fig. 6 is a schematic flowchart of a task execution method of another unmanned aerial vehicle according to an embodiment of the present application; optionally, in step S501, if the adaptation result is that the task matches the unmanned aerial vehicle, displaying a first result may include:
s601, determining the execution progress of the task.
In an implementation manner, when the task selected by the user is matched with the drone, that is, the drone is available to execute the task, the execution progress of the task selected by the user, that is, the execution state mentioned above, may be further determined to determine whether the task is currently in the state of being executed, not executed, or being partially executed.
Alternatively, historical execution data for the task may be queried to determine the progress of execution of the task.
And S602, displaying a first result corresponding to the task according to the execution progress of the task.
When the task is in different execution schedules, a first result corresponding to the execution schedule can be displayed, so that prompt information of the task in different execution states can be displayed in a refined mode accurately, and a user can know the execution conditions of the selected task and the unmanned aerial vehicle in different states.
Fig. 7 is a schematic diagram of a task and unmanned aerial vehicle matching interface provided in an embodiment of the present application.
Optionally, in step S602, displaying a first result corresponding to the task according to the execution progress of the task may include: if the execution progress of the task is completed, the model of the unmanned aerial vehicle and the model of the camera used by the current unmanned aerial vehicle when the task is completed are the same, and data in a storage module of the current unmanned aerial vehicle are changed, displaying first prompt information and a first user operable option, wherein the first prompt information comprises: detecting a change in data under the task, the first user-operable option comprising: re-executed or cancelled.
As shown in fig. 7, when the selected task is matched with the unmanned aerial vehicle, that is, both the model of the unmanned aerial vehicle and the model of the camera support the task, if the execution progress of the task is not executed (the progress is 0%), no prompt information is displayed, and the unmanned aerial vehicle is directly controlled to execute the task.
When the execution progress of the task is completed (the progress is 100%), in one case, the prompt information is directly displayed as follows: this task has been performed to completion, with the user operable options displayed: re-executed or cancelled.
In another case, if data in the storage module of the current drone has changed, displaying the first prompt message includes: detecting a change in data under the task, and the first user-operable option comprises: re-executed or cancelled. Because under this kind of circumstances, the data in the unmanned aerial vehicle storage module has changed at present, if the breakpoint was executed, can lead to the data inconsistency that unmanned aerial vehicle gathered around the breakpoint to a certain extent to disturb normal data analysis, so the execution of task is realized to the accessible re-execution, perhaps cancels the execution, chooses for use other unmanned aerial vehicles to carry out.
Continuing with fig. 7, optionally, in step S502, displaying a first result corresponding to the task according to the execution progress of the task may include: if the execution progress of the task is partially completed, the types of the cameras used by the current unmanned aerial vehicle and the partially completed task are the same, and the types of the unmanned aerial vehicles are different, displaying second prompt information and a second user operable option, wherein the second prompt information comprises: the current drone is of a different model than the drone that has been partially completed for use, the second user-operable option comprising: breakpoint execution or re-execution.
If the execution progress of the task is partially completed, the types of the cameras used by the current unmanned aerial vehicle and the partially completed task are different, and the types of the unmanned aerial vehicles are the same, displaying third prompt information and third user operable options, wherein the third prompt information comprises: when the current unmanned aerial vehicle is different from the camera model used by the partially completed unmanned aerial vehicle, the acquired data are inconsistent when a breakpoint is executed, and the third user operable options comprise: breakpoint execution or re-execution.
If the execution progress of the task is partially completed, and the model of the current unmanned aerial vehicle is different from the model of the unmanned aerial vehicle and the model of the camera used when the task is partially completed, displaying fourth prompt information and a fourth user operable option, wherein the fourth prompt information comprises: the model of the current unmanned aerial vehicle is different from the model of the partially completed unmanned aerial vehicle, the model of the camera is different from the model of the currently completed unmanned aerial vehicle, acquired data are inconsistent when a breakpoint is executed, and the fourth user operable option comprises: breakpoint execution or re-execution.
It should be noted that, when any of the model of the unmanned aerial vehicle or the model of the camera changes from the completed part, if the breakpoint execution is executed, since the data acquired by the completed part during the task execution is stored in the unmanned aerial vehicle used by the completed part, and the data acquired by the continuous breakpoint execution is stored in the unmanned aerial vehicle selected by the user, the breakpoint execution is executed by using the prompt information: the breakpoint execution may result in inconsistency in the collected data, and the user may be reminded to upload the data stored in the unmanned aerial vehicle used before the breakpoint execution in time, so as to ensure the integrity of the data.
Optionally, the method of the present application may further include: if the adaptation result is that the task is not matched with the unmanned aerial vehicle, displaying a second result, wherein the second result comprises: and prompting information.
When the task is judged to be not matched with the unmanned aerial vehicle, the displayed second result only comprises prompt information, namely, the user operable option is not included.
Continuing with fig. 7, optionally, if the adaptation result is that the task does not match the drone, displaying a second result may include: if the unmanned aerial vehicle model does not match with the task, and when the camera model matches with the task, then show fifth tip information, fifth tip information includes: current drones do not support this task.
If the unmanned aerial vehicle model matches with the task, and when the camera model does not match with the task, then display sixth prompt information, sixth prompt information includes: current cameras do not support this task.
If the unmanned aerial vehicle model and the camera model are not matched with the task, displaying seventh prompt information, wherein the seventh prompt information comprises: neither current drones nor cameras support this task.
To the unmatched condition of unmanned aerial vehicle and task, can divide into three kinds usually, it does not support the task to be the unmanned aerial vehicle model respectively, or the camera model does not support the task, or unmanned aerial vehicle model and camera model do not support the task, and the prompt message that corresponds the demonstration in every kind of condition can be different, and when the unmanned aerial vehicle model does not support the task, the prompt message that shows can include: the current model of the unmanned aerial vehicle does not support the task; when the camera model does not support the task, the prompt information displayed may include: the current camera model does not support the task; when neither the unmanned aerial vehicle model nor the camera model supports the task, the prompt message displayed may include: current drone models and camera models do not support this task.
In some embodiments, for the case where the drone does not match the task, the second result displayed may also include a user-actionable option, in which case the user-actionable option may be a confirmation option to confirm that the user knows the specific reason why the drone did not match the task.
Optionally, in step S502, in response to a confirmation operation of the user for the user-operable option input, controlling the drone to perform the task may include: in response to a confirmation operation input by the user for the user operable option, controlling the unmanned aerial vehicle to execute the task according to the selected execution mode, wherein the execution mode comprises the following steps: breakpoint execution or re-execution.
Optionally, in the case that the task matches the drone, the user may select a target execution mode by displayed user-operable options, wherein, as can be seen from the above analysis, the selectable execution modes may include: re-execution and breakpoint execution. And responding to the target execution mode input by the user, and controlling the unmanned aerial vehicle to execute the task according to the target execution mode.
Optionally, in step S201, the obtaining information of the task to be adapted and the unmanned aerial vehicle may include: and responding to the confirmation operation of the task and the unmanned aerial vehicle input by the user on the task execution interface, and acquiring the information of the task and the unmanned aerial vehicle to be adapted.
In an implementation manner, a user can select a task to be executed and a drone to be used in a task execution interface of the drone management application, and information of the task and the drone to be adapted can be acquired in response to confirmation operations of the user on the task and the drone. Of course, the specific implementation is not limited to this implementation.
Fig. 8 is a schematic flowchart of a task execution method of a drone according to an embodiment of the present application; fig. 9 is a schematic diagram of a task creation interface according to an embodiment of the present application. Optionally, in the foregoing step, before the task to be adapted and the information of the unmanned aerial vehicle are acquired in response to the confirmation operation of the task and the unmanned aerial vehicle input by the user on the task execution interface, the method of the present application may further include:
s801, responding to a task area drawn in a target map displayed on a task creation interface by a user, and displaying a recommended task to the user according to the content contained in the task area.
In one implementation, a target map for a user to create a mapping area in a map may be displayed in a task creation interface in response to a user selecting a task type in the interface.
Alternatively, the user may manually map in the target map according to the needs of the task to be created, for example: and manually selecting a target area and determining a task area. In response to the task area drawn by the user, a task may be recommended to the user according to content contained in the task area.
In another implementation, the user can directly open the target map and perform mapping on the target map without first selecting the task type.
Optionally, according to the content included in the task area drawn by the user, the task recommended to the user may be, for example:
A) when 90% of the content in the task area is fruit trees, preferentially recommending fruit tree tasks;
B) when 90% of the content contained in the task area is a highway or a coastline (which can be added later), the voyage task is preferentially recommended;
C) when the content contained in the task area is a mountain land with large elevation fluctuation, an inclined task is recommended preferentially;
D) when the content contained in the task area is non-fruit trees, non-highways, non-navigation zones and mountains with larger non-altitude fluctuation, the flat ground task is preferentially recommended.
That is, a better task may be recommended according to the type of the object included in the task area drawn by the user.
S802, at least one task is created in response to the recommended task confirmation operation input by the user.
In some embodiments, when the user approves the currently recommended task, the creation of the task may be completed in response to a confirmation operation of the recommended task input by the user.
In other embodiments, the task to be created may also be changed by the user himself or herself when the user thinks that there is a deviation in the recommended task.
Optionally, all tasks that may be involved may be created in advance by the above method, and when a user needs to execute a task, the user may log in the drone management application program first, and select a task to be executed from the created tasks displayed in the task execution interface.
Fig. 10 is a schematic flowchart of a task execution method of another unmanned aerial vehicle according to an embodiment of the present application; optionally, in step S202, determining an adaptation result between the task and the drone according to the performance parameter of the drone may include:
s1001, whether the unmanned aerial vehicle is connected to the remote controller is judged.
Optionally, if and only if the selected drone is connected to the remote control, the drone may be used to perform the task, then, before determining whether the task selected by the user matches the drone, it may be determined first whether the drone selected by the user is connected to the remote control.
The remote controller can be a remote control device independent of the terminal device installed with the unmanned aerial vehicle management application program, and certainly, the terminal device can also be used as the remote controller.
And S1002, if so, determining an adaptation result of the task and the unmanned aerial vehicle according to the performance parameters of the unmanned aerial vehicle.
Optionally, when it is determined that the drone is connected to the remote controller, a result of the adaptation of the drone to the task may be further determined.
Alternatively, the present application may have another realizable form: the user only inputs the task confirmation operation in the task execution interface, and the unmanned aerial vehicle corresponding to the task can be automatically recommended to the user according to the predetermined support relationship between the model of each unmanned aerial vehicle and the model of the camera, so that the unmanned aerial vehicle is controlled to execute the task. This is not particularly limited by the present application.
To sum up, the task execution method for the unmanned aerial vehicle provided by the embodiment of the application comprises the following steps: acquire the information of the task and the unmanned aerial vehicle of treating the adaptation, unmanned aerial vehicle's information includes: performance parameters of the drone; determining an adaptation result of the task and the unmanned aerial vehicle according to performance parameters of the unmanned aerial vehicle, wherein the performance parameters comprise: unmanned aerial vehicle model and camera model; and if the adaptation result is that the task is matched with the unmanned aerial vehicle, controlling the unmanned aerial vehicle to execute the task. In this scheme, through carrying out intelligent matching judgement to task and unmanned aerial vehicle, can improve the matching efficiency of task and unmanned aerial vehicle on the one hand, it is consuming time to reduce to match, and on the other hand, through the adaptation logic, can provide comparatively accurate matching result for the user to improve the accuracy of task execution result.
In addition, matching judgment is carried out on the task selected by the user and the unmanned aerial vehicle, and prompt information corresponding to the adaptation result is displayed for the user, so that friendly unmanned aerial vehicle and task matching reminding can be provided for the user.
The following describes a device, a storage medium, and the like for executing the task execution method of the unmanned aerial vehicle provided by the present application, and the specific implementation process and technical effects thereof are referred to above, and are not described again below.
Fig. 11 is a schematic diagram of a task execution device of an unmanned aerial vehicle according to an embodiment of the present application, where functions implemented by the device correspond to steps executed by the foregoing method. The apparatus may be understood as the terminal, the server, or the processor of the server, or may be understood as a component which is independent of the server or the processor and implements the functions of the present application under the control of the server, and the apparatus may include: an acquisition module 110, a determination module 120, and an execution module 130;
an obtaining module 110, configured to obtain information of a task to be adapted and an unmanned aerial vehicle, where the information of the unmanned aerial vehicle includes: performance parameters of the drone;
a determining module 120, configured to determine an adaptation result of the task and the drone according to a performance parameter of the drone, where the performance parameter includes: unmanned aerial vehicle model and camera model;
and the execution module 130 is configured to control the unmanned aerial vehicle to execute the task if the adaptation result is that the task is matched with the unmanned aerial vehicle.
Optionally, the determining module 120 is specifically configured to determine whether the model of the unmanned aerial vehicle and the model of the camera are matched with the task; and determining an adaptation result of the task and the unmanned aerial vehicle according to the matching result.
Optionally, the determining module 120 is specifically configured to determine that an adaptation result of the task and the unmanned aerial vehicle is: matching the task with the unmanned aerial vehicle; if the model of the unmanned aerial vehicle and/or the model of the camera are not matched with the task, determining that the adaptation result of the task and the unmanned aerial vehicle is as follows: the task does not match the drone.
Optionally, the executing module 130 is specifically configured to display a first result if the adaptation result is that the task is matched with the unmanned aerial vehicle, where the first result includes: prompt information, user operable options; and controlling the unmanned aerial vehicle to execute the task in response to the confirmation operation input by the user for the user operable option.
Optionally, the execution module 130 is specifically configured to determine an execution progress of the task; and displaying a first result corresponding to the task according to the execution progress of the task.
Optionally, the executing module 130 is specifically configured to display first prompt information and a first user operable option if the execution progress of the task is completed, the model of the current unmanned aerial vehicle is the same as the model of the unmanned aerial vehicle and the model of the camera used when the task is completed, and data in the storage module of the current unmanned aerial vehicle has changed, where the first prompt information includes: detecting a change in data under the task, the first user-operable option comprising: re-executed or cancelled.
Optionally, the executing module 130 is specifically configured to, if the execution progress of the task is partially completed, and the current model of the unmanned aerial vehicle is the same as the model of the camera used when the task is partially completed, and the models of the unmanned aerial vehicles are different, display second prompt information and a second user-operable option, where the second prompt information includes: the current drone is of a different model than the drone that has been partially completed for use, the second user-operable option comprising: breakpoint execution or re-execution;
if the execution progress of the task is partially completed, the types of the cameras used by the current unmanned aerial vehicle and the partially completed task are different, and the types of the unmanned aerial vehicles are the same, displaying third prompt information and third user operable options, wherein the third prompt information comprises: when the current unmanned aerial vehicle is different from the camera model used by the partially completed unmanned aerial vehicle, the acquired data are inconsistent when a breakpoint is executed, and the third user operable options comprise: breakpoint execution or re-execution;
if the execution progress of the task is partially completed, and the model of the current unmanned aerial vehicle is different from the model of the unmanned aerial vehicle and the model of the camera used when the task is partially completed, displaying fourth prompt information and a fourth user operable option, wherein the fourth prompt information comprises: the model of the current unmanned aerial vehicle is different from the model of the partially completed unmanned aerial vehicle, the model of the camera is different from the model of the currently completed unmanned aerial vehicle, acquired data are inconsistent when a breakpoint is executed, and the fourth user operable option comprises: breakpoint execution or re-execution.
Optionally, the executing module 130 is further configured to display a second result if the adaptation result is that the task does not match the drone, where the second result includes: and prompting information.
Optionally, the executing module 130 is specifically configured to display fifth prompt information if the model of the unmanned aerial vehicle is not matched with the task, and the model of the camera is matched with the task, where the fifth prompt information includes: the current unmanned aerial vehicle does not support the task;
if the unmanned aerial vehicle model matches with the task, and when the camera model does not match with the task, then display sixth prompt information, sixth prompt information includes: the current camera does not support the task;
if the unmanned aerial vehicle model and the camera model are not matched with the task, displaying seventh prompt information, wherein the seventh prompt information comprises: neither current drones nor cameras support this task.
Optionally, the executing module 130 is specifically configured to, in response to a confirmation operation input by the user for the user operable option, control the unmanned aerial vehicle to execute the task according to the selected executing mode, where the executing mode includes: breakpoint execution or re-execution.
Optionally, the obtaining module 110 is specifically configured to obtain information of the task to be adapted and the unmanned aerial vehicle in response to a confirmation operation of the task and the unmanned aerial vehicle input by the user on the task execution interface.
Optionally, the apparatus further comprises: a display module and a creation module;
the display module is used for responding to a task area drawn in a target map displayed on a task creation interface by a user and displaying a recommended task to the user according to the content contained in the task area;
and the creating module is used for creating at least one task in response to the recommended task confirmation operation input by the user.
Optionally, the determining module 120 is specifically configured to determine whether the drone is connected to the remote controller; if so, determining an adaptation result of the task and the unmanned aerial vehicle according to the performance parameters of the unmanned aerial vehicle.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
The modules may be connected or in communication with each other via a wired or wireless connection. The wired connection may include a metal cable, an optical cable, a hybrid cable, etc., or any combination thereof. The wireless connection may comprise a connection over a LAN, WAN, bluetooth, ZigBee, NFC, or the like, or any combination thereof. Two or more modules may be combined into a single module, and any one module may be divided into two or more units. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to corresponding processes in the method embodiments, and are not described in detail in this application.
It should be noted that the above modules may be one or more integrated circuits configured to implement the above methods, for example: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, the modules may be integrated together and implemented in the form of a System-on-a-chip (SOC).
Fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application, where the terminal may be a computer with a data processing function.
The apparatus may include: a processor 801 and a memory 802.
The memory 802 is used for storing programs, and the processor 801 calls the programs stored in the memory 802 to execute the above-mentioned method embodiments. The specific implementation and technical effects are similar, and are not described herein again.
The memory 802 stores therein program code that, when executed by the processor 801, causes the processor 801 to perform various steps of the method for task execution of a drone according to various exemplary embodiments of the present application described in the section "exemplary methods" above in this specification.
The Processor 801 may be a general-purpose Processor, such as a Central Processing Unit (CPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware components, and may implement or execute the methods, steps, and logic blocks disclosed in the embodiments of the present Application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
Memory 802, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules. The Memory may include at least one type of storage medium, and may include, for example, a flash Memory, a hard disk, a multimedia card, a card-type Memory, a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Programmable Read Only Memory (PROM), a Read Only Memory (ROM), a charged Erasable Programmable Read Only Memory (EEPROM), a magnetic Memory, a magnetic disk, an optical disk, and so on. The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory 802 in the embodiments of the present application may also be circuitry or any other device capable of performing a storage function for storing program instructions and/or data.
Optionally, the present application also provides a program product, such as a computer readable storage medium, comprising a program which, when being executed by a processor, is adapted to carry out the above-mentioned method embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer (which may be a personal computer, a server, or a network) or a processor (in english: processor) to execute some steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (16)

1. A task execution method of an unmanned aerial vehicle is characterized by comprising the following steps:
acquiring information of a task to be adapted and an unmanned aerial vehicle, wherein the information of the unmanned aerial vehicle comprises: performance parameters of the drone;
determining an adaptation result of the task and the unmanned aerial vehicle according to the performance parameters of the unmanned aerial vehicle, wherein the performance parameters comprise: unmanned aerial vehicle model and camera model;
and if the adaptation result is that the task is matched with the unmanned aerial vehicle, controlling the unmanned aerial vehicle to execute the task.
2. The method of claim 1, wherein determining the result of the task adapting to the drone according to the performance parameters of the drone comprises:
judging whether the model of the unmanned aerial vehicle and the model of the camera are matched with the task;
and determining an adaptation result of the task and the unmanned aerial vehicle according to the matching result.
3. The method of claim 2, wherein determining the result of the task adapting to the drone according to the matching result comprises:
if the unmanned aerial vehicle model and the camera model are both matched with the task, determining that the task and the unmanned aerial vehicle are matched as follows: the task is matched with the unmanned aerial vehicle;
if the unmanned aerial vehicle model and/or the camera model are not matched with the task, determining that the task and the unmanned aerial vehicle are matched as follows: the task is not matched with the drone.
4. The method of claim 1, wherein if the adaptation result is that the task matches the drone, controlling the drone to execute the task comprises:
if the adaptation result is that the task is matched with the unmanned aerial vehicle, displaying a first result, wherein the first result comprises: prompt information, user operable options;
controlling the drone to perform the task in response to a user confirmation operation input for the user-operable option.
5. The method of claim 4, wherein if the adaptation result is that the task matches the drone, displaying a first result comprises:
determining the execution progress of the task;
and displaying a first result corresponding to the task according to the execution progress of the task.
6. The method of claim 5, wherein displaying a first result corresponding to the task according to the execution progress of the task comprises:
if the execution progress of the task is completed, the model of the current unmanned aerial vehicle is the same as the model of the unmanned aerial vehicle and the model of the camera used when the task is completed, and data in a storage module of the current unmanned aerial vehicle are changed, displaying first prompt information and a first user operable option, wherein the first prompt information comprises: detecting a change in data under the task, the first user-operable option comprising: re-executed or cancelled.
7. The method of claim 5, wherein displaying a first result corresponding to the task according to the execution progress of the task comprises:
if the execution progress of the task is partially completed, the current unmanned aerial vehicle is the same as the camera model used when the task is partially completed, and the unmanned aerial vehicle models are different, displaying second prompt information and a second user operable option, wherein the second prompt information comprises: the current drone is of a different model than the drone that has been partially completed for use, the second user-operable option comprising: breakpoint execution or re-execution;
if the execution progress of the task is partially completed, the current unmanned aerial vehicle is different from a camera model used when the task is partially completed, and the models of the unmanned aerial vehicles are the same, third prompt information and third user operable options are displayed, wherein the third prompt information comprises: when the current drone is of a different camera model than that used for partial completion, a breakpoint may be executed causing inconsistency in the collected data, and the third user operable options include: breakpoint execution or re-execution;
if the execution progress of the task is partially completed, and the model of the current unmanned aerial vehicle is different from the model of the unmanned aerial vehicle and the model of the camera used when the task is partially completed, displaying fourth prompt information and a fourth user operable option, wherein the fourth prompt information comprises: when the current model of the unmanned aerial vehicle is different from the model of the unmanned aerial vehicle and the model of the camera used by the partially completed unmanned aerial vehicle, and a breakpoint is executed, the acquired data may be inconsistent, and the fourth user operable option includes: breakpoint execution or re-execution.
8. The method of claim 4, further comprising:
if the adaptation result is that the task is not matched with the unmanned aerial vehicle, displaying a second result, wherein the second result comprises: and prompting information.
9. The method of claim 8, wherein if the adaptation result is that the task does not match the drone, displaying a second result comprises:
if the unmanned aerial vehicle model with the task does not match, just when the camera model with the task matches, then show fifth tip information, fifth tip information includes: the current unmanned aerial vehicle does not support the task;
if the unmanned aerial vehicle model with the task matches, just when the camera model with the task does not match, then show sixth prompt information, sixth prompt information includes: the current camera does not support the task;
if the unmanned aerial vehicle model with the camera model all with when the task did not match, then show seventh tip information, seventh tip information includes: neither current drones nor cameras support this task.
10. The method of claim 4, wherein said controlling the drone to perform the task in response to a confirmation operation by a user input for the user-operable option comprises:
in response to a confirmation operation of the user input for the user-operable option, controlling the drone to execute the task in the selected execution mode, the execution mode including: breakpoint execution or re-execution.
11. The method of claim 1, wherein the obtaining information about the task to be adapted and the drone comprises:
and responding to the confirmation operation of the task and the unmanned aerial vehicle input by the user on the task execution interface, and acquiring the information of the task and the unmanned aerial vehicle to be adapted.
12. The method of claim 11, wherein before the obtaining the information of the task and the drone to be adapted in response to the confirmation operation of the task and the drone input by the user on the task execution interface, the method further comprises:
responding to a task area drawn in a target map displayed on a task creation interface by a user, and displaying a recommended task to the user according to the content contained in the task area;
and creating at least one task in response to the recommended task confirmation operation input by the user.
13. The method of claim 1, wherein determining the result of the task adapting to the drone according to the performance parameters of the drone comprises:
judging whether the unmanned aerial vehicle is connected to a remote controller;
and if so, determining an adaptation result of the task and the unmanned aerial vehicle according to the performance parameters of the unmanned aerial vehicle.
14. A task execution device of an unmanned aerial vehicle, comprising: the device comprises an acquisition module, a determination module and an execution module;
the acquisition module is used for acquiring the information of the task to be adapted and the unmanned aerial vehicle, and the information of the unmanned aerial vehicle comprises: performance parameters of the drone;
the determining module is configured to determine an adaptation result of the task and the unmanned aerial vehicle according to a performance parameter of the unmanned aerial vehicle, where the performance parameter includes: unmanned aerial vehicle model and camera model;
and the execution module is used for controlling the unmanned aerial vehicle to execute the task if the adaptation result is that the task is matched with the unmanned aerial vehicle.
15. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing program instructions executable by the processor, the processor and the storage medium communicating via the bus when the electronic device is running, the processor executing the program instructions to perform the steps of the method according to any one of claims 1 to 13 when executed.
16. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 13.
CN202110468809.7A 2021-04-28 2021-04-28 Task execution method and device for unmanned aerial vehicle, electronic equipment and storage medium Pending CN113095714A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110468809.7A CN113095714A (en) 2021-04-28 2021-04-28 Task execution method and device for unmanned aerial vehicle, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110468809.7A CN113095714A (en) 2021-04-28 2021-04-28 Task execution method and device for unmanned aerial vehicle, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113095714A true CN113095714A (en) 2021-07-09

Family

ID=76681406

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110468809.7A Pending CN113095714A (en) 2021-04-28 2021-04-28 Task execution method and device for unmanned aerial vehicle, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113095714A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116050813A (en) * 2023-03-31 2023-05-02 深圳市城市公共安全技术研究院有限公司 Control method and equipment for photovoltaic operation and maintenance system and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105763620A (en) * 2016-04-01 2016-07-13 北京飞蝠科技有限公司 Method and system for matching unmanned aerial vehicle and pilot
CN110531788A (en) * 2019-09-24 2019-12-03 北京佰才邦技术有限公司 Cruise control method, device and the electronic equipment of unmanned plane
CN111443727A (en) * 2020-03-18 2020-07-24 东北农业大学 Flight control management system and method for multi-rotor unmanned aerial vehicle
CN111506115A (en) * 2020-05-27 2020-08-07 广州机械科学研究院有限公司 Unmanned aerial vehicle cluster regulation and control method and device
CN112468588A (en) * 2020-11-27 2021-03-09 广州极飞科技有限公司 Unmanned aerial vehicle parameter setting method and device, user terminal and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105763620A (en) * 2016-04-01 2016-07-13 北京飞蝠科技有限公司 Method and system for matching unmanned aerial vehicle and pilot
CN110531788A (en) * 2019-09-24 2019-12-03 北京佰才邦技术有限公司 Cruise control method, device and the electronic equipment of unmanned plane
CN111443727A (en) * 2020-03-18 2020-07-24 东北农业大学 Flight control management system and method for multi-rotor unmanned aerial vehicle
CN111506115A (en) * 2020-05-27 2020-08-07 广州机械科学研究院有限公司 Unmanned aerial vehicle cluster regulation and control method and device
CN112468588A (en) * 2020-11-27 2021-03-09 广州极飞科技有限公司 Unmanned aerial vehicle parameter setting method and device, user terminal and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116050813A (en) * 2023-03-31 2023-05-02 深圳市城市公共安全技术研究院有限公司 Control method and equipment for photovoltaic operation and maintenance system and readable storage medium
CN116050813B (en) * 2023-03-31 2023-06-06 深圳市城市公共安全技术研究院有限公司 Control method and equipment for photovoltaic operation and maintenance system and readable storage medium

Similar Documents

Publication Publication Date Title
US10120030B2 (en) Trace data recording system, trace data recording server, trace data recording method, and information storage medium
US10748354B2 (en) Communication method, device, and system for vehicle remote diagnosis
US10345769B2 (en) Method and apparatus for controlling smart home device
US20160371075A1 (en) Method for software updating of vehicle components
US20160364225A1 (en) Centralized system for software updating vehicle components
CN110796544A (en) Asset management wind control engine configuration method and device
CN104376739A (en) Parking space guide system and method
US10775846B2 (en) Electronic device for providing information related to smart watch and method for operating the same
CN113095714A (en) Task execution method and device for unmanned aerial vehicle, electronic equipment and storage medium
CN113108826A (en) Carbon emission metering system, platform and method
WO2017049893A1 (en) Application program testing method, testing apparatus, and mobile terminal
EP3610371A1 (en) Program release packages including program updates
JP4883314B2 (en) Data trace system using PLC
CN105096104A (en) Form operation authority control method, apparatus and office automation system
CN110457101B (en) Information processing method, electronic equipment and computer readable storage medium
CN109725785A (en) Task execution situation method for tracing, device, equipment and readable storage medium storing program for executing
CN111190618A (en) Flash method and device for Electronic Control Unit (ECU), diagnostic equipment and storage medium
CN111507852A (en) Method, device, medium and equipment for determining insurance scheme based on big data
CN112046784B (en) Unmanned aerial vehicle positioner endurance time analysis method and device and unmanned aerial vehicle positioner
KR102640393B1 (en) Apparatus for managing wages and method thereof
CN107203471B (en) Joint debugging method, service platform and computer storage medium
CN108174369B (en) Microphone starting method and device of early education equipment, early education equipment and storage medium
CN115062178A (en) Method and device for arranging inspection images of unmanned aerial vehicle equipment and computer equipment
CN109710339B (en) Information processing method and device
US20160217059A1 (en) Debug device, debug method, and debug program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210709

RJ01 Rejection of invention patent application after publication