WO2021168793A1 - 一种作业规划方法、装置及存储介质 - Google Patents

一种作业规划方法、装置及存储介质 Download PDF

Info

Publication number
WO2021168793A1
WO2021168793A1 PCT/CN2020/077193 CN2020077193W WO2021168793A1 WO 2021168793 A1 WO2021168793 A1 WO 2021168793A1 CN 2020077193 W CN2020077193 W CN 2020077193W WO 2021168793 A1 WO2021168793 A1 WO 2021168793A1
Authority
WO
WIPO (PCT)
Prior art keywords
work
area block
target sub
job
straight line
Prior art date
Application number
PCT/CN2020/077193
Other languages
English (en)
French (fr)
Inventor
赵力尧
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/077193 priority Critical patent/WO2021168793A1/zh
Publication of WO2021168793A1 publication Critical patent/WO2021168793A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • This application relates to the technical field of unmanned aerial vehicles, and in particular to a method, device and storage medium for operation planning.
  • the drone When the drone is flying in the operating area, it can perform operations on the operating objects in the operating area. For example, the drone can spray pesticides on the fruit trees in the fruit tree area.
  • the drone When operating the operating objects in the operating area, it is necessary to plan the operating process of the drone (for example, flight routes, etc.) to achieve automatic operations.
  • the current planning process for unmanned aerial vehicles for operating objects in the operating area is not intelligent enough, generally flying according to a fixed-shaped flight route, so accurate operations cannot be achieved.
  • the algorithm logic of some job planning is complex, requires a lot of calculations, and consumes a lot of computing resources.
  • the embodiments of the present application provide a method, device, and storage medium for operation planning, which can realize the precise operation of the drone, and at the same time, the logic of operation planning is simple and avoids consuming a large amount of computing resources.
  • an embodiment of the present application provides a job planning method, which includes:
  • the target sub-work area block is a sub-work area block that includes the work object among the multiple sub-work area blocks;
  • the drone is controlled to perform operations on the operation objects in the operation area block.
  • the operations include one or more of spraying operations, photographing operations, and substance collection operations.
  • the method further includes: acquiring multiple frames of images collected when the surveying and mapping unmanned aerial vehicle flies over the work area block; the specific implementation manner of acquiring the position of the work object in the work area block is: The image captures the position of the job object in the job area block.
  • the method further includes: dividing the target area into a plurality of work area blocks along the second straight line direction; and a specific implementation of obtaining the multi-frame images collected when the surveying and mapping unmanned aerial vehicle flies over the work area block
  • the method is: acquiring multiple frames of images collected when the surveying and mapping unmanned aerial vehicle is flying over the operating area.
  • the method further includes: determining the two job objects that are the furthest apart in the second straight line direction in the next target sub-work area block; determining the last job in the previous target sub-work area block The distance between the object and the two job objects; the job object with the smaller distance between the two job objects and the job object of the last job is determined as the job object of the first job in the next target sub-work area block.
  • the specific implementation manner of obtaining the position of the work object in the work area block is: obtaining the position of the work object in the target area block; the method further includes: dividing the target area into a plurality of areas along the second straight line Target area block; according to the position of the work object in the target area block, determine the work area block including the work object from the multiple target area blocks; determine the work sequence between the work area blocks, where according to the work between the work area blocks Sequentially determine that each next work area block is away from its previous work area block in the second straight line direction; according to the work sequence of the work objects in the target sub work area block and the work sequence between the target sub work area blocks, no one is controlled
  • the specific implementation method of the machine to work on the job objects in the job area block is: according to the job sequence of the job objects in the target sub-work area block, the job sequence between the target sub-work area blocks, and the job sequence control between the job area blocks
  • the drone performs operations on the operation objects in the operation area block.
  • the method further includes: determining the two target sub-work areas furthest in the first straight line direction in the next work area block; from each of the two target sub-work areas In the area, determine a job object that is closest to the previous job area block in the second straight line direction; from the two job objects that are closest to the previous job area block in the second straight line direction, determine one job object as the next one The job object of the first job in the job area block.
  • a job object is determined from the two job objects closest to the previous job area block in the second straight line direction
  • the specific implementation manner as the job object of the first job in the next job area block is as follows: : Determine the job object of the last job in the previous job area block; determine the distance between the job object of the last job and the two job objects that are closest to the previous job area block in the second straight line; place the two in the second Among the work objects closest to the last work area block in the straight line direction, the work object closest to the last work area block in the second straight line direction with the smaller distance from the last work area block is determined to be the work object in the next work area block.
  • the job object of the first job is determined from the two job objects closest to the previous job area block in the second straight line direction
  • the method further includes: determining two target sub-work area blocks with the furthest distance in the first straight line direction from the target sub-work area blocks; determining each of the two target sub-work area blocks The two operation objects in a target sub-operation area block with the furthest distance in the second straight line direction; determine the reference position of the drone before starting operation and the farthest distance in the second straight line direction in each target sub-operation area block The distance between each of the two job objects; the job object with the smallest distance is determined as the first job object of the job area block.
  • the method further includes: determining the two work area blocks with the furthest distance in the second straight line direction from the work area blocks including the work object; from each of the two work area blocks Determine the two target sub-work areas that are the furthest in the first straight line direction in the block; determine the distance in the second straight line direction in each of the two target sub-work areas corresponding to each work area block The two farthest operation objects; determine the distance between the reference position of the drone before the operation starts and each of the two operation objects that are the furthest in the second straight line direction in each target sub-operation area; The work object with the smallest distance is determined as the first work object in the target area.
  • the reference position is the position when the drone is turned on or the position when the start operation instruction sent by the control terminal is received.
  • an embodiment of the present application provides a work planning device, and the work planning device includes:
  • Memory used to store computer programs
  • the processor which calls a computer program, is used to perform the following operations:
  • the target sub-work area block is a sub-work area block that includes the work object among the multiple sub-work area blocks;
  • the drone is controlled to perform operations on the operation objects in the operation area block.
  • an embodiment of the present application provides a computer-readable storage medium having a computer program stored on the computer-readable storage medium, and when the computer program is executed, the implementation of the Method of job planning.
  • the operation planning device divides the operation area block into a plurality of sub-operation area blocks in the first straight line direction, and then determines the target sub-operation area in the plurality of sub-operation area blocks according to the position of the operation object in the obtained operation area block.
  • the operation area block finally, according to the determined operation sequence of the operation objects in the target sub-operation area block and the operation sequence between the target sub-operation area blocks, control the drone to operate the operation objects in the operation area block.
  • the operation planning device can control the drone to perform operations on the operation objects in the operation area block according to the operation sequence of the operation object in the target sub-operation area block and the operation sequence between the target sub-operation area block.
  • the job sequence of the job objects in the determined target sub-work area block and the job sequence between the target sub-work area blocks include each job object in the job area block, so accurate work on each job object in the work area block can be realized , At the same time, the logic of job planning is simple to avoid consuming a lot of computing resources.
  • FIG. 1 is a schematic structural diagram of a semantic map provided by an embodiment of this application.
  • Figure 2a is a schematic diagram of an application scenario provided by an embodiment of the application.
  • Figure 2b is a schematic diagram of another application scenario provided by an embodiment of the application.
  • FIG. 3 is a schematic diagram of the architecture of a job planning system provided by an embodiment of the application.
  • FIG. 4 is a schematic flowchart of a job planning method provided by an embodiment of the application.
  • FIG. 5a is a schematic structural diagram of a work area block provided by an embodiment of this application.
  • FIG. 5b is a schematic structural diagram of a plurality of divided sub-work area blocks provided by an embodiment of the application.
  • FIG. 5c is a schematic structural diagram of another divided multiple sub-work area blocks provided by an embodiment of the application.
  • FIG. 5d is a schematic structural diagram of a determined target sub-work area block provided by an embodiment of this application.
  • FIG. 6 is a schematic flowchart of another operation planning method provided by an embodiment of the application.
  • FIG. 7 is a schematic flowchart of a method for determining the work object of the first work in the next target sub-work area block according to an embodiment of the application;
  • FIG. 8 is a schematic structural diagram of a job object of the first job in a determined next target sub-work area block provided by an embodiment of the application;
  • FIG. 9 is a schematic flowchart of another operation planning method provided by an embodiment of the present invention.
  • FIG. 10 is a schematic diagram of a structure of multiple target area blocks according to an embodiment of the present invention.
  • FIG. 11 is a schematic flowchart of a method for determining a job sequence between job area blocks according to an embodiment of the present invention
  • FIG. 12 is a schematic structural diagram of a work area block provided by an embodiment of the present invention.
  • FIG. 13 is a schematic flow chart of a method for determining the task object of the first task in a target area according to an embodiment of the present invention
  • FIG. 14 is a schematic structural diagram of a work area block provided by an embodiment of the present invention.
  • FIG. 15 is a schematic structural diagram of a job planning device provided by an embodiment of the present invention.
  • Semantic map A includes the job objects that need to be worked on.
  • the job objects are represented by dots in Figure 1, namely, job object A, job object B, job object C, job object D, Operation object E and operation object F.
  • the operation path planned by UAV operation is path A with arrows, which covers all operation areas in semantic map A, but does not include each operation object in this path, such as ,
  • the drone flies along this path, it flies by the side of the work object A, so that it cannot perform precise operations on the work object A. Therefore, when the unmanned aerial vehicle operates according to the operation path obtained by the operation planning method, it cannot accurately perform operations on each operation object.
  • the embodiment of the present application proposes a method of operation planning.
  • the operation planning device divides the operation area block into a plurality of sub-operation area blocks in a first linear direction, and then determines the plurality of sub-operations according to the position of the operation object in the obtained operation area block.
  • the target sub-operation area block in the area block and finally according to the determined operation sequence of the operation object in the target sub-operation area block and the operation sequence between the target sub-operation area block, control the drone to perform the operation on the operation object in the operation area block.
  • Operation This method enables the drone to perform operations on each operation object in the operation area block, so accurate operations on the operation objects in the operation area block can be realized.
  • this method of job planning is simple in logic and avoids consuming a lot of computing resources.
  • FIG. 2a is a scene diagram of the spraying operation of the drone A on the sparse orchard A, where the spraying operation can be Including spraying pesticides or spraying water, etc.
  • Some sparse fruit trees are planted in sparse orchard A, including fruit tree A-fruit tree H.
  • the operation planning device obtains the position of each fruit tree in the operation area block corresponding to sparse orchard A, that is, obtains the position of the operation object in the operation area block, and Divide the work area block into multiple sub-work area blocks, determine the target sub-work area block in the multiple sub-work area blocks according to the position of the work object in the work area block, and finally determine the operation sequence and the fruit tree in the target sub-work area block.
  • the operation sequence between the target sub-work area blocks controls the UAV A to work on the fruit trees in the work area blocks. Connect the operation sequence determined by the operation planning method, and the path of the UAV is the path with arrows in Figure 2a. It can be seen that the path includes various operation objects, so the UAV operates according to this method. Spraying the fruit trees in sequence can realize the precise spraying of each fruit tree in the sparse orchard A.
  • Fig. 2b is a schematic diagram of the task of taking pictures of the target points of the bridge A by the drone A, and the bridge A needs to be photographed.
  • the points of A are points A, B, and C.
  • Points A, B, and C are the operating objects that drone A needs to shoot.
  • the operation planning device determines the information about bridge A based on the semantic map of bridge A.
  • the flight path of UAV A when photographing bridge A according to the sequence of operations is shown by the straight line with arrows in Figure 2b.
  • the flight path includes each shooting point, so that the UAV can accurately target the bridge A.
  • FIG. 3 is a schematic structural diagram of a job planning system according to an embodiment of the present invention.
  • the operation planning system 30 is composed of an operation planning device 301 and an unmanned aerial vehicle 302.
  • the operation planning device 301 may be one or more control terminals capable of controlling the drone from among a mobile phone, a computer, a remote control device, and the like.
  • the operation planning device 301 can execute the operation planning method described in the embodiment of the application, and control the drone 302 to control the operation area block according to the operation sequence of the operation object in the target sub-operation area block and the operation sequence between the target sub-operation area block.
  • the job object in the job is composed of an operation planning device 301 and an unmanned aerial vehicle 302.
  • the operation planning device 301 may be one or more control terminals capable of controlling the drone from among a mobile phone, a computer, a remote control device, and the like.
  • the operation planning device 301 can execute the operation planning method described in the embodiment of the application, and control the drone 302 to control the operation area
  • the operation planning device 301 may send the operation sequence of the operation object in the target sub-operation area block and the operation sequence between the target sub-operation area block to the UAV 302, and the UAV 302 can use the target sub-operation area block according to the target sub-operation area block.
  • the job sequence between the job object in the middle job and the job sequence between the target sub-work area block, and the job object in the job area block is performed.
  • the operation planning device 301 may be a device in the UAV 302, and the UAV 302 can execute the operation planning method described in the embodiments of this application, according to the operation sequence and target sub-operations of the operation objects in the target sub-operation area block.
  • the job sequence between the job area blocks is to perform operations on the job objects in the job area blocks.
  • the embodiment of the present application takes the operation planning device 301 as the execution subject as an example for description.
  • the semantic map is a collection of the map and the label information of the objects on the map.
  • the surveying and mapping drone can survey and take pictures of a designated area (such as the work area block described in the embodiment of this application), and process the pictures obtained from the surveying and mapping to obtain a 3D reconstruction model, and then use machine learning methods to reconstruct the 3D model.
  • the objects inside are identified and labeled, so as to obtain a semantic map.
  • an embodiment of the present invention proposes a job planning method as shown in FIG. 4, and the job planning method may include S401-S406:
  • the operation planning device obtains the position of the operation object in the operation area block.
  • the operation area block is an area block that requires drones to perform operations.
  • the operation area block is shown in Figure 5a.
  • the operation area block includes operation objects.
  • the operation objects are represented by dots in Figure 5a.
  • the operation planning device obtains the operation area.
  • the position of the operation object in the block can provide preconditions for the subsequent precision operation of the drone.
  • the operations of the drone include one or more of spraying operations, photographing operations, and material collection operations, which are not limited here.
  • the operation planning device divides the operation area block into a plurality of sub-operation area blocks in the first linear direction.
  • the operation planning device divides the operation area block into a plurality of sub-operation area blocks in the vertical axis direction, and the sub-operation area blocks are as shown in Fig. 5b.
  • the operation planning device divides the operation area block into a plurality of sub-operation area blocks in the horizontal axis direction, and the sub-operation area blocks are as shown in Fig. 5c.
  • the initial position of the drone is the position when the drone is turned on, or the position when the drone receives the start operation instruction sent by the control terminal.
  • the operation planning device divides the operation area block into a plurality of sub-operation area blocks of equal height or equal width in the first straight line direction. Or, the operation planning device divides the operation area block into a plurality of sub-operation area blocks of unequal height or unequal width in the first linear direction, which is not limited here.
  • the sub-work area blocks shown in Fig. 5b/Fig. 5c are sub-work area blocks of equal height/width.
  • the operation planning device divides the operation area block into a plurality of sub-operation area blocks in the first straight line direction, which can refine the operation area of the drone.
  • the work planning device determines a target sub-work area block among the multiple sub-work area blocks according to the position of the work object in the work area block.
  • the work planning device determines the target sub-work area block among the multiple sub-work area blocks according to the position of the work object in the work area block.
  • the target sub-work area block is a sub-work area block including a work object among the plurality of sub-work area blocks. That is, the sub-work area block that does not include the job object is not the target sub-work area block.
  • the position of the work object in the work area block is as shown by the dots in Figure 5d, including dot A, dot B, dot C, dot D, dot E, and dot F
  • the operation planning device determines the target sub-operation area blocks including the operation object as: target sub-operation area block A, target sub-operation area block B, and target sub-operation area Block C, target sub-work area block D.
  • S404 The operation planning device determines the operation sequence of the operation objects in the target sub-operation area block.
  • the operation sequence of the operation objects in the target sub-operation area block is determined. Wherein, each next work object determined according to the work order of the work object is away from its previous work object in a second straight line direction perpendicular to the first straight line direction.
  • the job planning device can ensure that the job sequence of each job object in the target sub-work area is determined by determining the job sequence of the job objects in the target sub-work area block.
  • the work planning device determines the work sequence of the work objects in the target sub-work area block AD, for example, the work planning device determines that the work object that was first worked in the target sub-work area block C is a dot D, the job object of the rework is dot E, that is, dot D is the previous job object, and dot E is the next job object.
  • the job sequence of the job objects in the target sub-work area block realizes all the operations on the dot D and the dot E of the job objects.
  • the operation planning device determines the operation sequence between the target sub-operation area blocks.
  • the job planning device determines the job sequence of the job objects in the target sub-work area block, it also needs to determine the job sequence between the target sub-work area blocks, where each next one is determined according to the job sequence between the job area blocks.
  • the target sub-work area block is far away from the previous target sub-work area block in the first straight line direction to ensure that the work sequence of each target sub-work area block is determined.
  • the work planning device determines the work sequence among the target sub work area blocks as target sub work area block D, target sub work area block C, target sub work area block B, The target sub-work area block A, that is, the work planning device subsequently performs operations on the work objects in each target sub-work area block according to the determined work sequence between the target sub-work area blocks.
  • the target sub-work area block C is far away from the target sub-work area block D in the first linear direction (vertical axis direction)
  • the target sub-work area block B is far away from the target sub-work area block C in the first linear direction
  • the target The sub-work area block A is far away from the target sub-work area block B in the first straight line direction.
  • the operation planning device performs operations on each target sub-work area block according to the work sequence, and can realize each target sub-work in the work area block. All operations of the area block.
  • the operation planning device controls the drone to perform operations on the operation objects in the operation area block according to the operation sequence of the operation objects in the target sub-operation area block and the operation sequence between the target sub-operation area blocks.
  • the job planning device After determining the job sequence of the job objects in the target sub-work area block and the job sequence between the target sub-work area blocks, determines the job sequence between the job objects in the target sub-work area block and the target sub-work area block. Operation sequence, control the drone to operate the objects in the operation area block.
  • the job sequence of the job objects in the target sub-work area block and the job sequence between the target sub-work area blocks include the job sequence of all the job objects in the job area block. Therefore, the job planning device is based on the job order of the job objects in the target sub-work area block.
  • the operation sequence between the sequence and the target sub-operation area block controls the drone to operate the objects in the operation area block, which can realize the precise operation of the operation area block.
  • the operation planning device obtains the position of the operation object in the operation area block, divides the operation area block into a plurality of sub-operation area blocks in the first linear direction, and determines the number of operations according to the position of the operation object in the operation area block.
  • the target sub-work area block in each sub-work area block and then determine the job sequence of the job objects in the target sub-work area block and the job sequence between the target sub-work area blocks, and finally the job planning device according to the job objects in the target sub-work area block
  • the operation sequence between the target sub-operation area block and the target sub-operation area block control the UAV to perform operations on the objects in the operation area block.
  • the operation planning method enables the UAV to perform operations on each operation object in the operation area block, and realizes the accurate operation of the operation object in the operation area block. At the same time, this method of job planning is simple in logic and avoids consuming a lot of computing resources.
  • FIG. 6 is a schematic flowchart of another operation planning method provided by an embodiment of the present application.
  • the operation planning device is in the first straight line
  • the operation planning device can obtain the multi-frame images collected by the surveying and mapping UAV flying over the operation area block, and obtain the operation in the operation area block according to the multi-frame images. The location of the object.
  • the work planning device also determines the first work object of the work area block before determining the work sequence of the work objects in the target sub-work area block.
  • the job planning methods shown in Figure 6 include but are not limited to S601-S611:
  • the operation planning device acquires multiple frames of images collected when the surveying and mapping unmanned aerial vehicle flies over the operation area block.
  • the surveying and mapping unmanned aerial vehicle is an unmanned aerial vehicle that surveys, maps, and photographs the operation area.
  • the surveying and mapping unmanned aerial vehicle flies above the work area block, it takes pictures of the work area block and collects multiple frames of images about the work area block.
  • the surveying and mapping unmanned aerial vehicle may be an unmanned aerial vehicle for operation planning.
  • the surveying and mapping unmanned aerial vehicle may not be an unmanned aerial vehicle for operation planning, but an unmanned aerial vehicle other than the unmanned aerial vehicle planned for this operation.
  • the surveying and mapping drone obtains the semantic map
  • the semantic map or the multi-frame images in the semantic map can be sent to the operation planning device, so that the operation planning device can obtain the position of the operation object in the operation area block.
  • the job planning device may acquire the multi-frame images, acquire point cloud information according to the multi-frame images, and acquire the position of the job object according to the point cloud.
  • the point cloud may be input to a preset recognition model (for example, a neural network model), and the position of the work object output by the recognition model can be obtained.
  • a preset recognition model for example, a neural network model
  • the operation planning device obtains multiple frames of images collected by the surveying and mapping UAV flying over the operation area block.
  • each frame of the multi-frame image includes part or all of the information about the work area block, and the information includes the position information of the work object.
  • the operation planning device divides the operation area into a plurality of operation area blocks along the second straight line direction; acquiring the multi-frame images collected by the surveying and mapping unmanned aerial vehicle flying over the operation area block includes: acquiring the surveying and mapping Multi-frame images collected by a human aircraft flying over the operating area.
  • the work planning device acquires the position of the work object in the work area block according to the multi-frame images.
  • the job planning device obtains the position of the job object in the job area block according to each sparse point in the multi-frame image, that is, the position of each sparse point in the multi-frame image is the position of the job object.
  • the operation planning device divides the operation area block into a plurality of sub-operation area blocks in the first linear direction.
  • the operation planning device determines the target sub-work area block among the multiple sub-work area blocks according to the position of the work object in the work area block.
  • steps S603-S604 in the embodiment of the present application please refer to the execution process of steps S402-S403 in the foregoing embodiment, which will not be repeated in the embodiment of the present application.
  • the operation planning device determines the two target sub-operation areas with the furthest distance in the first straight line direction from the target sub-operation area blocks.
  • the operation planning device determines the two target sub-operation areas furthest in the first straight line direction from the four target sub-operation area blocks as the target sub-operation area A and the target sub-operation area D.
  • the operation planning device determines the two operation objects with the farthest distance in the second straight line direction in each target sub-operation area of the two target sub-operation areas.
  • the operation planning device determines that the two target sub-operation areas farthest in the first straight line direction in FIG. 5d are the target sub-operation area block A and the target sub-operation area block D, and the target sub-operation area block
  • the two work objects farthest in the second straight line direction in A are dot A and dot B.
  • the target sub-work area block D contains only one work object, and the work planning device determines that it is in the target sub-work area block.
  • the job object in D is dot F.
  • the operation planning device determines the distance between the reference position of the unmanned aerial vehicle before starting operation and each of the two operation objects with the furthest distance in the second straight line direction in each target sub-operation area block.
  • the reference position of the drone before starting operation is the initial position of the drone, that is, the reference position is the position when the drone is turned on or the position when it receives the operation start instruction sent by the control terminal.
  • the reference position of the drone before the start of operation is obtained by the positioning of the drone's Global Positioning System (GPS).
  • GPS Global Positioning System
  • the operation planning device determines the reference position of the drone before starting the operation, according to the reference position and the positions of the two operation objects farthest in the second straight line direction in each target sub-operation area determined in S606, The distance between the determined reference position of the drone before starting operation and each of the two operation objects that are the furthest in the second straight line direction in each target sub-operation area block to determine the first operation object Job object.
  • the reference position of UAV A obtained by the operation planning device A through GPS before the operation is A, and then according to the position at A and the dot A in the operation area block ,
  • the positions of dot B and dot F, determine the distance between UAV A and dot A, dot B and dot F at A, as shown in Figure 5d as l 1 , l 2 , l 3 Shown.
  • the work planning device determines the work object with the smallest distance as the work object of the first work in the work area block.
  • the work planning device determines the work object with the smallest distance as the first work object of the work area block. At the same time, the work planning device takes the target sub-work area block where the determined first work object is located as the first work target. Work area block.
  • the work planning device determines that the work object with the smallest distance between the reference position before starting work and each work object in each target sub-work area is dot A, so dot A is taken as the work area block The first operation object, and the target sub-operation area block A where the dot A is located as the first target sub-operation area block, that is, in the operation area block, the drone A first starts from the target sub-operation area block The job target in A starts the job at dot A.
  • the operation planning device determines the operation sequence of the operation objects in the target sub-operation area block.
  • the operation planning device determines the operation sequence between the target sub-operation area blocks.
  • steps S609-S610 in the embodiment of the present application refer to the execution process of steps S404-S405 in the foregoing embodiment, and details are not described in the embodiment of the present application.
  • the operation planning device may also execute the method shown in FIG. 7 after determining S610.
  • the embodiment shown in FIG. 7 is mainly a method for the operation planning device to determine the first operation object in the next target sub-operation area block. The method It includes the following steps:
  • the operation planning device determines the two operation objects with the furthest distance in the second straight line direction in the next target sub-operation area block.
  • the operation planning device determines the operation sequence between the target sub-work area blocks, it will determine the previous target sub-work area block and the next target sub-work area block between the target sub-work area blocks.
  • the previous target sub-operation area block and the next target sub-operation area block exist relatively.
  • the drone finishes the operation of the target in the current target sub-operation area block, the current target sub-operation area block becomes the previous target.
  • Sub-work area block the target sub-work area block to be operated after the previous target sub-work area block is the next target sub-work area block. Therefore, after the work planning device determines the work sequence between the target sub-work area blocks, it can determine the two work objects that are the farthest in the second straight line direction in the next target sub-work area block.
  • the work planning device determines that the two work objects farthest in the second straight line direction in the target sub-work area block B are dot B and dot C.
  • the operation planning device determines the distance between the operation object of the last operation and the two operation objects in the last target sub-operation area block.
  • the last target sub-work area block is the target sub-work area block A
  • the last job in the target sub-work area block A is dot A.
  • the dot A and two job object circles The distance between the point B and the circle point C is shown as the straight lines l 3 and l 4 in Fig. 8.
  • the work planning device determines the work object with a smaller distance between the two work objects and the work object of the last work as the work object of the first work in the next target sub-work area block.
  • the operation planning device regards dot B as the first operation object in the next target sub-operation area block B, that is, when the drone targets the operation object in the target sub-area block A After the point A job is finished, perform the job on the job object dot B in the target sub-work area B first, and then perform the job on the job object C.
  • the operation planning device controls the drone to operate the operation objects in the operation area block according to the operation sequence of the operation objects in the target sub-operation area block and the operation sequence between the target sub-operation area blocks.
  • step S611 in the embodiment of the present application refer to the execution process of step S406 in the foregoing embodiment, and details are not repeated in the embodiment of the present application.
  • the operation planning device acquires the multi-frame images collected by the surveying and mapping UAV flying over the operation area block, and obtains the position of the operation object in the operation area block according to the multi-frame images, and moves it in the first straight line direction.
  • the work area block is divided into multiple sub-work area blocks, the target sub-work area block in the multiple sub-work area blocks is determined according to the position of the work object in the work area block, and the distance in the first straight line direction is determined from the target sub-work area block
  • the two farthest target sub-operating areas determine the two operating objects that are the furthest in the second straight line in each target sub-operating area of the two target sub-operating areas, and determine the reference position of the drone before starting the operation
  • the distance between each of the two work objects that are the furthest in the second straight line direction in each target sub-work area block, and the work object with the smallest distance is determined as the first work of the work area block Object
  • the operation planning device determines the operation sequence of the operation object in the target sub-operation area block, and the operation planning device controls the drone pair according to the operation sequence of the operation object in the target sub-operation area block and the operation sequence between the target sub-operation area block.
  • the objects in the work area block perform work.
  • the drone operates the work area block according to the work order of the work objects in the target sub work area block and the work order between the target sub work area blocks. It can work on each work object in the work area block, so it can also Realize the precise operation of the work area block.
  • FIG. 9 is a schematic flowchart of another job planning method provided by an embodiment of the present invention.
  • the work planning device first determines the work area block that includes the work object from the multiple target area blocks, and then determines the work of the work object in the target sub-work area block Sequence, the job sequence between the target sub-work area blocks, and the job sequence between the job area blocks.
  • the job planning methods shown in Figure 9 include but are not limited to S901-S910:
  • the operation planning device acquires multiple frames of images collected when the surveying and mapping unmanned aerial vehicle is flying over the target area.
  • the target area is the entire area where the drone needs to operate.
  • the target area is indicated in FIG. 10.
  • the operation planning device divides the target area into a plurality of target area blocks along the second straight line direction.
  • the job planning device divides the target area in the multi-frame image into multiple targets in the horizontal axis direction. Area block.
  • the operation planning device obtains the position of the operation object in the target area block.
  • the work planning device determines a work area block including the work object from a plurality of target area blocks according to the position of the work object in the target area block.
  • the work planning device determines the target area block containing the work object as the work area block.
  • the work area block includes a work area block A, a work area block B, and a work area block C.
  • the operation planning device divides the operation area block into a plurality of sub-operation area blocks in the first linear direction.
  • the work planning device determines the target sub-work area block among the multiple sub-work area blocks according to the position of the work object in the work area block.
  • the operation planning device determines the operation sequence of the operation objects in the target sub-operation area block.
  • the operation planning device determines the operation sequence between the target sub-operation area blocks.
  • steps S905-S908 in the embodiment of the present application refer to the execution process of steps S402-S405 in the foregoing embodiment, and details are not described in the embodiment of the present application.
  • the operation planning device determines the operation sequence between the operation area blocks.
  • the operation planning device determines the operation sequence between the operation area blocks, wherein each next operation area block is determined to be away from the previous operation area block in the second linear direction according to the operation sequence between the operation area blocks.
  • the operation planning device determines the two target sub-operation areas farthest in the first straight line direction in the next operation area block.
  • the work planning device determines that the work area block A is the previous work area block, and the work area block B is the next work area block, and then the work planning device determines that the work area block B is in the first straight line direction.
  • the two farthest target sub-work areas are target sub-work area A and target sub-work area B.
  • the operation planning device determines from each target sub-operation area of the two target sub-operation areas a task object that is closest to the previous operation area block in the second straight line direction.
  • the work planning device determined in the target sub-work area A that the work object closest to the previous work area block A in the second straight line is work object B, that is, dot B.
  • the work planning device A job object determined in the target sub-work area B that is closest to the previous job area block A in the second straight line direction is the job object C, that is, the dot C.
  • S1103 The work planning device determines one work object from the two work objects closest to the previous work area block in the second straight line direction as the work object of the first work in the next work area block.
  • the operation planning device determines the operation object of the last operation in the previous operation area block; then the operation planning device determines the distance between the operation object of the last operation and the two operation objects closest to the previous operation area block in the second linear direction
  • the last work planning device will be the closest to the last work area block in the second straight line direction among the two work objects that are closest to the last work area block in the second straight line direction.
  • the job target of is determined as the job target of the first job in the next job area block.
  • the job planning device determines that the job object of the last job in the previous job area block A is job object A, and the job object A of the last job is the two closest job area blocks in the second linear direction.
  • the distance between work object B and work object C is shown as 15 and 16 in Figure 12, respectively. It can be seen that the distance between work object B and work object A is longer than the distance between work object C fish and work object A. Therefore, Work object B is determined as the first work object in the next work area block B, that is, after the drone finishes the work on work object A in work area block A, it first performs work on work object B in work area block B. Work on it.
  • the operation planning device controls the operation of the drone to the operation objects in the operation area block according to the operation sequence of the operation objects in the target sub-operation area block, the operation sequence between the target sub-operation area blocks and the operation sequence between the operation area blocks Work on it.
  • the operation planning device controls the drone to operate the objects in the work area block according to the work sequence of the work objects in the target sub work area block, the work sequence between the target sub work area blocks, and the work sequence between the work area blocks.
  • the method shown in FIG. 13 is a method for the method operation planning device to determine the first operation object in the target area, and the method includes the following steps:
  • the work planning device determines the two work area blocks with the farthest distance in the second straight line direction from the work area blocks including the work object.
  • the work planning device determines the two work area blocks farthest in the second straight line direction as the work area from the work area block A, the work area block B, and the work area block C including the work target. Block A and work area block B.
  • the operation planning device determines the two target sub-operation areas with the furthest distance in the first straight line direction from each operation area block of the two operation area blocks.
  • the operation planning device determines from the operation area block A that the two target sub-operation areas farthest in the first straight line direction are the target sub-operation area A and the target sub-operation area B.
  • the two target sub-work areas determined to be the farthest in the first straight line direction are the target sub-work area C and the target sub-work area D.
  • the operation planning device determines the two operation objects that are the farthest in the second straight line direction in each target sub-operation area of the two target sub-operation areas corresponding to each operation area block.
  • the work planning device determines that the two work objects farthest in the second linear direction in the target sub-work area A are work object A and work object B, and determines that the target sub-work area B is in the first
  • the two work objects with the farthest distance in the second straight line direction are the work object C and the work object D.
  • It is determined that there is only one work object in the target sub-work area D so it is determined that the two work objects farthest apart in the second straight line direction are only the work object G.
  • the operation planning device determines the distance between the reference position of the drone before starting operation and each of the two operation objects that are the furthest in the second straight line direction in each target sub-operation area.
  • the reference position is the position where the drone is when it is turned on or when it receives the start operation instruction sent by the control terminal.
  • the position when the drone is turned on is the position A of the five-pointed star shown in Figure 14. Therefore, the operation planning device can be based on the reference position and S1303 of the drone before the operation starts. Determine the distance between point A and each job object for each job object identified in.
  • the operation planning device determines the operation object with the smallest distance as the first operation object in the target area.
  • the operation planning device determines the distance between each of the two operation objects whose distance in the second straight line direction is the farthest in each target sub-operation area from the determined reference position of the drone before starting operation.
  • the operation object with the smallest distance between the operation object and the reference position of the drone before the start of the operation is determined, and the operation object is determined as the first operation object in the target area, that is, the first operation object when the drone starts the operation area Perform work on the work object and the work area block where the work object is located.
  • the work planning device will determine work object A as the first work object in the target area, that is, when the drone performs work on the work object first Work is performed on the work object A and the work area block A where the work object A is located.
  • the operation planning device controls the drone to perform operations on the objects in the operation area block according to the operation sequence of the operation objects in the target sub-operation area block, the operation sequence between the target sub-operation area blocks, and the operation sequence between the operation area blocks.
  • the tasks include: the operation planning device according to the first operation object in the target area, the operation sequence of the operation objects in the target sub-operation area block, the operation sequence between the target sub-operation area blocks and the operation sequence control between the operation area blocks.
  • the drone operates on the objects in the work area block, and realizes the precise operation of all the work objects in multiple work area blocks.
  • the operation planning device determines the operation area block including the operation object from a plurality of target area blocks, and then determines the operation sequence of the operation object in the target sub-operation area block, and the operation sequence between the target sub-operation area blocks. The operation sequence between the operation sequence and the operation area block. Finally, the drone is controlled to perform operations on each operation object in each operation area block according to the above operation sequence, so as to realize the operation of each operation object in multiple operation area blocks. Accurate operation.
  • FIG. 15 is a schematic structural diagram of a job planning device provided by an embodiment of the present invention.
  • the drone 150 described in the embodiment of the present invention includes a processor 1501 and a memory 1502, and a processor 1501 and a memory. 1502 is connected by one or more communication buses.
  • the above-mentioned processor 1501 may be a central processing unit (Central Processing Unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (Digital Signal Processors, DSPs), application specific integrated circuits (ASICs). ), off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor, etc.
  • the processor 1501 is configured to support the job planning apparatus to execute the method described in FIG. 4 or FIG. 6 FIG. 7 or FIG. 9 or FIG. 11 The corresponding function of the job planning device.
  • the aforementioned memory 1502 may include a read-only memory and a random access memory, and provides computer programs and data to the processor 1501.
  • a part of the memory 1502 may also include a non-volatile random access memory.
  • the target sub-work area block is a sub-work area block that includes the work object among the multiple sub-work area blocks;
  • the drone is controlled to perform operations on the operation objects in the operation area block.
  • the operations include one or more of spraying operations, photographing operations, and substance collection operations.
  • the processor 1501 is further used to divide the work area into a plurality of work area blocks along the second straight line direction; the processor 1501 is also used to obtain the data collected when the surveying and mapping unmanned aerial vehicle is flying over the work area. Multi-frame images.
  • the processor 1501 is further configured to determine the two job objects that are the furthest away in the second straight line direction in the next target sub-work area block; the processor 1501 is further configured to determine the previous target sub-work area. The distance between the job object of the last job and the two job objects in the job area block; the processor 1501 is also used to determine the job object with the smaller distance between the two job objects and the job object of the last job as the next The job object of the first job in a target sub-work area block.
  • the processor 1501 is further configured to obtain the position of the job object in the target area block; the processor 1501 is further configured to divide the target area into a plurality of target area blocks along the second straight line direction; the processor 1501 , Is also used to determine the work area block including the work object from the multiple target area blocks according to the position of the work object in the target area block; the processor 1501 is also used to determine the work sequence between the work area blocks, wherein, according to The work sequence between the work area blocks determines that each next work area block is away from its previous work area block in the second straight line direction; the processor 1501 is also used to determine the work sequence of the work objects in the target sub-work area block, The operation sequence between the target sub-operation area blocks and the operation sequence between the operation area blocks control the drone to perform operations on the operation objects in the operation area blocks.
  • the processor 1501 is further used to determine the two target sub-work areas furthest in the first straight line direction in the next work area block; the processor 1501 is also used to determine the two target sub-work areas In each target sub-work area in the work area, a work object that is closest to the previous work area block in the second straight line direction is determined; the processor 1501 is also used to select from two work objects that are closest to the previous work block in the second straight line direction.
  • one job object is determined to be the job object of the first job in the next job area block.
  • the processor 1501 is also used to determine the job object of the last job in the previous job area block; the processor 1501 is also used to determine the job object of the last job and the two most recent job objects in the second straight line direction. The distance between the work objects that are close to the previous work area block; the processor 1501 is also used to compare the two work objects that are closest to the previous work area block in the second straight line direction to the last work object that has a smaller distance The work object closest to the previous work area block in the second straight line direction is determined as the work object that is the first work in the next work area block.
  • the processor 1501 is further configured to determine the two target sub-work area blocks with the furthest distance in the first straight line from the target sub-work area blocks; the processor 1501 is further configured to determine the two target sub-work area blocks.
  • the processor 1501 is further configured to determine the two work area blocks with the furthest distance in the second straight line direction from the work area blocks including the work object; the processor 1501 is further configured to In each operation area block of the operation area block, the two target sub-operation areas farthest in the first straight line direction are determined; the processor 1501 is also used to determine the two target sub-operation areas corresponding to each operation area block In each target sub-operation area in each target sub-operation area, the two operation objects with the furthest distance in the second straight line direction; the processor 1501 is also used to determine the reference position of the UAV before starting operation and the position in the first target sub-operation area in each target sub-operation area. The distance between each of the two work objects with the furthest distance in the two straight line directions; the processor 1501 is also used to determine the work object with the smallest distance as the first work object in the target area.
  • the reference position is the position when the drone is turned on or the position when the start operation instruction sent by the control terminal is received.
  • the embodiment of the present application also provides a readable storage medium, and the readable storage medium stores a computer program.
  • the computer program When the computer program is executed by a processor, it can be used to implement the embodiment of the present application as shown in Figure 4 or Figure 6, Figure 7 or The job planning method described in the embodiment corresponding to FIG. 9 or FIG. 11 will not be repeated here.
  • the computer-readable storage medium may be an internal storage unit of the job planning apparatus described in any of the foregoing embodiments, such as a hard disk or memory of a device.
  • the computer-readable storage medium may also be an external storage device of the image processing device, such as a plug-in hard disk equipped on the device, a smart memory card (Smart Media Card, SMC), and a Secure Digital (SD) ) Card, Flash Card, etc.
  • the computer-readable storage medium may also include both an internal storage unit of the image processing device and an external storage device.
  • the computer-readable storage medium is used to store the computer program and other programs and data required by the image processing device.
  • the computer-readable storage medium can also be used to temporarily store data that has been output or will be output.
  • the program can be stored in a readable storage medium. During execution, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium can be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种作业规划方法、装置(301)以及存储介质。其中作业规划方法包括:作业规划装置获取作业区域块中作业对象的位置;在第一直线方向将作业区域块划分为多个子作业区域块;根据作业区域块中作业对象的位置确定多个子作业区域块中的目标子作业区域块;确定目标子作业区域块中作业对象的作业顺序;确定在目标子作业区域块之间的作业顺序;根据目标子作业块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机对作业区域块中的作业对象进行作业。采用本发明实施例,可以实现对作业区域块中各个作业对象的精准作业,同时作业规划的逻辑简单,避免消耗大量的计算资源。

Description

一种作业规划方法、装置及存储介质 技术领域
本申请涉及无人机技术领域,尤其涉及一种作业规划方法、装置及存储介质。
背景技术
无人机在作业区域飞行的过程中,可以对作业区域中的作业对象进行作业操作,比如无人机对果树区域中的果树进行喷扫农药。在对作业区域中的作业对象进行作业时,需要对无人机的作业过程(例如飞行航线等)进行规划以实现自动作业。然而,目前针对无人机对作业区域中的作业对象进行作业操作的规划过程不够智能化,一般是按照固定的形状的飞行航线来飞行,这样无法实现精准的作业。另外,现有技术中,一些作业规划的算法逻辑复杂,需要大量的计算,消耗大量的计算资源。
因此,如何实现无人机作业规划中的精准作业和易于实现的作业规划成为一个亟待解决的问题。
发明内容
本申请实施例提供了一种作业规划方法、装置及存储介质,可实现无人机的精准作业,同时作业规划的逻辑简单,避免消耗大量的计算资源。
第一方面,本申请实施例提供了一种作业规划方法,该方法包括:
获取作业区域块中作业对象的位置;
在第一直线方向将作业区域块划分为多个子作业区域块;
根据作业区域块中作业对象的位置确定多个子作业区域块中的目标子作业区域块,其中,目标子作业区域块为多个子作业区域块中包括作业对象的子作业区域块;
确定目标子作业区域块中作业对象的作业顺序,其中,按照作业对象的作业顺序确定的每个下一个作业对象在与第一直线方向垂直的第二直线方向上远离其上一个作业对象;
确定在目标子作业区域块之间的作业顺序,其中,按照作业区域块之间的作业顺序确定的每个下一个目标子作业区域块在第一直线方向上远离其上一个目标子作业区域块;
根据目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机对作业区域块中的作业对象进行作业。
在一种实现方式中,作业包括喷洒作业、拍摄作业和物质采集作业中的一种或多种。
在一种实现方式中,所述方法还包括:获取测绘无人飞行器在作业区域块上方飞行时采集到的多帧图像;获取作业区域块中作业对象的位置的具体实施方式为:根据多帧图像获取作业区域块中作业对象的位置。
在一种实现方式中,所述方法还包括:将目标区域沿第二直线方向划分成多个作业区域块;获取测绘无人飞行器在作业区域块上方飞行时采集到的多帧图像的具体实现方式为:获取测绘无人飞行器在作业区域上方飞行时采集到的多帧图像。
在一种实现方式中,所述方法还包括:确定下一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象;确定上一个目标子作业区域块中最后作业的作业对象与两个作业对象之间的距离;将两个作业对象中与最后作业的作业对象之间的距离较小的作业对象确定为下一个目标子作业区域块中最先作业的作业对象。
在一种实现方式中,获取作业区域块中作业对象的位置的具体实现方式为:获取目标区域块中作业对象的位置;所述方法还包括:将目标区域沿第二直线方向划分成多个目标区域块;根据目标区域块中作业对象的位置从多个目标区域块中确定包括作业对象的作业区域块;确定在作业区域块之间的作业顺序,其中,按照作业区域块之间的作业顺序确定每个下一个作业区域块在第二直线方向上远离其上一个作业区域块;根据目标子作业区域块中作业对象的作业顺序和在目标子作业区域块之间的作业顺序控制无人机对作业区域块中的作业对象进行作业的具体实现方式为:根据目标子作业区域块中作业对象的作业顺序、目标子作业区域块之间的作业顺序和作业区域块之间的作业顺序控制无人机对作业区域块中的作业对象进行作业。
在一种实现方式中,所述方法还包括:确定下一个作业区域块中在第一直线方向上最远的两个目标子作业区域;从两个目标子作业区域中每一个目标子作业区域中确定一个在第二直线方向上最靠近上一个作业区域块的作业对象;从两个在第二直线方向上最靠近上一个作业区域块的作业对象中,确定一个作业对象确定作为下一个作业区域块中最先作业的作业对象。
在一种实现方式中,从两个在第二直线方向上最靠近上一个作业区域块的作业对象中确定一个作业对象,作为下一个作业区域块中最先作业的作业对象的具体实现方式为:确定上一个作业区域块中最后作业的作业对象;确定最后作业的作业对象与两个在第二直线方向上最靠近上一个作业区域块的作业对象之间的距离;将两个在第二直线方向上最靠近上一个作业区域块的作业对象中与最后作业的作业对象距离较小的在第二直线方向上最靠近上一个作业区域块的作业对象,确定为下一个作业区域块中最先作业的作业对象。
在一种实现方式中,所述方法还包括:从目标子作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域块;确定两个目标子作业区域块中每一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象;确定无人机开始作业前的参考位置与每一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离;将距离最小的作业对象确定为作业区域块的最先作业的作业对象。
在一种实现方式中,所述方法还包括:从包括作业对象的作业区域块中确定在第二直线方向上距离最远的两个作业区域块;从两个作业区域块的每一个作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域;确定每一个作业区域块对应的两个目标子作业区域中每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象;确定无人机开始作业前的参考位置与每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离;将距离最小的作业对象确定为目标区域的最先作业的作业对象。
在一种实现方式中,参考位置为无人机开机时所处的位置或接收到控制终端发送的开始作业指令时所处的位置。
第二方面,本申请实施例提供了一种作业规划装置,所述作业规划装置包 括:
存储器,用于存储计算机程序;
处理器,调用计算机程序,用于执行以下操作:
获取作业区域块中作业对象的位置;
在第一直线方向将作业区域块划分为多个子作业区域块;
根据作业区域块中作业对象的位置确定多个子作业区域块中的目标子作业区域块,其中,目标子作业区域块为多个子作业区域块中包括作业对象的子作业区域块;
确定目标子作业区域块中作业对象的作业顺序,其中,按照作业对象的作业顺序确定的每个下一个作业对象在与第一直线方向垂直的第二直线方向上远离其上一个作业对象;
确定在目标子作业区域块之间的作业顺序,其中,按照作业区域块之间的作业顺序确定的每个下一个目标子作业区域块在第一直线方向上远离其上一个目标子作业区域块;
根据目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机对作业区域块中的作业对象进行作业。
第三方面,本申请实施例提供一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机程序,所述计算机程序在被执行时,实现如第一方面本申请实施例所述的作业规划方法。
本申请实施例中,作业规划装置在第一直线方向将作业区域块划分为多个子作业区域块,然后根据获取到的作业区域块中作业对象的位置确定多个子作业区域块中的目标子作业区域块,最后根据确定的目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机对作业区域块中的作业对象进行作业。通过本申请实施例,作业规划装置可以根据目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机对作业区域块中的作业对象进行作业,由于确定的目标子作业区域块中的作业对象的作业顺序和目标子作业区域块之间的作业顺序包括了作业区域块中的各个作业对象,因此可实现对作业区域块中各个作业对象的精准作业,同时作业规划的逻辑简单,避免消耗大量的计算资源。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的一种语义地图的结构示意图;
图2a为本申请实施例提供的一种应用场景示意图;
图2b为本申请实施例提供的另一种应用场景示意图;
图3为本申请实施例提供的一种作业规划系统的架构示意图;
图4为本申请实施例提供的一种作业规划方法的流程示意图;
图5a为本申请实施例提供的一种作业区域块的结构示意图;
图5b为本申请实施例提供的一种划分的多个子作业区域块的结构示意图;
图5c为本申请实施例提供的另一种划分的多个子作业区域块的结构示意图;
图5d为本申请实施例提供的一种确定的目标子作业区域块的结构示意图;
图6为本申请实施例提供的另一种作业规划方法的流程示意图;
图7为本申请实施例提供的一种确定下一个目标子作业区域块中最先作业的作业对象的方法的流程示意图;
图8为本申请实施例提供的一种确定的下一个目标子作业区域块中最先作业的作业对象的结构示意图;
图9为本发明实施例提供的另一种作业规划方法的流程示意图;
图10为本发明实施例提供的一种多个目标区域块的结构示意图;
图11为本发明实施例提供的一种确定在作业区域块之间的作业顺序的方法流程示意图;
图12为本发明实施例提供的一种作业区域块的结构示意图;
图13为本发明实施例提供的一种确定目标区域的最先作业的作业对象的 方法流程示意图;
图14为本发明实施例提供的一种作业区域块的结构示意图;
图15为本发明实施例提供的一种作业规划装置的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例进行阐述。
目前,无人机一般采用对整片区域全覆盖的作业规划方法进行作业,该方法规划出的路径无法精准的遍历作业区域中的作业对象,导致无人机在进行作业时可能从作业对象旁边飞过,从而错过对作业对象的作业。比如,语义地图A如图1所示,语义地图A包括了需要作业的作业对象,作业对象用图1中的圆点表示,即作业对象A、作业对象B、作业对象C、作业对象D、作业对象E以及作业对象F,无人机作业规划的作业路径为带有箭头的路径A,该路径覆盖了语义地图A中的所有作业区域,但未将各个作业对象包含在此路径中,比如,无人机按照此路径进行飞行时,从作业对象A的旁边飞过,从而不能对作业对象A进行精准作业。因此,当无人机按照该作业规划方法得到的作业路径进行作业时,不能精准地对每个作业对象进行作业。
本申请实施例提出了一种作业规划方法,作业规划装置在第一直线方向将作业区域块划分为多个子作业区域块,然后根据获取到的作业区域块中作业对象的位置确定多个子作业区域块中的目标子作业区域块,最后根据确定的目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机对作业区域块中的作业对象进行作业。该方法可使得无人机对作业区域块中的每个作业对象都进行作业,因此可实现对作业区域块中作业对象的精准作业。同时,这种作业规划的方法逻辑简单,避免消耗大量的计算资源。
在一种实现方式中,本申请实施例提出的作业规划方法可以应用于图2a所示的场景中,图2a为无人机A对稀疏果树园A喷洒作业的场景图,其中,喷洒作业可以包括喷洒农药或喷洒水等。稀疏果树园A中种植了一些稀疏的果树,包括果树A-果树H,作业规划装置获取稀疏果树园A对应的作业区域块中各个果树的位置,即获取作业区域块中作业对象的位置,并将作业区域块划分为多个子作业区域块,根据作业区域块中作业对象的位置确定多个子作业 区域块中的目标子作业区域块,最后根据确定的目标子作业区域块中果树的作业顺序和目标子作业区域块之间的作业顺序,控制无人机A对作业区域块中的果树进行作业。将该作业规划方法确定的作业顺序连接起来,得到无人机的路径为图2a中带箭头的路径,可以看出该路径中包括了个各个作业对象,因此无人机按照该方法确定的作业顺序对各个果树进行喷洒作业,可实现对稀疏果园A中每个果树的精准喷洒作业。
在一种实现方式中,本申请实施例提出的作业规划方法可以应用于图2b所示的场景中,图2b为无人机A对桥梁A进行目标点的拍照任务示意图,桥梁A中需要拍摄的点为A点、B点以及C点,A点、B点以及C点即为无人机A需要拍摄作业的作业对象,作业规划装置根据关于桥梁A的语义地图,确定出关于桥梁A的语义地图中的每个作业对象的作业顺序,最后按照此作业顺序控制无人机A对关于桥梁A的语义地图中的作业对象进行作业。无人机A在按照该作业顺序对桥梁A进行拍摄作业时的飞行路径如图2b中带有箭头的直线所示,该飞行路径包括每个拍摄点,使得无人机能精准的对桥梁A中的A点、B点以及C点进行拍摄作业。
为了更好的理解本发明实施例公开的一种作业规划方法、装置及存储介质,下面首先对本发明实施例适用的作业规划系统的架构进行描述。
请参见图3,图3是本发明实施例提供的一种作业规划系统的架构示意图。如图3所示,该作业规划系统30由作业规划装置301和无人机302组成。其中,作业规划装置301可以是手机、电脑、遥控设备等中的一个或多个能够控制无人机的控制终端。作业规划装置301可执行本申请实施例所述的作业规划方法,根据目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机302对作业区域块中的作业对象进行作业。可选的,作业规划装置301可将目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序发送给无人机302,由无人机302根据目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,对作业区域块中的作业对象进行作业。
可选的,作业规划装置301可以为无人机302中的装置,无人机302可执行本申请实施例所述的作业规划方法,根据目标子作业区域块中作业对象的作 业顺序和目标子作业区域块之间的作业顺序,对作业区域块中的作业对象进行作业。其中,本申请实施例以作业规划装置301作为执行主体为例进行阐述。
本申请实施例中,语义地图是地图和地图上物体的标注信息的合集。测绘无人机可对指定区域(如本申请实施例中所述的作业区域块)进行测绘拍照,并对测绘所得照片进行处理,获得三维重建模型,再用机器学习的方法对该三维重建模型里的物体进行识别和标注,从而获得语义地图。
基于上述描述,本发明实施例提出一种如图4所示的作业规划方法,该作业规划方法可以包括S401-S406:
S401:作业规划装置获取作业区域块中作业对象的位置。
作业区域块为需要无人机进行作业的区域块,例如,作业区域块如图5a所示,作业区域块中包括作业对象,作业对象如图5a中各圆点表示,作业规划装置获取作业区域块中作业对象的位置,可为后续无人机的精准作业提供前提条件。
在一种实现方式中,无人机的作业包括喷洒作业、拍摄作业和物质采集作业中的一种或多种,在此不做限定。
S402:作业规划装置在第一直线方向将作业区域块划分为多个子作业区域块。
在一种实现方式中,若无人机的初始位置位于作业区域块的左方或者右方位置,如图5b所示,无人机A位于作业区域块的A处或B处,则第一直线方向为竖轴方向,因此作业规划装置在竖轴方向将作业区域块划分为多个子作业区域块,子作业区域块如图5b中所示。
在一种实现方式中,若无人机的初始位置位于作业区域块的上方或者下方位置,如图5c所示,无人机A位于作业区域块的C处或D处,则第一直线方向为横轴方向,因此作业规划装置在横轴方向将作业区域块划分为多个子作业区域块,子作业区域块如图5c中所示。
在一种实现方式中,无人机的初始位置为无人机开机时所处的位置,或为无人机接收到控制终端发送的开始作业指令时所处的位置。
作业规划装置在第一直线方向上将作业区域块划分为多个等高或等宽的子作业区域块。或者,作业规划装置在第一直线方向上将作业区域块划分为多个不等高或不等宽的子作业区域块,在此不做限定。图5b/图5c所示的子作业区域块为等高/等宽的子作业区域块。
作业规划装置在第一直线方向将作业区域块划分为多个子作业区域块,可细化无人机的作业区域。
S403:作业规划装置根据作业区域块中作业对象的位置确定多个子作业区域块中的目标子作业区域块。
作业规划装置根据作业区域块中作业对象的位置确定多个子作业区域块中的目标子作业区域块。目标子作业区域块为多个子作业区域块中包括作业对象的子作业区域块。也就是说,不包括作业对象的子作业区域块不为目标子作业区域块。
例如,在图5d中,作业区域块中作业对象的位置为如图5d中的圆点所示,包括圆点A、圆点B、圆点C、圆点D、圆点E和圆点F,作业规划装置根据作业区域块中圆点A-圆点E的位置,确定包括作业对象的目标子作业区域块分别为:目标子作业区域块A、目标子作业区域块B、目标子作业区域块C、目标子作业区域块D。
S404:作业规划装置确定目标子作业区域块中作业对象的作业顺序。
作业规划装置确定出目标子作业区域块后,确定目标子作业区域块中作业对象的作业顺序。其中,按照作业对象的作业顺序确定的每个下一个作业对象在与第一直线方向垂直的第二直线方向上远离其上一个作业对象。作业规划装置通过确定目标子作业区域块中作业对象的作业顺序的方式,可以保证目标子作业区域中每个作业对象的作业顺序都被确定。
例如,在图5d所示的作业区域块中,作业规划装置确定目标子作业区域块A-D中作业对象的作业顺序,比如作业规划装置确定目标子作业区域块C中先作业的作业对象为圆点D,再作业的作业对象为圆点E,即圆点D为上一个作业对象,圆点E为下一个作业对象。作业规划装置确定的在目标子作业区域块C中作业对象的作业顺序中,下一个作业对象圆点E在与第一直线方向上垂直的第二直线方向上远离其上一个作业对象,即远离圆点D,该目标子作 业区域块中作业对象的作业顺序实现了对作业对象圆点D和圆点E的全部作业。
S405:作业规划装置确定在目标子作业区域块之间的作业顺序。
作业规划装置在确定目标子作业区域块中作业对象的作业顺序之后,还需要确定在目标子作业区域块之间的作业顺序,其中,按照作业区域块之间的作业顺序确定的每个下一个目标子作业区域块在第一直线方向上远离其上一个目标子作业区域块,以保证各个目标子作业区域块的作业顺序都被确定。
例如,在图5d所示的作业区域块中,作业规划装置确定在目标子作业区域块之间的作业顺序为目标子作业区域块D、目标子作业区域块C、目标子作业区域块B、目标子作业区域块A,即作业规划装置在后续按照确定的所述目标子作业区域块之间的作业顺序对每个目标子作业区域块中的作业对象进行作业。其中,目标子作业区域块C在第一直线方向(竖轴方向)上远离目标子作业区域块D,目标子作业区域块B在第一直线方向上远离目标子作业区域块C,目标子作业区域块A在第一直线方向上远离目标子作业区域块B,作业规划装置按照该作业顺序对每个目标子作业区域块进行作业,可实现对作业区域块中每个目标子作业区域块的全部作业。
S406:作业规划装置根据目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机对作业区域块中的作业对象进行作业。
作业规划装置在确定目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序后,根据目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机对作业区域块中的对象进行作业。目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序包括了作业区域块中所有作业对象的作业顺序,因此作业规划装置根据目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机对作业区域块中的对象进行作业,可实现对作业区域块的精准作业。
在本申请实施例中,作业规划装置获取作业区域块中作业对象的位置,在第一直线方向将作业区域块划分为多个子作业区域块,并根据作业区域块中作 业对象的位置确定多个子作业区域块中的目标子作业区域块,然后确定目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,最后作业规划装置根据目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机对作业区域块中的对象进行作业。该作业规划方法使得无人机对作业区域块中的每个作业对象都进行了作业,实现了对作业区域块中作业对象的精准作业。同时,这种作业规划的方法逻辑简单,避免消耗大量的计算资源。
请参见图6,图6是本申请实施例提供的另一种作业规划方法的流程示意图,图6所示实施例与图5所示实施例的不同点在于:作业规划装置在第一直线方向将作业区域块划分为多个子作业区域块之前,作业规划装置可获取测绘无人飞行器在作业区域块上方飞行时采集到的多帧图像,并根据所述多帧图像获取作业区域块中作业对象的位置。同时,作业规划装置在确定目标子作业区域块中作业对象的作业顺序之前,还会确定作业区域块的最先作业的作业对象。
图6所示作业规划方法包括但不限于S601-S611:
S601:作业规划装置获取测绘无人飞行器在作业区域块上方飞行时采集到的多帧图像。
测绘无人飞行器为对作业区域块进行测绘拍摄的无人飞行器。测绘无人飞行器在作业区域块上方飞行时,对作业区域块进行拍摄,采集关于作业区域块的多帧图像。
在一种实现方式中,测绘无人飞行器可以为进行作业规划的无人机。可选的,测绘无人飞行器也可以不为进行作业规划的无人机,为除开此作业规划的无人机之外的无人机。其中,测绘无人机获得语义地图后,可将该语义地图或该语义地图中的多帧图像发送给作业规划装置,使得作业规划装置获取作业区域块中作业对象的位置。或者作业规划装置可以获取所述多帧图像,根据所述多帧图像获取点云信息,并根据所述点云获取所述作业对象的位置。进一步地,可以将所述点云输入预设的识别模型(例如神经网络模型),并获取所述识别模型输出的所述作业对象的位置。
作业规划装置获取测绘无人飞行器在作业区域块上方飞行时采集到的多 帧图像。其中,多帧图像中的每帧图像都包括部分或全部的关于作业区域块的信息,该信息包括作业对象的位置信息。
在一种实现方式中,作业规划装置将作业区域沿第二直线方向划分成多个作业区域块;获取测绘无人飞行器在作业区域块上方飞行时采集到的多帧图像,包括:获取测绘无人飞行器在作业区域上方飞行时采集到的多帧图像。
S602:作业规划装置根据多帧图像获取作业区域块中作业对象的位置。
作业规划装置根据多帧图像中的各稀疏点获取作业区域块中作业对象的位置,即多帧图像中各稀疏点的位置为作业对象的位置。
S603:作业规划装置在第一直线方向将作业区域块划分为多个子作业区域块。
S604:作业规划装置根据作业区域块中作业对象的位置确定多个子作业区域块中的目标子作业区域块。
本申请实施例中的步骤S603-S604具体可参见上述实施例中步骤S402-S403的执行过程,本申请实施例不再赘述。
S605:作业规划装置从目标子作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域。
例如,在图5d中,作业规划装置从四个目标子作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域为目标子作业区域A和目标子作业区域D。
S606:作业规划装置确定两个目标子作业区域中每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象。
例如,在S605中,作业规划装置确定图5d中在第一直线方向上距离最远的两个目标子作业区域为目标子作业区域块A和目标子作业区域块D,目标子作业区域块A中在第二直线方向上距离最远的两个作业对象为圆点A和圆点B,目标子作业区域块D中都只包含一个作业对象,作业规划装置确定的在目标子作业区域块D中的作业对象为圆点F。
S607:作业规划装置确定无人机开始作业前的参考位置与每一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离。
无人机开始作业前的参考位置为无人机的初始位置,也即参考位置为无人机开机时所处的位置或接收到控制终端发送的开始作业指令时所处的位置。无人机开始作业前的参考位置为无人机的全球定位系统(Global Positioning System,GPS)进行定位得到的。
作业规划装置确定无人机开始作业前的参考位置后,根据所述参考位置和在S606中确定的每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象的位置,无人机开始作业前的确定参考位置与每一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离,以确定最先开始作业的作业对象。例如,在图5d所示的作业区域块中,作业规划装置A通过GPS得到的无人机A在开始作业前的参考位置为A处,然后根据A处的位置和作业区域块中圆点A、圆点B和圆点F的位置,确定无人机A在A处与圆点A、圆点B和圆点F的距离,具体如图5d中所标示的l 1、l 2、l 3所示。
S608:作业规划装置将距离最小的作业对象确定为作业区域块的最先作业的作业对象。
作业规划装置将距离最小的作业对象确定为作业区域块的最先作业的作业对象,同时,作业规划装置将确定的最先作业的作业对象所在的目标子作业区域块作为最先作业的目标子作业区域块。
例如,在图5d中,作业规划装置确定开始作业前的参考位置与每一个目标子作业区域中每一个作业对象的距离最小的作业对象为圆点A,因此将圆点A作为作业区域块中最先作业的作业对象,同时也将圆点A所在的目标子作业区域块A作为最先作业的目标子作业区域块,即在作业区域块中,无人机A先从目标子作业区域块A中的作业对象圆点A开始作业。
S609:作业规划装置确定目标子作业区域块中作业对象的作业顺序。
S610:作业规划装置确定在目标子作业区域块之间的作业顺序。
本申请实施例中的步骤S609-S610具体可参见上述实施例中步骤S404-S405的执行过程,本申请实施例不再赘述。
作业规划装置确定在S610之后,还可以执行图7所示的方法,图7所示实施例主要为作业规划装置确定下一个目标子作业区域块中最先作业的作业 对象的方法,所述方法包括以下步骤:
S701:作业规划装置确定下一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象。
作业规划装置确定在目标子作业区域块之间的作业顺序之后,就会确定出在目标子作业区域块之间的上一个目标子作业区域块、下一个目标子作业区域块。上一个目标子作业区域块、下一个目标子作业区域块都是相对存在的,无人机对当前目标子作业区域块中的作业对象作业结束后,当前目标子作业区域块就成为上一个目标子作业区域块,在上一个目标子作业区域块后需要作业的目标子作业区域块为下一个目标子作业区域块。因此,作业规划装置确定在目标子作业区域块之间的作业顺序之后,可以确定下一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象。
例如,在图8中,当无人机对当前目标子作业区域块A作业结束后,目标子作业区域块A为上一个目标子作业区域块,下一个需要作业的目标子作业区域块为目标子作业区域块B,作业规划装置确定目标子作业区域块B中在第二直线方向上距离最远的两个作业对象为圆点B和圆点C。
S702:作业规划装置确定上一个目标子作业区域块中最后作业的作业对象与两个作业对象之间的距离。
例如,如图8所示,上一个目标子作业区域块为目标子作业区域块A,在目标子作业区域块A中最后作业的作业对象为圆点A,圆点A与两个作业对象圆点B和圆点C之间的距离如图8中的直线l 3、l 4所示。
S703:作业规划装置将两个作业对象中与最后作业的作业对象之间的距离较小的作业对象确定为下一个目标子作业区域块中最先作业的作业对象。
例如,从图8中可以看出,在下一个目标子作业区域块B的两个作业对象圆点B和圆点C中,与上一个目标子作业区域块A中最后作业对象圆点A的距离较小的为圆点B,因此作业规划装置将圆点B作为下一个目标子作业区域块B中最先作业的作业对象,也即当无人机对目标子区域块A中的作业对象圆点A作业结束后,对目标子作业区域B中的作业对象圆点B先进行作业,再对作业对象C进行作业。
S611:作业规划装置根据目标子作业区域块中作业对象的作业顺序和目标 子作业区域块之间的作业顺序,控制无人机对作业区域块中的作业对象进行作业。
本申请实施例中的步骤S611具体可参见上述实施例中步骤S406的执行过程,本申请实施例不再赘述。
在本申请实施例中,作业规划装置获取测绘无人飞行器在作业区域块上方飞行时采集到的多帧图像,根据多帧图像获取作业区域块中作业对象的位置,在第一直线方向将作业区域块划分为多个子作业区域块,根据作业区域块中作业对象的位置确定多个子作业区域块中的目标子作业区域块,从目标子作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域,确定两个目标子作业区域中每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象,确定无人机开始作业前的参考位置与每一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离,将距离最小的作业对象确定为作业区域块的最先作业的作业对象,作业规划装置确定目标子作业区域块中作业对象的作业顺序,作业规划装置根据目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序,控制无人机对作业区域块中的对象进行作业。无人机按照目标子作业区域块中作业对象的作业顺序和目标子作业区域块之间的作业顺序对作业区域块进行作业,可对作业区域块中的每个作业对象进行作业,因此也可实现对作业区域块的精准作业。
请参见图9,图9是本发明实施例提供的另一种作业规划方法的流程示意图,图9所示实施例与图4、图6、图7所示实施例的不同点在于,当无人机需要进行作业的作业区域块的数量为多个时,作业规划装置先从多个目标区域块中确定出包括作业对象的作业区域块,然后再确定目标子作业区域块中作业对象的作业顺序、目标子作业区域块之间的作业顺序以及在作业区域块之间的作业顺序。
图9所示作业规划方法包括但不限于S901-S910:
S901:作业规划装置获取测绘无人飞行器在目标区域上方飞行时采集到的多帧图像。
当作业区域块的数量大于1时,目标区域为无人机需要进行作业的整个区域。例如,目标区域为图10中所标示的。
S902:作业规划装置将目标区域沿第二直线方向划分成多个目标区域块。
如图10所示,第一直线方向为竖轴方向,第二直线方向为横轴方向,因此,作业规划装置将多帧图像中的目标区域在横轴方向将目标区域划分为多个目标区域块。
S903:作业规划装置获取目标区域块中作业对象的位置。
S904:作业规划装置根据目标区域块中作业对象的位置从多个目标区域块中确定包括作业对象的作业区域块。
具体的,作业规划装置将包含有作业对象的目标区域块确定为作业区域块。例如在图10中,作业区域块包括作业区域块A、作业区域块B以及作业区域块C。
S905:作业规划装置在第一直线方向将作业区域块划分为多个子作业区域块。
S906:作业规划装置根据作业区域块中作业对象的位置确定多个子作业区域块中的目标子作业区域块。
S907:作业规划装置确定目标子作业区域块中作业对象的作业顺序。
S908:作业规划装置确定在目标子作业区域块之间的作业顺序。
本申请实施例中的步骤S905-S908具体可参见上述实施例中步骤S402-S405的执行过程,本申请实施例不再赘述。
S909:作业规划装置确定在作业区域块之间的作业顺序。
作业规划装置确定在作业区域块之间的作业顺序,其中,按照作业区域块之间的作业顺序确定每个下一个作业区域块在第二直线方向上远离其上一个作业区域块。
具体的,作业规划装置确定在作业区域块之间的作业顺序的详细步骤如图11所示,具体步骤如下:
S1101:作业规划装置确定下一个作业区域块中在第一直线方向上最远的两个目标子作业区域。
例如,在图12中,作业规划装置确定作业区域块A为上一个作业区域快,作业区域块B为下一个作业区域块,然后作业规划装置确定作业区域块B中在第一直线方向上最远的两个目标子作业区域为目标子作业区域A和目标子 作业区域B。
S1102:作业规划装置从两个目标子作业区域中每一个目标子作业区域中确定一个在第二直线方向上最靠近上一个作业区域块的作业对象。
例如,在图12中,作业规划装置在目标子作业区域A中确定的一个在第二直线方向上最靠近上一个作业区域块A的作业对象为作业对象B,即圆点B,作业规划装置在目标子作业区域B中确定的一个在第二直线方向上最靠近上一个作业区域块A的作业对象为作业对象C,即圆点C。
S1103:作业规划装置从两个在第二直线方向上最靠近上一个作业区域块的作业对象中,确定一个作业对象确定作为下一个作业区域块中最先作业的作业对象。
具体的,作业规划装置确定上一个作业区域块中最后作业的作业对象;然后作业规划装置确定最后作业的作业对象与两个在第二直线方向上最靠近上一个作业区域块的作业对象之间的距离;最后作业规划装置将两个在第二直线方向上最靠近上一个作业区域块的作业对象中与最后作业的作业对象距离较小的在第二直线方向上最靠近上一个作业区域块的作业对象,确定为下一个作业区域块中最先作业的作业对象。
比如,在图12中,作业规划装置确定上一个作业区域块A中最后作业的作业对象为作业对象A,最后作业的作业对象A与两个在第二直线方向上最靠近上一个作业区域块的作业对象B和作业对象C之间的距离分别如图12中的l5、l6所示,可以看出作业对象B与作业对象A的距离比作业对象C鱼作业对象A的距离,因此,将作业对象B确定作为下一个作业区域块B中最先作业的作业对象,即无人机在对作业区域块A中的作业对象A进行作业结束后,先对作业区域块B中的作业对象B进行作业。
S910:作业规划装置根据目标子作业区域块中作业对象的作业顺序、在目标子作业区域块之间的作业顺序和作业区域块之间的作业顺序控制无人机对作业区域块中的作业对象进行作业。
作业规划装置在根据目标子作业区域块中作业对象的作业顺序、在目标子作业区域块之间的作业顺序和作业区域块之间的作业顺序控制无人机对作业区域块中的对象进行作业之前,还可以执行如图13所示的步骤。图13所示方 法作业规划装置确定目标区域的最先作业的作业对象的方法,该方法包括以下步骤:
S1301:作业规划装置从包括作业对象的作业区域块中确定在第二直线方向上距离最远的两个作业区域块。
例如,如图14所示,作业规划装置从包括作业对象的作业区域块A、作业区域块B以及作业区域块C中确定在第二直线方向上距离最远的两个作业区域块为作业区域块A和作业区域块B。
S1302:作业规划装置从两个作业区域块的每一个作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域。
例如,如图14所示,作业规划装置从作业区域块A中确定在第一直线方向上距离最远的两个目标子作业区域为目标子作业区域A和目标子作业区域B,从作业区域块B中确定在第一直线方向上距离最远的两个目标子作业区域为目标子作业区域C和目标子作业区域D。
S1303:作业规划装置确定每一个作业区域块对应的两个目标子作业区域中每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象。
例如,如图14所示,作业规划装置确定目标子作业区域A中在第二直线方向上距离最远的两个作业对象为作业对象A和作业对象B,确定目标子作业区域B中在第二直线方向上距离最远的两个作业对象为作业对象C和作业对象D,确定目标子作业区域C中在第二直线方向上距离最远的两个作业对象为作业对象E和作业对象F,确定目标子作业区域D中只有一个作业对象,因此确定在第二直线方向上距离最远的两个作业对象只有作业对象G。
S1304:作业规划装置确定无人机开始作业前的参考位置与每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离。
参考位置为无人机开机时所处的位置或接收到控制终端发送的开始作业指令时所处的位置。例如,如图14所示,无人机开机时所处的位置为图14中所示的五角星所处的位置A处,因此作业规划装置可以根据无人机开始作业前的参考位置和S1303中确定的各个作业对象,确定A点与各个作业对象之间的距离。
S1305:作业规划装置将距离最小的作业对象确定为目标区域的最先作业的作业对象。
作业规划装置从上述确定的无人机开始作业前的参考位置与每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离中,确定出作业对象与无人机开始作业前的参考位置的距离最小的作业对象,并将该作用对象确定为目标区域的最先作业的作业对象,即无人机开始对作业区域进行作业时最先对该作业对象和该作业对象所在的作业区域块进行作业。比如,在图14中,作业对象A与位置A处的距离最短,则作业规划装置将作业对象A确定为目标区域的最先作业的作业对象,即无人机在对作业对象进行作业时先对作业对象A以及作业对象A所在的作业区域块A进行作业。
因此,作业规划装置根据目标子作业区域块中作业对象的作业顺序、在目标子作业区域块之间的作业顺序和作业区域块之间的作业顺序控制无人机对作业区域块中的对象进行作业包括:作业规划装置根据目标区域的最先作业的作业对象、目标子作业区域块中作业对象的作业顺序、在目标子作业区域块之间的作业顺序和作业区域块之间的作业顺序控制无人机对作业区域块中的对象进行作业,实现对多个作业区域块中的所有作业对象的精准作业。
在申请实施例中,作业规划装置从多个目标区域块中确定出包括作业对象的作业区域块,然后确定出目标子作业区域块中作业对象的作业顺序、在目标子作业区域块之间的作业顺序和作业区域块之间的作业顺序,最后控制无人机根据上述作业顺序对每个作业区域块中的每个作业对象进行作业,实现了对多个作业区域块中每个作业对象的精准作业。
请参见图15,图15是本发明实施例提供的一种作业规划装置的结构示意图,本发明实施例中所描述的无人机150,包括:处理器1501和存储器1502,处理器1501和存储器1502通过一条或多条通信总线连接。
上述处理器1501可以是中央处理单元(Central Processing Unit,CPU),该处理器还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑 器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等,处理器1501被配置为支持作业规划装置执行图4或图6图7或图9或图11所述方法中作业规划装置相应的功能。
上述存储器1502可以包括只读存储器和随机存取存储器,并向处理器1501提供计算机程序和数据。存储器1502的一部分还可以包括非易失性随机存取存储器。其中,所述处理器1501调用所述计算机程序时用于执行:
获取作业区域块中作业对象的位置;
在第一直线方向将所述作业区域块划分为多个子作业区域块;
根据作业区域块中作业对象的位置确定多个子作业区域块中的目标子作业区域块,其中,目标子作业区域块为多个子作业区域块中包括作业对象的子作业区域块;
确定目标子作业区域块中作业对象的作业顺序,其中,按照作业对象的作业顺序确定的每个下一个作业对象在与第一直线方向垂直的第二直线方向上远离其上一个作业对象;
确定在目标子作业区域块之间的作业顺序,其中,按照作业区域块之间的作业顺序确定的每个下一个目标子作业区域块在第一直线方向上远离其上一个目标子作业区域块;
根据目标子作业区域块中作业对象的作业顺序和所述目标子作业区域块之间的作业顺序,控制无人机对所述作业区域块中的作业对象进行作业。
在一种实现方式中,作业包括喷洒作业、拍摄作业和物质采集作业中的一种或多种。
在一种实现方式中,处理器1501,还用于将作业区域沿第二直线方向划分成多个作业区域块;处理器1501,还用于获取测绘无人飞行器在作业区域上方飞行时采集到的多帧图像。
在一种实现方式中,处理器1501,还用于确定下一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象;处理器1501,还用于确定上一个目标子作业区域块中最后作业的作业对象与两个作业对象之间的距离;处理器1501,还用于将两个作业对象中与最后作业的作业对象之间的距离较小 的作业对象确定为下一个目标子作业区域块中最先作业的作业对象。
在一种实现方式中,处理器1501,还用于获取目标区域块中作业对象的位置;处理器1501,还用于将目标区域沿第二直线方向划分成多个目标区域块;处理器1501,还用于根据目标区域块中作业对象的位置从多个目标区域块中确定包括作业对象的作业区域块;处理器1501,还用于确定在作业区域块之间的作业顺序,其中,按照作业区域块之间的作业顺序确定每个下一个作业区域块在第二直线方向上远离其上一个作业区域块;处理器1501,还用于根据目标子作业区域块中作业对象的作业顺序、在目标子作业区域块之间的作业顺序和作业区域块之间的作业顺序控制无人机对作业区域块中的作业对象进行作业。
在一种实现方式中,处理器1501,还用于确定下一个作业区域块中在第一直线方向上最远的两个目标子作业区域;处理器1501,还用于从两个目标子作业区域中每一个目标子作业区域中确定一个在第二直线方向上最靠近上一个作业区域块的作业对象;处理器1501,还用于从两个在第二直线方向上最靠近上一个作业区域块的作业对象中,确定一个作业对象确定作为下一个作业区域块中最先作业的作业对象。
在一种实现方式中,处理器1501,还用于确定上一个作业区域块中最后作业的作业对象;处理器1501,还用于确定最后作业的作业对象与两个在第二直线方向上最靠近上一个作业区域块的作业对象之间的距离;处理器1501,还用于将两个在第二直线方向上最靠近上一个作业区域块的作业对象中与最后作业的作业对象距离较小的在第二直线方向上最靠近上一个作业区域块的作业对象,确定为下一个作业区域块中最先作业的作业对象。
在一种实现方式中,处理器1501,还用于从目标子作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域块;处理器1501,还用于确定两个目标子作业区域块中每一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象;处理器1501,还用于确定无人机开始作业前的参考位置与每一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离;处理器1501,还用于将距离最小的作业对象确定为作业区域块的最先作业的作业对象。
在一种实现方式中,处理器1501,还用于从包括作业对象的作业区域块中确定在第二直线方向上距离最远的两个作业区域块;处理器1501,还用于从两个作业区域块的每一个作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域;处理器1501,还用于确定每一个作业区域块对应的两个目标子作业区域中每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象;处理器1501,还用于确定无人机开始作业前的参考位置与每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离;处理器1501,还用于将距离最小的作业对象确定为目标区域的最先作业的作业对象。
在一种实现方式中,参考位置为无人机开机时所处的位置或接收到控制终端发送的开始作业指令时所处的位置。
本申请实施例还提供一种可读存储介质,所述可读存储介质存储有计算机程序,所述计算机程序被处理器执行时,可以用于实现本申请实施例图4或图6图7或图9或图11所对应实施例中描述的作业规划方法,在此不再赘述。
所述计算机可读存储介质可以是前述任一实施例所述的作业规划装置的内部存储单元,例如设备的硬盘或内存。所述计算机可读存储介质也可以是所述图像处理设备的外部存储设备,例如所述设备上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,所述计算机可读存储介质还可以既包括所述图像处理设备的内部存储单元也包括外部存储设备。所述计算机可读存储介质用于存储所述计算机程序以及所述图像处理设备所需的其他程序和数据。所述计算机可读存储介质还可以用于暂时地存储已经输出或者将要输出的数据。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一可读取存储介质中,所述程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
以上所揭露的仅为本申请较佳实施例而已,当然不能以此来限定本申请之权利范围,因此依本申请权利要求所作的等同变化,仍属本申请所涵盖的范围。

Claims (23)

  1. 一种作业规划方法,其特征在于,包括:
    获取作业区域块中作业对象的位置;
    在第一直线方向将所述作业区域块划分为多个子作业区域块;
    根据所述作业区域块中作业对象的位置确定所述多个子作业区域块中的目标子作业区域块,其中,所述目标子作业区域块为所述多个子作业区域块中包括所述作业对象的子作业区域块;
    确定所述目标子作业区域块中所述作业对象的作业顺序,其中,按照所述作业对象的作业顺序确定的每个下一个作业对象在与第一直线方向垂直的第二直线方向上远离其上一个作业对象;
    确定在所述目标子作业区域块之间的作业顺序,其中,按照所述作业区域块之间的作业顺序确定的每个下一个目标子作业区域块在第一直线方向上远离其上一个目标子作业区域块;
    根据所述目标子作业区域块中作业对象的作业顺序和所述目标子作业区域块之间的作业顺序,控制无人机对所述作业区域块中的作业对象进行作业。
  2. 根据权利要求1所述的方法,其特征在于,所述作业包括喷洒作业、拍摄作业和物质采集作业中的一种或多种。
  3. 根据权利要求1或2所述的方法,其特征在于,所述方法还包括:
    获取测绘无人飞行器在所述作业区域块上方飞行时采集到的多帧图像;
    所述获取作业区域块中作业对象的位置,包括:
    根据所述图像获取所述作业区域块中作业对象的位置。
  4. 根据权利要求3所述的方法,其特征在于,所述方法还包括:
    将作业区域沿第二直线方向划分成多个所述作业区域块;
    所述获取测绘无人飞行器在所述作业区域块上方飞行时采集到的多帧图像,包括:
    获取所述测绘无人飞行器在所述作业区域上方飞行时采集到的多帧图像。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,所述方法还包括:
    确定所述下一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象;
    确定所述上一个目标子作业区域块中最后作业的作业对象与所述两个作业对象之间的距离;
    将所述两个作业对象中与所述最后作业的作业对象之间的距离较小的作业对象确定为所述下一个目标子作业区域块中最先作业的作业对象。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,
    所述获取作业区域块中作业对象的位置,包括:
    获取目标区域块中作业对象的位置;
    所述方法还包括:
    将目标区域沿第二直线方向划分成多个所述目标区域块;
    根据所述目标区域块中作业对象的位置从所述多个目标区域块中确定包括作业对象的作业区域块;
    确定在作业区域块之间的作业顺序,其中,按照所述作业区域块之间的作业顺序确定每个下一个作业区域块在第二直线方向上远离其上一个作业区域块;
    所述根据所述目标子作业区域块中作业对象的作业顺序和所述在目标子作业区域块之间的作业顺序控制无人机对所述作业区域块中的作业对象进行作业,包括:
    根据所述目标子作业区域块中作业对象的作业顺序、所述在目标子作业区域块之间的作业顺序和作业区域块之间的作业顺序控制无人机对所述作业区域块中的作业对象进行作业。
  7. 根据权利要求6所述的方法,其特征在于,所述方法还包括:
    确定所述下一个作业区域块中在第一直线方向上最远的两个目标子作业 区域;
    从所述两个目标子作业区域中每一个目标子作业区域中确定一个在第二直线方向上最靠近所述上一个作业区域块的作业对象;
    从两个在第二直线方向上最靠近所述上一个作业区域块的作业对象中,确定一个作业对象确定作为所述下一个作业区域块中最先作业的作业对象。
  8. 根据权利要求7所述的方法,其特征在于,
    所述从两个在第二直线方向上最靠近所述上一个作业区域块的作业对象中,确定一个作业对象,作为所述下一个作业区域块中最先作业的作业对象,包括:
    确定所述上一个作业区域块中最后作业的作业对象;
    确定所述最后作业的作业对象与所述两个在第二直线方向上最靠近所述上一个作业区域块的作业对象之间的距离;
    将所述两个在第二直线方向上最靠近所述上一个作业区域块的作业对象中与所述最后作业的作业对象距离较小的在第二直线方向上最靠近所述上一个作业区域块的作业对象,确定为所述下一个作业区域块中最先作业的作业对象。
  9. 根据权利要求1-3任一项所述的方法,其特征在于,所述方法还包括:
    从所述目标子作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域块;
    确定所述两个目标子作业区域块中每一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象;
    确定无人机开始作业前的参考位置与每一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离;
    将所述距离最小的作业对象确定为所述作业区域块的最先作业的作业对象。
  10. 根据权利要求6-8任一项所述的方法,其特征在于,所述方法还包括:
    从包括作业对象的作业区域块中确定在第二直线方向上距离最远的两个作业区域块;
    从所述两个作业区域块的每一个作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域;
    确定每一个作业区域块对应的所述两个目标子作业区域中每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象;
    确定无人机开始作业前的参考位置与每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离;
    将所述距离最小的作业对象确定为所述目标区域的最先作业的作业对象。
  11. 根据权利要求所述9或10所述的方法,其特征在于,所述参考位置为无人机开机时所处的位置或接收到控制终端发送的开始作业指令时所处的位置。
  12. 一种作业规划装置,其特征在于,包括存储器和处理器;
    所述存储器,用于存储程序代码;
    所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
    获取作业区域块中作业对象的位置;
    在第一直线方向将所述作业区域块划分为多个子作业区域块;
    根据所述作业区域块中作业对象的位置确定所述多个子作业区域块中的目标子作业区域块,其中,所述目标子作业区域块为所述多个子作业区域块中包括所述作业对象的子作业区域块;
    确定所述目标子作业区域块中所述作业对象的作业顺序,其中,按照所述作业对象的作业顺序确定的每个下一个作业对象在与第一直线方向垂直的第二直线方向上远离其上一个作业对象;
    确定在所述目标子作业区域块之间的作业顺序,其中,按照所述作业区域块之间的作业顺序确定的每个下一个目标子作业区域块在第一直线方向上远离其上一个目标子作业区域块;
    根据所述目标子作业区域块中作业对象的作业顺序和所述目标子作业区域块之间的作业顺序,控制无人机对所述作业区域块中的作业对象进行作业。
  13. 根据权利要求所述12所述的作业规划装置,其特征在于,所述作业包括喷洒作业、拍摄作业和物质采集作业中的一种或多种。
  14. 根据权利要求12或13所述的作业规划装置,其特征在于,所述处理器还用于执行以下操作:
    获取测绘无人飞行器在所述作业区域块上方飞行时采集到的多帧图像;
    所述处理器获取作业区域块中作业对象的位置,具体为:
    根据所述图像获取所述作业区域块中作业对象的位置。
  15. 根据权利要求14所述的作业规划装置,其特征在于,所述处理器还用于执行以下操作:
    将作业区域沿第二直线方向划分成多个所述作业区域块;
    所述处理器获取测绘无人飞行器在所述作业区域块上方飞行时采集到的多帧图像,具体用于执行以下操作:
    获取所述测绘无人飞行器在所述作业区域上方飞行时采集到的多帧图像。
  16. 根据权利要求12-15任一项所述的作业规划装置,其特征在于,所述处理器还用于执行以下操作:
    确定所述下一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象;
    确定所述上一个目标子作业区域块中最后作业的作业对象与所述两个作业对象之间的距离;
    将所述两个作业对象中与所述最后作业的作业对象之间的距离较小的作业对象确定为所述下一个目标子作业区域块中最先作业的作业对象。
  17. 根据权利要求12-16任一项所述的作业规划装置,其特征在于,
    所述处理器获取作业区域块中作业对象的位置,具体执行以下操作:
    获取目标区域块中作业对象的位置;
    所述处理器还用于执行以下操作:
    将目标区域沿第二直线方向划分成多个所述目标区域块;
    根据所述目标区域块中作业对象的位置从所述多个目标区域块中确定包括作业对象的作业区域块;
    确定在作业区域块之间的作业顺序,其中,按照所述作业区域块之间的作业顺序确定每个下一个作业区域块在第二直线方向上远离其上一个作业区域块;
    所述处理器根据所述目标子作业区域块中作业对象的作业顺序和所述在目标子作业区域块之间的作业顺序控制无人机对所述作业区域块中的对象进行作业,具体执行以下操作:
    根据所述目标子作业区域块中作业对象的作业顺序、所述在目标子作业区域块之间的作业顺序和作业区域块之间的作业顺序控制无人机对所述作业区域块中的对象进行作业。
  18. 根据权利要求17所述的作业规划装置,其特征在于,所述处理器还用于执行以下操作:
    确定所述下一个作业区域块中在第一直线方向上最远的两个目标子作业区域;
    从所述两个目标子作业区域中每一个目标子作业区域中确定一个在第二直线方向上最靠近所述上一个作业区域块的作业对象;
    从两个在第二直线方向上最靠近所述上一个作业区域块的作业对象中,确定一个作业对象确定作为所述下一个作业区域块中最先作业的作业对象。
  19. 根据权利要求18所述的作业规划装置,其特征在于,
    所述处理器从两个在第二直线方向上最靠近所述上一个作业区域块的作业对象中,确定一个作业对象,作为所述下一个作业区域块中最先作业的作业对象,具体用于执行以下操作:
    确定所述上一个作业区域块中最后作业的作业对象;
    确定所述最后作业的作业对象与所述两个在第二直线方向上最靠近所述上一个作业区域块的作业对象之间的距离;
    将所述两个在第二直线方向上最靠近所述上一个作业区域块的作业对象中与所述最后作业的作业对象距离较小的在第二直线方向上最靠近所述上一个作业区域块的作业对象,确定为所述下一个作业区域块中最先作业的作业对象。
  20. 根据权利要求12-14任一项所述的作业规划装置,其特征在于,所述处理器还用于执行以下操作:
    从所述目标子作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域块;
    确定所述两个目标子作业区域块中每一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象;
    确定无人机开始作业前的参考位置与每一个目标子作业区域块中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离;
    将所述距离最小的作业对象确定为所述作业区域块的最先作业的作业对象。
  21. 根据权利要求17-19任一项所述的作业规划装置,其特征在于,所述处理器还用于执行以下操作:
    从包括作业对象的作业区域块中确定在第二直线方向上距离最远的两个作业区域块;
    从所述两个作业区域块的每一个作业区域块中确定在第一直线方向上距离最远的两个目标子作业区域;
    确定每一个作业区域块对应的所述两个目标子作业区域中每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象;
    确定无人机开始作业前的参考位置与每一个目标子作业区域中在第二直线方向上距离最远的两个作业对象中每一个作业对象之间的距离;
    将所述距离最小的作业对象确定为所述目标区域的最先作业的作业对象。
  22. 根据权利要求所述20或21所述的作业规划装置,其特征在于,所述参考位置为无人机开机时所处的位置或接收到控制终端发送的开始作业指令时所处的位置。
  23. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机程序,所述计算机程序在被执行时,实现如权利要求1至11任一项所述的作业规划方法。
PCT/CN2020/077193 2020-02-28 2020-02-28 一种作业规划方法、装置及存储介质 WO2021168793A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/077193 WO2021168793A1 (zh) 2020-02-28 2020-02-28 一种作业规划方法、装置及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/077193 WO2021168793A1 (zh) 2020-02-28 2020-02-28 一种作业规划方法、装置及存储介质

Publications (1)

Publication Number Publication Date
WO2021168793A1 true WO2021168793A1 (zh) 2021-09-02

Family

ID=77489739

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/077193 WO2021168793A1 (zh) 2020-02-28 2020-02-28 一种作业规划方法、装置及存储介质

Country Status (1)

Country Link
WO (1) WO2021168793A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023082066A1 (zh) * 2021-11-09 2023-05-19 深圳市大疆创新科技有限公司 作业规划方法、控制装置、控制终端及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10068489B2 (en) * 2016-08-31 2018-09-04 Skycatch, Inc. Managing energy during flight of unmanned aerial vehicles for safe return to ground
CN108919832A (zh) * 2018-07-23 2018-11-30 京东方科技集团股份有限公司 无人机作业航线规划方法、无人机施药方法及装置
CN109035871A (zh) * 2018-07-17 2018-12-18 深圳常锋信息技术有限公司 无人机飞行路线规划方法、装置、系统及智能终端
CN109407701A (zh) * 2018-11-30 2019-03-01 广州极飞科技有限公司 控制作业的方法、系统及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10068489B2 (en) * 2016-08-31 2018-09-04 Skycatch, Inc. Managing energy during flight of unmanned aerial vehicles for safe return to ground
CN109035871A (zh) * 2018-07-17 2018-12-18 深圳常锋信息技术有限公司 无人机飞行路线规划方法、装置、系统及智能终端
CN108919832A (zh) * 2018-07-23 2018-11-30 京东方科技集团股份有限公司 无人机作业航线规划方法、无人机施药方法及装置
CN109407701A (zh) * 2018-11-30 2019-03-01 广州极飞科技有限公司 控制作业的方法、系统及装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023082066A1 (zh) * 2021-11-09 2023-05-19 深圳市大疆创新科技有限公司 作业规划方法、控制装置、控制终端及存储介质

Similar Documents

Publication Publication Date Title
CN106774431B (zh) 一种测绘无人机航线规划方法及装置
CN108521788B (zh) 生成模拟航线的方法、模拟飞行的方法、设备及存储介质
US10983201B2 (en) User interface for displaying point clouds generated by a lidar device on a UAV
US10754354B2 (en) Hover control
KR102003187B1 (ko) 무인 비행체에 의해 촬영된 이미지들을 사용하여 측량 대상지의 위치 정보를 포함하는 GCP (Ground Control Point)들의 각각이 매칭된 결과 이미지를 모델링하는 방법 및 장치
CN109032165B (zh) 无人机航线的生成方法和装置
WO2020103108A1 (zh) 一种语义生成方法、设备、飞行器及存储介质
CN106647805B (zh) 无人机自主飞行的方法、装置以及无人机
CN109765933A (zh) 一种无人机带状区域航线规划方法、装置和设备
WO2020220195A1 (zh) 无人机的控制方法、设备、喷洒系统、无人机及存储介质
CN111699455B (zh) 飞行航线生成方法、终端和无人机
CN112470092A (zh) 一种测绘系统、测绘方法、装置、设备及介质
WO2019100188A1 (zh) 无人机作业航线的规划方法及地面端设备
CN108981706B (zh) 无人机航拍路径生成方法、装置、计算机设备和存储介质
JP2014139538A (ja) 地形情報取得装置、地形情報取得システム、地形情報取得方法及びプログラム
US11769295B2 (en) System and method of highly-scalable mapping and 3D terrain modeling with aerial images
US20220084415A1 (en) Flight planning method and related apparatus
WO2019104678A1 (zh) 一种控制方法、终端、管理平台、系统及存储介质
WO2020113447A1 (zh) 无人机的喷洒作业方法和装置
CN105526916A (zh) 动态图像遮蔽系统和方法
WO2019164502A1 (en) Methods, devices and computer program products for generating 3d models
DE202016008227U1 (de) Navigations-Anwendungsprogrammierungsschnittstelle zum Unterbringen einer Mehrfachwegpunkte-Führung
KR20210105345A (ko) 측량 및 매핑 방법, 장치 및 기기
WO2021168793A1 (zh) 一种作业规划方法、装置及存储介质
CN115115785A (zh) 面向野外山林环境搜救的多机协同三维建模系统及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20922462

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20922462

Country of ref document: EP

Kind code of ref document: A1